Key Technical Highlights
Hybrid Search: Amazon OpenSearch Service handles both vector and keyword search, ensuring relevant retrieval across the intelligence corpus. The platform processes multiple document types through AI-driven custom chunking and Cohere embeddings to ensure nuanced semantic understanding across diverse data sources.
AI Models: Amazon Bedrock provides the foundation with native LLM integrations, using Cohere embeddings for semantic search and Anthropic Claude for dynamic content generation. A sophisticated GenAI router and planner, aware of chat history, dynamically orchestrates multiple analytical tools — combining document retrieval, geographic intelligence, and route analysis as needed.
Beyond Text: The system doesn't just return text; it uses deep analytics to generate complementary visualizations, such as safe-route maps and charts, to aid decision-making.
"One of the key technical achievements in this project is our application of deep analytics to improve the quality of the tool's responses. Text alone is not always the best way to present insights, so we enabled the agent to also provide users with complementary visualizations. For example, if a user wants to know the safest route between point A and point B, the tool will present a clear map with precise, explanatory information." — BigData Boutique team
Cost & Scale: AWS Lambda streamlines ETL procedures — efficiently processing and embedding reports for indexing in OpenSearch — while Amazon ECS hosts the scalable and fault-tolerant agent application. Amazon CloudWatch enables real-time monitoring of service health, usage patterns, and AI model consumption. Throughout the project, careful attention was given to cost optimization, with strategies like selecting cost-effective AI models for each task and minimizing token usage.