LLMs / Coding | MARCH, 2024
Reading Time: ~3-4 minutes

Revolutionizing Product Design with an AI-Driven RAG Application

AWS Chatbot - Case study

Skills I relied on: Knowledge base management, LLM integration, Amazon Bedrock, Amazon S3, OpenSearch Serverless, Python development, API usage, Flask deployment.

Problem
Teams often struggle with fragmented information spread across various documents—product narratives, user research, user personas, team meeting notes, etc. This scattered data makes it challenging to harness the full potential of insights buried within. The question I asked myself was: How can we streamline access to this knowledge and make it actionable in the product design and development process?
The Game-Changing Idea
Imagine if a team could centralize all this information in a knowledge base powered by a large language model (LLM). This LLM would enable the team to ask questions, brainstorm, and gain insights by leveraging everything from product documents to user personas in one place. How would this change the way products are designed and developed?

What if an LLM could analyze pain points from user research and compare them to a product's current feature set? Even better, what if the LLM could suggest improvements or solutions tailored to your target audience based on user personas? For instance, you could ask the LLM to:
"Generate user stories for [your Product] that describe solutions to pain points identified in user research. Provide an output based on [user personas] with three user stories per persona."
Demo
To demonstrate this concept, I coded a simple Retrieval Augmented Generation (RAG) application. The RAG system connects to an Amazon Bedrock knowledge base and pulls data from an S3 bucket. The app generates insightful responses to user queries, like suggesting new user stories based on pain points identified in user research.
Implementation Approach
By creating a Knowledge Base in Amazon Bedrock and linking it to a data source (e.g., an Amazon S3 bucket storing all product documents), I was able to do just that. Here's a breakdown:
  1. Gather Documents: Upload product narrative documents, user research, personas, and meeting notes to an Amazon S3 bucket.
  2. Create Knowledge Base: Set up a Knowledge Base in Amazon Bedrock.
  3. Configure Data Source: Link the S3 bucket to the Knowledge Base, ensuring seamless retrieval of your product data.
  4. Select Embeddings Model: Choose an embeddings model to represent the document information in vector format.
  5. Create Vector Database: Use Amazon’s "Quick create" option to build a vector database (OpenSearch Serverless) to store and retrieve document vectors.
  6. Build RAG Application: Using the RetrieveAndGenerate API, query your knowledge base to generate AI-driven insights.
  7. Integration in VS Code: Leverage the AWS SDK (Boto3) for Python to integrate your code in Visual Studio Code.
  8. Run Your App: Deploy the app locally using Flask to interact with the knowledge base.
My Design Process
This project provided an opportunity to define the product design process flow for the cross-functional teams I was working with. Working with my UX Design pair, our goal was to create a reference guide to develop an efficient, cross-functional team workflow. The timeline below represents an ideal product design process flow from inception of a design feature through to completion along with actions to take in response to unplanned events.

This is not an exhaustive list of tasks for each department, but rather a high level outline of the critical methods used to capture user feedback and input from all Product Teams to integrate into design solutions through the collaboration process.

Conclusion
The resulting application generates actionable insights from the knowledge base, helping the team brainstorm, refine features, and create user stories. This experimental project shows great potential for enhancing the product development process and helping designers and developers work more efficiently.
Next Steps
The focus will shift to improving data parsing, exploring more complex product queries, and using Amazon Q Business applications to simplify UI creation for knowledge bases, RAG, and query tools.
Key things I learned
  1. A centralized, LLM-powered knowledge base can transform product design by making insights from scattered documents easily accessible.
  2. Using Amazon Bedrock, S3, and the RetrieveAndGenerate API, teams can build intelligent applications that aid in decision-making and brainstorming
  3. Continuous learning and experimentation with AI tools can yield powerful new ways of working, even in complex fields like product development.
👊 Hope this case study sparks ideas for your own projects!