Organizations often struggle with information silos, making it challenging for employees to access critical knowledge spread across various systems, slowing down decision-making, and impacting productivity. Recognizing this, we at base2Services took on the issue during our February HackDay, using Amazon Kendra and AI technologies to build a poweful, unified search interface tailored for our internal teams. This solution has since proven valuable, and we are happy to share our learnings with other organizations navigating similar knowledge management challenges.
Like many organizations, base2Services stores information across multiple platforms, including Confluence, Zendesk, and Slack. This fragmentation can lead to inefficiencies and frustration when team members need to quickly find specific information. Our goal was to develop a solution that could seamlessly query these disparate data sources and provide meaningful answers, not just links to documents.
Amazon Kendra proved to be the ideal foundation for our project. This AI-powered enterprise search service offers several key advantages:
Here's how we initialized the Kendra client and performed a Search:
import boto3
INDEX_ID = 'c0ed0b41-2f39-4eca-b99d-b83386c285e4'
kendra_client = boto3.client('kendra')
def extract_context_from_kendra(index_id, prompt):
results = kendra_client.retrieve(IndexId=index_id, QueryText=prompt)['ResultItems']
high_confidence_results = []
medium_confidence_results = []
for item in results:
score_confidence = item['ScoreAttributes']['ScoreConfidence']
if score_confidence in ('HIGH', 'VERY_HIGH'):
high_confidence_results.append(item['Content'])
else:
medium_confidence_results.append(item['Content'])
result_content = '\n'.join(high_confidence_results) if high_confidence_results else '\n'.join(medium_confidence_results)
return result_content
While Kendra excels at finding relevant documents, we wanted to take our solution a step further. By integrating Amazon Bedrock, a fully managed service offering high-performance foundation models, we were able to transform raw search results into concise, informative answers.
Our application uses Kendra to retrieve relevant documents, then feeds this context into a Bedrock language model. This combination allows us to generate human-like responses to queries, synthesizing information from multiple sources.
Here's how we used Bedrock to generate responses based on Kendra's search results:
import json
bedrock_client = boto3.client(service_name='bedrock-runtime', region_name='us-east-1')
def generate_bedrock_response(message):
request_body = {
"prompt": message,
"maxTokens": 2000,
"temperature": 0.2,
"topP": 0.2
}
response = bedrock_client.invoke_model(
body=json.dumps(request_body),
modelId='ai21.j2-ultra',
accept='application/json',
contentType='application/json'
)
response_body = json.loads(response['body'].read())
return response_body['completions'][0]['data']['text']
In order to build a truly effective knowledge search solution, we focused on implementing the following essential architectural and functional elements:
def run_chain(chain, prompt: str, history=[]):
return chain({"question": prompt, "chat_history": history})
chat_history = []
qa = build_chain()
print("Hello! How can I help you?")
print("Ask a question, start a New search: or CTRL-D to exit.")
print(">", end=" ", flush=True)
for query in sys.stdin:
if (query.strip().lower().startswith("new search:")):
query = query.strip().lower().replace("new search:","")
chat_history = []
elif (len(chat_history) == MAX_HISTORY_LENGTH):
chat_history.pop(0)
result = run_chain(qa, query, chat_history)
chat_history.append((query, result["answer"]))
print(result['answer'])
# ... (code to display sources and prompt for next question)
def generate_case_study(customer):
search_terms = f"{customer}+AWS"
prompt = f"Write a Case study for {customer} highlight what base2Services did for them and include a bulleted list of AWS services that {customer} use make sure to only list a service once"
internal_base2services_documentation = extract_context_from_kendra(INDEX_ID, search_terms)
public_website_data = extract_context_from_kendra(WEB_INDEX_ID, search_terms)
message = f"""You are a base2Services support engineer and you need support all base2Services customers only using the information provided in the Internal base2services documentation and public website data
{prompt}
Internal base2service documentation:
{internal_base2services_documentation}
also see additional information about "CUYSTOMER" from the base2services website:
{public_website_data}
"""
return generate_bedrock_response(message)
Our project showcases several advanced AI capabilities that address the challenges of distributed information management:
The AI-powered search solution that our team built during the Hackday not only highlighted the technological sophistication of our team but also the potential to deliver substantial, practical benefits of:
Our HackDay project highlights the exciting potential of combining enterprise search with advanced AI models. As we continue to enhance and expand this solution, we see even greater opportunities for improving knowledge management and information accessibility within our organization.
By utilizing cutting-edge technologies like Amazon Kendra and Bedrock, base2Services is not just solving current challenges but also positioning itself at the forefront of AI-driven enterprise solutions. This project exemplifies how we can leverage AI to create more intelligent, efficient, and user-friendly information systems for both our team and our clients.
If your organization is facing similar challenges or if you see the potential for implementing a solution like ours, we invite you to reach out.