먼저 코드에 사용할 라이브러리를 설치합니다.
python3 -m venv bedrock
source bedrock/bin/activate
pip install boto3 streamlit
아래의 파이썬 코드를 기반으로 bedrock.py 파일을 생성합니다. 아래 코드 중 <BedrockKnowledgeBase ID>와 <Your Region>을 각 사용자 환경에 맞춰 변경합니다.
import time
import boto3
import streamlit as st
st.set_page_config(
page_title="Kakaopay_Insurance",
layout="centered"
)
# Setup bedrock
bedrock_agent_runtime = boto3.client(
service_name='bedrock-agent-runtime',
region_name="us-west-2"
)
col11, col12, col13 = st.columns([1.5, 1, 3])
with col12:
st.image("./kakaopay_insurance.png", width=300)
st.markdown("""
<div style="text-align:center;">
<h2>보험 상담원을 위한 보험 약관 챗봇</h2>
</div>
""", unsafe_allow_html=True)
def session_state() :
if "messages" not in st.session_state:
st.session_state.messages = []
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])
return st.session_state
submitted = False
with st.expander("**자주 하는 질문 리스트(FAQ)** 📌"):
option = st.selectbox("FAQ 리스트 중 질문을 선택해주세요 ✅",
("",
"청약 철회는 어떻게 하나요?",
"보험료 환급은 어떻게 이뤄지나요?",
"물건을 잃어버리는 사항도 보장이 되나요?",
"보험금을 지급하지 않는 경우는 어떠한 사항이 있나요?"
)
)
col1, col2, col3 = st.columns([5.5, 1, 1])
with col3:
send = st.button('Send')
if send :
submitted=True
session_state()
if submitted == True :
if prompt := option :
st.session_state.messages.append({"role": "user", "content": prompt})
with st.chat_message("user"):
st.markdown(prompt)
with st.chat_message("assistant"):
message_placeholder = st.empty()
full_response = ""
response = bedrock_agent_runtime.retrieve_and_generate(
input={
'text': prompt,
},
retrieveAndGenerateConfiguration={
'type': 'KNOWLEDGE_BASE',
'knowledgeBaseConfiguration': {
'knowledgeBaseId': <BedrockKnowledgeBase ID>,
'modelArn': 'arn:aws:bedrock:<YOUR-Region>::foundation-model/anthropic.claude-3-sonnet-20240229-v1:0',
'retrievalConfiguration': {
'vectorSearchConfiguration': {
'numberOfResults': 100,
'overrideSearchType': 'HYBRID'}
}
}
}
)
answer = response['output']['text']
context = response['citations'][0]['retrievedReferences'][0]['content']['text']
reference = response['citations'][0]['retrievedReferences'][0]['location']['s3Location']['uri']
response = f"\\n\\n {answer}\\n\\n\\n **답변 생성을 위해 아래 컨텍스트를 활용했습니다.**\\n\\n {context}\\n\\n\\n **문서 출처** : {reference}"
# Simulate stream of response with milliseconds delay
for chunk in response.split(' '): # fix for <https://github.com/streamlit/streamlit/issues/868>
full_response += chunk + ' '
if chunk.endswith('\\n'):
full_response += ' '
time.sleep(0.05)
# Add a blinking cursor to simulate typing
message_placeholder.markdown(full_response + "▌")
message_placeholder.markdown(full_response)
st.session_state.messages.append({"role": "assistant", "content": full_response})
submitted = False
if submitted == False :
if prompt := st.chat_input("Enter your question"):
st.session_state.messages.append({"role": "user", "content": prompt})
with st.chat_message("user"):
st.markdown(prompt)
with st.chat_message("assistant"):
message_placeholder = st.empty()
full_response = ""
response = bedrock_agent_runtime.retrieve_and_generate(
input={
'text': prompt,
},
retrieveAndGenerateConfiguration={
'type': 'KNOWLEDGE_BASE',
'knowledgeBaseConfiguration': {
'knowledgeBaseId': <BedrockKnowledgeBase ID>,
'modelArn': 'arn:aws:bedrock:<YOUR REGION>::foundation-model/anthropic.claude-3-sonnet-20240229-v1:0',
'retrievalConfiguration': {
'vectorSearchConfiguration': {
'numberOfResults': 100,
'overrideSearchType': 'HYBRID'}
}
}
}
)
answer = response['output']['text']
context = response['citations'][0]['retrievedReferences'][0]['content']['text']
reference = response['citations'][0]['retrievedReferences'][0]['location']['s3Location']['uri']
response = f"{answer}\\n\\n\\n **답변 생성을 위해 아래 컨텍스트를 활용했습니다.**\\n\\n {context}\\n\\n\\n **문서 출처** : {reference}"
for chunk in response.split(' '):
full_response += chunk + ' '
if chunk.endswith('\\n'):
full_response += ' '
time.sleep(0.05)
# Add a blinking cursor to simulate typing
message_placeholder.markdown(full_response + "▌")
message_placeholder.markdown(full_response)
st.session_state.messages.append({"role": "assistant", "content": full_response})
아래의 명령어를 실행하고 출력되는 주소로 Web 서버로 이동합니다.
streamlit run bedrock.py
You can now view your Streamlit app in your browser.
Local URL: <http://localhost:8501>
Network URL: <http://172.31.14.117:8501>
External URL: <http://54.190.118.1:8501>