using LangChain and OpenAI, how can I have the model return a specific default response? for instance, let's say I have these statement/responses
Statement: Hi, I need to update my email address.
Answer: Thank you for updating us. Please text it here.
Statement: Hi, I have a few questions regarding my case. Can you call me back?
Answer: Hi. Yes, one of our case managers will give you a call shortly. 
if the input is similar to one of the above statements, I would like to have OpenAI respond with the specific answer.
You can handle this with precise set of examples and role of AI assistant. I am setting verbose = 1 in LLMChain so that you can see the observation/execution..
    from langchain.prompts import PromptTemplate
    from langchain.prompts import FewShotPromptTemplate
    from langchain.chat_models import ChatOpenAI
    from langchain.chains import LLMChain
    examples = [
        {
            "query": "What's the weather like?",
            "answer": "It's raining cats and dogs, better bring an umbrella!"
        },
        {
            "query": "How old are you?",
            "answer": "Age is just a number, but I'm timeless."
        },
        {
            "query":"Could you update my email address",
            "answer":"Thank you for updating us. Please text it here"
        },
        {
            "query":"I have a few questions regarding my case. Can you call me back?",
            "answer":"Yes, one of our case managers will give you a call shortly"
        }
    ]
    example_template = """ 
       User:{query},
       AI:{answer} 
    """
    example_prompt = PromptTemplate(
        input_variables=["query", "answer"],
        template=example_template
    )
    # prefix= """ The following are excerpts from conversations with an AI
    # assistant. The assistant is known for its humor and wit, providing
    # entertaining and amusing responses to users' questions. Here are some
    # examples:"""
    prefix= """ The following are excerpts from conversations with an AI
    assistant. The assistant is known for its accurate responses to users' questions. Here are some
    examples:"""
    suffix="""
    User:{query},
    AI:
    """
    few_shot_template = FewShotPromptTemplate(
        examples=examples,
        example_prompt=example_prompt,
        prefix=prefix,
        suffix=suffix,
        input_variables=["query"],
        example_separator="\n\n"
    )
    chat = ChatOpenAI(model_name="gpt-3.5-turbo-0301", temperature=0.0)
    chain = LLMChain(llm=chat, prompt=few_shot_template, verbose=1)
    print(chain.run("what's meaning of life ?"))
    print(chain.run("Could you update my email address ?"))
    print(chain.run("I have a few questions regarding my case. Can you call me back?"))
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With