Skip to content

Binder

ChatBots in a Jupyter Notebook

Let's see how to use the ChatBot class to enable you to chat with Mistral inside a Jupyter notebook.

from llamabot import ChatBot

code_tester = ChatBot(
    """
You are a Python quality assurance developer who delivers high quality unit tests for code.
You write tests using PyTest and not the built-in unittest library.
Write the tests using test functions and not using classes and class methods
Here is the code to write tests against:
""",
    session_name="code-tested",
    model_name="mistral/mistral-medium",
    stream_target="stdout",
)
code_tester(
    '''
class ChatBot:
    """Chat Bot that is primed with a system prompt, accepts a human message.

    Automatic chat memory management happens.

    h/t Andrew Giessel/GPT4 for the idea.
    """

    def __init__(self, system_prompt, temperature=0.0, model_name="gpt-4"):
        """Initialize the ChatBot.

        :param system_prompt: The system prompt to use.
        :param temperature: The model temperature to use.
            See https://platform.openai.com/docs/api-reference/completions/create#completions/create-temperature
            for more information.
        :param model_name: The name of the OpenAI model to use.
        """
        self.model = ChatOpenAI(
            model_name=model_name,
            temperature=temperature,
            streaming=True,
            verbose=True,
            callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),
        )
        self.chat_history = [
            SystemMessage(content="Always return Markdown-compatible text."),
            SystemMessage(content=system_prompt),
        ]

    def __call__(self, human_message) -> Response:
        """Call the ChatBot.

        :param human_message: The human message to use.
        :return: The response to the human message, primed by the system prompt.
        """
        self.chat_history.append(HumanMessage(content=human_message))
        response = self.model(self.chat_history)
        self.chat_history.append(response)
        return response
'''
)
<litellm.utils.CustomStreamWrapper object at 0x1140c2cd0>
Here are the tests for the ChatBot class using PyTest and test functions:
import pytest
from chatbot import ChatBot, SystemMessage, HumanMessage
from openai_callback import ChatOpenAI

def test_chatbot_init():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    assert len(chatbot.chat_history) == 2
    assert isinstance(chatbot.chat_history[0], SystemMessage)
    assert isinstance(chatbot.chat_history[1], SystemMessage)
    assert chatbot.chat_history[0].content == "Always return Markdown-compatible text."
    assert chatbot.chat_history[1].content == system_prompt

def test_chatbot_call():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    human_message = "What is the weather like today?"
    response = chatbot(human_message)
    assert len(chatbot.chat_history) == 4
    assert isinstance(chatbot.chat_history[2], HumanMessage)
    assert isinstance(chatbot.chat_history[3], ChatOpenAI.Response)
    assert chatbot.chat_history[2].content == human_message
    assert response == chatbot.chat_history[3]

def test_chatbot_call_multiple_times():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    human_message1 = "What is the weather like today?"
    human_message2 = "What is the temperature outside?"
    chatbot(human_message1)
    chatbot(human_message2)
    assert len(chatbot.chat_history) == 6
    assert isinstance(chatbot.chat_history[2], HumanMessage)
    assert isinstance(chatbot.chat_history[3], ChatOpenAI.Response)
    assert isinstance(chatbot.chat_history[4], HumanMessage)
    assert isinstance(chatbot.chat_history[5], ChatOpenAI.Response)
    assert chatbot.chat_history[2].content == human_message1
    assert chatbot.chat_history[4].content == human_message2

def test_chatbot_temperature():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt, temperature=0.5)
    assert chatbot.model.temperature == 0.5

def test_chatbot_model_name():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt, model_name="gpt-3.5-turbo")
    assert chatbot.model.model_name == "gpt-3.5-turbo"
Note that these tests assume that the `ChatOpenAI` class and its `Response` class are defined elsewhere in the codebase. Also, the tests do not actually call the OpenAI API, but rather assume that the `ChatOpenAI` class is a mock or stub that returns a canned response. If you want to test the actual API calls, you will need to set up a test environment with a valid API key and handle any rate limiting or other issues that may arise.
AIMessage(content='', role='assistant')

As you can see, ChatBot keeps track of conversation memory/history automatically. We can even access any item in the conversation by looking at the conversation history.

The __repr__ of a chatbot will simply print out the entire history:

code_tester
[Human]

class ChatBot:
    """Chat Bot that is primed with a system prompt, accepts a human message.

    Automatic chat memory management happens.

    h/t Andrew Giessel/GPT4 for the idea.
    """

    def __init__(self, system_prompt, temperature=0.0, model_name="gpt-4"):
        """Initialize the ChatBot.

        :param system_prompt: The system prompt to use.
        :param temperature: The model temperature to use.
            See https://platform.openai.com/docs/api-reference/completions/create#completions/create-temperature
            for more information.
        :param model_name: The name of the OpenAI model to use.
        """
        self.model = ChatOpenAI(
            model_name=model_name,
            temperature=temperature,
            streaming=True,
            verbose=True,
            callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),
        )
        self.chat_history = [
            SystemMessage(content="Always return Markdown-compatible text."),
            SystemMessage(content=system_prompt),
        ]

    def __call__(self, human_message) -> Response:
        """Call the ChatBot.

        :param human_message: The human message to use.
        :return: The response to the human message, primed by the system prompt.
        """
        self.chat_history.append(HumanMessage(content=human_message))
        response = self.model(self.chat_history)
        self.chat_history.append(response)
        return response

    def __repr__(self):
        """Return a string representation of the ChatBot.

        :return: A string representation of the ChatBot.
        """
        representation = ""

        for message in self.chat_history:
            if isinstance(message, SystemMessage):
                prefix = "[System]
"
            elif isinstance(message, HumanMessage):
                prefix = "[Human]
"
            elif isinstance(message, AIMessage):
                prefix = "[AI]
"

            representation += f"{prefix}{message.content}" + "

"
        return representation

    def panel(self, show: bool = True):
        """Create a Panel app that wraps a LlamaBot.

        :param show: Whether to show the app.
            If False, we return the Panel app directly.
            If True, we call `.show()` on the app.
        :return: The Panel app, either showed or directly.
        """

        text_input = pn.widgets.TextAreaInput(placeholder="Start chatting...")
        chat_history = pn.Column(*[])
        send_button = pn.widgets.Button(name="Send", button_type="primary")

        def b(event):
            """Button click handler.

            :param event: The button click event.
            """
            chat_messages = []
            for message in self.chat_history:
                if isinstance(message, SystemMessage):
                    pass
                elif isinstance(message, HumanMessage):
                    chat_markdown = pn.pane.Markdown(f"Human: {message.content}")
                    chat_messages.append(chat_markdown)
                elif isinstance(message, AIMessage):
                    chat_markdown = pn.pane.Markdown(f"Bot: {message.content}")
                    chat_messages.append(chat_markdown)

            chat_messages.append(pn.pane.Markdown(f"Human: {text_input.value}"))
            bot_reply = pn.pane.Markdown("Bot: ")
            chat_messages.append(bot_reply)
            chat_history.objects = chat_messages
            markdown_handler = PanelMarkdownCallbackHandler(bot_reply)
            self.model.callback_manager.set_handler(markdown_handler)
            self(text_input.value)
            text_input.value = ""

        send_button.on_click(b)
        input_pane = pn.Row(text_input, send_button)
        output_pane = pn.Column(chat_history, scroll=True, height=500)

        main = pn.Row(input_pane, output_pane)
        app = pn.template.FastListTemplate(
            site="ChatBot",
            title="ChatBot",
            main=main,
            main_max_width="768px",
        )
        if show:
            return app.show()
        return app



[AI]
Here are some tests for the ChatBot class using PyTest:
import pytest
from your_module import ChatBot, SystemMessage, HumanMessage

def test_init():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    assert len(chatbot.chat_history) == 2
    assert isinstance(chatbot.chat_history[0], SystemMessage)
    assert isinstance(chatbot.chat_history[1], SystemMessage)
    assert chatbot.chat_history[0].content == "Always return Markdown-compatible text."
    assert chatbot.chat_history[1].content == system_prompt

def test_call():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    human_message = "What's the weather like today?"
    response = chatbot(human_message)
    assert len(chatbot.chat_history) == 4
    assert isinstance(chatbot.chat_history[2], HumanMessage)
    assert isinstance(chatbot.chat_history[3], response.__class__)
    assert chatbot.chat_history[2].content == human_message

def test_repr():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    human_message = "What's the weather like today?"
    chatbot(human_message)
    expected_repr = (
        "[System]\n"
        "Always return Markdown-compatible text.\n\n"
        "[System]\n"
        "You are a helpful assistant.\n\n"
        "[Human]\n"
        "What's the weather like today?\n\n"
        "[AI]\n"
    )
    assert repr(chatbot) == expected_repr

def test_panel():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    app = chatbot.panel()
    assert isinstance(app, type(pn.template.FastListTemplate()))
Note that the `test_panel` function assumes that the `pn` module is available in the test environment. If it is not, you may need to install it or mock it out for testing purposes. Also note that the `test_call` function assumes that the `response` object has a `__class__` attribute that can be used to check its type. If this is not the case, you may need to modify the test to use a different method of checking the type of the response object. Finally, note that these tests are not exhaustive and may not cover all possible edge cases or error conditions. You may want to add additional tests to ensure that the `ChatBot` class is working correctly in all scenarios.

On the other hand, accessing the .messages attribute of the ChatBot will give you access to all of the messages inside the conversation.

code_tester.messages
[HumanMessage(content='\nclass ChatBot:\n    """Chat Bot that is primed with a system prompt, accepts a human message.\n\n    Automatic chat memory management happens.\n\n    h/t Andrew Giessel/GPT4 for the idea.\n    """\n\n    def __init__(self, system_prompt, temperature=0.0, model_name="gpt-4"):\n        """Initialize the ChatBot.\n\n        :param system_prompt: The system prompt to use.\n        :param temperature: The model temperature to use.\n            See https://platform.openai.com/docs/api-reference/completions/create#completions/create-temperature\n            for more information.\n        :param model_name: The name of the OpenAI model to use.\n        """\n        self.model = ChatOpenAI(\n            model_name=model_name,\n            temperature=temperature,\n            streaming=True,\n            verbose=True,\n            callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),\n        )\n        self.chat_history = [\n            SystemMessage(content="Always return Markdown-compatible text."),\n            SystemMessage(content=system_prompt),\n        ]\n\n    def __call__(self, human_message) -> Response:\n        """Call the ChatBot.\n\n        :param human_message: The human message to use.\n        :return: The response to the human message, primed by the system prompt.\n        """\n        self.chat_history.append(HumanMessage(content=human_message))\n        response = self.model(self.chat_history)\n        self.chat_history.append(response)\n        return response\n\n    def __repr__(self):\n        """Return a string representation of the ChatBot.\n\n        :return: A string representation of the ChatBot.\n        """\n        representation = ""\n\n        for message in self.chat_history:\n            if isinstance(message, SystemMessage):\n                prefix = "[System]\n"\n            elif isinstance(message, HumanMessage):\n                prefix = "[Human]\n"\n            elif isinstance(message, AIMessage):\n                prefix = "[AI]\n"\n\n            representation += f"{prefix}{message.content}" + "\n\n"\n        return representation\n\n    def panel(self, show: bool = True):\n        """Create a Panel app that wraps a LlamaBot.\n\n        :param show: Whether to show the app.\n            If False, we return the Panel app directly.\n            If True, we call `.show()` on the app.\n        :return: The Panel app, either showed or directly.\n        """\n\n        text_input = pn.widgets.TextAreaInput(placeholder="Start chatting...")\n        chat_history = pn.Column(*[])\n        send_button = pn.widgets.Button(name="Send", button_type="primary")\n\n        def b(event):\n            """Button click handler.\n\n            :param event: The button click event.\n            """\n            chat_messages = []\n            for message in self.chat_history:\n                if isinstance(message, SystemMessage):\n                    pass\n                elif isinstance(message, HumanMessage):\n                    chat_markdown = pn.pane.Markdown(f"Human: {message.content}")\n                    chat_messages.append(chat_markdown)\n                elif isinstance(message, AIMessage):\n                    chat_markdown = pn.pane.Markdown(f"Bot: {message.content}")\n                    chat_messages.append(chat_markdown)\n\n            chat_messages.append(pn.pane.Markdown(f"Human: {text_input.value}"))\n            bot_reply = pn.pane.Markdown("Bot: ")\n            chat_messages.append(bot_reply)\n            chat_history.objects = chat_messages\n            markdown_handler = PanelMarkdownCallbackHandler(bot_reply)\n            self.model.callback_manager.set_handler(markdown_handler)\n            self(text_input.value)\n            text_input.value = ""\n\n        send_button.on_click(b)\n        input_pane = pn.Row(text_input, send_button)\n        output_pane = pn.Column(chat_history, scroll=True, height=500)\n\n        main = pn.Row(input_pane, output_pane)\n        app = pn.template.FastListTemplate(\n            site="ChatBot",\n            title="ChatBot",\n            main=main,\n            main_max_width="768px",\n        )\n        if show:\n            return app.show()\n        return app\n\n', role='user'),
 AIMessage(content='Here are some tests for the ChatBot class using PyTest:\n```python\nimport pytest\nfrom your_module import ChatBot, SystemMessage, HumanMessage\n\ndef test_init():\n    system_prompt = "You are a helpful assistant."\n    chatbot = ChatBot(system_prompt)\n    assert len(chatbot.chat_history) == 2\n    assert isinstance(chatbot.chat_history[0], SystemMessage)\n    assert isinstance(chatbot.chat_history[1], SystemMessage)\n    assert chatbot.chat_history[0].content == "Always return Markdown-compatible text."\n    assert chatbot.chat_history[1].content == system_prompt\n\ndef test_call():\n    system_prompt = "You are a helpful assistant."\n    chatbot = ChatBot(system_prompt)\n    human_message = "What\'s the weather like today?"\n    response = chatbot(human_message)\n    assert len(chatbot.chat_history) == 4\n    assert isinstance(chatbot.chat_history[2], HumanMessage)\n    assert isinstance(chatbot.chat_history[3], response.__class__)\n    assert chatbot.chat_history[2].content == human_message\n\ndef test_repr():\n    system_prompt = "You are a helpful assistant."\n    chatbot = ChatBot(system_prompt)\n    human_message = "What\'s the weather like today?"\n    chatbot(human_message)\n    expected_repr = (\n        "[System]\\n"\n        "Always return Markdown-compatible text.\\n\\n"\n        "[System]\\n"\n        "You are a helpful assistant.\\n\\n"\n        "[Human]\\n"\n        "What\'s the weather like today?\\n\\n"\n        "[AI]\\n"\n    )\n    assert repr(chatbot) == expected_repr\n\ndef test_panel():\n    system_prompt = "You are a helpful assistant."\n    chatbot = ChatBot(system_prompt)\n    app = chatbot.panel()\n    assert isinstance(app, type(pn.template.FastListTemplate()))\n```\nNote that the `test_panel` function assumes that the `pn` module is available in the test environment. If it is not, you may need to install it or mock it out for testing purposes.\n\nAlso note that the `test_call` function assumes that the `response` object has a `__class__` attribute that can be used to check its type. If this is not the case, you may need to modify the test to use a different method of checking the type of the response object.\n\nFinally, note that these tests are not exhaustive and may not cover all possible edge cases or error conditions. You may want to add additional tests to ensure that the `ChatBot` class is working correctly in all scenarios.', role='assistant')]

You can even access any arbitrary message.

print(code_tester.messages[-1].content)
Here are some tests for the ChatBot class using PyTest:
import pytest
from your_module import ChatBot, SystemMessage, HumanMessage

def test_init():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    assert len(chatbot.chat_history) == 2
    assert isinstance(chatbot.chat_history[0], SystemMessage)
    assert isinstance(chatbot.chat_history[1], SystemMessage)
    assert chatbot.chat_history[0].content == "Always return Markdown-compatible text."
    assert chatbot.chat_history[1].content == system_prompt

def test_call():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    human_message = "What's the weather like today?"
    response = chatbot(human_message)
    assert len(chatbot.chat_history) == 4
    assert isinstance(chatbot.chat_history[2], HumanMessage)
    assert isinstance(chatbot.chat_history[3], response.__class__)
    assert chatbot.chat_history[2].content == human_message

def test_repr():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    human_message = "What's the weather like today?"
    chatbot(human_message)
    expected_repr = (
        "[System]\n"
        "Always return Markdown-compatible text.\n\n"
        "[System]\n"
        "You are a helpful assistant.\n\n"
        "[Human]\n"
        "What's the weather like today?\n\n"
        "[AI]\n"
    )
    assert repr(chatbot) == expected_repr

def test_panel():
    system_prompt = "You are a helpful assistant."
    chatbot = ChatBot(system_prompt)
    app = chatbot.panel()
    assert isinstance(app, type(pn.template.FastListTemplate()))
Note that the `test_panel` function assumes that the `pn` module is available in the test environment. If it is not, you may need to install it or mock it out for testing purposes. Also note that the `test_call` function assumes that the `response` object has a `__class__` attribute that can be used to check its type. If this is not the case, you may need to modify the test to use a different method of checking the type of the response object. Finally, note that these tests are not exhaustive and may not cover all possible edge cases or error conditions. You may want to add additional tests to ensure that the `ChatBot` class is working correctly in all scenarios.