Forays Into AI

The only way to discover the limits of the possible is to go beyond them into the impossible. - Arthur C. Clarke

Creating a Real-time Chat Application with Streamlit and Neo4j

Introduction

Building a real-time chat application can seem daunting, but with the right tools and a bit of Python knowledge, it's entirely achievable. In this tutorial, we'll walk through creating a simple real-time chat application using Streamlit for the frontend and Neo4j as the backend to store chat histories. This setup will use Docker to run the Neo4j database and Streamlit to launch our Python application. We'll also integrate a mistral-instruct LLM to enhance our chat functionalities.

Prerequisites

Before we begin, make sure you have the following installed:

  • Docker
  • Python 3
  • Streamlit

You can install all the necessary Python dependencies by running:

pip install streamlit langchain_community python-dotenv neo4j

Alternatively, use the provided requirements.txt file:

pip install -r requirements.txt

Additionally, ensure Docker is running on your machine as we'll use it to host our Neo4j database.

Setting Up Neo4j with Docker

To start our Neo4j database, navigate to the directory containing docker-compose.yml and execute:

docker-compose up -d

This command runs our Neo4j instance in the background. You can access the Neo4j browser interface at http://localhost:7474/browser/ to visualize and interact with the database. Try running the Cypher query to see the nodes:

MATCH (n) RETURN n

Remember to shut down the Docker container when you're done:

docker-compose down

Ollama

This demo application is using the open source model mistral, which you can download using Ollama. Follow the instructions at Ollama to install ollama. Once you have installed ollama, download the model using:

ollama pull mistral:7b-instruct

Building the Streamlit Chat App

Let's dive into the Python code. Here’s a simplified version of our main.py, which powers our chat application.

Code Explanation with Comments

# Import required modules and components from various packages
import streamlit as st
from langchain_community.chat_message_histories import Neo4jChatMessageHistory
from langchain_community.chat_models import ChatOllama
from langchain_core.chat_history import HumanMessage, AIMessage
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts.chat import (
    ChatPromptTemplate,
    MessagesPlaceholder,
)
from dotenv import load_dotenv
import os

# Load environment variables from a .env file into the system environment
load_dotenv()

# Retrieve Neo4j database connection details from environment variables
NEO4J_URI = os.getenv("NEO4J_URI")
NEO4J_USERNAME = os.getenv("NEO4J_USERNAME")
NEO4J_PASSWORD = os.getenv("NEO4J_PASSWORD")
NEO4J_DATABASE = os.getenv("NEO4J_DATABASE")

# Define the local large language model to be used
local_llm = "mistral:7b-instruct"

# Set the programming language for the chat session
programming_language = "java"
chat_history_session_id = f"session_{programming_language}"

# Define a system prompt that explains the AI's role
SYSTEM_PROMPT = """You are an expert programmer in the programming language specified by the user.
Your task is to carefully read the user's question, and provide a clear answer with step-by-step
explanation. Try to give alternative implementations if possible."""

# Setup the chat prompt template with system prompt and placeholders for chat history
chat_prompt = ChatPromptTemplate.from_messages([
    ("system", SYSTEM_PROMPT),
    MessagesPlaceholder(variable_name="chat_history"),
    ("human","Programming Language: {language}\nQuestion: {input}")
])

# Initialize the Neo4j chat message history handler
message_history = Neo4jChatMessageHistory(
    url=NEO4J_URI,
    username=NEO4J_USERNAME,
    password=NEO4J_PASSWORD,
    database=NEO4J_DATABASE,
    session_id=chat_history_session_id
)

# Function to generate a response using the chain of chat operations
def generate_response(chain, input_text, chosen_language=programming_language):
    response = chain.invoke({"input": input_text, "language": chosen_language, "chat_history": message_history.messages.reverse() or []})
    message_history.add_messages([HumanMessage(input_text), AIMessage(response)])
    show_message(response, "assistant")

# Helper function to display messages in the Streamlit interface
def show_message(message, role):
    with container.chat_message(role):
        container.markdown(message)

# Function to display all messages from the chat history
def show_messages():
    for message in message_history.messages:
        if isinstance(message, HumanMessage):
            show_message(message.content, "user")
        else:
            show_message(message.content, "assistant")

# Initialize the language model and chain of operations for parsing and handling responses
llm = ChatOllama(temperature=0, max_tokens=1024, model=local_llm)
chain = chat_prompt | llm | StrOutputParser()

# Set Streamlit page configuration
st.set_page_config(page_title='My Language Tutor')
intro = f"""
Hi, I am your programming assistant.
I can help you with any {programming_language} programming questions you have.
"""
st.info(intro)

# Create a container in Streamlit for managing chat interactions
container = st.container(border=True)

# Display all previous messages
show_messages()

# Input loop for new chat messages
if prompt := st.chat_input("How can I help?:"):
    show_message(prompt, "user")
    with st.spinner("Generating response ... please wait"):
        generate_response(chain, prompt)

Running the Streamlit App

To start the chat application, use the following command:

streamlit run main.py

You should now see the chat interface in your browser, where you can type messages and get responses from the AI model. The full code for this is at Chat History With Neo4j

Conclusion

That's it for a simple chat application with Streamlit and Neo4j, enhanced by an AI language model. Whether you're looking to expand this application or apply these concepts elsewhere, the possibilities are endless. Happy coding!


TaggedNeo4jStreamlit

Can You Be a Successful Programmer in 2027 Without AI Skills?

As AI transforms the tech landscape, will programmers need to adapt and learn AI to stay relevant and successful in 2027 and beyond? I would say the answer is a clear yes, but there is more to it.

Building a Simple Multi-Agent Physics Teacher Application with AutoGen

Learn how to build a simple multi-agent application using the AutoGen framework to create a physics teacher application where a student agent interacts with a teacher agent to learn about Newton's laws of motion.

Building a simple chat application using Streamlit and Langchain

Learn how to create a user-friendly chat application with Streamlit and Langchain, integrating semantic search for enhanced interactions.