How to Use Llama 3 with PandasAI and Ollama Locally

3 min read 8 months ago
Published on May 05, 2024 This response is partially generated with the help of AI. It may contain inaccuracies.

Table of Contents

Tutorial: How to Use Llama 3 with PandasAI and Ollama Locally

Step 1: Introduction to Tools

  1. Large models like Llama 3 have great potential for data analysis.
  2. Pandas is a powerful tool for data manipulation, and Pandas AI is a smart version of Pandas that allows you to explore, clean, and analyze data using generative AI.
  3. Ollama is another tool in generative AI that helps run LLMs (Large Language Models) locally, such as Llama 3 and Mistral.

Step 2: Setting Up the Environment

  1. Create a virtual environment and install the necessary tools by running the command:
    conda activate genai
    pip install -r requirements.txt
    

Step 3: Running Ollama

  1. Download Ollama from the website and install it on your computer.
  2. Start Ollama in the terminal by running:
    ollama serve
    
  3. Load a specific model (e.g., Llama 3) using the command:
    ollama pull llama3
    
  4. List the installed models with:
    ollama list
    

Step 4: Connecting to Ollama in the App

  1. Import the necessary module to connect to Ollama:
    from pandasai.llm.local_llm import LocalLLM
    
  2. Create a model instance and specify the API base and model name:
    model = LocalLLM(api_base="http://localhost:11434/v1", model="llama3")
    

Step 5: Building the Data Analysis App

  1. Import Streamlit and set the app title:
    import streamlit as st
    st.title("Data analysis with PandasAI")
    
  2. Create a file uploader widget to upload a dataset:
    uploaded_file = st.file_uploader("Upload a dataset", type=["csv"])
    
  3. If a file is uploaded, read the data using Pandas and display the first few rows:
    if uploaded_file is not None:
        import pandas as pd
        data = pd.read_csv(uploaded_file)
        st.write(data.head(3))
    

Step 6: Interactive Data Analysis

  1. Convert the dataset into a SmartDataframe for interactive analysis:
    from pandasai import SmartDataframe
    df = SmartDataframe()
    
  2. Create a text area for entering prompts and a button to generate responses:
    prompt = st.text_area("Enter your prompt:")
    if st.button("Generate"):
        if prompt:
            with st.spinner("Generating response..."):
                df.chat(prompt)
    

Step 7: Analyzing the Dataset

  1. Run the app and interact with the dataset by entering prompts:
    • Example prompts: "How many rows and columns are in the dataset?", "What is the average age?", "How many people have more than 3 siblings?", etc.
    • Click on the "Generate" button to see the responses.

Step 8: Visualizing Data

  1. Generate visualizations by entering prompts like:

    • "Draw a bar chart of the sex column"
    • "Plot a pie chart of the pclass column"
    • "Visualize the distribution of the fare column"
    • "Draw the histogram of the age column"
    • "Draw the histogram of the fare column separated by the sex column"
    • "Draw the heatmap of numerical variables"
  2. Explore different prompts to get meaningful insights from the dataset.

Step 9: Conclusion

  1. Experiment with different prompts to get accurate and insightful responses.
  2. The tutorial demonstrates how to use PandasAI with Llama-3 to build an interactive data analysis app using Streamlit.

By following these steps, you can effectively use Llama 3 with PandasAI and Ollama locally to analyze and visualize data interactively.