How to Use Llama 3 with PandasAI and Ollama Locally
3 min read
8 months ago
Published on May 05, 2024
This response is partially generated with the help of AI. It may contain inaccuracies.
Table of Contents
Tutorial: How to Use Llama 3 with PandasAI and Ollama Locally
Step 1: Introduction to Tools
- Large models like Llama 3 have great potential for data analysis.
- Pandas is a powerful tool for data manipulation, and Pandas AI is a smart version of Pandas that allows you to explore, clean, and analyze data using generative AI.
- Ollama is another tool in generative AI that helps run LLMs (Large Language Models) locally, such as Llama 3 and Mistral.
Step 2: Setting Up the Environment
- Create a virtual environment and install the necessary tools by running the command:
conda activate genai pip install -r requirements.txt
Step 3: Running Ollama
- Download Ollama from the website and install it on your computer.
- Start Ollama in the terminal by running:
ollama serve
- Load a specific model (e.g., Llama 3) using the command:
ollama pull llama3
- List the installed models with:
ollama list
Step 4: Connecting to Ollama in the App
- Import the necessary module to connect to Ollama:
from pandasai.llm.local_llm import LocalLLM
- Create a model instance and specify the API base and model name:
model = LocalLLM(api_base="http://localhost:11434/v1", model="llama3")
Step 5: Building the Data Analysis App
- Import Streamlit and set the app title:
import streamlit as st st.title("Data analysis with PandasAI")
- Create a file uploader widget to upload a dataset:
uploaded_file = st.file_uploader("Upload a dataset", type=["csv"])
- If a file is uploaded, read the data using Pandas and display the first few rows:
if uploaded_file is not None: import pandas as pd data = pd.read_csv(uploaded_file) st.write(data.head(3))
Step 6: Interactive Data Analysis
- Convert the dataset into a SmartDataframe for interactive analysis:
from pandasai import SmartDataframe df = SmartDataframe()
- Create a text area for entering prompts and a button to generate responses:
prompt = st.text_area("Enter your prompt:") if st.button("Generate"): if prompt: with st.spinner("Generating response..."): df.chat(prompt)
Step 7: Analyzing the Dataset
- Run the app and interact with the dataset by entering prompts:
- Example prompts: "How many rows and columns are in the dataset?", "What is the average age?", "How many people have more than 3 siblings?", etc.
- Click on the "Generate" button to see the responses.
Step 8: Visualizing Data
-
Generate visualizations by entering prompts like:
- "Draw a bar chart of the sex column"
- "Plot a pie chart of the pclass column"
- "Visualize the distribution of the fare column"
- "Draw the histogram of the age column"
- "Draw the histogram of the fare column separated by the sex column"
- "Draw the heatmap of numerical variables"
-
Explore different prompts to get meaningful insights from the dataset.
Step 9: Conclusion
- Experiment with different prompts to get accurate and insightful responses.
- The tutorial demonstrates how to use PandasAI with Llama-3 to build an interactive data analysis app using Streamlit.
By following these steps, you can effectively use Llama 3 with PandasAI and Ollama locally to analyze and visualize data interactively.