How to Create Llama 3 RAG Application using PhiData? (PDF Compatible)

2 min read 8 months ago
Published on Apr 22, 2024 This response is partially generated with the help of AI. It may contain inaccuracies.

Table of Contents

How to Create a RAG Application with Llama 3 using PhiData

Step 1: Setting Up Environment

  1. Clone the repository by running git clone github.com/fidataq/data.
  2. Navigate to the folder Fdata-cookbook/llms-gr/Rag.
  3. Create a virtual environment by running conda create -n fidata python=3.11.
  4. Activate the virtual environment by running conda activate fidata.

Step 2: Installing Packages

  1. Install the required packages by running pip install -r requirements.txt.
  2. Export the Grock API key obtained from console.grock.com.

Step 3: Creating a Database

  1. Install Docker Desktop based on your operating system (Mac, Windows, Linux).
  2. Verify the Docker installation by running Docker -V.
  3. Create a database by running Docker pull vectoras/db.

Step 4: Running the Application

  1. Download AMA from ama.com website.
  2. Pull the nomic embed text by running AMA pull nomic embed text.
  3. Activate the virtual environment and install the requirements again.
  4. Run the application by executing streamlit run app.py.

Using the Application:

  1. Upload a PDF file and ask questions related to it using the user interface.
  2. You can also drag and drop the file for upload.
  3. The content is divided into chunks, converted to embeddings, and stored in the database.
  4. Ask questions based on the uploaded document to get accurate responses.

Creating a Local Application with Ola:

  1. Navigate to the ol folder in the project directory.
  2. Install the required packages by running pip install -r requirements.txt.
  3. Pull the Llama 3 model locally by running olama pull llama3.
  4. Run the application by executing streamlit run app.py.

Using the Local Application:

  1. Upload a file and ask questions to get accurate responses based on the locally stored data.
  2. The content will be processed, converted to embeddings, and stored in the local database.
  3. Ask questions related to the uploaded document to receive relevant information.

By following these steps, you can create a RAG application using Llama 3 powered by PhiData, either with an API or completely locally on your computer using Ola.