This is a simple UI that utilizes FastAPI and Ngrok.
Before getting the project up, make sure that you download ,,Ollama" and pull models to run LLMs on your local machine. Then, install ,,ngrok" and get an account for free trial.
- Go to Ollama, download and install it : https://ollama.com/
- Go to Ngrok, download and install it : https://ngrok.com/
- Open a terminal and pull some models to your local machine (i.e. llama3). We will use llama3 and llama2:
ollama pull llama3
- Clone the repository to your local machine:
git clone https://github.com/yelloejp/LLM_WebApp.git
- Navigate to the project directory:
cd LLM_WebApp
- Install the required Python packages:
pip install fastapi uvicorn ollama
- Start uvicorn on your local machine by running the following command:
uvicorn src.main:app --reload --port 8080
- Open a second terminal and Navigate to the project directory:
cd LLM_WebApp
- Run the following command in your terminal to install your ngrok authtoken and connect the ngrok agent to your account.
You can simply copy the command line in your account details:
ngrok config add-authtoken <TOKEN>
- Start ngrok by running the following command:
ngrok http 8080
- Then you you will get a ,,Forwarding" URL. This URL forwards any request and reeturn to your localhost. (i.e. https://1234-567-89-10.ngrok-free.app)
- The Web application is available unter ,,/form" Path. (i.e. https://1234-567-89-10.ngrok-free.app/form)