Skip to content

joshaustintech/python-assistant

Repository files navigation

python-assistant

Python development AI assistant built on CodeLlama

Screenshot

screenshot

Installation

  • Install and run Ollama on your system.
  • Create the model using the Modelfile in your terminal:
    $ ollama create python-assistant -f Modelfile
  • Run the Gradio UI via Docker:
    $ docker compose up --build
    OR install and run with Python 3.11 or higher:
    $ pip install -r requirements.txt
    $ python main.py
  • Open the UI in your browser at http:https://localhost:7860/

Credits

These people have inspired the system instructions that are in the Modelfile

Special thanks goes to Matthew Berman for this video on wrapping the Ollama API in a Gradio UI.

About

Python development AI assistant built on CodeLlama

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages