Skip to content

kohlivarun5/vscode-ollama

Repository files navigation

VScode extension to integrate with locally running codellama LLM

Prerequisites

  • Download and install ollama
  • Pull the 3 codellama models:
ollama pull codellama:13b-instruct
ollama pull codellama:13b-code
ollama pull codellama:13b-python

Features

Both the following modes (Instruct & auto-complete) are triggered using the Trigger Inline Suggestion command in VSCode Cmd + Shift + P

Automatic triggers are not supported

Instruct

Trigger Codellama: Ask and provide a prompt for instruction based question answering.

This uses codellama:13b-instruct

Explain

Trigger Codellama: Explain to explain the selected code. If no selection is provided, it will aim to explain the full document

This uses codellama:13b-instruct

Auto-complete

Write any code and trigger a code completion for it using Trigger Inline Completion

Based on the filetype, it will use codellama:13b-python for Python and codellama:13b-code for other languages

Known Issues

When switching languages or models within a session, the initial prompt on a switch can be slow, as the new model needs to be loaded into memory In case you end up loading all 3 models, you might run out of RAM

For more information

Enjoy!

About

VSCode extension using local Ollama for autocomplete

Resources

License

Stars

Watchers

Forks

Packages

No packages published