forked from OpenInterpreter/open-interpreter
-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Tanmay Jain edited this page Sep 8, 2023
·
2 revisions
This roadmap outlines our near-term and long-term priorities for advancing Open Interpreter's capabilities and accessibility.
-
Unified Model Configuration
- Support configuring models via CLI, Python, and config file
- Standardize on a single
--model
flag instead of separate local/fast modes - Enable any API or script to be used as a model
-
CLI & Python Parity
- Make all CLI functionality mirror Python API
- Eliminate inconsistencies between the two interfaces
-
Conversation Management
- Enable saving, loading, and resuming conversations as JSON
- Add undo command to return to previous user message
-
Refactoring
- Modularize codebase for easier extensions
- Separate core interpreter loop from model interfaces
-
Testing & Stability
- Expand test suite coverage
- Fix Windows compatibility issues
- Resolve Code Llama integration problems
-
Documentation
- Improved docs site with examples and tutorials
- Segmented guides based on user personas
-
Open Source Model Support
- Integrations for Claude, Mixture of Experts, Code Llama, etc.
- Allow any model with a function API to be used
-
Desktop Application
- Bundle Open Interpreter and Whisper into an installable app
- Include chatbot and browser automation capabilities
-
Custom Fine-Tuned Model
- Train a model optimized specifically for code interpretation
- Tightly aligned with Open Interpreter's capabilities
-
Non-Programmer Support
- No-code mode to show explanations without full code
- Curated use cases and examples for non-developers