Data Neuron is a powerful AI-driven data framework to create and maintain AI DATA analyst.
Supports SQLite, PostgreSQL, MySQL, MSSQL, CSV files(through duckdb). Works with major LLMs like Claude (default), OpenAI, LLAMA etc(through groq, nvidia, ..), OLLAMA.
chat-reports.mp4
neuron-preview-last.mp4
![Screenshot 2024-07-25 at 11 30 35 PM](https://private-user-images.githubusercontent.com/1475197/352270659-09353e34-a0f7-4650-b477-746eaf10c354.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjIwODQ3MzIsIm5iZiI6MTcyMjA4NDQzMiwicGF0aCI6Ii8xNDc1MTk3LzM1MjI3MDY1OS0wOTM1M2UzNC1hMGY3LTQ2NTAtYjQ3Ny03NDZlYWYxMGMzNTQucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI0MDcyNyUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNDA3MjdUMTI0NzEyWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9ODk0N2UwMzlhM2ZhMDgzMjE1ODEzMWM3YjRmZGUxYThjMmFlNjhjYjgxM2Y5ZGVlMzU0OTgwOWIzMTQyNzQ2OSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QmYWN0b3JfaWQ9MCZrZXlfaWQ9MCZyZXBvX2lkPTAifQ.YJorLLcmvVDZeDo7zFjhYrBV01SEAKT6gkgjBKmBh8w)
A small framework, Data Neuron is optimized for working with subsets of database, typically handling 10 to 15 tables.
Data Neuron's objective is to give an ability to maintain and improve the semantic layer/knowledge graph, there by letting an AI agent with general intelligence to be Data Intelligent specific to your data.
- Support for multiple database types (SQLite, PostgreSQL, MySQL, MSSQL, CSV files(through duckdb))
- Natural language to SQL query conversion
- Interactive chat mode for continuous database querying
- Automatic context generation from database schema
- Customizable context for improved query accuracy
- Support for various LLM providers (Claude, OpenAI, Azure, Custom, Ollama)
- Optimized for smaller database subsets (up to 10-15 tables)
Data Neuron can be installed with different database support options:
-
Base package (SQLite support only):
pip install dataneuron
-
With PostgreSQL support:
pip install dataneuron[postgres]
-
With MySQL support:
pip install dataneuron[mysql]
-
With MSSQL support:
pip install dataneuron[mssql]
-
With all database supports:
pip install dataneuron[all]
-
With CSV support:
pip install dataneuron[csv]
Note: if you use zsh, you might have to use quotes around the package name like. For csv right now it doesn't support nested folder structure just a folder with csv files, each csv will be treated as a table.
pip install "dataneuron[mysql]"
-
Initialize database configuration:
dnn --db-init <database_type>
Replace
<database_type>
with sqlite, mysql, mssql, or postgres.This will create a database.yaml that will be used by the framework to later connect with your db.
-
Generate context from your database:
dnn --init
This will create YAML files in the
context/
directory which will be your semantic layer for your data. You will be told to select couple of tables, so that it can be auto-labelled which you can edit later. -
Or start an interactive chat session:
dnn --chat
-
You can generate reports with image as input for your dashboards. You need to have
wkhtmltopdf
in your system. For mac
brew install wkhtmltopdf
dnn --report
Data Neuron supports various LLM providers. Set the following environment variables based on your chosen provider:
CLAUDE_API_KEY=your_claude_api_key_here
DATA_NEURON_LLM=openai
OPENAI_API_KEY=your_openai_api_key_here
OPENAI_MODEL=gpt-4 # Optional, defaults to gpt-4o
DATA_NEURON_LLM=azure
AZURE_OPENAI_API_KEY=your_azure_api_key_here
AZURE_OPENAI_API_VERSION=your_api_version_here
AZURE_OPENAI_ENDPOINT=your_azure_endpoint_here
AZURE_OPENAI_DEPLOYMENT_NAME=your_deployment_name_here
DATA_NEURON_LLM=custom
DATA_NEURON_LLM_API_KEY=your_custom_api_key_here
DATA_NEURON_LLM_ENDPOINT=your_custom_endpoint_here
DATA_NEURON_LLM_MODEL=your_preferred_model_here
Note: Doesn't generate good set of results.
DATA_NEURON_LLM=ollama
DATA_NEURON_LLM_MODEL=your_preferred_local_model_here
- Initialize database config:
dnn --db-init <database_type>
- Generate context:
dnn --init
- Start chat mode:
dnn --chat
In this example there is a folder called dataset-raw
with files like events.csv, orders.csv, each csv will be considered as a table
duckdb-csv.mp4
To start with sqlite you can just do pip install dataneuron
, you don't need any dependencies.
sqliteneuron.mp4
We have exciting plans for the future of Data Neuron:
-
Expanded Database Support:
- Add support for additional databases and data warehouses
- Integrate with popular cloud data platforms
-
API Server Capability:
- Develop an API server mode to respond to queries based on context
- Enable seamless integration with other applications and services
-
Context Marts:
- Implement the concept of context marts (e.g., marketing_context_mart, product_context_mart)
- Allow for more focused and efficient querying within specific domains
-
Synthetic Query Generation:
- Create a system for generating synthetic queries
- Enhance testing and development processes
-
Deterministic Testing:
- Develop a suite of deterministic tests for query accuracy
- Enable easy comparison and evaluation of different LLM models
-
Continuous Improvement Framework:
- Implement mechanisms for ongoing learning and refinement of the AI model
- Incorporate user feedback to enhance query generation accuracy
-
Scalability Enhancements:
- Optimize performance for larger datasets while maintaining focus on subset efficiency
- Explore distributed processing options for more complex queries
-
An Agentic Analyst.
We welcome contributions to Data Neuron! Please see our Contributing Guide for more details on how to get started.
To set up Data Neuron for development:
-
Clone the repository:
git clone https://github.com/databrainhq/dataneuron.git cd dataneuron
-
Install dependencies using Poetry:
poetry install --all-extras
or
poetry install --extras postgres
-
Run tests:
poetry run pytest
Note: Tests are still being added.
This project is licensed under the MIT License - see the LICENSE file for details.
For questions, suggestions, or issues, please open an issue on the GitHub repository or contact the maintainers directly.
Happy querying with Data Neuron!