Helix is a framework for building multi-model, feedback-looping AI systems. It's like a modular synthesizer for AI.
Read more about the concept in this blog post. In this analogy, if GPT
is a module making a single tone, Helix
is a rack full of modules feeding back into each other making a beautiful cacaphony.
You interact with Helix by using and writing Task Modules, which provide a single AI capability, and creating Graphs, which describe a network of those modules and their inputs and outputs.
Helix then loads the graph, runs all of the modules in their own seperate processes, handles communication between them, and provides a live web interface for interacting with them.
Though the project has lofty goals, Helix as a framework may be practical for all sorts of uses, such as:
- Building feedback-driven, self-training and internally-adversarial AI systems
- Making human-in-the-loop and human-out-of-the-loop AI networks
- Giving real-world capabilities (internet access, UNIX shells, robot controls) to AI systems
- Designing multi-modal AI networks
- Unsupervised knowledge creation
Helix is written in Elixir and provides a web interface with Phoenix LiveView.
🚨🚨🚨 Warning! Helix, left unattended, may eat through OpenAI credits as fast as it can! 🚨🚨🚨
These instructions assume you have Elixir installed.
First, clone this repository and cd
into it.
Then, install the dependencies:
mix deps.get
Copy the environment template file:
cp .env.tpl .env
Next, get your OpenAI API Keys and put them in .env
, as well as another configuration settings you want to put.
Run the application with source .env && mix phx.server
, or use the provided run.sh
script. The application will now be running at localhost:4000.
Once Helix is running, you can visit localhost:4000 to interact with it.
On the first screen, you can see all of your availble graphs:
Choose a graph from the dropdown to preview the rendered graph file. Press "Load Graph" to start the network.
On the next page, you can interact with your network (if it has LiveInput and LiveOutput modules in the graph.)
Notice that if your graph has multiple LiveInput targets, your can choose which to target using the dropdown. Each module in the graph will have it's own bubble color.
Graphs are described a in DOT format. A very simple GPT feedback graph could be defined like so:
digraph Daoism{
Ying [module=GPTModule, prompt="Breathe in."]
Yang [module=GPTModule, prompt="Breathe out."]
Ying -> Yang
Yang -> Ying
}
However, DOT is quite limited by itself, so Graph files are actually Liquid templates used to create a DOT file. This makes it much easier to use variable assigns and loops, like so:
{% assign ying_prompt="Your last thought was '{Yang}'. You breathe in and think: " %}
{% assign yang_prompt="Your last thought was '{Ying}'. You breathe out and think: " %}
digraph Daoism{
Ying [module=GPTModule, prompt="{{ying_prompt}}"]
Yang [module=GPTModule, prompt="{{yang_prompt}}"]
Ying -> Yang
Yang -> Ying
}
Place your graphs in ./priv/graphs
.
A simple syntax is provided for accessing historical inputs. If a module is receiving a signal from YourModule, you can reference it as {YourModule}
. To reference the previous signal received from that module, reference it as {YourModule.1}
, etc.
You can render the entire input/output history as {HISTORY}
, and your can reference the input which triggered the current node execution as {INPUT}
. This syntax is likely to expand and change.
GPTModule
GPTDecisionModule
BBTextModule
*- Uses the Bumblebee framework, but I don't have a GPU to test it properly.
LiveInputModule
LiveOutputModule
ClockModule
AwaitModule
StartModule
PrintModule
PassthroughModule
Creating a module is very simple. All a module must do is implement handle_cast({:convey, event}, state)
to receive inputs from other modules, and at the end of that function call convey(output_value, state)
to pass a message along.
So, the simplest passthrough module will be:
defmodule Helix.Modules.PassthroughModule do
use Helix.Modules.Module
def handle_cast({:convey, event}, state) do
{:noreply, convey(event, state)}
end
end
Create modified Heex/DOT template formatWeb Interface- Image representations
LiveInputChoose Graph from folder, instantiate
GPTModule templating syntax- Error Handling /
ErrorModule
- Logging, Saving and Restoring
- Use DynamicSupervisor
- More modules:
MixModule
,,ClockModule
,OutputModule
TextInputModule
- More modules:
ImageInputModule
,StableDiffusionModule
,,HuggingFaceModule
ImageOutputModule
,WebSearchModule
,WebExtractTextModule
,UnixModule
,GenModuleModule
,,AwaitModule
, variousGPTDecisionModule
Bumblebee
modules. - More more modules:
SaveFileModule
,LoadFileModule
- Refactor modules names.. don't need Module
- Create GitHub pages blog
"Guru" example (One Ying -> A Thousand Yangs -> One Ying, etc.)
- Persistent memory
- Self-embedding
- Access to the internet
- Multi-modal stimuli
- Self-Observing Solving a Puzzle
- Recursive problem-solving
- Beat the HF GPT detector
- Rat in A Maze graph
- Brainfuck Interpreter
- This will require a cludge, since getting the character of a string at an index is not possible with pure OAI.
- Computer Romance
- Two 'Hers' in a relationship, where the feelings towards each other are based on a constantly-updating synopsis injected into their own prompts.
Please, feel free to play around with Helix! I encourage you to share your feedback, ideas, and experiments. Please use GitHub issues for this.
If you'd like to make code contributions or submit graphs/modules, please send a pull request.
(c) Rich Jones, 2022+, AGPL.