This is a monorepo for the Hyperobjective LLM router frontend, backend and supporting infrastructure.
You will need to use the environment variables defined in .env.local
to run the application.
Note: You should not commit your
.env
file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel
- Link local instance with Vercel and GitHub accounts (creates
.vercel
directory):vercel link
- Download your environment variables:
vercel env pull
- Install Foundary
pnpm install
pnpm dev
Your app will now be running on localhost:3000.
Message Queue
The message queue is used to communicate between the various services. The following command will start the message queue.
pnpm inngest
Your message queue will now be running on localhost:8288.
Test Driven Development (TDD) is used for all development. The following command will run the test suite in watch mode.
pnpm test:watch
Use the CLI to rapidly prototype or play around with functionality. The available commands can be found in the cli
directory.
pnpm cli --help