Skip to content

Commit

Permalink
feat: add env variable to limit LLM usage to gpt3.5
Browse files Browse the repository at this point in the history
  • Loading branch information
marcusschiesser committed Apr 29, 2024
1 parent a1f02b2 commit 394733b
Show file tree
Hide file tree
Showing 3 changed files with 20 additions and 3 deletions.
4 changes: 3 additions & 1 deletion .env.template
Original file line number Diff line number Diff line change
@@ -1,2 +1,4 @@
# Your openai api key. (required)
OPENAI_API_KEY=sk-xxxx
OPENAI_API_KEY=sk-xxxx
# Allow all OpenAI models (not only gpt 3.5)
ALLOW_ALL_MODELS=true
8 changes: 6 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

Welcome to [LlamaIndex Chat](https://github.com/run-llama/chat-llamaindex). You can create and share LLM chatbots that know your data (PDF or text documents).

Getting started with LlamaIndex Chat is a breeze. Visit https://chat-llamaindex.vercel.app - a hosted version of LlamaIndex Chat with no user authentication that provides an immediate start.
Getting started with LlamaIndex Chat is a breeze. Visit https://chat.llamaindex.ai - a hosted version of LlamaIndex Chat with no user authentication that provides an immediate start.

## 🚀 Features

Expand Down Expand Up @@ -76,24 +76,28 @@ cp .env.template .env.development.local
Edit environment variables in `.env.development.local`.

#### Building the Docker Image

```bash
docker build -t chat-llamaindex .
```

#### Running in a Docker Container

```bash
docker run -p 3000:3000 --env-file .env.development.local chat-llamaindex
```

#### Docker Compose

For those preferring Docker Compose, we've included a docker-compose.yml file. To run using Docker Compose:

```bash
docker compose up
```

Go to https://localhost:3000 in your web browser.

__Note__: By default, the Docker Compose setup maps the `cache` and `datasources` directories from your host machine to the Docker container, ensuring data persistence and accessibility between container restarts.
**Note**: By default, the Docker Compose setup maps the `cache` and `datasources` directories from your host machine to the Docker container, ensuring data persistence and accessibility between container restarts.

### Vercel Deployment

Expand Down
11 changes: 11 additions & 0 deletions app/api/llm/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -151,6 +151,17 @@ export async function POST(request: NextRequest) {
);
}

const allowAllModels = JSON.parse(process.env.ALLOW_ALL_MODELS || "false");
if (!allowAllModels && config.model !== "gpt-3.5-turbo") {
return NextResponse.json(
{
error:
"Only configured to use GPT 3.5. Change model used by the bot or set 'ALLOW_ALL_MODELS' env variable to 'true'.",
},
{ status: 400 },
);
}

const llm = new OpenAI({
model: config.model,
temperature: config.temperature,
Expand Down

0 comments on commit 394733b

Please sign in to comment.