Skip to content

hira-jun/chatgpt-artifacts

 
 

Repository files navigation

chatgpt-artifacts

Bring Claude's Artifacts feature to ChatGPT

Preview

artifacts-react-4k.mp4

Usage

Clone this repository

git clone https://github.com/ozgrozer/chatgpt-artifacts.git

Install dependencies

npm install

Duplicate .env.example as .env and add your OPEN AI API key

cp .env.example .env
vim .env

Build the app

npm run build

Start the app

npm start

Ollama Support

To make it work with your local LLMs like Llama3 or Gemma2 you just need to make a simple update in the code.

Open /pages/api/chat.js file

// change this
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY
})
// to this
const openai = new OpenAI({
  apiKey: 'ollama',
  baseURL: 'https://127.0.0.1:11434/v1'
})

// change this
const stream = await openai.chat.completions.create({
  stream: true,
  model: 'gpt-4o',
  messages: conversations[conversationId]
})
// to this
const stream = await openai.chat.completions.create({
  stream: true,
  model: 'llama3',
  messages: conversations[conversationId]
})

Groq Support

To make it work with Groq you just need to get an API key here and make a simple update in the code.

Open /pages/api/chat.js file

// change this
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY
})
// to this
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY,
  baseURL: 'https://api.groq.com/openai/v1'
})

// change this
const stream = await openai.chat.completions.create({
  stream: true,
  model: 'gpt-4o',
  messages: conversations[conversationId]
})
// to this
const stream = await openai.chat.completions.create({
  stream: true,
  model: 'llama3-70b-8192',
  messages: conversations[conversationId]
})

License

GPL-3.0

About

Bring Claude's Artifacts feature to ChatGPT

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 88.4%
  • SCSS 11.6%