Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Groq compatibility? #136

Open
durrantmm opened this issue May 13, 2024 · 12 comments
Open

Groq compatibility? #136

durrantmm opened this issue May 13, 2024 · 12 comments
Assignees
Labels
enhancement New feature or request

Comments

@durrantmm
Copy link

Hi, I access llama3-70b through groq using this python tool. I was hoping to also use groq with gp.nvim. Any plans to support this?

@Robitx Robitx added the enhancement New feature or request label Aug 4, 2024
@0fflineuser
Copy link

0fflineuser commented Aug 14, 2024

For me adding the groq provider works. Example :

return {
	"robitx/gp.nvim",
	dependencies = { "folke/which-key.nvim" },
	config = function()
		local config = {
			providers = {
				groq = {
					disable = false,
					endpoint = "https://api.groq.com/openai/v1/chat/completions",
					secret = os.getenv("GROQ_API_KEY"),
				},
			},
			agents = {
				{
					provider = "groq",
					name = "ChatGroqLlama3.1-70B",
					chat = true,
					command = false,
					-- string with model name or table with model name and parameters
					model = {
						model = "llama-3.1-70b-versatile",
						temperature = 0.6,
						top_p = 1,
						min_p = 0.05,
					},
					system_prompt = require("gp.defaults").chat_system_prompt,
				},
				{
					provider = "groq",
					name = "CodeGroqLlama3.1-70B",
					chat = false,
					command = true,
					model = {
						model = "llama-3.1-70b-versatile",
						temperature = 0.4,
						top_p = 1,
						min_p = 0.05,
					},
					system_prompt = require("gp.defaults").code_system_prompt,
				},
			},
		}
		require("gp").setup(config)
	end,
}

@Robitx Robitx self-assigned this Aug 14, 2024
@yuukibarns
Copy link

For me adding the groq provider works. Example :

return {
	"robitx/gp.nvim",
	dependencies = { "folke/which-key.nvim" },
	config = function()
		local config = {
			providers = {
				groq = {
					disable = false,
					endpoint = "https://api.groq.com/openai/v1/chat/completions",
					secret = os.getenv("GROQ_API_KEY"),
				},
			},
			agents = {
				{
					provider = "groq",
					name = "ChatGroqLlama3.1-70B",
					chat = true,
					command = false,
					-- string with model name or table with model name and parameters
					model = {
						model = "llama-3.1-70b-versatile",
						temperature = 0.6,
						top_p = 1,
						min_p = 0.05,
					},
					system_prompt = require("gp.defaults").chat_system_prompt,
				},
				{
					provider = "groq",
					name = "CodeGroqLlama3.1-70B",
					chat = false,
					command = true,
					model = {
						model = "llama-3.1-70b-versatile",
						temperature = 0.4,
						top_p = 1,
						min_p = 0.05,
					},
					system_prompt = require("gp.defaults").code_system_prompt,
				},
			},
		}
		require("gp").setup(config)
	end,
}

Can't work on my nvim.

@Robitx
Copy link
Owner

Robitx commented Aug 20, 2024

@yuukibarns could you provide your gp configuration, outputs from :GpInspectPlugin and :GpInspectLog covering some events which didn't work (ideally with log_sensitive = true in the config)?

Just be careful not to paste your secrets.

@yuukibarns
Copy link

The error msg is .html format which is hard to read, so i ignored it before.
After having read the msg, I find the reason is that my api is blocked, maybe it is not supported in my region.

@yuukibarns
Copy link

Why have I been blocked?

This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solu tion. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.

@yuukibarns
Copy link

And it told me that Please enable cookies.

@Robitx
Copy link
Owner

Robitx commented Aug 20, 2024

@yuukibarns the message is from ClaudFlare, can't provide more help without config and GpInspect data.

@yuukibarns
Copy link

{
	"robitx/gp.nvim",
	config = function()
		local conf = {
			providers = {
				groq = {
					disable = false,
					endpoint = "https://api.groq.com/openai/v1/chat/completions",
					secret = os.getenv("GROQ_API_KEY"),
				},
				openai = {
					disable = true,
					endpoint = "https://api.openai.com/v1/chat/completions",
					-- secret = os.getenv("OPENAI_API_KEY"),
				},
			},
			agents = {
				{
					name = "ChatGroqLlama3.1-70B",
					provider = "groq",
					chat = true,
					command = false,
					-- string with model name or table with model name and parameters
					model = {
						model = "llama-3.1-70b-versatile",
						temperature = 0.6,
						top_p = 1,
						min_p = 0.05,
					},
					system_prompt = require("gp.defaults").chat_system_prompt,
				},
				{
					name = "CodeGroqLlama3.1-70B",
					provider = "groq",
					chat = false,
					command = true,
					model = {
						model = "llama-3.1-70b-versatile",
						temperature = 0.4,
						top_p = 1,
						min_p = 0.05,
					},
					system_prompt = require("gp.defaults").code_system_prompt,
				},
			},
		}
		require("gp").setup(conf)

		-- Setup shortcuts here (see Usage > Shortcuts in the Documentation/Readme)
	end,
},

@yuukibarns
Copy link

Plugin structure:
{
BufTarget = {
current = 0,
popup = 1,
split = 2,
tabnew = 4,
vsplit = 3
},
Prompt = <function 1>,
Target = {
append = 1,
enew = <function 2>,
new = <function 3>,
popup = 3,
prepend = 2,
rewrite = 0,
tabnew = <function 4>,
vnew = <function 5>
},
_Name = "Gp",
_chat_agents = { "ChatGroqLlama3.1-70B" },
_chat_finder_opened = false,
_command_agents = { "CodeGroqLlama3.1-70B" },
_prepared_bufs = {},
_setup_called = true,
_state = {
chat_agent = "ChatGroqLlama3.1-70B",
command_agent = "CodeGroqLlama3.1-70B",
last_chat = "C:\Users\JZR\AppData\Local\nvim-data/gp/chats/2024-08-20.17-21-14.420.md",
updated = 1724155208
},
_toggle = {},
_toggle_add = <function 6>,
_toggle_close = <function 7>,
_toggle_kind = {
chat = 1,
context = 3,
popup = 2,
unknown = 0
},
_toggle_resolve = <function 8>,
agents = {
["ChatGroqLlama3.1-70B"] = {
chat = true,
command = false,
model = {
min_p = 0.05,
model = "llama-3.1-70b-versatile",
temperature = 0.6,
top_p = 1
},
name = "ChatGroqLlama3.1-70B",
provider = "groq",
system_prompt = "You are a general AI assistant.\n\nThe user provided the additional info about how they would like you to respond:\n\n- If you're unsure don't guess and say you don't know instead.\n- Ask question if you need clarification to provide better answer.\n- Think deeply and carefully from first principles step by step.\n- Zoom out first to see the big picture and then zoom in to details.\n- Use Socratic method to improve your thinking and coding skills.\n- Don't elide any code from your output if the answer requires coding.\n- Take a deep breath; You've got this!\n"
},
["CodeGroqLlama3.1-70B"] = {
chat = false,
command = true,
model = {
min_p = 0.05,
model = "llama-3.1-70b-versatile",
temperature = 0.4,
top_p = 1
},
name = "CodeGroqLlama3.1-70B",
provider = "groq",
system_prompt = "You are an AI working as a code editor.\n\nPlease AVOID COMMENTARY OUTSIDE OF THE SNIPPET RESPONSE.\nSTART AND END YOUR ANSWER WITH:\n\n" } }, buf_handler = <function 9>, chat_respond = <function 10>, cmd = { Agent = <function 11>, Append = <function 12>, ChatDelete = <function 13>, ChatFinder = <function 14>, ChatNew = <function 15>, ChatPaste = <function 16>, ChatRespond = <function 17>, ChatToggle = <function 18>, Context = <function 19>, Enew = <function 20>, New = <function 21>, NextAgent = <function 22>, Popup = <function 23>, Prepend = <function 24>, Rewrite = <function 25>, Stop = <function 26>, Tabnew = <function 27>, Vnew = <function 28>, WhisperAppend = <function 29>, WhisperEnew = <function 30>, WhisperNew = <function 31>, WhisperPopup = <function 32>, WhisperPrepend = <function 33>, WhisperRewrite = <function 34>, WhisperTabnew = <function 35>, WhisperVnew = <function 36> }, config = { chat_assistant_prefix = { "🤖:", "[{{agent}}]" }, chat_conceal_model_params = true, chat_confirm_delete = true, chat_dir = "C:\\Users\\JZR\\AppData\\Local\\nvim-data/gp/chats", chat_finder_pattern = "topic ", chat_free_cursor = false, chat_prompt_buf_type = false, chat_shortcut_delete = { modes = { "n", "i", "v", "x" }, shortcut = "<C-g>d" }, chat_shortcut_new = { modes = { "n", "i", "v", "x" }, shortcut = "<C-g>c" }, chat_shortcut_respond = { modes = { "n", "i", "v", "x" }, shortcut = "<C-g><C-g>" }, chat_shortcut_stop = { modes = { "n", "i", "v", "x" }, shortcut = "<C-g>s" }, chat_template = "# topic: ?\n\n- file: {{filename}}\n{{optional_headers}}\nWrite your queries after {{user_prefix}}. Use `{{respond_shortcut}}` or :{{cmd_prefix}}ChatRespond to generate a response.\nResponse generation can be terminated by using `{{stop_shortcut}}` or :{{cmd_prefix}}ChatStop command.\nChats are saved automatically. To delete this chat, use `{{delete_shortcut}}` or :{{cmd_prefix}}ChatDelete.\nBe cautious of very long chats. Start a fresh chat by using `{{new_shortcut}}` or :{{cmd_prefix}}ChatNew.\n\n---\n\n{{user_prefix}}\n", chat_topic_gen_prompt = "Summarize the topic of our conversation above in two or three words. Respond only with those words.", chat_user_prefix = "💬:", cmd_prefix = "Gp", command_auto_select_response = true, command_prompt_prefix_template = "🤖 {{agent}} ~ ", curl_params = <1>{}, log_file = "C:\\Users\\JZR\\AppData\\Local\\nvim-data/gp.nvim.log", log_sensitive = false, openai_api_key = "", state_dir = "C:\\Users\\JZR\\AppData\\Local\\nvim-data/gp/persisted", style_chat_finder_border = "single", style_chat_finder_margin_bottom = 8, style_chat_finder_margin_left = 1, style_chat_finder_margin_right = 2, style_chat_finder_margin_top = 2, style_chat_finder_preview_ratio = 0.5, style_popup_border = "single", style_popup_margin_bottom = 8, style_popup_margin_left = 1, style_popup_margin_right = 2, style_popup_margin_top = 2, style_popup_max_width = 160, template_append = "I have the following from {{filename}}:\n\n{{filetype}}\n{{selection}}\n\n\n{{command}}\n\nRespond exclusively with the snippet that should be appended after the selection above.", template_command = "{{command}}", template_prepend = "I have the following from {{filename}}:\n\n{{filetype}}\n{{selection}}\n\n\n{{command}}\n\nRespond exclusively with the snippet that should be prepended before the selection above.", template_rewrite = "I have the following from {{filename}}:\n\n{{filetype}}\n{{selection}}\n\n\n{{command}}\n\nRespond exclusively with the snippet that should replace the selection above.", template_selection = "I have the following from {{filename}}:\n\n{{filetype}}\n{{selection}}\n\n\n{{command}}", toggle_target = "vsplit", zindex = 49 }, defaults = { chat_system_prompt = "You are a general AI assistant.\n\nThe user provided the additional info about how they would like you to respond:\n\n- If you're unsure don't guess and say you don't know instead.\n- Ask question if you need clarification to provide better answer.\n- Think deeply and carefully from first principles step by step.\n- Zoom out first to see the big picture and then zoom in to details.\n- Use Socratic method to improve your thinking and coding skills.\n- Don't elide any code from your output if the answer requires coding.\n- Take a deep breath; You've got this!\n", chat_template = "# topic: ?\n\n- file: {{filename}}\n{{optional_headers}}\nWrite your queries after {{user_prefix}}. Use `{{respond_shortcut}}` or :{{cmd_prefix}}ChatRespond to generate a response.\nResponse generation can be terminated by using `{{stop_shortcut}}` or :{{cmd_prefix}}ChatStop command.\nChats are saved automatically. To delete this chat, use `{{delete_shortcut}}` or :{{cmd_prefix}}ChatDelete.\nBe cautious of very long chats. Start a fresh chat by using `{{new_shortcut}}` or :{{cmd_prefix}}ChatNew.\n\n---\n\n{{user_prefix}}\n", code_system_prompt = "You are an AI working as a code editor.\n\nPlease AVOID COMMENTARY OUTSIDE OF THE SNIPPET RESPONSE.\nSTART AND END YOUR ANSWER WITH:\n\n",
short_chat_template = "# topic: ?\n- file: {{filename}}\n---\n\n{{user_prefix}}\n"
},
deprecator = {
_deprecated = {},
check_health = <function 37>,
has_old_chat_signature = <function 38>,
has_old_prompt_signature = <function 39>,
is_valid = <function 40>,
report = <function 41>
},
dispatcher = {
config = {
curl_params = <table 1>
},
create_handler = <function 42>,
prepare_payload = <function 43>,
providers = {
groq = {
disable = false,
endpoint = "https://api.groq.com/openai/v1/chat/completions"
}
},
query = <function 44>,
query_dir = "C:\Users\JZR\AppData\Local\Temp\nvim/gp/query",
setup = <function 45>
},
display_chat_agent = <function 46>,
get_chat_agent = <function 47>,
get_command_agent = <function 48>,
helpers = {
autocmd = <function 49>,
create_augroup = <function 50>,
create_user_command = <function 51>,
cursor_to_line = <function 52>,
delete_buffer = <function 53>,
delete_file = <function 54>,
ends_with = <function 55>,
feedkeys = <function 56>,
file_to_table = <function 57>,
find_git_root = <function 58>,
get_buffer = <function 59>,
get_filetype = <function 60>,
last_content_line = <function 61>,
prepare_dir = <function 62>,
set_keymap = <function 63>,
starts_with = <function 64>,
table_to_file = <function 65>,
undojoin = <function 66>,
uuid = <function 67>
},
hooks = {
Implement = <function 68>,
InspectLog = <function 69>,
InspectPlugin = <function 70>
},
imager = {
_agents = { "DALL-E-3-1024x1024-natural", "DALL-E-3-1024x1024-natural-hd", "DALL-E-3-1024x1024-vivid", "DALL-E-3-1024x1024-vivid-hd", "DALL-E-3-1024x1792-natural", "DALL-E-3-1024x1792-natural-hd", "DALL-E-3-1024x1792-vivid", "DALL-E-3-1024x1792-vivid-hd", "DALL-E-3-1792x1024-natural", "DALL-E-3-1792x1024-natural-hd", "DALL-E-3-1792x1024-vivid", "DALL-E-3-1792x1024-vivid-hd" },
_state = {
agent = "DALL-E-3-1024x1024-natural"
},
agents = {
["DALL-E-3-1024x1024-natural"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1024-natural",
quality = "standard",
size = "1024x1024",
style = "natural"
},
["DALL-E-3-1024x1024-natural-hd"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1024-natural-hd",
quality = "hd",
size = "1024x1024",
style = "natural"
},
["DALL-E-3-1024x1024-vivid"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1024-vivid",
quality = "standard",
size = "1024x1024",
style = "vivid"
},
["DALL-E-3-1024x1024-vivid-hd"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1024-vivid-hd",
quality = "hd",
size = "1024x1024",
style = "vivid"
},
["DALL-E-3-1024x1792-natural"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1792-natural",
quality = "standard",
size = "1024x1792",
style = "natural"
},
["DALL-E-3-1024x1792-natural-hd"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1792-natural-hd",
quality = "hd",
size = "1024x1792",
style = "natural"
},
["DALL-E-3-1024x1792-vivid"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1792-vivid",
quality = "standard",
size = "1024x1792",
style = "vivid"
},
["DALL-E-3-1024x1792-vivid-hd"] = {
model = "dall-e-3",
name = "DALL-E-3-1024x1792-vivid-hd",
quality = "hd",
size = "1024x1792",
style = "vivid"
},
["DALL-E-3-1792x1024-natural"] = {
model = "dall-e-3",
name = "DALL-E-3-1792x1024-natural",
quality = "standard",
size = "1792x1024",
style = "natural"
},
["DALL-E-3-1792x1024-natural-hd"] = {
model = "dall-e-3",
name = "DALL-E-3-1792x1024-natural-hd",
quality = "hd",
size = "1792x1024",
style = "natural"
},
["DALL-E-3-1792x1024-vivid"] = {
model = "dall-e-3",
name = "DALL-E-3-1792x1024-vivid",
quality = "standard",
size = "1792x1024",
style = "vivid"
},
["DALL-E-3-1792x1024-vivid-hd"] = {
model = "dall-e-3",
name = "DALL-E-3-1792x1024-vivid-hd",
quality = "hd",
size = "1792x1024",
style = "vivid"
}
},
cmd = {
Image = <function 71>,
ImageAgent = <function 72>
},
config = {
cmd_prefix = "Gp",
disable = false,
prompt_prefix_template = "🖌️ {{agent}} ~ ",
prompt_save = "🖌️💾 ~ ",
state_dir = "C:\Users\JZR\AppData\Local\nvim-data/gp/persisted",
store_dir = "C:\Users\JZR\AppData\Local\Temp/gp_images"
},
disabled = false,
generate_image = <function 73>,
get_image_agent = <function 74>,
refresh = <function 75>,
setup = <function 76>
},
logger = {
_log_history = { "[2024-08-20.19-59-10.600] [1b57ebd2] DEBUG: creating user command: GpEnew", "[2024-08-20.19-59-10.600] [1b57ebd2] DEBUG: creating user command: GpWhisperEnew", "[2024-08-20.19-59-10.600] [1b57ebd2] DEBUG: creating user command: GpPopup", "[2024-08-20.19-59-10.700] [1b57ebd2] DEBUG: creating user command: GpWhisperPopup", "[2024-08-20.19-59-10.700] [1b57ebd2] DEBUG: creating user command: GpTabnew", "[2024-08-20.19-59-10.700] [1b57ebd2] DEBUG: creating user command: GpWhisperTabnew", "[2024-08-20.19-59-10.800] [1b57ebd2] DEBUG: creating user command: GpRewrite", "[2024-08-20.19-59-10.800] [1b57ebd2] DEBUG: creating user command: GpWhisperRewrite", "[2024-08-20.19-59-10.800] [1b57ebd2] DEBUG: creating user command: GpVnew", "[2024-08-20.19-59-10.800] [1b57ebd2] DEBUG: creating user command: GpWhisperVnew", "[2024-08-20.19-59-10.900] [1b57ebd2] DEBUG: creating user command: GpChatRespond", "[2024-08-20.19-59-10.900] [1b57ebd2] DEBUG: creating user command: GpAgent", "[2024-08-20.19-59-10.900] [1b57ebd2] DEBUG: creating user command: GpContext", "[2024-08-20.19-59-10.100] [1b57ebd2] DEBUG: creating user command: GpWhisperAppend", "[2024-08-20.19-59-10.100] [1b57ebd2] DEBUG: creating user command: GpAppend", "[2024-08-20.19-59-10.100] [1b57ebd2] DEBUG: creating user command: GpChatToggle", "[2024-08-20.19-59-10.100] [1b57ebd2] DEBUG: creating user command: GpChatPaste", "[2024-08-20.19-59-10.130] [1b57ebd2] DEBUG: setup finished", "[2024-08-20.20-00-08.869] [1b57ebd2] DEBUG: state[updated]: disk=1724155150 old=1724155150 new=1724155208", "[2024-08-20.20-00-08.871] [1b57ebd2] DEBUG: running hook: InspectPlugin" },
debug = <function 77>,
error = <function 78>,
info = <function 79>,
now = <function 80>,
setup = <function 81>,
trace = <function 82>,
warning = <function 83>
},
new_chat = <function 84>,
not_chat = <function 85>,
open_buf = <function 86>,
prep_chat = <function 87>,
prep_context = <function 88>,
prep_md = <function 89>,
prepare_commands = <function 90>,
refresh_state = <function 91>,
render = {
append_selection = <function 92>,
popup = <function 93>,
prompt_template = <function 94>,
template = <function 95>,
template_replace = <function 96>
},
repo_instructions = <function 97>,
resolve_buf_target = <function 98>,
setup = <function 99>,
spinner = {
_current_spinner_frame = 1,
_display_spinner = <function 100>,
_spinner_frames = { "01010010", "01101111", "01100010", "01101001", "01110100", "01111000", "00101111", "01100111", "01110000", "00101110", "01101110", "01110110", "01101001", "01101101" },
start_spinner = <function 101>,
stop_spinner = <function 102>
},
tasker = {
_handles = {},
_queries = {},
add_handle = <function 103>,
cleanup_old_queries = <function 104>,
get_query = <function 105>,
grep_directory = <function 106>,
is_busy = <function 107>,
once = <function 108>,
remove_handle = <function 109>,
run = <function 110>,
set_query = <function 111>,
stop = <function 112>
},
vault = {
_obfuscated_secrets = {},
_state = {},
add_secret = <function 113>,
config = {
curl_params = <table 1>,
state_dir = "C:\Users\JZR\AppData\Local\nvim-data/gp/persisted"
},
get_secret = <function 114>,
refresh_copilot_bearer = <function 115>,
resolve_secret = <function 116>,
run_with_secret = <function 117>,
setup = <function 118>
},
whisper = {
Whisper = <function 119>,
check_health = <function 120>,
cmd = {
Whisper = <function 121>
},
config = {
cmd_prefix = "Gp",
curl_params = <table 1>,
disable = false,
endpoint = "https://api.openai.com/v1/audio/transcriptions",
language = "en",
silence = "1.75",
store_dir = "C:\Users\JZR\AppData\Local\Temp/gp_whisper",
style_popup_border = "single",
tempo = "1.75"
},
disabled = false,
setup = <function 122>
}
}
Command params:
{
args = "",
bang = false,
count = -1,
fargs = {},
line1 = 2,
line2 = 2,
mods = "",
name = "GpInspectPlugin",
range = 0,
reg = "",
smods = {
browse = false,
confirm = false,
emsg_silent = false,
hide = false,
horizontal = false,
keepalt = false,
keepjumps = false,
keepmarks = false,
keeppatterns = false,
lockmarks = false,
noautocmd = false,
noswapfile = false,
sandbox = false,
silent = false,
split = "",
tab = -1,
unsilent = false,
verbose = -1,
vertical = false
}
}

@yuukibarns
Copy link

image

@Robitx
Copy link
Owner

Robitx commented Aug 20, 2024

@yuukibarns yep, sadly looks like a geo fence issue 🙁 groq/groq-python#32

I've had similar troubles with gemini in EU, if you can get some proxy to fake your location to US you could use curl_params config to set it up:

-- optional curl parameters (for proxy, etc.) 
-- curl_params = { "--proxy", "https://X.X.X.X:XXXX" } 
curl_params = {}, 

@yuukibarns
Copy link

Thanks for your analysis.
I've turned to DeepSeek for alternative, which is in mainland China, and supports Chinese well.
An amazing plugin, 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants