Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error decoding response body: expected value at line 1 column 1 #99

Open
jalalirs opened this issue Oct 24, 2023 · 6 comments
Open

Error decoding response body: expected value at line 1 column 1 #99

jalalirs opened this issue Oct 24, 2023 · 6 comments
Labels

Comments

@jalalirs
Copy link

jalalirs commented Oct 24, 2023

I am trying to use llm-vscode with a locally deployed Text Generation Inference (TGI) server but I keep getting the following error:

Error decoding response body: expected value at line 1 column 1

My setting is the following where and correspond to my server path. I tried both with /generate and without it

{
    "editor.accessibilitySupport": "off",
    "workbench.colorTheme": "Default Dark+",
    "git.openRepositoryInParentFolders": "always",
    "diffEditor.codeLens": true,
    "llm.attributionEndpoint": "http:https://<host>:<port>/generate",
    "llm.configTemplate": "Custom",
    "llm.modelIdOrEndpoint": "http:https:// <host>:<port>/generate",
    "llm.fillInTheMiddle.enabled": true,
    "llm.fillInTheMiddle.prefix": "<PRE> ",
    "llm.fillInTheMiddle.middle": " <MID>",
    "llm.fillInTheMiddle.suffix": " <SUF>",
    "llm.temperature": 0.2,
    "llm.contextWindow": 4096,
    "llm.tokensToClear": [
        "<EOT>"
    ],
    "llm.enableAutoSuggest": true,
    "llm.documentFilter": {
 

    },
    "llm.tlsSkipVerifyInsecure": true
}
Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Nov 24, 2023
@KeyvNari
Copy link

Hello, I also get the same error. Did you find a solution for it?

@jalalirs
Copy link
Author

Unfortunately not yet and I am still waiting for an answer or an update

@github-actions github-actions bot removed the stale label Nov 28, 2023
@McPatate
Copy link
Member

Hello, did you check the TGI logs? I assume the response body is not formatted correctly, there may be an issue with the way the response is parsed.

@jalalirs
Copy link
Author

jalalirs commented Dec 1, 2023

The TGI output is fine and can be consumed both by langchain and chat-ui. TGI with codellama-34 can be consumed fine with python requests call.

Copy link
Contributor

github-actions bot commented Jan 1, 2024

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Jan 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants