-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error with Custom Local Model #1075
Comments
I think you should be able to use the existing OpenAI connection class and just set |
Thank you @tom-doerr ! I tried using the OpenAI conn class, but I am getting another error `--------------------------------------------------------------------------- File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/primitives/program.py:26, in Module.call(self, *args, **kwargs) File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/functional/functional.py:180, in TypedPredictor.forward(self, **kwargs) File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/predict/predict.py:49, in Predict.call(self, **kwargs) File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/predict/predict.py:91, in Predict.forward(self, **kwargs) TypeError: 'NoneType' object is not iterable |
The server seems to not send you any completion, you could try to get the error message the server might sends you back.
|
I had the same problem. In particular, I first set the API endpoint to be http:https://localhost:1234/v1, which is incorrect. But I got stuck then because when I tried completion_text = lm('This is a test'), my server did not receive any request. But it is possible to directly send a request as dsp.settings.lm.request("??"). So my guess is that, the mis-configured lm is cached somewhere. Update: removing the folder cachedir = os.environ.get('DSP_CACHEDIR') or os.path.join(Path.home(), 'cachedir_joblib') seems solved my problem. |
yes I believe the caching needs to be improved to avoid misconfigured LMs breaking behavior - although I believe the different |
Hello,
I am facing an issue with dspy using a Custom LM. The LM is Mistral Instruct v0 2 7B deployed using a local inference server on LM Studio. According to LM Studio this is how the model is called over API locally.
`# Example: reuse your existing OpenAI setup
from openai import OpenAI
Point to the local server
client = OpenAI(base_url="http:https://localhost:1234/v1", api_key="lm-studio")
completion = client.chat.completions.create(
model="TheBloke/Mistral-7B-Instruct-v0.2-GGUF",
messages=[
{"role": "system", "content": "Always answer in rhymes."},
{"role": "user", "content": "Introduce yourself."}
],
temperature=0.7,
)
print(completion.choices[0].message)`
I am running an extraction use case from Youtube transcripts. I have defined several classes to help me with the task
which I am calling like this:
However, the TypedPredictor returns:
`---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[7], line 1
----> 1 prediction = cot_predictor(
2 context=context_description,
3 transcript=transcript)
File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/primitives/program.py:26, in Module.call(self, *args, **kwargs)
25 def call(self, *args, **kwargs):
---> 26 return self.forward(*args, **kwargs)
File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/functional/functional.py:190, in TypedPredictor.forward(self, **kwargs)
188 value = completion[name]
189 parser = field.json_schema_extra.get("parser", lambda x: x)
--> 190 parsed[name] = parser(value)
191 except (pydantic.ValidationError, ValueError) as e:
192 errors[name] = _format_error(e)
File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/functional/functional.py:152, in TypedPredictor.prepare_signature..(x, from_json)
145 from_json = lambda x, type=type_: type_.model_validate_json(x)
146 schema = json.dumps(type_.model_json_schema())
147 signature = signature.with_updated_fields(
148 name,
149 desc=field.json_schema_extra.get("desc", "")
150 + (". Respond with a single JSON object. JSON Schema: " + schema),
151 format=lambda x, to_json=to_json: (x if isinstance(x, str) else to_json(x)),
--> 152 parser=lambda x, from_json=from_json: from_json(unwrap_json(x)),
153 type=type_,
154 )
155 else: # If input field
156 format_ = lambda x: x if isinstance(x, str) else str(x)
File /opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/functional/functional.py:282, in _unwrap_json(output)
281 def _unwrap_json(output):
--> 282 output = output.strip()
283 if output.startswith("
"): [284](https://file+.vscode-resource.vscode-cdn.net/opt/homebrew/Caskroom/miniforge/base/envs/food.copilot/lib/python3.12/site-packages/dspy/functional/functional.py:284) if not output.startswith("
json"):AttributeError: 'builtin_function_or_method' object has no attribute 'strip'`
The text was updated successfully, but these errors were encountered: