-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BaseLLM bug #47
Comments
Hey! @ibiscp do you know what could be causing this? I tested it here and it happened to me too. I ran Maybe the output to OpenAI shouldn't be str. |
I'll debug that. |
In frontend/src/CustomNodes/GenericNode.js L84
This line is problematic because it produces id in the form like this
or more likely to
will fix the problem in current implementation. We also need to fix hard-coded "type:str" notation on ParameterComponent. I'd like to also find a way to actually use type parameter rather than just parse handle's id field when we're validating connections between nodes (Because current implementation is eager to be broken in future if we try to introduce way to manage custom components. Using reserved eparators for component ID will break the validation), but I'm not sure that's doable as this is my first time using reactflow library. May I open PR from my fork to resolve this issue or is someone already working on it? |
Thanks for the help! I'll test it out now. |
I fixed the tooltip part, but changing Will take a further investigation afterwards if needed :) |
Ok. I'll open a discussion based on your comment so we can maybe think of a better solution. |
Fixed in version 0.0.44 |
I can't connect the model to any tool or LLMMathChain. OpenAI model input type is str. (0.0.40 version on windows)
The text was updated successfully, but these errors were encountered: