-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rendering non-chat flows #1492
Comments
Based on your request, it seems you want to execute non-chat workflows directly on the frontend without having to export the JSON workflow into your local environment. Here's a detailed guide on how you can achieve this:
Here's a simplified example of how you might modify the async function executeWorkflow(data: WorkflowExecutionProps): Promise<void> {
try {
if (ws.current) {
const workflowExecutionRequest = {
type: 'execute_workflow', // Indicate this is a workflow execution request
workflowId: data.workflowId,
inputs: data.inputs,
};
ws.current.send(JSON.stringify(workflowExecutionRequest));
}
} catch (error) {
setErrorData({
title: "There was an error executing the workflow",
list: [(error as { message: string }).message],
});
}
} In this example, Remember, this approach requires both frontend and backend modifications. The backend must be capable of receiving workflow execution requests, processing them, and sending back the results. Regarding the current structure of the WebSocket connection in the As for the I hope this helps! If you have any further questions, feel free to ask. Sources
|
We have also built and tested OpenAI's Whisper in a non-chat flow, directly calling it through an API as you mentioned. Indeed, there is a critical need for a feature that allows testing inputs and outputs in a flow that does not include the LLM chain. |
I completely agree. We're actively working on this and, to be honest, we're already in the testing phase. We anticipate it will be available in just a few weeks at most. |
Hi Team, are there any plans to render the output of non-chat flows? Currently it seems the end node has to be some form of LLMChain in which the chat functionality can connect to. I currently have to export the json workflow into my local environment and execute there. Having this functionality on the frontend is possible would be great!
The text was updated successfully, but these errors were encountered: