Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rendering non-chat flows #1492

Closed
kencheah opened this issue Mar 4, 2024 · 3 comments
Closed

Rendering non-chat flows #1492

kencheah opened this issue Mar 4, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@kencheah
Copy link

kencheah commented Mar 4, 2024

Hi Team, are there any plans to render the output of non-chat flows? Currently it seems the end node has to be some form of LLMChain in which the chat functionality can connect to. I currently have to export the json workflow into my local environment and execute there. Having this functionality on the frontend is possible would be great!

Copy link

dosubot bot commented Mar 4, 2024

Based on your request, it seems you want to execute non-chat workflows directly on the frontend without having to export the JSON workflow into your local environment. Here's a detailed guide on how you can achieve this:

  1. WebSocket Connection: The WebSocket connection established in the connectWS function is used to communicate with the backend in real-time. You can utilize this connection to send and receive messages related to non-chat workflows.

  2. Sending Workflow Execution Requests: You can modify the sendAll function or create a similar function dedicated to non-chat workflows. This function should format the request with the necessary workflow details and inputs, then send it through the WebSocket connection. The backend should be able to differentiate between chat and non-chat workflow requests.

  3. Receiving Workflow Execution Responses: In the handleWsMessage function, add logic to handle responses from the backend related to non-chat workflow executions. This could involve updating the UI with the results of the workflow execution or handling any errors that occurred.

  4. UI Integration: Integrate a UI component or modal that allows users to input the necessary details for executing a non-chat workflow. This could be a form that captures the workflow ID, input parameters, and any other necessary information. Upon submission, use the modified sendAll function to send the execution request to the backend.

  5. Backend Support: Ensure that your backend is capable of handling requests for non-chat workflow executions. This includes parsing the request, executing the specified workflow with the provided inputs, and sending back the results or any errors.

Here's a simplified example of how you might modify the sendAll function to handle non-chat workflow execution requests:

async function executeWorkflow(data: WorkflowExecutionProps): Promise<void> {
  try {
    if (ws.current) {
      const workflowExecutionRequest = {
        type: 'execute_workflow', // Indicate this is a workflow execution request
        workflowId: data.workflowId,
        inputs: data.inputs,
      };
      ws.current.send(JSON.stringify(workflowExecutionRequest));
    }
  } catch (error) {
    setErrorData({
      title: "There was an error executing the workflow",
      list: [(error as { message: string }).message],
    });
  }
}

In this example, WorkflowExecutionProps is a TypeScript interface you would define based on the inputs required to execute a workflow. The executeWorkflow function formats the request and sends it through the WebSocket connection. You would need to adjust the backend to handle this new request type and execute the specified workflow.

Remember, this approach requires both frontend and backend modifications. The backend must be capable of receiving workflow execution requests, processing them, and sending back the results.

Regarding the current structure of the WebSocket connection in the connectWS function and how it is used in the langflow codebase, the ChatService class in the src/backend/langflow/services/chat/service.py file manages WebSocket connections and handles messaging between the server and clients. It maintains a dictionary of active WebSocket connections (self.active_connections) indexed by client IDs. Each entry in this dictionary is a WebSocket object representing the connection to a client. The ChatService class also handles connection and disconnection, message sending, connection closure, message processing and history management, and WebSocket event handling.

As for the sendAll function, it does not exist in the provided context of the langflow codebase within the src/backend/langflow/services/chat/service.py file. The code snippet provided outlines the structure and functionality of the ChatService class, which includes methods for managing chat connections, sending messages, processing messages, and handling WebSocket connections, but does not include a sendAll function.

I hope this helps! If you have any further questions, feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@kencheah kencheah changed the title Rendering non-chat workflows Rendering non-chat flows Mar 4, 2024
@dosubot dosubot bot added the enhancement New feature or request label Mar 4, 2024
@YamonBot
Copy link
Contributor

We have also built and tested OpenAI's Whisper in a non-chat flow, directly calling it through an API as you mentioned. Indeed, there is a critical need for a feature that allows testing inputs and outputs in a flow that does not include the LLM chain.

@anovazzi1
Copy link
Contributor

I completely agree. We're actively working on this and, to be honest, we're already in the testing phase. We anticipate it will be available in just a few weeks at most.

@dosubot dosubot bot added the stale label Jun 20, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Jun 27, 2024
@dosubot dosubot bot removed the stale label Jun 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants