Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug] OpenFunctions-v2: how to continue conversation? #488

Closed
tybalex opened this issue Jun 26, 2024 · 1 comment
Closed

[bug] OpenFunctions-v2: how to continue conversation? #488

tybalex opened this issue Jun 26, 2024 · 1 comment
Labels
hosted-openfunctions-v2 Issues with OpenFunctions-v2

Comments

@tybalex
Copy link

tybalex commented Jun 26, 2024

This is more like a question rather than a bug, but what to do after the model returns a function call?

I followed this example:

To help with quick prototyping, we provide a hosted Gorilla Openfunctions-v2 model for inference. Or you can run it locally, or self-host it by accessing the model from [HuggingFace](https://huggingface.co/gorilla-llm/gorilla-openfunctions-v2). The example below, demonstrates how to invoke the hosted Gorilla Openfunctions-v2 model:
import openai

def get_gorilla_response(prompt="", model="gorilla-openfunctions-v2", functions=[]):
    openai.api_key = "EMPTY"  # Hosted for free with ❤️ from UC Berkeley
    openai.api_base = "http:https://luigi.millennium.berkeley.edu:8000/v1"
    try:
        completion = openai.ChatCompletion.create(
            model="gorilla-openfunctions-v2",
            temperature=0.0,
            messages=[{"role": "user", "content": prompt}],
            functions=functions,
        )
        # completion.choices[0].message.content, string format of the function call 
        # completion.choices[0].message.functions, Json format of the function call
        return completion.choices[0]
Prompt the model:
What's the weather like in the two cities of Boston and San Francisco?
Format your function call: The model will return the function call based on your request.
query = "What's the weather like in the two cities of Boston and San Francisco?"
functions = [
    {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "The city and state, e.g. San Francisco, CA",
                },
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["location"],
        },
    }
]
Get Your Function Call: The model will return a Python function call based on your request.
This opens up possibilities for developers and non-developers alike, allowing them to leverage complex functionalities without writing extensive code.
Input:

get_gorilla_response(prompt=query, functions=[functions])
Output:

[get_current_weather(location='Boston, MA'), get_current_weather(location='San Francisco, CA')]

and assume I have function to check the weather of Boston and SF, how do I append those results back and continue the conversation?

@tybalex tybalex added the hosted-openfunctions-v2 Issues with OpenFunctions-v2 label Jun 26, 2024
@ShishirPatil
Copy link
Owner

Thanks for trying it out @tybalex
Let me preface it by mentioning that, OpenFunctions-v2 is currently tested for single-turn only, so multi-turn behavior is ill-defined. Anyways if you want to try it out. You would do as such:

completion = openai.ChatCompletion.create(
           model="gorilla-openfunctions-v2",
           temperature=0.0,
           messages=[{"role": "user", "content": prompt_1}, 
                                {"role": "assistant", "content": response_1},
                                {"role": "user", "content": prompt_2}],
           functions=functions,
       )

where you would replace prompt_1 is what you input the first time, response_1 is the LLM function call returned by the LLM, and prompt_2 is what you want to continue the convsersation.. Hope this helps!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
hosted-openfunctions-v2 Issues with OpenFunctions-v2
Projects
None yet
Development

No branches or pull requests

2 participants