-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: Hugging Face API #2229
Comments
Hello @dyomed93, I'm here to assist you with any questions or issues you may have. I'm a bot that can help you solve bugs, answer questions, and become a contributor. Let's see how I can help you today! You can find documentation or examples for the "Hugging Face API" module on Langflow, specifically for the Endpoint URL and model fields, in the Here is the relevant code snippet: from typing import Optional
from langchain.llms.base import BaseLLM
from langchain.llms.huggingface_endpoint import HuggingFaceEndpoint
from langflow import CustomComponent
class HuggingFaceEndpointsComponent(CustomComponent):
display_name: str = "Hugging Face Inference API"
description: str = "LLM model from Hugging Face Inference API."
def build_config(self):
return {
"endpoint_url": {"display_name": "Endpoint URL", "password": True},
"task": {
"display_name": "Task",
"options": ["text2text-generation", "text-generation", "summarization"],
},
"huggingfacehub_api_token": {"display_name": "API token", "password": True},
"model_kwargs": {
"display_name": "Model Keyword Arguments",
"field_type": "code",
},
"code": {"show": False},
}
def build(
self,
endpoint_url: str,
task: str = "text2text-generation",
huggingfacehub_api_token: Optional[str] = None,
model_kwargs: Optional[dict] = None,
) -> BaseLLM:
try:
output = HuggingFaceEndpoint( # type: ignore
endpoint_url=endpoint_url,
task=task,
huggingfacehub_api_token=huggingfacehub_api_token,
model_kwargs=model_kwargs or {},
)
except Exception as e:
raise ValueError("Could not connect to HuggingFace Endpoints API.") from e
return output This class defines the configuration for the Hugging Face API, including the Additionally, the Langflow repository provides specific examples and sample configurations for using the Hugging Face API module with different models, including Mistral 7B. Here is a sample configuration for the MistralAIModelComponent: def build_config(self):
return {
"input_value": {"display_name": "Input", "input_types": ["Text", "Record", "Prompt"]},
"max_tokens": {
"display_name": "Max Tokens",
"advanced": True,
"info": "The maximum number of tokens to generate. Set to 0 for unlimited tokens.",
},
"model_name": {
"display_name": "Model Name",
"advanced": False,
"options": [
"open-mistral-7b",
"open-mixtral-8x7b",
"open-mixtral-8x22b",
"mistral-small-latest",
"mistral-medium-latest",
"mistral-large-latest",
],
"value": "open-mistral-7b",
},
"mistral_api_base": {
"display_name": "Mistral API Base",
"advanced": True,
"info": (
"The base URL of the Mistral API. Defaults to https://api.mistral.ai.\n\n"
"You can change this to use other APIs like JinaChat, LocalAI and Prem."
),
},
"mistral_api_key": {
"display_name": "Mistral API Key",
"info": "The Mistral API Key to use for the Mistral model.",
"advanced": False,
"password": True,
},
"temperature": {
"display_name": "Temperature",
"advanced": False,
"value": 0.1,
},
"stream": {
"display_name": "Stream",
"info": STREAM_INFO_TEXT,
"advanced": True,
},
"system_message": {
"display_name": "System Message",
"info": "System message to pass to the model.",
"advanced": True,
},
"max_retries": {
"display_name": "Max Retries",
"advanced": True,
},
"timeout": {
"display_name": "Timeout",
"advanced": True,
},
"max_concurrent_requests": {
"display_name": "Max Concurrent Requests",
"advanced": True,
},
"top_p": {
"display_name": "Top P",
"advanced": True,
},
"random_seed": {
"display_name": "Random Seed",
"advanced": True,
},
"safe_mode": {
"display_name": "Safe Mode",
"advanced": True,
},
} This configuration includes the model "open-mistral-7b" among other options [2][3]. |
Hi,
I'm using Langflow to create a ChatBot based on Mistral 7B, but i can't find any documentation or example of the module "Hugging Face API" on Langflow, and what are the exact values to put in Endpoint URL or model.
Where can I find an example or some documentation about that?
Thank You in advance
The text was updated successfully, but these errors were encountered: