AI Agent simplifies the implementation and use of generative AI with LangChain, was inspired by the project autogen
Use the package manager pip to install AI Agent.
pip install ai_enterprise_agent
import asyncio
from ai_enterprise_agent.agent import Agent
from ai_enterprise_agent.interface.settings import (CHAIN_TYPE, DATABASE_TYPE, DIALECT_TYPE,
LLM_TYPE, PROCESSING_TYPE, VECTOR_STORE_TYPE)
agent = Agent({
'processing_type': PROCESSING_TYPE.single,
'chains': [CHAIN_TYPE.simple_chain],
'model': {
"type": LLM_TYPE.azure,
"api_key": <api_key>,
"model": <model>,
"endpoint": <endpoint>,
"api_version": <api_version>,
"temperature": 0.0
},
"system": {
"system_message": ""
},
})
response = asyncio.run(
agent._call(
input={
"question": "Who's Leonardo Da Vinci?.",
"chat_thread_id": "<chat_thread_id>"
}
)
)
print(response)
When using LLM with Orchestrator Mode the Agent finds the best way to answer the question in your base knowledge.
agent = Agent({
'processing_type': PROCESSING_TYPE.orchestrator,
'chains': [CHAIN_TYPE.simple_chain, CHAIN_TYPE.sql_chain],
'model': {
"type": LLM_TYPE.azure,
"api_key": <api_key>,
"model": <model>,
"endpoint": <endpoint>,
"api_version": <api_version>,
"temperature": 0.0
},
"database": {
"type": DIALECT_TYPE.postgres,
"host": <host>,
"port": <port>,
"username": <username>,
"password": <password>,
"database": <database>,
"includes_tables": ['table-1', 'table-2'],
},
"system": {
"system_message": ""
},
})
response = asyncio.run(
agent._call(
input={
"question": "How many employees there?",
"chat_thread_id": "<chat_thread_id>"
}
)
)
print(response)
If you've ever wanted to contribute to open source, and a great cause, now is your chance!
See the contributing docs for more information
JP. Nobrega 💬 📖 👀 📢 |
Túlio César Gaio 💬 📖 👀 📢 |