Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LlamaPacks: Add memary llamapack #13968

Open
wants to merge 24 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
58ce34b
init: first package
seyeong-han Jun 5, 2024
4c62740
docs: add llamahub info
seyeong-han Jun 6, 2024
f6c6db1
docs: versioning python and memary
seyeong-han Jun 6, 2024
d878bd0
docs: author's info
seyeong-han Jun 6, 2024
69dff3e
docs: add poetry dependencies
seyeong-han Jun 6, 2024
054c71f
init: first README
seyeong-han Jun 6, 2024
4d710f9
init: first memary/agent
seyeong-han Jun 6, 2024
ceef367
init: first memary/memory
seyeong-han Jun 6, 2024
7f6e32c
init: first memary/synonym_expand
seyeong-han Jun 6, 2024
b25fdf3
init: memory module tests for pytest
seyeong-han Jun 6, 2024
fc284d1
init: memory module tests for pytest
seyeong-han Jun 6, 2024
7fd8c50
init: streamlit example
seyeong-han Jun 6, 2024
577f4e2
init: memary __init__.py
seyeong-han Jun 6, 2024
d75312c
init: first agent/llm_api
seyeong-han Jun 6, 2024
9919418
refactor: run `make lint`
seyeong-han Jun 7, 2024
f5d3b4c
fix: add target sources
seyeong-han Jun 7, 2024
495ea8a
fix: add poetry_requirements after running `pants tailor`
seyeong-han Jun 12, 2024
aefa471
refactor: remove and replace with memary library
seyeong-han Jun 17, 2024
8318047
refactor: follow dependencies with memary library
seyeong-han Jun 17, 2024
5057b8f
init: add MemaryChatAgentPack as base
seyeong-han Jun 17, 2024
2e4cfba
refactor: import MemaryChatAgentPack only
seyeong-han Jun 17, 2024
fa57662
refactor: add test_class
seyeong-han Jun 17, 2024
ac5e65e
refactor: rename ChatAgent to MemaryChatAgentPack
seyeong-han Jun 17, 2024
bdd97bc
build files
logan-markewich Jun 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
153 changes: 153 additions & 0 deletions llama-index-packs/llama-index-packs-memary/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
llama_index/_static
.DS_Store
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
bin/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
etc/
include/
lib/
lib64/
parts/
sdist/
share/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
.ruff_cache

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints
notebooks/

# IPython
profile_default/
ipython_config.py

# pyenv
.python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock

# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/

# Celery stuff
celerybeat-schedule
celerybeat.pid

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
pyvenv.cfg

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# Jetbrains
.idea
modules/
*.swp

# VsCode
.vscode

# pipenv
Pipfile
Pipfile.lock

# pyright
pyrightconfig.json
7 changes: 7 additions & 0 deletions llama-index-packs/llama-index-packs-memary/BUILD
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
python_sources(
sources=["llama_index/packs/memary/*.py"],
)

poetry_requirements(
name="poetry",
)
17 changes: 17 additions & 0 deletions llama-index-packs/llama-index-packs-memary/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
GIT_ROOT ?= $(shell git rev-parse --show-toplevel)

help: ## Show all Makefile targets.
@grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[33m%-30s\033[0m %s\n", $$1, $$2}'

format: ## Run code autoformatters (black).
pre-commit install
git ls-files | xargs pre-commit run black --files

lint: ## Run linters: pre-commit (black, ruff, codespell) and mypy
pre-commit install && git ls-files | xargs pre-commit run --show-diff-on-failure --files

test: ## Run tests via pytest.
pytest tests

watch-docs: ## Build and watch documentation.
sphinx-autobuild docs/ docs/_build/html --open-browser --watch $(GIT_ROOT)/llama_index/
113 changes: 113 additions & 0 deletions llama-index-packs/llama-index-packs-memary/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# memary Pack

Agents use LLMs that are currently constrained to finite context windows. memary overcomes this limitation by allowing your agents to store a large corpus of information in knowledge graphs, infer user knowledge through our memory modules, and only retrieve relevant information for meaningful responses.

## CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack memary --download-dir ./memary
```

You can then inspect the files at `./memary` and use them as a template for your own project.

## Demo

**Notes:** memary currently assumes the local installation method and currently supports any models available through Ollama:

- LLM running locally using Ollama (Llama 3 8B/40B as suggested defaults) **OR** `gpt-3.5-turbo`
- Vision model running locally using Ollama (LLaVA as suggested default) **OR** `gpt-4-vision-preview`

memary will default to the locally run models unless explicitly specified.

**To run the Streamlit app:**

1. [Optional] If running models locally using Ollama, follow this the instructions in this [repo](https://github.com/ollama/ollama).

2. Ensure that a `.env` exists with any necessary API keys and Neo4j credentials.

```
OPENAI_API_KEY="YOUR_API_KEY"
NEO4J_PW="YOUR_NEO4J_PW"
NEO4J_URL="YOUR_NEO4J_URL"
PERPLEXITY_API_KEY="YOUR_API_KEY"
GOOGLEMAPS_API_KEY="YOUR_API_KEY"
ALPHA_VANTAGE_API_KEY="YOUR_API_KEY"
```

3. How to get API keys:

```
OpenAI key: https://openai.com/index/openai-api

Neo4j: https://neo4j.com/cloud/platform/aura-graph-database/?ref=nav-get-started-cta
Click 'Start for free'
Create a free instance
Open auto-downloaded txt file and use the credentials

Perplexity key: https://www.perplexity.ai/settings/api

Google Maps:
Keys are generated in the 'Credentials' page of the 'APIs & Services' tab of Google Cloud Console https://console.cloud.google.com/apis/credentials

Alpha Vantage: (this key is for getting real time stock data)
https://www.alphavantage.co/support/#api-key
Recommend use https://10minutemail.com/ to generate a temporary email to use
```

4. Update user persona which can be found in `streamlit_app/data/user_persona.txt` using the user persona template which can be found in `streamlit_app/data/user_persona_template.txt`. Instructions have been provided - replace the curly brackets with relevant information.

5. . [Optional] Update system persona, if needed, which can be found in `streamlit_app/data/system_persona.txt`.
6. Run:

```
cd streamlit_app
streamlit run app.py
```

## Usage

```python
from dotenv import load_dotenv

load_dotenv()

from llama_index.packs.memary.agent.chat_agent import ChatAgent

system_persona_txt = "data/system_persona.txt"
user_persona_txt = "data/user_persona.txt"
past_chat_json = "data/past_chat.json"
memory_stream_json = "data/memory_stream.json"
entity_knowledge_store_json = "data/entity_knowledge_store.json"

chat_agent = ChatAgent(
"Personal Agent",
memory_stream_json,
entity_knowledge_store_json,
system_persona_txt,
user_persona_txt,
past_chat_json,
)
```

Pass in subset of `['search', 'vision', 'locate', 'stocks']` as `include_from_defaults` for different set of default tools upon initialization.

### Adding Custom Tools

```python
def multiply(a: int, b: int) -> int:
"""Multiply two integers and returns the result integer"""
return a * b


chat_agent.add_tool({"multiply": multiply})
```

More information about creating custom tools for the LlamaIndex ReAct Agent can be found [here](https://docs.llamaindex.ai/en/stable/examples/agent/react_agent/).

### Removing Tools

```python
chat_agent.remove_tool("multiply")
```
1 change: 1 addition & 0 deletions llama-index-packs/llama-index-packs-memary/examples/BUILD
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
python_sources()
Loading
Loading