Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
馃殌 feat(utils.py): add support for configurable LLM caching
This commit adds support for configurable LLM caching. The `setup_llm_caching` function now imports the cache class from the `langchain.cache` module based on the `settings.cache` value. If the import is successful, the `langchain.llm_cache` is set to an instance of the cache class. If the import fails, a warning is logged. If an exception is raised during the setup, a warning is logged with the error message.
- Loading branch information