Skip to content
forked from eli64s/readme-ai

README file generator, powered by large language model APIs πŸ‘Ύ

License

Notifications You must be signed in to change notification settings

AIEdX/readme-ai

Β 
Β 

Repository files navigation

README-AI

Automated README file generator, powered by large language model APIs

github-actions codecov pypi-version pepy-total-downloads license

Documentation
Quick Links

πŸ“ Overview

Objective

Readme-ai is a developer tool that auto-generates README.md files using a combination of data extraction and generative ai. Simply provide a repository URL or local path to your codebase and a well-structured and detailed README file will be generated for you.

Motivation

Streamlines documentation creation and maintenance, enhancing developer productivity. This project aims to enable all skill levels, across all domains, to better understand, use, and contribute to open-source software.


πŸ‘Ύ Demo

CLI Usage

readmeai-cli-demo.mov

Offline Mode

readmeai-streamlit-demo.mov

Tip

Offline mode is useful for generating a boilerplate README at no cost. View the offline README.md example here!


🧬 Features

  • Flexible README Generation: Robust repository context extraction combined with generative AI.
  • Multiple LLM Support: Compatible with OpenAI, Ollama, Google Gemini and Offline Mode.
  • Customizable Output: Dozens of CLI options for styling, badges, header designs, and more.
  • Language Agnostic: Works with a wide range of programming languages and project types.
  • Offline Mode: Generate a boilerplate README without calling an external API.

See a few examples of the README-AI customization options below:

default-header
--emojis --image custom --badge-color DE3163 --header-style compact --toc-style links

--image cloud --header-style compact --toc-style fold
cloud-db-logo
--align left --badge-style flat-square --image cloud
gradient-markdown-logo
--align left --badge-style flat --image gradient
custom-logo
--badge-style flat --image custom
skills-light
--badge-style skills-light --image grey
readme-ai-header
--badge-style flat-square
black-logo
--badge-style flat --image black
default-header
--image custom --badge-color 00ffe9 --badge-style flat-square --header-style classic
default-header
--image llm --badge-style plastic --header-style classic
default-header
--image custom --badge-color BA0098 --badge-style flat-square --header-style modern --toc-style fold

See the Configuration section for a complete list of CLI options.

πŸ‘‹ Overview
Overview

    - High-level introduction of the project, focused on the value proposition and use-cases, rather than technical aspects.
llm-overview
🧩 Features
Features Table

    - Generated markdown table that highlights the key technical features and components of the codebase. This table is generated using a structured prompt template.

llm-features
πŸ“„ Codebase Documentation
Repository Structure

    - Directory tree structure is generated using pure Python (tree.py) and embedded in the README.

directory-tree
File Summaries

    - Summarizes key files in the codebase, and also used as context for additional prompts!

llm-summaries
πŸš€ Quickstart Commands
Getting Started

    - Auto-generated setup guides based on language and dependency analysis.
    - Install, Usage, and Test guides are supported for many languages.
    - The parsers module is a collection of tool-specific parsers that extract dependencies and metadata.

quick-start
πŸ”° Contributing Guidelines
Contributing Guide

    - Dropdown section that outlines general process for contributing to your project.
    - Provides links to your contributing guidelines, issues page, and more resources.
    - Graph of contributors is also included.

contributing-guidelines
Additional Sections

    - Project Roadmap, Contributing Guidelines, License, and Acknowledgements are included by default.

contributing-and-more

πŸš€ Getting Started

System Requirements:

  • Python 3.9+
  • Package manager/Container: pip, pipx, docker
  • LLM service: OpenAI, Ollama, Google Gemini, Offline Mode
    • Anthropic and LiteLLM coming soon!

Repository URL or Local Path:

Make sure to have a repository URL or local directory path ready for the CLI.

Select an LLM API Service:

  • OpenAI: Recommended, requires an account setup and API key.
  • Ollama: Free and open-source, potentially slower and more resource-intensive.
  • Google Gemini: Requires a Google Cloud account and API key.
  • Offline Mode: Generates a boilerplate README without making API calls.

βš™οΈ Installation

Using pip

pip

❯ pip install readmeai

Using pipx

pipx

❯ pipx install readmeai

Tip

Use pipx to install and run Python command-line applications without causing dependency conflicts with other packages!

Using docker

docker

❯ docker pull zeroxeli/readme-ai:latest

From source

Build readme-ai

Clone repository and navigate to the project directory:

❯ git clone https://github.com/eli64s/readme-ai

❯ cd readme-ai

Using bash

bash

❯ bash setup/setup.sh

Using poetry

Poetry

❯ poetry install

πŸ€– Usage

Environment Variables

OpenAI

Generate a OpenAI API key and set it as the environment variable OPENAI_API_KEY.

# Using Linux or macOS
❯ export OPENAI_API_KEY=<your_api_key>

# Using Windows
❯ set OPENAI_API_KEY=<your_api_key>

Ollama

Pull model of your choice from the Ollama registry as follows:

# i.e. mistral, llama3, gemma2, etc.
❯ ollama pull mistral:latest

Start the Ollama server:

❯ export OLLAMA_HOST=127.0.0.1 && ollama serve

For more details, check out the Ollama repository.

Google Gemini

Generate a Google API key and set it as the environment variable GOOGLE_API_KEY.

❯ export GOOGLE_API_KEY=<your_api_key>

Running README-AI

Using pip

pip

With OpenAI API:

❯ readmeai --repository https://github.com/eli64s/readme-ai \
          --api openai \
          --model gpt-3.5-turbo

With Ollama:

❯ readmeai --repository https://github.com/eli64s/readme-ai \
          --api ollama \
          --model llama3

With Gemini:

❯ readmeai --repository https://github.com/eli64s/readme-ai \
          --api gemini
          --model gemini-1.5-flash

Advanced Options:

❯ readmeai --repository https://github.com/eli64s/readme-ai \
         --api openai \
         --model gpt-4-turbo \
         --badge-color blueviolet \
         --badge-style flat-square \
         --header-style compact \
         --toc-style fold \
         --temperature 0.1 \
         --tree-depth 2
         --image LLM \
         --emojis \

Using docker

docker

❯ docker run -it \
-e OPENAI_API_KEY=$OPENAI_API_KEY \
-v "$(pwd)":/app zeroxeli/readme-ai:latest \
-r https://github.com/eli64s/readme-ai

Using streamlit

Streamlit App Try directly in your browser on Streamlit, no installation required! For more details, see the readme-ai-streamlit repository.

From source

Using readme-ai

Using bash

bash

❯ conda activate readmeai
❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

Using poetry

Poetry

❯ poetry shell
❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

πŸ§ͺ Testing

Using pytest

pytest

❯ make pytest

Using nox

❯ nox -f noxfile.py

Tip

Use nox to test application against multiple Python environments and dependencies!


πŸ”§ Configuration

Customize your README generation using these CLI options:

Option Description Default
--align Text align in header center
--api LLM API service (openai, ollama, offline) offline
--badge-color Badge color name or hex code 0080ff
--badge-style Badge icon style type flat
--base-url Base URL for the repository v1/chat/completions
--context-window Maximum context window of the LLM API 3999
--emojis Adds emojis to the README header sections False
--header-style Header template style default
--image Project logo image blue
--model Specific LLM model to use gpt-3.5-turbo
--output Output filename readme-ai.md
--rate-limit Maximum API requests per minute 5
--repository Repository URL or local directory path None
--temperature Creativity level for content generation 0.9
--toc-style Table of contents template style bullets
--top-p Probability of the top-p sampling method 0.9
--tree-depth Maximum depth of the directory tree structure 2

Tip

For a full list of options, run readmeai --help in your terminal.


Project Badges

The --badge-style option lets you select the style of the default badge set.

Style Preview
default
flat
flat-square
for-the-badge
plastic
skills Python Skill Icon
skills-light Python Skill Light Icon
social

When providing the --badge-style option, readme-ai does two things:

  1. Formats the default badge set to match the selection (i.e. flat, flat-square, etc.).
  2. Generates an additional badge set representing your projects dependencies and tech stack (i.e. Python, Docker, etc.)

Example

❯ readmeai --badge-style flat-square --repository https://github.com/eli64s/readme-ai

Output

{... project logo ...}

{... project name ...}

{...project slogan...}


Developed with the software and tools below.

YAML

{... end of header ...}


Project Logo

Select a project logo using the --image option.

blue gradient black
cloud purple grey

For custom images, see the following options:

  • Use --image custom to invoke a prompt to upload a local image file path or URL.
  • Use --image llm to generate a project logo using a LLM API (OpenAI only).

🎨 Examples

Language/Framework Output File Input Repository Description
Python readme-python.md readme-ai Core readme-ai project
TypeScript & React readme-typescript.md ChatGPT App React Native ChatGPT app
PostgreSQL & DuckDB readme-postgres.md Buenavista Postgres proxy server
Kotlin & Android readme-kotlin.md file.io Client Android file sharing app
Python & Streamlit readme-streamlit.md readme-ai-streamlit Streamlit UI for readme-ai
Rust & C readme-rust-c.md CallMon System call monitoring tool
Go readme-go.md docker-gs-ping Dockerized Go app
Java readme-java.md Minimal-Todo Minimalist todo app
FastAPI & Redis readme-fastapi-redis.md async-ml-inference Async ML inference service
Python & Jupyter readme-mlops.md mlops-course MLOps course materials
Flink & Python readme-local.md Local Directory Example using local files

Note

See additional README file examples here.


πŸ“Œ Roadmap

  • v1.0 release with new features, bug fixes, and improved performance.
  • Develop readmeai-vscode extension to generate README files (WIP).
  • Add new CLI options to enhance README file customization.
    • --audit to review existing README files and suggest improvements.
    • --template to select a README template style (i.e. ai, data, web, etc.)
    • --language to generate README files in any language (i.e. zh-CN, ES, FR, JA, KO, RU)
  • Develop robust documentation generator to build full project docs (i.e. Sphinx, MkDocs)
  • Create community-driven templates for README files and gallery of readme-ai examples.
  • GitHub Actions script to automatically update README file content on repository push.

πŸ“’ Changelog

Changelog


🀝 Contributing

To grow the project, we need your help! See the links below to get started.



πŸŽ— License

MIT


πŸ‘Š Acknowledgments

⬆️ Top


About

README file generator, powered by large language model APIs πŸ‘Ύ

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 95.3%
  • Shell 3.8%
  • Other 0.9%