readme-ai

Designed for simplicity, customization, and developer productivity.

Github Actions Test Coverage PyPI Version Total Downloads MIT License

line break

[!IMPORTANT] Visit the Official Documentation for detailed guides and tutorials.

line break

Introduction

ReadmeAI is a developer tool that automatically generates README files using a robust repository processing engine and advanced language models. Simply provide a URL or path to your codebase, and a well-structured and detailed README will be generated.

Why Use ReadmeAI?

This project aims to streamline the process of creating and maintaining documentation across all technical disciplines and experience levels. The core principles include:

Demo

Run from your terminal:

readmeai-cli-demo

line break

Features

Let’s begin by exploring various customization options and styles supported by ReadmeAI:

Header Styles & Themes
custom-dragon-project-logo

CLI Command:

$ readme --repository https://github.com/eli64s/readme-ai-streamlit \
         --logo custom \
         --badge-color FF4B4B \
         --badge-style flat-square \
         --header-style classic
        
docker-go-readme-example

CLI Command:

$ readme --repository https://github.com/olliefr/docker-gs-ping \
         --badge-color 00ADD8 \
         --badge-style for-the-badge \
         --header-style modern \
         --navigation-style roman
        

Banner Styles

ascii-readme-header-style

CLI Command:

$ readme --repository https://github.com/eli64s/readme-ai \
         --header-style ascii
svg-banner

CLI Command:

$ readme --repository https://github.com/eli64s/readme-ai-streamlit \
         --badge-style for-the-badge \
         --header-style svg

And More!

cloud-logo

CLI Command:

$ readme --repository https://github.com/jwills/buenavista \
           --align left \
           --badge-style flat-square \
           --logo cloud
balloon-logo

CLI Command:

$ readme --repository https://github.com/eli64s/readme-ai-streamlit \
           --badge-style flat \
           --logo custom
$ Provide an image file path or URL: \
           https://www.svgrepo.com/show/395851/balloon.svg
skill-icons

CLI Command:

$ readme --repository https://github.com/FerrariDG/async-ml-inference \
           --badge-style skills-light \
           --logo grey
compact-header

CLI Command:

$ readme --repository https://github.com/eli64s/readme-ai \
           --logo cloud \
           --header-style compact \
           --navigation-style fold
modern-style

CLI Command:

$ readme --repository https://github.com/eli64s/readme-ai \
           -i custom \
           -bc BA0098 \
           -bs flat-square \
           -hs modern \
           -ns fold

[!IMPORTANT] See the Official Documentation for a complete list of customization options and examples.

Explore additional content sections supported by ReadmeAI:

πŸ”Ή Overview
Overview

β—Ž The Overview section provides a high-level summary of the project, including its use case, benefits, and differentiating features.

readme-overview-section
πŸ”Έ Features
Features Table

β—Ž Generated markdown table that highlights the key technical features and components of the codebase. This table is generated using a structured prompt template.

  </td>
</tr>
<tr>
  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/llm-content/features-table.png" alt="readme-features-section" width="700">
  </td>
</tr>
πŸ”Ά Module Analysis
Directory Tree

β—Ž The project’s directory structure is generated using pure Python and embedded in the README. See readmeai.generators.tree. for more details.

  </td>
</tr>
<tr>
  <td align="center">
    <img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/project-structure/tree.png" alt="directory-tree" width="700">
  </td>
</tr>
<tr>
  <td style="padding-top:20px;">
    <b>File Summaries</b><br>
    <p>β—Ž Summarizes key modules of the project, which are also used as context for downstream <a href="https://github.com/eli64s/readme-ai/blob/main/readmeai/config/settings/prompts.toml">prompts.</a>
    </p>
  </td>
</tr>
<tr>
  <td align="center">
    <img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/project-structure/file-summaries.png" alt="file-summaries" width="700">
</tr>
πŸ”Ί Quickstart Guides
Getting Started

β—Ž Prerequisites and system requirements are extracted from the codebase during preprocessing. The parsers handles the majority of this logic currently.

  </td>
</tr>
<tr>
  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/getting-started/prerequisites-and-installation.png" alt="getting-started-section-prerequisites" width="700">
  </td>
</tr>
<tr>
  <td><b>Installation Guide</b><br>
    <p>β—Ž <code>Installation</code>, <code>Usage</code>, and <code>Testing</code> guides are generated based on the project's dependency files and codebase configuration.
    </p>
    <tr>
    <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/getting-started/usage-and-testing.png" alt="getting-started-section-usage-and-testing" width="700">
  </td>
</tr>
πŸ”» Contributing Guidelines
Contributing Guide

β—Ž Dropdown section that outlines general process for contributing to your project.

    <p>β—Ž Provides links to your contributing guidelines, issues page, and more resources.</p>
    <p>β—Ž Graph of contributors is also included.</p>
    </p>
  </td>
</tr>
<tr>
  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/contributing/contributing-guidelines.png" alt="contributing-guidelines-section" width="700">
  </td>
</tr>
<tr>
  <td><b>Additional Sections</b><br>
    <p>β—Ž <code>Roadmap</code>, <code>Contributing Guidelines</code>, <code>License</code>, and <code>acknowledgment</code> are included by default.
    </p>
  </td>
</tr>
<tr>
  <td align="center"><img src="https://raw.githubusercontent.com/eli64s/readme-ai/main/docs/docs/assets/img/contributing/footer.png" alt="footer-readme-section" width="700"></td>
</tr>

line break

Getting Started

Prerequisites

ReadmeAI requires Python 3.9 or higher, plus one installation method of your choice:

Requirement Details
β€’ Python β‰₯3.9 Core runtime
Installation Method (choose one)
β€’ pip Default Python package manager
β€’ pipx Isolated environment installer
β€’ uv High-performance package manager
β€’ docker Containerized environment

Supported Repository Platforms

ReadmeAI needs access to your repository to generate a README file. Current supported platforms include:

Platform Details
File System Local repository access
GitHub Industry-standard hosting
GitLab Full DevOps integration
Bitbucket Atlassian ecosystem

Supported LLM API Services

ReadmeAI is model agnostic, with support for the following LLM API services:

Provider Best For Details
OpenAI General use Industry-leading models
Anthropic Advanced tasks Claude language models
Google Gemini Multimodal AI Latest Google technology
Ollama Open source No API key needed
Offline Mode Local operation No internet required

Installation

ReadmeAI is available on PyPI as readmeai and can be installed as follows:

pip Pip

Install with pip (recommended for most users):

❯ pip install -U readmeai

pipx Pipx

With pipx, readmeai will be installed in an isolated environment:

❯ pipx install readmeai

uv Uv

The fastest way to install readmeai is with uv:

❯ uv tool install readmeai

docker Docker

To run readmeai in a containerized environment, pull the latest image from [Docker Hub][dockerhub-link]:

❯ docker pull zeroxeli/readme-ai:latest

build-from-source From source

Click to build readmeai from source
  1. Clone the repository:

    ❯ git clone https://github.com/eli64s/readme-ai
  2. Navigate to the project directory:

    ❯ cd readme-ai
  3. Install dependencies:

    ❯ pip install -r setup/requirements.txt

Alternatively, use the [setup script][setup-script] to install dependencies:

bash Bash

  1. Run the setup script:

    ❯ bash setup/setup.sh

Or, use poetry to build and install project dependencies:

poetry Poetry

  1. Install dependencies with poetry:

    ❯ poetry install


Additional Optional Dependencies

[!IMPORTANT] To use the Anthropic and Google Gemini clients, extra dependencies are required. Install the package with the following extras:

Usage

Set your API key

When running readmeai with a third-party service, you must provide a valid API key. For example, the OpenAI client is set as follows:

❯ export OPENAI_API_KEY=<your_api_key>

# For Windows users:
❯ set OPENAI_API_KEY=<your_api_key>
Click to view environment variables for - Ollama, Anthropic, Google Gemini
Ollama


Refer to the Ollama documentation for more information on setting up the Ollama server.

To start, follow these steps:

  1. Pull your model of choice from the Ollama repository:

    ❯ ollama pull llama3.2:latest
  2. Start the Ollama server and set the OLLAMA_HOST environment variable:

    ❯ export OLLAMA_HOST=127.0.0.1 && ollama serve
Anthropic
  1. Export your Anthropic API key:

    ❯ export ANTHROPIC_API_KEY=<your_api_key>
Google Gemini
  1. Export your Google Gemini API key:

    ❯ export GOOGLE_API_KEY=<your_api_key

Using the CLI

Running with a LLM API service

Below is the minimal command required to run readmeai using the OpenAI client:

❯ readmeai --api openai -o readmeai-openai.md -r https://github.com/eli64s/readme-ai 

[!IMPORTANT] The default model set is gpt-3.5-turbo, offering the best balance between cost and performance.When using any model from the gpt-4 series and up, please monitor your costs and usage to avoid unexpected charges.

ReadmeAI can easily switch between API providers and models. We can run the same command as above with the Anthropic client:

❯ readmeai --api anthropic -m claude-3-5-sonnet-20240620 -o readmeai-anthropic.md -r https://github.com/eli64s/readme-ai

And finally, with the Google Gemini client:

❯ readmeai --api gemini -m gemini-1.5-flash -o readmeai-gemini.md -r https://github.com/eli64s/readme-ai
Running with local models

We can also run readmeai with free and open-source locally hosted models using the Ollama:

❯ readmeai --api ollama --model llama3.2 -r https://github.com/eli64s/readme-ai
Running on a local codebase

To generate a README file from a local codebase, simply provide the full path to the project:

❯ readmeai --repository /users/username/projects/myproject --api openai

Adding more customization options:

❯ readmeai --repository https://github.com/eli64s/readme-ai \
           --output readmeai.md \
           --api openai \
           --model gpt-4 \
           --badge-color A931EC \
           --badge-style flat-square \
           --header-style compact \
           --navigation-style fold \
           --temperature 0.9 \
           --tree-depth 2
           --logo LLM \
           --emojis solar
Running in offline mode

ReadmeAI supports offline mode, allowing you to generate README files without using a LLM API service.

❯ readmeai --api offline -o readmeai-offline.md -r https://github.com/eli64s/readme-ai

docker Docker

Run the readmeai CLI in a Docker container:

❯ docker run -it --rm \
    -e OPENAI_API_KEY=$OPENAI_API_KEY \
    -v "$(pwd)":/app zeroxeli/readme-ai:latest \
    --repository https://github.com/eli64s/readme-ai \
    --api openai

streamlit Streamlit

Try readme-ai directly in your browser on Streamlit Cloud, no installation required.

See the readme-ai-streamlit repository on GitHub for more details about the application.

[!WARNING] The readme-ai Streamlit web app may not always be up-to-date with the latest features. Please use the command-line interface (CLI) for the most recent functionality.

build-from-source From source

Click to run readmeai from source

bash Bash

If you installed the project from source with the bash script, run the following command:

  1. Activate the virtual environment:

    ❯ conda activate readmeai
  2. Run the CLI:

    ❯ python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

poetry Poetry

  1. Activate the virtual environment:

    ❯ poetry shell
  2. Run the CLI:

    ❯ poetry run python3 -m readmeai.cli.main -r https://github.com/eli64s/readme-ai

line break

Testing

The pytest and nox frameworks are used for development and testing.

Install the dependencies with uv:

❯ uv pip install -r pyproject.toml --all-extras

Run the unit test suite using Pytest:

❯ make test

Using nox, test the app against Python versions 3.9, 3.10, 3.11, and 3.12:

❯ make test-nox

[!TIP] Nox is an automation tool for testing applications in multiple environments. This helps ensure your project is compatible with across Python versions and environments.

line break

Configuration

Customize your README generation with a variety of options and style settings supported such as:

Option Description Default
--align Text alignment in header center
--api LLM API service provider offline
--badge-color Badge color name or hex code 0080ff
--badge-style Badge icon style type flat
--header-style Header template style classic
--navigation-style Table of contents style bullet
--emojis Emoji theme packs prefixed to section titles None
--logo Project logo image blue
--logo-size Logo image size 30%
--model Specific LLM model to use gpt-3.5-turbo
--output Output filename readme-ai.md
--repository Repository URL or local directory path None
--temperature Creativity level for content generation 0.1
--tree-depth Maximum depth of the directory tree structure 2

Run the following command to view all available options:

❯ readmeai --help

Visit the Official Documentation for a complete guide on configuring and customizing README files.

line break

Examples

Explore a variety of README examples generated by readmeai:

Tech Output Source Description
Readme-ai readme-ai.md readme-ai Readme-ai project
Apache Flink readme-pyflink.md pyflink-poc Pyflink project
Streamlit readme-streamlit.md readme-ai-streamlit Streamlit web app
Vercel & NPM readme-vercel.md github-readme-quotes Vercel deployment
Go & Docker readme-docker-go.md docker-gs-ping Dockerized Go app
FastAPI & Redis readme-fastapi-redis.md async-ml-inference Async ML inference service
Java readme-java.md Minimal-Todo Minimalist todo Java app
PostgreSQL & DuckDB readme-postgres.md Buenavista Postgres proxy server
Kotlin readme-kotlin.md android-client Android client app
Offline Mode offline-mode.md litellm LLM API service

Find additional README.md file examples in our examples directory.

line break

Roadmap

Contributing

Contributions are welcome! Please read the Contributing Guide to get started.


Acknowledgments

A big shoutout to the projects below for their awesome work and open-source contributions:

shields.io simpleicons.org tandpfun/skill-icons astrit/css.gg Ileriayo/markdown-badges Ileriayo/markdown-badges

line break

πŸŽ— License

Copyright Β© 2023 readme-ai.
Released under the MIT license.