Skip to content

AI-powered code documentation generation from cli

License

Notifications You must be signed in to change notification settings

ishaan-jaff/doc-comments.ai

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generate your code documentation with doc-comments.ai

Build Publish

Focus on writing your code, let LLMs write the documentation for you.
With just a few keystrokes in your terminal by using the OpenAI API or 100% local LLMs without any data leaks.

Built with langchain, lama.cpp and treesitter.

ezgif-4-53d6e634af

✨ Features

  • 📝 Create documentation comment blocks for all methods in a file
    • e.g. Javadoc, JSDoc, Docstring, Rustdoc etc.
  • ✍️ Create inline documentation comments in method bodies
  • 🌳 Treesitter integration
  • 💻 Local LLM support

Note

Documentations will only be added to files without unstaged changes, so nothing is overwritten.

🚀 Usage

Create documentations for any method in the file with GPT-3.5 Turbo model:

aicomments <RELATIVE_FILE_PATH>

Create also documentation comments in the method body:

aicomments <RELATIVE_FILE_PATH> --inline

Use GPT-4 model (Default is GPT-3.5):

aicomments <RELATIVE_FILE_PATH> --gpt4

Guided mode, confirm documentation generation for each method:

aicomments <RELATIVE_FILE_PATH> --guided

Use a local LLM on your machine:

aicomments <RELATIVE_FILE_PATH> --local_model <MODEL_PATH>

Note

How to download models from huggingface for local usage see Local LLM usage

Important

The results by using a local LLM will highly be affected by your selected model. To get similar results compared to GPT-3.5/4 you need to select very large models which require a powerful hardware.

⚙️ Supported Languages

  • Python
  • Typescript
  • Javascript
  • Java
  • Rust
  • Kotlin
  • Go
  • C++
  • C
  • Scala

📋 Requirements

  • Python >= 3.9

🔧 Installation

1. OpenAI API usage

Create your personal OpenAI API key and add it as $OPENAI_API_KEY to your environment with:

export OPENAI_API_KEY=<YOUR_API_KEY>

Install with pipx:

pipx install doc-comments-ai

It is recommended to use pipx for installation, nonetheless it is also possible to use pip.

2. Local LLM usage

By using a local LLM no API key is required. On first usage of --local_model you will be asked for confirmation to intall llama-cpp-python with its dependencies. The installation process will take care of the hardware-accelerated build tailored to your hardware and OS. For further details see: installation-with-hardware-acceleration

To download a model from huggingface for local usage the most convenient way is using the huggingface-cli:

huggingface-cli download TheBloke/CodeLlama-13B-Python-GGUF codellama-13b-python.Q5_K_M.gguf

This will download the codellama-13b-python.Q5_K_M model to ~/.cache/huggingface/. After the download has finished the absolute path of the .gguf file is printed to the console which can be used as the value for --local_model.

Important

Since llama.cpp is used the model must be in the .gguf format.

About

AI-powered code documentation generation from cli

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%