- Fork the repository.
- Create a new branch.
- Make your changes.
- Commit your changes.
- Push your changes to your fork.
- Create a pull request.
- Wait for your pull request to be reviewed.
- Make any changes requested by the reviewer.
- Wait for your pull request to be merged.
- Celebrate your success! 🎉
To contribute a new data connector, please follow these steps:
- Create a new module in the data_connectors directory.
- Create a new class in the module that inherits from the base DataConnector class.
- Add the 'name' attribute to the new class to set the name of the connector.
- Implement the
create_connection()
abstract method. - Implement the
get_tables()
andget_columns()
methods if the database does not support SQLAlchemy Inspector. - Add an import statement for the new class in the
__init__.py
file in the data_connectors directory. - Add unit tests for the new data connector in the
tests
directory. - Add your new data connector to the list of supported databases in the README.md file.
- Update the package version number in the
__about__.py
file. - Congratulations, data virtuoso! 🎉 Your mastery of data connectors has unlocked the gates to a realm of boundless data possibilities.
To contribute a new LLM connector, please follow these steps:
- Create a new module in the llm_connectors directory.
- Create a new class in the module that inherits from the base LLMConnector class.
- Add the 'name' attribute to the new class to set the name of the connector.
- Implement the
format_database_schema()
,create_prompt()
andget_answer()
abstract methods. - Add an import statement for the new class in the
__init__.py
file in the llm_connectors directory. - Add unit tests for the new LLM connector in the
tests
directory. - Add your new LLM connector to the list of supported LLMs in the README.md file.
- Update the package version number in the
__about__.py
file. - Rejoice and let the Fellowship of Language Learning Models raise their virtual goblets in celebration! 🎉 Your mastery of LLM connectors has united the powers of AI language models, forging an alliance that transcends the boundaries of human-machine communication.
To set up the local environment for AI-Text-to-SQL, please follow these steps:
- Fork the repository.
- Clone the forked repository.
git clone https://github.com/YourUsername/ai_text_to_sql.git
- Create a virtual environment with Python 3.9 or higher.
conda create -n ai_text_to_sql python=3.9
- Activate the virtual environment.
conda activate ai_text_to_sql
- Install the dependencies.
pip install -r requirements.txt
To run the unit tests, please follow these steps:
- Activate the virtual environment you created.
- Run the unit tests. Unit tests are available for all data connectors and LLM connectors, however, the majority of the tests require a database connection to be set up. As a result, the only unit tests that can be run without a running server is the tests for SQLite with OpenAI. To run these tests, execute the following command:
python -m unittest discover tests -t tests/excluded_tests
Note: Please ensure that you do include unit tests for any new data connectors or LLM connectors that you create. We can decide whether or not to include them in the main test suite later.
The release process for AI-Text-to-SQL is completely automated. Once a pull request is merged into the main branch, the GitHub Actions workflow will automatically run the unit tests and build the package. If the unit tests pass, the package will be published to PyPI. Please ensure that you update the package version number in the __about__.py
file before creating a pull request.