Skip to content

Commit

Permalink
README
Browse files Browse the repository at this point in the history
  • Loading branch information
ashpreetbedi committed Feb 10, 2024
1 parent 170be60 commit b11d633
Show file tree
Hide file tree
Showing 4 changed files with 63 additions and 13 deletions.
34 changes: 28 additions & 6 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,14 +29,36 @@ Please run this script before submitting a pull request.

## Adding a new Vector Database

1. Get your local environment setup by following the [Development setup](#development-setup).
2. Create a new directory in `phi/vectordb` with the name of the vector database.
3. Implement the `VectorDb` interface in `phi/vectordb/<your_db>/<your_db>.py`.
4. Import your `VectorDb` implementation in `phi/vectordb/<your_db>/__init__.py`.
5. Add a recipe for using your `VectorDb` in `cookbook/<your_db>/assistant.py`.
6. Format and validate your code by running `./scripts/format.sh`.
1. Setup your local environment by following the [Development setup](#development-setup).
2. Create a new directory under `phi/vectordb` for the new vector database.
3. Create a Class for your VectorDb that implements the `VectorDb` interface
- Your Class will be in the `phi/vectordb/<your_db>/<your_db>.py` file.
- The `VectorDb` interface is defined in `phi/vectordb/base
- Import your `VectorDb` Class in `phi/vectordb/<your_db>/__init__.py`.
- Checkout the `phi/vectordb/pgvector` directory for an example.
4. Add a recipe for using your `VectorDb` under `cookbook/<your_db>`.
- Checkout `phidata/cookbook/pgvector` for an example (you do not need to add the `resources.py` file).
5. Important: Format and validate your code by running `./scripts/format.sh`.
6. Submit a pull request.

## Adding a new LLM provider

1. Setup your local environment by following the [Development setup](#development-setup).
2. Create a new directory under `phi/llm` for the new LLM provider.
3. If the LLM provider supports the OpenAI API spec:
- Create a Class for your LLM provider that inherits the `OpenAILike` Class from `phi/llm/openai/like.py`.
- Your Class will be in the `phi/llm/<your_llm>/<your_llm>.py` file.
- Import your Class in the `phi/llm/<your_llm>/__init__.py` file.
- Checkout the `phi/llm/together/together.py` file for an example.
4. If the LLM provider does not support the OpenAI API spec:
- Reach out to us on [Discord](https://discord.gg/4MtYHHrgA8) or open an issue to discuss the best way to integrate your LLM provider.
5. Add a recipe for using your LLM provider under `cookbook/<your_llm>`.
- Checkout `phidata/cookbook/together` for an example.
6. Important: Format and validate your code by running `./scripts/format.sh`.
7. Submit a pull request.

Message us on [Discord](https://discord.gg/4MtYHHrgA8) if you have any questions or need help with credits.

## 📚 Resources

- <a href="https://docs.phidata.com/introduction" target="_blank" rel="noopener noreferrer">Documentation</a>
Expand Down
31 changes: 27 additions & 4 deletions cookbook/together/README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,42 @@
## Together

1. Install libraries
> Note: Fork and clone this repository if needed
1. Create a virtual environment

```shell
pip install -U together
python3 -m venv ~/.venvs/aienv
source ~/.venvs/aienv/bin/activate
```

2. Test Together Assistant
2. Install libraries

```shell
pip install -U together phidata
```

3. Test Together Assistant

- Streaming

```shell
python cookbook/together/assistant.py
```

3. Test Structured output
- Without Streaming

```shell
python cookbook/together/assistant_stream_off.py
```

4. Test Structured output

```shell
python cookbook/together/pydantic_output.py
```

5. Test function calling

```shell
python cookbook/together/tool_call.py
```
6 changes: 6 additions & 0 deletions phi/docker/api_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,12 @@ def create_api_client(self) -> Optional[Any]:
except Exception as e:
logger.error("Could not connect to docker. Please confirm docker is installed and running")
logger.error(e)
logger.info("Fix:")
logger.info("- If docker is running, please check output of `ls -l /var/run/docker.sock`.")
logger.info(
'- If file does not exist, please run: `sudo ln -s "$HOME/.docker/run/docker.sock" /var/run/docker.sock`'
)
logger.info("- More info: https://docs.phidata.com/faq/could-not-connect-to-docker")
exit(0)
return self._api_client

Expand Down
5 changes: 2 additions & 3 deletions phi/llm/together/together.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@
from os import getenv
from typing import Optional

from phi.llm.openai.chat import OpenAIChat
from phi.llm.openai.like import OpenAILike


class Together(OpenAIChat):
class Together(OpenAILike):
name: str = "Together"
model: str = "mistralai/Mixtral-8x7B-Instruct-v0.1"
api_key: Optional[str] = getenv("TOGETHER_API_KEY")
base_url: str = "https://api.together.xyz/v1"
phi_proxy: bool = False

0 comments on commit b11d633

Please sign in to comment.