Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve docker deployment configuration #163

Merged
merged 1 commit into from
Aug 18, 2024
Merged

Improve docker deployment configuration #163

merged 1 commit into from
Aug 18, 2024

Conversation

AmgadHasan
Copy link
Contributor

Is your pull request related to a problem? Please describe.
This PR improves the docker workflow for deploying tabbyAPI. It moves the deployment configuration outside the docker image. This allows us to change the deployment configurations such as model, host and port without having to rebuild the docker image.

Additionally, I create a sample for api_tokens.yml named api_tokens_sample.yml. This makes it easier for new users to understand how to persist their auth tokens.

Why should this feature be added?
It makes working with docker a lot easier

Wiki
The wiki should be updated to account for the changes. Suggested version:
Docker

  1. Install Docker and docker compose from the docs
  2. Install the Nvidia container compatibility layer
    i. For Linux: Nvidia container toolkit
    ii. For Windows: Cuda Toolkit on WSL
  3. Clone TabbyAPI via git clone https://github.com/theroyallab/tabbyAPI
  4. Enter the tabbyAPI directory by cd tabbyAPI
  5. Create a config.yml file from the config_sample.yml to set up the configurations such as model, host and port
  6. [optional] Create a api_tokens.yml to specify your api key and admin key. If this step is skipped, they will be auto generated.
  7. Update the volume mount section in the docker/docker-compose.yml file:
    volumes:
      # - /path/to/models:/app/models                       # Change me
      # - /path/to/config.yml:/app/config.yml               # Change me
      # - /path/to/api_tokens.yml:/app/api_tokens.yml       # Change me
  1. Run docker compose -f docker/docker-compose.yml up

Additional context
Follow up from
#150

@bdashore3 bdashore3 merged commit dae3940 into theroyallab:main Aug 18, 2024
1 check passed
@AmgadHasan
Copy link
Contributor Author

@bdashore3
Thanks for merging the PR.

Could you please update the wiki as per the sugestions?

Wiki The wiki should be updated to account for the changes. Suggested version: Docker

1. Install Docker and docker compose from the [docs](https://docs.docker.com/compose/install/)

2. Install the Nvidia container compatibility layer
   i. For Linux: [Nvidia container toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html)
   ii. For Windows: [Cuda Toolkit on WSL](https://docs.nvidia.com/cuda/wsl-user-guide/index.html)

3. Clone TabbyAPI via git clone https://github.com/theroyallab/tabbyAPI

4. Enter the tabbyAPI directory by cd tabbyAPI

5. Create a `config.yml` file from the `config_sample.yml` to set up the configurations such as model, host and port

6. [optional] Create a `api_tokens.yml` to specify your api key and admin key. If this step is skipped, they will be auto generated.

7. Update the volume mount section in the `docker/docker-compose.yml` file:
    volumes:
      # - /path/to/models:/app/models                       # Change me
      # - /path/to/config.yml:/app/config.yml               # Change me
      # - /path/to/api_tokens.yml:/app/api_tokens.yml       # Change me
9. Run docker compose -f docker/docker-compose.yml up

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants