Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: llama-index-embeddings-fastembed should use latest fastembed that is 0.3.4 but it is using 0.2.7 #15002

Closed
pydv9991 opened this issue Jul 28, 2024 · 5 comments · Fixed by #15029
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@pydv9991
Copy link

Bug Description

llama-index-embeddings-fastembed should use latest fastembed that is 0.3.4 but it is using 0.2.7 which caused library conflicts.

Version

0.0.79.post1

Steps to Reproduce

install fastembed-gpu or only fastembed latest that is 0.3.4
then install llama-index-embeddings-fastembed it used fastemebed 0.2.7 which causes conflicts

Relevant Logs/Tracbacks

conflict is caused by:
15.53     The user requested fastembed==0.3.4
15.53     llama-index-embeddings-fastembed 0.1.6 depends on fastembed<0.3.0 and >=0.2.2
15.53     The user requested fastembed==0.3.4
@pydv9991 pydv9991 added bug Something isn't working triage Issue needs to be triaged/prioritized labels Jul 28, 2024
Copy link

dosubot bot commented Jul 28, 2024

To address the issue of llama-index-embeddings-fastembed using fastembed version 0.2.7 instead of the latest version 0.3.4, you need to update the pyproject.toml file to specify the correct version of fastembed. Here is the updated pyproject.toml file:

[build-system]
build-backend = "poetry.core.masonry.api"
requires = ["poetry-core"]

[tool.codespell]
check-filenames = true
check-hidden = true
skip = "*.csv,*.html,*.json,*.jsonl,*.pdf,*.txt,*.ipynb"

[tool.llamahub]
contains_example = false
import_path = "llama_index.embeddings.fastembed"

[tool.llamahub.class_authors]
FastEmbedEmbedding = "llama-index"

[tool.mypy]
disallow_untyped_defs = true
exclude = ["_static", "build", "examples", "notebooks", "venv"]
ignore_missing_imports = true
python_version = "3.8"

[tool.poetry]
authors = ["Your Name <you@example.com>"]
description = "llama-index embeddings fastembed integration"
exclude = ["**/BUILD"]
license = "MIT"
name = "llama-index-embeddings-fastembed"
readme = "README.md"
version = "0.1.4"

[tool.poetry.dependencies]
python = ">=3.8.1,<3.12"
llama-index-core = "^0.10.11.post1"
fastembed = "^0.3.4"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
jupyter = "^1.0.0"
mypy = "0.991"
pre-commit = "3.2.0"
pylint = "2.15.10"
pytest = "7.2.1"
pytest-mock = "3.11.1"
ruff = "0.0.292"
tree-sitter-languages = "^1.8.0"
types-Deprecated = ">=0.1.0"
types-PyYAML = "^6.0.12.12"
types-protobuf = "^4.24.0.4"
types-redis = "4.5.5.0"
types-requests = "2.28.11.8"
types-setuptools = "67.1.0.0"

[tool.poetry.group.dev.dependencies.black]
extras = ["jupyter"]
version = "<=23.9.1,>=23.7.0"

[tool.poetry.group.dev.dependencies.codespell]
extras = ["toml"]
version = ">=v2.2.6"

[[tool.poetry.packages]]
include = "llama_index/"

After updating the pyproject.toml file, make sure to install the fastembed package with the specified version:

pip install fastembed==0.3.4

Additionally, ensure that your code is updated to use the latest fastembed version [1].

To continue talking to Dosu, mention @dosu.

@Ashufet
Copy link

Ashufet commented Jul 28, 2024

@dosu How to update the pyproject.toml while installing the package and how we can do the same through docker also.

@pydv9991
Copy link
Author

How to update the pyproject.toml while installing the package with requirements.txt and how we can do the same through docker also.

@pydv9991
Copy link
Author

@dosu the code you shared is giving error
ValueError: /llama_index does not contain any element

Copy link

dosubot bot commented Jul 29, 2024

To address the issues and questions raised:

Updating pyproject.toml and Installing the Package

  1. Update pyproject.toml:
    Ensure that the pyproject.toml file specifies fastembed version 0.3.4:

    [tool.poetry.dependencies]
    python = ">=3.8.1,<3.12"
    llama-index-core = "^0.10.11.post1"
    fastembed = "==0.3.4"
  2. Install the dependencies:
    Run the following commands to install the updated dependencies:

    poetry install

Using requirements.txt

  1. Update requirements.txt:
    Ensure that the requirements.txt file specifies fastembed version 0.3.4:

    fastembed==0.3.4
    
  2. Install the dependencies:
    Run the following command to install the dependencies from requirements.txt:

    pip install -r requirements.txt

Using Docker

  1. Dockerfile Configuration:
    Update your Dockerfile to include the installation of fastembed version 0.3.4. Here is an example Dockerfile:

    FROM python:3.10-slim
    
    WORKDIR /app
    
    # Install Poetry
    RUN pip install poetry
    
    # Copy pyproject.toml and poetry.lock
    COPY pyproject.toml poetry.lock ./
    
    # Install dependencies
    RUN poetry install --no-root
    
    # Copy the rest of the application code
    COPY . .
    
    # Install fastembed version 0.3.4
    RUN poetry run pip install fastembed==0.3.4
    
    CMD ["poetry", "run", "python", "your_main_script.py"]
  2. Build and Run the Docker Container:
    Build the Docker image and run the container:

    docker build -t llamaindex:latest .
    docker run -it --rm llamaindex:latest

Fixing the ValueError: /llama_index does not contain any element Error

This error typically occurs when the specified path in the pyproject.toml file does not contain any files or elements. Ensure that the llama_index/ directory is not empty and contains the required files for your project.

In the pyproject.toml file, make sure the configuration is correct:

[tool.poetry]
name = "llama-index-embeddings-fastembed"
version = "0.1.4"
description = "llama-index embeddings fastembed integration"
authors = ["Your Name <you@example.com>"]
readme = "README.md"
license = "MIT"
packages = [{include = "llama_index"}]

Ensure that the llama_index/ directory contains the necessary files and elements.

By following these steps, you should be able to update the pyproject.toml file, install the package, and resolve the ValueError error [1][2][3].

To continue talking to Dosu, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants