Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Data] ray.data.from_huggingface with dynamically generated dataset raises causes No module named 'datasets_modules' error #49529

Open
Jemoka opened this issue Jan 1, 2025 · 0 comments
Labels
bug Something that is supposed to be working; but isn't data Ray Data-related issues triage Needs triage (eg: priority, bug/not-bug, and owning component)

Comments

@Jemoka
Copy link

Jemoka commented Jan 1, 2025

What happened + What you expected to happen

Similar to #28084, but different due to the additional constraint for reproduction that it only comes up during datasets that require dynamic dataset generation.

On call to ray.data.from_huggingface with HF IterableDataset (streaming=True), datasets that requires running remote code to generate will cause Ray Data to crash when attempting to materialize/interact with the dataset. Likely this is because of the dynamic import of the datasets_modules isn't loaded yet during dataset generation on HF's end when Ray tries to iterate over the dataset.

Trace from reproducer below:

2025-01-01 12:14:21,433 INFO worker.py:1821 -- Started a local Ray instance.
2025-01-01 12:14:22,142 INFO streaming_executor.py:108 -- Starting execution of Dataset. Full logs are in /tmp/ray/session_2025-01-01_12-14-20_696005_7922/logs/ray-data
2025-01-01 12:14:22,142 INFO streaming_executor.py:109 -- Execution plan of Dataset: InputDataBuffer[Input] -> TaskPoolMapOperator[ReadHuggingFace]
Running 0: 0.00 row [00:00, ? row/s]                           2025-01-01 12:14:22,533  ERROR streaming_executor_state.py:485 -- An exception was raised from a task of operator "ReadHuggingFace->SplitBlocks(40)". Dataset execution will now abort. To ignore this exception and continue, set DataContext.max_errored_blocks.
⚠️   Dataset execution failed: : 0.00 row [00:00, ? row/s]
- ReadHuggingFace->SplitBlocks(40): Tasks: 1; Queued blocks: 0; Resources: 1.0 CPU, 256.0MB object store: : 0.00 row [00:00, ? row/s]
2025-01-01 12:14:22,544 ERROR exceptions.py:73 -- Exception occurred in Ray Data or Ray Core internal code. If you continue to see this error, please open an issue on the Ray project GitHub page with the full stack trace below: https://github.com/ray-project/ray/issues/new/choose
2025-01-01 12:14:22,544 ERROR exceptions.py:81 -- Full stack trace:
Traceback (most recent call last):
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/exceptions.py", line 49, in handle_trace
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/plan.py", line 498, in execute
    blocks = execute_to_legacy_block_list(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/legacy_compat.py", line 123, in execute_to_legacy_block_list
    block_list = _bundles_to_block_list(bundles)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/legacy_compat.py", line 169, in _bundles_to_block_list
    for ref_bundle in bundles:
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/interfaces/executor.py", line 37, in __next__
    return self.get_next()
           ^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/streaming_executor.py", line 153, in get_next
    item = self._outer._output_node.get_output_blocking(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/streaming_executor_state.py", line 312, in get_output_blocking
    raise self._exception
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/streaming_executor.py", line 230, in run
    continue_sched = self._scheduling_loop_step(self._topology)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/streaming_executor.py", line 285, in _scheduling_loop_step
    num_errored_blocks = process_completed_tasks(
                         ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/streaming_executor_state.py", line 486, in process_completed_tasks
    raise e from None
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/streaming_executor_state.py", line 453, in process_completed_tasks
    bytes_read = task.on_data_ready(
                 ^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/interfaces/physical_operator.py", line 105, in on_data_ready
    raise ex from None
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/_internal/execution/interfaces/physical_operator.py", line 101, in on_data_ready
    ray.get(block_ref)
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/auto_init_hook.py", line 21, in auto_init_wrapper
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/client_mode_hook.py", line 103, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/worker.py", line 2755, in get
    values, debugger_breakpoint = worker.get_objects(object_refs, timeout=timeout)
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/worker.py", line 906, in get_objects
    raise value.as_instanceof_cause()
ray.exceptions.RayTaskError(RaySystemError): ray::ReadHuggingFace->SplitBlocks(40)() (pid=7949, ip=127.0.0.1)
  At least one of the input arguments for this task could not be computed:
ray.exceptions.RaySystemError: System error: No module named 'datasets_modules'
traceback: Traceback (most recent call last):
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          ^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'datasets_modules'
ray.data.exceptions.SystemException

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/houjun/Documents/Projects/ragdoll/data.py", line 61, in <module>
    ds.materialize()
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/dataset.py", line 4884, in materialize
    copy._plan.execute()
  File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/data/exceptions.py", line 89, in handle_trace
    raise e.with_traceback(None) from SystemException()
ray.exceptions.RayTaskError(RaySystemError): ray::ReadHuggingFace->SplitBlocks(40)() (pid=7949, ip=127.0.0.1)
  At least one of the input arguments for this task could not be computed:
ray.exceptions.RaySystemError: System error: No module named 'datasets_modules'
traceback: Traceback (most recent call last):
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
          ^^^^^^^^^^^^^^^^^^^^^
ModuleNotFoundError: No module named 'datasets_modules'
(ReadHuggingFace->SplitBlocks(40) pid=7949) No module named 'datasets_modules'
(ReadHuggingFace->SplitBlocks(40) pid=7949) Traceback (most recent call last):
(ReadHuggingFace->SplitBlocks(40) pid=7949)   File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/serialization.py", line 460, in deserialize_objects
(ReadHuggingFace->SplitBlocks(40) pid=7949)     obj = self._deserialize_object(data, metadata, object_ref)
(ReadHuggingFace->SplitBlocks(40) pid=7949)           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(ReadHuggingFace->SplitBlocks(40) pid=7949)   File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/serialization.py", line 317, in _deserialize_object
(ReadHuggingFace->SplitBlocks(40) pid=7949)     return self._deserialize_msgpack_data(data, metadata_fields)
(ReadHuggingFace->SplitBlocks(40) pid=7949)            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(ReadHuggingFace->SplitBlocks(40) pid=7949)   File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/serialization.py", line 272, in _deserialize_msgpack_data
(ReadHuggingFace->SplitBlocks(40) pid=7949)     python_objects = self._deserialize_pickle5_data(pickle5_data)
(ReadHuggingFace->SplitBlocks(40) pid=7949)                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
(ReadHuggingFace->SplitBlocks(40) pid=7949)   File "/Users/houjun/Documents/Projects/ragdoll/.venv/lib/python3.11/site-packages/ray/_private/serialization.py", line 262, in _deserialize_pickle5_data
(ReadHuggingFace->SplitBlocks(40) pid=7949)     obj = pickle.loads(in_band)
(ReadHuggingFace->SplitBlocks(40) pid=7949)           ^^^^^^^^^^^^^^^^^^^^^
(ReadHuggingFace->SplitBlocks(40) pid=7949) ModuleNotFoundError: No module named 'datasets_modules'

Versions / Dependencies

ray==2.40.0
datasets==3.2.0
huggingface-hub==0.27.0

Reproduction script

import datasets
import ray
ds = datasets.load_dataset(path="hotpotqa/hotpot_qa",
name="fullwiki",
split="train",
streaming=True,
trust_remote_code=True)
ds = ray.data.from_huggingface(ds)
ds.materialize()

Issue Severity

Medium: It is a significant difficulty but I can work around it.

@Jemoka Jemoka added bug Something that is supposed to be working; but isn't triage Needs triage (eg: priority, bug/not-bug, and owning component) labels Jan 1, 2025
@jcotant1 jcotant1 added the data Ray Data-related issues label Jan 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that is supposed to be working; but isn't data Ray Data-related issues triage Needs triage (eg: priority, bug/not-bug, and owning component)
Projects
None yet
Development

No branches or pull requests

2 participants