Sequential Execution of Dynamic Mapped Task #26093
Replies: 1 comment 4 replies
-
|
No. dynamic tasks are run in parallel and that's the only reason why you would like to use them this way. If you want to process your file sequentially, it makes completely no sense to make it Dynamic Tasks. you can write your custom (non-dynamic) task: If you want to, however do some processing in parallel and THEN join the result of it sequentially (typical map-reduce scenario) then you can run the dynamic tasks for each file separately (and store it somewhere - like s3) and THEN run a single task that will take the result of those paralll-processed files and join them together. Dynamic Task mapping has full support for it - it can pas the "array" of outputs (for example output files produced) and your next task can take the array and process all files from the arrray as it finds appropriately. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, is it possible to define a sequential dependencies for dynamically mapped tasks.
The case is as follows
list_s3_files -> extract -> transform -> loadextract, transform and load steps are expanded for all files from first task.
Since files are partitioned by timestamp field the load step should add to incremental dataset following the order of files generated by
list_s3_files.Thanks
Beta Was this translation helpful? Give feedback.
All reactions