Skip to content

Remove Storage objects initial check in get_flows #3027

Closed
@mithalee

Description

USE CASE:
I am trying to work on the same version of the flow I have uploaded to my S3 bucket. The issue I found is storage.flows() is empty and does not find my existing flow from S3 bucket.
If I explicitly pass the flow name then I can access my flow through get_flow method.Else I get an error as "Flow not contained in the storage".
Please let me know if I am missing anything.
#define globally the flow name and flow location as storage.flows() is not able to find my flow from S3.
dictFlows={'flows': {'ETL': 'etl/testflow'}, 'flow_location': 'etl/testflow'}
def test_add_flow_to_S3():
storage = S3(bucket="test",key="etl/testflow")
f = Flow("ETL")
f.name not in storage
with Flow("ETL") as f:
e = extract()
t = transform(e)
l = load(t)
flow_location=storage.add_flow(f)
f.name in storage
storage.build()def test_get_flow_S3(dictFlows):
print("i am in get flow")
storage = S3(bucket="test", key="etl/testflow")
storage.flows=dictFlows['flows']
newflow=storage.get_flow('etl/testflow')
print("S3 FLOW OUTPUT")
newflow.run() (edited)
NOTE : I did chat about the above issue in the prefect slack channel and it looks like a bug.

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions