-
Notifications
You must be signed in to change notification settings - Fork 104
Packaging, dependency updates, and other errata #8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
b6eec40
to
1734194
Compare
if __name__ == "__main__": | ||
# Due to https://github.com/python-poetry/poetry/issues/3509 and Poetry | ||
# assuming the tags, we have to change the wheel ourselves after build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well this is too bad.
Can we just build the wheels directly rather than via poetry? Will that still have this issue, or is it too inconvenient to do in the first place?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I kinda am, but you need Poetry to get all the files that you want in the wheel setup properly and to have the dependency versions set in the dist info properly. (otherwise it gets real hacky)
8e0cb94
to
23dd21f
Compare
|
||
|
||
def from_bridge_payloads( | ||
payloads: Iterable[temporalio.bridge.proto.common.Payload], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Sushisource we should have just used the API definition of Payload, lang shouldn't need to bother with these conversions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, probably, at this point. I am not a fan of the way it's defined but we're stuck with it so may as well get rid of the extra indirection.
@@ -13,7 +13,7 @@ jobs: | |||
fail-fast: true | |||
matrix: | |||
python: ["3.7", "3.10"] | |||
os: [ubuntu-latest] # TODO: macos-latest, windows-latest | |||
os: [ubuntu-latest, macos-latest, windows-latest] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
description = "Temporal.io Python SDK" | ||
authors = ["Temporal Technologies Inc <sdk@temporal.io>"] | ||
license = "MIT" | ||
# We need to include proto source that is otherwise excluded via .gitignore. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I still wish we didn't exclude proto files from being in-repo. It it so annoying for a user, if they want to see the proto types/files, to have to download a prebuilt artifact and extract it because we won't commit the source.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They're in-repo via the core submodule though, no?
Install the `temporalio` package from [PyPI](https://pypi.org/project/temporalio). If using `pip` directly, this might | ||
look like: | ||
|
||
python -m pip install temporalio |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I plan on publishing this once we merge
temporalio/worker.py
Outdated
# If there is a manager here, we make it flush all pending heartbeats in | ||
# the queue | ||
if manager: | ||
manager.flush_pending_heartbeats(task_token) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is required for multiprocess activities because it is technically possible some heartbeats remain in queue unprocessed
temporalio/worker.py
Outdated
logger.exception("Failed during multiprocess queue poll for heartbeat") | ||
return | ||
|
||
def flush_pending_heartbeats(self, task_token: bytes) -> None: | ||
# We're gonna go ahead and just flush them all |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is admittedly lazy and can result in issues if the queue fills up faster than it can be processed. The processing should be quick since it's just scheduling heartbeats with asyncio, not even converting them or anything.
For the next phase when workflow workers come about, I will probably move all the activity stuff to its own private file/package. I may also provide an alternative SharedStateManager
that is a local loopback gRPC server instead of using multiprocessing.Manager()
to see if that works better.
temporalio.worker.SharedStateManager.create_from_multiprocessing( | ||
multiprocessing.Manager() | ||
) | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We made this eager at the top of tests because where we originally had it, as a local inside a function, was causing all local vars to be copied during fork
(called implicitly by multiprocessing.Manager
) which included the Python client which has a reference to the Rust client and when that Python client was GC'd from the child process, the Rust client was dropped, causing a file descriptor mismatch.
I have opened #11 to be more explicit about how fork works with this SDK.
e25f0a2
to
9fc91ec
Compare
fcbac18
to
8656cc1
Compare
Closing and reopening to attempt to trigger the actions to get going again (they are stuck for whatever reason) |
58c1bf7
to
16d6299
Compare
Merging |
What was changed