Skip to content

Apache Spark integration #491

Closed
Closed
@harpaj

Description

@harpaj

With the recently released integration for Apache Beam and the general push of Sentry for Data Pipelines - is there an integration of Sentry with Apache Spark (especially with PySpark) planned?

apache/spark#20151 explicitly mentions that one reason for creating that PR was to make a Sentry integration easier. However, I didn't manage to even find some example code for using Sentry to capture Spark errors, and so far I failed when trying to build an integration using the feature introduced by the PR myself.

Metadata

Metadata

Assignees

No one assigned

    Labels

    New IntegrationIntegrating with a new framework or library

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions