Closed
Description
With the recently released integration for Apache Beam and the general push of Sentry for Data Pipelines - is there an integration of Sentry with Apache Spark (especially with PySpark) planned?
apache/spark#20151 explicitly mentions that one reason for creating that PR was to make a Sentry integration easier. However, I didn't manage to even find some example code for using Sentry to capture Spark errors, and so far I failed when trying to build an integration using the feature introduced by the PR myself.