Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -239,6 +239,8 @@ public void onApplicationEnd(SparkListenerApplicationEnd applicationEnd) {
}
}

// This function is called using reflection in SparkExitInstrumentation, make sure to update if
// the signature of this function is changed
public synchronized void finishApplication(
long time, Throwable throwable, int exitCode, String msg) {
log.info("Finishing spark application trace");
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
package datadog.trace.instrumentation.spark;

import java.lang.reflect.Method;
import net.bytebuddy.asm.Advice;

class SparkExitAdvice {
@Advice.OnMethodEnter(suppress = Throwable.class)
public static void enter(@Advice.Argument(0) int exitCode) {
try {
// Using reflection as java.lang.* instrumentation have to be done at boostrap,
// when spark classes have not been injected yet
Class<?> klass =
Thread.currentThread()
.getContextClassLoader()
.loadClass("datadog.trace.instrumentation.spark.AbstractDatadogSparkListener");
Object datadogListener = klass.getDeclaredField("listener").get(null);
if (datadogListener != null) {
Method method =
datadogListener
.getClass()
.getMethod(
"finishApplication", long.class, Throwable.class, int.class, String.class);
method.invoke(datadogListener, System.currentTimeMillis(), null, exitCode, null);
}
} catch (Exception ignored) {
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
package datadog.trace.instrumentation.spark;

import static datadog.trace.agent.tooling.bytebuddy.matcher.NameMatchers.named;
import static net.bytebuddy.matcher.ElementMatchers.isDeclaredBy;

import com.google.auto.service.AutoService;
import datadog.trace.agent.tooling.Instrumenter;
import datadog.trace.agent.tooling.InstrumenterModule;
import datadog.trace.api.Config;

@AutoService(InstrumenterModule.class)
public class SparkExitInstrumentation extends InstrumenterModule.Tracing
implements Instrumenter.ForSingleType, Instrumenter.HasMethodAdvice, Instrumenter.ForBootstrap {

public SparkExitInstrumentation() {
super("spark-exit");
}

@Override
protected boolean defaultEnabled() {
return Config.get().isDataJobsEnabled();
}

@Override
public String instrumentedType() {
return "java.lang.Runtime";
Copy link
Contributor

@mcculls mcculls Mar 17, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We already instrument java.lang.Shutdown in ShutdownInstrumentation - this calls ShutdownHelper.shutdownAgent() which in turn calls Agent.shutdown() to safely close out various components.

Other components register a shutdown hook with the JVM using the public API.

I'd prefer it if one of these existing approaches could be (re)used, rather than introduce yet another form of shutdown hook.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created a new Instrumentation for Runtime.exit() in order to:

  • capture the exit code
  • inject it only when data jobs is enabled

It should not change much to instrument the Shutdown.exit() method in the ShutdownInstrumentation instead, but it means that it would be injected for everyone, not only when data jobs is enabled

}

@Override
public void methodAdvice(MethodTransformer transformer) {
transformer.applyAdvice(
named("exit").and(isDeclaredBy(named("java.lang.Runtime"))),
packageName + ".SparkExitAdvice");
}
}