Skip to content

Commit

Permalink
Add some documentation about LambdaStreamingResponseHandler (langchai…
Browse files Browse the repository at this point in the history
  • Loading branch information
agoncal authored Sep 27, 2024
1 parent 1185515 commit 92fcf9c
Showing 1 changed file with 19 additions and 0 deletions.
19 changes: 19 additions & 0 deletions docs/docs/tutorials/4-response-streaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,4 +64,23 @@ model.generate(userMessage, new StreamingResponseHandler<AiMessage>() {
error.printStackTrace();
}
});
```

A more compact way to stream the response is to use the `LambdaStreamingResponseHandler` class.
This utility class provides static methods to create a `StreamingResponseHandler` using lambda expressions.
The way to use lambdas to stream the response is quite simple.
You just call the `onNext()` static method with a lambda expression that defines what to do with the token:

```java
import static dev.langchain4j.model.LambdaStreamingResponseHandler.onNext;

model.generate("Tell me a joke", onNext(System.out::print));
```

The `onNextAndError()` method allows you to define actions for both the `onNext()` and `onError()` events:

```java
import static dev.langchain4j.model.LambdaStreamingResponseHandler.onNextAndError;

model.generate("Tell me a joke", onNextAndError(System.out::print, Throwable::printStackTrace));
```

0 comments on commit 92fcf9c

Please sign in to comment.