LLM(large language model)的spring boot starter! 支持自动注入LLM Client, 开箱即用。 采用webflux实现, 完全支持LLM的流式输出! 支持openai, chat glm, 文心一言等大模型。
mvn clean -U install -Dmaven.test.skip=truemvn dependency
<dependency>
<groupId>cn.llm</groupId>
<artifactId>llm-spring-boot-starter</artifactId>
<version>2.7.0-SNAPSHOT</version>
</dependency>2.7.0也是spring-boot的版本
/**
* @author kepler
*/
@Controller
@RequestMapping("/llm")
public class LLMController {
@Autowired
private OpenAI llmClient;
@GetMapping(value="/stream", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
@ResponseBody
public Flux<LLMReply> stream(@RequestParam(name = "q") String question){
return llmClient.completeFlux(question);
}
}- 运行 cn.llm.App
- 访问 http://localhost:8080/llm/stream/display
plz contact me