Skip to content

Commit

Permalink
Add function return documentation for LLMService (#1721)
Browse files Browse the repository at this point in the history
- Changes to add return types for classes derived from in LLMService and LLMClient (llm_service.py)

"fixes #1720 "

## By Submitting this PR I confirm:
- I am familiar with the [Contributing Guidelines](https://github.com/nv-morpheus/Morpheus/blob/main/docs/source/developer_guide/contributing.md).
- When the PR is ready for review, new or existing tests cover these changes.
- When the PR is ready for review, the documentation is up to date with these changes.

Authors:
  - https://github.com/acaklovic-nv

Approvers:
  - Michael Demoret (https://github.com/mdemoret-nv)

URL: #1721
  • Loading branch information
acaklovic-nv authored May 28, 2024
1 parent 6eeff86 commit 3360602
Showing 1 changed file with 30 additions and 0 deletions.
30 changes: 30 additions & 0 deletions morpheus/llm/services/llm_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,11 @@ class LLMClient(ABC):
def get_input_names(self) -> list[str]:
"""
Returns the names of the inputs to the model.
Returns
-------
list[str]
List of input names.
"""
pass

Expand All @@ -42,6 +47,11 @@ def generate(self, **input_dict) -> str:
----------
input_dict : dict
Input containing prompt data.
Returns
-------
str
Generated response for prompt.
"""
pass

Expand All @@ -54,6 +64,11 @@ async def generate_async(self, **input_dict) -> str:
----------
input_dict : dict
Input containing prompt data.
Returns
-------
str
Generated async response for prompt.
"""
pass

Expand All @@ -80,6 +95,11 @@ def generate_batch(self, inputs: dict[str, list], return_exceptions=False) -> li
Inputs containing prompt data.
return_exceptions : bool
Whether to return exceptions in the output list or raise them immediately.
Returns
-------
list[str] | list[str | BaseException]
List of responses or list of responses and exceptions.
"""
pass

Expand Down Expand Up @@ -110,6 +130,11 @@ async def generate_batch_async(self,
Inputs containing prompt data.
return_exceptions : bool
Whether to return exceptions in the output list or raise them immediately.
Returns
-------
list[str] | list[str | BaseException]
List of responses or list of responses and exceptions.
"""
pass

Expand All @@ -131,5 +156,10 @@ def get_client(self, *, model_name: str, **model_kwargs) -> LLMClient:
model_kwargs : dict[str, typing.Any]
Additional keyword arguments to pass to the model.
Returns
-------
LLMClient
Client for interacting with LLM models.
"""
pass

0 comments on commit 3360602

Please sign in to comment.