Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Azure OpenAI LLM #395

Merged
merged 6 commits into from
Dec 22, 2022
Merged

Conversation

gojira
Copy link
Contributor

@gojira gojira commented Dec 21, 2022

Hi! This PR adds support for the Azure OpenAI service to LangChain.

I've tried to follow the contributing guidelines.

Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

jsut trying to fully understand the differences between this and the current OpenAI class... is it only that it uses engine=self.deployment_name rather than model=self.model_name? are there other differences that may pop up in the future?

just trying to understand, because if that is the only difference its possible they could share even more code (which might make some the circular dependency stuff easier)

@hwchase17
Copy link
Contributor

and i guess a follow up, when passing in deployment_name do they also need to specify model_name slash is that even a concept? (asking because there is some functionality, like a mapping of model_name to max tokens, that may not be available if its not passed in)

@gojira
Copy link
Contributor Author

gojira commented Dec 21, 2022

@hwchase17 that is the only difference yes.

And yes, I suppose that the example would be better to include the model_name to show how you can use that fo rthe token counts.

@hwchase17
Copy link
Contributor

depending on the answer to that, something that be simpler is just removing all directing passing of model=self.model_name and put it in _default_params, and then override those in some subclasses, eg

class OpenAI(BaseOpenAI):

    model_name: str = "text-davinci-003"
    """Model name to use."""
    
    @property
    def _default_params(self) -> Dict[str, Any]:
        return {**{"model": self.model_name}, **super()._default_params}


class AzureOpenAI(BaseOpenAI):
    deployment_name: str = ""
    """Deployment name to use."""

    @property
    def _default_params(self) -> Dict[str, Any]:
        return {**{"engine": self.deployment_name}, **super()._default_params}

happy to help with that! super exciting to see this added :)

@gojira
Copy link
Contributor Author

gojira commented Dec 21, 2022

depending on the answer to that, something that be simpler is just removing all directing passing of model=self.model_name and put it in _default_params, and then override those in some subclasses, eg

class OpenAI(BaseOpenAI):

    model_name: str = "text-davinci-003"
    """Model name to use."""
    
    @property
    def _default_params(self) -> Dict[str, Any]:
        return {**{"model": self.model_name}, **super()._default_params}


class AzureOpenAI(BaseOpenAI):
    deployment_name: str = ""
    """Deployment name to use."""

    @property
    def _default_params(self) -> Dict[str, Any]:
        return {**{"engine": self.deployment_name}, **super()._default_params}

happy to help with that! super exciting to see this added :)

This sound really great! Makes a lot of sense and it a lot cleaner!

@hwchase17
Copy link
Contributor

i took a stab and it actually turned out to be a tiny bit tricker than i thought (bc i have some silly _identifying/default param stuff that is tech debt). anyways, since thats my mess i tried to work around it and i think this should do the trick: #396?

although would appreciate you double checking since i dont have an azure deployment to test on :) probably needs better docstrings, etc (i like the ones you added)

@gojira
Copy link
Contributor Author

gojira commented Dec 21, 2022

Yup that works!

I can abandon this one or update this one with your changes?

@hwchase17
Copy link
Contributor

Yup that works!

I can abandon this one or update this one with your changes?

can you update this one with my changes? i like your documentation (the stuff about env vars) + example ntoebook :)

@gojira
Copy link
Contributor Author

gojira commented Dec 21, 2022

OK I merged your refactor into the PR.

@gojira
Copy link
Contributor Author

gojira commented Dec 21, 2022

I reordered the imports in __init.py - lint runs fine locally now

@gojira
Copy link
Contributor Author

gojira commented Dec 21, 2022

I merged the 2 openai import lines into one:

from langchain.llms.openai import AzureOpenAI, OpenAI

Now the sort check succeeds

$ poetry run isort . --check
Skipped 2 files

@hwchase17 hwchase17 merged commit 543db9c into langchain-ai:master Dec 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants