Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support self-hosted models (non-OpenAI flavor) #283

Open
cryoff opened this issue Jan 26, 2024 · 1 comment
Open

Support self-hosted models (non-OpenAI flavor) #283

cryoff opened this issue Jan 26, 2024 · 1 comment

Comments

@cryoff
Copy link

cryoff commented Jan 26, 2024

Proposal

It would be great to use non-OpenAI APIs. For example, in the real setting I can use Llama2 or even FlanT5, host it somewhere, expose API endpoint, set up pezzo proxy to proxy requests and submit the custom headers.
Right now a similar functionality is implemented in LangChain using custom callbacks. That could be a way as well.

Use-Case

No response

Is this a feature you are interested in implementing yourself?

Maybe

@developbiao
Copy link

Hello, can this project support other LLM models? For example, Gemini for example(vertex ai) or other third-party projects. Are there any plans to add this feature in the future? Thanks :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants