Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it really LLMs (plural) when you only have OpenAPI integration?


Right now it just works with OpenAI chat models (gpt-3.5-turbo, gpt-4) but if there's interest I plan to extend it to have several backends. These would probably each be an existing library that implements generating structured output like https://github.com/outlines-dev/outlines or https://github.com/guidance-ai/guidance. If you have ideas how this should be done let me know - on a github issue would be great to make it visible to others.


Oh, and some companies offer APIs that match the OpenAI API and there are some open-source projects that do this for llama running locally. Since those would be compatible with the openai python package they will work with magentic too - though some of these do not support function calling.

See for example Anyscale Endpoints https://app.endpoints.anyscale.com/landing and https://github.com/AmineDiro/cria


There's also LocalAI[0] which allows the use of local LLMs with an OpenAI compatible API.

[0] https://github.com/go-skynet/LocalAI


Thanks for sharing! LocalAI supports function calling[0] so this should work for most or all features of magentic - I'm interested to see if concurrent requests work. I will test this out.

[0] https://localai.io/features/openai-functions/


I tried out guidance. Encountered endless bugs


OpenAI offers a few different LLMs :)


text-generation-webui offers an OpenAI API implementation, specifically to support OpenAI API clients, so you can get something more than just OpenAI support by just wrapping the OpenAI API.

You could have more flexibility by abstracting out the underlying LLM APIs, but then you also have a bigger deal with supported features of different APIs, the same conceptual feature supported with very different parameter structures, etc., etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: