Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
czyhandsome
on Sept 27, 2023
|
parent
|
context
|
favorite
| on:
Show HN: Magentic – Use LLMs as simple Python func...
Do you support custom LLMs?
jackmpcollins
on Sept 27, 2023
[–]
At the moment only those that support the OpenAI Chat API, with function calling for the structured outputs. For example you can use LocalAI[0][1] to run models locally.
[0]
https://github.com/go-skynet/LocalAI
[1]
https://localai.io/features/openai-functions/
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: