ellama is a tool for interacting with LLMs from Emacs. It now uses llm library as a backend and supports various providers.
You must log in or # to comment.
That was fast, nice work! I posted in your last thread. Now that you support GPT-4 I can probably use this.
Couple Q’s before I set some time aside to try it out:
- Do you have a public API if I want to define some custom prompt commands? I’d like to do some simple things like make functions that are region specific with a custom prompt string. I could hack it up but it would be helpful to know if there’s an API surface that I should use to keep things stable between versions.
- Where do you want to go from here?
Sure. You can use functions without double dashes as public api. If you want some specific, open issue or you can even use llm library directly if you want more control.
Has anyone installed ellama with elpaca? I wanted to play around with it, but elpaca errors out with “Unable to determine recipe URL”. And it doesn’t seem to be a matter of refreshing: that has not helped.
Strange.