I need a local model without espionage, which can translate text into several languages and not heat up my hardware too much, since computers are getting more expensive now.
The Firefox Browser has translation built in and it works fairly well. We have LibreTranslate as a self-hosted service, I think it’s okay… Not particulary good, more okay in my experience. And what I tend to do is just copy-paste text to my local LLM and tell it to translate. Most models will do it. They have to be trained on multiple languages for that, and can’t be too small. You could try one of the Ministral models at whatever size fits and doesn’t heat up your computer. But I bet the average model from Meta and Google will do as well, I think they all have multilangual capabilities these days. And for web use, I’d recommend using Firefox. I can read Japanese websites with that. It’s not perfect by any means, but low on the resources and it only takes a few seconds, even on battery power on my laptop.
LibreTranslate isn’t AI, but it’s alright, it’ll let you get the general gist of something. if you want something that works with your local ollama, then there’s Page Assist for Firefox and its forks. you can pop it into the sidebar and when you open it it will let you run prompts on the current page.
libretranslate has a new project going on focused on AI translation: https://github.com/LibreTranslate/LTEngine
That’s nice, because LLMs seem to translate things better than the regular tools, especially when it comes to things like idioms.



