I’ve been playing with both the Thumb and the Unexpected keyboards. I like 'em both but, man, I have to admit I’d like them more if they had that top bar that predicts what you might be. Is that just a no-go from a privacy perspective? Can that functionality be local?

(I also wouldn’t mind a good voice typing feature)

    • KptnAutismus@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      i think openboard might be the thing OP is looking for. not using that feature myself, but it seems to be on par with the others i’ve used.

    • SomeBoyo@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      My only problem with it is, that it removes the currently typed word from the autocomplete bar.

  • AtmaJnana@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    8 months ago

    Yes. Very possible. An LLM could possibly be run locally or just sandboxed for only you. In my experience, I guess because there is less training data and fewer iterations, it tends to take longer and result in poorer outputs.

    Microsoft could also let you control this but of course they do not want to.

    I switched to Openboard. After a few months, Its not as good yet as SwiftKey was, but it’s also not sending all my text input to Microsoft.

    The primary feature I miss from Swiftkey is the ability to insert a gif easily.

    • therebedragons@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      Do any of the open source keyboards have gif integration? I’ve tried floris and anysoft and I miss it so much.

      • AtmaJnana@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        8 months ago

        I have yet to find one. I keep Swiftkey installed and switch inputs when I need to insert a gif.

  • Tja@programming.dev
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    8 months ago

    It can and it will. That is one of the uses of “NPUs” I’m most excited about.

    Basically you can run an (potentially open-source) small LLM on the phone using whatever context the keyboard has access to (at a minumim, what you’ve typed so far) and have the keyboard generate the next token(s).

    Since this is comptationally intensive the model has to be small and you need dedicated hardware to optimize it, otherwise you would need a 500W GPU like the big players. You can do it for 0.5W locally. Of course, adjust your expectations accordingly.

    I don’t know any project doing it right now, but I imagine that Microsoft will integrate in SwiftKey soon, with open source projects to follow.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      I think you hugely estimate what it takes to complete and correct a few words. Maybe you would want some sort of accelerator for fine tuning but 1. You probably don’t even need fine tuning and 2. You can probably just run it on the CPU while your device is charging. But for inference modern CPUs are by far powerful enough.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        8 months ago

        Yeah, modern arm CPUs can run at 3GHz and play PS4 level games, but I don’t want my phone to become a handwarmer every time I want to typefvvn a quick email…

        And of course, I’m not talking about correcting “fuck” to “duck”, I’m talking about ChatGPT level prediction. Or llama2, or gemini nano, or whatever…

  • kevincox@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    While Google isn’t generally good for privacy GBoard actually does this. IIRC they actually completely removed the sync service and your typing history is only kept on-device and Android backup.

    However it is a bit of a privacy nightmare otherwise as many of the other features phone home. But last I checked (~4 years ago, worth checking again) the core typing functionality is actually fully offline and private.

    So yes, it is possible.

  • technomad@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    I’ve been using a version of openboard with sayboard implemented. It doesn’t work perfectly, there’s a good amount of frustration that comes with it, but it works good enough for what I need it to do and I’ll gladly take the trade-off for being less dependent on google.

    Hopefully, it continues to get better or a more exact/perfected alternative comes along.