I’ve been playing with both the Thumb and the Unexpected keyboards. I like 'em both but, man, I have to admit I’d like them more if they had that top bar that predicts what you might be. Is that just a no-go from a privacy perspective? Can that functionality be local?

(I also wouldn’t mind a good voice typing feature)

    • Shamot
      link
      fedilink
      arrow-up
      7
      ·
      11 months ago

      Updated 18 months ago on F-Droid, but the Github looks still active. I hope they’ll soon have a releasable version.

      • Antiochus@lemmy.one
        link
        fedilink
        arrow-up
        6
        ·
        11 months ago

        The project linked is a fork of the version on F-Driod. You can download the APK directly from the Github and use it just fine.

        • RovingFox@infosec.pub
          link
          fedilink
          arrow-up
          5
          ·
          11 months ago

          This.

          Also, recommend using it with Obtainium. It auto searches for updates on multiple sources like GitHub, GitLab, F-Droid Official, F-Droid Third-Party Repo, APKPure,Telegram App, HTML, etc.

    • KptnAutismus@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      i think openboard might be the thing OP is looking for. not using that feature myself, but it seems to be on par with the others i’ve used.

    • SomeBoyo@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      My only problem with it is, that it removes the currently typed word from the autocomplete bar.

  • AtmaJnana@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    11 months ago

    Yes. Very possible. An LLM could possibly be run locally or just sandboxed for only you. In my experience, I guess because there is less training data and fewer iterations, it tends to take longer and result in poorer outputs.

    Microsoft could also let you control this but of course they do not want to.

    I switched to Openboard. After a few months, Its not as good yet as SwiftKey was, but it’s also not sending all my text input to Microsoft.

    The primary feature I miss from Swiftkey is the ability to insert a gif easily.

    • therebedragons@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      Do any of the open source keyboards have gif integration? I’ve tried floris and anysoft and I miss it so much.

      • AtmaJnana@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        I have yet to find one. I keep Swiftkey installed and switch inputs when I need to insert a gif.

  • Tja@programming.dev
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    11 months ago

    It can and it will. That is one of the uses of “NPUs” I’m most excited about.

    Basically you can run an (potentially open-source) small LLM on the phone using whatever context the keyboard has access to (at a minumim, what you’ve typed so far) and have the keyboard generate the next token(s).

    Since this is comptationally intensive the model has to be small and you need dedicated hardware to optimize it, otherwise you would need a 500W GPU like the big players. You can do it for 0.5W locally. Of course, adjust your expectations accordingly.

    I don’t know any project doing it right now, but I imagine that Microsoft will integrate in SwiftKey soon, with open source projects to follow.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      11 months ago

      I think you hugely estimate what it takes to complete and correct a few words. Maybe you would want some sort of accelerator for fine tuning but 1. You probably don’t even need fine tuning and 2. You can probably just run it on the CPU while your device is charging. But for inference modern CPUs are by far powerful enough.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        11 months ago

        Yeah, modern arm CPUs can run at 3GHz and play PS4 level games, but I don’t want my phone to become a handwarmer every time I want to typefvvn a quick email…

        And of course, I’m not talking about correcting “fuck” to “duck”, I’m talking about ChatGPT level prediction. Or llama2, or gemini nano, or whatever…

  • Shamot
    link
    fedilink
    arrow-up
    4
    ·
    11 months ago

    This functionality can be local. I use the Google keyboard with internet access blocked and it works. The only thing missing is the ability to search for emojis typing a word (they are still in the list) and some features that I never used and never understood why they are in a keyboard since it’s not related to typing text, like the gifs.

    The only reason I see for a keyboard to need internet connection is to update the dictionary when it’s modified, but it shouldn’t prevent to work with an outdated dictionary.

    When I searched for alternatives a few months ago, I couldn’t find anything satisfying.

  • kevincox@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    11 months ago

    While Google isn’t generally good for privacy GBoard actually does this. IIRC they actually completely removed the sync service and your typing history is only kept on-device and Android backup.

    However it is a bit of a privacy nightmare otherwise as many of the other features phone home. But last I checked (~4 years ago, worth checking again) the core typing functionality is actually fully offline and private.

    So yes, it is possible.

  • brokenlcd@feddit.it
    link
    fedilink
    arrow-up
    3
    ·
    11 months ago

    For voice typing sayboard may be an option, i’ve never tried it so i’m not sure if it’s good or not though; while for the predictive text i have been looking for a keyboard that can do it as well

  • technomad@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    I’ve been using a version of openboard with sayboard implemented. It doesn’t work perfectly, there’s a good amount of frustration that comes with it, but it works good enough for what I need it to do and I’ll gladly take the trade-off for being less dependent on google.

    Hopefully, it continues to get better or a more exact/perfected alternative comes along.