Trying out the Affinity 2 Beta, which has NPU support for a couple new plugins. The object selector is CRAZY fast…

#tech #technology #AI #photo #editing #photography #geek

  • Kevin@fosstodon.org
    link
    fedilink
    arrow-up
    1
    ·
    13 days ago

    @SomeGadgetGuy@techhub.social holy molly that’s so cool. How does apps use the NPU? Did Qualcomm open source their driver?

  • Overlander_1@techhub.social
    link
    fedilink
    arrow-up
    1
    ·
    13 days ago

    @SomeGadgetGuy@techhub.social very impressive, and very scary for any graphic design or former guru’s who can now see their art/skills diminished and undervalued with every iteration of these programs.

    • SomeGadgetGuy@techhub.socialOP
      link
      fedilink
      arrow-up
      2
      ·
      13 days ago

      @Overlander_1@techhub.social Naw. Subject selection has always been a tedious chore. No one’s artistic skills are being challenged in the market JUST because we can identify a subject and draw a border around it a little faster. 😄

        • SomeGadgetGuy@techhub.socialOP
          link
          fedilink
          arrow-up
          1
          ·
          13 days ago

          @Overlander_1@techhub.social Oh I’m with you on generative AI slop, but I will admit to enjoying the output of NPUs since the Huawei Kirin chips back in the day.
          THat’s how we got cameras that could identify subjects, then later we got Sony’s amazing eye tracking autofocus, and now subject detection in photo editing apps.
          That kind of on-device practical NPU performance is really interesting.

          Making soulless “art” out of prompts? A lot less interesting…