Trying out the Affinity 2 Beta, which has NPU support for a couple new plugins. The object selector is CRAZY fast…
#tech #technology #AI #photo #editing #photography #geek
@SomeGadgetGuy@techhub.social holy molly that’s so cool. How does apps use the NPU? Did Qualcomm open source their driver?
@notanonymous26@fosstodon.org Oh definitely not open source, but Qualcomm has an SDK for devs https://www.qualcomm.com/developer/software/neural-processing-sdk-for-ai
@SomeGadgetGuy@techhub.social very impressive, and very scary for any graphic design or former guru’s who can now see their art/skills diminished and undervalued with every iteration of these programs.
@Overlander_1@techhub.social Naw. Subject selection has always been a tedious chore. No one’s artistic skills are being challenged in the market JUST because we can identify a subject and draw a border around it a little faster. 😄
@SomeGadgetGuy@techhub.social Yet 🫣
But I take your point 👍
@Overlander_1@techhub.social Oh I’m with you on generative AI slop, but I will admit to enjoying the output of NPUs since the Huawei Kirin chips back in the day.
THat’s how we got cameras that could identify subjects, then later we got Sony’s amazing eye tracking autofocus, and now subject detection in photo editing apps.
That kind of on-device practical NPU performance is really interesting.Making soulless “art” out of prompts? A lot less interesting…