Some days, continuing to read the news can be stressful.
Oh sick. Now this is the stuff I’m most excited with AI lately. Apples doing an implementation as well.
Now you could say a command such as “close the window” or “click the picture of a puppy”. It’s an amazing accessibility tool. So much better then those eye tracking or screen grid coordinate systems we had prior.
Or issuing a command such as “go to this website, add this to my cart, and check out” sure my Alexa or Home can do it with their predefined stores, but this opens up any site or program that a human can operate. So it’s useful for everyone at the end of the day.
IIRC Windows has an accessibility feature where the cursor jumps to the primary default action in opening dialogs.
Doing it screenshot based seems inefficient if y du could iterate through windows and controls.
Makes it work universally, even if the gui isn’t made with a standard toolkit
Also it’s ai they don’t care about efficiency
Yeah this is one of those things where accessibility settings can probably get you 90% there but screenshots and machine learning can probably close the gap somewhat reliably (even if it’s much less efficient).