Some days, continuing to read the news can be stressful.
IIRC Windows has an accessibility feature where the cursor jumps to the primary default action in opening dialogs.
Doing it screenshot based seems inefficient if y du could iterate through windows and controls.
Oh sick. Now this is the stuff I’m most excited with AI lately. Apples doing an implementation as well.
Now you could say a command such as “close the window” or “click the picture of a puppy”. It’s an amazing accessibility tool. So much better then those eye tracking or screen grid coordinate systems we had prior.
Or issuing a command such as “go to this website, add this to my cart, and check out” sure my Alexa or Home can do it with their predefined stores, but this opens up any site or program that a human can operate. So it’s useful for everyone at the end of the day.
The fact that suddenly it went to watch a leisure website during work… did they use stolen screen recordings from human activity for training? Like if some corporation allowed them to record all the activity of their employees for training