Currently, the key AI feature everyone is running toward is “agentic systems,” an umbrella term that refers to AI knowing more about you and your preferences so it can do things for you. To quote Demis Hassabis, Google DeepMind CEO, in a prerecorded video shown at the event, “In the coming months, you’ll be able to ask Gemini to reason about the things you see, whether it’s on your phone or the world around you.”
One part of that is Screen Sharing, a way for the phone to analyze what’s being displayed on the screen. In the example given, a person has asked the phone whether a pair of jeans they’re shopping for would be a good fit. It reads the size chart and description from the web page and replies, “Based on the size chart, a medium is equivalent to a size 38. However, since the jeans have a relaxed fit, you might want to consider sizing down to a 36 for a slightly more fitted look.”
Good advice, but seemingly something the person could deduce by reading the sizes (which for some reason were all in French).
(Why Samsung chose the generic term “screen sharing” is a mystery. That’s already a thing people understand as being able to view someone else’s screen, such as during a Zoom presentation or training.)
Read the full article here