And, let’s face it, no-one wants to trawl through pages and pages of images of skin conditions looking for their own specific issue. This certainly takes some of the pain out of the process.
“Describing an odd mole or rash on your skin can be hard to do with words alone,” said Google in a statement. “Fortunately, there’s a new way Lens can help, with the ability to search skin conditions that are visually similar to what you see on your skin.
“This feature also works if you're not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head.”
While it’s certainly not meant to be used instead of professional treatment and diagnosis, it can provide a lot more information about any concerns, particularly if seeing a doctor is difficult, and offers a degree of privacy that can stop many people seeking help for minor conditions.
However, activity is saved to the cloud, so if you're sensitive about medical history being archived in this domain, you’ll need to turn off Google Lens saves in the Web & App Activity section of your Google Account first.
The company also announced that Google Lens will be integrated into generative AI chatbot Bard, which will allow for real-time feedback on image prompts.
Google Lens already assists with visual cues such as translating foreign street signs, directions and menus. Some of the new features include uploading an image of a type of food with a “near me” prompt and it will return a list (with pictures) of local spots where you can potentially find the item or dish.
The Google Lens app is available on Android, while iOS users can access it through the Google app.
For more on how Google Lens works, check out this video from the archives.
Source: Google
No comments:
Post a Comment