Google Soli: Palm-sized radar could rev up checkouts by telling apples from oranges

Employing machine learning, RadarCat uses Google’s Soli radar sensors to recognize and distinguish between objects. Image: University of St Andrews UK Researchers from St Andrews University in Scotland have used Google’s mini radar for mobile gesture control to create RadarCat, a classification machine that can accurately distinguish between objects in real time. Rather than use cameras to capture objects in view, the RadarCat machine combines recognition algorithms with radio waves beamed from Google’s Soli radars to identify objects based on unique traits reflected back to its sensor. The mini radar was developed by Google’s ATAP team and unveiled at the 2015 IO as Project Soli. The Soli sensors are capable of recognizing fine gestures, such as rubbing two fingers together, when held in front of the radar. However, the St…


Link to Full Article: Google Soli: Palm-sized radar could rev up checkouts by telling apples from oranges

Pin It on Pinterest

Share This