Plates get ‘smart’ through object recognition

    Listen

    Image recognition technology gains a foothold in next wave gadgets—some for the kitchen.

    It isn’t exactly flawless, but “computer vision” capabilities are improving quickly, allowing for automatic recognition of objects ranging from water bottles to white bread. That’s got technologists, including two Philadelphia startups, developing gadgets and apps aimed at improving our well-being.

    Plate runneth over

    Anthony Ortiz pulls out a prototype of his new invention, placing it on a table next to a spread of fruits and vegetables.

    “SmartPlate is the world’s first connected kitchen device that will instantly track and analyze everything you consume,” he says.

    The 10-inch white and purple disk (technically, a squirkle) is divided into three areas, like a TV dinner tray. In the middle, a small bump, which houses its cameras.

    “So this camera here, it uses object recognition to identify the food,” says Ortiz. “So it takes a picture, very quickly once you set your food on the plate, and within a second or so, you’ve got the results.”

    Ortiz places a round, green object with a shiny texture on the SmartPlate. Using image recognition software, it quickly identifies it as a Granny Smith apple.

    The machine also has an internal scale, allowing it to determine portion size. It then takes that information and runs it against the FDA’s online database of nutritional content. Ortiz holds up his phone, which loads the results.

    “Okay, so, we’ve got a seven-ounce apple,” he says. “Its got 102 calories, 27 grams of carbs, and two grams of sodium.”

    That information then gets automatically updated into the SmartPlate app.

    “No more manual data entry, no more food journaling with a pen or paper,” as Ortiz says, just an online log of whatever you eat, accessible through a smartphone.

    Ortiz’s idea is that to eat healthier—and lose weight, if that’s your goal—we need a more accurate measure of what we take in. Even the well-meaning among us consistently under-report calorie intake.

    SmartPlate is supposed to be a non-judgemental check on that.

    “This SmartPlate is not the food police,” he says. “We are not tied to the food police. So it is not going to yell at you when you are eating too fast, you are eating the wrong things. It is just going to give you insight that could possibly help you, if you want the help.”

    Ortiz, 38, is the founder of Fitly, a grocery delivery service that focuses on healthy foods. His new endeavour aims to expand on that concept, but faces some challenges.

    First, to get an accurate count of calories consumed, you’ve got to use the plate for every meal. That may not happen if you eat out. Also, no two lasagnas are created equal. The FDA’s database gives the average nutritional content, so SmartPlate won’t know the difference between whole-milk and skim-milk mozzarella.

    The app attempts to solve this by allowing for manual entry of ingredients. It also can scan the barcode on packaged foods.

    But whether it will actually change the way someone eats?

    “Well, just because it comes out, doesn’t mean it works,” says Michael Lowe, who researches eating and weight regulation at Drexel. “A crucial thing will be for the developers of this, either internally or externally, to demonstrate that it is actually capable of fulfilling whatever claims they might be making for it.”

    Lots of research backs up the importance of food journaling as part of a weight loss program, and getting a better handle on proper portion sizes would be helpful.

    But Lowe questions how well Smart Plate solves the problem of under-reporting.

    “The same psychological influences that can defeat food journaling, of course, can also defeat this.”

    One final issue to overcome: it isn’t dishwasher safe.

    An extra set of eyes

    While SmartPlate sees food, and catalogues it, four rising sophomores at UPenn are taking image recognition technology in a different direction.

    “If you lack vision, why not supplement it with automated vision?” asks Ben Sandler, one of the creators of ThirdEye.

    It’s an app that runs on Google Glass, the search-giant’s wearable computer, which first came out in 2013, but was pulled from shelves while the company reworks its design.

    Sandler explains the concept: “Let’s say I had a bottle. I could feel it was a bottle, but I didn’t know if it was a soda, or water, or something like that. So, I just say, ‘Okay Glass, recognize this.'”

    The voice command prompts Google Glass to take a picture, which then gets run through an image recognition algorithm.

    “And now it is saying ‘processing’ to me to let me know it is trying to figure out what’s in the image.”

    He hears this through a tiny speaker built into Google Glass. Within a few seconds, it correctly names a Poland Spring bottle.

    With a decent web connection, ThirdEye can quickly tell the difference between a $1 bill and a $5, between Tylenol and Aleve…the type of day-to-day challenges a visually impaired person may face.

    There are already smartphone apps that do similar searches, but ThirdEye’s 18-year old CEO Rajat Bhageria says, “They are really crappy. You have to get out your phone, you have to hold it in front of your face, you have to press four or five buttons to get to the app, and then maybe after five or six seconds of fumbling around, you get to an answer.

    “That’s what we are trying to fix by putting it right on your face.”

    The start-up is partnering with the National Federation of the Blind to improve the app’s functionality, and while it isn’t clear when the next generation of Google Glass will be released, ThirdEye plans to work on other smart glass brands.

    Ben Sandler expects a wave of apps and gadgets based on image recognition.

    “I think for a really long time computer vision was prohibitively expensive. But the algorithms have gotten better, the actual computer architecture to process this stuff has gotten better, to the point where it is now cheap and accessible to more developers,” Sandler says.

    Just where those developers take this new wave of technology isn’t clear yet. But in these early days of the computer vision-era, where soft-focus lighting has it looking good, there’s seemingly endless opportunity for improved well-being. For tools that can change the way we interact with and, ultimately, see our world.

    And perhaps eat less junk food.

    WHYY is your source for fact-based, in-depth journalism and information. As a nonprofit organization, we rely on financial support from readers like you. Please give today.

    Want a digest of WHYY’s programs, events & stories? Sign up for our weekly newsletter.

    Together we can reach 100% of WHYY’s fiscal year goal