Googling a vague description of a desired piece of furniture, like “red leather chair”, is likely to return a slew of unwanted results, so the Cornell researchers designed a neural network in the vein of Shazam, the app which listens to a snippet of music and in seconds can bring up the song’s title, artist, album and other details.
The system developed at Cornell scans photos of objects, and compares their shape, color and features to a database of “iconic images” taken from manufacturer catalogs and websites. From there, it returns the best match, along with details on who makes it and where it’s available.
“It seems a lot of people want to buy things they see in someone else’s home or in a photo, but they don’t know where to look,” says Sean Bell, one of the paper’s authors.
Online communities like Houzz, Pinterest and LikeThatDecor allow users to share information on products, including where to buy them, but the researchers wanted to automate and streamline that process into a service.
Images of household scenes, with boxes drawn around items of interest, were crowdsourced, and fed into a neural network along with their iconic images to train the system to recognize objects. Rather than waiting for it to trawl through the entire database, the range of matches is narrowed down by first broadly analyzing the image for its edges and lines, to weed out the most obviously wrong answers. Then it looks at more specific parts and shapes, before searching for entire objects in a now much smaller list.
The researchers have formed a startup company, Grokstyle, to get the service up and running on a subscription basis for retailers and designers, and in future, similar systems could be developed for other areas like clothing.