The Ultimate Game of Tag

New photo app recognizes more than just faces
October 1, 2011
The Ultimate Game of Tag

Many photo applications can guess the subjects of your digital images using facialrecognition software. But a new application, developed by students from Duke and the University of South Carolina, goes considerably deeper, promising to automatically tag photos in a digital collection with names, places, activities, and even emotions.

The app, called TagSense, taps into the trove of data collected by mobile devices to figure out what’s going on in a set of images. For example, it can use a mobile phone’s sound and motion detectors to determine whether photo subjects are standing still, bowling, or dancing. It can even detect snow or rain by browsing weather conditions at the time and location of the shot.

To test the application’s tagging prowess, students snapped more than 200 photos at various locations around the Duke campus, including classroom buildings, gyms, and the Nasher Museum of Art. They found TagSense outperformed current versions of Apple’s iPhoto and Google’s Picasa, demonstrating “greater sophistication,” says Romit Roy Choudhury, assistant professor of electrical and computer engineering and an adviser on the project.

Tagging sophistication is important given the exploding number of digital images most users are keeping on personal computers, says Xuan Bao, a Ph.D. student in computer science and one of the developers. Multiple layers of data make it easier to retrieve specific images from large sets, he explains.

“So, for example, if you’ve taken a bunch of photographs at a party, it would be easy at a later date to search for just photographs of happy people dancing,” adds co-developer Chuan Qin, a visiting graduate student from USC. “Or more specifically, what if you just wanted to find photographs only of Mary dancing at the party and didn’t want to look through all the photographs of Mary?”

The students envision that TagSense would most likely be adopted by groups of people, allowing data to be collected and shared among all of their mobile devices. The current application is a prototype, and the researchers believe that a commercial product could be available in a few years.

The research was supported by the National Science Foundation. Roy Choudhury’s Systems Networking Research Group also receives funding from Microsoft, Nokia, Verizon, and Cisco.