Can Deep Learning ‘Spot Obesity from Space'?

Deep Learning Can’t ‘Spot Obesity from Space,’ But it Can ID the Neighborhood

There’s a bit more to the story, as well as less.

Reports about artificial intelligence’s latest triumphs can sometimes come with, let’s say, a dash of hyperbole. Over the past year alone, we’ve seen headlines claiming AI “can tell whether you’re gay or straight” based on a photograph, and can “Now Read Better Than Humans, Putting Millions of Jobs at Risk,” and that, “An AI-Written Novella Almost Won a Literary Prize.”

In each case, of course, there was a little less to the story, when accounting for factors such as a limited range of photos on a dating site, a reading test that isn’t as cognitively difficult as the ones elementary school kids get, or the fact the novella-writing AI had a research team serving as its muse the whole way. The AI advancements cited in these projects were impressive in terms of the technology’s steady, incremental march forward, just not as astounding as you might think at first glance.

So, you might cast a slightly skeptical eye toward a recent report from Science contending “artificial intelligence spots obesity from space,” which seems to push the gee-wiz exaggerations to another level. But behind the attention-getting image of adding, say, a former governor’s waistline to the Great Wall of China and the Pyramids of Egypt as human achievements astronauts can see with the naked eye from space, is a notable development not just in machine-learning’s ability to spot patterns and interpret their meanings, but in what certain indicators actually mean. As with those other claims, there’s a bit more to the story, as well as less. And in this case, it also can help public health organizations in planning healthier communities.

View From Above

According to The Journal of the American Medical Association’s JAMA Network Open, research teams from the University of Washington collected nearly 150,000 Google Maps made from satellite images of Los Angeles, Memphis, San Antonio and the Seattle metro area, and drilled down into the neighborhoods using Google Street View. They ran the images through a deep learning convolutional neural network to classify the prevalence in a neighborhood of crosswalks, building types and green areas. They then used another program called Elastic Net to compare the obesity rates in those areas based on data from the Centers for Disease Control and Prevention’s 500 Cities project.

The researchers found the presence of green areas, crosswalks and the distribution of buildings in what they called “built areas” allowed for better estimates of the prevalence of obesity than what might be thought of as more traditional measures, such as the number of gyms or spas in the immediate area. Among the many factors to consider, for instance, is that people who live in higher income areas don’t necessarily have to join a gym around the corner when they can drive to one.

Some other environmental features that might seem to be good indicators, such as land use mix and the density of subways, bus stops and intersections, also proved to be less significant. And while income levels — another go-to factor in health-related studies — helped explain some of the results, the researchers write that, “our analyses also suggests that the built environment features more consistently estimate obesity than per capita income across all regions.”

The researchers, led by the University of Washington’s Adyasha Maharana, don’t contend there’s a consistently precise ratio between the characteristics of built areas and obesity, only that those indicators got better results than other measures. Their results varied among the cities in the study, in fact — the system did best in Memphis, where it showed a 73.3 percent accuracy at predicting obesity, but tended to underestimate obesity levels in Seattle, the researchers wrote.

But they also found that, “our approach consistently presents a strong association between obesity prevalence and the built environment indicator across all four regions, despite varying city and neighborhood values.”

New Terrain

Another advantage of using maps made from satellite images, which didn’t have to be labeled and didn’t require adjustments from the neural network to account for CDC’s obesity data, is that the assessments could be done quicker than they could with other methods, which often have to wait for data from more sources to become available.

The researchers will make their data and computational openly available, allowing other teams to make comparison studies in other regions and types of communities — and put those results to use.

“We show that models fitted solely to the features of the built environment can provide reasonable estimates of neighborhood obesity prevalence,” the researchers write, noting the results could be useful to organizations developing public health programs to combat obesity.

AI is used to combat obesity in a variety of other ways. Michigan State University researchers use an AI as a health coach to help people improve their health habits. A team at the University of Southern California is testing a comprehensive algorithm designed to help people be smarter about their choice of foods. But knowing where the problem exists can help health officials decide where to target programs, whether they be healthy eating initiatives, exercise programs or trying to attract healthy food options to an area. In this case, applying AI helped get accurate results more quickly than other approaches, but it also helped find the most revealing signs of where the problem is most in need of remedy.