More data isn't always better, says Nate Silver

Top statistician warns that an abundance of data lets statisticians 'cherry pick' data points to get the results they want

Big data may seem to promise big insights to users, but more isn't always better, cautions statistician Nate Silver, who became one of America's most well-known faces of data analysis after his FiveThirtyEight blog accurately predicted 2012 presidential election results in all 50 states.

The more data there is, "the more people can cherry pick" data points that confirm what they want it to show, he said.

Abundant data is a notable problem in politics, where many have an interest in the outcome. But it's also an issue in fields ranging from medicine -- where many researchers and journals would rather see studies showing an interesting result than a confirmation of no news -- to earthquake prediction.

It turns out that along with real insight, Big Data can bring "a lot of spurious correlations" -- what appear to be relationships between things that are just random noise, Silver said at the RMS Exceedance conference in Boston today, where RMS announced a new cloud-based RMS(one) risk-management platform..

In addition to writing the FiveThirtyEight blog, now seen at the New York Times, Silver is the author of the book, The Signal and the Noise: why so many predictions fail -- but some don't.

In his presentation, Silver offered four tips for more effectively gaining -- and sharing -- insight from data:

  1. "Think probabilistically," he urged. "Think in terms of probabilities and not in terms of absolutes."

    Don't be afraid of communicating the level of uncertainty that comes with your predictions -- just as most public opinion polls include margins of error -- even if not all of your audience will understand. Some criticised the FiveThirtyEight conclusions of stating the confidence level Silver had in his election predictions, but conveying uncertainty is "important and good science".

    Not doing so can have serious consequences, he noted, such as in 1997 when the National Weather Service predicted a 49-foot flood level for the Red River in Grand Forks, ND. Many in the town were reassured by that, since the city's levees were designed to withstand a 51-foot flood.

    Unfortunately, what was not communicated to Grand Forks residents was the likely margin of error based on past forecasts: plus or minus 9 feet. In fact, the river crested at 54-feet and much of the community was flooded.

    Today, the National Weather Service is much better about noting the uncertainty level of its forecasts, Silver said, citing the "cone of uncertainty" that comes along with projected hurricane paths. Showing uncertainty "in a visual way is important" in helping people evaluate forecasts.

    Probability forecasts are a "way point between ignorance and knowledge," but they are not certainties.

  2. "Know where you're coming from" -- that is, know your weak points, the incentives to reach certain conclusions and the biases against others. "You are defined by your weakest link," he said.

    He noted an experiment on gender bias where people were shown similar technical resumes -- one with a female name and one with a male name. People who claimed to have no gender bias were in fact more likely to discriminate against the female's resume. Why? Those who were aware of their tendencies toward bias were more likely to take action to counteract it, Silver said.

  3. Survey the data landscape, and make sure you have some variance in your data before having confidence in a forecast. (In other words, accurately forecasting the weather in San Diego is not as impressive feat as doing so in Buffalo.)

    Likewise, forecasting a stable economy is easier than in times of a lot of booms and busts, which helps explain why many forecasters were unprepared for the most recession. The forecasters were creating models based on data from 1986-2006, when the economy was unusually stable. A detailed and sophisticated model based on silly assumptions won't do you much good, he noted.

  4. Finally, trial and error are helpful.

    Models tend to work well when they are developed slowly with a lot of feedback. As with many things in life: "You should be suspicious of miraculous results."

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags databasesbusiness issuesData managementstoragesoftwareapplicationsdata warehousingdata mining

More about BuffaloBuffaloRMS

Show Comments
[]