Fake News: “Lies, damned lies and statistics”

Fake News: “Lies, damned lies and statistics”

With the sheer volume of media bombarding us each day, it can be tough to identify what is fact, what is an opinion or perhaps more importantly, what is a credible opinion and what is just plain made up B.S. The news (& social) media needs to grab our attention, in a world where our attention is at a premium, so it uses shock tactics to get through our attention filters. Their objective is to make it seem like Halloween every day.

There are a lot of reports on “scientific studies”. Unfortunately some journalists who write these articles often don’t understand the scientific process. They will misrepresent the information to make it more understandable or interesting to the reader. The study may even be totally bogus. You need to become a sceptic and a detective to separate truth from fiction.

Sample size – How many were actually involved?
Studies need to have a “high” number of participants for the conclusions to be believable. This is usually in the hundreds or even the thousands. Scientists may run small studies (often called pilots) with just a few people to see if there is potentially a trend. If there does seem to be something happening, this small study can be used to apply for money to funding a bigger, more conclusive study. Unfortunately, poor science or poor journalism will try and represent these small studies as being conclusive, and they simply are not. If there were only 15 people in the trial, then you can’t really believe the results because the study is not repeatable. This means if you were to run exactly the same study on 15 other people, you wouldn’t really know if you would get the same results. This brings me to….

Repeatability – Can they do it again?
You can’t make conclusions based on one study alone. There needs to be a few similar studies that show similar results. This is why drug development involves a number of phases, each of which run a new study with new participants on the same drug. Recently, a research team in South Africa announced that according to their study there were only between 250 and 530 Great White Sharks left in the shores of South Africa. The media reported this as the ultimate fact – with headlines such as “Sharks on the brink of extinction!”. However, other shark scientists were hugely critical of this, as there were multitudes of other long-term studies showing that there were many, many more sharks. Just because one experiment showed a result doesn’t mean that it should be believed. A few studies showing the same result – now we’re talking.

Hidden, or latent variables – Did something else cause it?
A recent study showed that children who drink full cream (fat) milk are slimmer than children who were given low-fat milk. As a sceptic and detective, you need to ask yourself “But what is different about kids who drink full cream milk?” Perhaps their parents believe in more “natural” food – and are more likely to eat many more vegetables and fewer refined foods. So though it would appear that full cream milk makes children thinner – the full cream milk may be hiding some other hidden, or latent variable…. Which is that they may be healthier eaters in general. A good scientist will account for other variables that may be hidden by recording them and including them in the model. A good journalist will report these potential latent variables.

Generalisation – Can the results of the study be applied more broadly?
Sometimes a study is made to look like it applies to everyone. For example a study that was recently released in the media said that “A glass of wine a day is equivalent to an hour at the gym”. If you read the fine-print though – this was a study on lab rats! It doesn’t apply in general to people at all. Similarly, if a study is run on people of a particular age or ethnicity, the results will not automatically apply to the general population.

Hidden Agenda – Who sponsored the study? Do they have a financial interest in the outcome?
“Study shows there is no association between soft drinks and obesity”. Hmm… let’s have a look at who sponsored that study. Sometimes the impartiality and credibility of the scientist is at risk when they are given a big grant by the industry they are working for. Ask yourself – are the funders reputable or could they have a hidden agenda?

Some websites have a very big agenda – for example a websites such as WeDontLikeVaccinations.com will often misrepresent a study – or just make one up. But sound very convincing in doing so. It’s best to read about studies in newspapers. Even better, Google the study in Google Scholar. Usually you need membership to read the whole article that the scientist wrote, but you can read what’s known as the abstract. This is a summary of the study itself.

Confirmation Bias – Is it just confirming what I want to believe?
If you want to believe wine is good for you, a typical Google search might be “the health benefits of wine” or “is wine good for you?”. Google will return reports showing you positive benefits of wine. The results would be very different if you Googled “the side effects of drinking wine” or “is wine bad for you?”. Similarly, experiments may be set up to confirm a pre-existing belief. Additionally, a journalist perhaps may only report on the experiments that support the article their readers would enjoy, this is understandable, but you need to be sceptical of any conclusions you draw from it.

In the scientific world, a study or paper needs to be peer-reviewed by other (anonymous) scientists before it gets published in a scientific journal. Unfortunately, the media will often report on a study before it has gone through the peer review process. It’s harder to tell if a paper has been peer-reviewed, but worth keeping in mind.

So the next time you hear the phrase “a recent scientific study has shown” we challenge you to be more sceptical of the potential flaws in the experiment and the reporting itself.


Facebook Comments

Leave a Reply