We’ve heard it all before:

“Green tea has been shown to cure X!”

“This study shows the catechins in tea have been proven to Y!”

“This one study proves tea cures cancer!”

We’ve heard these statements so often we take them for granted. We’ve heard these claims so much we don’t even bother to check the sources anymore. They’re basically fact. But there is one thing we should always remind ourselves:

Not all studies are equal.

The problem is that the media loves to latch onto the latest in studies that could be spun however they like. Have you ever noticed that certain foods (most commonly coffee and wine) seem to go in and out of vogue from year to year? Sometimes coffee has been proven to increase your likelihood of getting sick, and other times it’s a preventative measure for high blood pressure or some such? Sometimes carbs are the enemy. Sometimes it’s fat.

Have you ever noticed that debaters always have data that backs their argument, whether they are on one side or an issue or the other? Why is it that global warming deniers have as much research on their side as global warming promoters?

Individual studies by themselves often contradict others trying to prove/disprove the same things. And a lot of times, when we look at studies, we find that they may not have been as reputable as we thought.

Two of the biggest things to look for (and to be clear, these are not the only warning signs) when checking out a study are sample size and who is sponsoring the study.

Who is financing a particular study?

Often times we forget that almost all things–whether documentaries or newspapers or scientific studies–have a bias. All things have a bias, even from sources that claim not to have any.

A  cola brand (hint: it’s the brand with the red cans) got in trouble with the media a couple years ago for funding a number of allegedly bogus studies that attempted to convince people that cutting sugar out of a diet isn’t nearly as effective as exercising, and so you really shouldn’t cut out all the pop from your diet (something I can tell you is a load of horse hockey). Does this sound fishy? It should.

Studies need to be funded somehow and getting money from a party who has much to gain from positive results is a great way to get your research done. But these need to be taken with a shaker full of salt. The results can be completely accurate, but sometimes the data gets skewed. And we need to be on constant watch for it.

How big is the study?

It’s pretty intuitive: the more data collected, the better, right? Right. The larger the sample size, the more accurate the results can be. It’s much easier to see a trend from a sample of 1000 people than it is from 50. So when we see a study claiming it has found a link between the consumption of, say, green tea, and a decrease in some disease, a small sample size doesn’t really prove anything except that a larger test needs to be called for.

These kinds of studies are great for news sites and blogs because you can pretty much prove whatever you want with a small enough sample size. The smaller a sample size the closer it approaches anecdotal evidence, which is a far cry from a proven fact.

This isn’t to say that all studies with small sample sizes are bad, but when your only evidence of a thing is with one or two very small studies that merely suggest that thing, you have to wonder how accurate these studies are.

On top of all of this, there is such a thing as just plain bad science. Studies that look at the wrong variables. Studies that look at correlation and say it’s causation. Just plain bad science. And these are the studies that are picked up by the media, get a lot of attention, and stick with us long after these studies are disproven or the “scientists” even come out and admit their information was faulty. The damage has been done, and millions still think vaccines give you autism (they don’t), and that sex burns enough calories we don’t need to exercise (wishful thinking, but no). Data can be manipulated to say whatever we want.

And, just as a reminder: a study that says “X suggests Y” or “A has been shown to B” is not the same thing as saying that “X always means Y” and “A proves B”. A study saying that green tea has been shown to prevent or delay the onset of cancer does not mean that green tea is a cure for cancer.

And we are part of the problem, too! We are all too happy to find the first link on Google that “proves” our point and find the first study that claims what we’re trying to prove, even if the study is by no institution we’ve ever heard of and is more than a little bit questionable. We share these things like they’re viruses, and we’ve unintentionally hurt people with faulty information. I’m not trying to call anyone out–I’m calling everyone out. We need to be more vigilant in finding real research.

The sooner we can commit to finding real information, the better off we are. And if the current American political climate is any indication, we are in desperate need of some real information and honesty.