Becoming Better Researchers

This month I want to take on research. My cortisol levels spike daily reading about the new “research” in the media. The way it’s portrayed to us, the public, compared to what the data is or actually showing is appalling. Then companies, especially in the natural products industry, site “research” to strengthen their claims. Finally, the general public, trying to do their best, “research” topics with little guidance in a skewed, biased system, and the result is just short of mayhem. I’m going to try to address all three. 

If you don’t use the social media site Reddit, I recommend it. Just yesterday, someone posted “Various studies have shown that coffee prevents cancer, causes cancer, makes you live longer, makes you die younger and reduces your risk of diabetes.”  

I can’t think of a better opener that highlights all the problems in one, apprehension causing statement. What the heck am I supposed to do? Drink coffee? Not drink coffee? Trust the studies? Ignore the studies? What’s worse - this isn’t the only thing that’s like this. Butter, dairy, gluten, almost everything feels to us like it’s contradictory! What should we do?

Educate yourself. That’s what we’re looking for. We’re going to help. Starting with the basics.

Here’s a piece of data that I like to point out to people when we are discussing clinical trials or research during a consult. It’s a graph charting the use of Internet Explorer as a web browser vs the US Murder rates.

As you can see, if people use Internet Explorer less, fewer people are murdered. Obviously, this isn’t true but data doesn’t lie, right? (I don’t know though - my homicidal tendencies increase when I have to use IE for work stuff)

The lesson here is the difference between correlation and causation, a huge piece to understanding data, research, and the scientific media. Just because something seems to correlate, meaning they have some sort of relationship together, doesn’t mean that one causes the other. 

To determine the mechanism of a correlation, a well-done experiment should be conducted that tries to determine if causation exists.  Just because people who take Vitamin C during a cold get better, doesn’t mean that Vitamin C makes someone get better. Just because people who take Vitamin D with adrenal fatigue get better, doesn’t mean it was the Vitamin D at all.

Lots of data illustrate a correlation and that’s what people cling to.  Better studies are done and show no causation and people either ignore them or are too entrenched in their habits or ideas. 

In order to understand all of this confusion, we have to know what we don’t know and work towards being better.

Raise Our Own Standards

I want to kind of “clear the air” or set aside a few observations that many people in the scientific community would say about the general public and perception of research and science in general. This isn’t meant to offend, just point out 3 major flaws many people have when it comes to understanding science, studies, research, and medicine.

First off, reading a clinical trial or doing accurate research is a skill. It takes LOTS of practice after thorough education on HOW to research, plus a good amount of statistics education. In other words, it is extremely difficult for a layperson to casually do. It takes effort, truly, to get a firm understanding of what a study is telling you. I’m going to get you started on your journey to make you better at it - don’t you worry!

Secondly, what we, the general public, consider ‘research’ is kind of what we consider ‘food’ to be nowadays. The bar is lower than what it used to be, let’s just put it that way. It’s important to discern between “Google search” and research. As we’ve stated before and as it is generally well known, Google search results can be manipulated. As a test - research whole food vitamins. Where does Nutriplex show up on that list? Almost all of the results on the first few pages aren’t actually whole food products. Research done by many laypeople ends up being too superficial - just scratching the surface, but potentially not penetrating through the layers of marketing, propaganda, and misinformation.

Thirdly, we have to realize the impact of our evolutionary shortcuts on what we believe and idealize. I highly recommend a book called “The Believing Brain” which discusses how these evolutionary shortcuts were developed for survival, but are now used to manipulate us or cause us to entrench ourselves to ideals we favor. We are advocates for a healthy dose of skepticism - don’t believe everything you read or hear - even from us!  (Although if we are constantly talking about a lack of quality controls in an industry then 4 major retailers - vitamin stores included - are found to have no active ingredients in their herbs, that’s a big sign we might be on to something...)

Now that those are established, let’s beat up the people that are causing the problem and adding to our confusion, then send you to a few places to get a good understanding of how to be a more effective self-researcher on your own.

1. The Media Isn't Reporting For Medicine, They're Reporting for Clicks.

The “scientific media” is the PR side of research and the media organizations looking for increased advertising revenue from views. Both are prioritizing attention over accuracy. 

There are two major themes we see. First, studies that reach conclusions that we expect to see (“Drinking water doesn’t cause cancer”) aren’t really attention-getting. “No Duh” we’d say as we click through to see what the Kardashians are up to. But, if we see “Drinking water causes cancer,” we’ll at least pause for a moment, if not read more. As a result, studies that are done well and not controversial will tend to not be reported, while studies that draw new or odd conclusions will more likely be.

I told ya I’d teach you about studies, and here we go. I want to introduce to a concept called confidence intervals to further reinforce this point. A confidence interval is the "fudge factor" or more appropriately, the "fuzz factor."  Most statistical analysis shoot for a confidence interval of 95-99%. This is a measure of certainty that we're drawing the right conclusions. One of the biggest factors in maximizing this certainty is studying enough people, called sample size.

If we wanted to find the average weight of 34-year-old men in this country, it would be pretty intense to try to weigh all of us. Instead, we use fancy math to determine what number of people we'd have to test to get our certainty to it's highest. 

It also tells us, if we were to look at it super simply, that with a 95% confidence interval, 5% of the time the results we expect won't happen. This is also assuming that we're talking about the well-done studies. Assume for a moment that thousands of studies are done well annually (they are) and 5% of them are reporting incorrect results. For every 1000, you'd have 50 studies that if done properly are reporting incorrectly. For every 19 that say "water doesn't cause cancer" there is one that may say "water causes cancer". Which is the media going to report on?

So not only do decently done studies with wild conclusions make the news, but many times studies that reach inappropriate conclusions will make the news.

Another thing the scientific media does is take a new piece of research and extrapolate and speculate in ways the research never intended, just to make a shocking headline. We see this often. A paper will come to a conclusion, the authors will cite the need for further analysis or study, and the media will report what they took from their brief looksie as fact. 

This is the “cancer” thing we frequently talk about with the media.  Cancer isn’t one thing.  A study will look at specific types of cancer, in specific stages, and specific populations.  If you see an article that says "X does Y to cancer," take it with a grain of salt.

So when looking at something used for therapy, we don’t want to make broad generalizations - we can only truly infer or extrapolate to specific scenarios.  Testing something for its impact on one marker of inflammation or sickness doesn't necessarily mean it is good for "immune health."

2. New Data Isn't The Current Consensus, It's Just New Data.

This is one of my favorite articles from The Onion:

Eggs Good For You This Week

BOSTON—According to a Northeastern University study released Monday, eggs—discovered last week by a University of California-Santa Cruz study to be unhealthy, raising serum cholesterol by as much as 20 percent—have beneficial effects on cardiovascular health this week."Contrary to what was previously thought, consuming an egg a day can lower a person's blood pressure and increase the heart's efficiency for the next week," the Northeastern study stated.The report urged Americans to increase egg consumption immediately, as eggs may be unhealthy again as soon as next Monday.

Each new paper does not represent the current state of knowledge. Period. Exclamation point.

This is crucial: science is the study of everything - drawing conclusions and testing those conclusions rigorously. We MUST be able to reproduce the results for a study to be valid - which is why a single piece of data cannot stand on its own. Science is a journey of trial-and-error.

Humans are extremely complex organisms with varied habits, abilities, and capabilities. Studying them is extremely difficult, especially things such as diet’s impact on outcomes - death, strokes, etc.

So studies are done on similar topics to try to further our understanding of a subject until we’ve either got an extremely well-done study with lots of people (large sample size and high confidence intervals) or have lots of good data trending in similar directions. What you end up with is something like this:

 

I recommend reading this article, Beware the One Man Study to understand the concept more. As he states, it is difficult to find a graph such as this for any particular topic. This is one of my biggest complaints about research.  This stuff isn’t compiled regularly for the general public. Guess what we may start doing soon!

Medicine is a science that usually has some urgency to it. People are looking for answers today, so new studies are sought after with some immediacy. But not all studies “move the ball down the field.”  Just because it is new, doesn’t mean that it is actually adding to our current knowledge base or furthering our efforts to date. Sometimes, studies can set us back.

Meta-analysis and review articles are good ways to get the current state of affairs. Be warned of these. As the New York Times reported this week on fish oil not being beneficial for helping us live longer or healthier, we realize that the methods of the studies weren’t entirely appropriate. Many of them were dosing less than 1000 mg of EPA and DHA. We don’t know yet what the impact on death and the cardiovascular event is from the correct dose of fish oil. We do know many people lack Omega 3 in their diets AND eat way too much Omega 6s, so a healthy dose makes sense and has lots of short-term benefits. We’ll wait to see if this will keep us alive longer!

If you want to know the current state of a science, product, or treatment, you have to look at the whole body of research to really know what’s going on. But that doesn’t sell newspapers, increase click rates, or get attention to increase funding at a school.

3. Not All Data Is Good Data

As we said before, some good studies can reach improper conclusions. More likely, weak or poorly done studies will reach improper conclusions, be reported in the media, and capitalists leach on and adopt it to move a product.

Papers get retracted all the time for errors or misconduct during studying, but that’s rarely reported. The media just reports, they don’t analyze the study, methods, results, or statistics.

There is A LOT that can go wrong with a study. Here’s a quick list:

  • Plausibility - does it sound like their approach makes sense? There was a physics paper that said neutrinos travel faster than the speed of light. Many physicists scratched their head, some got excited. Further analysis said no, the study was done incorrectly and other studies validated that their finding was wrong.
  • Poor design - not enough people studied, wrong statistics were chosen, etc.
  • Variables - not considering everything that can skew the data. Multivitamin trials would have to split gender, age, activity level, health risk factors, diet and exercise regimen. Then get enough people in each group! Maintain this for a long time to determine long-term impact. very expensive, hard to do correctly.
  • Bias - huge issue.  Who's motivated to what end as it pertains to the data?
  • Money - Money funds certain research or makes studies hidden and inaccessible. This is how people feel about GMO research.
  • The peer-review process at times can be sketchy.

    But, the two that I want to focus on are the works cited section and the raw data. By improving how we handle these two things, we could dramatically improve science and remove much of the confusion and misinformation. Works cited, or the references a study uses, list authors, journals, and pages. It doesn’t really discuss WHY they picked that particular reference. Having that information available would be EXTREMELY helpful. Secondly, the raw data collected by a clinical trial doesn’t have to be fully disclosed. Having that available would GREATLY improve the validity of studies - peer review would be something like Wikipedia - crowdsourced and scrutinized.

    Research papers are like movies:  99.9% of the stuff that is shown is intentional. There's a lot, potentially, on the cutting room floor. If something isn't mentioned - assume it wasn't done or was done incorrectly and it's being hidden.

    If a study reaches a conclusion that is new or unique, it’s probably completely wrong or, in fact, inconclusive. Just don’t throw the baby out with the bathwater. We just can’t sensationalize every single shred of information as definitive or advancing our understanding. Just having a reference doesn’t mean anything. Products can and have referenced studies that have nothing to do with the specifics of their formulation.

    Getting Better At Researching

    We want you to be better. Here’s what I recommend for researching a topic.

    1. Start with what we call tertiary literature - Encyclopedias, textbooks - and move to primary literature - the actual research papers. This gives you a general understanding and moves to the more specific, illustrating how we got to where we are.
    2. Read every reference. Why was that referenced? Does it even support what the paper or document is saying?
    3. Focus on the methods as much as the results. How did they try to prove their theory? Is it plausible? Did they forget something crucial?
    4. Keep digging. If you see something you don’t understand - look it up! You’ll see themes emerge and you’ll realize that many topics have similar structures, so looking up early and often will help with your total understanding.
    5. Get help. Bring your findings to an expert - doctors and pharmacists are trained in research analysis. It was literally 3 years of specific study for us, with 1 year of statistics. We also have access to almost all journal articles through our relationships with pharmacy colleges across the state!

      This is a great guide for beginners.  

      The Conclusion Always Has The Heart

      Our primary role or mission is to ensure that no matter what product you want to take, you will get a pure, potent, and consistent product.If you want to regularly use or try something, you should have the right to do so, which we 100% support. Some people want isolate vitamins, like a one a day or Vitamin C. With us, you’ll know that we will be 100% transparent of the source as well - no fibs about whole food or food-derived if they are non-natural sources.

      Some of us here are licensed professionals and all of us are trusted experts. From an advice standpoint, we’ll want to prioritize the dosages or forms with the most research and what is SAFE. We still will respect your autonomy.

      But, we have a third, overarching role of protecting you. Protecting you from the misinformation and potential health hazards. Most importantly, is our role in protecting you financially. People spend LOTS of money on supplements when they don’t have to. Most of this is based on the weak data, improper research, and false claims of an unregulated industry. So please, know we are trying our best to do the best for you.

      As with a lot of things today, people cling to ideals despite a lack of evidence to support it or an overwhelming amount of evidence against it. What we are proposing with this rant is a healthy dose of skepticism, critical thinking, and diligence. When formulating an opinion on a topic, it’s important to know the current state of research on the topic, speak with trusted experts, and know what we as a whole don’t know. 

      A great quote to leave you with: “Be skeptical. But when you get proof, accept proof." -Michael Specter

      Our Woodstock Vitamins products are Vitality Approved, so you can be sure that you're getting quality products: pure, potent, and consistent batch to batch.

      Just trying to keep it real...

      Neal Smoller, PharmD
      Owner, Pharmacist, Big Mouth

      neal@woodstockvitamins.com

      About Dr. Neal Smoller

      Dr. Neal Smoller is a holistic pharmacist, supplement expert, and founder of Woodstock Vitamins. Dr. Neal’s mission is to challenge the natural products industry, redefining holistic care and setting the standard for supplement quality. His methods and products are backed by real science, and with them, he builds and supports his customers’ lifelong wellness strategies.

      Previous article It’s Allergies, Not Covid