Testosterone Boosters Don’t Work

Testosterone Boosters Don’t Work

Sergey enjoying the sh*t blizzard that is Google Page 1 for 'Testosterone Boosters'.

tl; dr ... Short and to the Point
Who is this article aimed at?
Anybody who has read the headlines about some academics 'proving' testosterone boosters don't work.
I am busy - summarize it for me

This is an interesting study, but not for the reasons you'd think. What makes it interesting is how little the academics involved understand about the online supplements world and Google search.

A recent (June 2019) study published in The World Journal of Men’s Health makes fascinating reading for anybody who is interested in testosterone boosting supplements.

Entitled “‘Testosterone Boosting’ Supplements Composition and Claims Are not Supported by the Academic Literature”, it was conducted by a joint team from both the University of Southern California, Los Angeles; Keck School of Medicine and the Institute of Urology.

The results it generated grabbed headlines, “Test Boosters Don’t Work!!” etc etc. Primarily because the testosterone boosting market is fairly well established and worth billions of dollars a year. It concluded;

On PubMed, 24.8% of supplements had data showing an increase in T with supplementation, 10.1% had data showing a decrease in T, and 18.3% had data showing no change in T. No data were found on 61.5% of supplements on their effect on T

However, never mind the headlines. This has proved to be a fascinating insight into the dystopian internet nightmare of the Google landscape Post Medic (the colloquial term for the search giant’s core algorithm update in August 2018), and says a lot more about internet search results than it does about testosterone boosters.

The reason?

Bless them … these well-meaning academics decided to base their study on the efficacy of testosterone boosters by typing ‘Testosterone Booster’ into Google.

As anybody involved in the supplements game will be able to tell you;

Ranking top in Google does not mean the product is good. It means the website promoting the product is good.

This study is of interest due to the dreadful results Google now throws out, and the fact that even very clever academic people get confused and disoriented in the resulting sh*t blizzard of Google’s page 1 when they don’t know what they are looking for or at.

A brief lesson in Search Engine Optimization

When you Google ‘Testosterone Booster’ you will get different results than I get when I Google ‘Testosterone Booster’. Why is that?

Many, many reasons. Obvious ones are location and your previous search history. But there are hundreds, I’ve lifted this from a Quora page on the topic;

  • Which version of Google you are using (dot co dot UK vs dot com)
  • Your IP address (which feeds both into your ‘identity’ and ‘geo’ location)
  • Your geographic location
  • The Google ‘data centre’ which is serving you: Data Centers – Google
  • Your cookies and browser-stored site visitation data
  • Your IT network and the types of traffic which commonly stem from it
  • Your Google account (if logged in) and its associated search history
  • Information relating to you from Google’s wider network (Google Maps, Google My Business / Places, YouTube) which is intravenously applied
  • The type of device you are using (tablet vs desktop vs laptop vs smartphone)
  • The specifications of your device (screen resolution, processing power etc)
  • The software deployment of your device (which web browser you are using – Chrome vs Dolphin vs Safari vs Torch vs Edge etc.)
  • The OS deployment of your device (Windows 10 vs OSX vs Linux Mint, etc.)
  • The nature and regularity of your queries (are they automated? Are they irregular and thus likely to be processed manually?)
  • Your current search session (what have you previously searched for in the current browsing session, even if you won’t be permanently storing the cookie)
  • Some social data from your Google+ profile
  • Your associated personal search entity
  • Your perceived ‘demographic’ data
  • Retargeting data from Google’s search partners

So the point being that whatever appeared in the Google Page 1 for study lead Mary K. Samplaski when she hit ‘search’, it will definitely be different to what anybody else will get.

Which is fundamentally a pretty bad way to identify test boosters for a study on test boosters.

What did these products have in them?

This has been the really interesting part for us, checkout the image below. Look at some of those headscratching ingredients!

Annoyingly they’ve not identified any of the products (probably scared of getting sued) but there must be some absolutely terrible products on that list.

Take a look at this image, the original image is posted here but we’ll repost it here in case it disappears.

Ingredients List

Some notable points are;

1. Less than two thirds of these supplements contained zinc, an absolute cornerstone of a test booster.
2. Only 17.8% contained Vitamin D (it doesn’t say whether it was D3 but let’s assume it was).
3. They spelled Mucuna pruriens incorrectly.
4. Ashwaghanda and Withania somnifera are the same thing.
5. Only 4 of the 45 products contained Bioperine. This is a telling statistic. It tells me there was a lot of rubbish in the list.
6. “Ninety percent of supplements claimed to ‘boost T'” – the marketing departments in those other 10% really need to take a good hard look at themselves in the mirror.

Testogen and TestoFuel?

Two of the biggest products on the test boosting market are TestoFuel and Testogen. You can’t look at lists of test boosters without seeing them, they are to test boosters what Coca Cola and Pepsi are to sugary brown soft drinks.

And they both have vitamin K in them.

And the results say only one product in their list had vitamin K in it.

And they also say one product had Oyster Extract in it. Which is basically TestoFuel, oyster extract is not a popular test boosting ingredient because it instantly removes both vegetarians and people with shellfish allergies from your target market.

Which means … they managed to miss Testogen, one of the most well known test boosters on the market in their list of ‘Top 50 Test Boosters’.

Fat Burners

Some of these are blatantly fat burning ingredients, the products containing these ingredients must have been fat burners and not test boosters at all so no wonder they didn’t claim to boost test;

  1. Yohimbe
  2. 5HTP
  3. Green Tea Extract
  4. Caffeine
  5. Sarsaparilla
  6. Turmeric

How on earth did those land on the list?

Pasta followed by Apple Pie

What are these ingredients doing in a test booster, are we sure this wasn’t a recipe they found?

  1. Apple Extract
  2. Garlic
  3. Rosemary

Things to look for

This ingredient list is a grim litany of test boosting disasters. The big surprise is that they thought 24.8% of that lot had ingredients that could be scientifically substantiated.

A couple of takeaways from this are;

– Look for Bioperine. It tells you they’ve invested (and spent) a bit more on the formula and care about how effectively it’s absorbed and how well it works.
– Look for Vitamin D3. It is a nailed-on full bird test boosting superstar, any test booster without it should be discounted immediately.
– Indole-3-Carbinol is massively under-represented in the top 50 t-boosters. It is the best estrogen control ingredient available.


Ultimately, this study isn’t indicative of the quality of Testosterone Boosters. This is indicative of the utter garbage now served up by Google when you stick the term ‘Testosterone Booster’ into the search bar.

It is quite scary though, these are educated people running this exercise and they’ve conspired to turn up the biggest load of absolute rubbish possible when they were trying to find a testosterone booster. What chance does Joe Average with no Phd or Professorship stand?

Caveat Emptor!! Do your research people, be safe out there 😉

tl; dr ... Short and to the Point
Who is this article aimed at?
Anybody who has read the headlines about some academics 'proving' testosterone boosters don't work.
I am busy - summarize it for me

This is an interesting study, but not for the reasons you'd think. What makes it interesting is how little the academics involved understand about the online supplements world and Google search.