Skip to content

Choice Magazine Finds Sunscreens Falling Short of their Advertised SPF – Why?

January 24, 2016

In December consumer product testing company Choice published results of a study they had recently completed on SPF 50 sunscreens.  Six sunscreen products were tested, only two passed.  As if that isn’t bad enough the Australian Choice report also states that consumer group testing in the USA, UK and New Zealand also found discrepancies between advertised and re-tested SPF. Something doesn’t look good here.

So we have a situation where several consumer group product testers have tested several sunscreen products in different laboratories and found that a worrying number of those sunscreens don’t live up to their advertised SPF.

So where is the problem?  Are the products faulty or is it the test method or the way that these tests are carried out?

Whatever the truth is I am pretty sure that everyone involved will be hoping very much that the problem isn’t with them.

I don’t know the truth, I didn’t get involved in these tests and I didn’t make the products involved to have an intimate relationship with those particular formulations and manufacturing facilities.  But I have made sunscreens before and do know a fair bit about how they are tested, about the test methods used and about how and when sunscreen products are put through their SPF testing paces.  From that background  I can see how this could happen.

The Choice magazine article does actually point out some of the potential reasons for SPF variation and they are all logical and worth a look so I won’t repeat them here by saying the same thing.  So rather than going over the things listed there I’ll have a look at how three of the tests ran and see what that brings up.

Looking at the Australian Vs the UK  Vs the USA test results (Australia tested 6 products, UK tested 25, USA tested 49) we see that only 2/6 or 33%  of the Australian tested products met their SPF while 21/25 or 84% of the UK test products were compliant. Testing in the USA gave a pass rate of 33/49 or 67%.   While this is interesting it is hard to draw a conclusion from these results as the Australian tests were measuring products advertised as SPF 50,  the UK test was measuring SPF 30 and the tests in the USA spanned a variety of SPF from 30 up to 100.   Further, the test protocols differed between the labs making it impossible to draw any direct conclusions between the three studies.  But that doesn’t mean the three sets of results can’t be evaluated on a more general level.

NB: I had to subscribe to get the full results for the UK and USA. 

Here are some things that I did note that might be of interest:

  • Using the USA data if I took only the SPF 30 products and looked at what percentage of those passed the test I get a figure of 88%.  2/17 of the samples failed to meet their SPF 30 label claim.  This is pretty similar to the UK figure of an 84% pass rate.
  • If I take the USA data for all products advertised with SPF between 50-60 (including SPF 50+) we are left with 24 sunscreens of which 13 passed 54%.   The Australian results relate to products with SPF in this general range and got a pass rate of only 33% but with only 6 products tested it is hard to say this is a meaningful result.
  • In order to see if it looks to be true that the higher the claimed SPF the more variable the SPF test results  I looked at USA test results for products with SPF over 60.  There were only 4 and two passed and two failed – 50% pass rate.  Again this is a very small sample size so it is unwise to read too much into this other than to say that it warrants further review to test the idea that the higher the SPF the more variable the SPF result will be.
  • In terms of brand failures I looked at the three sets of data for any insight.  The brand Hawaiian Tropic had at least one product fail in both the UK and USA test results but it wasn’t tested in Australia. Banana Boat also had failures in two tests – Australia and USA.   As both of these brands are huge and have many, many SKU’s than some of the other brands tested I would urge people not to think of either of these brands as abject failures as both brands also had products tested that met their required SPF. If anything these anomalies do make me think that there is something strange and complex going on rather than a simple case of ‘oh well, that company is clearly doing the wrong thing somewhere along the line’ or ‘that test house has it all wrong’.  It is possible that a big brand uses multiple manufacturers and R&D chemists to create their products which COULD result in one factory producing products that were of a different quality to anothers but one would think that in-house GMP/ QC testing would identify that.
  • The UK test results included a rating out of 5 for ease of application.  I wondered if that correlated with anything.  I found that the products that passed the test had an average application score of 3.62 (there were 21 products in this group) and a median value of 4. Those that failed had an average application score of 3 (there were 4 products in this group) and a median of 3.5.  The uneven sample size and small size of the failed samples makes me reluctant to judge this especially given that the lowest score out of 5 for a product that passed and failed was 1 and two out of 4 products scored a 4/5 in the failed group.   No other data set included a value like this.
  • The USA data Also stated SPF actives in the test results which made it easy to spot those formulations using Zinc Oxide and Titanium Dioxide only systems. Out of this group only 3/10 met their advertised SPF (Advertised SPF’s ranged from 30-60).  This is the lowest compliance group with a fair few products coming in at around 1/2 the stated SPF.
  • The three product test groups covered all sorts of sunscreen including aerosols, roll-ons, mists and traditional cream types.  There were passes and failures across the application types but not really enough data to say that sunscreen form was a predictor of SPF compliance.

The Proctor and Gamble Experience.

The Choice Australia commentary on their website states a case where product manufacturer Proctor and Gamble sent a product off to be tested telling the test lab the product had an SPF between 20-100 and SPF test results came back at between 37-75 but when they told the lab the estimated SPF was expected to be 80, three labs scored the product close to SPF 80, one at 54 and one at around 70.   The little bit of involvement I’ve had with SPF testing leads me to find this result interesting but not entirely surprising although in some ways it is a difficult test.   The less guidance given to a sunscreen testing lab the more testing has to be done, testing takes time and costs money and with human testing being relatively expensive anyway most brands want to reduce this AND reduce the UV exposure for test subjects by honing in on the approximate SPF before people get involved.  SPF estimates can be given (roughly) by using a machine to calculate a rough SPF and check that the product has a good UVA/ UVB balance.  Not all test houses will do a machine calculation as standard protocol before testing it on people.  It would often be offered as something the client could pay for before testing commences but as with everything paid for some brands will say no.  Further, an in-vitro test doesn’t directly correlate to in-vivo although it is better than nothing and is a good way of screening out terrible products that are likely to burn people.  So, what we are left with is having to test a product whose SPF could be anywhere between 20-100 on people. The only thing to do is to take a conservative approach which means lots of time-sapping testing which may take hours when you consider that the SPF is increased in a step-wise fashion across the back of the test subject who could have to sit there perfectly still for up to 5-6 hours.  That’s just not practical.

Any Further Comment.

While there are a few different methods of testing sunscreens in the lab the only testing that really matters is the one you do on your bush walk, swim, work or beach trip.  Lab testing methods for sunscreens have been commented on, critiqued and tweaked time and time again yet we still have some discrepancy between the results we get in the lab and the results that people achieve in real life and this isn’t surprising given the number of variables involved.

On the product side of course it is possible that in a product that contains multiple ingredients made in often multiple factories across the world that product would vary a bit although one would hope not substantially.

Finally on the people side I am not entirely convinced that a persons MED (minimal erythemal dose) doesn’t change over time, maybe even over the short term.  I have found reports stating that salt bath exposure, fish oil consumption, sun exposure and a high antioxidant diet can change a persons MED although I have to admit that I haven’t found anything that specifically says the MED can change in 16-24 hours – the usual time in testing between exposure and results.  However, this aspect of SPF testing does interest me. Is it possible that longer SPF tests that involve higher doses of UV trigger an adaptive tolerance response?  A response that is a bit more than the slowly, slowly erythema response?  Human brains certainly favour thinking that resists change so why isn’t the skin any different, especially given that adapting to something takes more effort than trying to pretend it isn’t happening and just carrying on as normal.  I’m just pondering that and relating this pondering to my own experience. I’m the worst kind of sun-exposed person. I spend lots of time out of the sun then go for a walk and get burned. Then, I cover up and avoid the sun ridiculously until I’m cured only to go walking and burn again a bit later. The only time I behave differently is on the occasional summer holiday when I tend to go gently brown and not burn at all as I’m experiencing a more even dose of sun- allowing myself to adapt.

family in stinger suits

But now I’ve gone on too much…..

The bottom line is that while there is much still to learn and perfect in the area of sunscreens I am drawn to the conclusion that these errors say more about our lack of understanding than they do about our business practices or adherence to protocol.  Personally I find that exciting as it means there is room for much improvement.  The only other thing I’d say is that while some sunscreens did fail all tested provided a decent level of SPF coverage and if used properly would ‘seem’ to work.  The number one reason sunscreens fail out in the real world is because they are not applied thick enough or re-applied often enough so if you are worried by these results make sure you are using best practice to apply your product before hiding under your desk.

So there you go.  Very, very interesting.

I would LOVE to see results of a large and diverse group of products tested to the same protocol across a range of test houses using one standard to see how that faired.  These results are hard to compare.

And one last comment – Looking for an alternative to sunscreen for your high SPF protection? I did find this stinger suit very good but I’m not sure how comfortable it would be if I had to wear it all summer 🙂

Amanda x

No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: