Don't Trust Average Attenuation


By: CJ, 18 August 2018

Homebrewers often look to yeast manufacturers' attenuation ratings when choosing a yeast strain or predicting final gravity. I sometimes find myself thinking "Strain X attenuates 74%, but strain Y attenuates 75%, and I want a dry beer, so I should go with Y, right?" or "I'm aiming for 74% attenuation to hit my desired final gravity, so I should use strain X, right?"

The data I have so far suggests that this is a bad idea: the manufacturer's average attenuation rating corresponds only loosely with actual attenuation. In other words, don't trust average attenuation.

Fun with Statistics

bricks Statistically, the simplest measure of how one thing corresponds to another thing is 'correlation'. A correlation closer to 1 means there is a strong positive relationship between the two things, a correlation near 0 means there is no real relationship between them, and a correlation closer to -1 means there is a strong negative relationship. If we studied the correlation between, say, the number of standard bricks I've stacked on top of each other and the height of my stack, we'd expect to find a correlation of 1. If we studied the correlation between a person's height and their shoe size, we'd expect a pretty strong correlation, but not quite 1, since some people have feet that are quite large or small for their height.

Another interesting statistical measure of how one thing corresponds to another is the R squared of a linear regression. The idea here is that you try to predict one thing from another on the assumption that they have a straightforward, 1-to-1 or 1-to-2 type relationship, and see how closely the actual results fit with your predicted results. The closer the R squared is to 1, the closer the predictions are to the actual results, which is good. An R squared closer to 0 means the predictions are far from the actual results, which is not so good. In my brick example, you should be able to predict perfectly accurately how tall my stack of bricks is if I tell you how many standard bricks are in my stack — you just multiply the number of bricks by the height of each brick. Your R squared would end up at 1, since there is a perfect 1-brick-equals-X-inches relationship at work: your predictions would perfectly match reality. With the shoe example, you'd probably not be as accurate in predicting shoe size from height. Because of individual variation, there would be a bit of difference between what you predicted and what was actually observed. The R squared might be something like .7 or .8, but it wouldn't quite be 1.

shoes

Sweet, Sweet Numbers

When it comes to brewing, if there was a strong positive link between the manufacturer's attenuation rating and actual attenuation, we'd expect the data to show a correlation near to 1. If the manufacturer's attenuation rating was helpful in predicting actual attenuation, we'd also expect an R squared close to 1 when trying to predict actual attenuation from the average rating.

That's not what we get. Not even close.

I analyzed data on 148 batches, which is all I have available at the moment. The correlation between manufacturer's average attenuation rating and actual attenuation was only 0.218. The R squared was only 0.047. So the connection between what the yeast manufacturer tells you about the strain and what you'll get is pretty tenuous.

This isn't surprising if you dig deeply into the White Labs website. White Labs posts limited data about batches they brew for their tasting room, which you can look through here. Unfortunately, they don't give much information about the recipe or brewing process, but they do note the apparent attenuation of most batches. If you pick a random White Labs strain, select a few random batches using that strain, and compare their actual attenuation measures, the results are also all over the place. WLP001, listed as having an average attenuation of 76.5%, will attenuate up to 90% in some conditions (check out batch 91.3 on the site linked above).

Is This Significant?

kale In statistics, a lot rests on 'significance', which is the line drawn between what we're willing to count as "yes, this is a strong correlation..." and "no, this is a weak correlation". There is no absolute rule for the cutoff point: it depends on what you're doing, what you care about, and many other factors. A weak correlation between eating kale and randomly bursting into flames might turn people off of kale, while a strong correlation between eating kale and having a slight headache every 10 years wouldn't. (For the record, no one should ever eat kale regardless.)

To establish some sort of reference point, I looked at the same 148 batches and examined the correlation and R squared between the percentage of specialty malts used in the grist and the measured attenuation. In this case, the correlation was -0.41 and the R squared was 0.17. Not mind-blowing, but these indicate stronger relationships. Basically, if you were trying to predict your attenuation, you'd be better off looking at how much specialty grain you're using rather than the manufacturer's average attenuation rating.

But why would this be? Ask yourself a question: how do the yeast manufacturers come up with average attenuation ratings? Please email me if you know, because I have no idea! I tried to track down whether there is some sort of standard protocol for producing this measure (as there is for, say, the SRM or Lovibond rating of a grain), but couldn't find any. So it might be that yeast manufacturers are testing strain attenuation in specific lab conditions that don't replicate what homebrewers are doing outside the lab. Maybe they're using a wort that's mashed in some unusual way, fermenting in some odd way, or using equipment that homebrewers tend not to use. It's hard to tell. Unfortunately, that also means it's dangerous to assume that your batches will be similar enough to their's that you'll get a similar yeast performance.

Analysis Summary
Feature 1 Feature 2 Correlation R Squared
% Specialty Malt Measured Attenuation -0.41 0.17
Average Attenuation Rating Measured Attenuation 0.218 0.047

Bonus!

So out of curiosity, I ranked the 148 batches on which I have data to see which of them had actual attenuation closest to the manufacturer's average attenuation. Of those 148, there were 19 batches that were within 1 percentage point of the manufacturer's rating. There were 2 batches that were dead on. For the 19 batches, there seemed to be no pattern. Some were step mashed and some weren't; some had specialty grain and some didn't; some had simple sugars, some didn't. For the 2 that were dead on, there were some commonalities: both were single infusion mashed at 152, had 10-15% specialty grain and no simple sugar in the grist, were sparged with 5-6 gallons of water, were between 1.040 and 1.050 OG, and used non-Belgian/diastaticus yeast without a starter. I also found 1 other batch with the same characteristics that was 1.26 percentage points from the manufacterer average, which is pretty close, though it only ranked 26th in closeness among the 148 batches. Oddly, all three of these batches finished at 1.013.

What's interesting to me is that this profile is very much like a standard American Pale Ale brewed with a normal fly or batch sparge process. Maybe the manufacterers are using a wort that they feel is representative of a standard American craft beer, in the hopes their measures will then apply to average brewing situations. So it might turn out that your Sierra Nevada clone ends up with attenuation closer to what the manufacturer says than your Duvel clone. Sucks for people like me who'd prefer making a Duvel clone.

pale ale

An Exercise for the Reader

Anyone with Excel can run the same analyses I did on their own batches. To illustrate what you can figure out with simple tools, I didn't use any fancy computer coding to get these results. Just set up a spreadsheet with one row for each batch and one column for each aspect of the brew (OG, average attenuation rating, actual attenuation, percentage of specialty grain, etc). Then install the Analysis Toolpak and have Excel spit out a correlation and linear regression. I'd be interested to see if your data analysis has the same outcome as mine. Let me know!