• Please visit and share your knowledge at our sister communities:
  • If you have not, please join our official Homebrewing Facebook Group!

    Homebrewing Facebook Group

Water Chemistry Questions

Homebrew Talk

Help Support Homebrew Talk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
You can only remove so much alkalinity because the reaction travels in both directions and finds equilibrium at about 65 ppm alkalinity* (as CaCO3). [* provided that sufficient calcium is present, whereby for reaction '1' the Ca++ ion must also be present in an excess or surplus of at least 12 ppm]

1) Ca++ + 2HCO3- <--> CaCO3 + CO2 + H2O

2) CaCO3 + CO2 + H2O <--> Ca(HCO3)2

NOTE: If you exhaust calcium to its minimum of 12 ppm before the waters alkalinity hits 65 ppm, add more calcium to the boil mix as either CaCl2 or CaSO4.
 
Last edited:
Much of what Bamforth was referring to in his last sentence (within the words of his which I quoted above) was the confusion surrounding peer reviewed studies as to whether the pH was measured at mash temperature or at room temperature. It is rarely if ever explicitly stated in the old peer reviewed brewing industry level journals as to which, and must be carefully teased out of the study.

Amazingly enough, Bamforth, a former President of the Institute of Brewing and Distilling who was also an Anheuser-Busch Endowed Professor of Malting and Brewing Sciences at University of California, Davis between 1999 and 2018 is honest enough to admit confusion, but all it took for amateur homebrewers to definitively shed such confusion and accept as gospel fact that all of such measurement has throughout history been derived at room temperature was for AJ deLange to proclaim that it was definitively so through his own intuition.

For but one example of this, in the 2004 published book titled 'Brewing, Science and Practice' by Briggs, Boulton, Brookes, and Stevens, at the top of page 115 is found:
Infusion mashes are best carried out at pH 5.2-5.4 (mash temperature), and so will give cooled worts with pH values of about 5.5-5.8.

If we allow for AJ's formula for temperature induced pH shift being 0.0055 pH points of shift per degree C., and we apply it to the above we get (presuming 66 degrees C. at infusion mash temp., and 20 degrees C. (the EBC standard for wort pH, as opposed to mash pH readings, for which no standard exists AFAICT) we get:

0.0055 x (66-20) = 0.253 as the pH shift. Let's round it to 0.25 points. Then we get 5.2 + 0.25 = 5.45, and 5.4 + 0.25 = 5.65

And we come to the optimum "room temperature" mash pH range of 5.45 to 5.65 pH, with an ideal midrange target of 5.55 pH. This as opposed to Briggs 5.5 to 5.8 for room temp., with an ideal midrange of 5.65.

No matter how you approach it, for Briggs and company, 5.4 is not the ideal room temperature pH midrange for mashing. 5.55 is a better room temperature midrange target if we allow for AJ's observation of the pH shift observed as measurement temperature varies. Or 5.65 for Briggs.
 
Last edited:
Here is a picture from a Weyermann presentation of a mash into which they added consecutive increments of acid malt as a percentage of the grist weight. Notice the difference in measured pH at mash temperature (called specifically Mash-pH) vs. at room temperature (called speciffically Wort-pH) for each addition of acid malt. I 'speculate' that "perhaps" the pH differential was narrower at the onset because this was a step mash, and only at about step 6 or 7 did the mash reach is final target mash temperature (of perhaps 65-67 C.). Their differential (from step 6 on) is about 0.22 pH points on average. Closer to AJ's 0.25 points than to Brigg's 0.35 pH points.

Acidulated_Malt.png
 
Last edited:
Note also that Weyermann ended the acidification when they achieved 5.50 pH at room temperature. They did not target 5.40.

Also note that it is a "mash pH" only when measured at "mash temperature". And note that if mash pH's were actually intended to be taken "only" at room temperature, then a single pH column by which to represent the pH progress during the mash as they acidified it would have totally sufficed, and it would have been the right most column.

And also note carefully that 'ATC' does not conflate the two columns into one. That is not the purpose of 'ATC'.
No, ATC does not work like conductivity temperature compensation. We cannot adjust a measured sample pH value at one temperature to an expected sample pH value at another temperature (e.g. 25°C), because we do not know how the pH of a sample varies with temperature. For example, the pH value of a water sample may change rapidly as the result of chemical, physical, or biological processes that are temperature dependent. If we want to know what the pH value of the sample is at a certain temperature, we would have to adjust the sample to that temperature and measure the pH. That is why pH is frequently reported with a temperature measurement. We understand that the pH of a sample is temperature dependent. While ATC does allow us to calibrate accurately and adjust the pH electrode calibration when the temperature changes, ATC can’t correct for sample pH/temperature effects, which are unknown.
https://assets.thermofisher.com/TFS...p-Compensation-pH-Measure-ST-ATCPHMEAS-EN.pdf
What ATC does do is assure you that if you measure pH at 66 degrees C. and find it to be 5.27, that is truly the worts pH at 66 degrees C., and if you subsequently cool the very same wort sample to 20 degrees C. and measure it again, and this time your meter reads 5.50 pH, that is also its true pH. The bottom line is that pH's are naturally lower at higher temperatures. But for a pH meter to accurately indicate pH's at differing temperatures (with pH's displayed "as they are", and not "as you would like them to be") the "slope" of the internal formula used by the meter must be changed to make the meter agree with the temperature. All that 'ATC' does is change this slope so you don't have to correct for an incorrect slope.
 
Last edited:
What ATC does do is assure you that if you measure pH at 66 degrees C. and find it to be 5.27, that is truly the worts pH at 66 degrees C., and if you subsequently cool the very same wort sample to 20 degrees C. and measure it again, and this time your meter reads 5.50 pH, that is also its true pH. The bottom line is that pH's are naturally lower at higher temperatures. But for a pH meter to accurately indicate pH's at differing temperatures (with pH's displayed "as they are", and not "as you would like them to be") the "slope" of the internal formula used by the meter must be changed to make the meter agree with the temperature. All that 'ATC' does is change this slope so you don't have to correct for an incorrect slope.

This is a challenging concept for many home brewers to get their heads around because they naturally think that the temperature correction (ATC) solves the problem. In fact there are two problems when comparing pH's at different temperatures:

1. Temperature effects the way your device reads pH. A correction needs to be made to normalize (or normalise if you're zed-phobic) temperatures to a standard (e.g., 68° F). This correction is fairly straightforward to understand and parallels how we think about correcting S.G. based on temperature.

2. Temperature can also directly change the pH of a sample. The ATC correction does not account for the fact that the pH of mash at mash temperatures will be lower than when that mash sample is allowed to cool to room temperature. The problem, is that this effect is also a temperature effect, and pH meters advertise built-in temperature correction. So it's easy to see how people misunderstand this:

"Wait, I bought a meter that temperature-corrects my 150° mash pH, but now you're telling me that I still need to let it cool down to room temperature before measuring the pH?"

The best way I've found to explain this to people is to do the following experiment the next time that you brew:

1. Measure the mash pH at mash temp, but put the temperature probe in room temperature water (and the pH probe in the mash). This is effectively what would happen if you had no temperature correction (ATC) on your meter. It's an incorrect reading because the device is doing its internal calibration incorrectly because you tricked the device into thinking the sample is at room temp.

2. Now measure the mash pH at mash temp as you normally would by putting both temp and pH probes in the hot mash. This reading (e.g., 5.30) is a temperature-corrected reading. It is an internal correction that is specific to the meter. This reading puts you on a level playing field with scientists all across the world who use ATC when reporting pH. And, and this is important, if temperature did not directly alter the pH of your mash, you could simply use this value as a true pH reading.

3. Let the same mash sample cool to 68° F. Put the temp probe and pH probe in this cooled sample and measure. What you are likely to discover is that the pH now reads higher (e.g., 5.5). Why? Because temperature changes the true pH value. In this case, heating a sample of wort is figuratively the same as adding some acid to it. And your ATC pH meter has no knowledge of what you're sticking it in. It doesn't know whether temperature fundamentally changes the pH of your mash, or aquarium water, or your hot tub. It just knows how to mechanically correct itself to a standard 68°F.

In short, if you want a mash pH of 5.3 at 68° F, you must measure your mash sample at 68° F even if using an ATC pH meter.
 
Last edited:
Speculation time: (yes, pure intuitive guesswork, wherein I'm fully aware that intuition generally makes for bad science and therefore the probability that this is correct is very low, so take this with a grain of salt)

Observation series 1) The pH masters of yore consistently tend to inform us that the difference between a Wort measured at ~66 degrees C. and a Wort measured at 20 degrees C. is about 0.35 pH points.

Observation series 2) The current observational reality seems to be more on the order of the difference between a Wort measured at ~66 degrees C. and a Wort measured at 20 degrees C. being about roughly between 0.20 and 0.25 pH points.

I wonder (that is to say I'm speculating or guessing or pondering) as to whether older pH meters without ATC trended toward (or cluster around) 0.35 pH points of measured difference, and modern pH meters with ATC cluster around 0.20 to 0.25 pH points of difference across this 66 degree to 20 degree C. temperature differential? The 'speculative reason' for the difference being one of slope without correction vs. slope with correction.

This speculative exercise is merely an intuitive attempt to understand why for decades 0.35 pH points was accepted, and now it is not and now it is something more along the lines of 0.22 pH points that is accepted as well as observed (as for Weyermann).
 
Why is there no EBC or ASBC or other "Standard" methodology for the taking of a "Mash pH", whereas the methodology for taking of a "Wort pH" is standardized?

I believe the reason "may be" (I.E., I speculate) that there is no standard "mash", and without a "Standard Mash" there can be no standardized methodology for the taking of a "Mash pH".

Within single infusion mashes people can mash across a broad range of temperatures. I.E., no standard.
One can also undertake single decoction or double decoction mashing. Again, no standard.
And one can step mash with a multiplicity of mash step temperatures and times. Again, no standard.
 
This linked dissertation, which I believe to be sourced from AJ deLange since his name appears within the web link, if read carefully, seems to me (if I'm following it correctly, which I indeed may not be doing) to strongly indicate that mash pH readings and titrations whereby to determine buffering are physically to be carefully undertaken at 47 to 48 degrees C. and then somewhat awkwardly (IMHO) back calculated via the application of an under-verified application of AJ's 0.0055 x degrees C. method to a nominal 20 degrees C. Here is the link:

http://themodernbrewhouse.com/wp-content/uploads/2016/11/DeLange-Estimating-Mash-pH.pdf
So even though AJ may have instructed Palmer for the purposes of publishing the "Water" book that all mash pH's are to be taken at 20 degrees C., it does not appear that AJ himself (presuming again, this document to be his, plus that I'm reading it correctly) practices this sans via dubious formula derived back calculation to 20 degrees C.

What is clearly needed here is an explanation from @ajdelange himself whereby to gain understanding of presumptions from the source, such as to clear the air of my presumptions. The 47-48 degrees C. requirement may be an instrument related requirement, but again I speculate...

A quote from the document:
This is the DI mash pH of this malt at the approximately 47 °C mash temperature of these experiments. We usually like to work in terms of mash pH specified at room temperature but in this case we need to see what is actually happening in the mash and so took pH measurements at mash temperature. Using the general rule of thumb that measured mash pH increases about 0.0055 for each °C decrease in temperature we would add (4720)*0.0055 = 0.1485 pH and conclude that the room temperature (20 °C) DI pH for this malt is 5.63

Interestingly, this states that to understand specifically what is "actually" happening within the mash one must measure it at mash temperature.
 
Last edited:
This linked dissertation indicates that the ASBC standards organization does have a procedure for "Congress Mash", as does the EBC (which is EBC Method 4.5.1). Here is the link:

http://www.regional.org.au/au/abts/1999/jones.htm
A chart of the ASBC Congress mash as taken from the linked source looks to be mighty close to 47 degrees C. for the first 30 minutes of the mash. Perhaps this is where 47 degrees C. is derived in the presumed AJ link I provided in the post above this one. The EBC 4.5.1 standard specifically calls for 45 degrees C. for the first 30 minutes of a Congress Mash, followed by a progressive temperature ramp-up similar to ASBC.

ASBC Congress Mash.png
 
Back
Top