Yet more evidence that commercial brewers do not mash at 5.2 to 5.6 pH ...

Homebrew Talk - Beer, Wine, Mead, & Cider Brewing Discussion Forum

Help Support Homebrew Talk - Beer, Wine, Mead, & Cider Brewing Discussion Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
That would be true, for scientists... except for the fact that many (most?) brewers are not scientists.

Unequivocally so, postings on this and other brewing forums bear out your premise, just as universal applause won't greet opinions on this very subject.

I began brewing with a little knowledge of pH, but not knowing it was of any relevance to this subject. That mattered not as then, in 1963, pH meters were horrendously expensive, large and delicate apparatus requiring space on a laboratory bench. Then they were still much like the machine introduced at the American Chemical Society meeting in San Francisco as seen here. Although this was an improvement on that in Fig 1 of this paper given to the IoB in 1923, it wasn't made to be lugged inside a mash tun and take a measurement. When there were handheld pH meters, their cost would be prohibitive to any but businesses with large turnover, by when much information was already found and recorded for posterity. Some such findings are contained in that 1923 paper, which also records that these were from and of brewing British beers, Milds, Pale Ales and Stouts. Could that imply lagers were mashed at higher pH?
 
See post #555, paragraph 2.

"The consequence of this is that for mash pH readings taken at room temperature with a well calibrated ATC meter the ideal target is going to be around 5.55 to 5.65 pH, with the rather good presumption that if such a room temperature sample reading had instead been taken at around 150-155 degrees F. it would have been in close proximity to 5.40 pH."

Sorry, that's not quite what I asked. I am not using a pH meter. I am relying on software like yours, Brewfather, etc., which simply estimates a pH. Previous wisdom was to aim for a software generated pH of 5.3 to 5.5. Has that changed?

Thanks
 
What's the TL;DR version of this when using brewing software to estimate the mash pH? Target 5.4-5.6 or something higher now?

Target about 5.55-5.60 (as measured at room temperature, anyway!).

That's what I'm doing now based on all this. I am NOT sure how much it matters to final beer quality, but if I've been brewing wrong for the past 20 years, I figure I might as well brew wrong the next 5, 10, 20 years and see if maybe it makes some difference. I honestly don't know yet.
 
Sorry, that's not quite what I asked. I am not using a pH meter. I am relying on software like yours, Brewfather, etc., which simply estimates a pH. Previous wisdom was to aim for a software generated pH of 5.3 to 5.5. Has that changed?

Thanks

IMHO, yes. Target 5.55 - 5.6 as the mid-range for room temperature and 5.40 at mash temperature. Others may disagree. YMMV
 
Last edited:
Unequivocally so, postings on this and other brewing forums bear out your premise, just as universal applause won't greet opinions on this very subject.

I began brewing with a little knowledge of pH, but not knowing it was of any relevance to this subject. That mattered not as then, in 1963, pH meters were horrendously expensive, large and delicate apparatus requiring space on a laboratory bench. Then they were still much like the machine introduced at the American Chemical Society meeting in San Francisco as seen here. Although this was an improvement on that in Fig 1 of this paper given to the IoB in 1923, it wasn't made to be lugged inside a mash tun and take a measurement. When there were handheld pH meters, their cost would be prohibitive to any but businesses with large turnover, by when much information was already found and recorded for posterity. Some such findings are contained in that 1923 paper, which also records that these were from and of brewing British beers, Milds, Pale Ales and Stouts. Could that imply lagers were mashed at higher pH?

Interesting to see that electronic pH meters existed back in 1923, whereas (per Wikipedia, at least) the first patent for a pH meter (by Beckman) was not issued until October of 1934. I wonder why it took from 1923 (or earlier) until 1934 for a patent to be issued? What might have set the patented meter apart from earlier efforts?

For grins, Beckman sold 444 pH meters in 1935, and grossed $60,000. That's $135 per meter.
 
What pH optima did he specify for "Wort" prior to adjusting it (either just pre, or during , or post boil) so as to achieve 5.0-5.2 post boil and cooling. I've expressedly stated that pH should be measured upon cooled Wort, but my distinction (and I believe also Weyermann's) is that at that juncture it is a Wort pH that one is measuring, and no longer is it technically a mash pH, whereby via the application of the Weyerman 0.22 point offset one sees a means to "presume" mash pH from "de-facto" wort pH.

The EBC standard for measuring Wort pH is to do so at 20 degrees C., but this does not make what is being measured a mash pH. Wort pH is at room temperature and mash pH is at mash temperature.
This is quite wrong. A lower pH reading at mash temperature does not mean the mash is more acid. Nor does a higher pH reading at a low temperature make it more alkaline. The pH scale was only designed to be used at 25°C and does not apply exactly at higher and lower temperatures. For example, at boiling point pure water is still neutral (same number of OH- and H3O+ ions) but its pH is 6.14. It is not more acid at boiling point. At 100°C a neutral pH IS 6.14. The pH value of 7.00 being neutral, only applies at 25°C. Room temperature is near enough to 25°C for most purposes but not exactly so. So there is absolutely no point in quoting a mash pH temperature unless you adjust the pH scale for the temperature of your mash. Decent pH meters adjust the pH reading automatically to what it would be at 25°C. Heat or cold does not make water or aqueous liquids more acid or more alkaline...it merely shifts the pH SCALE (not the pH) from its value at 25°C. See: Temperature Dependence of the pH of pure Water
 
This is quite wrong. A lower pH reading at mash temperature does not mean the mash is more acid. Nor does a higher pH reading at a low temperature make it more alkaline. The pH scale was only designed to be used at 25°C and does not apply exactly at higher and lower temperatures. For example, at boiling point pure water is still neutral (same number of OH- and H3O+ ions) but its pH is 6.14. It is not more acid at boiling point. At 100°C a neutral pH IS 6.14. The pH value of 7.00 being neutral, only applies at 25°C. Room temperature is near enough to 25°C for most purposes but not exactly so. So there is absolutely no point in quoting a mash pH temperature unless you adjust the pH scale for the temperature of your mash. Decent pH meters adjust the pH reading automatically to what it would be at 25°C. Heat or cold does not make water or aqueous liquids more acid or more alkaline...it merely shifts the pH SCALE (not the pH) from its value at 25°C. See: Temperature Dependence of the pH of pure Water

I beg to differ, and contend that Weyermann is fundamentally correct in recording a roughly 0.22 pH point difference between a Wort measured at 67 degrees C. and then subsequently measured at 20 degrees C.

An ATC pH meter does not correct pH values to 25 degrees C. It assures that the reading will be more in line with being correct at temperatures other than the calibration temperature. But for a meter calibrated at 25 degrees C. the Wort will read about 0.22 points lower than it will at mash temperature, as the referenced Weyermann literature clearly indicates.

Clearly and technically you can argue (precisely as you are doing) that DI water always has the same amount of extant hydroxyl and hydronium ions, such that regardless of temperature, it is de-Facto neutral. But without an understanding of the slope offset, one can not assume that an actual measured 5.40 pH Wort at 67 C. will also read on the display of the ATC meter the very same 5.40 pH when at the appropriate 20 degrees C. (as per EBC) or at the appropriate 25 degrees C. (per many meters accompanying literature) simply due to the presence of ATC alone. What will be observed for a 5.40 at full mash temperature Wort pH will be observed to be somewhere around a 5.62 pH when read by the same instrument at 20 degrees C.

So what is incorrect is your statement that "Decent pH meters adjust the pH reading automatically to what it would be at 25°C.".

Will ATC adjust the measured sample pH result to the result expected at 25°C? Does it work like conductivity temperature compensation? No, ATC does not work like conductivity temperature compensation. We cannot adjust a measured sample pH value at one temperature to an expected sample pH value at another temperature (e.g. 25°C), because we do not know how the pH of a sample varies with temperature. For example, the pH value of a water sample may change rapidly as the result of chemical, physical, or biological processes that are temperature dependent. If we want to know what the pH value of the sample is at a certain temperature, we would have to adjust the sample to that temperature and measure the pH. That is why pH is frequently reported with a temperature measurement. We understand that the pH of a sample is temperature dependent. While ATC does allow us to calibrate accurately and adjust the pH electrode calibration when the temperature changes, ATC can’t correct for sample pH/temperature effects, which are unknown.

Source: https://assets.thermofisher.com/TFS...p-Compensation-pH-Measure-ST-ATCPHMEAS-EN.pdf
 
The question even came up when the "Water" book was being written, indicating late panic and confusion. I quote whom I presume to be Martin Brungard:
During the production of Palmer and Kaminski’s Water book, John Palmer sent a worried message to AJ DeLange and me that questioned if the typical brewing pH range referred to room- or mash-temperature measurement. AJ’s response follows: “If that range is supposed to be the optimum at mash temperature and mash pH is supposed to be 0.3 less than room temperature, the proper range at room temperature would be 5.5 - 5.9 and we would be hard pressed to explain the large number of brewers who noted great improvement in their beers when they got mash pH down to the 5.4-5.6 range. Certainly that has been my experience.” So, all of us agreed that our recommended range of 5.2 to 5.6 is based on ROOM-TEMPERATURE MEASUREMENT.



This (I.E., consensus based upon purely intuitive speculation) is simply not how science is done.
 
This is quite wrong. A lower pH reading at mash temperature does not mean the mash is more acid. Nor does a higher pH reading at a low temperature make it more alkaline. The pH scale was only designed to be used at 25°C and does not apply exactly at higher and lower temperatures. For example, at boiling point pure water is still neutral (same number of OH- and H3O+ ions) but its pH is 6.14. It is not more acid at boiling point. At 100°C a neutral pH IS 6.14. The pH value of 7.00 being neutral, only applies at 25°C. Room temperature is near enough to 25°C for most purposes but not exactly so. So there is absolutely no point in quoting a mash pH temperature unless you adjust the pH scale for the temperature of your mash. Decent pH meters adjust the pH reading automatically to what it would be at 25°C. Heat or cold does not make water or aqueous liquids more acid or more alkaline...it merely shifts the pH SCALE (not the pH) from its value at 25°C. See: Temperature Dependence of the pH of pure Water

Hmm... how much does absorption vs. off-gassing of CO2 have to do with pH changes at different temperatures? Or should we also assume all measurements are taken in a vacuum? It appears to me that as temperature increases, CO2 would (quickly) exit the solution, resulting in an increase in hydronium (H3O+), effectively lowering the pH, such that it makes sense to see a drop to ~6 at boiling with a normal solution at earthly atmospheric conditions. In which case, the pH really *is* lower at the boiling point, not neutral, at least initially if not later on in the equilibrium condition. I don't have any horse in this race but it's interesting to think about anyway.
 
This is quite wrong. A lower pH reading at mash temperature does not mean the mash is more acid. Nor does a higher pH reading at a low temperature make it more alkaline. The pH scale was only designed to be used at 25°C and does not apply exactly at higher and lower temperatures. For example, at boiling point pure water is still neutral (same number of OH- and H3O+ ions) but its pH is 6.14. It is not more acid at boiling point. At 100°C a neutral pH IS 6.14. The pH value of 7.00 being neutral, only applies at 25°C. Room temperature is near enough to 25°C for most purposes but not exactly so. So there is absolutely no point in quoting a mash pH temperature unless you adjust the pH scale for the temperature of your mash. Decent pH meters adjust the pH reading automatically to what it would be at 25°C. Heat or cold does not make water or aqueous liquids more acid or more alkaline...it merely shifts the pH SCALE (not the pH) from its value at 25°C. See: Temperature Dependence of the pH of pure Water
A mash is possibly as far as you can get from "pure water". It is indeed a complex buffered system that will shift its equilibrium point with changing temperature as temperature affects the dissociation constant of its dissolved components in ways that are not easy to predict, hence the need for a standardized measurement temperature.
 
Hmm... how much does absorption vs. off-gassing of CO2 have to do with pH changes at different temperatures?
If you want to see the predicted shift int PH due only to temperature changes you will obviously have to set up a CO2 free test environment. If you don't then CO2 solubility will become the prevalent factor and you will see a different behaviour which of course can still be predicted by taking this factor into account.
I'm assuming you were referring to PH shift in DI water. If you were referring to dissolved CO2 effects on mash PH then you can write them off as being totally negligible. There is so little dissolved CO2 in there and its dissociation constant is so low anyway that the buffering capacity of the mash completely overshadows any effect CO2 might have on final PH.

For reference, the equilibrium concentration of CO2 at 25°C exposed to an atmospheric CO2 content of 410ppm is only 0.00061 g/l which is really not that much compared to the 5.5 g/l of carbonated beer.
 
Last edited:
I must inject here (as if a broken record) that if one is going to make this change and begin targeting a room temperature mash pH of ballpark 5.6, they must also further downstream (before, during, or post boil) additionally adjust with an acid addition such that the Wort exits boil and cooling at a measured pH of 5.0 to 5.2.
 
I don't see anything in that pdf about pH measurement temp.

Brew on :mug:
Just for the record I am not a chemist... but if temperature is ignored entirely the pH measurement is not accurate. pH is a touchy thing to measure, forget litmus paper and titration. A good meter and temp. correction....
 
It's a long read, but to be aware of the complication of measuring pH in earlier times, it is necessary to read that 1923 paper. pH meters have for only a handful of years had a digital readout. These are fed by electronics beyond the comprehension of most, include a probe with calibration buffers coming off the shelf via Amazon or eBay.

Take a moment to think about those first 50 years of pH meters not like those modern devices. Here is a link to a patent from 1969, after I started brewing. Observe the probe and notice the output that would necessarily lead to solving a mathematical equation to obtain a result. To my mind it was impractical for most years that pH meters existed to do anything but take a sample of wort to the lab and carry out the job at standard temperature which was 18C in UK in 1923.

The reason we have confusion today is only because of meters with digital readout and temperature compensation that are not fully appreciated or properly understood.
 
Last edited:
Ah, this thread is like Déjà vu all over again. Thinking the whole pH readings at room or mash temperature debate is settled, well think again. Let's say you have an expensive pH meter. One that can repeatedly be calibrated and used to sample mash temperature wort. The wort drawn and sampled hot right from the mash tun had a pH reading of 5.2.

The wort sample is then cooled to room temperature and another pH reading is taken. This time the pH meter is calibrated using room temperature calibration solution. What is the pH reading of the cooled wort sample? Are the two pH readings the same?
 
Last edited:
Ah, this thread is like Déjà vu all over again. Thinking the whole pH readings at room or mash temperature debate is settled, well think again. Let's say you have an expensive pH meter. One that can repeatedly be calibrated and used to sample mash temperature wort. The wort drawn and sampled hot right from the mash tun had a pH reading of 5.2.

The wort sample is then cooled to room temperature and another pH reading is taken. This time the pH meter is calibrated using room temperature calibration solution. What is the pH reading of the cooled wort sample? Are the two pH readings the same?

I'm going to guess ~5.42 pH, and I presume they are both the same. The only issue is that whereas no presumption is involved for a solo mash temperature reading of 5.2 pH, presumption is clearly required for a solo room temperature 5.42 pH whereby to back read it to 5.2 during the mash (unless one takes both readings upon the same sample and indeed winds up at 5.42 at room temperature after hitting 5.2 at mash temperature).
 
A mash is possibly as far as you can get from "pure water". It is indeed a complex buffered system that will shift its equilibrium point with changing temperature as temperature affects the dissociation constant of its dissolved components in ways that are not easy to predict, hence the need for a standardized measurement temperature.

This nails it. Since the temperature related pH slope compensation for pure DI water is well known, and thus we indeed know how the pH of DI water changes with relation to temperature, this knowledge could be incorporated into a pH meter whereby to fully correct the pH of DI water at any liquid water temperature to its pH at 20 or 25 degrees C. But then this pH meter would be capable of accomplishing this feat for only pure DI water, and for nothing else, and it's displayed compensated pH output for DI water would always read 7.00. And since pure DI water (free of any CO2 contamination) is always de-Facto neutral there would be absolutely no purpose for going through the effort to build such a useless pH meter. This is why the quote I included in post #570 above states: "We cannot adjust a measured sample pH value at one temperature to an expected sample pH value at another temperature (e.g. 25°C), because we do not know how the pH of a sample varies with temperature."
 
I beg to differ, and contend that Weyermann is fundamentally correct in recording a roughly 0.22 pH point difference between a Wort measured at 67 degrees C. and then subsequently measured at 20 degrees C.

An ATC pH meter does not correct pH values to 25 degrees C. It assures that the reading will be more in line with being correct at temperatures other than the calibration temperature. But for a meter calibrated at 25 degrees C. the Wort will read about 0.22 points lower than it will at mash temperature, as the referenced Weyermann literature clearly indicates.

Clearly and technically you can argue (precisely as you are doing) that DI water always has the same amount of extant hydroxyl and hydronium ions, such that regardless of temperature, it is de-Facto neutral. But without an understanding of the slope offset, one can not assume that an actual measured 5.40 pH Wort at 67 C. will also read on the display of the ATC meter the very same 5.40 pH when at the appropriate 20 degrees C. (as per EBC) or at the appropriate 25 degrees C. (per many meters accompanying literature) simply due to the presence of ATC alone. What will be observed for a 5.40 at full mash temperature Wort pH will be observed to be somewhere around a 5.62 pH when read by the same instrument at 20 degrees C.

So what is incorrect is your statement that "Decent pH meters adjust the pH reading automatically to what it would be at 25°C.".



Source: https://assets.thermofisher.com/TFS...p-Compensation-pH-Measure-ST-ATCPHMEAS-EN.pdf
I'm not fundamentally arguing against your points. Of course, you can record and report the pH reading at whatever temperature you want. However, I think it's confusing to quote the lower pH values taken at mash temperature, when people might think that it means the mash is more acidic and might affect the enzymes differently. Also, those using pH meters with and without ATC are going to get different values at the higher temperature, whereas they should agree closely at around room temperature.
One thing that occurs to me (I haven't read it anywhere connected with brewing) is that buffer solutions could possibly have a different acidity at lower and higher temperatures if the balance between the buffering agents is favored by temperature one way or the other. For example, CaSO4.2H2O or gypsum is more soluble in water at lower temperatures and less soluble at higher temperatures, so its effect on pH balance could vary with temperature. Also in bicarbonate buffer, heating drives off the dissolved CO2 so might be expected to raise pH. In my tap water, boiling the water raises the pH from 7.3 to 8.0 but leaves the water with a much weaker buffering power due to loss of calcium carbonate. The grain mash is basically a big, complicated buffer with phosphates, bicarbonates and plenty more besides. So, on top of the artifact of the pH scale shown for pure water, there might be other real effects of temperature on acidity in buffer solutions and therefore the typical mash. Since the ATC can give compensation for the pure water temperature discrepancy in pH readings it might in the end be best to measure pH at mash temperature and take the ATC compensated reading, which may truly differ from a reading taken at 25°C or around room temperature.
 
Last edited:
Since the ATC can give compensation for the pure water temperature discrepancy in pH readings it might in the end be best to measure pH at mash temperature and take the ATC compensated reading, which may truly differ from a reading taken at 25°C or around room temperature.
ATC does not do that all. All that the ATC function does is compensate for the change in probe response at different temperatures.

If one is using a non-ATC meter at a temperature that differs from the calibration temperature then the measurement has to either be calibrated manually (using a calibration table provided with the instrument) or discarded.

Calibration solutions do suffer from a shift in PH at temperatures that differ from the standard calibration temperature. Usually there is a table showing the change either on the label or in the accompanying documentation.

It doesn't really matter at what temperature mash PH is taken as long as everybody is taking the reading at the exact same temperature, otherwise the different measurments cannot be compared with each other, hence the need for a standard temperature.
 
Since the ATC can give compensation for the pure water temperature discrepancy in pH readings ....

But ATC doesn't accomplish what you seemingly hope it accomplishes. And the reason (falsifying what you claimed within your initial post) is that factually acidic liquids are indeed more acidic at higher temperatures. I'll quote another definitive source (as seen on page 10):

Every measuring solution has a characteristic temperature and pH behaviour (temperature coefficient). In general one has to assume that a temperature change results in a pH change (see buffer/temperature table). The reason for this is the temperature dependent dissociation which causes a change in the H+ concentration. This pH change is real, not a measuring error, and cannot be compensated for by use of ATC. This has to be taken into consideration if pH values obtained at different temperatures are to be compared. Experimentally, samples should be measured at the same temperature.
https://www.mt.com/mt_ext_files/Edi..._0x000248ff00025c9a00093c4a_files/guideph.pdf
 
Here is an online calculator that agrees with your outlook, but is actually simulating what you will observe with either an ATC pH meter straight up, or a non ATC meter after the mathematical corrections have been applied. If you input a pH of 5.20 at a temperature of 152 degrees F. and hit "calculate" its output (as adjusted to standardized room temperature) is amazingly 5.4207. There is your ~0.22 pH offset, but this calculator is doing what the reputable links I've listed above claim is simply not possible for an ATC pH meter. Visit the online calculator here.

https://www.hamzasreef.com/Contents/Calculators/PhTempCorrection.php
 
Last edited:
ATC does not do that all. All that the ATC function does is compensate for the change in probe response at different temperatures.

If one is using a non-ATC meter at a temperature that differs from the calibration temperature then the measurement has to either be calibrated manually (using a calibration table provided with the instrument) or discarded.

Calibration solutions do suffer from a shift in PH at temperatures that differ from the standard calibration temperature. Usually there is a table showing the change either on the label or in the accompanying documentation.

It doesn't really matter at what temperature mash PH is taken as long as everybody is taking the reading at the exact same temperature, otherwise the different measurments cannot be compared with each other, hence the need for a standard temperature.
You are right there Vale71. ATC does not compensate for different readings due to effects on the sample being measured, but rather corrects for certain errors due to the electrode, such as electrode slope effects. I was quite wrong there. My apologies to you and anyone else involved, including Silver is Money. The only way to get accurate readings is, as you say, to measure at a similar temperature to that at which the pH meter was calibrated with the buffer solution, which is luckily what I've been doing anyway. Since buffer solutions are made up to give the right pH at 25°C or at least around room temperature, that's the way to go.
 
But ATC doesn't accomplish what you seemingly hope it accomplishes. And the reason (falsifying what you claimed within your initial post) is that factually acidic liquids are indeed more acidic at higher temperatures. I'll quote another definitive source (as seen on page 10):

https://www.mt.com/mt_ext_files/Edi..._0x000248ff00025c9a00093c4a_files/guideph.pdf
Interesting reference. I'm not saying you are wrong but the figures on Page 10 of your reference don't seem to back up your statement that "factually acidic liquids are indeed more acidic at higher temperatures". HCl (0.001 molar) was the only acid shown in the table and it didn't change but stayed at a pH of 3.00 from 20°C to 30°C. There is also the other riddle about whether a different pH reading at a higher temperature necessarily means a different acidity, as in the case of DI water. Perhaps the DI water anomaly is due to unadjustable electrode error rather than the water itself.
 
Here is an online calculator that agrees with your outlook, but is actually simulating what you will observe with either an ATC pH meter straight up, or a non ATC meter after the mathematical corrections have been applied. If you input a pH of 5.20 at a temperature of 152 degrees F. and hit "calculate" its output (as adjusted to standardized room temperature) is amazingly 5.4207. There is your ~0.22 pH offset, but this calculator is doing what the reputable links I've listed above claim is simply not possible for an ATC pH meter. Visit the online calculator here.

https://www.hamzasreef.com/Contents/Calculators/PhTempCorrection.php
Thanks for the calculator. What will they come up with next?!
 
Interesting reference. I'm not saying you are wrong but the figures on Page 10 of your reference don't seem to back up your statement that "factually acidic liquids are indeed more acidic at higher temperatures". HCl (0.001 molar) was the only acid shown in the table and it didn't change but stayed at a pH of 3.00 from 20°C to 30°C. There is also the other riddle about whether a different pH reading at a higher temperature necessarily means a different acidity, as in the case of DI water. Perhaps the DI water anomaly is due to unadjustable electrode error rather than the water itself.

Well, soon you will find that in the world of brewing there is a lot of dogmatic word of mouth (and print) circular reasoning that passes as science (sometimes for many decades), both at the amateur and professional level (as attested by Charles Bamforth), and sometimes this circular reasoning clutter of obfuscation is so pronouncedly dark and cloudy and rooted in print (which as @dmtaylor says, makes it from that juncture on irrefutably true) that it confounds the entire lot of us. If it matters, I'm not 100% certain or dogmatic as to what to believe in regard to what is out there in print with regard to this pH and temperature and ATC/non-ATC matter either, so welcome aboard and happy brewing.
 
Well, soon you will find that in the world of brewing there is a lot of dogmatic word of mouth (and print) circular reasoning that passes as science (sometimes for many decades), both at the amateur and professional level (as attested by Charles Bamforth), and sometimes this circular reasoning clutter of obfuscation is so pronouncedly dark and cloudy and rooted in print (which as @dmtaylor says, makes it from that juncture on irrefutably true) that it confounds the entire lot of us. If it matters, I'm not 100% certain or dogmatic as to what to believe in regard to what is out there in print with regard to this pH and temperature and ATC/non-ATC matter either, so welcome aboard and happy brewing.
Thanks for your interesting discussions and good luck to you too in your brewing.
 
Thanks for your interesting discussions and good luck to you too in your brewing.

Thank you kindly. My best brewing days are well behind me at this juncture, but there are hopefully brighter days for brewing on the horizon that lies ahead. I'm drifting more to the theoretical side of things at this juncture.
 
Thank you kindly. My best brewing days are well behind me at this juncture, but there are hopefully brighter days for brewing on the horizon that lies ahead. I'm drifting more to the theoretical side of things at this juncture.
Thank you kindly. My best brewing days are well behind me at this juncture, but there are hopefully brighter days for brewing on the horizon that lies ahead. I'm drifting more to the theoretical side of things at this juncture.
Home brewing has been great during the COVID lockdown when no pubs have been open in the UK for long periods. The bars still can't serve alcohol unless with a meal to people seated at a table. I particularly like cask-conditioned ales served from cellars by hand pumps, so home brewed bottle-conditioned ale is the only reasonable substitute. I can't drink a whole cask in the few days it lasts.
 
Here is a dissertation by Hach, a well known and reliable manufacturer of pH meters. Basically it states that while ATC can compensate for temperature fluctuations from the ideal for buffer solutions, the same is not possible for actual pH samples since the meter has no way of knowing in advance the specific pH/temperature relationship for a sample.

https://at.hach.com/asset-get.download.jsa?id=25593629889
Some quotes:
Compared with pH buffer solutions, samples can NOT be temperature compensated.
While pH buffer solutions are well known and the pH meters can automatically do temperature compensation (ATC), the temperature behavior of real samples is not known. ATC with samples is normally not possible.
 
Here is a dissertation by Hach, a well known and reliable manufacturer of pH meters. Basically it states that while ATC can compensate for temperature fluctuations from the ideal for buffer solutions, the same is not possible for actual pH samples since the meter has no way of knowing in advance the specific pH/temperature relationship for a sample.

https://at.hach.com/asset-get.download.jsa?id=25593629889
Some quotes:
I thought it had been decided in this thread that ATC pH meters can compensate for electrode errors in pH measurement (according to the Nernst equation), but not for any particular sample differences in pH at different temperatures. Correcting for electrode errors using ATC would still be useful in samples, even though all errors may not be known. No point having pH meters with ATC in brewing if you can't use them on samples at different temperatures.
I suppose the ATC pH meters might still be useful for calibrating with buffers at say an ambient room temperature so that readings could be corrected to 25°C, or whatever the specified buffer temperature states, instead of having to warm buffers and samples up to exactly the right temperature every time.
 
Last edited:
From the HANNA website regarding how ATC works.

“Temperature affects the activity of the ions in solution but does not affect the concentration, therefore meters with temperature compensation correct for this condition.”

“When measuring pH using a pH electrode the temperature error from the electrode varies based on the Nernst Equation as 0.03pH/10C/unit of pH away from pH7. The error due to temperature is a function of both temperature and the pH being measured.

Automatic temperature compensation requires input from a temperature sensor and constantly sends a compensated pH signal to the display. Automatic temperature compensation is useful for measuring pH in systems with wide variations in temperature.”

FWIW my pH meter is calibrated at ~75F and mash samples cooled to ~75F when taking a pH reading.
 
Last edited:
My confusion lies within only the focused question regarding whether or not pH 5.20 at 152 degrees F. and pH 5.42 at room temperature for Wort are the saying same thing. I've wavered both ways on this, stating 'same' at some junctures and 'different' at other junctures. But:

10^-5.2/10^-5.42 = 1.66

So 66% more H+ ions (or H3O+) by this measure are extant at 152 degrees F. vs room temp.
 
From the HANNA website regarding how ATC works.

“Temperature affects the activity of the ions in solution but does not affect the concentration, therefore meters with temperature compensation correct for this condition.”

“When measuring pH using a pH electrode the temperature error from the electrode varies based on the Nernst Equation as 0.03pH/10C/unit of pH away from pH7. The error due to temperature is a function of both temperature and the pH being measured.

Automatic temperature compensation requires input from a temperature sensor and constantly sends a compensated pH signal to the display. Automatic temperature compensation is useful for measuring pH in systems with wide variations in temperature.”

Are you reading into this that a 5.20 pH at 152 degrees F. will (via ATC compensation) also display as 5.20 pH at room temperature for Wort?
 
Are you reading into this that a 5.20 pH at 152 degrees F. will (via ATC compensation) also display as 5.20 pH at room temperature for Wort?
No, not at all, just the opposite. I have seen first hand the differences between pH readings of the same wort at mash and room temperature. The wort pH taken at mash temperature was 5.20 and the sample taken of the same room temperature wort was ~0.20 pH higher.

Our club did a collaboration brew with Red Tank Brewing. Two things that struck me their head brewer measured gravity in Plato and took pH readings at mash temperature. Neither of which I had done before as a homebrewer.
 
The ideal pH ranges for the enzymes of import to beer brewing are yet another area of contention, as very little ideal pH range data specifies temperatures correlated to the stated pH ranges. As opposed to this the temperatures at which the various enzymes are at peak activation, and at which they begin to denature are well documented.
 
It was not until I saw repeated in-process (mash temp) metering occurring with industrial meters in the Homebrew setting that I started to get spun around the axles.

I do know this: I know a guy who is a pretty consistent and experienced brewer, with maybe the most sophisticated brewing system anyone on this forum has in their home (maybe the most sophisticated home brewery in this country, or the world for that matter) and when he measures in-process, he is targeting 5.4.

The beer is beyond excellent in my subjective and objective opinion. I don’t know what that means in the big picture but just my $0.02.
 
Back
Top