Does the volume of mash water actually matter for mash pH adjustment software?

Homebrew Talk - Beer, Wine, Mead, & Cider Brewing Discussion Forum

Help Support Homebrew Talk - Beer, Wine, Mead, & Cider Brewing Discussion Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Silver_Is_Money

Larry Sayre, Developer of 'Mash Made Easy'
Joined
Dec 31, 2016
Messages
6,462
Reaction score
2,217
Location
N/E Ohio
Givens:
15L or 30L of DI mash Water, with this DI water at pH 7.00
6 Kg. Grist
Aggregate grist buffer = 45 mEq/Kg.pH
Aggregate pHDI of the grist = 5.65
Target pH = 5.40
88% Lactic Acid at pH 5.40 has an acid strength of 11.451 mEq/mL

For brevity in what is already going to get long you're going to have to trust me on these two pre-calculated givens:
1) mEq's of acid required to move 15L of 7 pH DI water to pH 5.4 = 0.05822
2) mEq's of acid required to move 30L of 7 pH DI water to pH 5.4 = 0.11643

mEq's of acid required to move the grist alone to pH 5.4:
Delta_pH = (5.65 - 5.4) = mEq's/(45 x 6)
mEq's = 67.5

3) mEq's required to move 15L of DI + the Grist to pH 5.4
= 67.5 + 0.05822 = 67.55822

4) mEq's required to move 30L of DI + the Grist to pH 5.4
= 67.5 + 0.11643 = 67.61643

Lets 1st assume that our software simply ignores DI water volume in the mash and calculates thereby that:
67.5/11.451 = 5.89468 mL of 88% Lactic Acid addition called for to hit pH 5.4

Let's next assume that our software adds in the water for 15L:
67.55822/11.451 = 5.899766 mL of 88% Lactic Acid addition called for to hit pH 5.4

Let's lastly assume that our software adds in the water for 30L:
67.61643/11.451 = 5.904849 mL of 88% Lactic Acid addition called for to hit pH 5.4

For the 15L mash water case our software that ignores water volume completely shorts the lactic acid addition by:
5.899766 - 5.89468 = 0.005086 mL
0.005086 x 11.451 = 0.05824 mEq's

For the 30L mash water case our software that ignores water volume completely shorts the lactic acid addition by:
5.904849 - 5.89468 = 0.010169 mL
0.010169 x 11.451 = 0.11644 mEq's

For the 15L mash volume our software's pH error is therefore:
Delta_pH = pH_error = 0.05824/(45 x 6)
pH_error = 0.000215704 pH points

For the 30L mash volume our software's pH error is therefore:
Delta_pH = pH_error = 0.11644/(45 x 6)
pH_error = 0.00043126 pH points

Conclusion. Due to strong grist buffering vs. nill DI water buffering a mashes water volume can safely be ignored by software, as for our above examples the maximum error induced by ignoring the water is 0.00043 pH points. It is only that which is in the water that actually matters. The water itself does not. And technically all water can be considered to be DI water plus added minerals, whether added by mother nature or by the brewer. And the software only needs to consider the minerals and the grist, and not the water.
 
Last edited:
Givens:
15L or 30L of DI mash Water, with this DI water at pH 7.00
6 Kg. Grist
Aggregate grist buffer = 45 mEq/Kg.pH
Aggregate pHDI of the grist = 5.65
Target pH = 5.40
88% Lactic Acid at pH 5.40 has an acid strength of 11.451 mEq/mL

For brevity in what is already going to get long you're going to have to trust me on these two pre-calculated givens:
1) mEq's of acid required to move 15L of 7 pH DI water to pH 5.4 = 0.05822
2) mEq's of acid required to move 30L of 7 pH DI water to pH 5.4 = 0.11643

mEq's of acid required to move the grist alone to pH 5.4:
Delta_pH = (5.65 - 5.4) = mEq's/(45 x 6)
mEq's = 67.5

3) mEq's required to move 15L of DI + the Grist to pH 5.4
= 67.5 + 0.05822 = 67.55822

4) mEq's required to move 30L of DI + the Grist to pH 5.4
= 67.5 + 0.11643 = 67.61643

Lets 1st assume that our software simply ignores DI water volume in the mash and calculates thereby that:
67.5/11.451 = 5.89468 mL of 88% Lactic Acid addition called for to hit pH 5.4

Let's next assume that our software adds in the water for 15L:
67.55822/11.451 = 5.899766 mL of 88% Lactic Acid addition called for to hit pH 5.4

Let's lastly assume that our software adds in the water for 30L:
67.61643/11.451 = 5.904849 mL of 88% Lactic Acid addition called for to hit pH 5.4

For the 15L mash water case our software that ignores water volume completely shorts the lactic acid addition by:
5.899766 - 5.89468 = 0.005086 mL
0.005086 x 11.451 = 0.05824 mEq's

For the 30L mash water case our software that ignores water volume completely shorts the lactic acid addition by:
5.904849 - 5.89468 = 0.10169 mL
0.010169 x 11.451 = 0.11644 mEq's

For the 15L mash volume our software's pH error is therefore:
Delta_pH = pH_error = 0.05824/(45 x 6)
pH_error = 0.000215704 pH points

For the 30L mash volume our software's pH error is therefore:
Delta_pH = pH_error = 0.11644/(45 x 6)
pH_error = 0.00043126 pH points

Conclusion. Due to strong grist buffering vs. nill DI water buffering a mashes water volume can safely be ignored by software, as for our above examples the maximum error induced by ignoring the water is 0.00043 pH points. It is only that which is in the water that actually matters. The water itself does not. And technically all water can be considered to be DI water plus added minerals, whether added by mother nature or by the brewer. And the software only needs to consider the minerals and the grist, and not the water.

It’s one of this things that you calculate because you can and you’re better for doing so. If you are taking the time to write a pH estimation algorithm, it is my personal opinion that you include everything possible.
 
The magnitude of error induced/introduced by crude volumetric means of acid addition to the mash water alone grossly dwarfs all water volume concerns. As does the incorrect presumption that ones particular lot of nominally 88% Lactic Acid is 88.0000...% (hint, it isn't. Some has been measured as high as 92%). As does the inaccuracy of the pH meter itself. Even if they can display to 0.01 pH points, they generally only certify 0.02 pH point accuracy in their reading, and that certification requires perfect calibration using perfect buffers and perfect sample temperature, plus no stirring error (which most people induce), plus a requisite of several minutes of undisturbed pH meter rest within the sample to reach full reading stability (which no one does, thinking that the rapidly appearing stability indicator symbol actually means absolute stability, such as it doesn't), As does the mega-gross error in presuming that your grist actually behaves as to its acidity like software merely presumes it will, as does the fact that the software totally ignores the quite noticeable differences in grist acidity induced by step-mashing, vs. single infusion mashing, vs. decoction mashing, etc... No matter how many trivial variables are accounted for, this is still merely ballpark science.

If the volume of water mattered, why do highly critical titrations carried out in chemical labs that are not ballparking always ignore the impact of the DI water water carrier of the sample being titrated (at least in my experience in such settings)?
 
Last edited:
The magnitude of error induced/introduced by crude volumetric means of acid addition to the mash water alone grossly dwarfs all water volume concerns. As does the incorrect presumption that ones particular lot of nominally 88% Lactic Acid is 88.0000...% (hint, it isn't. Some has been measured as high as 92%). As does the inaccuracy of the pH meter itself. Even if they can display to 0.01 pH points, they generally only certify 0.02 pH point accuracy in their reading, and that certification requires perfect calibration using perfect buffers and perfect sample temperature, plus no stirring error (which most people induce), plus a requisite of several minutes of undisturbed pH meter rest within the sample to reach full reading stability (which no one does, thinking that the rapidly appearing stability indicator symbol actually means absolute stability, such as it doesn't), As does the mega-gross error in presuming that your grist actually behaves as to its acidity like software merely presumes it will, as does the fact that the software totally ignores the quite noticeable differences in grist acidity induced by step-mashing, vs. single infusion mashing, vs. decoction mashing, etc... No matter how many trivial variables are accounted for, this is still merely ballpark science.

If the volume of water mattered, why do highly critical titrations carried out in chemical labs that are not ballparking always ignore the impact of the DI water water carrier of the sample being titrated (at least in my experience in such settings)?

Sounds like you are advocating a “I can’t measure it right so why should I care about modeling any of it” approach.

A model should account for any and every constituent part of the mash system. Are some of those likely to be small enough to be ignored? Of course. Should you? I can’t see why you would. If I’m in Excel already making writing a pH algorithm, I’ll include anything that can be calculated. Why? It’s because we should separate the modeling from the measuring. The model assumes you can measure to the highest degree of accuracy possible. It makes no concessions to measurement error because it’s a math model and that’s not it’s job. It isn’t trying to replace measurement.

As for the effect of volume of water on the mEq of the water itself, we know it matters little for DI water. But for mineralized water........it still does not have a huge affect. But I calculate it anyway because, why not?
 
Sounds like you are advocating a “I can’t measure it right so why should I care about modeling any of it” approach.

In light of my having measured it as seen above, which is the very purpose of this thread, (albeit, and found it wanting), why suggest this?
 
I don't use DI water as my tap water is soft enough that I am still adding minerals to every beer. But there are minerals in there so how would software account for the naturally occuring minerals if I didn't tell it how much water I was using?
 
In light of my having measured it as seen above, which is the very purpose of this thread, (albeit, and found it wanting), why suggest this?

The post I quoted could be misconstrued by the passerby as an endorsement of the hopelessness inherit in measuring pH.

I agree with you that the waters contribution alone is small but your measurements won’t sway me from including it because to me, modeling and measuring are in some ways 2 different pursuits. We strive to model the mash system the best we can and we strive to measure the results the best we can.

We will always be somewhat behind the 8 ball on the latter but doesn't mean we don’t include as much of the former as we can.
 
For brevity in what is already going to get long you're going to have to trust me on these two pre-calculated givens:
1) mEq's of acid required to move 15L of 7 pH DI water to pH 5.4 = 0.05822
2) mEq's of acid required to move 30L of 7 pH DI water to pH 5.4 = 0.11643

A highly simplified method of arriving at the above calculated givens (which ignores the H+ and OH- contributions of the water itself):

10^-5.4 = 0.000003981 moles/Liter of H+ ions from the added acid, or 0.003981 mmoles/L, or for monoprotic H+, mEq's/L
(H+ (or protons if you will) being monoprotic by definition, whereby valence is 1, we conclude that their mmoles/L = their mEq's/L)

1) For 15L of water this method derives that 15L x 0.003981 mEq's/L = 0.059715 mEq's of H+ (as opposed to 0.05822 actual)
2) For 30L of water this method derives that 30L x 0.003981 mEq's/L = 0.119432 mEq's of H+ (as opposed to 0.11643 actual)

Even here the impact of the water itself is yet again shown to be effectively inconsequential.
 
Last edited:
I don't use DI water as my tap water is soft enough that I am still adding minerals to every beer. But there are minerals in there so how would software account for the naturally occuring minerals if I didn't tell it how much water I was using?

To clarify, Larry is talking about the contribution of the water itself sans any mineralization. You can break the source water into a few components:

1.) the mEq contribution of the “pure water” (de-ionized)
2.) the mEq contribution of the alkalinity of the water
3.) the mEq contribution of the calcium and magnesium of the water
 
I don't use DI water as my tap water is soft enough that I am still adding minerals to every beer. But there are minerals in there so how would software account for the naturally occurring minerals if I didn't tell it how much water I was using?

But you do tell it. And it doesn't matter to the software if it is natural mineralization or added mineralization, as previously mentioned within post #1. And whereby mineralization includes alkalinity. Ultimately water in and of itself is by definition deionized. The point I'm making is that waters contribution to mash water acidity is so small as to be inconsequential.
 
When we put people on the moon, relativistic effects were ignored due to being so small as to be inconsequential. Perhaps the lander would have landed a few feet away from where it did if they were included. But would that have mattered? It's quite effectively the same for water. It is for all practical purposes only there as a non-reacting carrier medium for everything tossed into it, by man, or by nature.
 
When we put people on the moon, relativistic effects were ignored due to being so small as to be inconsequential. Perhaps the lander would have landed a few feet away from where it did if they were included. But would that have mattered? It's quite effectively the same for water. It is for all practical purposes only there as a non-reacting carrier medium for everything tossed into it, by man, or by nature.

Keep in mind, they were also using slide rules and hand calcs. Streamlining was as much for sanity’s sake as for accuracy’s sake.

We don't have that issue as we use Excel.
 
Back
Top