Water Chemistry Questions

Homebrew Talk - Beer, Wine, Mead, & Cider Brewing Discussion Forum

Help Support Homebrew Talk - Beer, Wine, Mead, & Cider Brewing Discussion Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

BIGRUGBY

Well-Known Member
Joined
Sep 8, 2019
Messages
55
Reaction score
27
Recently got my water tested, results are below.
1594309600900.png

Had a couple thoughts after the fact that I really should have thought of before sending my water sample in.

1. Does the Bru'n water spreadsheet account for using 10 gallons of water to make 5 gallons of beer? For ex. if my sodium level is 194 ppm, and I turn 10 gallons of water into 5 gallons, is my sodium level now more concentrated (simplifying the question as its not exactly 10 to 5, as there is water absorption by the malt, etc)? Am I overthinking that?
2. Is there any sort of a standard for how much sodium water softeners add? Instead of cutting my tap water down with RO or distilled, I could add a splitter before my water softener to get less salty water.
3. I didn't take a sample of water that had been Camden-tabletized.... anyone have math as to how that changes water chemistry?

Thanks in advance!
 
You should have submitted pre-softener water for analysis. Of all waters, softened is generally one of the worst to attempt to brew with. But pre-softened water would not change your waters extremely high alkalinity. You would be best served to consider the use of mineral modified RO water instead of your homes water.
 
Water softeners also do not typically alter a waters TDS.

A ballpark wild guess would place your pre-softened water in the vicinity of 110 ppm Ca++ and 25 ppm Mg++.
 
Water worksheets do not need to be further adjusted for water losses. You enter the volume of mash/sparge water and let it do the rest.

To the best of my knowledge, Campden tabs have no effect on mineral balance of water.

I’d avoid using softened water, instead either buy RO or spring water or you can precipitate out the carbonate causing the high alkalinity in your hard water. You can cause it drop out by either pre boiling, but this wastes a lot of energy, or lime softening with pickling lime. Either of these requires a surplus of calcium, so you would need to start with your un-softened water, add a little gypsum and/or CaCl, and let settle overnight. I have done this for years with hard, high alkalinity water because I don’t have access to home RO water, and won’t pay for it. Works great for lower SRM beers <20. For stouts and porters, go to town with that hard water as is.
 
Campden tablets raise the SO4-- ion concentration a bit. The OP should be aware that his actual SO4 is 117 ppm (mg/L).
 
Don't use water softener sourced water to brew with. Far too much sodium.
 
Here is a water report I received for some source water I'd like to use. Not sure if this info can be cooked out of this chart, but can one of you chemists out there review and let me know if these can be determined from this, and what the values would be:
Ca2+
Mg2+
Na+
Cl-
SO42-
HCO3-

water report.PNG
 
Guessing Ca++ = ~17.68 mg/L (and not higher than ~24.6 or lower than 0) [note: if Ca++ is ~24.6, then Mg++ must be zero]
Guessing Mg++ = ~4.17 mg/L (and not higher than ~14.9 or lower than 0) [note: if Mg++ is ~14.9, then Ca++ must be zero]
Guessing Alkalinity as CaCO3 = ~71 mg/L
Guessing HCO3- = ~86.6 mg/L

The rest are on the report.
Na+ = 6.11
Cl- = 2.3
SO4-- = ~0.19

Total Hardness = 61.4 = 2.5(Ca++) + 4.12(Mg++)

I assumed ~28% of total hardness from Mg++, and 72% of total hardness from Ca++, which seems fairly average for fresh water overall (but by no means is meant to be firmly representative of any single "real world" water source).

2.5(17.68) + 4.12(4.17) = 61.38 (decently close to 61.4, with difference due only to rounding)

(2.5 x 17.68)/61.4 x 100 = 72% of total hardness from Ca++
(4.12 x 4.17)/61.4 x 100 = 28% of total hardness from Mg++

If it was all I had to go by, I would run with these guesses. You will need to add some Calcium Chloride and Gypsum whereby to boost calcium, chloride, and sulfate ions. And (recipe dependent) you will need to add some small amount of Lactic or Phosphoric Acid to lower the Alkalinity.
 
Last edited:
Assuming distilled sparge water, I I'm seeing a common 5ml lactic acid 88% solution per gallon. About how many drops per gallon is this?

A rule of thumb is 20 drops per ml. But since you mentioned sparge water, it's not normally necessary to acidify distilled sparge water. It has no buffering capacity, so the amount of acid needed to acidify it to match mash pH is miniscule, and basically irrelevant. The distilled sparge water (even untreated) won't cause the runoff pH to get significantly higher.

I am curious about "5ml lactic acid 88% solution per gallon." What's the context you're seeing that in?
 
Couple references online mentioned that 5ml/gal number. Also mentioned sparge water being 5.4 to 5.7PH optimally, which is lower than say 7.4 or so distilled. So I was looking for a quick and dirty method of acidifying distilled to take it from 7.4 to say 5.4PH. Drops per gallon.
 
Couple references online mentioned that 5ml/gal number. Also mentioned sparge water being 5.4 to 5.7PH optimally, which is lower than say 7.4 or so distilled. So I was looking for a quick and dirty method of acidifying distilled to take it from 7.4 to say 5.4PH. Drops per gallon.

If you want to treat distilled water (with a theoretical pH of 7.0) to hit 5.7 pH, it will take about 0.006 ml of (88%) lactic acid to treat 10 gallons, so, 0.012 drops (not ml) per gallon. (But again, it's not necessary.)
 
2.5 mL of 88% Lactic Acid will remove the alkalinity from 5 gallons of your tap water. At that juncture you can sparge with it.
 
My tap water here in SoCal is terrible. Has an organic smell to it from the reservoir they are pulling it from, really gross. Not exactly what I envision for my Bavarian Pilsner.

Anyway, so I'm buying distilled water and building it up to Munich profile.

Sparge I'm not going crazy on, but am using distilled with lactic acid drops to adjust the pH to around 5.4. Originally I was figuring around three drops per 1 gallon jug. I've heard several estimates on this, just trying to figure a quick and easy guide to acidifying distilled jug water, 0 drops, 1 drop, 3 drops, 20 drops or whatever.
 
As has already been stated, there is no way to measure out whereby to add so little acid as is required to move quality deionized water to pH 5.4 without falling below pH 5.4, and the entire process of attempting to do so is unnecessary.

If you are capable of acidifying distilled water without falling below pH 5.4, your distilled water source is not likely of very good quality. At around 8-10 ppm alkalinity the addition of 88% lactic acid required would need to be held to no more than 1/4 mL into 5 gallons.

More importantly, I've come across 2 peer reviewed major brewing industry documents dating to the 1950's - 60's which indicate that you can sparge successfully with up to 50 ppm alkalinity or less water without intervention (meaning without acidifying it) and with little to no fear of the dreaded release of tannins, so perhaps our modern era's preoccupation with worrying about such things as hitting 5.4 pH with sparge water has become at some juncture massively overblown, and from that juncture on perpetuated merely via circular reasoning, or the blind leading the blind, or as one forum member not long ago called it, circle jerking, whereby rumor becomes fact through propagational repetition from one homebrewing writer to another with zero verification or peer review. One of the two old brewing industry documents referencing 50 ppm alkalinity even indicated that the waters pH can measure 9 and it won't matter as long as alkalinity is at or below 50 ppm. But this same document in another location also stated that keeping alkalinity at or below 25 ppm for sparge water is considered a good practice. Water with up to a ballpark TDS of 60 would likely be at 50 ppm or less as to its alkalinity. The several brands of distilled I've measured for TDS have came in at typically ballpark 8 - 15 ppm (and even higher than 20 ppm for one gallon) TDS, and I've measured really good RO at between 2 and 6 ppm TDS. Therefore, in my experience good quality RO is quite often more pure than store bought distilled. But much of RO water is questionable as to its quality....

Here's a secret: Water at 5.4 pH does not at all mean water with zero ppm's of alkalinity. The zero point for alkalinity is variable to a small degree, but hovers quite near pH 4.3. pH dependent, water with ballpark 250 ppm alkalinity when acidified to pH 5.4 will have ballpark 25 ppm remaining alkalinity. And also pH dependent, water at ballpark 470 ppm alkalinity will still have ballpark 50 ppm alkalinity remaining when acidified to pH 5.4.
 
Last edited:
One of the two old brewing industry documents referencing 50 ppm alkalinity even indicated that the waters pH can measure 9 and it won't matter as long as alkalinity is at or below 50 ppm.

It seems to me a claim like that must include some kind of assumption (even if not stated) about the buffering capacity of the wort and the "length" of the sparge, i.e. how dilute the final runnings get.
 
Here is a water report I received for some source water I'd like to use. Not sure if this info can be cooked out of this chart, but can one of you chemists out there review and let me know if these can be determined from this, and what the values would be:
Ca2+
Mg2+
Na+
Cl-
SO42-
HCO3-

View attachment 696546
That's probably a decent brewing water excepting for the manganese. It's getting to the level where you might pick up a metallic flavor in the water and that can echo into the beer. Using a Greensand filter to reduce the manganese should help. There are whole-house filters like that and they are typically regenerated with a potassium permanganate solution.

The rest of the ionic content is likely to be within acceptable range. You can't assume that the magnesium content will be a certain percentage of the hardness, but it's not likely that it will be too high. You can either call the water company and speak to the water quality lab person or you can send a sample off to a reputable water lab to determine the ionic content needed for brewing.
 
Last edited:
Hmm. I'm curious as to what they mean by "Practical tests demonstrate..." Other than that, it sounds like they are ballparking it. 100ppm. 50ppm. Nice round numbers.

Well, Bamforth effectively made it clear that even within the major brewing industry much of what passes as peer reviewed science is merely unsubstantiated rumor which gets passed along from one peer reviewed author to another with no effort to verify or substantiate, such that the (pseudo) verification comes merely from repetition. Simply repeat something long enough and it has a way of becoming scientific truth through the power of circular reasoning.

There have been surprisingly few (if any) detailed studies of the precise impact of pH on mashing performance and wort composition. Textbooks of brewing make reference to "optimum" pH's for parameters such as extract and "wort filtration", though they are conspicuous by the lack of references. One textbook refers to a previous textbook! It seems that a largely empirical approach has been employed. How the data has been generated and on what scale (lab mashes are not always good mimics of commercial mashes) is unclear. Furthermore, the manner by which the pH has been adjusted in such studies is seldom apparent, despite its tremendous importance.
Charles W. Bamforth
 
Well, Bamforth effectively made it clear that even within the major brewing industry much of what passes as peer reviewed science is merely unsubstantiated rumor which gets passed along from one peer reviewed author to another with no effort to verify or substantiate, such that the verification comes from repetition. Simply repeat something long enough and it has a way of becoming scientific truth.

No doubt. And having read a lot of Bamforth, I'll add that he himself is not always immune from this.
 
Much of what Bamforth was referring to in his last sentence (within the words of his which I quoted above) was the confusion surrounding peer reviewed studies as to whether the pH was measured at mash temperature or at room temperature. It is rarely if ever explicitly stated in the old peer reviewed brewing industry level journals as to which, and must be carefully teased out of the study.

Amazingly enough, Bamforth, a former President of the Institute of Brewing and Distilling who was also an Anheuser-Busch Endowed Professor of Malting and Brewing Sciences at University of California, Davis between 1999 and 2018 is honest enough to admit confusion, but all it took for amateur homebrewers to definitively shed such confusion and accept as gospel fact that all of such measurement has throughout history been derived at room temperature was for AJ deLange to proclaim that it was definitively so through his own intuition.
 
AJ had very good basis for STATING that pH measurement was historically performed at room-temperature because prior to a couple of decades ago, all pH measuring equipment was bulky, expensive, and not suited to placing into a hot tun or kettle. I'd welcome information that showed that his statement was wrong.
 
Apologies if I am posting in an incorrect place. I am attempting to use tap water for brewing for the first time. I am attempting to make a Berliner weisse, current water report information shown in the picture below.

As I dont have any RO or distilled water available to me at the moment, I decided to pre boil the water to reduce hardness. I did not add any extra chalk, and boiled for 15 min.

Now I am having a hard time figuring out how much calcium or Biocarbonate I reduced with the the boil. Tools I have available to me are a PH and a TDS meter.

Id appreciate if anyone could help me to get some sort of estimation. Also welcome are tips on what additions should be made to the water according to the Berliner weisse style. Or if I am posting in the wrong place please point me to the correct thread. Big thanks!

1600805207366.png
 
If the above represents your starting water, your post boiling water will have ballpark 37.6 ppm calcium, 65 ppm alkalinity, and 79.3 ppm bicarbonate. I believe you will need to maintain the boil for about 15 minutes. During that time you may loose on the order of 0.2 to 0.25 gallons to evaporation.
 
You can only remove so much alkalinity because the reaction travels in both directions and finds equilibrium at about 65 ppm alkalinity* (as CaCO3). [* provided that sufficient calcium is present, whereby for reaction '1' the Ca++ ion must also be present in an excess or surplus of at least 12 ppm]

1) Ca++ + 2HCO3- <--> CaCO3 + CO2 + H2O

2) CaCO3 + CO2 + H2O <--> Ca(HCO3)2

NOTE: If you exhaust calcium to its minimum of 12 ppm before the waters alkalinity hits 65 ppm, add more calcium to the boil mix as either CaCl2 or CaSO4.
 
Last edited:
Much of what Bamforth was referring to in his last sentence (within the words of his which I quoted above) was the confusion surrounding peer reviewed studies as to whether the pH was measured at mash temperature or at room temperature. It is rarely if ever explicitly stated in the old peer reviewed brewing industry level journals as to which, and must be carefully teased out of the study.

Amazingly enough, Bamforth, a former President of the Institute of Brewing and Distilling who was also an Anheuser-Busch Endowed Professor of Malting and Brewing Sciences at University of California, Davis between 1999 and 2018 is honest enough to admit confusion, but all it took for amateur homebrewers to definitively shed such confusion and accept as gospel fact that all of such measurement has throughout history been derived at room temperature was for AJ deLange to proclaim that it was definitively so through his own intuition.

For but one example of this, in the 2004 published book titled 'Brewing, Science and Practice' by Briggs, Boulton, Brookes, and Stevens, at the top of page 115 is found:
Infusion mashes are best carried out at pH 5.2-5.4 (mash temperature), and so will give cooled worts with pH values of about 5.5-5.8.

If we allow for AJ's formula for temperature induced pH shift being 0.0055 pH points of shift per degree C., and we apply it to the above we get (presuming 66 degrees C. at infusion mash temp., and 20 degrees C. (the EBC standard for wort pH, as opposed to mash pH readings, for which no standard exists AFAICT) we get:

0.0055 x (66-20) = 0.253 as the pH shift. Let's round it to 0.25 points. Then we get 5.2 + 0.25 = 5.45, and 5.4 + 0.25 = 5.65

And we come to the optimum "room temperature" mash pH range of 5.45 to 5.65 pH, with an ideal midrange target of 5.55 pH. This as opposed to Briggs 5.5 to 5.8 for room temp., with an ideal midrange of 5.65.

No matter how you approach it, for Briggs and company, 5.4 is not the ideal room temperature pH midrange for mashing. 5.55 is a better room temperature midrange target if we allow for AJ's observation of the pH shift observed as measurement temperature varies. Or 5.65 for Briggs.
 
Last edited:
Here is a picture from a Weyermann presentation of a mash into which they added consecutive increments of acid malt as a percentage of the grist weight. Notice the difference in measured pH at mash temperature (called specifically Mash-pH) vs. at room temperature (called speciffically Wort-pH) for each addition of acid malt. I 'speculate' that "perhaps" the pH differential was narrower at the onset because this was a step mash, and only at about step 6 or 7 did the mash reach is final target mash temperature (of perhaps 65-67 C.). Their differential (from step 6 on) is about 0.22 pH points on average. Closer to AJ's 0.25 points than to Brigg's 0.35 pH points.

Acidulated_Malt.png
 
Last edited:
Note also that Weyermann ended the acidification when they achieved 5.50 pH at room temperature. They did not target 5.40.

Also note that it is a "mash pH" only when measured at "mash temperature". And note that if mash pH's were actually intended to be taken "only" at room temperature, then a single pH column by which to represent the pH progress during the mash as they acidified it would have totally sufficed, and it would have been the right most column.

And also note carefully that 'ATC' does not conflate the two columns into one. That is not the purpose of 'ATC'.
No, ATC does not work like conductivity temperature compensation. We cannot adjust a measured sample pH value at one temperature to an expected sample pH value at another temperature (e.g. 25°C), because we do not know how the pH of a sample varies with temperature. For example, the pH value of a water sample may change rapidly as the result of chemical, physical, or biological processes that are temperature dependent. If we want to know what the pH value of the sample is at a certain temperature, we would have to adjust the sample to that temperature and measure the pH. That is why pH is frequently reported with a temperature measurement. We understand that the pH of a sample is temperature dependent. While ATC does allow us to calibrate accurately and adjust the pH electrode calibration when the temperature changes, ATC can’t correct for sample pH/temperature effects, which are unknown.
https://assets.thermofisher.com/TFS...p-Compensation-pH-Measure-ST-ATCPHMEAS-EN.pdf
What ATC does do is assure you that if you measure pH at 66 degrees C. and find it to be 5.27, that is truly the worts pH at 66 degrees C., and if you subsequently cool the very same wort sample to 20 degrees C. and measure it again, and this time your meter reads 5.50 pH, that is also its true pH. The bottom line is that pH's are naturally lower at higher temperatures. But for a pH meter to accurately indicate pH's at differing temperatures (with pH's displayed "as they are", and not "as you would like them to be") the "slope" of the internal formula used by the meter must be changed to make the meter agree with the temperature. All that 'ATC' does is change this slope so you don't have to correct for an incorrect slope.
 
Last edited:
What ATC does do is assure you that if you measure pH at 66 degrees C. and find it to be 5.27, that is truly the worts pH at 66 degrees C., and if you subsequently cool the very same wort sample to 20 degrees C. and measure it again, and this time your meter reads 5.50 pH, that is also its true pH. The bottom line is that pH's are naturally lower at higher temperatures. But for a pH meter to accurately indicate pH's at differing temperatures (with pH's displayed "as they are", and not "as you would like them to be") the "slope" of the internal formula used by the meter must be changed to make the meter agree with the temperature. All that 'ATC' does is change this slope so you don't have to correct for an incorrect slope.

This is a challenging concept for many home brewers to get their heads around because they naturally think that the temperature correction (ATC) solves the problem. In fact there are two problems when comparing pH's at different temperatures:

1. Temperature effects the way your device reads pH. A correction needs to be made to normalize (or normalise if you're zed-phobic) temperatures to a standard (e.g., 68° F). This correction is fairly straightforward to understand and parallels how we think about correcting S.G. based on temperature.

2. Temperature can also directly change the pH of a sample. The ATC correction does not account for the fact that the pH of mash at mash temperatures will be lower than when that mash sample is allowed to cool to room temperature. The problem, is that this effect is also a temperature effect, and pH meters advertise built-in temperature correction. So it's easy to see how people misunderstand this:

"Wait, I bought a meter that temperature-corrects my 150° mash pH, but now you're telling me that I still need to let it cool down to room temperature before measuring the pH?"

The best way I've found to explain this to people is to do the following experiment the next time that you brew:

1. Measure the mash pH at mash temp, but put the temperature probe in room temperature water (and the pH probe in the mash). This is effectively what would happen if you had no temperature correction (ATC) on your meter. It's an incorrect reading because the device is doing its internal calibration incorrectly because you tricked the device into thinking the sample is at room temp.

2. Now measure the mash pH at mash temp as you normally would by putting both temp and pH probes in the hot mash. This reading (e.g., 5.30) is a temperature-corrected reading. It is an internal correction that is specific to the meter. This reading puts you on a level playing field with scientists all across the world who use ATC when reporting pH. And, and this is important, if temperature did not directly alter the pH of your mash, you could simply use this value as a true pH reading.

3. Let the same mash sample cool to 68° F. Put the temp probe and pH probe in this cooled sample and measure. What you are likely to discover is that the pH now reads higher (e.g., 5.5). Why? Because temperature changes the true pH value. In this case, heating a sample of wort is figuratively the same as adding some acid to it. And your ATC pH meter has no knowledge of what you're sticking it in. It doesn't know whether temperature fundamentally changes the pH of your mash, or aquarium water, or your hot tub. It just knows how to mechanically correct itself to a standard 68°F.

In short, if you want a mash pH of 5.3 at 68° F, you must measure your mash sample at 68° F even if using an ATC pH meter.
 
Last edited:
Speculation time: (yes, pure intuitive guesswork, wherein I'm fully aware that intuition generally makes for bad science and therefore the probability that this is correct is very low, so take this with a grain of salt)

Observation series 1) The pH masters of yore consistently tend to inform us that the difference between a Wort measured at ~66 degrees C. and a Wort measured at 20 degrees C. is about 0.35 pH points.

Observation series 2) The current observational reality seems to be more on the order of the difference between a Wort measured at ~66 degrees C. and a Wort measured at 20 degrees C. being about roughly between 0.20 and 0.25 pH points.

I wonder (that is to say I'm speculating or guessing or pondering) as to whether older pH meters without ATC trended toward (or cluster around) 0.35 pH points of measured difference, and modern pH meters with ATC cluster around 0.20 to 0.25 pH points of difference across this 66 degree to 20 degree C. temperature differential? The 'speculative reason' for the difference being one of slope without correction vs. slope with correction.

This speculative exercise is merely an intuitive attempt to understand why for decades 0.35 pH points was accepted, and now it is not and now it is something more along the lines of 0.22 pH points that is accepted as well as observed (as for Weyermann).
 
Why is there no EBC or ASBC or other "Standard" methodology for the taking of a "Mash pH", whereas the methodology for taking of a "Wort pH" is standardized?

I believe the reason "may be" (I.E., I speculate) that there is no standard "mash", and without a "Standard Mash" there can be no standardized methodology for the taking of a "Mash pH".

Within single infusion mashes people can mash across a broad range of temperatures. I.E., no standard.
One can also undertake single decoction or double decoction mashing. Again, no standard.
And one can step mash with a multiplicity of mash step temperatures and times. Again, no standard.
 
This linked dissertation, which I believe to be sourced from AJ deLange since his name appears within the web link, if read carefully, seems to me (if I'm following it correctly, which I indeed may not be doing) to strongly indicate that mash pH readings and titrations whereby to determine buffering are physically to be carefully undertaken at 47 to 48 degrees C. and then somewhat awkwardly (IMHO) back calculated via the application of an under-verified application of AJ's 0.0055 x degrees C. method to a nominal 20 degrees C. Here is the link:

http://**********************/wp-content/uploads/2016/11/DeLange-Estimating-Mash-pH.pdf
So even though AJ may have instructed Palmer for the purposes of publishing the "Water" book that all mash pH's are to be taken at 20 degrees C., it does not appear that AJ himself (presuming again, this document to be his, plus that I'm reading it correctly) practices this sans via dubious formula derived back calculation to 20 degrees C.

What is clearly needed here is an explanation from @ajdelange himself whereby to gain understanding of presumptions from the source, such as to clear the air of my presumptions. The 47-48 degrees C. requirement may be an instrument related requirement, but again I speculate...

A quote from the document:
This is the DI mash pH of this malt at the approximately 47 °C mash temperature of these experiments. We usually like to work in terms of mash pH specified at room temperature but in this case we need to see what is actually happening in the mash and so took pH measurements at mash temperature. Using the general rule of thumb that measured mash pH increases about 0.0055 for each °C decrease in temperature we would add (4720)*0.0055 = 0.1485 pH and conclude that the room temperature (20 °C) DI pH for this malt is 5.63

Interestingly, this states that to understand specifically what is "actually" happening within the mash one must measure it at mash temperature.
 
Last edited:
This linked dissertation indicates that the ASBC standards organization does have a procedure for "Congress Mash", as does the EBC (which is EBC Method 4.5.1). Here is the link:

http://www.regional.org.au/au/abts/1999/jones.htm
A chart of the ASBC Congress mash as taken from the linked source looks to be mighty close to 47 degrees C. for the first 30 minutes of the mash. Perhaps this is where 47 degrees C. is derived in the presumed AJ link I provided in the post above this one. The EBC 4.5.1 standard specifically calls for 45 degrees C. for the first 30 minutes of a Congress Mash, followed by a progressive temperature ramp-up similar to ASBC.

ASBC Congress Mash.png
 
Back
Top