Judochop
Well-Known Member
It wasnt my intent, but I ended up performing a controlled experiment, the results of which confound me.
Here are the details:
I brewed 10 gallons of a red ale, OG = 1.053, and split the wort into 2 six-gallon carboys.
I made a single 2L starter with 3 yeast packets (Wyeast 1728 Scottish Ale) on a stir plate, swirled till uniform, and pitched even amounts into the 2 carboys.
Both batches were fermented for 2 weeks, and both ended w/ a final gravity of 1.008-1.009. (I ALWAYS get high attenuation with 1728, despite Wyeasts predictive claims).
After 2 weeks fermentation, I moved both carboys to the garage to crash cool, and added equal parts of gelatin finings. Both spent 2 weeks cold @ ~38F before kegging.
So far, a fairly controlled experiment, yes? Heres the one obvious difference.
One batch was fermented @ 60F in well-controlled conditions using a 2-stage controller, a mini-fridge and a fermwrap. I bumped up the temp to 65F for the last bit of fermentation.
One batch was fermented @ 66-68F in my uncontrolled downstairs bathroom. There were slight (+/-2 degrees) temp fluctuations over the course of the primary fermentation. I bumped up the temp with a spare fermwrap to 70-72F for the last bit of fermentation.
In terms of flocculation/clarity, the results between the two batches are night and day. I mean, theyre not even close. The batch that fermented colder looks like a perfect ruby when held up to the light. The batch that fermented warm looks like it finished fermenting that day and never got crashed at all. Visually speaking, the one is a homebrewers dream, and the other, a disappointing failure. Both taste good/clean. No infection or wildness detected in either. They do taste marginally different, but that is to be expected.
Anyone have any theories that would explain this difference in clarity?
Here are the details:
I brewed 10 gallons of a red ale, OG = 1.053, and split the wort into 2 six-gallon carboys.
I made a single 2L starter with 3 yeast packets (Wyeast 1728 Scottish Ale) on a stir plate, swirled till uniform, and pitched even amounts into the 2 carboys.
Both batches were fermented for 2 weeks, and both ended w/ a final gravity of 1.008-1.009. (I ALWAYS get high attenuation with 1728, despite Wyeasts predictive claims).
After 2 weeks fermentation, I moved both carboys to the garage to crash cool, and added equal parts of gelatin finings. Both spent 2 weeks cold @ ~38F before kegging.
So far, a fairly controlled experiment, yes? Heres the one obvious difference.
One batch was fermented @ 60F in well-controlled conditions using a 2-stage controller, a mini-fridge and a fermwrap. I bumped up the temp to 65F for the last bit of fermentation.
One batch was fermented @ 66-68F in my uncontrolled downstairs bathroom. There were slight (+/-2 degrees) temp fluctuations over the course of the primary fermentation. I bumped up the temp with a spare fermwrap to 70-72F for the last bit of fermentation.
In terms of flocculation/clarity, the results between the two batches are night and day. I mean, theyre not even close. The batch that fermented colder looks like a perfect ruby when held up to the light. The batch that fermented warm looks like it finished fermenting that day and never got crashed at all. Visually speaking, the one is a homebrewers dream, and the other, a disappointing failure. Both taste good/clean. No infection or wildness detected in either. They do taste marginally different, but that is to be expected.
Anyone have any theories that would explain this difference in clarity?