Hi Dotball. Glad the install went well for you. That's the intent of course, but its a big world with lots of variables.
day_trippr is exactly right, of course. I'd like to elaborate while I wait for my coffee to brew:
And another question is where can you get good starting data for a particular brew? I can find a lot of instructions for xyz lager or abc ale with mash steps and boil times, hop additions, etc. but they all seem to lack proper temp schedules for fermenting.
I think 95% of this is technique and research rather than software. The first place to go for the proper fermentation temps is the yeast supplier. If you look at White Labs' page for WLP001 California Ale Yeast, you'll see that proper temps for this particular yeast are 68°F - 73°F. If you skip over to Wyeast's most popular yeast in the same category, you'll see that the 1056 American Ale yeast prefers temps in the 60°F - 70°F range. So, where you might choose 70°F for a middle-range setting for the WLP001, it would be at the top end of what's okay for the 1056. Even though I've been brewing since '91 (I prefer not to do the math on how many years that is anymore) and a judge since '95, me choosing the temp you should be fermenting your beers is a responsibility I do not want. So, the quick answer is to ask the yeast guys. All of the suppliers I have used will have this data on their website. That's where you should be looking.
You mention temp profiles too and this is a subject to which I've been directing a few brain cells as of late. It's easy as humans to say "Hold it at 70°F till fermentation is finished, then ramp it up to 75°F for 3 days, then cold crash at 55°F. My question about such an algorithm is: "When is it finished fermenting?" Time is a bad judge as we all learn sooner or later. One batch will go gangbusters and be 90% attenuated in two days. The next will take four days to do the same work. So, time is out.
What about the specific gravity? Taking my "house ale" as an example, it's an American Amber which starts at 1.042 (yes that's low, it's intended to be a session beer.) I use Wyeast 1056 for this which has an attenuation rate of 73-77% - in the lab. That means if I do my part right, the FG will be 1.113-1.097. As a human, it's easy for me to watch the gravity curve (especially with a Tilt which I think everyone should have!) and see if it's "done." When I do so I'm not really looking at the number so much as the flattening of the curve indicating my yeasties are finished. Computers do not really look at shapes, they look at numbers. There's a 0.006 point gravity range in there where my beer might be done (again, that's assuming I have perfect conditions.)
To get my five gallons of American Amber to the style-appropriate 2.3 volumes of CO2, if I were bottling I would use ~107g of corn sugar. That equates to roughly 0.002 in specific gravity. I think you can begin to see the issue here. If I tell a computer that my beer is done fermenting at 1.010 but it could really have gone to 1.097, I might cold crash a beer that's not fermented out with 0.003 points of SG to go. Then if I add in my 107 grams of corn sugar I've now got a surplus of 0.005 which equals about 5.1 volumes of CO2 when it's bottled. That's not quite a bottle-bomb, but it's a fizzy gusher and not what you are looking for.
There are other projects which have begun to add in gravity-based temperature profiles, but I think you can see the issue here now that I have put some numbers to "paper." Teaching a human to judge "done" and a computer to do the same is a pretty different prospect. The results are important and despite a computer being "precise", it could lead to a widely varying result.
One way is to do some math and figure out that the gravity drop has levelled out and at some arbitrary level of change over time, determine it "done." As you begin to peel back the layers of the onion, however, you see that the original problem hides several others. For instance, the gravity will bounce up and down even sitting in water. Remember that 0.002 is the level at which I would carbonate, so a variance of 0.001 even would make a difference. So now I have to add some form of smoothing (something that humans do well, but computers have all sorts of ways and choices for formulae, all offering different results.)
Then, of course, remember that there are points where we might want to make changes for which gravity is the only determining factor. So now I would have two different choices - a schedule that takes pure attenuation into account, plus the "shape of the curve." If I coded that, how much time would I spend explaining it?
The point being, there's a sort of diminishing return here. How many hours to do this, versus the hours I need to allocate to do the Python 2.7 to 3.x uplift which is becoming more mandatory every day? There are few people who really use the temp profile the way it was originally intended I think. I personally use it when I am ready to ramp a large change. For instance, when I think it's done, I might crash 10°F over 24 hours. Until that point, I use Beer Constant.
Long story short: Select a Beer Constant temp that your yeast supplier recommends, and then you be the judge of when the beer is done in order to cold crash or whatever else you will do. Someone else's curve might not be the right one for your recipe, your equipment, your process and your yeast. Using temp control to keep what the yeast supplier recommends will be a large improvement in your process and your beer. The last 0.1% you might get from automating that last part is likely not something to which I'll be devoting a lot of time. Who knows though? I love a challenge.
My question is can we use an extra temp probe for a second FV in the same chamber and have the averages smoothed between the 3 probes? So normal chamber probe (one of) and one probe in each of two FV with the same brew or at least expecting the same temp schedule.
So now I have my coffee in hand and I thought about this while pouring a cup. There are two issues here I believe:
- Let's say you have one fermenter that's really taken off and the second has not yet. The first will be generating its own heat, that can be 3-5 degrees maybe if it really gets going. The second one will be very close to the chamber temp until it starts going well. If you lower the chamber temp based on an average, you will also be chilling down the one that's not fermenting as well yet, further slowing it down. The strength of a temp controller like this is the ability to keep a fermenting liquid +-0.1°F or so. If you average between two fermenters you're removing that advantage and honestly making it worse. My recommendation for that scenario would be:
- Get a larger fermenter and do it in one vessel; or
- Use a second fermentation chamber and control both correctly; or
- Use chamber constant and at least know that the chamber is being well controlled. That's an "average" of sorts, and will at least not penalize the second fermenter based on the activity of the first. A fan in the chamber will really help here.
- Program Space - the Arduino Uno is severely limited and the last change I made necessitated me going through alk the code and finding a way to remove 27 characters of string storage (27 letters that were in messages) so I could get it to run without crashing. Adding new functionality in the controler itself is unlikely without switching to a new platform.
Yikes, that's a lot of typing. It was an interesting set of questions though which present some challenges that are likely not well/optimally solved by computers. I enjoyed "talking" it out.