kanzimonson
Well-Known Member
I'm a baker by profession, and pulling my breads out of the oven at the proper temps is just as important as proper mashing temps to bread/beer quality. The owner of my bakery, who has 30 years experience in the food science industry, has a different argument for how to calibrate an analog thermometer.
He claims that analog thermometers are not as linear as we believe. Therefore, if you calibrate your thermometer at freezing point, it's not going to be as accurate at higher temps, such as mashing temps or bread-doneness. Calibrating at boiling would be better because it's closer to mashing, but still a good 50-70* off.
Here's how we calibrate our analogs in the bakery: we calibrate a thermapen to freezing point. Then we use the thermapen to calibrate the analogs in water that's close to our bread-doneness temps. In this method, we've minimized variance at the temps that matter most. Of course, we're trusting the thermapen to be correct, but at least it's basing its numbers on calculations, not a linear scale.
A point of evidence for my theory: have you ever calibrated your analog at freezing, and then tried it in boiling and it reads something other than 212 (usually lower)? Happens to me every time. So at this point, do you adjust your analog to boiling temp? But won't that throw off your freezing temp adjustment? Maybe you could try to strike a balance between the two? But then, how can you trust that any number is correct at this point?
So you see why I wanted to bring this up. I think people have just blindly accepted their calibration methods as correct. Of course, I know proposing something that goes against brewers' laziness (like making a yeast starter FFS) always sparks outrage, but I'm curious how others feel about this.
He claims that analog thermometers are not as linear as we believe. Therefore, if you calibrate your thermometer at freezing point, it's not going to be as accurate at higher temps, such as mashing temps or bread-doneness. Calibrating at boiling would be better because it's closer to mashing, but still a good 50-70* off.
Here's how we calibrate our analogs in the bakery: we calibrate a thermapen to freezing point. Then we use the thermapen to calibrate the analogs in water that's close to our bread-doneness temps. In this method, we've minimized variance at the temps that matter most. Of course, we're trusting the thermapen to be correct, but at least it's basing its numbers on calculations, not a linear scale.
A point of evidence for my theory: have you ever calibrated your analog at freezing, and then tried it in boiling and it reads something other than 212 (usually lower)? Happens to me every time. So at this point, do you adjust your analog to boiling temp? But won't that throw off your freezing temp adjustment? Maybe you could try to strike a balance between the two? But then, how can you trust that any number is correct at this point?
So you see why I wanted to bring this up. I think people have just blindly accepted their calibration methods as correct. Of course, I know proposing something that goes against brewers' laziness (like making a yeast starter FFS) always sparks outrage, but I'm curious how others feel about this.