Me and a couple of buddies are pretty new to brewing, but have read up quite a bit and have done our research. We did 4 extract kits which turned out fairly good, then decided to venture into all-grain a couple of weeks ago.
Due to personal preference we are non-plastic, all stainless/metal. Since I knew our efficiency was going to be off our first go around, I expected a 65% efficiency, and factored in a little more grain on our first 2 batches a couple of weeks ago. Was super hard to maintain mash temps since depending on where we took temps of the mash (grain vs. water pockets), we got varying temps. We missed our gravities by only about .006 though so wasn't too bad considering how much we struggled monitoring/maintaining mash temps.
We decided to get a pump, setup a quasi-"RIMS" system. We don't have a RIMS tube though so are direct heating w/ propane for now. Due to this, I thought our efficiency would be pretty decent and set it at 72% in BeerSmith and brewed away. We hit our mash of 152* consistently for the full 60 mins (recirculating the whole time), mashed out, and batch sparged w/ 2 equal batches. This time we missed our gravities by about .007 and .008. I was so confident with our "RIMS" system that we'd get a fairly decent efficiency cause we were so spot on w/ our temps.
There's really only 2 things I can think of that could cause the issues. When we recirculate, we don't have a sparge arm or anything, our copper just ends in a T, and basically just drip the wort off the 2 ends of the T back into the grain. Due to this, we mixed the grains every 10 minutes or so. I'm assuming having some type of rotating sparge arm would help?
The only other thing I can think of is we mill our grain at our LHBS. I've heard people say that LHBS set their grain mills to inefficiently so that brewers have to buy more grains. We plan on investing in our own grain mill, but I'm curious to see if anyone has any other suggestions?
I've made sure all of our thermometers are calibrated as well.