This comes up alot, when folks compare programs head to head.
A couple of the biggest issues that cause consternation to folks, especially if they compare programs, or if they take for example a recipe from byo or someplace and input it, and find differences have to do with batch size settings and which IBU formula the software is defaulting to.
The final volume of a lot of Palmer and Jamil's recipes, and some of them in magazines are usually 5.5 or 6 gallons whereas most of the time we write recipes for the standard 5 gallon recipes. That often accounts for differences between what we might input in software. Make sure the final volume is matching.
The other thing is, that there's several different calculations used to figure out IBU. And they give different numbers. Somewhere in either a book or on the software it should tell you what the default setting is, and even give you the option to change it to match. But often they don't make it obvious.
Here's an explanation of how Beercalculus calculates it from their Hopville Blog for example;
Previously, the default IBU calculation for Beer Calculus was based on an average of a few popular formulas. It did four calculations (Garetz, Rager, Tinseth, and the legacy Hopville calc) and averaged them together. I chose to blend a few conflicting numbers together instead of committing to a single one by default. That neutral position tended to cause some confusion among both types of brewers: those who cared which formula was in use, but didn’t know you could change it, and those who didn’t care at all. Plus, the only indication that a formula selection was being made was a subtle message “avg” near the IBU result – pretty vague about what was happening behind the scenes. Recipes now default to the Tinseth formula. Hopefully this will satisfy those who prefer this formula, and also clarify the default calculation to folks who don’t really care.
IIRC beercalculus is defalted to tinseth (maybe). So comparing the two in terms of IBUS is going to show up differently.
One of the most recent thread discussing this is here. http://www.homebrewtalk.com/f84/diff...ftware-218066/
The other thing has to do with the efficience a given recipe was created with and the efficiency setting in the particular software. 75% is usually the default in the software, but a lot of folks, especially people who have their systems dialed in may have a higher or lower efficiency setting in their native software, so the anticipated og and fg may be different.
None of these are the software, or mean that one software is better than the other. Often it's the user's own settings that are off.
But in terms of accuracy, they're all accurate, you might think of it simply being that they're in different languages....as long as you stay consistant in using one over any other it will be right.
But in reality it's all arbitrary anyway...they're just numbers. I think a better analogy than what I posted above would be instead of languages think about Fahrenheit vs Celcius or Brix vs specific gravity, they're valid and accurate scales. Just present the same "data" differently.