To echo what others have said first - measuring efficiency is highly dependent on getting accurate gravity and volume measurements, and +/- .004 is pretty darn close - nice work!
When you're following a recipe, it's usually making some assumptions about your overall efficiency (usually called brewhouse efficiency) being somewhere in the 70%-80% range. So assuming gravity and volume were measured perfectly, it would mean you exceeded their 'guess' at your efficiency by a few %.
Overall, efficiency is a measure how much potential sugar made it into your wort. Grains will have an extract potential value which is what you would get at 100% eff. For example, Beersmith lists basic US 2-row pale malt with a 1.036 potential, or 36 'points'. This value is a measure of points per pound per gallon. In this example, if you used 1 pound of 2-row to make 1 gallon of wort at 100% efficiency, it should come out at a gravity of 1.036. (1 lb x 36 points) / 1 Gal = 36, or 1.036 SG.
Lets say you dump another gallon of water in there - you would get (1 lb x 36 points) / 2 Gal = 1.018
Boil it back down to 1 Gal and you should be back at 1.036.
The idea is that, unless you somehow gain or lose more of the sugar, your total gravity points per gallon stays the same
Now lets say you actually go and do this experiment (1 gal of wort from 1 lb of 2 row) and you get a gravity of 1.027 or 27 points. Your efficiency is simply 27 / 36, or 75%
There are LOTS of things that can influence efficiency, from grain crush, to dead space losses, to mash PH and water chemistry, but it all boils down (no pun intended) to how well you were able to get all of that potential sugar into your soon-to-be-beer.