Basically, you find out what it reads for two temperatures you're sure about (such as boiling water and ice water - water on transition between states). Water doesn't like to exist at temperatures above its boiling point unless under pressure. So if you have a thermometer at 215 F at sea level, under atmospheric pressure, then your thermometer is off by three degrees at that end. Then you can check it in ice water - water which is turning to ice. If you have a liquid-based thermometer (alcohol or mercury) then I believe this is all you need to know to calibrate, as the conversion should be linear (you could have a scaling factor, like, for instance x1.2, and a difference, like -3, but it shouldn't be more complicated than converting between celsius and fahrenheit).
As I just pointed out though, water will generally not exist* at atmospheric pressure above its boiling point, so if you have boiling water, you've got it right! If you're losing a lot of volume, you may want to reduce the power supplied, as this will reduce evaporation, but I wouldn't worry about the temperature of the liquid as much as the excess energy should be lost through evaporation.
*As water is a poor conductor of heat, heat may build up in some areas (typically near the source) and could cause "scorching". The rolling boil helps move the liquid about and distributes heat, and so is a good thing to aim for. If you can reduce power and still maintain one, go for it.
If anyone wants to disagree, please point out my mistakes! I'm afraid I've already decided to RDWHAHB as exams are finally over!