Don't compare neutral lines, just consider the power demand of what you are running. If you are running something (heating element for example) that is pulling 15 amps then you are fine. If you are running two elements on the same 30 amp circuit, both of which pulling 15 amps, than you are pushing it. To be safe, use 80% as a rule...if it's a 30 amp circuit, anything over 24 amps is pushing it.
Also, on a 110/120v circuit the hot wire acts as 'in' and the neutral acts as 'out' (if that makes any sense), there shouldn't be any difference between them. A GFCI compares these two and if they don't match up the GFCI will trip sensing that power is being lost from the hot to somewhere other than the neutral (could be through a properly installed ground or it could be through your body [hopefully not]). This isn't the 'scientific' explanation but helps to basically understand it. A 220v/240v circuit doesn't have a neutral, each hot is out of phase (meaning that they are transmitting electrons in opposite directions multiple times per second, acting as in/out by being out of phase).
I'm not an electrician or an engineer, but this is how I understand it in lay man's terms. I may get bashed for not referencing appropriate terminology or processes, but in a quick minute it helped me to understand it.