I know that it's probably a subject that has been discussed and beaten to death, but I've spent the last 2 hours reading long threads about PWM, and I haven't really seen anyone who answered what I'm looking for.
From all of the discussion, it looks like people who are making their own PWM circuits or modifying existing ones are designing them so that they have a 0.25-2Hz base frequency. In some of the recent projects that I've worked on, I've used triacs that did zero-crossing detection and switched on at the frequency of 60Hz. Switching the SSR on at the zero crossing and off somewhere else in the cycle makes for much cleaner power.
Is the common ~1Hz solution just to minimize the mid-period switching, or is it to make it a simpler PWM implementation? Also if you're using a 555 circuit and a pot to control the duty cycle, do you limit the minimum duty cycle so that you're not switching both on and off in one cycle of the power?
I'm using a pretty intense and customizable industrial control system, so I can implement my PWMs however makes the most sense; however as I start to tinker with it though, I don't want to just blindly follow the HBT conventions without knowing some of the background reasons.
Any light on that background would be greatly appreciated