
Power Conditioner vs Voltage Regulator: The Ultimate battle in Power Quality gear. To understand the difference we must take a look at some history and the reason each of these important power protection components evolved. Let’s take a deep dive:
Power Conditioner vs Voltage Regulator Introduction
Since the advent of electronic systems, electrical power related disturbances have had the ability to destroy components, disrupt system operation and interfere with productivity. Almost everyone has experienced the effects of power problems at one time or another, and it’s a common belief that system failure is due to voltage “sags” and “surges”. However, electronic technology is continuously evolving, and it is important to recognize that this evolution has changed the way systems respond to power disturbances. The advent of switch mode power supplies (SMPS) was specifically implemented to address these issues. The adoption of the SMPS replaced the linear power supply opening up modern computers to a fatal flaw.
The Evolution
When John Atanasoff and Clifford Berry invented the first digital computer in 1939 at Iowa State University, they built a machine that relied on vacuum tubes for the fundamental logic circuitry. These were high voltage, low current devices that were powered by a basic linear power supply. From the ENIAC, EDVAC, and UNIVAC systems that followed to the more familiar systems of the mid-1980’s, little change took place in power supply design. By the late ‘80’s, however, engineers had begun using large numbers of integrated circuits which themselves were being built with increasing numbers of transistor junctions. The result was a “low voltage” computer, which used substantial amounts of current. Linear power supply technology of the time was inefficient. A power supply capable of meeting the current delivery requirements of the rapidly growing computer circuitry would be significantly larger than its predecessors. Designers were striving to make computers smaller and, larger power supplies were just not compatible with this goal. The result was the introduction of the SMPS. This design eliminated the 60 Hz. stepdown transformer and series regulator section in favor of a pulse width modulated, high frequency circuit capable of rectifying line voltages down to usable, well regulated dc power for the computer’s logic circuitry.
Fundamental Differences
This technological change is responsible for some fundamental differences in the way systems respond to power problems. The linear power supply rectified incoming line voltage and supplied power to the logic circuitry through a series regulator. The range of this regulator was limited, however and an input voltage that was too high or too low would quickly result in problems. Low input voltage would cause the supply output to “foldback” or drop below the operating tolerance of the logic circuit. Input voltage that was too high would activate the power supply’s “crowbar” circuit. In the process of protecting itself, the power supply output would again fall below the operating tolerance of the computer’s electronic circuitry. Because line voltage variations are frequent, sags and surges were commonly the culprit in early electronic system failures. Dedicated electrical circuits were the first line of defense against this condition, and if ineffective, a voltage regulator was normally specified.
Switch mode supplies are very different. The series regulator has been eliminated along with the input stepdown transformer. Switch mode power supplies consume current from the AC power supply for only portions of each power line cycle. Not only are switch mode supplies considerably smaller and more efficient, but they are largely immune to voltage sags and surges. An explanation is found in the way the system operates.
Duty Cycle Is Everything
Because the switch mode supply draws current for only a brief time period, much can occur to the line voltage during the time the switcher is “turned off” with little effect on its operation. If line voltage sags or surges during the time the supply is “turned on,” the supply compensates for the variation by adjusting its duty cycle or the time period over which it operates. With less peak current available, the supply compensates by drawing current for a longer period of time. The power supply’s voltage outputs still supply well regulated +5 and + 12 volts under full rated load.
Built In Voltage Regulation
The capabilities of switch mode power supplies with regard to voltage regulation problems are well documented. In fact, it is the inherent tolerance to such voltage variations that makes it possible to operate a modern system from a standby UPS in which the computer may operate completely without power for as much as 5 or 6 milliseconds while it is transferred to a battery powered inverter. Switch mode power supplies can be said to contain their own “built in” regulation capability. It is important to note that most voltage regulators can only adequately regulate down to 75% of nominal line voltage. Switch mode power supplies are naturally tolerant of voltages well below the regulation capabilities of most regulators.
Compatibility Issues Abound
The most popular types of regulators are tap switching autoformers and/or transformers and ferroresonant transformers. Regardless of the type, these regulators all accomplish their function by controlling the current flowing in an electrical circuit. This can have implications for the appropriate operation of switch mode supplies. Voltage regulators tend to be high impedance sources, which restrict the amount of current available to the supply at any given time. Under these circumstances, the switch mode supply can be “starved” for current and in the process cause significant voltage distortion on the output of the regulator. Significant noise generation may result, and there is conjecture in the industry about the stress placed on the supply by permanently altering its duty cycle. All these are compatibility issues of the first order. Voltage regulation is no longer necessary for switch mode technology. Eliminating the misapplication of voltage regulation technology will eliminate any concern for compatibility, too.
Appropriate Solutions
In the migration from linear supply to switcher, the input step down transformer was eliminated. In the process, the system’s natural immunity to common mode noise and high voltage impulses was totally lost. Today’s power protection solutions recognize that these immunities must be restored. An appropriate solution for modern systems incorporates a surge diverter, an isolation transformer, and a noise filter. These three elements work in concert with the natural voltage regulating ability of the switch mode supply to provide all the power protection elements necessary for modern systems.
Conclusion
Voltage regulators no longer provide any needed protection for modern computer systems. Their continued use is largely due to the industry’s failure in educating its customers about the power protection needs of modern systems. Solutions that include isolation transformers, surge diverters and noise filters are far more effective and do not introduce the compatibility issues that can create more power problems than are solved.