Apparent power, measured in volt-amperes (VA), represents the total power in an alternating current (AC) circuit. Real power, measured in watts (W), signifies the actual power consumed by the load. The relationship between these two quantities is influenced by the power factor, a dimensionless value between 0 and 1. In a purely resistive circuit, the power factor is 1, and VA equals W. However, inductive or capacitive loads introduce reactance, causing a phase difference between voltage and current, reducing the power factor and resulting in VA exceeding W. For example, a device rated at 100 watts with a power factor of 0.8 would have an apparent power of 125 VA (100W / 0.8 = 125VA).
Understanding the distinction between apparent and real power is crucial for several reasons. Proper equipment sizing, such as generators and uninterruptible power supplies (UPS), requires considering the total VA demand to avoid overload. Ignoring the power factor and only considering watts can lead to undersized equipment that may fail to deliver the necessary power. Historically, the increasing prevalence of non-linear loads, such as electronic devices, has led to greater emphasis on VA ratings to ensure system reliability and efficiency.