Electromagnetic Induction

By R.W. Hurst, Editor


electromagnetic induction

Electromagnetic induction is the process of generating electric current by changing a magnetic field, forming the basis for transformers, generators, and electric motors in power generation and energy conversion systems.

 

What is Electromagnetic Induction?

Electromagnetic induction is a fascinating and vital concept shaping how we generate, transfer, and utilize electricity.

✅ Process where a changing magnetic field induces an electric current in a conductor

✅ Fundamental principle behind transformers, generators, and electric motors

✅ Essential for modern power generation, transmission, and energy conversion

 

Its underlying principles, Faraday's Law and Lenz's Law, explain the phenomena occurring when magnetic fields interact with conductors. At the same time, its various applications, such as electric generators, transformers, induction heating, and wireless charging, have revolutionized our daily lives. As we continue to explore and understand this intricacy, we can look forward to even more innovations and advancements in electricity and magnetism. Understanding Faraday's Law is essential to grasp how changing magnetic fields generate electricity through electromagnetic induction.

 


 

Fundamental Laws of Electromagnetic Induction

Faraday's Law of electromagnetic induction, invented by Michael Faraday, is a key principle that states that the induced EMF in a coil is proportional to the rate of change of the magnetic flux (dφ/dt) passing through the coil. The more rapidly the magnetic field changes, the higher the amount of voltage induced will be. This relationship between electricity and magnetism laid the foundation for the development of various electrical devices and systems. The relationship between electricity and magnetism forms the foundation of generators, transformers, and other key electrical systems.

Lenz's Law, another crucial principle, explains the direction of the induced current. It states that the induced current will always flow in a direction that opposes the change in magnetic flux, acting as a stabilizing electromotive force. By combining Faraday's and Lenz's Law, we can better understand how it operates in real-world applications. For a deeper understanding of how voltage is produced in coils, explore the principles of Lenz's Law.

 

Core Applications: Generators and Transformers

One notable application is the electrical generator, which converts mechanical energy into electrical energy. In a typical generator, a coil of wire rotates within a magnetic field, creating a changing magnetic flux. This change induces a voltage and current in the coil, generating electricity. Generators are commonly used in power plants, automobiles, and other devices requiring a continuous electrical supply. A three-phase transformer uses electromagnetic induction to efficiently transfer power between circuits at different voltages.

Another critical application is the transformer, which transfers electrical energy between circuits with different voltages. Transformers consist of two induction coils wrapped around a shared magnetic core. When an alternating current flows through the primary coil, it creates a magnetic field that induces a voltage in the secondary coil. By adjusting the number of turns in the coils, transformers can step up or down the voltage as needed for specific electrical systems.

 

Special Phenomena: Eddy Currents, Mutual Inductance, and Self-Inductance

Eddy currents are an intriguing aspect of electromagnetic induction. These currents form when a magnetic field changes, causing swirling, closed loops of electric current within a conductor. Eddy currents generate heat and can lead to energy losses in electrical systems, but they also have practical applications, such as in metal detectors and induction heating. Devices that use electromagnetic induction, such as electric motors, rely on core principles of inductance to regulate energy transfer.

Two critical aspects of electromagnetic induction are mutual inductance and self-inductance. Mutual inductance occurs when the current in one coil induces a voltage in a nearby coil due to their shared magnetic field. Self-inductance refers to a coil's ability to induce a voltage within itself when the current flowing through it changes. Understanding these concepts is crucial for designing and analyzing complex electrical circuits and systems.

 

Demonstrating the Principle: Coil and Magnet Interaction

In many cases, it involves the interaction of a coil and a magnet, either a permanent magnet or an electromagnet. A bar magnet, for example, may be moved near a coil, or the coil may be moved about the magnet. This motion changes the magnetic field and induces a voltage in the coil, demonstrating the dynamic relationship between electricity and magnetism. Learning the basics of electrical resistance helps explain how eddy currents generate heat in conductive materials.

 

Modern Impact and Emerging Technologies

It has an immense impact on modern life, from generating electricity in power plants to operating devices like transformers, electric motors, and wireless chargers. The principles of Faraday's Law and Lenz's Law help explain the complex phenomena that occur when magnetic fields interact with conductors. By harnessing these principles, scientists and engineers have created various practical applications that have revolutionized our daily lives.

Electromagnetic induction demonstrates the fundamental relationship between electricity and magnetism and is integral to many aspects of our modern world. Faraday's Law and Lenz's Law underpin this phenomenon, governing the induced voltage and current when magnetic fields change. The applications of electromagnetic induction range from electric generators and transformers to the study of eddy currents, mutual in influence and self-inductance. By understanding these principles, we can continue to innovate and develop new technologies to improve our daily lives and advance our understanding of the world.

 

Induction Heating

Electromagnetic induction heating is another practical application that relies on electromagnetic induction. This technology utilizes eddy currents to heat conductive materials, such as metals. By placing a conductive material within a rapidly changing magnetic field, eddy currents are generated within the material, producing heat due to their resistance. This heating method has been employed in various industrial processes, including metal hardening, welding, and cooking.

Wireless charging is an emerging technology that utilizes electromagnetic induction to transfer energy between two coils. When an electric current flows through the primary coil, it generates a magnetic field, which induces a voltage in the secondary coil, providing power to the device. This technology has seen increasing adoption in recent years, particularly for charging electric vehicles and smartphones, offering convenience and reduced dependence on cords and cables.

 

Related Articles

 

Related News

Electricity Cost Principles Explained

Electricity cost reflects kWh rates, tariffs, demand charges, power factor penalties, and TOU peak/off-peak pricing, driven by load profiles, utility billing, transmission and distribution fees, and efficiency measures in industrial, commercial, and residential systems.

 

What Is Electricity Cost?

Electricity cost is the total price per kWh including energy, demand, and network charges under applicable tariffs.

✅ Includes energy (kWh), demand (kW), and fixed charges

✅ Varies by TOU tariffs, peak/off-peak, and seasons

✅ Affected by power factor, load profile, and efficiency

 

Electricity Cost principles involve looking at how much electricity consumption and we have to understand how it's measured.

At its core, understanding power use starts with grasping what electricity is and how it behaves in circuits.

It is determined at any moment and is measured in watts consumed. For example: if you want to determine the bill energy or bill electricity rate: For a refresher, see what a watt represents to relate device ratings to instantaneous power.

  • A 100-watt light bulb uses 100 watts.
  • A typical desktop computer uses 65 watts.
  • A central air conditioner uses about 3500 watts.

These device ratings illustrate electric load in practical terms as each appliance contributes to total demand.

If you want to know how to rate electricity pricing, you want to know how much energy you're using. When you use 1000 watts for an hour, that's a kilowatt-hour. For example:

  • Ten 100-watt light bulbs on for an hour, is 1 kWh
  • Ten 100-watt light bulbs on for 1/2 an hour, is 0.5 kWh
  • Ten 50-watt light bulbs on for an hour, is 0.5 kWh
  • One 60-watt light bulb on for an hour, is 0.06 kWh (60/1000)
  • Running a 3500-watt air conditioner for an hour is 3.5 kWh.

The average U.S. household used 10,654 kWh a year in 2001, or 888 kWh/mo. (Dept. of Energy) The U.S. as a whole used 3,883 billion kWh in 2003, or 13,868 kwH per person based on a population of 300 million. (Dept. of Energy)

 

Watt-hours

For smaller items we use the term watt-hours instead of kilowatt-hours. For example, we say a 60-watt light bulb uses 60 watt-hours of electricity billed, not 0.060 kWh. If you're unsure, this overview of what a watt-hour means clarifies the relationship between power and time.

Note that the "-hours" part is important. Without it we'd have no idea what period of time we were talking about.

If you ever see a reference without the amount of time specified, it's almost certainly per hour.

If your device lists amps instead of watts, then just multiply the amps times the voltage to get the watts. For example:

2.5 amps x 120 volts = 300 watts

Trivia: On a peak day in 2009, California used 50,743 megawatt-hours of electricity, or 50,743,000,000 watt-hours.

How much does electricity cost?

 

Electricity Cost

It depends on where you live (like Ontario), how much you use, and possibly when you use it. There are also fixed charges that you pay every month no matter how much electricity you use. For example, I pay $6/mo. for the privilege of being a customer of the electric company, no matter how much energy I use. Local infrastructure and electricity supply conditions can also influence pricing tiers.

Check your utility bill for the rates in your area. If it's not on your bill then look it up on the utility's website. National summaries of electricity prices help you compare trends across regions.

The electric company measures how much electricity you use in kilowatt-hours. The abbreviation for killowatt-hour is kWh. Note that on your bill there can be multiple charges per kWh (e.g., one for the "base rate", another for "fuel") and you have to add them all up to get the total cost per kWh. This measurement is recorded by a watt-hour meter that cumulatively tracks energy over time.

Most utility companies charge a higher rate when you use more than a certain amount of energy, and they also charge more during summer months when electric use is higher. As an example, here are the residential rates prices electricity for Austin, Texas (as of 11-03):

First 500 kilowatts5.8¢ per kilowatt hour (kWh)

Additional kilowatts (May-Oct.)10¢ per kilowatt hour

Additonal kilowatts (Nov.-Apr.)8.3¢ per kilowatt hour

These figures include a fuel charge of 2.265¢ per kWh.

The average cost of residential electricity was 9.86¢/kWh in the U.S. in March 2006. The average household used 888 kWh/mo. in 2001 and would pay $87.56 for it based on the March 2006 average rate. (Dept. of Energy)

The cost of electricity varies by region. In 2003 the price ranged from 5.81¢ in Tennessee to 12¢ in California, 14.314¢ in New York, and 16.734¢ in Hawaii. In Summer 2001, electricity was a whopping 20¢/kWh in parts of California.

 

Related Articles

View more

Electricity Power Explained

Electricity power underpins voltage, current, and wattage in generation, transmission, and distribution across the grid, integrating AC/DC systems, transformers, generators, load management, power factor correction, and renewable energy for efficient, reliable supply.

 

What Is Electricity Power?

Electricity power is energy transfer rate: voltage times current, yielding watts in AC/DC systems.

✅ P = V * I; apparent, real, reactive power relationships

✅ Power factor improves efficiency and reduces I^2R losses

✅ Transformers step voltage for transmission efficiency

 

The make up of the electricity power sector has been in transition over recent history. In most provinces, there has been a transition from traditional electric utilities (which are mostly provincially-owned crown corporations) to private generation, transmission and distribution assets. For context, industry overviews such as electricity generation in Canada outline how policy and market structures influence capacity additions today.

In 2014, Canadian eutilities and industrials generators produced more than 639 terawatt hours of electricity. When it comes to hydroelectric power, Canada is the world's second largest producer. Hydroelectricity accounts for 59.3 per cent of Canada’s supply. Other energy sources that contribute to the transfer of energy include: coal, uranium, natural gas, petroleum and non-hydro renewable energy sources. About 67 per cent of Canada’s energy comes from renewable sources like wind, solar and hydroelectric power stations and 82% from non-greenhouse gas emitting sources. A breakdown of major fuels is available through sources of electricity for readers comparing provincial mixes.

Hydropower output is intrinsically linked to water availability and operations, and resources on water and electricity explain how hydrology and infrastructure shape generation profiles.

In 2014, almost 59 terawatt hours of electricity exported to the USA, while about 13 terawatt hours was imported from the United States. These trade patterns reflect how provinces balance load and generation portfolios, as noted in discussions of electricity supply dynamics under varying market conditions.

Canada's industrial sector is responsible for the largest share of electricity demand because there are a number of energy-intensive industrial activities like mining, oil and gas production, pulp and paper production, and other primary industry uses. Residential and commercial sectors consume much smaller quantities of electricity for lighting (light bulb), heating, and communications. Detailed consumption patterns can be explored in Canada's electricity demand by sector and region.

R&D programs aimed at maintaining the operating capability of the nation’s existing nuclear power plants and developing the next generation of nuclear technologies. Nuclear energy is the world’s largest source of emission-free electricity power. The Nuclear Energy program is working to develop cost-efficient technologies that further enhance nuclear safety, minimize the generation of nuclear waste, and further reduce the risk of proliferation. Within the broader generation portfolio, resources like nuclear are profiled in guides on how electricity is generated across different technologies.

To meet the rising demand of the 21st century, significant improvements in America’s electric system are necessary. Blackouts serve as a powerful reminder of the critical role electricity plays in the everyday lives of people.

 

Voltages, currents, power energy

When it comes to electric current, electrical voltage, electric circuit dynamics and energy powers, all rely on the flow of the electron. What this means is that power voltage feeds loads, solid state power load devices, and other loads that need electric power. Electric power is the rate at which that power is measured in watts or voltage in volts. There are various forms of energy measured in si unit: apparent power, reactive power, ac power, real power in watts, etc. For a refresher on fundamental concepts and terminology, see what is electricity to connect these quantities with practical applications.

Electrical Engineers and Scientists alike seek common views in order to develop new technologies for the storage of energy and the transmission of electric potential through power circuits that will contribute to energy efficiency of the electric industry. For instance, the copper wires used in typical transmission lines lose a percentage of the electricity passing through them because of resistance over long distances, which causes the wires to heat up. But "superconducting" materials have no resistance, and if they are used to transmit electricity in the future, very little of the electricity will be lost.

 

Related Articles

View more

Faraday's Law Explained

Faraday’s Law describes electromagnetic induction, where changing magnetic fields generate electricity. Discovered in 1831 by Michael Faraday, it revealed how magnetism produces electric current and became the foundation of modern electrical science.

 

What is Faraday’s Law?

Faraday's Law is a fundamental principle of electromagnetism describing how a changing magnetic field induces an electromotive force (emf) in a conductor. The law: 

✅ Explains electromagnetic induction and induced current

✅ Is essential for transformers, generators, and electric motors

✅ Is the foundation of modern physics and electrical engineering

For a deeper look at the mathematics, mechanisms, and engineering uses, see our detailed guide on Faraday’s Law of Induction."

 

Michael Faraday, the Inventor of Faraday’s Law

In 1831, Michael Faraday performed experiments with coils of wire, magnets, and galvanometers. He observed that when a magnet was moved through a coil, or when the magnetic field around the coil was changed, a current was generated.

Faraday’s Law proved that electricity could be produced directly from magnetism — a revolutionary concept at the time. Faraday’s experiments were deceptively simple but profoundly important, showing that energy could be converted from one form (mechanical motion of the magnet) into another (electric current). 

 

 

Theoretical Context

Faraday’s Law provided one of the first clear demonstrations that electricity and magnetism are not separate forces but part of the same phenomenon. This insight was later formalized by James Clerk Maxwell, whose Maxwell’s equations mathematically unified electromagnetism.

Alongside Ampère’s Law and the Biot–Savart Law, Faraday’s Law gave scientists a new framework to describe the invisible forces of nature. It also introduced the idea of fields, regions of influence surrounding magnets and electrical charges, which became central to modern physics. Read our article titled: Who Invented Electricity for more information about the foundation of electricity.

 

Understanding Electromagnetic Induction

Faraday’s Law showed that induction can occur in several ways:

  • By moving a conductor through a magnetic field

  • By changing the strength of a magnetic field

  • By altering the orientation or size of the conductor’s loop in the field

In each case, the result is the same: a changing magnetic flux produces an EMF. The direction of the induced current is explained by Lenz’s Law, which states that the induced current always opposes the change that created it, ensuring energy conservation.

The principle of electromagnetic induction is the basis for the operation of various electrical devices, including:

  • Generators: These devices convert mechanical energy into electrical energy by rotating a coil within a magnetic field, causing the magnetic flux to change and inducing an EMF in the coil.

  • Transformers: Transformers work on the principle of mutual induction, where a changing current in the primary coil produces a varying magnetic field, which in turn induces an EMF in the secondary coil.

  • Induction motors: In these motors, a changing magnetic field is created by the stator, inducing an EMF in the rotor conductors, which generates a current that interacts with the magnetic field to produce torque and drive the motor.

 

Coils, Flux, and Electrodynamics

Faraday’s induction experiments showed that the induced voltage in a coil is equal to the rate of change of magnetic flux through it. This relationship depends on several factors, including the magnetic field strength, the loop area, and the number of turns in the coil. In fact, the induced EMF is proportional to the rate at which a coil of wire cuts across magnetic field lines, meaning that more turns in the coil increase the overall flux linkage.

From the perspective of electrodynamics, this effect illustrates how charges respond to forces within a conductor. The Lorentz force explains why electrons move when exposed to a changing field, while a time-varying field generates a non-conservative electric field that sustains induction. These principles are embedded in the Maxwell–Faraday equation, which extends Faraday’s insight into the framework of Maxwell’s equations.

Faraday also noted that induction is not limited to mutual coupling between separate coils. The principle of self-induction arises when a coil’s own changing current produces an induced EMF opposing the original change. This phenomenon became central to the design of inductors and transformers.

Ultimately, the legacy of Faraday’s Law extends to our understanding of electromagnetic waves. By linking induction with Maxwell’s field theory, Faraday’s Law explains how time-varying fields generate waves that propagate through space — a concept that underpins modern communications and physics.

 

Why Faraday’s Law Mattered

At the time of its discovery, electricity was known mainly from batteries and static experiments. Faraday’s Law proved that electricity could be generated in continuous and controlled ways. This breakthrough opened the door to inventions that would transform society, from power generation to communication technologies.

 

Everyday Applications of Faraday’s Law

The importance of Faraday’s Law extends far beyond the laboratory. It forms the basis of countless technologies, including:

  • Electric generators – converting mechanical energy into electrical power.

  • Transformers – enabling efficient power transmission over long distances.

  • Induction motors – powering everything from industrial machines to household appliances.

  • Wireless charging systems – transferring energy without physical connections.

  • Magnetic storage and security – used in swipe cards and sensors.

  • Induction heating and cooktops – converting magnetic energy into heat for practical use.

  • Eddy current braking – providing non-contact braking in trains, amusement rides, and industrial systems by inducing currents that resist motion.

  • Inductors in circuits – storing energy in magnetic fields and controlling current in electrical systems.

  • Wireless energy transfer in EVs – enabling electric vehicles to charge without plugs using electromagnetic induction.

These applications show how Faraday’s 19th-century experiments continue to power the 21st-century world.

 

Faraday’s Legacy

Faraday’s Law was not just a scientific milestone but also a turning point for technology and industry. Without his insights, there would be no electric power grids, no telecommunications as we know them, and no modern electronics.

The law also highlighted an important principle: that simple experiments can lead to world-changing insights. With little formal education, Faraday reshaped science and technology, showing that careful observation and curiosity could unlock nature’s secrets.

Faraday’s Law remains one of the most important discoveries in physics. By revealing that changing magnetic fields could create electricity, it provided the foundation for modern electrical engineering, industrial power systems, and everyday devices.

Faraday’s Law stands as a reminder that the forces of nature — magnetism, electricity, and motion — are deeply interconnected. Faraday’s insight into electromagnetic induction continues to shape the modern world, demonstrating that a discovery made nearly two centuries ago remains a vital force in our lives today.

 

Frequently Asked Questions

 

What did Faraday’s Law prove?

Faraday’s experiments with coils and magnets in 1831 proved that a changing magnetic field can induce an electromotive force (EMF) in a conductor. This showed that electricity can be generated from magnetism, a discovery that revolutionized science and technology.

 

Why is Faraday’s Law important today?

Faraday’s Law is the foundation of electrical power generation and transmission. It explains how generators, transformers, and induction motors work, making it essential to modern energy systems, communication technologies, and countless everyday devices.

 

How does Faraday’s Law relate to Maxwell’s equations?

Faraday’s Law was later formalized in Maxwell’s equations as the Maxwell–Faraday equation, which shows that a changing magnetic field produces an electric field. This unification confirmed that electricity and magnetism are two aspects of the same force.

 

Related Articles

 

View more

Definition of a Rectifier

Definition of a rectifier explains AC to DC conversion using diodes, bridge or full-wave circuits, filters, and regulators to supply stable DC, reduce ripple, and power electronics, chargers, and DC motor drives.

 

What Is the Definition of a Rectifier?

An explanation of devices that convert AC to DC using diodes and filtering to deliver regulated, low-ripple DC power.

✅ Converts alternating current to direct current via diodes

✅ Topologies: half-wave, full-wave, bridge rectifier

✅ Uses filters and regulators to minimize ripple and stabilize DC

 

Understanding the definition of a rectifier is fundamental to grasping the complexities of power electronics and electrical engineering. Rectifiers are pivotal in converting AC power to DC power, a critical process for operating numerous electronic devices. By delving into the intricacies of half-wave, full-wave, and bridge rectifiers, as well as the silicon-controlled rectifier, it becomes clear that these devices are crucial in shaping the modern world of electronics. For readers new to the basics, an overview of the difference between AC and DC can clarify foundational concepts.


 

In electrical engineering, the definition encompasses a vital device that plays a crucial role in converting alternating current (AC) into direct current (DC). It  is an essential component in power electronics. It facilitates the conversion of AC power, commonly used in households and businesses, to DC power that electronic devices rely on. To contextualize this conversion, understanding alternating current behavior in typical distribution systems explains the need for rectification.

They come in various forms, such as diodes, mercury arc valves, and silicon-controlled rectifiers (SCRs). A diode, the most basic rectifying component, allows current to flow in only one direction. This feature helps convert AC power, which alternates direction, to DC power that flows in a single, continuous direction. For measurement and notation conventions, see how DC current is defined and applied in circuit analysis.

AC to DC conversion regulates voltage to create a steady and reliable energy supply. Throughout this conversion, rectifiers often assume different configurations depending on the specific application or requirement. Two popular configurations are the half-wave rectifier and the full-wave rectifier. Downstream stabilization is often handled by a voltage regulator that smooths residual ripple and safeguards loads.

A half-wave type allows only one half of the AC waveform to pass through while blocking the other half. This type is relatively simple and cost-effective but has a few downsides, such as lower efficiency and increased ripple voltage. Ripple voltage refers to the residual periodic variation of the DC output voltage. The result is an unsteady DC output, which might be undesirable in certain applications.

On the other hand, the full-wave type allows both halves of the AC waveform to pass through, effectively doubling the output frequency and reducing the ripple voltage. This configuration improves efficiency and provides a smoother, more consistent DC output voltage. The full-wave type has two primary forms: the center-tap configuration and the bridge rectifier.

The bridge rectifier is a highly efficient configuration that uses four diodes in a specific arrangement. This configuration eliminates the need for a center-tapped transformer, a key component in center-tap full-wave rectifiers, thus reducing overall size and cost. In addition, the bridge rectifier provides better voltage regulation and reduced harmonic distortion, making it ideal for various applications. For consistent terminology across datasheets, reviewing key electricity terms can prevent misinterpretation during design.

They serve numerous purposes in various fields. For instance, they are used in power supply circuits of electronic devices, battery chargers, and welding equipment. In addition, they help in voltage regulation for DC power supplies and extract radio signals from transmitted modulated signals. Many of these uses depend on stable direct current characteristics to ensure predictable device behavior.

Silicon-controlled rectifiers (SCRs) represent a more advanced type of rectifier. As silicon-based semiconductor switches, SCRs can control large amounts of current with a small input signal. This ability to manage current flow in a controlled manner sets SCRs apart from regular rectifiers, like diodes. As a result, SCRs are often used in power control applications, such as motor speed controllers, light dimmers, and temperature regulators.

Understanding how rectifiers play a pivotal role in industries and everyday applications is essential. One such application is in electric vehicles (EVs), where rectifiers contribute to charging systems by converting AC power from the grid to DC power for charging the battery. Additionally, they are utilized in renewable energy systems, like solar power and wind power, to convert the harvested energy into a stable and usable form for various applications. At a broader level, a concise electricity definition helps frame how conversion stages integrate with generation, storage, and loads.

Another application is in telecommunications equipment, where they help maintain a continuous and stable power supply. In the aviation and marine industries, rectifiers ensure sensitive equipment operates with minimal interference from fluctuating power sources. Moreover, they are employed in power supply units of medical devices, computers, and other critical equipment that demand stable power supplies.


How does it work in AC to DC conversion?

Its primary function is to convert AC voltage, which periodically reverses direction, into DC voltage, which flows in one direction. Its simplest form it consists of a diode, which allows current to pass in only one direction. When connected to an AC power source, the diode rectifies the current by blocking the negative half-cycles of the AC waveform. More advanced configurations, like full-wave rectifiers and bridge rectifiers, rectify both the positive and negative half-cycles, increasing efficiency and providing a smoother output.


What are the differences between half-wave and full-wave rectifiers?

The key difference between half-wave and full-wave rectifiers lies in their efficiency and output. A half-wave type only allows one-half of the AC waveform to pass through, blocking the other half. As a result, it has lower efficiency, increased ripple voltage, and produces an unsteady DC output.

On the other hand, a full-wave type allows both halves of the AC waveform to pass through, leading to increased efficiency and reduced ripple voltage. In addition, the full-wave type produces a smoother, more consistent DC output voltage, making it more suitable for applications that require a stable power supply. The full-wave type can be implemented using a center-tap or bridge rectifier configuration.

 

Related Articles

View more

Electrical Energy

Electrical energy is the power derived from the movement of electrons through a conductor. It powers homes, industries, and modern technology. This form of energy can be generated from various sources, including fossil fuels, solar, wind, and hydroelectric systems.

 

What is Electrical Energy?

Electrical energy is the result of electric charge flow, used to perform work in countless applications across daily life and industry.

✅ Powers lighting, heating, and electronic devices across all sectors

✅ Generated by power plants from renewable and non-renewable sources

✅ Converted from and into other energy forms, like mechanical or thermal

 

Electrical Energy: The Fundamental Source of Power

An Electric charge is a property of certain subatomic particles (e.g., electrons and protons) which interacts with the electromagnetic field and causes attractive and repulsive forces between them. Electric charges give rise to one of the four fundamental forces of nature, and is a conserved property of matter that can be quantified. In this sense, the phrase "quantity of electricity" is used interchangeably with the phrases "charge of electricity" and "quantity of charge." There are two types of charge: we call one kind of charge positive and the other negative. Through experimentation, we find that like-charged objects repel and opposite-charged objects attract one another. Coulomb's law gives the magnitude of the force of attraction or repulsion. For a broader understanding, visit how electricity works to see how it is generated, transmitted, and used across power grids and homes.

 

How Electric Fields Transfer Energy Between Charges

Michael Faraday introduced the concept of the Electrical Energy field. The field force acts between two charges, in the same way that the gravitational field force acts between two masses. However, the electric field is a little bit different. Gravitational force depends on the masses of two bodies, whereas the electromagnetic force depends on the electric charges of two bodies. While gravity can only pull two masses together, the force can be attractive or repulsive. If both charges are of the same sign (e.g. both positive), there will be a repulsive force between the two. If the charges are opposite, an attractive force will exist between the two bodies (as seen in static electricity and kinetic energy). The magnitude of the force varies inversely with the square of the distance between the two bodies and is also proportional to the product of the unsigned magnitudes of the two charges. Discover how electrical load determines the demand placed on a system’s energy supply.

 

Electrical Energy and Voltage Explained

The electric potential difference between two points is defined as the work done per unit charge (against forces) in moving a positive point charge slowly from one point to another. Choose one point as a reference with zero potential. The electric potential at any other point equals the work required per unit charge to move a positive test charge from the reference point to that location. For isolated charges, the reference point is usually taken to be infinity. The potential is measured in volts. (1 volt = 1 joule/coulomb) The electric potential is analogous to temperature: there is a different potential at every point in space, and the potential gradient indicates the direction and magnitude of the driving force behind electric current flow. Similarly, there is an electric potential at every point in space, and its gradient indicates the direction and magnitude of the driving force behind the movement of charged energy. To understand how voltage influences electrical energy, it's essential to grasp how potential difference drives current through circuits.

 

Electric Current: Flow of Charge That Powers Technology

Current is a flow of electrical energy, and its intensity is measured in amperes. Examples of electric currents include metallic conduction, where electrons flow through a conductor such as a metal wire, and electrolysis, where ions (charged atoms) flow through liquids. The particles themselves often have energy to move quite slowly, while the moving electric field that drives them propagates at close to the speed of light. See energy conduction for more information. Alternating current and direct current each transfer electrical energy differently, depending on the application.

Devices that utilize charge flow principles in materials are referred to as electronic devices.

A direct current (DC) is a unidirectional flow, while an alternating current (AC) reverses direction repeatedly. The time average of an alternating current is zero, but its capability (RMS value) is not zero.

Ohm's Law is an important relationship describing the behaviour of energy, relating it to voltage. Explore the principles behind Ohm’s Law to see how voltage, current, and resistance interrelate in determining energy flow. Learn how electrical resistance affects energy loss and heat generation in systems.

For historical reasons, power is said to flow from the most positive part of a circuit to the most negative part. The current thus defined is referred to as conventional current. It is now known that, depending on the conditions, a current can consist of a flow of charged particles in either direction, or even in both directions at once. The positive-to-negative convention is widely used to simplify this situation. If another definition is used - for example, "electron current" - it should be explicitly stated.

 

Related Articles

 

View more

Voltage Drop Formula

The voltage drop formula is V = I × R, where V represents the voltage drop, I represents the current in amperes, and R represents the resistance in ohms. It calculates the amount of voltage lost through a conductor and is essential for designing safe and efficient electrical systems.

 

What is Voltage Drop Formula?

A voltage drop formula can help you manually calculate the voltage drop in cables under full load in branch circuits. It does not matter if you are working with copper or aluminum conductors. To accurately calculate the voltage drop for a given cable size, length, and current, you need to know the resistance of the type of cable you’re using. 

✅ V = I × R for DC circuits

✅ Accounts for impedance in AC circuits

✅ Helps ensure safe, efficient electrical design

Use our voltage drop calculator to quickly determine wire size, conductor efficiency, and line loss for safer, more reliable electrical installations.

 

Power Quality Analysis Training

Power Factor Training

Request a Free Power Quality Training Quotation

To determine voltage drop in an electrical system, engineers and electricians use a standard formula that accounts for conductor resistance, current, and circuit length. The National Electrical Code (NEC) provides guidelines for the maximum voltage drop allowed in various installations to ensure efficiency and safety. For both single-phase and three-phase circuits, excessive voltage drop can lead to poor equipment performance and energy loss. To simplify calculations, a voltage drop calculator can quickly assess whether a circuit meets NEC recommendations and optimize conductor sizing accordingly. You can learn more about how electrical resistance affects circuits in our detailed voltage drop guide, which explains causes, calculations, and solutions for safe, efficient power distribution.

 

DC / single phase calculation

The voltage drop V in volts (V) is equal to the wire current I in amps (A) times 2 times one way wire size length L in feet (ft) times the wire resistance per 1000 feet R in ohms (Ω/kft) divided by 1000:

 

Vdrop (V) = Iwire (A) × Rwire(Ω)

Iwire (A) × (2 × L(ft) × Rwire(Ω/kft) / 1000(ft/kft))

 

The voltage drop V in volts (V) is equal to the wire current I in amps (A) times 2 times one way wire length L in meters (m) times the wire resistance per 1000 meters R in ohms (Ω/km) divided by 1000:

 

Vdrop (V) = Iwire (A) × Rwire(Ω)

Iwire (A) × (2 × L(m) × Rwire (Ω/km) / 1000(m/km))

 

3 phase calculation

The line-to-line voltage drop V in volts (V) is equal to the square root of 3 times the wire current I in amps (A) times one-way wire length L in feet (ft) times the wire resistance per 1000 feet R in ohms (Ω/kft) divided by 1000:

 

Vdrop (V) = √3 × Iwire (A) × Rwire (Ω)

= 1.732 × Iwire (A) × (L(ft) × Rwire (Ω/kft) / 1000(ft/kft))

 

The line-to-line voltage drop V in volts (V) is equal to the square root of 3 times the wire current I in amps (A) times one-way wire length L in meters (m) times the wire resistance per 1000 meters R in ohms (Ω/km) divided by 1000:

 

Vdrop (V) = √3 × Iwire (A) × Rwire (Ω)

= 1.732 × Iwire (A) × (L(m) × Rwire (Ω/km) / 1000(m/km))

 

Wire diameter calculations

The n gauge wire diameter, dn in inches (in), is equal to 0.005in times 92 raised to the power of 36 minus gauge number n, divided by 39:

 

dn (in) = 0.005 in × 92(36-n)/39

 

The n gauge wire diameter dn in millimetres (mm) is equal to 0.127mm times 92 raised to the power of 36 minus gauge number n, divided by 39:

 

dn (mm) = 0.127 mm × 92(36-n)/39

 

Wire cross-sectional area calculations

The n gauge wire's cross-sectional area An in kilo-circular mils (kcmil) is equal to 1000 times the square wire diameter d in inches (in):

 

An (kcmil) = 1000×dn2 = 0.025 in2 × 92(36-n)/19.5

 

The n gauge wire's cross-sectional area An in square inches (in2), is equal to pi divided by 4 times the square wire diameter d in inches (in):

 

An (in2) = (π/4)×dn2 = 0.000019635 in2 × 92(36-n)/19.5

 

The n gauge wire's cross-sectional area, An, in square millimetres (mm2), is equal to pi divided by 4 times the square wire diameter d in millimetres (mm):

 

An (mm2) = (π/4)×dn2 = 0.012668 mm2 × 92(36-n)/19.5

 

Wire resistance calculations

The n gauge wire resistance R in ohms per kilofeet (Ω/kft) is equal to 0.3048×1000000000 times the wire's resistivity ρ in ohm-meters (Ω·m) divided by 25.42 times the cross-sectional area An in square inches (in2):

 

R(Ω/kft) = 0.3048 × 109 × ρ(Ω·m) / (25.42 × An (in2))

 

The n gauge wire resistance R in ohms per kilometre (Ω/km) is equal to 1000000000 times the wire's resistivity ρ in ohm-meters (Ω·m) divided by the cross-sectional area An in square millimetres (mm2):

 

R(Ω/km) = 109 × ρ(Ω·m) / An (mm2)

 

Related Pages

 

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.