What is Inductance?


What is Inductance

Inductance is the property of an electrical circuit that resists changes in current by storing energy in a magnetic field. It underpins transformers, motors, inductors, and AC power systems, making it essential in the field of electrical engineering.

 

What is Inductance?

Inductance is a fundamental electrical property that describes how circuits resist changes in current flow by generating a magnetic field.

✅ Opposes changes in current by storing energy in a magnetic field

✅ Measured in henrys (H), used in inductors and transformers

✅ Essential in AC power systems, filters, and motor design

This principle is central to electromagnetic induction and underlies many devices in modern power and electronics.

The principle of inductance was first discovered by Michael Faraday, who showed that moving a magnet through a coil of wire induced a voltage across the coil. This discovery, known as Faraday’s Law, explains the phenomenon of electromagnetic induction. Heinrich Lenz later refined this understanding, demonstrating that the induced voltage always acts to oppose the change that created it — a principle now known as Lenz’s Law. These discoveries underpin the operation of inductors, transformers, and motors.

Inductance occurs when a change in current flow is utilized to prevent signals with a higher frequency component from passing, while allowing signals with lower frequency components to pass. This is why inductors are sometimes referred to as "chokes," as they effectively block higher frequencies. A common application of a choke is in a radio amplifier biasing circuit, where the collector of a transistor needs to be supplied with a DC voltage without allowing the RF (radio frequency) signal to conduct back into the DC supply. To understand how inductance interacts with voltage changes in AC circuits, see our article on voltage.

To understand how inductance interacts with voltage changes in AC circuits, see our article on voltage.

Power Quality Analysis Training

Power Factor Training

Request a Free Power Quality Training Quotation

Imagine a wire 1,000,000 miles (about 1,600,000 kilometres) long. Imagine that we form this wire into a huge loop and then connect its ends to the terminals of a battery, as shown in Figure 1, to drive current through the wire. Inductors are often paired with capacitors in circuits, so it's helpful to review the basics of what is a capacitor and how they function.

If we used a short wire for this experiment, the current would begin to flow immediately and attain a level limited only by the resistance in the wire and the resistance in the battery. But because we have an extremely long wire, the electrons require some time to work their way from the negative battery terminal, around the loop, and back to the positive terminal. Therefore, it will take some time for the current to reach its maximum level. Since inductance is used to oppose changes in current, our guide on what is current electricity provides essential background.

The magnetic field produced by the loop will initially be small during the first few moments when current flows only in part of the loop. The field will build up as the electrons get around the loop. Once the electrons reach the positive battery terminal, allowing a steady current to flow around the entire loop, the magnetic field quantity reaches its maximum and levels off, as shown in Figure 2. At that time, we'll have a certain amount of energy stored in the magnetic field. The amount of stored energy will depend on the inductance of the loop, which depends on its overall size. We symbolize inductance, as a property or as a mathematical variable, by writing an italicized, uppercase letter L. Our loop constitutes an inductor. To abbreviate "inductor," we write an upper-case, non-italicized letter L.

 


 

Fig. 1. We can use a huge, imaginary loop of wire to illustrate the principle of inductance.

 

Counter EMF and the RL Time Constant

As current begins to flow in a conductor, the expanding magnetic field induces a counter-electromotive force (emf) that opposes the applied voltage. At the instant of starting, this counter emf nearly equals the source voltage, keeping the current flow small. As the magnetic field stabilizes, the counter emf weakens until the current reaches its maximum.

Because inductors resist changes in current, the current in an RL circuit cannot change instantly. The rate of change is governed by the time constant (τ):

  • After one time constant, the current reaches approximately 63% of its maximum value.

  • After 5 time constants, the current is essentially steady.

This exponential buildup parallels the charging process of capacitors in RC circuits.

 

 

Building Inductance with Coils and Cores

Obviously, we can't make a wire loop measuring anywhere near 1,000,000 miles in circumference. But we can wind fairly long lengths of wire into compact coils. When we do that, the magnetic flux for a given length of wire increases compared with the flux produced by a single-turn loop, increasing the inductance. If we place a ferromagnetic rod, called a core, inside a coil of wire, we can increase the flux density and further enhance the inductance. High-frequency signals blocked by inductors are often explained using the concept of impedance, detailed in our page on what is impedance.

We can achieve values of L many times greater with a ferromagnetic core than with a similar-sized coil having an air core, a solid plastic core, or a solid wooden core. (Plastic and dry wood have permeability values that differ little from air or a vacuum; engineers occasionally use these materials as coil cores or "forms" to add structural rigidity to the windings without significantly changing the inductance.) The current that an inductor can handle depends on the diameter of the wire. But the value of L also depends on the number of turns in the coil, the diameter of the coil, and the overall shape of the coil. Many circuits involve a balance between resistance and inductance, so reviewing what is electrical resistance will add depth to your understanding.

Key physical parameters that affect inductance include:

  • Number of turns: inductance varies as the square of the turns.

  • Cross-sectional area: a larger area increases inductance.

  • Length of the coil: Longer coils reduce inductance.

  • Core material: magnetic materials (iron, steel, nickel) concentrate flux and increase inductance.

If we hold all other factors constant, the inductance of a helical coil increases in direct proportion to the number of turns of wire. Inductance also increases in direct proportion to the diameter of the coil. If we "stretch out" a coil with a certain number of turns and a certain diameter while holding all other parameters constant, its inductance decreases. Conversely, if we "squash up" an elongated coil while holding all other factors constant, the inductance goes up. For insight into how inductors shape power quality, especially reactive power, read our overview of reactive power.

 

Self-Inductance

Even a straight conductor exhibits some inductance, but when the current in a coil changes, the change in the magnetic field induces a voltage across that same coil. This phenomenon is known as self-inductance, and it always opposes changes in current. The effect is central to the operation of inductors, transformers, and motor windings.

 

Normal and Abnormal Conditions

Under normal circumstances, the inductance of a coil (or any other type of device designed to function as an inductor) remains constant regardless of the strength of the signal we apply. In this context, "abnormal circumstances" refer to an applied signal so strong that the inductor wire melts or the core material overheats excessively. Good engineering sense demands that such conditions should never arise in a well-designed electrical or electronic system.

 


 

Fig. 2. Relative magnetic flux in and around a huge loop of wire connected to a current source, as a function of time.

 

Related Articles

 

Related News

What is Electricity?

Electricity is the flow of electric charge, usually through a conductor like wire. It powers lights, appliances, and machines by converting energy into motion, heat, or light. Electricity can be generated from sources such as fossil fuels, wind, solar, or water.

 

What is electricity?

Electricity is a fundamental form of energy created by the movement of electrons.

✅ Powers homes, industries, and electronic devices

✅ Flows through circuits as an electric current

✅ Generated from renewable and non-renewable sources

The power we use is a secondary energy source because it is produced by converting primary energy sources such as coal, natural gas, nuclear, solar, and wind energy into electrical power. It is also referred to as an energy carrier, meaning it can be converted into other forms of energy, such as mechanical or thermal energy.

Primary energy sources are either renewable or nonrenewable, but our power is neither.

To understand why electrons move in the first place, start with voltage, the electrical “pressure” that pushes charge through every circuit.

 

Electricity Has Changed Everyday Life

Although most people rarely think about electricity, it has profoundly changed how we live. It is as essential as air or water, yet we tend to take it for granted—until it’s gone. Electricity powers heating and cooling systems, appliances, communications, entertainment, and modern conveniences that past generations never imagined.

Before widespread electrification began just over a century ago, homes were lit with candles or oil lamps, food was cooled with ice blocks, and heating was provided by wood- or coal-burning stoves.

The steady stream of electrons we use daily is explored in our primer on current electricity.

 

Discovering Electricity: From Curiosity to Power Grid

Scientists and inventors began unlocking the secrets of electricity as early as the 1600s. Over the next few centuries, their discoveries built the foundation for the electric age.

Benjamin Franklin demonstrated that lightning is a form of electricity.

Thomas Edison invented the first commercially viable incandescent light bulb.

Nikola Tesla pioneered the use of alternating current (AC), which enabled the efficient transmission of electricity over long distances. He also experimented with wireless electricity.

Curious why Tesla’s ideas beat Edison’s? Our article on alternating current breaks down the advantages of alternating current (AC) over direct current (DC).

Before Tesla’s innovations, arc lighting used direct current (DC) but was limited to outdoor and short-range applications. His work made it possible for electricity to be transmitted to homes and factories, revolutionizing lighting and industry.

 

Understanding Electric Charge and Current

Electricity is the movement of electrically charged particles, typically electrons. These particles can move either statically, as in a buildup of charge, or dynamically, as in a flowing current.

All matter is made of atoms, and each atom consists of a nucleus with positively charged protons and neutral neutrons, surrounded by negatively charged electrons. Usually, the number of protons and electrons is balanced. But when that balance is disturbed—when electrons are gained or lost—an electric current is formed as those electrons move.

For a step-by-step walkthrough of everything from circuits to safety, visit how electricity works.

 

Electricity as a Secondary Energy Source

Electricity doesn’t occur naturally in a usable form. It must be generated by converting other types of energy. In fact, electricity is a manufactured product. That’s why electricity is called a secondary energy source—it carries energy from its original form to where we need it.

We generate electricity by transforming mechanical energy—such as spinning a turbine—into electrical energy. This conversion happens at power plants that use a variety of fuels and methods:

  • Fossil fuels (coal, oil, natural gas)

  • Nuclear energy

  • Renewable sources like wind, solar, and hydroelectric

If turbines, magnets, and power plants intrigue you, see how electricity is generated for a deeper dive.

 

How Electricity Was Brought Into Homes

Before electricity generation began on a mass scale, cities often developed near waterfalls, where water wheels powered mills and machines. The leap from mechanical energy to electrical energy enabled power to travel not just across a town, but across entire countries.

Beginning with Franklin’s experiments and followed by Edison’s breakthrough with indoor electric light, the practical uses of electricity expanded rapidly. Tesla’s AC power system made widespread electric distribution feasible, bringing light, heat, and industry to homes and cities worldwide.

 

How Transformers Changed Everything

To transmit electricity efficiently over long distances, George Westinghouse developed the transformer. This device adjusts the voltage of electrical power to match its purpose—high for long-range travel, low for safe use in homes.

Transformers made it possible to supply electricity to homes and businesses far from power plants. The electric grid became a coordinated system of generation, transmission, distribution, and regulation.

Even today, most of us rarely consider the complexity behind our wall sockets. But behind every outlet lies a vast infrastructure keeping electricity flowing safely and reliably.

 

How Is Electricity Generated?

Electric generators convert mechanical energy into electricity using the principles of magnetism. When a conductor—such as a coil of wire—moves through a magnetic field, an electric current is induced.

In large power stations, turbines spin magnets inside massive generators. These turbines are driven by steam, water, or wind. The rotating magnet induces small currents in the coils of wire, which combine into a single continuous flow of electric power.

Discover the principle that turns motion into power in electromagnetic induction, the heart of every modern generator.

 

Measuring Electricity

Electricity is measured in precise units. The amount of power being used or generated is expressed in watts (W), named after inventor James Watt.

  • One watt is a small unit of power; 1,000 watts equal one kilowatt (kW).

  • Energy use over time is measured in kilowatt-hours (kWh).

  • A 100-watt bulb burning for 10 hours uses 1 kWh of electricity.

These units are what you see on your electric bill. They represent how much electricity you’ve consumed over time—and how much you’ll pay.

When it’s time to decode your energy bill, the chart in electrical units makes watts, volts, and amps clear.

 

Related Articles

 

View more

What is Capacitance?

Capacitance is the ability of a component to store electrical charge between plates separated by insulation, measured in farads. It is vital in circuits, electronics, and energy storage for voltage regulation, stability, and efficient power system performance.

 

What is Capacitance?

Capacitance is a fundamental concept in electrical engineering and physics, describing a system's ability to store energy in the form of an electrical charge.

✅ Measures the ability to store electric charge in a circuit

✅ Essential in filtering, timing, and power conditioning

✅ Found in capacitors used in electronics, motors, and power supplies

Capacitance (C = Q / V) measures the amount of electric charge (Q) a system stores per volt (V); unit: farad (F).

Power Quality Analysis Training

Power Factor Training

Request a Free Power Quality Training Quotation

Capacitance is the property of a system that allows it to store electric charge when a voltage, or electric potential difference, is applied across its terminals. This stored charge represents the system’s charge storage capacity, which is directly proportional to the capacitance value and the applied voltage. In alternating current (AC) circuits, capacitors do not pass current in the same way resistors do; instead, they oppose changes in voltage through a phenomenon known as capacitive reactance. This reactance decreases as the frequency of the AC signal increases, making capacitors essential in filtering, tuning, and timing applications. To understand how stored charge behaves across voltage differences, refer to our guide on capacitive load, which explores how capacitors affect AC circuits. For a detailed breakdown of how capacitance is measured, check out the unit of capacitance to understand farads and their practical conversions.

 

What Determines Capacitance?

The capacitance of a capacitor is determined by its geometry and the properties of the dielectric material between the conductive plates. The unit of capacitance is the farad, which can be measured in farads. Capacitors are often rated in microfarads (μF) or picofarads (pF), depending on their size and intended use.  For the basics of components that store electrical energy, see what is a capacitor to learn how these devices function and their role in circuits.

Learn how stored charge relates to apparent and real power in AC systems by visiting our apparent power vs real power comparison.

When a capacitor is connected to a voltage source, it charges, storing energy as an electrical field between its conductive plates. The amount of energy stored in a capacitor is proportional to its capacitance and the square of the voltage across it. When the voltage source is removed, the capacitor will slowly discharge, releasing the stored energy as an electrical current. Capacitance is also central to power factor optimization—discover how this works with an automatic power factor controller.

 

RC Circuits: The Role of Capacitance in Charging

A capacitor's charging and discharging behaviour is determined by its capacitance and the properties of the surrounding circuit. For example, in a simple circuit with a resistor and a capacitor, the capacitor will charge up rapidly when first connected to a voltage source. Still, it will then discharge slowly over time as the energy stored in the capacitor is dissipated through the resistor. The time constant of the circuit, which describes the rate at which the capacitor charges and discharges, is determined by the product of the resistance and capacitance of the circuit.

Capacitors are used in various electronic applications, from filtering noise in power supplies to storing energy in flash memory. The capacitance of a capacitor can be tuned by selecting different dielectric materials, changing the geometry of the conductive plates, or adding additional plates in parallel. To explore how capacitors behave when connected end-to-end, visit capacitance in series, which explains the reciprocal formula and voltage behavior.

 

Dielectric Materials and Their Effect on Capacitance

Dielectric materials are crucial to the operation of capacitors, as they serve as insulators between the conductive plates. The dielectric constant of a material describes its ability to store electrical energy and is a key parameter in determining the capacitance of a capacitor. Dielectric materials can be solid, such as ceramics or plastics, or liquid, such as electrolytes.

Capacitors can store energy in various forms, from high-voltage electrical energy in power supplies to low-voltage electrical energy in portable electronic devices. The energy stored in a capacitor can provide a quick burst of power to a circuit or smooth out variations in a power supply. If you're curious about how capacitors add their values in parallel, the capacitance in parallel page illustrates how capacitances sum directly, enhancing energy storage.

 

How Capacitance Compares to Resistance and Inductance

Resistance slows down the transfer rate of charge carriers (usually electrons) by "brute force." In this process, some energy is invariably converted from electrical form to heat. Resistance is said to consume power for this reason. Resistance is pre­sent in DC as well as in AC circuits and works the same way for either direct or alternat­ing current. Capacitor performance depends on safe voltage levels; our page on capacitor voltage rating explains these limits in detail.

Inductance impedes the flow of AC charge carriers by temporarily storing the en­ergy as a magnetic field. However, this energy is eventually replenished. For high-capacitance setups in electrical systems, see how banks of capacitors are configured in our capacitor bank overview.

Capacitance impedes the flow of AC charge carriers by temporarily storing the energy as an electric potential field. This energy is given back later, just as it is in an inductor. Capacitance is not generally necessary in pure-DC cir­cuits. However, it can have significance in circuits where DC is pulsating and not steady. If you're studying how capacitance affects reactive energy, visit our breakdown of reactive power formula in electrical circuits.

 

Capacitance in AC Circuits and Frequency Response

Capacitance, like inductance, can appear unexpectedly or unintentionally. As with inductance, this effect becomes more evident as the ac frequency increases.

Capacitance in electric circuits is deliberately introduced by a device called a capacitor. It was discovered by the Prussian scientist Ewald Georg von Kleist in 1745 and independently by the Dutch physicist Pieter van Musschenbroek at about the same time while investigating electrostatic phenomena. They discovered that electricity generated by an electrostatic machine could be stored for a period and then released. The device, which came to be known as the Leyden jar, consisted of a stoppered glass vial or jar filled with water, with a nail piercing the stopper and dipping into the water. By holding the jar in hand and touching the nail to the conductor of an electrostatic machine, they found that a shock could be obtained from the nail after disconnecting it by touching it with the free hand.

This reaction showed that some of the electricity from the machine had been stored. A simple but fundamental step in the evolution of the capacitor was taken by the English astronomer John Bevis in 1747 when he replaced the water with metal foil, forming a lining on the inside surface of the glass and another covering the outside surface. The interaction of capacitance and system reactance is a key part of understanding power quality, as explained on our reactor reactance in power systems page.

 

A Visual Thought Experiment: Capacitance Between Metal Plates

Imagine two very large, flat sheets of metal, such as copper or aluminum, that are ex­cellent electrical conductors. Suppose they are each the size of the state of Nebraska and are placed one on top of the other, separated by just a foot of space. What will happen if these two sheets of metal are connected to the terminals of a battery, as shown in Fig. 11-1?

 



Fig. 11-1. Two plates will become charged electrically, one positively and the other nega­tively.

 

The two plates will become charged electrically, one positively and the other nega­tively. You might think this would take a little while because the sheets are so big. However, this is a reasonable assumption.

If the plates were small, they would both become charged almost instantly, attaining a relative voltage equal to the battery's voltage. But because the plates are gigantic, it will take a while for the negative one to "fill up" with electrons, and it will take an equal amount of time for the other one to get electrons "sucked out." Finally, however, the volt­age between the two plates will be equal to the battery voltage, and an electric field will exist in the space between the plates.

This electric field will be small at first; the plates don't charge immediately. However, the negative charge and the positive charge will increase over time, depending on the size of the plates and the distance between them. Figure 11-2 is a relative graph showing the in­tensity of the electric field between the plates as a function of time elapsed since the plates are connected to the battery terminals.

 



Fig. 11-2. Relative electric field intensity, as a function of time, between two metal plates connected to a voltage source. 

 

Related Articles

 

View more

What is a Resistor?

A resistor is an electronic component that limits or regulates the flow of electric current, manages voltage levels, and safeguards circuits in electrical and electronic devices, ensuring stable performance and preventing component damage.

 

What is a resistor?

A resistor is an electronic component designed to create electrical resistance in a circuit.

✅ Limits or regulates electric current flow in circuits

✅ Controls voltage levels for proper device operation

✅ Protects electrical and electronic components from damage

In electronic components and circuits, resistors play a crucial role. But what exactly is a resistor, and why are they so important? This comprehensive guide will explain the basics of resistors, explore different types and applications, and answer common questions related to their function and use. 

Their primary function is to control and limit the flow of electrical current, ensuring the proper operation of electronic devices and, in addition, introducing resistance to help maintain stable voltage and current levels in circuits, protecting sensitive components from damage due to excess current.

 

Electrical Resistance

Understanding electrical resistance is essential to grasping how resistors control current flow and protect sensitive components in circuits. The value of a resistor is determined by its electrical resistance, which is measured in ohms (Ω). Resistance is directly related to Ohm's law, a fundamental principle in electronics that states that the current (I) flowing through a conductor between two points is directly proportional to the voltage (V) across those points and inversely proportional to the resistance (R). In simpler terms, the equation V = I represents Ohm's law of R. Resistors work alongside capacitors and other components to regulate voltage and ensure stable performance in electronic devices. The unit of electrical resistance, the ohm (Ω), defines how much a resistor opposes the flow of electric current.

Various types of resistors are available, each with its own set of applications and characteristics. Some common resistor types include fixed resistors, variable resistors, carbon film resistors, metal foil resistors, metal oxide film resistors, and wire-wound resistors.

As the name suggests, fixed resistors have a fixed resistance value and are often used for general-purpose applications. Carbon film and metal film resistors are popular examples of fixed resistors, with the latter offering higher accuracy and stability. On the other hand, wire-wound resistors are constructed using a metal wire wrapped around a core, providing excellent heat dissipation and making them suitable for high-power applications.

 

Types of Resistors

Variable resistors, also known as potentiometers or rheostats, allow users to adjust the resistance manually. These components are typically used for fine-tuning and controlling various aspects of electronic circuits, such as volume or light intensity. Different types of resistors offer unique properties for specific applications, from precision electronics to high-power systems.

Resistor colour codes identify the value, tolerance, and sometimes the temperature coefficient of fixed resistors. The colour code consists of a series of coloured bands, with each colour representing a specific number. To read the colour code, you need to learn the number assigned to each colour and understand the sequence of bands.

The primary difference between fixed and variable resistors is the ability to adjust the resistance value. Fixed resistors have a predetermined resistance that cannot be changed, while variable resistors can be adjusted to obtain the desired resistance within a certain range.

 

Power Dissipation

Power dissipation is the heat a resistor generates when electrical current flows through it. This heat can affect the performance and reliability of a resistor and, in some cases, may cause damage to the component or the circuit. To prevent such issues, resistors are designed with a power rating, which indicates the maximum amount of power they can safely dissipate.

A resistor is integral to electronic circuits and can be found in virtually every electronic device. They come in various shapes, sizes, and materials to suit various applications. With their ability to control electrical current and maintain the stability of circuits, resistors play a vital role in the successful operation of electronic devices.

 

What is a resistor?

Resistors are essential electronic components that help regulate electrical current and voltage within circuits. Their various types and applications cater to different needs in the electronics world. Understanding resistors and their characteristics is crucial for anyone working with electronic circuits or looking to build their own devices.

 

Related Articles

 

View more

How Is Electricity Generated?

It is produced by converting various energy sources, such as fossil fuels, nuclear, solar, wind, or hydro, into electrical energy using turbines and generators. These systems harness mechanical or chemical energy and transform it into usable power.

 

How Is Electricity Generated?

✅ Converts energy sources like coal, gas, wind, or sunlight into power

✅ Uses generators driven by turbines to create electrical current

✅ Supports global power grids and industrial, commercial, and residential use

 

Understanding Electricity Generation

Electricity generation is the lifeblood of modern civilization, powering homes, industries, hospitals, transportation systems, and digital infrastructure. But behind the flip of a switch lies a vast and complex process that transforms raw energy into electrical power. At its core, electricity is generated by converting various forms of energy—mechanical, thermal, chemical, or radiant—into a flow of electric charge through systems engineered for efficiency and reliability.

Understanding the role of voltage is essential in this process, as it determines the electrical pressure that drives current through circuits.

According to the Energy Information Administration, the United States relies on a diverse mix of technologies to produce electric power, including fossil fuels, nuclear power, and renewables. In recent years, the rapid growth of solar photovoltaic systems and the widespread deployment of wind turbines have significantly increased the share of clean energy in the national grid. These renewable systems often use turbines to generate electricity by converting natural energy sources—sunlight and wind—into mechanical motion and ultimately electrical power. This transition reflects broader efforts to reduce emissions while meeting rising electric power demand.

 

How Power Generation Works

Most electricity around the world is produced using turbines and generators. These devices are typically housed in large-scale power plants. The process begins with an energy source—such as fossil fuels, nuclear reactions, or renewable inputs like water, wind, or sunlight—which is used to create movement. This movement, in turn, drives a turbine, which spins a shaft connected to a generator. Inside the generator, magnetic fields rotate around conductive coils, inducing a voltage and producing alternating current (AC) electricity. This method, known as electromagnetic induction, is the fundamental mechanism by which nearly all electric power is made.

In designing and maintaining electrical systems, engineers must also consider voltage drop, which can reduce efficiency and power quality. You can evaluate system losses using our interactive voltage drop calculator, and better understand the math behind it using the voltage drop formula.

 

Energy Sources Used in Power Production

Steam turbines remain the dominant technology in global energy production. These are especially common in plants that burn coal, natural gas, or biomass, or that rely on nuclear fission. In a typical thermal power plant, water is heated to create high-pressure steam, which spins the turbine blades. In nuclear facilities, this steam is generated by the immense heat released when uranium atoms are split. While highly efficient, these systems face environmental and safety concerns—greenhouse gas emissions from fossil fuels, radioactive waste and accident risk from nuclear power.

Power quality in these plants can be impacted by voltage sag, which occurs when systems experience a temporary drop in electrical pressure, often due to sudden large loads or faults. Managing such variations is crucial to stable output.

 

The Rise of Renewable Energy in Electricity Generation

Alongside these large-scale thermal technologies, renewable sources have grown significantly. Hydroelectric power harnesses the kinetic energy of falling or flowing water, typically from a dam, to spin turbines. Wind energy captures the movement of air through large blades connected to horizontal-axis turbines. Solar power generates electricity in two distinct ways: photovoltaic cells convert sunlight directly into electric power using semiconductors, while solar thermal plants concentrate sunlight to heat fluids and produce steam. Geothermal systems tap into the Earth’s internal heat to generate steam directly or via heat exchangers.

These renewable systems offer major advantages in terms of sustainability and environmental impact. They produce no direct emissions and rely on natural, often abundant energy flows. However, they also face limitations. Solar and wind power are intermittent, meaning their output fluctuates with weather and time of day. Hydropower and geothermal are geographically constrained, only viable in certain regions. Despite these challenges, renewables now account for a growing share of global electricity generation and play a central role in efforts to decarbonize the energy sector.

In areas where water and electricity coexist—such as hydroelectric plants—understanding the risks associated with water and electricity is critical to ensure operational safety and prevent electrocution hazards.

 

Generators and Turbines: The Heart of Electricity Generation

Generators themselves are marvels of electromechanical engineering. They convert rotational kinetic energy into electrical energy through a system of magnets and copper windings. Their efficiency, durability, and capacity to synchronize with the grid are critical to a stable electric power supply. In large plants, multiple generators operate in parallel, contributing to a vast, interconnected grid that balances supply and demand in real-time.

Turbines, powered by steam, water, gas, or wind, generate the rotational force needed to drive the generator. Their design and performance have a significant impact on the overall efficiency and output of the plant. Measuring output accurately requires devices like a watthour meter or wattmeters, which are standard tools in generation stations.

Technicians often use formulas such as Watt’s Law to determine power consumption and verify performance. Understanding what ammeters measure also plays a role in monitoring electrical current flowing through generator systems.

Related Articles

 

View more

What do Ammeters Measure?

An ammeter measures electric current in a circuit, displayed in amperes (A). Connected in series with low internal resistance to reduce burden voltage, it ensures accurate readings for testing, fault detection, and diagnostics.

 

What do Ammeters Measure?

Ammeters are measuring devices that measure the flow of electricity in the form of current in a circuit.

✅ Measure electric current in amperes, connected in series with low internal resistance to minimize burden voltage.

✅ Available in analog, digital, clamp, and current transformer designs.

✅ Used for testing, fault detection, continuity checks, and diagnostics.

Electrical currents are then measured in the form of amperes, hence the name "ammeter". The term "ammeter" is sometimes used incorrectly as "ampmeter". Understanding how an ammeter works is easier when you first explore the basics of electricity fundamentals, including how voltage, current, and resistance interact in a circuit.

An ammeter measures electric current in a circuit, expressed in amperes (A). It must be connected in series with the load so that all the current flows through it, and is designed with low internal resistance to minimize burden voltage, thereby ensuring accurate readings without significantly affecting the circuit’s performance. The measurement unit for an ammeter is the ampere, explained in detail on our what is an ampere page, which also covers its relationship to other electrical units.

Ammeters are designed (as well as ohmmeters and voltmeters) to be used with a sensitive current detector such as a moving coil in a galvanometer. To measure the electric current flow through it, an ammeter is placed in series with a circuit element. The ammeter is designed to offer very low resistance to the current, so that it does not appreciably change the circuit it is measuring. To do this, a small resistor is placed in parallel with the galvanometer to shunt most of the current around the galvanometer. Its value is chosen so that when the design current flows through the meter, it will deflect to its full-scale reading. A galvanometer's full-scale current is very small: on the order of milliamperes. To see how ammeters fit into broader measurement tools, check out our guide on what is a voltmeter and what is a multimeter, which measure multiple electrical properties.

An Ammeter is analog. It is not mechanical or digital. It uses an analog-to-digital converter (ADC) to measure the voltage across the shunt resistor. The ADC is read by a microcomputer that performs the calculations to display the current through the resistor.

 

How an Ammeter Works

An ammeter works by being placed in series with the circuit so that all the current flows through it. Inside, a shunt resistor with very low internal resistance creates a small, measurable voltage drop proportional to the current. In analog designs, this current is partly diverted around a sensitive moving-coil mechanism, which displays the reading on a scale. In digital designs, the voltage drop across the shunt is measured by an analog-to-digital converter (ADC) and calculated into an amperes value, ensuring accurate measurement without significantly disturbing the circuit’s performance. Accurate current measurement also depends on understanding what is electrical resistance and how it affects current flow, especially in low-resistance ammeter designs.

 

Types and Mechanisms

Analog ammeter – Includes moving-coil (D'Arsonval) and moving-iron types, which use magnetic deflection to display current on a scale. These designs are valued for their simplicity, durability, and ability to provide continuous current readings.

Digital ammeter – Uses a shunt resistor to create a small voltage drop proportional to the current. This voltage is measured by an analog-to-digital converter (ADC) and displayed as a precise digital value. Digital ammeters often feature higher accuracy, wider measurement ranges, and additional functions such as data logging.

Clamp meter – Measures current without breaking the circuit by detecting the magnetic field around a conductor. This method is ideal for quick testing, especially in high-current applications or where live connections cannot be interrupted.

Current transformer (CT) ammeter – Designed for high-current AC systems, this type uses a transformer to scale down large primary currents into a safe, measurable secondary current for the meter.

 

Shunts and Operation

A shunt resistor is a precision, low-resistance component used in many ammeters. In analog designs, it is placed in parallel with the meter movement, diverting most of the current to protect the instrument. In certain digital designs, it is placed in series with the circuit. By measuring the voltage drop across the shunt and applying Ohm’s law, the meter accurately calculates the current. This approach allows for measurement of very large currents without damaging the meter and helps maintain measurement stability.

 

Applications and Value

Ammeters are essential tools in electrical testing, short-circuit detection, continuity testing, and system diagnostics. They help identify overloads, open circuits, and unstable current conditions that may indicate equipment faults or inefficiencies.

In industrial, commercial, and residential settings, ammeters are used for equipment maintenance, troubleshooting, and performance monitoring. Specialized variants such as milliammeters and microammeters are designed for extremely low current measurements, while integrating ammeters track current over time to determine total electrical charge delivered to a device or system. For historical context on the development of measuring instruments, visit our history of electricity page to learn how electrical science evolved over time.

 

Practical Applications of Ammeters

Ammeters are used in a wide range of electrical and electronic work:

  • Automotive diagnostics – Measuring current draw from the battery to detect parasitic drains, starter motor issues, and charging system faults.

  • Solar panel and battery monitoring – Tracking current output from photovoltaic arrays and the charging/discharging rates of storage batteries to optimize system efficiency.

  • Industrial motor maintenance – Monitoring motor current to identify overload conditions, detect bearing wear, or confirm correct load operation.

  • Household appliance servicing – Checking current draw to troubleshoot faulty components or ensure devices operate within safe limits.

  • Power distribution systems – Ensuring current levels remain within capacity for cables, fuses, and protective devices.

 

Related Articles

 

View more

What is a Voltmeter?

What is a voltmeter? A voltmeter is an electrical measuring instrument used to determine voltage across circuit points. Common in electronics, engineering, and power systems, it ensures accuracy, safety, and efficiency when monitoring current and diagnosing electrical performance.

 

What is a Voltmeter?

A Voltmeter provides a method to accurately measure voltage, which is the difference in electric potential between two points in a circuit, without changing the voltage in that circuit. It is an instrument used for measuring voltage drop.

✅ Ensures accurate voltage measurement for safety and performance

✅ Used in electrical engineering, electronics, and power systems

✅ Helps diagnose faults and maintain efficient operation

Electrical current consists of a flow of charge carriers. Voltage, also known as electromotive force (EMF) or potential difference, manifests as "electrical pressure" that enables current to flow. Given an electric circuit under test with a constant resistance, the current through the circuit varies directly in proportion to the voltage across the circuit. A voltmeter measures potential difference, which directly relates to Ohm’s Law, the fundamental equation connecting voltage, current, and resistance in circuits.

A voltmeter can take many forms, from the classic analog voltmeter with a moving needle to modern instruments like the digital voltmeter (DVM) or the versatile digital multimeter. These tools are essential for measuring electrical values in electronic devices, enabling technicians to measure voltage, current, and resistance with precision and accuracy. While analog units provide quick visual feedback, digital versions deliver more precise measurements across wider voltage ranges, making them indispensable for troubleshooting and maintaining today’s complex electrical systems.

A voltmeter can be tailored to have various full-scale ranges by switching different values of resistance in series with the microammeter, as shown in Fig. 3-6. A voltmeter exhibits high internal resistance because the resistors have large ohmic values. The greater the supply voltage, the larger the internal resistance of the voltmeter because the necessary series resistance increases as the voltage increases. To understand how a voltmeter works, it helps to first review basic electricity, as voltage, current, and resistance form the foundation of all electrical measurements.

 


 

Fig 3-6. A simple circuit using a microammeter (tA) to measure DC voltage.

 

A Voltmeter, whether digital or analog, should have high resistance, and the higher the better. You don't want the meter to draw a lot of current from the power source. (Ideally, it wouldn't draw any current at all.) The power-supply current should go, as much as possible, towards operating whatever circuit or system you want to use, not into getting a meter to tell you the voltage. A voltmeter is commonly used to measure voltage drop across conductors or devices, helping electricians ensure circuits operate efficiently and safely. For quick calculations, a voltage drop calculator provides accurate estimates of conductor losses based on length, size, and current. Understanding the voltage drop formula allows engineers and technicians to apply theoretical principles when designing or troubleshooting electrical systems.

Also, you might not want to keep the voltmeter constantly connected in parallel in the circuit. You may need the voltmeter for testing various circuits. You don't want the behavior of a circuit to be affected the moment you connect or disconnect the voltmeter. The less current a voltmeter draws, the less it affects the behavior of anything that operates from the power supply. Engineers often ask: What is a voltmeter?  They use a voltmeter in power system analysis, where accurate voltage readings are crucial for ensuring safety, reliability, and optimal performance.

Alternative types of voltmeters use electrostatic deflection, rather than electromagnetic deflection, to produce their readings. Remember that electric fields produce forces, just as magnetic fields do. Therefore, a pair of electrically charged plates attracts or repels each other. An electrostatic type utilizes the attractive force between two plates with opposite electric charges or a large potential difference. A voltmeter is used to measure the potential difference. Figure 3-7 portrays the functional mechanics of an electrostatic meter. It constitutes, in effect, a sensitive, calibrated electroscope. A voltmeter draws essentially no current from the power supply. Nothing but air exists between the plates, and air constitutes a nearly perfect electrical insulator. A properly designed electrostatic meter can measure both AC voltage and DC voltage. However, the meter construction tends to be fragile, and mechanical vibration can influence the reading.

 

 

Fig 3-7. Functional drawing of an electrostatic voltmeter movement.

 

It's always good when a voltmeter has a high internal resistance. The reason for this is that you don't want the voltmeter to draw a significant amount of current from the power source. This cur­rent should go, as much as possible, towards working whatever circuit is hooked up to the supply, and not just into getting a reading of the voltage. Additionally, you may not want or need to have the voltmeter constantly connected in the circuit; instead, you might need it for testing various circuits. You don't want the behavior of the circuit to be affected the instant you connect the voltmeter to the supply. The less current a voltmeter draws, the less it will affect the behavior of anything that is working from the power supply.

If you connect an ammeter directly across a source of voltage, a battery, the meter needle will deflect. In fact, a milliammeter needle will probably be "pinned" if you do this with it, and a microammeter might well be wrecked by the force of the needle striking the pin at the top of the scale. For this reason, you should never connect milli-ammeters or micro-ammeters directly across voltage sources. An ammeter, perhaps with a range of 0-10 A, may not deflect to full scale if it is placed across a battery; however, it's still a bad idea to do so, as it will rapidly drain the battery. Some batteries, such as automotive lead-acid cells, can explode under these conditions. This is because all ammeters have low internal resistance. They are designed that way deliberately. They are meant to be connected in series with other parts of a circuit, not right across the power supply. Because voltage is inseparable from current, learning what is current electricity provides deeper insight into why voltmeters are vital diagnostic tools.

But if you place a large resistor in series with an ammeter, and then connect the ammeter across a battery or other type of power supply, you no longer have a short cir­cuit. The ammeter will give an indication that is directly proportional to the voltage of the supply. The smaller the full-scale reading of the ammeter, the larger the resistance needed to get a meaningful indication on the meter. Using a microammeter and a very large resistor in series, it can be devised that draws only a small current from the source.

So, What is a Voltmeter? In summary, a voltmeter is a fundamental instrument for electrical work, allowing professionals and students to accurately measure voltage and understand circuit behaviour. Whether using an analog or digital design, voltmeters and multimeters provide precise insights that support safety, efficiency, and reliable performance in electrical systems.

Related Articles

 

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.