What do Ammeters Measure?


What do Ammeters Measure

An ammeter measures electric current in a circuit, displayed in amperes (A). Connected in series with low internal resistance to reduce burden voltage, it ensures accurate readings for testing, fault detection, and diagnostics.

 

What do Ammeters Measure?

Ammeters are measuring devices that measure the flow of electricity in the form of current in a circuit.

✅ Measure electric current in amperes, connected in series with low internal resistance to minimize burden voltage.

✅ Available in analog, digital, clamp, and current transformer designs.

✅ Used for testing, fault detection, continuity checks, and diagnostics.

Electrical currents are then measured in the form of amperes, hence the name "ammeter". The term "ammeter" is sometimes used incorrectly as "ampmeter". Understanding how an ammeter works is easier when you first explore the basics of electricity fundamentals, including how voltage, current, and resistance interact in a circuit.

An ammeter measures electric current in a circuit, expressed in amperes (A). It must be connected in series with the load so that all the current flows through it, and is designed with low internal resistance to minimize burden voltage, thereby ensuring accurate readings without significantly affecting the circuit’s performance. The measurement unit for an ammeter is the ampere, explained in detail on our what is an ampere page, which also covers its relationship to other electrical units.

Ammeters are designed (as well as ohmmeters and voltmeters) to be used with a sensitive current detector such as a moving coil in a galvanometer. To measure the electric current flow through it, an ammeter is placed in series with a circuit element. The ammeter is designed to offer very low resistance to the current, so that it does not appreciably change the circuit it is measuring. To do this, a small resistor is placed in parallel with the galvanometer to shunt most of the current around the galvanometer. Its value is chosen so that when the design current flows through the meter, it will deflect to its full-scale reading. A galvanometer's full-scale current is very small: on the order of milliamperes. To see how ammeters fit into broader measurement tools, check out our guide on what is a voltmeter and what is a multimeter, which measure multiple electrical properties.

An Ammeter is analog. It is not mechanical or digital. It uses an analog-to-digital converter (ADC) to measure the voltage across the shunt resistor. The ADC is read by a microcomputer that performs the calculations to display the current through the resistor.

 

How an Ammeter Works

An ammeter works by being placed in series with the circuit so that all the current flows through it. Inside, a shunt resistor with very low internal resistance creates a small, measurable voltage drop proportional to the current. In analog designs, this current is partly diverted around a sensitive moving-coil mechanism, which displays the reading on a scale. In digital designs, the voltage drop across the shunt is measured by an analog-to-digital converter (ADC) and calculated into an amperes value, ensuring accurate measurement without significantly disturbing the circuit’s performance. Accurate current measurement also depends on understanding what is electrical resistance and how it affects current flow, especially in low-resistance ammeter designs.

 

Types and Mechanisms

Analog ammeter – Includes moving-coil (D'Arsonval) and moving-iron types, which use magnetic deflection to display current on a scale. These designs are valued for their simplicity, durability, and ability to provide continuous current readings.

Digital ammeter – Uses a shunt resistor to create a small voltage drop proportional to the current. This voltage is measured by an analog-to-digital converter (ADC) and displayed as a precise digital value. Digital ammeters often feature higher accuracy, wider measurement ranges, and additional functions such as data logging.

Clamp meter – Measures current without breaking the circuit by detecting the magnetic field around a conductor. This method is ideal for quick testing, especially in high-current applications or where live connections cannot be interrupted.

Current transformer (CT) ammeter – Designed for high-current AC systems, this type uses a transformer to scale down large primary currents into a safe, measurable secondary current for the meter.

 

Shunts and Operation

A shunt resistor is a precision, low-resistance component used in many ammeters. In analog designs, it is placed in parallel with the meter movement, diverting most of the current to protect the instrument. In certain digital designs, it is placed in series with the circuit. By measuring the voltage drop across the shunt and applying Ohm’s law, the meter accurately calculates the current. This approach allows for measurement of very large currents without damaging the meter and helps maintain measurement stability.

 

Applications and Value

Ammeters are essential tools in electrical testing, short-circuit detection, continuity testing, and system diagnostics. They help identify overloads, open circuits, and unstable current conditions that may indicate equipment faults or inefficiencies.

In industrial, commercial, and residential settings, ammeters are used for equipment maintenance, troubleshooting, and performance monitoring. Specialized variants such as milliammeters and microammeters are designed for extremely low current measurements, while integrating ammeters track current over time to determine total electrical charge delivered to a device or system. For historical context on the development of measuring instruments, visit our history of electricity page to learn how electrical science evolved over time.

 

Practical Applications of Ammeters

Ammeters are used in a wide range of electrical and electronic work:

  • Automotive diagnostics – Measuring current draw from the battery to detect parasitic drains, starter motor issues, and charging system faults.

  • Solar panel and battery monitoring – Tracking current output from photovoltaic arrays and the charging/discharging rates of storage batteries to optimize system efficiency.

  • Industrial motor maintenance – Monitoring motor current to identify overload conditions, detect bearing wear, or confirm correct load operation.

  • Household appliance servicing – Checking current draw to troubleshoot faulty components or ensure devices operate within safe limits.

  • Power distribution systems – Ensuring current levels remain within capacity for cables, fuses, and protective devices.

 

Related News

What do Ammeters Measure?

An ammeter measures electric current in a circuit, displayed in amperes (A). Connected in series with low internal resistance to reduce burden voltage, it ensures accurate readings for testing, fault detection, and diagnostics.

 

What do Ammeters Measure?

Ammeters are measuring devices that measure the flow of electricity in the form of current in a circuit.

✅ Measure electric current in amperes, connected in series with low internal resistance to minimize burden voltage.

✅ Available in analog, digital, clamp, and current transformer designs.

✅ Used for testing, fault detection, continuity checks, and diagnostics.

Electrical currents are then measured in the form of amperes, hence the name "ammeter". The term "ammeter" is sometimes used incorrectly as "ampmeter". Understanding how an ammeter works is easier when you first explore the basics of electricity fundamentals, including how voltage, current, and resistance interact in a circuit.

An ammeter measures electric current in a circuit, expressed in amperes (A). It must be connected in series with the load so that all the current flows through it, and is designed with low internal resistance to minimize burden voltage, thereby ensuring accurate readings without significantly affecting the circuit’s performance. The measurement unit for an ammeter is the ampere, explained in detail on our what is an ampere page, which also covers its relationship to other electrical units.

Ammeters are designed (as well as ohmmeters and voltmeters) to be used with a sensitive current detector such as a moving coil in a galvanometer. To measure the electric current flow through it, an ammeter is placed in series with a circuit element. The ammeter is designed to offer very low resistance to the current, so that it does not appreciably change the circuit it is measuring. To do this, a small resistor is placed in parallel with the galvanometer to shunt most of the current around the galvanometer. Its value is chosen so that when the design current flows through the meter, it will deflect to its full-scale reading. A galvanometer's full-scale current is very small: on the order of milliamperes. To see how ammeters fit into broader measurement tools, check out our guide on what is a voltmeter and what is a multimeter, which measure multiple electrical properties.

An Ammeter is analog. It is not mechanical or digital. It uses an analog-to-digital converter (ADC) to measure the voltage across the shunt resistor. The ADC is read by a microcomputer that performs the calculations to display the current through the resistor.

 

How an Ammeter Works

An ammeter works by being placed in series with the circuit so that all the current flows through it. Inside, a shunt resistor with very low internal resistance creates a small, measurable voltage drop proportional to the current. In analog designs, this current is partly diverted around a sensitive moving-coil mechanism, which displays the reading on a scale. In digital designs, the voltage drop across the shunt is measured by an analog-to-digital converter (ADC) and calculated into an amperes value, ensuring accurate measurement without significantly disturbing the circuit’s performance. Accurate current measurement also depends on understanding what is electrical resistance and how it affects current flow, especially in low-resistance ammeter designs.

 

Types and Mechanisms

Analog ammeter – Includes moving-coil (D'Arsonval) and moving-iron types, which use magnetic deflection to display current on a scale. These designs are valued for their simplicity, durability, and ability to provide continuous current readings.

Digital ammeter – Uses a shunt resistor to create a small voltage drop proportional to the current. This voltage is measured by an analog-to-digital converter (ADC) and displayed as a precise digital value. Digital ammeters often feature higher accuracy, wider measurement ranges, and additional functions such as data logging.

Clamp meter – Measures current without breaking the circuit by detecting the magnetic field around a conductor. This method is ideal for quick testing, especially in high-current applications or where live connections cannot be interrupted.

Current transformer (CT) ammeter – Designed for high-current AC systems, this type uses a transformer to scale down large primary currents into a safe, measurable secondary current for the meter.

 

Shunts and Operation

A shunt resistor is a precision, low-resistance component used in many ammeters. In analog designs, it is placed in parallel with the meter movement, diverting most of the current to protect the instrument. In certain digital designs, it is placed in series with the circuit. By measuring the voltage drop across the shunt and applying Ohm’s law, the meter accurately calculates the current. This approach allows for measurement of very large currents without damaging the meter and helps maintain measurement stability.

 

Applications and Value

Ammeters are essential tools in electrical testing, short-circuit detection, continuity testing, and system diagnostics. They help identify overloads, open circuits, and unstable current conditions that may indicate equipment faults or inefficiencies.

In industrial, commercial, and residential settings, ammeters are used for equipment maintenance, troubleshooting, and performance monitoring. Specialized variants such as milliammeters and microammeters are designed for extremely low current measurements, while integrating ammeters track current over time to determine total electrical charge delivered to a device or system. For historical context on the development of measuring instruments, visit our history of electricity page to learn how electrical science evolved over time.

 

Practical Applications of Ammeters

Ammeters are used in a wide range of electrical and electronic work:

  • Automotive diagnostics – Measuring current draw from the battery to detect parasitic drains, starter motor issues, and charging system faults.

  • Solar panel and battery monitoring – Tracking current output from photovoltaic arrays and the charging/discharging rates of storage batteries to optimize system efficiency.

  • Industrial motor maintenance – Monitoring motor current to identify overload conditions, detect bearing wear, or confirm correct load operation.

  • Household appliance servicing – Checking current draw to troubleshoot faulty components or ensure devices operate within safe limits.

  • Power distribution systems – Ensuring current levels remain within capacity for cables, fuses, and protective devices.

 

Related Articles

 

View more

Unit of Capacitance Explained

The unit of capacitance is the farad (F), which measures the amount of electric charge a capacitor stores per volt. Typically expressed in microfarads, nanofarads, or picofarads, it is essential in electronics, circuit design, and energy storage systems.

 

What is a Unit of Capacitance?

The unit of capacitance, the farad (F), measures the amount of electric charge a capacitor can store per volt. It is crucial to understand the function of capacitors in electronics, circuits, and energy storage technologies.

✅ 1 farad equals 1 coulomb per volt

✅ Common values include microfarads, nanofarads, and picofarads

✅ Used in electronics, circuits, power systems, and capacitor design

 

It is determined by the electrical charge, which is symbolized by the letter Q, and is measured in units of coulombs. Discover how capacitance interacts with other electrical quantities and gain a deeper understanding of its role in circuit design and performance. The coulomb is given by the letter C, as with capacitance. Unfortunately, this can be confusing. One coulomb of charge is defined as a charge equivalent to 6.28 × 10^18 electrons. The basic unit is the farad, denoted by the letter F. By definition, one farad is the amount of charge stored on a capacitor when one volt is applied across its plates. The general formula for capacitance in terms of charge and voltage is:


 

Understanding the Unit of Electric Capacitance

The unit of electric capacitance explains how a capacitor functions as a body to store an electrical charge. This is achieved through two conductive plates, which form the essential structure of a parallel plate capacitor. These plates are separated by an insulating material, known as the dielectric, which prevents direct current flow while allowing the device to store energy.

A capacitor is a widely used electronic component, and it belongs to the class of passive electronic components since it does not generate energy but only stores it temporarily. The concept of capacitance was first defined by the English physicist Michael Faraday, whose pioneering work in electromagnetism laid the foundation of electrical science. Historical records place Michael Faraday 1791 1867 as one of the most influential figures in this field.

In modern practice, capacitance is measured in the SI base units of the farad (F). Because a farad is large, smaller units such as the nanofarad nF are commonly used to describe practical capacitors found in circuits. Whether measured in farads, microfarads, or nanofarads, the unit of electric capacitance remains the standard way of expressing a capacitor’s ability to store charge for reliable operation in electronic systems.

 

Farad in Practical Use

In practical terms, one farad is a large amount of capacitance. Typically, in electronics, much smaller units are used. The two more common smaller units are the microfarad (μF), which is 10^-6 farad, and the picofarad (pF), which is 10^-12 farad. To better understand the core principles behind charge and voltage, see our overview on what is a capacitor, which explains how capacitance functions in practical circuits.

Voltage Rating of a Capacitor: Capacitors have limits on the voltage that can be applied across their plates. The aircraft technician must be aware of the voltage rating, which specifies the maximum DC voltage that can be applied without risking damage to the device. This voltage rating is typically referred to as the breakdown voltage, working voltage, or simply the voltage rating. If the voltage applied across the plates is too great, the dielectric will break down, and arcing will occur between the plates. The capacitor is then short-circuited, and the possible flow of direct current through it can cause damage to other parts of the equipment. For foundational knowledge that supports capacitance calculations, our what is voltage article defines the relationship between electric potential and stored charge.

A capacitor that can be safely charged to 500 volts DC cannot be safely subjected to AC or pulsating DC whose effective values are 500 volts. An alternating voltage of 500 volts (RMS) has a peak voltage of 707 volts, and a capacitor to which it is applied should have a working voltage of at least 750 volts. The capacitor should be selected so that its working voltage is at least 50 percent greater than the highest voltage to be applied. Learn about different types of components that influence total capacitance by reading our guide on types of capacitors, which compares materials, ratings, and applications.

 

Smaller Units of Capacitance

The voltage rating of the capacitor is a factor in determining the actual capacitance, as capacitance decreases with increasing dielectric thickness. A high-voltage capacitor with a thick dielectric must have a larger plate area to achieve the same capacitance as a similar low-voltage capacitor with a thin dielectric.

 

Table 1 – Dielectric Strength of Common Materials

Dielectric Material Approx. Dielectric Strength (kV/mm) Relative Permittivity (εr) Notes / Applications
Vacuum 30 1.0 Reference value, ideal insulator
Air 3 ~1.0 Baseline, used as standard
Paper 16 3–4 Used in older capacitors
Glass 9–14 4–10 High stability, low loss
Mica 100 5–7 Precision capacitors, RF use
Ceramic 10–40 6–12 (varies) Common in small capacitors
Polystyrene 20–30 2.5–2.7 Low loss, stable
Polyethylene 20–30 2.2 High-voltage applications
Teflon (PTFE) 60–170 2.1 Excellent insulator, stable
Oil (transformer) 10–15 2.2–2.3 Used in HV capacitors and transformers
Quartz 8–10 ~3.8 Stable, heat resistant

 

Factors Affecting A Unit of Capacitance

  1. The capacitance of parallel plates is directly proportional to the area of the plates. A larger plate area produces a larger capacitance, and a smaller area produces less capacitance. If we double the area of the plates, there is room for twice as much charge. The charge that a capacitor can hold at a given potential difference is doubled, and since C = Q/E, the capacitance is doubled.

  2. The capacitance of parallel plates is inversely proportional to the spacing between them.

  3. The dielectric material affects the capacitance of parallel plates. The dielectric constant of a vacuum is defined as 1, and that of air is very close to 1. These values are used as a reference, and all other materials have values specified in relation to air (vacuum).

The strength of some commonly used dielectric materials is listed in Table 1. The voltage rating also depends on frequency, as the losses and resultant heating effect increase with higher frequencies. Discover how capacitance fits into the broader context of energy flow in circuits by visiting our what is electrical resistance page, offering insights on resistance and its effect on voltage and current.

 

Related Articles

 

View more

What is Considered High Voltage? HV Applications Explained

What is considered high voltage? Per IEC/IEEE, voltages above 1 kV AC or 1.5 kV DC; linked to insulation coordination, arc-flash risk, transmission lines, substations, switchgear ratings, clearance/creepage distances, and dielectric breakdown in power systems.

 

What Is Considered High Voltage?

Voltages above 1 kV AC or 1.5 kV DC are classed as high voltage per IEC/IEEE in power systems.

✅ IEC/IEEE: >1 kV AC or >1.5 kV DC thresholds

✅ Categories: MV, HV, EHV, UHV in power transmission

✅ Impacts insulation, clearance, arc-flash, switchgear ratings

 

What is Considered High Voltage?

In the world of electrical engineering, understanding voltage levels is crucial. So you might be asked to define high voltage. But what is considered HV? This article explores the definition, classification, and applications of HV and the safety concerns and precautions that come with it. For foundational context, the concept of voltage underpins how these levels are defined and compared across systems.


 

According to the International Electrotechnical Commission (IEC), HV is typically defined as any voltage above 1000 volts for alternating current (AC) systems and 1500 volts for direct current (DC) systems. However, the term "HV" can also refer to voltages as low as 50 volts in some safety regulations, depending on the context. For example, the US Occupational Safety and Health Administration (OSHA) defines HV as 600 volts or higher in their safety guidelines. Standards often reference nominal voltage values that guide equipment ratings, insulation clearances, and test criteria in practice.

High voltage systems are essential in electric power transmission and distribution, allowing electricity to be transported over long distances with minimal energy loss. Power lines, transmission lines, and transformers all play a role in the power transmission and distribution process. Transformers are used to step up or down voltage levels, depending on whether the electricity is transported over long distances or distributed to end-users. At the point of use, networks step down to low voltage levels suitable for residential and commercial equipment before final delivery.

Voltage classification is a method for organizing voltages based on their range. There are four primary classifications of voltage levels: low (LV), medium  (MV), HV, and extra HV (EHV). Ultra HV (UHV) is another classification for exceptionally high voltages, typically used in long-distance power transmission projects. In distribution grids, medium voltage tiers bridge the gap between long-distance transmission and local feeders in a coordinated hierarchy.

Insulation is a crucial aspect of HV systems, as it prevents electrical current from leaking and causing short circuits, equipment damage, or even fires. Different types of insulation are used depending on the voltage level and application, such as air, gas, oil, or solid materials like plastics and ceramics. For clarity on terminology used in insulation, dielectric strength, and creepage distances, consult common electricity terms that standardize communication across projects.

HV circuits and equipment, such as transformers and switchgear, are designed to handle higher voltages safely and efficiently. These devices are essential components of power distribution networks and are subject to strict design, manufacturing, and testing standards to ensure reliability and safety.

Working with high voltage circuits presents several electrical hazards, such as electric shock, arc flash, and fires. To mitigate these risks, electrical safety measures must be put in place. Workers with HV equipment must follow safety procedures and use appropriate personal protective equipment (PPE), such as insulated gloves, safety glasses, and arc flash suits. Comprehensive electricity safety programs integrate procedures, labeling, lockout/tagout, and training to reduce incident rates.

So, what is considered high voltage? As mentioned earlier, the IEC defines HV as 1000 volts for AC and 1500 volts for DC. However, some safety regulations might consider voltages as low as 50 or 600 volts as HV.

HV is used in power transmission and distribution to transport electricity efficiently over long distances. Transmission lines, transformers, and other equipment are designed to handle HVs and are integral to power distribution networks.

Safety concerns associated with HV systems include electric shock, arc flash, and fires. Proper safety procedures and protective equipment are necessary to minimize these risks. Understanding the broader dangers of electricity helps contextualize HV-specific risks and informs mitigation strategies.

Transformers handle HV levels by stepping up or stepping down the voltage, allowing for efficient power transmission and distribution. They are designed to withstand HV stresses and are subject to rigorous testing and standards.

Various types of insulation are needed for HV applications, including air, gas, oil, and solid materials like plastics and ceramics. The choice of insulation depends on the level and specific application requirements.

The different classifications of voltage levels include low, medium, HV, extra HV, and ultra HV. These classifications help categorize voltage ranges for various applications and safety standards.

When working with HV equipment, workers should follow safety procedures, use appropriate personal protective equipment, and undergo regular training to stay updated on best practices and safety guidelines.

In conclusion, understanding what is considered HV is crucial for electrical systems' safe and efficient operation. HV plays a vital role in power transmission and distribution, allowing electricity to be transported over long distances with minimal losses. Proper insulation, transformers, and other equipment are designed to handle HV levels and ensure the reliability of the electrical infrastructure. Safety concerns associated with HV systems must be addressed through stringent safety procedures, protective equipment, and worker training. We can maintain a safe and efficient electrical infrastructure by adhering to these guidelines and understanding the importance of classifications.


High, Extra-High and Ultra-HV Classifications

High, extra-high, and ultra-high voltage classifications are categories used to define the levels within electrical systems, particularly in power transmission and distribution networks. These classifications help standardize the design, manufacturing, and operation of electrical equipment and ensure safety and efficiency.


High Voltage (HV):

HV is typically defined as levels between 1000 volts (1 kV) and 100,000 volts (100 kV) for alternating current (AC) systems and between 1500 volts (1.5 kV) and 100,000 volts (100 kV) for direct current (DC) systems. HV systems are commonly used in electric power transmission and distribution networks, substations, and industrial facilities. HV allows for efficient power transmission over long distances while reducing energy loss due to resistance.


Extra-High Voltage (EHV):

Extra-high voltage refers to levels above 100,000 volts (100 kV) and up to 300,000 volts (300 kV) for AC systems and between 100,000 volts (100 kV) and 800,000 volts (800 kV) for DC systems. EHV systems are primarily used for long-distance power transmission, where higher levels reduce energy losses even further. EHV lines and equipment require specialized design, manufacturing, and maintenance to ensure safety, reliability, and efficiency. The use of extra-high voltage is also associated with more stringent safety protocols and larger right-of-way requirements for transmission lines.


 


Ultra-High Voltage (UHV):

Ultra-high voltage classification is designated for levels above 300,000 volts (300 kV) for AC systems and above 800,000 volts (800 kV) for DC systems. UHV systems are used in large-scale power transmission projects that aim to transmit massive amounts of electricity over very long distances with minimal losses. These projects typically connect major power generation sources, such as hydroelectric or nuclear plants, to far-off load centers or densely populated urban areas. As a result, UHV systems demand the highest level of engineering expertise, rigorous testing, and specialized equipment to ensure their safe and efficient operation.


 

 

Related Articles

View more

How Is Electricity Generated?

It is produced by converting various energy sources, such as fossil fuels, nuclear, solar, wind, or hydro, into electrical energy using turbines and generators. These systems harness mechanical or chemical energy and transform it into usable power.

 

How Is Electricity Generated?

✅ Converts energy sources like coal, gas, wind, or sunlight into power

✅ Uses generators driven by turbines to create electrical current

✅ Supports global power grids and industrial, commercial, and residential use

 

Understanding Electricity Generation

Electricity generation is the lifeblood of modern civilization, powering homes, industries, hospitals, transportation systems, and digital infrastructure. But behind the flip of a switch lies a vast and complex process that transforms raw energy into electrical power. At its core, electricity is generated by converting various forms of energy—mechanical, thermal, chemical, or radiant—into a flow of electric charge through systems engineered for efficiency and reliability.

Understanding the role of voltage is essential in this process, as it determines the electrical pressure that drives current through circuits.

According to the Energy Information Administration, the United States relies on a diverse mix of technologies to produce electric power, including fossil fuels, nuclear power, and renewables. In recent years, the rapid growth of solar photovoltaic systems and the widespread deployment of wind turbines have significantly increased the share of clean energy in the national grid. These renewable systems often use turbines to generate electricity by converting natural energy sources—sunlight and wind—into mechanical motion and ultimately electrical power. This transition reflects broader efforts to reduce emissions while meeting rising electric power demand.

 

How Power Generation Works

Most electricity around the world is produced using turbines and generators. These devices are typically housed in large-scale power plants. The process begins with an energy source—such as fossil fuels, nuclear reactions, or renewable inputs like water, wind, or sunlight—which is used to create movement. This movement, in turn, drives a turbine, which spins a shaft connected to a generator. Inside the generator, magnetic fields rotate around conductive coils, inducing a voltage and producing alternating current (AC) electricity. This method, known as electromagnetic induction, is the fundamental mechanism by which nearly all electric power is made.

In designing and maintaining electrical systems, engineers must also consider voltage drop, which can reduce efficiency and power quality. You can evaluate system losses using our interactive voltage drop calculator, and better understand the math behind it using the voltage drop formula.

 

Energy Sources Used in Power Production

Steam turbines remain the dominant technology in global energy production. These are especially common in plants that burn coal, natural gas, or biomass, or that rely on nuclear fission. In a typical thermal power plant, water is heated to create high-pressure steam, which spins the turbine blades. In nuclear facilities, this steam is generated by the immense heat released when uranium atoms are split. While highly efficient, these systems face environmental and safety concerns—greenhouse gas emissions from fossil fuels, radioactive waste and accident risk from nuclear power.

Power quality in these plants can be impacted by voltage sag, which occurs when systems experience a temporary drop in electrical pressure, often due to sudden large loads or faults. Managing such variations is crucial to stable output.

 

The Rise of Renewable Energy in Electricity Generation

Alongside these large-scale thermal technologies, renewable sources have grown significantly. Hydroelectric power harnesses the kinetic energy of falling or flowing water, typically from a dam, to spin turbines. Wind energy captures the movement of air through large blades connected to horizontal-axis turbines. Solar power generates electricity in two distinct ways: photovoltaic cells convert sunlight directly into electric power using semiconductors, while solar thermal plants concentrate sunlight to heat fluids and produce steam. Geothermal systems tap into the Earth’s internal heat to generate steam directly or via heat exchangers.

These renewable systems offer major advantages in terms of sustainability and environmental impact. They produce no direct emissions and rely on natural, often abundant energy flows. However, they also face limitations. Solar and wind power are intermittent, meaning their output fluctuates with weather and time of day. Hydropower and geothermal are geographically constrained, only viable in certain regions. Despite these challenges, renewables now account for a growing share of global electricity generation and play a central role in efforts to decarbonize the energy sector.

In areas where water and electricity coexist—such as hydroelectric plants—understanding the risks associated with water and electricity is critical to ensure operational safety and prevent electrocution hazards.

 

Generators and Turbines: The Heart of Electricity Generation

Generators themselves are marvels of electromechanical engineering. They convert rotational kinetic energy into electrical energy through a system of magnets and copper windings. Their efficiency, durability, and capacity to synchronize with the grid are critical to a stable electric power supply. In large plants, multiple generators operate in parallel, contributing to a vast, interconnected grid that balances supply and demand in real-time.

Turbines, powered by steam, water, gas, or wind, generate the rotational force needed to drive the generator. Their design and performance have a significant impact on the overall efficiency and output of the plant. Measuring output accurately requires devices like a watthour meter or wattmeters, which are standard tools in generation stations.

Technicians often use formulas such as Watt’s Law to determine power consumption and verify performance. Understanding what ammeters measure also plays a role in monitoring electrical current flowing through generator systems.

Related Articles

 

View more

What is Electricity?

Electricity is the flow of electric charge, usually through a conductor like wire. It powers lights, appliances, and machines by converting energy into motion, heat, or light. Electricity can be generated from sources such as fossil fuels, wind, solar, or water.

 

What is electricity?

Electricity is a fundamental form of energy created by the movement of electrons.

✅ Powers homes, industries, and electronic devices

✅ Flows through circuits as an electric current

✅ Generated from renewable and non-renewable sources

The power we use is a secondary energy source because it is produced by converting primary energy sources such as coal, natural gas, nuclear, solar, and wind energy into electrical power. It is also referred to as an energy carrier, meaning it can be converted into other forms of energy, such as mechanical or thermal energy.

Primary energy sources are either renewable or nonrenewable, but our power is neither.

To understand why electrons move in the first place, start with voltage, the electrical “pressure” that pushes charge through every circuit.

 

Electricity Has Changed Everyday Life

Although most people rarely think about electricity, it has profoundly changed how we live. It is as essential as air or water, yet we tend to take it for granted—until it’s gone. Electricity powers heating and cooling systems, appliances, communications, entertainment, and modern conveniences that past generations never imagined.

Before widespread electrification began just over a century ago, homes were lit with candles or oil lamps, food was cooled with ice blocks, and heating was provided by wood- or coal-burning stoves.

The steady stream of electrons we use daily is explored in our primer on current electricity.

 

Discovering Electricity: From Curiosity to Power Grid

Scientists and inventors began unlocking the secrets of electricity as early as the 1600s. Over the next few centuries, their discoveries built the foundation for the electric age.

Benjamin Franklin demonstrated that lightning is a form of electricity.

Thomas Edison invented the first commercially viable incandescent light bulb.

Nikola Tesla pioneered the use of alternating current (AC), which enabled the efficient transmission of electricity over long distances. He also experimented with wireless electricity.

Curious why Tesla’s ideas beat Edison’s? Our article on alternating current breaks down the advantages of alternating current (AC) over direct current (DC).

Before Tesla’s innovations, arc lighting used direct current (DC) but was limited to outdoor and short-range applications. His work made it possible for electricity to be transmitted to homes and factories, revolutionizing lighting and industry.

 

Understanding Electric Charge and Current

Electricity is the movement of electrically charged particles, typically electrons. These particles can move either statically, as in a buildup of charge, or dynamically, as in a flowing current.

All matter is made of atoms, and each atom consists of a nucleus with positively charged protons and neutral neutrons, surrounded by negatively charged electrons. Usually, the number of protons and electrons is balanced. But when that balance is disturbed—when electrons are gained or lost—an electric current is formed as those electrons move.

For a step-by-step walkthrough of everything from circuits to safety, visit how electricity works.

 

Electricity as a Secondary Energy Source

Electricity doesn’t occur naturally in a usable form. It must be generated by converting other types of energy. In fact, electricity is a manufactured product. That’s why electricity is called a secondary energy source—it carries energy from its original form to where we need it.

We generate electricity by transforming mechanical energy—such as spinning a turbine—into electrical energy. This conversion happens at power plants that use a variety of fuels and methods:

  • Fossil fuels (coal, oil, natural gas)

  • Nuclear energy

  • Renewable sources like wind, solar, and hydroelectric

If turbines, magnets, and power plants intrigue you, see how electricity is generated for a deeper dive.

 

How Electricity Was Brought Into Homes

Before electricity generation began on a mass scale, cities often developed near waterfalls, where water wheels powered mills and machines. The leap from mechanical energy to electrical energy enabled power to travel not just across a town, but across entire countries.

Beginning with Franklin’s experiments and followed by Edison’s breakthrough with indoor electric light, the practical uses of electricity expanded rapidly. Tesla’s AC power system made widespread electric distribution feasible, bringing light, heat, and industry to homes and cities worldwide.

 

How Transformers Changed Everything

To transmit electricity efficiently over long distances, George Westinghouse developed the transformer. This device adjusts the voltage of electrical power to match its purpose—high for long-range travel, low for safe use in homes.

Transformers made it possible to supply electricity to homes and businesses far from power plants. The electric grid became a coordinated system of generation, transmission, distribution, and regulation.

Even today, most of us rarely consider the complexity behind our wall sockets. But behind every outlet lies a vast infrastructure keeping electricity flowing safely and reliably.

 

How Is Electricity Generated?

Electric generators convert mechanical energy into electricity using the principles of magnetism. When a conductor—such as a coil of wire—moves through a magnetic field, an electric current is induced.

In large power stations, turbines spin magnets inside massive generators. These turbines are driven by steam, water, or wind. The rotating magnet induces small currents in the coils of wire, which combine into a single continuous flow of electric power.

Discover the principle that turns motion into power in electromagnetic induction, the heart of every modern generator.

 

Measuring Electricity

Electricity is measured in precise units. The amount of power being used or generated is expressed in watts (W), named after inventor James Watt.

  • One watt is a small unit of power; 1,000 watts equal one kilowatt (kW).

  • Energy use over time is measured in kilowatt-hours (kWh).

  • A 100-watt bulb burning for 10 hours uses 1 kWh of electricity.

These units are what you see on your electric bill. They represent how much electricity you’ve consumed over time—and how much you’ll pay.

When it’s time to decode your energy bill, the chart in electrical units makes watts, volts, and amps clear.

 

Related Articles

 

View more

What is a Voltmeter?

What is a voltmeter? A voltmeter is an electrical measuring instrument used to determine voltage across circuit points. Common in electronics, engineering, and power systems, it ensures accuracy, safety, and efficiency when monitoring current and diagnosing electrical performance.

 

What is a Voltmeter?

A Voltmeter provides a method to accurately measure voltage, which is the difference in electric potential between two points in a circuit, without changing the voltage in that circuit. It is an instrument used for measuring voltage drop.

✅ Ensures accurate voltage measurement for safety and performance

✅ Used in electrical engineering, electronics, and power systems

✅ Helps diagnose faults and maintain efficient operation

Electrical current consists of a flow of charge carriers. Voltage, also known as electromotive force (EMF) or potential difference, manifests as "electrical pressure" that enables current to flow. Given an electric circuit under test with a constant resistance, the current through the circuit varies directly in proportion to the voltage across the circuit. A voltmeter measures potential difference, which directly relates to Ohm’s Law, the fundamental equation connecting voltage, current, and resistance in circuits.

A voltmeter can take many forms, from the classic analog voltmeter with a moving needle to modern instruments like the digital voltmeter (DVM) or the versatile digital multimeter. These tools are essential for measuring electrical values in electronic devices, enabling technicians to measure voltage, current, and resistance with precision and accuracy. While analog units provide quick visual feedback, digital versions deliver more precise measurements across wider voltage ranges, making them indispensable for troubleshooting and maintaining today’s complex electrical systems.

A voltmeter can be tailored to have various full-scale ranges by switching different values of resistance in series with the microammeter, as shown in Fig. 3-6. A voltmeter exhibits high internal resistance because the resistors have large ohmic values. The greater the supply voltage, the larger the internal resistance of the voltmeter because the necessary series resistance increases as the voltage increases. To understand how a voltmeter works, it helps to first review basic electricity, as voltage, current, and resistance form the foundation of all electrical measurements.

 


 

Fig 3-6. A simple circuit using a microammeter (tA) to measure DC voltage.

 

A Voltmeter, whether digital or analog, should have high resistance, and the higher the better. You don't want the meter to draw a lot of current from the power source. (Ideally, it wouldn't draw any current at all.) The power-supply current should go, as much as possible, towards operating whatever circuit or system you want to use, not into getting a meter to tell you the voltage. A voltmeter is commonly used to measure voltage drop across conductors or devices, helping electricians ensure circuits operate efficiently and safely. For quick calculations, a voltage drop calculator provides accurate estimates of conductor losses based on length, size, and current. Understanding the voltage drop formula allows engineers and technicians to apply theoretical principles when designing or troubleshooting electrical systems.

Also, you might not want to keep the voltmeter constantly connected in parallel in the circuit. You may need the voltmeter for testing various circuits. You don't want the behavior of a circuit to be affected the moment you connect or disconnect the voltmeter. The less current a voltmeter draws, the less it affects the behavior of anything that operates from the power supply. Engineers often ask: What is a voltmeter?  They use a voltmeter in power system analysis, where accurate voltage readings are crucial for ensuring safety, reliability, and optimal performance.

Alternative types of voltmeters use electrostatic deflection, rather than electromagnetic deflection, to produce their readings. Remember that electric fields produce forces, just as magnetic fields do. Therefore, a pair of electrically charged plates attracts or repels each other. An electrostatic type utilizes the attractive force between two plates with opposite electric charges or a large potential difference. A voltmeter is used to measure the potential difference. Figure 3-7 portrays the functional mechanics of an electrostatic meter. It constitutes, in effect, a sensitive, calibrated electroscope. A voltmeter draws essentially no current from the power supply. Nothing but air exists between the plates, and air constitutes a nearly perfect electrical insulator. A properly designed electrostatic meter can measure both AC voltage and DC voltage. However, the meter construction tends to be fragile, and mechanical vibration can influence the reading.

 

 

Fig 3-7. Functional drawing of an electrostatic voltmeter movement.

 

It's always good when a voltmeter has a high internal resistance. The reason for this is that you don't want the voltmeter to draw a significant amount of current from the power source. This cur­rent should go, as much as possible, towards working whatever circuit is hooked up to the supply, and not just into getting a reading of the voltage. Additionally, you may not want or need to have the voltmeter constantly connected in the circuit; instead, you might need it for testing various circuits. You don't want the behavior of the circuit to be affected the instant you connect the voltmeter to the supply. The less current a voltmeter draws, the less it will affect the behavior of anything that is working from the power supply.

If you connect an ammeter directly across a source of voltage, a battery, the meter needle will deflect. In fact, a milliammeter needle will probably be "pinned" if you do this with it, and a microammeter might well be wrecked by the force of the needle striking the pin at the top of the scale. For this reason, you should never connect milli-ammeters or micro-ammeters directly across voltage sources. An ammeter, perhaps with a range of 0-10 A, may not deflect to full scale if it is placed across a battery; however, it's still a bad idea to do so, as it will rapidly drain the battery. Some batteries, such as automotive lead-acid cells, can explode under these conditions. This is because all ammeters have low internal resistance. They are designed that way deliberately. They are meant to be connected in series with other parts of a circuit, not right across the power supply. Because voltage is inseparable from current, learning what is current electricity provides deeper insight into why voltmeters are vital diagnostic tools.

But if you place a large resistor in series with an ammeter, and then connect the ammeter across a battery or other type of power supply, you no longer have a short cir­cuit. The ammeter will give an indication that is directly proportional to the voltage of the supply. The smaller the full-scale reading of the ammeter, the larger the resistance needed to get a meaningful indication on the meter. Using a microammeter and a very large resistor in series, it can be devised that draws only a small current from the source.

So, What is a Voltmeter? In summary, a voltmeter is a fundamental instrument for electrical work, allowing professionals and students to accurately measure voltage and understand circuit behaviour. Whether using an analog or digital design, voltmeters and multimeters provide precise insights that support safety, efficiency, and reliable performance in electrical systems.

Related Articles

 

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified