What is Ohm's Law?


what is ohm's law

Ohm’s Law defines the essential link between voltage, current, and resistance in electrical circuits. It provides the foundation for circuit design, accurate troubleshooting, and safe operation in both AC and DC systems, making it a core principle of electrical engineering.

 

What is Ohm’s Law?

Ohm’s Law is a fundamental principle of electrical engineering and physics, describing how voltage, current, and resistance interact in any circuit.

✅ Defines the relationship between voltage, current, and resistance

✅ Provides formulas for design, safety, and troubleshooting

✅ Essential for understanding both AC and DC circuits

When asking what is Ohm’s Law, it is useful to compare it with other fundamental rules like Kirchhoff’s Law and Ampere’s Law, which expand circuit analysis beyond a single equation.

 

What is Ohm's Law as a Fundamental Principle

Ohm's Law is a fundamental principle in electrical engineering and physics, describing the relationship between voltage, current, and resistance in electrical circuits. Engineers can design safe and efficient electrical circuits by understanding this principle, while technicians can troubleshoot and repair faulty circuits. The applications are numerous, from designing and selecting circuit components to troubleshooting and identifying defective components. Understanding Ohm's Law is essential for anyone working with electrical circuits and systems.

 

Who was Georg Ohm?

Georg Simon Ohm, born in 1789 in Erlangen, Germany, was a physicist and mathematician who sought to explain the nature of electricity. In 1827, he published The Galvanic Circuit Investigated Mathematically, a groundbreaking work that defined the proportional relationship between voltage, current, and resistance. Though his research was initially dismissed, it later became recognized as one of the cornerstones of modern electrical science.

His work introduced key concepts such as electrical resistance and conductors, and his law became fundamental to circuit design and analysis. The scientific community honored his contribution by naming the unit of resistance — the ohm (Ω) — after him. Today, every student and professional who studies electricity carries his legacy forward.

Georg Simon Ohm

 

What is Ohm’s Law Formula

At the heart of the law is a simple but powerful equation:

V = I × R

  • V is voltage, measured in volts (V)

  • I is current, measured in amperes (A)

  • R is resistance, measured in ohms (Ω)

Rearranging the formula gives I = V/R and R = V/I, making it possible to solve for any unknown value when the other two are known. This flexibility allows engineers to calculate required resistor values, predict circuit performance, and confirm safe operating conditions.

In both DC and AC systems, the law provides the same basic relationship. In AC, where current and voltage vary with time, resistance is replaced with impedance, but the proportional link remains the same.

The Ohm’s Law equation explains how the amount of electric current flowing through a circuit depends on the applied voltage and resistance. Current is directly proportional to voltage and inversely proportional to resistance, illustrating how electrical charge flows under various conditions. To maintain consistency in calculations, the law employs standard units: volts (V) for voltage, amperes (A) for current, and ohms (Ω) for resistance. Since Ohm’s Law formula defines the relationship between these values, it directly connects to related concepts such as electrical resistance and voltage.

 

Understanding the Formula

The strength of Ohm’s Law lies in its versatility. With just two known values, the third can be calculated, turning raw measurements into useful information. For an engineer, this might mean calculating the resistor needed to protect a sensitive device. For a technician, it may indicate whether a failing motor is caused by excess resistance or a low supply voltage.

 

How the Formula Works in Practice

Consider a simple example: a 12-volt battery connected to a 6-ohm resistor. Using the law, the current is I = V/R = 12 ÷ 6 = 2 amperes. If resistance doubles, the current halves. If the voltage increases, the current rises proportionally.

In practical terms, Ohm’s Law is used to:

  • calculate resistor values in electronic circuits,

  • verify safe current levels in wiring and equipment,

  • determine whether industrial loads are drawing excessive power,

  • troubleshoot faults by comparing measured and expected values.

Each of these tasks depends on the same simple equation first described nearly two centuries ago. Applying Ohm’s Law often involves calculating current in DC circuits and comparing it with alternating current systems, where impedance replaces simple resistance.

 

Modern Applications of Ohm’s Law

Far from being outdated, Ohm’s Law remains central to modern technology. In electronics, it ensures safe current levels in devices from smartphones to medical equipment. In renewable energy, it governs the design and balance of solar panels and wind turbines. In automotive and electric vehicle systems, battery management and charging depend on accurate application of the law. Even in telecommunications, it ensures signals travel efficiently across cables and transmission lines. In power engineering, Ohm’s Law works alongside Watts Law and power factor to determine efficiency, energy use, and safe operating conditions.

These examples demonstrate that the law is not a relic of early science but an active tool guiding the design and operation of contemporary systems.

 

Resistance, Conductivity, and Real-World Limits

Resistance is a material’s opposition to current flow, while conductivity — its inverse — describes how freely charge moves. Conductors, such as copper and aluminum, are prized for their high conductivity, while insulators, like rubber and glass, prevent unwanted current flow.

In reality, resistance can change with temperature, pressure, and frequency, making some devices nonlinear. Semiconductors, diodes, and transistors do not always follow Ohm’s Law precisely. In AC systems, resistance expands to impedance, which also considers inductance and capacitance. Despite these complexities, the proportional relationship between voltage and current remains an essential approximation for analysis and design. Exploring basic electricity and related principles of electricity and magnetism shows why Ohm’s Law remains a cornerstone of both theoretical study and practical engineering.

 

Frequently Asked Questions


What is an example of Ohm's Law?

A simple example in action is a circuit consisting of a battery, a resistor, and a light bulb. If the voltage supplied by the battery increases, the current flowing through the circuit will also increase, causing the light bulb to glow brighter. Conversely, if the resistance of the circuit is increased by adding another resistor, the current flowing through the circuit will decrease, causing the light bulb to dim.


What are the three formulas in Ohm's Law?

The three formulas are I = V/R, V = IR, and R = V/I. These formulas can solve a wide range of problems involving electrical circuits.


Does Ohm’s Law apply to all electrical devices?

Not always. Devices such as diodes and transistors are nonlinear, meaning their resistance changes with operating conditions. In these cases, Ohm’s Law provides only an approximation.

When asking What is Ohm’s Law, it becomes clear that it is far more than a formula. It is the framework that makes electricity predictable and manageable. By linking voltage, current, and resistance, it offers a universal foundation for design, troubleshooting, and innovation. From the earliest experiments to today’s electronics and power grids, Georg Ohm’s insight remains as relevant as ever.

Related News

What is a Capacitor?

A capacitor is an electrical component that stores and releases energy in a circuit. It consists of two conductive plates separated by an insulator and is commonly used for filtering, power conditioning, and energy storage in electronic and electrical systems.

 

What is a Capacitor?

A capacitor is a key component in electronics and power systems. It temporarily stores electrical energy and is widely used in both AC and DC circuits.

✅ Stores and discharges electrical energy efficiently

✅ Used in filtering, timing, and power factor correction

✅ Found in electronics, motors, and power supplies

It is designed for energy storage and can store electric charges, which can be released when needed. In this article, we will delve into the fundamentals of capacitors, including their functions, types, and applications. To better understand how capacitors support overall system performance, explore our Power Quality overview covering the fundamentals of voltage stability and energy flow.

Power Quality Analysis Training

Power Factor Training

Request a Free Power Quality Training Quotation

A capacitor consists of two metallic plates separated by an insulating material known as the dielectric. The dielectric can be made from various materials, such as mica, paper, or ceramic. When voltage is applied across the plates, positive charges accumulate on one plate, while negative charges accumulate on the opposite plate. The amount of capacitor charge that can be stored depends on several factors, including plate area, plate separation, dielectric material, and voltage ratings. Capacitors are often used in capacitor banks to improve power factor and reduce energy losses in electrical systems.

How does a capacitor work? The primary function of a capacitor in an electronic circuit is to store electrical energy. Capacitors can be used for various purposes, such as filtering, timing, and coupling or decoupling signals. In addition, they play a crucial role in power supplies, ensuring that the output voltage remains stable even when there are fluctuations in the input voltage. Learn how capacitive loads influence circuit behavior and why they require precise capacitor selection for optimal performance.

A capacitor stores energy through the electrostatic field created between its plates. The stored energy can be calculated using the formula E = 0.5 * C * V^2, where E is the stored energy, C is the capacitance, and V is the voltage across the capacitor. Capacitance, measured in Farads, is a measure of a capacitor's ability to store charge. The capacitor voltage rating is crucial for ensuring safe operation and preventing dielectric breakdown during voltage spikes.

So, when I am asked what is a capacitor? I tell readers about several types of capacitors, each with unique applications. Common types include ceramic, electrolytic, film, and tantalum capacitors. Ceramic capacitors are widely used due to their low cost and small size. They are ideal for high-frequency applications and decoupling in power supply circuits. On the other hand, Electrolytic capacitors are popular for their high capacitance values and are commonly used in filtering and energy storage applications. Capacitors play a crucial role in power factor correction, enabling industrial systems to reduce demand charges and enhance energy efficiency.

Dielectric materials used in capacitors can be organic (such as paper) or inorganic (such as ceramic). The choice of dielectric material depends on factors like the desired capacitance value, voltage rating, and operating temperature range. Additionally, different dielectric materials exhibit varying properties, making them suitable for specific applications. For a deeper understanding of energy relationships, see how apparent power differs from real and reactive power in systems using capacitors.

A capacitor can be classified as polarized or non-polarized based on the presence or absence of polarity. Polarized capacitors, like electrolytic capacitors, have a positive and a negative terminal and must be connected correctly in a circuit to function properly. Non-polarized capacitors, like ceramic capacitors, do not have a specific polarity and can be connected in any orientation.

A Capacitor behaves differently in AC and DC voltage circuits. In DC circuits, once a capacitor is charged, it blocks the flow of current, essentially acting as an open circuit. However, in ac voltage circuits, capacitors allow the flow of alternating current. This phenomenon is known as displacement current, which occurs due to the continuous charging and discharging of charges.

So, what is a capacitor? Understanding what a capacitor is and how it works is essential for anyone interested in electronics. The Capacitor plays a vital role in a wide range of applications, from energy storage and filtering to signal coupling and decoupling. Understanding the various types of capacitors and their specific applications enables you to make informed decisions when designing or troubleshooting electronic circuits. Explore how an automatic power factor controller dynamically adjusts capacitor usage to maintain an efficient power factor in real-time.

 

Related Articles

 

View more

How Is Electricity Generated?

It is produced by converting various energy sources, such as fossil fuels, nuclear, solar, wind, or hydro, into electrical energy using turbines and generators. These systems harness mechanical or chemical energy and transform it into usable power.

 

How Is Electricity Generated?

✅ Converts energy sources like coal, gas, wind, or sunlight into power

✅ Uses generators driven by turbines to create electrical current

✅ Supports global power grids and industrial, commercial, and residential use

 

Understanding Electricity Generation

Electricity generation is the lifeblood of modern civilization, powering homes, industries, hospitals, transportation systems, and digital infrastructure. But behind the flip of a switch lies a vast and complex process that transforms raw energy into electrical power. At its core, electricity is generated by converting various forms of energy—mechanical, thermal, chemical, or radiant—into a flow of electric charge through systems engineered for efficiency and reliability.

Understanding the role of voltage is essential in this process, as it determines the electrical pressure that drives current through circuits.

According to the Energy Information Administration, the United States relies on a diverse mix of technologies to produce electric power, including fossil fuels, nuclear power, and renewables. In recent years, the rapid growth of solar photovoltaic systems and the widespread deployment of wind turbines have significantly increased the share of clean energy in the national grid. These renewable systems often use turbines to generate electricity by converting natural energy sources—sunlight and wind—into mechanical motion and ultimately electrical power. This transition reflects broader efforts to reduce emissions while meeting rising electric power demand.

 

How Power Generation Works

Most electricity around the world is produced using turbines and generators. These devices are typically housed in large-scale power plants. The process begins with an energy source—such as fossil fuels, nuclear reactions, or renewable inputs like water, wind, or sunlight—which is used to create movement. This movement, in turn, drives a turbine, which spins a shaft connected to a generator. Inside the generator, magnetic fields rotate around conductive coils, inducing a voltage and producing alternating current (AC) electricity. This method, known as electromagnetic induction, is the fundamental mechanism by which nearly all electric power is made.

In designing and maintaining electrical systems, engineers must also consider voltage drop, which can reduce efficiency and power quality. You can evaluate system losses using our interactive voltage drop calculator, and better understand the math behind it using the voltage drop formula.

 

Energy Sources Used in Power Production

Steam turbines remain the dominant technology in global energy production. These are especially common in plants that burn coal, natural gas, or biomass, or that rely on nuclear fission. In a typical thermal power plant, water is heated to create high-pressure steam, which spins the turbine blades. In nuclear facilities, this steam is generated by the immense heat released when uranium atoms are split. While highly efficient, these systems face environmental and safety concerns—greenhouse gas emissions from fossil fuels, radioactive waste and accident risk from nuclear power.

Power quality in these plants can be impacted by voltage sag, which occurs when systems experience a temporary drop in electrical pressure, often due to sudden large loads or faults. Managing such variations is crucial to stable output.

 

The Rise of Renewable Energy in Electricity Generation

Alongside these large-scale thermal technologies, renewable sources have grown significantly. Hydroelectric power harnesses the kinetic energy of falling or flowing water, typically from a dam, to spin turbines. Wind energy captures the movement of air through large blades connected to horizontal-axis turbines. Solar power generates electricity in two distinct ways: photovoltaic cells convert sunlight directly into electric power using semiconductors, while solar thermal plants concentrate sunlight to heat fluids and produce steam. Geothermal systems tap into the Earth’s internal heat to generate steam directly or via heat exchangers.

These renewable systems offer major advantages in terms of sustainability and environmental impact. They produce no direct emissions and rely on natural, often abundant energy flows. However, they also face limitations. Solar and wind power are intermittent, meaning their output fluctuates with weather and time of day. Hydropower and geothermal are geographically constrained, only viable in certain regions. Despite these challenges, renewables now account for a growing share of global electricity generation and play a central role in efforts to decarbonize the energy sector.

In areas where water and electricity coexist—such as hydroelectric plants—understanding the risks associated with water and electricity is critical to ensure operational safety and prevent electrocution hazards.

 

Generators and Turbines: The Heart of Electricity Generation

Generators themselves are marvels of electromechanical engineering. They convert rotational kinetic energy into electrical energy through a system of magnets and copper windings. Their efficiency, durability, and capacity to synchronize with the grid are critical to a stable electric power supply. In large plants, multiple generators operate in parallel, contributing to a vast, interconnected grid that balances supply and demand in real-time.

Turbines, powered by steam, water, gas, or wind, generate the rotational force needed to drive the generator. Their design and performance have a significant impact on the overall efficiency and output of the plant. Measuring output accurately requires devices like a watthour meter or wattmeters, which are standard tools in generation stations.

Technicians often use formulas such as Watt’s Law to determine power consumption and verify performance. Understanding what ammeters measure also plays a role in monitoring electrical current flowing through generator systems.

Related Articles

 

View more

What is a Voltage Regulator?

What is a voltage regulator? A control circuit that stabilizes DC output from AC/DC power supplies using feedback, reference, and PWM; includes linear LDOs and switching buck/boost converters, improving line/load regulation, ripple suppression, efficiency.

 

What Is a Voltage Regulator?

It keeps voltage steady despite load changes, using linear or switching control to cut ripple, protecting circuits.

✅ Maintains setpoint via reference, error amplifier, feedback loop

✅ Linear LDOs offer low noise; dropout defined by headroom

✅ Switching buck/boost provide high efficiency, EMI needs filtering

 

What is a voltage regulator, and how does it work?

A voltage regulator is a component of the power supply unit that maintains a constant voltage supply through all operational conditions. Voltage regulators can regulate both AC and DC voltages, ensuring a steady, constant voltage supply. The output voltage is usually lower than the input voltage. The regulator compares the output voltage to a reference voltage and uses the difference to adjust the output voltage. An external voltage source or a circuit within the regulator typically sets the reference voltage. The regulator monitors the output voltage and adjusts it to maintain the reference voltage, which ensures a constant output voltage despite fluctuations in the input voltage or load conditions. For a succinct refresher on fundamentals, review what voltage is and how it is quantified in electrical systems.


 


Why is voltage regulation important in electronic circuits?

Voltage regulation is essential in electronic circuits because all electronic devices are designed to run at predetermined power ratings, including voltage and current. Therefore, the voltage supply should ideally be constant and steady for the device's proper functioning. Any variation in the voltage supply can lead to device malfunction or even failure. Voltage regulation ensures proper device operation and prevents damage due to voltage fluctuations. Design targets often align with a system's nominal voltage to ensure interoperability and safety margins.


What are the different types of voltage regulators?

They can be classified based on their physical design, active components used, and working principle. For example, linear and switching regulators are the most common classifications of active voltage regulators (that use amplifying components like transistors or op-amps).

Linear regulators use amplifying components like transistors or op-amps to regulate the output voltage. They are simple and reliable but less efficient as they waste excess power as heat. Linear regulators are suitable for low-power applications where efficiency is not a major concern. In many loads the effective behavior of a resistor network shapes the current draw and thermal budget.

Switching regulators, on the other hand, use inductors and capacitors to store and transfer energy, making them more efficient than linear regulators. They are commonly used in battery-powered devices as they consume less power. Switching regulators are more complex than linear regulators and require careful design and layout.

They can also be classified based on their physical design. Voltage regulators used in low-voltage electronic devices are usually integrated circuits. Power distribution centers providing AC power to residential and industrial consumers use more sophisticated and mechanically large voltage regulators that maintain a rated voltage regardless of consumption demands across the area. For context, consumer gadgets often operate within defined low-voltage categories that influence package choice and safety standards.


Can a voltage regulator be used for both AC and DC power sources?

Yes, they can be used for both AC and DC power sources. AC voltage regulators are used in power distribution centers to maintain a constant voltage supply to consumers. DC voltage regulators are used in electronic devices that run on DC power sources, such as batteries or DC power supplies. When selecting between sources, it helps to understand the difference between AC and DC and how each impacts regulation strategy.


What is the difference between a voltage regulator and a voltage stabilizer?

Linear voltage regulators and voltage stabilizers are similar in function as they both regulate the output voltage. However, the main difference between the two is in their working principle. They maintain a constant output voltage by adjusting the voltage as needed to maintain a reference voltage. On the other hand, voltage stabilizers maintain a constant output voltage by using a transformer and voltage regulator ics to stabilize the voltage output.


How do you choose the right one for a specific application?

When choosing one for a specific application, several factors should be considered, including the input voltage range, output voltage range, output current requirements, efficiency, and operating temperature range. During prototyping, verify rails with a calibrated voltmeter to confirm stability under representative loads.

The input voltage range refers to the maximum and minimum input voltages that the regulator can handle. The output voltage range is the range of output voltages that the regulator can provide. The output current requirement refers to the amount of current that the regulator needs to supply to the load. Efficiency is an essential factor as it determines how much power is wasted as heat. Higher efficiency regulators consume less power and generate less heat, which is especially important in battery-powered devices. The operating temperature range is also important as some higher output voltage regulators can operate only within a certain temperature range.

It is crucial to select the appropriate type of regulator for the application. For example, linear regulators are suitable for low-power applications where efficiency is not a major concern while switching regulators are more appropriate for high-power applications that require higher efficiency.

There are various types, including adjustable, boost step-up and buck-boost regulators, and constant output. Adjustable ones allow the user to adjust the output voltage as needed, making them versatile for different duty cycle applications. Boost step-up and buck-boost regulators can increase or decrease the output voltage from the input voltage, making them useful for applications where the input voltage is lower than the required output voltage. Constant output voltage regulators maintain a constant output voltage despite changes in input voltage or load conditions.

In electronic circuits, voltage fluctuations and ripple voltage can be problematic. Voltage fluctuations refer to rapid changes in the voltage level, while ripple voltage refers to the residual AC voltage that remains after rectification. Voltage regulators can help minimize voltage fluctuations and ripple voltage in electronic circuits, ensuring proper device operation. After rectification, understanding what a rectifier does helps explain the origin of ripple and filtering needs.

 

Related Articles

View more

What is a Voltmeter?

What is a voltmeter? A voltmeter is an electrical measuring instrument used to determine voltage across circuit points. Common in electronics, engineering, and power systems, it ensures accuracy, safety, and efficiency when monitoring current and diagnosing electrical performance.

 

What is a Voltmeter?

A Voltmeter provides a method to accurately measure voltage, which is the difference in electric potential between two points in a circuit, without changing the voltage in that circuit. It is an instrument used for measuring voltage drop.

✅ Ensures accurate voltage measurement for safety and performance

✅ Used in electrical engineering, electronics, and power systems

✅ Helps diagnose faults and maintain efficient operation

Electrical current consists of a flow of charge carriers. Voltage, also known as electromotive force (EMF) or potential difference, manifests as "electrical pressure" that enables current to flow. Given an electric circuit under test with a constant resistance, the current through the circuit varies directly in proportion to the voltage across the circuit. A voltmeter measures potential difference, which directly relates to Ohm’s Law, the fundamental equation connecting voltage, current, and resistance in circuits.

A voltmeter can take many forms, from the classic analog voltmeter with a moving needle to modern instruments like the digital voltmeter (DVM) or the versatile digital multimeter. These tools are essential for measuring electrical values in electronic devices, enabling technicians to measure voltage, current, and resistance with precision and accuracy. While analog units provide quick visual feedback, digital versions deliver more precise measurements across wider voltage ranges, making them indispensable for troubleshooting and maintaining today’s complex electrical systems.

A voltmeter can be tailored to have various full-scale ranges by switching different values of resistance in series with the microammeter, as shown in Fig. 3-6. A voltmeter exhibits high internal resistance because the resistors have large ohmic values. The greater the supply voltage, the larger the internal resistance of the voltmeter because the necessary series resistance increases as the voltage increases. To understand how a voltmeter works, it helps to first review basic electricity, as voltage, current, and resistance form the foundation of all electrical measurements.

 


 

Fig 3-6. A simple circuit using a microammeter (tA) to measure DC voltage.

 

A Voltmeter, whether digital or analog, should have high resistance, and the higher the better. You don't want the meter to draw a lot of current from the power source. (Ideally, it wouldn't draw any current at all.) The power-supply current should go, as much as possible, towards operating whatever circuit or system you want to use, not into getting a meter to tell you the voltage. A voltmeter is commonly used to measure voltage drop across conductors or devices, helping electricians ensure circuits operate efficiently and safely. For quick calculations, a voltage drop calculator provides accurate estimates of conductor losses based on length, size, and current. Understanding the voltage drop formula allows engineers and technicians to apply theoretical principles when designing or troubleshooting electrical systems.

Also, you might not want to keep the voltmeter constantly connected in parallel in the circuit. You may need the voltmeter for testing various circuits. You don't want the behavior of a circuit to be affected the moment you connect or disconnect the voltmeter. The less current a voltmeter draws, the less it affects the behavior of anything that operates from the power supply. Engineers often ask: What is a voltmeter?  They use a voltmeter in power system analysis, where accurate voltage readings are crucial for ensuring safety, reliability, and optimal performance.

Alternative types of voltmeters use electrostatic deflection, rather than electromagnetic deflection, to produce their readings. Remember that electric fields produce forces, just as magnetic fields do. Therefore, a pair of electrically charged plates attracts or repels each other. An electrostatic type utilizes the attractive force between two plates with opposite electric charges or a large potential difference. A voltmeter is used to measure the potential difference. Figure 3-7 portrays the functional mechanics of an electrostatic meter. It constitutes, in effect, a sensitive, calibrated electroscope. A voltmeter draws essentially no current from the power supply. Nothing but air exists between the plates, and air constitutes a nearly perfect electrical insulator. A properly designed electrostatic meter can measure both AC voltage and DC voltage. However, the meter construction tends to be fragile, and mechanical vibration can influence the reading.

 

 

Fig 3-7. Functional drawing of an electrostatic voltmeter movement.

 

It's always good when a voltmeter has a high internal resistance. The reason for this is that you don't want the voltmeter to draw a significant amount of current from the power source. This cur­rent should go, as much as possible, towards working whatever circuit is hooked up to the supply, and not just into getting a reading of the voltage. Additionally, you may not want or need to have the voltmeter constantly connected in the circuit; instead, you might need it for testing various circuits. You don't want the behavior of the circuit to be affected the instant you connect the voltmeter to the supply. The less current a voltmeter draws, the less it will affect the behavior of anything that is working from the power supply.

If you connect an ammeter directly across a source of voltage, a battery, the meter needle will deflect. In fact, a milliammeter needle will probably be "pinned" if you do this with it, and a microammeter might well be wrecked by the force of the needle striking the pin at the top of the scale. For this reason, you should never connect milli-ammeters or micro-ammeters directly across voltage sources. An ammeter, perhaps with a range of 0-10 A, may not deflect to full scale if it is placed across a battery; however, it's still a bad idea to do so, as it will rapidly drain the battery. Some batteries, such as automotive lead-acid cells, can explode under these conditions. This is because all ammeters have low internal resistance. They are designed that way deliberately. They are meant to be connected in series with other parts of a circuit, not right across the power supply. Because voltage is inseparable from current, learning what is current electricity provides deeper insight into why voltmeters are vital diagnostic tools.

But if you place a large resistor in series with an ammeter, and then connect the ammeter across a battery or other type of power supply, you no longer have a short cir­cuit. The ammeter will give an indication that is directly proportional to the voltage of the supply. The smaller the full-scale reading of the ammeter, the larger the resistance needed to get a meaningful indication on the meter. Using a microammeter and a very large resistor in series, it can be devised that draws only a small current from the source.

So, What is a Voltmeter? In summary, a voltmeter is a fundamental instrument for electrical work, allowing professionals and students to accurately measure voltage and understand circuit behaviour. Whether using an analog or digital design, voltmeters and multimeters provide precise insights that support safety, efficiency, and reliable performance in electrical systems.

Related Articles

 

View more

What do Ammeters Measure?

An ammeter measures electric current in a circuit, displayed in amperes (A). Connected in series with low internal resistance to reduce burden voltage, it ensures accurate readings for testing, fault detection, and diagnostics.

 

What do Ammeters Measure?

Ammeters are measuring devices that measure the flow of electricity in the form of current in a circuit.

✅ Measure electric current in amperes, connected in series with low internal resistance to minimize burden voltage.

✅ Available in analog, digital, clamp, and current transformer designs.

✅ Used for testing, fault detection, continuity checks, and diagnostics.

Electrical currents are then measured in the form of amperes, hence the name "ammeter". The term "ammeter" is sometimes used incorrectly as "ampmeter". Understanding how an ammeter works is easier when you first explore the basics of electricity fundamentals, including how voltage, current, and resistance interact in a circuit.

An ammeter measures electric current in a circuit, expressed in amperes (A). It must be connected in series with the load so that all the current flows through it, and is designed with low internal resistance to minimize burden voltage, thereby ensuring accurate readings without significantly affecting the circuit’s performance. The measurement unit for an ammeter is the ampere, explained in detail on our what is an ampere page, which also covers its relationship to other electrical units.

Ammeters are designed (as well as ohmmeters and voltmeters) to be used with a sensitive current detector such as a moving coil in a galvanometer. To measure the electric current flow through it, an ammeter is placed in series with a circuit element. The ammeter is designed to offer very low resistance to the current, so that it does not appreciably change the circuit it is measuring. To do this, a small resistor is placed in parallel with the galvanometer to shunt most of the current around the galvanometer. Its value is chosen so that when the design current flows through the meter, it will deflect to its full-scale reading. A galvanometer's full-scale current is very small: on the order of milliamperes. To see how ammeters fit into broader measurement tools, check out our guide on what is a voltmeter and what is a multimeter, which measure multiple electrical properties.

An Ammeter is analog. It is not mechanical or digital. It uses an analog-to-digital converter (ADC) to measure the voltage across the shunt resistor. The ADC is read by a microcomputer that performs the calculations to display the current through the resistor.

 

How an Ammeter Works

An ammeter works by being placed in series with the circuit so that all the current flows through it. Inside, a shunt resistor with very low internal resistance creates a small, measurable voltage drop proportional to the current. In analog designs, this current is partly diverted around a sensitive moving-coil mechanism, which displays the reading on a scale. In digital designs, the voltage drop across the shunt is measured by an analog-to-digital converter (ADC) and calculated into an amperes value, ensuring accurate measurement without significantly disturbing the circuit’s performance. Accurate current measurement also depends on understanding what is electrical resistance and how it affects current flow, especially in low-resistance ammeter designs.

 

Types and Mechanisms

Analog ammeter – Includes moving-coil (D'Arsonval) and moving-iron types, which use magnetic deflection to display current on a scale. These designs are valued for their simplicity, durability, and ability to provide continuous current readings.

Digital ammeter – Uses a shunt resistor to create a small voltage drop proportional to the current. This voltage is measured by an analog-to-digital converter (ADC) and displayed as a precise digital value. Digital ammeters often feature higher accuracy, wider measurement ranges, and additional functions such as data logging.

Clamp meter – Measures current without breaking the circuit by detecting the magnetic field around a conductor. This method is ideal for quick testing, especially in high-current applications or where live connections cannot be interrupted.

Current transformer (CT) ammeter – Designed for high-current AC systems, this type uses a transformer to scale down large primary currents into a safe, measurable secondary current for the meter.

 

Shunts and Operation

A shunt resistor is a precision, low-resistance component used in many ammeters. In analog designs, it is placed in parallel with the meter movement, diverting most of the current to protect the instrument. In certain digital designs, it is placed in series with the circuit. By measuring the voltage drop across the shunt and applying Ohm’s law, the meter accurately calculates the current. This approach allows for measurement of very large currents without damaging the meter and helps maintain measurement stability.

 

Applications and Value

Ammeters are essential tools in electrical testing, short-circuit detection, continuity testing, and system diagnostics. They help identify overloads, open circuits, and unstable current conditions that may indicate equipment faults or inefficiencies.

In industrial, commercial, and residential settings, ammeters are used for equipment maintenance, troubleshooting, and performance monitoring. Specialized variants such as milliammeters and microammeters are designed for extremely low current measurements, while integrating ammeters track current over time to determine total electrical charge delivered to a device or system. For historical context on the development of measuring instruments, visit our history of electricity page to learn how electrical science evolved over time.

 

Practical Applications of Ammeters

Ammeters are used in a wide range of electrical and electronic work:

  • Automotive diagnostics – Measuring current draw from the battery to detect parasitic drains, starter motor issues, and charging system faults.

  • Solar panel and battery monitoring – Tracking current output from photovoltaic arrays and the charging/discharging rates of storage batteries to optimize system efficiency.

  • Industrial motor maintenance – Monitoring motor current to identify overload conditions, detect bearing wear, or confirm correct load operation.

  • Household appliance servicing – Checking current draw to troubleshoot faulty components or ensure devices operate within safe limits.

  • Power distribution systems – Ensuring current levels remain within capacity for cables, fuses, and protective devices.

 

Related Articles

 

View more

What is Electricity?

Electricity is the flow of electric charge, usually through a conductor like wire. It powers lights, appliances, and machines by converting energy into motion, heat, or light. Electricity can be generated from sources such as fossil fuels, wind, solar, or water.

 

What is electricity?

Electricity is a fundamental form of energy created by the movement of electrons.

✅ Powers homes, industries, and electronic devices

✅ Flows through circuits as an electric current

✅ Generated from renewable and non-renewable sources

The power we use is a secondary energy source because it is produced by converting primary energy sources such as coal, natural gas, nuclear, solar, and wind energy into electrical power. It is also referred to as an energy carrier, meaning it can be converted into other forms of energy, such as mechanical or thermal energy.

Primary energy sources are either renewable or nonrenewable, but our power is neither.

To understand why electrons move in the first place, start with voltage, the electrical “pressure” that pushes charge through every circuit.

 

Electricity Has Changed Everyday Life

Although most people rarely think about electricity, it has profoundly changed how we live. It is as essential as air or water, yet we tend to take it for granted—until it’s gone. Electricity powers heating and cooling systems, appliances, communications, entertainment, and modern conveniences that past generations never imagined.

Before widespread electrification began just over a century ago, homes were lit with candles or oil lamps, food was cooled with ice blocks, and heating was provided by wood- or coal-burning stoves.

The steady stream of electrons we use daily is explored in our primer on current electricity.

 

Discovering Electricity: From Curiosity to Power Grid

Scientists and inventors began unlocking the secrets of electricity as early as the 1600s. Over the next few centuries, their discoveries built the foundation for the electric age.

Benjamin Franklin demonstrated that lightning is a form of electricity.

Thomas Edison invented the first commercially viable incandescent light bulb.

Nikola Tesla pioneered the use of alternating current (AC), which enabled the efficient transmission of electricity over long distances. He also experimented with wireless electricity.

Curious why Tesla’s ideas beat Edison’s? Our article on alternating current breaks down the advantages of alternating current (AC) over direct current (DC).

Before Tesla’s innovations, arc lighting used direct current (DC) but was limited to outdoor and short-range applications. His work made it possible for electricity to be transmitted to homes and factories, revolutionizing lighting and industry.

 

Understanding Electric Charge and Current

Electricity is the movement of electrically charged particles, typically electrons. These particles can move either statically, as in a buildup of charge, or dynamically, as in a flowing current.

All matter is made of atoms, and each atom consists of a nucleus with positively charged protons and neutral neutrons, surrounded by negatively charged electrons. Usually, the number of protons and electrons is balanced. But when that balance is disturbed—when electrons are gained or lost—an electric current is formed as those electrons move.

For a step-by-step walkthrough of everything from circuits to safety, visit how electricity works.

 

Electricity as a Secondary Energy Source

Electricity doesn’t occur naturally in a usable form. It must be generated by converting other types of energy. In fact, electricity is a manufactured product. That’s why electricity is called a secondary energy source—it carries energy from its original form to where we need it.

We generate electricity by transforming mechanical energy—such as spinning a turbine—into electrical energy. This conversion happens at power plants that use a variety of fuels and methods:

  • Fossil fuels (coal, oil, natural gas)

  • Nuclear energy

  • Renewable sources like wind, solar, and hydroelectric

If turbines, magnets, and power plants intrigue you, see how electricity is generated for a deeper dive.

 

How Electricity Was Brought Into Homes

Before electricity generation began on a mass scale, cities often developed near waterfalls, where water wheels powered mills and machines. The leap from mechanical energy to electrical energy enabled power to travel not just across a town, but across entire countries.

Beginning with Franklin’s experiments and followed by Edison’s breakthrough with indoor electric light, the practical uses of electricity expanded rapidly. Tesla’s AC power system made widespread electric distribution feasible, bringing light, heat, and industry to homes and cities worldwide.

 

How Transformers Changed Everything

To transmit electricity efficiently over long distances, George Westinghouse developed the transformer. This device adjusts the voltage of electrical power to match its purpose—high for long-range travel, low for safe use in homes.

Transformers made it possible to supply electricity to homes and businesses far from power plants. The electric grid became a coordinated system of generation, transmission, distribution, and regulation.

Even today, most of us rarely consider the complexity behind our wall sockets. But behind every outlet lies a vast infrastructure keeping electricity flowing safely and reliably.

 

How Is Electricity Generated?

Electric generators convert mechanical energy into electricity using the principles of magnetism. When a conductor—such as a coil of wire—moves through a magnetic field, an electric current is induced.

In large power stations, turbines spin magnets inside massive generators. These turbines are driven by steam, water, or wind. The rotating magnet induces small currents in the coils of wire, which combine into a single continuous flow of electric power.

Discover the principle that turns motion into power in electromagnetic induction, the heart of every modern generator.

 

Measuring Electricity

Electricity is measured in precise units. The amount of power being used or generated is expressed in watts (W), named after inventor James Watt.

  • One watt is a small unit of power; 1,000 watts equal one kilowatt (kW).

  • Energy use over time is measured in kilowatt-hours (kWh).

  • A 100-watt bulb burning for 10 hours uses 1 kWh of electricity.

These units are what you see on your electric bill. They represent how much electricity you’ve consumed over time—and how much you’ll pay.

When it’s time to decode your energy bill, the chart in electrical units makes watts, volts, and amps clear.

 

Related Articles

 

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified