What is an Ampere?

An ampere is the standard unit of electric current in the International System of Units (SI). It measures the flow of electric charge in a circuit, with one ampere equal to one coulomb of charge passing through a point per second.
What is an Ampere?
The ampere (A) is one of several units used to measure the electromagnetic force between straight, parallel conductors carrying electric current.
✅ Measures electric current or flow of electric charge per second
✅ Defined as one coulomb of charge per second in a conductor
✅ Essential in circuit design, safety, and load calculations
Scientific Definition and Formula
The ampere is defined by the formula:
-
V is voltage in volts
-
R is resistance in ohms
-
I is current in amperes
When you explore Ohm’s Law, you'll learn how voltage and resistance influence current using the formula I = V / R.
Safety Considerations
Electric current levels and their effects on the human body:
-
1 mA: barely perceptible
-
5–10 mA: painful shock
-
50 mA: can cause breathing difficulty
-
100 mA: potentially fatal if it passes through the chest
Even small currents, if applied in the wrong way, can be dangerous, especially in wet conditions.
Applications of Amperes
-
Power system design: selecting proper wire gauges and protective devices
-
Circuit protection: fuses and circuit breakers are rated in amperes
-
Electronics: current limits are vital in component design
-
Battery ratings: indicate how much current a battery can safely deliver
An ammeter is essential for measuring current directly in amperes within a circuit.
Although the ammeter can measure electric flow in coulombs per second, it is calibrated or marked in amperes. For most practical applications, the term amperes is used instead of coulombs per second when referring to the amount of current flow. Note the use of the prefixes micro and milli to represent very small amounts of current and kilo and mega to represent very large amounts. The article on the ampere explains why one coulomb per second is foundational to electrical theory. Exploring power factor reveals how reactive energy and real power interact in systems with large currents.
A current of a few milliamperes will give you a startling shock. About 50 mA will jolt you severely, and 100 mA can cause death if it flows through your chest cavity.
An ordinary 100-watt light bulb draws a current of about 1 A. An electric iron draws approximately 10 A; an entire household normally uses between 10 A and 50 A, depending on the size of the house, the types of appliances it has, and also the time of day, week, or year. Learning about the watt helps readers see how power (watts) relates to current (amperes) and voltage.
The amount of current that flows in an electrical circuit depends on both the voltage and the resistance. There are some circuits in which extremely large currents, say 1000 A, flow; this might happen through a metal bar placed directly at the output of a massive electric generator. The resistance is extremely low in this case, and the generator is capable of driving huge amounts of charge. In some semiconductor electronic devices, such as microcomputers, a few nanoamperes are often sufficient for many complex processes. Some electronic clocks draw so little current that their batteries last as long as they would if left on the shelf without being used at all. Reading about electricity safety shows why even small currents—measured in amperes—can pose serious hazards.
Related Articles
On-Site Training
Interested in cost effective, professional on-site electrical training?
We can present an Electrical Training Course to your electrical engineering and maintenance staff, on your premises, tailored to your specific equipment and requirements. Click on the link below to request a Free quotation.