# How Moments Work

Moments are mathematical expressions of the relationship between a physical quantity and its distance. Moments usually involve a fixed reference point that defines the distance between the physical quantity and the point. Moments describe the motion of physical quantities that are a certain distance from the reference point. Here’s how moments work:

The effect of a force can cause an object to rotate or move. A moment is a measure of this effect. Moments depend on the magnitude of the force and the distance from the axis of rotation. For example, a force applied to a door handle 12 cm away from its pivot will produce a clockwise moment of rotation, and a correspondingly anticlockwise moment. This is because the force that acts on the body cannot be counterbalanced by an equal and opposite force along the line of action.

Moments also refer to the tendency of a physical object to cause motion. Moments are measured in terms of the distance between an object and a reference point. For example, a force exerted at point B will produce a 1.2 Nm moment at point C. In order to find the magnitude of a moment, it is necessary to calculate the force at each point. It is best to note the sense next to the moment that is under consideration.

The mathematical concept of a moment is derived from the discovery of the lever principle by Archimedes. Archimedes discovered that the amount of force applied to an object corresponded to the distance from the reference point. He observed that a moment of force is the sum of the applied force times the distance. This distance accounts for the location of the physical quantity. The term “moment” has become a common technical term in both physics and mathematics.