Does the amplitude of a wave depend on distance from the source?

The energy carried in a wave depends on the wave’s amplitude and its velocity. Waves can be put into two categories: those that spread, like water waves on a pond, sound waves, or electromagnetic waves; and those that are confined to a narrow region, like waves on a rope or electrical oscillations on a wire. A water wave spreads on the surface of a pond, lake, wide river, or ocean. As it spreads its energy is spread over a larger area, so the energy transmitted to a particular location is reduced when the source is farther away. Therefore the amplitude of the water wave is also reduced in proportion to the distance from the source. Sound and electromagnetic waves usually spread in two dimensions. Again, as they spread the energy carried is also spread, so as the distance from the source is increased, the amplitude is decreased, but this time as the square of the distance.

On a rope or wire, the wave doesn’t spread, but often a different mechanism reduces the amplitude. In a rope there is friction between the fibers, which changes some of its kinetic energy to thermal energy. If a signal is sent through a wire as an oscillating voltage the resistance of the wire will convert some of the electric energy to thermal energy, reducing the voltage and thus the amplitude of the wave. This loss is reversed by putting amplifiers along the wire, putting more energy into the wave and increasing its amplitude.


This is a web preview of the "The Handy Physics Answer Book" app. Many features only work on your mobile device. If you like what you see, we hope you will consider buying. Get the App