• Electromagnetics I
• Ch 4
• Loc 4.7

# Divergence Theorem

The Divergence Theorem relates an integral over a volume to an integral over the surface bounding that volume. This is useful in a number of situations that arise in electromagnetic analysis. In this section, we derive this theorem.

Consider a vector field

representing a flux density, such as the electric flux density

or magnetic flux density

. The divergence of

is

where is

is the flux per unit volume through an infinitesimally-small closed surface surrounding the point at

. Since

is flux per unit volume, we can obtain flux for any larger contiguous volume

by integrating over

; i.e.,

In the Cartesian system,

can be interpreted as a three-dimensional grid of infinitesimally-small cubes having side lengths

,

, and

, respectively. Note that the flux out of any face of one of these cubes is equal to the flux into the cube that is adjacent through that face. That is, the portion of the total flux that flows between cubes cancels when added together. In fact, the only fluxes which do not cancel in the integration over

are those corresponding to faces which lie on the bounding surface

, since the integration stops there. Stating this mathematically:

Thus, we have converted a volume integral into a surface integral.

To obtain the Divergence Theorem, we return to Equation 4.7.1. Integrating both sides of that equation over

, we obtain

Now applying Equation 4.7.3 to the right hand side:

The Divergence Theorem (Equation 4.7.5) states that the integral of the divergence of a vector field over a volume is equal to the flux of that field through the surface bounding that volume.

The principal utility of the Divergence Theorem is to convert problems that are defined in terms of quantities known throughout a volume into problems that are defined in terms of quantities known over the bounding surface and vice-versa.