Astronomers learn about an astronomical source by measuring the strength of its radiation as a function of direction on the sky (by mapping or imaging) and frequency (spectroscopy), plus other quantities (time, polarization) that we ignore for now. We need precise and quantitative definitions to describe the strength of radiation and how it varies with distance between the source and the observer. The concepts of brightness and flux density are deceptively simple, but they regularly trip up experienced astronomers. It is very important to understand them clearly because they are so fundamental.

We start with the simplest possible
case of radiation traveling from
a
source through empty space (so there is no absorption, scattering, or
emission along the way) to an observer. In the ray-optics
approximation, radiated
energy flows in straight lines.
This approximation is valid only for systems much larger than the
wavelength $\lambda$ of the radiation, a criterion easily met by
astronomical sources. You may find it helpful to visualize
electromagnetic radiation as a stream of light particles (photons),
essentially
bullets that travel
in straight lines at the speed of light. To motivate the
following mathematical
definitions, imagine you are looking at the Sun. The "brightness"
of the Sun appears to be about the
same over most of the Sun's surface, which looks like a
nearly uniform disk even though it is a sphere. This means, for
example, that
a photograph of the Sun would expose the film equally across
the
Sun's disk. It also turns out that the exposure would not change if
photographs were made at different distances from the Sun, from points
near Mars, the Earth, and Venus, for example.

The Sun in three
imaginary photos taken
from a long
distance (left), medium distance (center), and short distance (right)
would have a constant brightness but increasing angular size.

Only the angular size of the Sun changes with the distance between the
Sun and the observer. The photo taken from near
Venus would not be overexposed, and the one from near Mars would not be
underexposed. The number of photons falling on the film per unit area
per unit time per unit solid angle does not depend on the distance
between the source and
the observer. The total number of photons falling on the film per unit
area per unit
time (or the total energy absorbed per unit area per unit time) does
decrease with
increasing distance. Thus we distinguish between the *
brightness* of the Sun, which does not depend on distance, and the
apparent * flux*, which does.

Note also that the number of photons
per unit area hitting the film
is proportional to $\cos\theta$ if the normal to the film is tilted by
an angle $\theta$ from the ray direction. This is just the same
projection effect that reduces the amount of water collected by a
tilted rain gauge by $\cos\theta$. Likewise at the source, such as the
spherical Sun, the projected area perpendicular to the line of
sight scales as $\cos\theta$.

Using the ray-optics approximation, we can define the specific intensity (sometimes called spectral intensity or spectral brightness, spectral radiance, or loosely, just brightness) $I_\nu$ in terms of

- $d\sigma$ = infinitesimal surface area (e.g., of a detector)

- $\theta$ = the angle between a "ray" of radiation and the normal
to the surface $d\sigma$

- $d\Omega$ = infinitesimal solid angle measured at the observer's location