We have seen that deconvolution tries to select one answer from the many that are possible. `CLEAN' uses a procedure to select a plausible image from the feasible set. Some of `CLEAN''s problems arise just because it is procedural, so there is no simple equation describing the output image. This makes it difficult to analyze the errors (noise) in `CLEAN'. By contrast, the Maximum Entropy Method (MEM) is not procedural: the image selected is that which fits the data, to within the noise level, and also has maximum entropy. The use of the term entropy has lead to some confusion about the justification for MEM. There is no consensus on this subject (e.g., Frieden 1972; Wernecke & D'Addario 1976; Gull & Daniell 1978; Jaynes 1982; Narayan & Nityananda 1984, 1986; Cornwell & Evans 1985). The authors' preferred justification defines the entropy as something which, when maximized, produces a positive image with a compressed range in pixel values. Image entropy thus defined is therefore not to be confused with a ``physical entropy'' (see Cornwell 1984a). The compression in pixel values forces the MEM image to be ``smooth'', and the positivity forces super-resolution on bright, isolated objects. There are many possible forms of this extended type of entropy, see e.g., Narayan & Nityananda 1984, but one of the best for general purpose use is:
where is a ``default'' image that incorporates a priori knowledge about the object. For example, a low resolution image of the object can be used to good effect as the default.
A requirement that each visibility point be fitted exactly is nearly always incompatible with the positivity of the MEM image. Consequently, data are usually incorporated in a constraint that the fit, , of the predicted visibility to that observed, be close to the expected value:
Simply maximizing subject to the constraint that be equal to its expected value leads to an image which fits the long spacings much too well (better than 1 ) and the zero and short spacings very poorly. The cause of this effect is that the entropy is insensitive to spatial information. It can be avoided (Cornwell & Evans 1985) by constraining the predicted zero-spacing flux density to be that provided by the user.
Algorithms for solving this maximization problem have been given by Wernecke & D'Addario (1976), by Cornwell & Evans (1985), and by Skilling & Bryan (1984). The Cornwell-Evans algorithm was coded in the NRAO's Astronomical Image Processing System (classic AIPS) as `VTESS'. This algorithm works well for many cases and its code is in the public domain. It is generally faster than `CLEAN' for larger images, the break-even point being around 1 million pixels.
MEM is an extremely flexible approach to deconvolution that can readily handle heterogeneous data types; this has made it particularly powerful for mosaicing.
1996 November 4