Algorithm to increase the largest aberration that can be reconstructed from Hartmann sensor measurements.
暂无分享,去创建一个
Conventional Hartmann sensor processing relies on locating the centroid of the image that is formed behind each element of a lenslet array. These centroid locations are used for computing the local gradient of the incident aberration, from which the phase of the incident wave front is calculated. The largest aberration that can reliably be sensed in a conventional Hartmann sensor must have a local gradient small enough that the spot formed by each lenslet is confined to the area behind the lenslet: If the local gradient is larger, spots form under nearby lenslets, causing a form of cross talk between the wave-front sensor channels. We describe a wave-front reconstruction algorithm that processes the whole image measured by a Hartmann sensor and a conventional image that is formed by use of the incident aberration. We show that this algorithm can accurately estimate aberrations for cases in which the aberration is strong enough to cause many of the images formed by individual lenslets to fall outside the local region of the Hartmann sensor detector plane defined by the edges of a lenslet.
[1] R. Noll. Zernike polynomials and atmospheric turbulence , 1976 .
[2] Andrew J. Watson. Hubble Successor Gathers Support , 1996 .
[3] J. H. Seldin,et al. Hubble Space Telescope characterized by using phase-retrieval algorithms. , 1993, Applied optics.
[4] J R Fienup,et al. Phase retrieval algorithms: a comparison. , 1982, Applied optics.
[5] James M. Spinhirne,et al. Two generations of laser-guide-star adaptive-optics experiments at the Starfire Optical Range , 1994 .