For space-based Earth Observations and solar system observations, obtaining both high revisit rates (using a constellation of small platforms) and high angular resolution (using large optics and therefore a large platform) is an asset for many applications. Unfortunately, they prevent the occurrence of each other. A deployable satellite concept has been suggested that could grant both assets by producing jointly high revisit rates and high angular resolution of roughly 1 meter on the ground. This concept relies however on the capacity to maintain the phasing of the segments at a sufficient precision (a few tens of nanometers at visible wavelengths), while undergoing strong and dynamic thermal gradients. In the constrained volume environment of a CubeSat, the system must reuse the scientific images to measure the phasing errors. We address in this paper the key issue of focal-plane wave-front sensing for a segmented pupil using a single image with deep learning. We show a first demonstration of measurement on a point source. The neural network is able to identify properly the phase piston-tip-tilt coefficients below the limit of 15nm per petal.
[1]
Gilles Louppe,et al.
Focal plane wavefront sensing using machine learning: performance of convolutional neural networks compared to fundamental limits
,
2021,
2106.04456.
[2]
Stephen Todd,et al.
High-resolution deployable CubeSat prototype
,
2020,
Astronomical Telescopes + Instrumentation.
[3]
Torben Andersen,et al.
Image-based wavefront sensing for astronomy using neural networks
,
2020,
Journal of Astronomical Telescopes, Instruments, and Systems.
[4]
Jun Tanida,et al.
Deep learning wavefront sensing.
,
2019,
Optics express.
[5]
James R Fienup,et al.
Machine learning for improved image-based wavefront sensing.
,
2018,
Optics letters.