An observational system for tele-micro-operation has been proposed with a dynamic focusing system and a high-speed image processing system using the "depth from focus" criteria. In our past work (2000), we proposed the system which has the "all-in-focus image" and the "depth" of an object, simultaneously. Actually, in micro operation, such as for micro-surgery, DNA operation and etc., the small depth of a focus on the microscope gives bad observability. In this sense, the "all-in-focus image", which holds the in-focused texture all over the image, is useful for observing micro environments with a microscope. It is also important to obtain the depth map, and to show the 3D micro virtual environments in real-time to actuate the micro objects, intuitively. The past system with dynamic focusing lens and smart sensor could obtain the "all-in-focus image" and the "depth" in 2 sec. To realize real-time micro operation, at least, a 30 frame/sec system should be required. The paper briefly reviews the criteria of "depth from focus" to achieve the all-in-focus image and the 3D micro environments' reconstruction, simultaneously. After discussing the problem in our past system, a new frame-rate system is constructed with high-speed video camera and FPGA hardware. To apply this system in a real microscope, new criteria to reconstruct the all-in-focus image is proposed. Finally, micro observation shows the validity of the system.
[1]
Kiyoharu Aizawa,et al.
Acquisition of an all‐focused image by the use of multiple differently focused images
,
1998
.
[2]
Shree K. Nayar,et al.
Real-time focus range sensor
,
1995,
Proceedings of IEEE International Conference on Computer Vision.
[3]
Daniel Raviv,et al.
Novel active-vision-based visual-threat-cue for autonomous navigation tasks
,
1996,
Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.
[4]
Shree K. Nayar,et al.
Shape from Focus
,
1994,
IEEE Trans. Pattern Anal. Mach. Intell..
[5]
Nobuyuki Ohya,et al.
A new, compact and quick-response dynamic focusing lens
,
1997,
Proceedings of International Solid State Sensors and Actuators Conference (Transducers '97).
[6]
Shree K. Nayar,et al.
Minimal operator set for passive depth from defocus
,
1996,
Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.
[7]
Akio Namiki,et al.
1-ms sensory-motor fusion system
,
2000
.
[8]
Takeo Kanade,et al.
Sensory Attention: Computational Sensor Paradigm for Low-Latency Adaptive Vision
,
1997
.
[9]
Yoshinori Takei,et al.
Implementation of Real Time Micro VR Camera
,
2000
.