Medical-grade Sterilizable Target for Fluid-immersed Fetoscope Optical Distortion Calibration

* These authors contributed equally
Bioengineering
 

Summary

This article describes the design and development of a sterilizable custom camera optical distortion calibration target for the peri-operative, fluid-immersed calibration of endoscopes during endoscopic interventions.

Cite this Article

Copy Citation | Download Citations

Nikitichev, D. I., Shakir, D. I., Chadebecq, F., Tella, M., Deprest, J., Stoyanov, D., Ourselin, S., Vercauteren, T. Medical-grade Sterilizable Target for Fluid-immersed Fetoscope Optical Distortion Calibration. J. Vis. Exp. (120), e55298, doi:10.3791/55298 (2017).

Abstract

We have developed a calibration target for use with fluid-immersed endoscopes within the context of the GIFT-Surg (Guided Instrumentation for Fetal Therapy and Surgery) project. One of the aims of this project is to engineer novel, real-time image processing methods for intra-operative use in the treatment of congenital birth defects, such as spina bifida and the twin-to-twin transfusion syndrome. The developed target allows for the sterility-preserving optical distortion calibration of endoscopes within a few minutes. Good optical distortion calibration and compensation are important for mitigating undesirable effects like radial distortions, which not only hamper accurate imaging using existing endoscopic technology during fetal surgery, but also make acquired images less suitable for potentially very useful image computing applications, like real-time mosaicing. In this paper proposes a novel fabrication method to create an affordable, sterilizable calibration target suitable for use in a clinical setup. This method involves etching a calibration pattern by laser cutting a sandblasted stainless steel sheet. This target was validated using the camera calibration module provided by OpenCV, a state-of-the-art software library popular in the computer vision community.

Introduction

Camera calibration is a well-known problem in the computer vision field that has been intensively studied over the years1,2,3. A key step of camera calibration procedures is to estimate the parameters of a distortion model, as well as the intrinsic camera parameters, by extracting a grid of points with a known geometry from camera images with sub-pixel accuracy. Calibration targets with a checkerboard pattern featuring black and white squares are commonly used for this purpose. Circular blobs offer an alternative pattern4,5,6.

In recent years, there has been a growing interest in the development of surgical navigation technology for fetal surgery procedures, such as the treatment of twin-to-twin transfusion syndrome (TTTS) on fetuses7,8,9,10. As the field of view of the fetoscope (i.e., an endoscope used in fetal surgical procedures) is very limited, methods for mapping the placental vasculature without the use of external trackers have been proposed to aid TTTS surgery11,12,13. Optical distortions within fetoscopic images have adverse effects on these computational mosaicing methods that rely on visual information extraction11. Thus, there is an unmet need for a cost-efficient and fast tool for peri-operatively calibrating fetoscopes so that optical distortion compensation can be done in real-time during the intervention.

Due to the fact that the fetoscope is immersed in amniotic fluid during the intervention, the refraction index difference between air and amniotic fluid renders classical in-air camera calibration methods unsuitable for fetal surgery procedures. Estimating fluid-immersed camera parameters from in-air camera parameters is a difficult task and requires at least one image of the fluid-immersed calibration target14. Furthermore, peri-operative, fluid-immersed fetoscopic camera calibration is currently impractical due to sterilization requirements and restrictions on the materials allowed in the operating theater. Due to these reasons, calibrating endoscopes for optical distortions is typically not part of the current clinical workflow. The work in this manuscript is an attempt to close this camera calibration gap by designing and producing a sterilizable and practical optical distortion calibration target featuring a pattern of asymmetric circles. Previously, Wengert et al. fabricated a custom calibration device featuring an oxidized aluminum plate as the calibration target. Their method, however, works only in conjunction with the custom calibration algorithm they developed15.

Protocol

1. Target Fabrication

  1. Sandblasting
    1. Prepare a 316 stainless steel sheet with a 1.2-mm thickness. Using a pencil or a nail, draw a 40 mm x 40 mm square onto the sheet with the aid of a ruler.
    2. Cut the drawn square using a manual metal cutter. CAUTION! Watch the fingers.
    3. Use a file to round the corners and sides of the sample. CAUTION! They are very sharp; be careful.
    4. Prepare a straight wooden or metal block slightly larger in size than the stainless steel sheet. Place the cut sheet on it; do this in order to avoid bending the sample during sandblasting.
    5. Place the assembly in the internal blast chamber. Remember to use a dust collector and to tightly seal the internal blast chamber; otherwise, the sand will spread all over during the process. Wear safety goggles to protect the eyes.
    6. Position a blast gun perpendicular to and at least 4-5 cm away from the metal surface. Apply the foot control for sandblasting. Put the sample on the piece of wood (1-2 cm thick) using a vice, as the high-pressure sand flow can deform the sample. During sand-blasting, hold the sample tightly by the edge of the piece of wood or by using another vice.
    7. Repeat the sandblasting on the other side if it is desirable to have a calibration pattern engraved on both sides.
  2. Laser patterning
    1. Design a pattern of asymmetric circles, as shown in Figure 1.
    2. Prepare a drawing exchange format (DXF) file of the design either using CAD software or another suitable programming language.
      NOTE: For convenience, a Python application that can generate DXF files for the design mentioned in this paper is provided as part of the compact GUI application16.
    3. Import the DXF files into the laser cutting software.
    4. Set up the following parameters for background etching. Laser Power: 40%, Scan Speed: 80 cm/s, Frequency: 4,000 Hz, Number of Passes: 1.
    5. Set up the following parameters for etching the pattern. Laser Power: 40%, Scan Speed: 2.1 cm/s, Frequency: 4,000 Hz, Number of Passes: 1.
    6. Put the sample on the working platform and align the cutting pattern using the software.
    7. After the laser performs the cut, clean the sample by dipping it in alcohol. Do not use any wipes, as they usually leave undesirable residue.
  3. Sterilization
    1. Wrap the sterilized sample in a sterilization package and insert it in the sterilization unit (autoclave).
    2. Add water (not distilled water) to the autoclave and follow the user's guide/manufacturer's recommendations to sterilize the target.

2. Peri-operative Calibration

  1. Calibration software
    1. Install the "endocal" endoscope calibration software package provided on GitHub16 (follow the instructions in the README file therein).
      NOTE: This software wraps the OpenCV camera calibration module17 in an easy-to-use convenience application. The provided application runs in two modes: online and offline. The online mode acquires the video stream directly from compatible frame-grabber hardware. The offline mode allows for loading endoscope images either from a video file or a folder with a number of video frames saved as image files. See README for supported hardware and detailed instructions on how to use these two modes.
  2. Endoscopic video acquisition
    NOTE: The following instructions are for online calibration (as described above), but they are also applicable to offline calibration.
    1. Place the calibration target in a sterile fluid container, such as a gallipot.
    2. Fill the container with the target fluid or a similar sterile substance.
      NOTE: For instance, in fetoscopic procedures, the target fluid is amniotic fluid. Since the optical properties of amniotic fluid are similar to saline water18,19, sterile saline water can be used for calibrating the fetoscope.
    3. Adjust the zoom and sharpness of the endoscope as desired.
    4. Immerse the endoscope in the fluid and hold it at a distance from the calibration target similar to the distance from the anatomy that the endoscope will later be used at.
    5. Launch the calibration application and start the camera acquisition.
    6. Move the tip of the endoscope slightly for different views while keeping the whole calibration pattern in view of the camera. For optimal performance, keep the elliptical legend around the calibration pattern within the circular view of the endoscope.
      NOTE: Video frames that are usable for calibration are indicated by a virtual pattern overlay, as seen in Figure 3.
    7. Acquire at least the minimum number of endoscopic camera views required for calibration (as indicated in the endocal window).
      NOTE: The current version of endocal requires at least 10 endoscopic camera views for calibration, a heuristically selected number of views where the calibration error appears to be minimal and follow a stable pattern20.
    8. Press the calibration key, as indicated on the endocal window, to start the calibration process using the images acquired so far.
  3. Saving and using the calibration parameters
    1. Press the indicated calibration key to save the resulting calibration parameters in a YAML ("YAML Ain't Markup Language") file21.
    2. Group the calibration parameters into the camera matrix and distortion coefficients, as explained in the OpenCV camera calibration module17.
      NOTE: After performing the calibration, the calibration application automatically displays the distortion-corrected image to the right of the original endoscope image.
    3. Use the distortion-corrected video feed during a fetoscopic procedure for pure visualization or for real-time placental mosaicing11.

Representative Results

We created a sterilizable calibration target by etching a pattern of asymmetric circle on a sandblasted stainless steel metal sheet, whose design is shown in Figure 1. An exemplar setup showing this calibration target in action together with a fetoscope is shown in Figure 2. To feed this design into the laser etching software, a custom application was implemented in the Python programming language16. Creating the design pattern involves iteratively etching parallel lines on a metal sheet. For the pattern to have a consistent color in the end, the distance between these lines should be less than the width of the laser beam (see the inset of Figure 1)—this value is 45 µm for the Violino (Laservall) laser cutter.

Figure 1
Figure 1: Design of the engraved pattern featuring a 3-by-11 grid of asymmetric circles. Inset: zoomed-in view of the grid of asymmetric circles. The distance between the lines is 45 µm (equal to the laser beam width), and each circle has a diameter of 1 mm. Other sizes could be used for the grid as well, but this was found to be optimal with respect to the fetoscope field of view. Please click here to view a larger version of this figure.

Figure 2
Figure 2: Exemplar setup with the calibration target in use. The tip of the water-immersed fetoscope is directed at the calibration target on the right. On the left is a British penny to provide scale information. Please click here to view a larger version of this figure.

The fabricated calibration target allows for the detection of the circular pattern in the endoscopic video stream with OpenCV17, whose locations are then sorted into the pre-defined asymmetric circular grid (see Figure 3). Using this information in conjunction with the already-known grid geometry, internal camera parameters can be estimated. These include the camera matrix and the distortion coefficients. The camera matrix consists of the focal lengths and the optical centers along the x- and y-axis of the 2D image plane. The distortion coefficients are based on the Brown-Conrady model3. Note that for this work, only the radial distortion parameters were estimated. For a brief discussion of the theory, with practical examples, see the webpage of the OpenCV camera calibration module17 and the MATLAB camera calibration toolbox22. More details about the camera calibration procedure are available in Zhang's work20. The endocal software repository features a sample dataset of 10 endoscopic views of the fabricated calibration target16. Using this dataset, a calibration with an average re-projection error of 0.28 pixels (min: 0.16, max: 0.45) was obtained. This is comparable to the 0.25 pixels reported by Wengert et al. using their custom calibration algorithm15. The same research group, however, reported a re-projection error of 0.6 pixels in a more recent paper when using the method in15 for calibrating an endoscopic camera used for placental mosaicing18.

Figure 3
Figure 3: Real-time detection of the calibration pattern. A screenshot from the calibration application16 featuring the detected calibration pattern overlaid onto the live video stream using the virtual reality visualization from OpenCV17. Note that each detected column of the calibration pattern is emphasized by a different color. The detected circles, in conjunction with the known geometry, are used for computing the camera parameters. Please click here to view a larger version of this figure.

The estimated camera parameters are used for optical distortion correction. Figure 4 shows a rectangular chessboard pattern, as viewed using a fetoscope, where optical distortions make the lines appear as curves. Note that the lines appear normal in the distortion-corrected image.

Figure 4
Figure 4: Optical distortion correction. A screenshot from the calibration application16 featuring the live video image from a fetoscope recording from the checkerboard pattern (left) with the distortion-corrected image (right). Three exemplary lines are drawn in both images, each from one corner to another, where the trajectory is linear. Due to the optical distortions, these lines appear as curves in the original fetoscope images. Please click here to view a larger version of this figure.

Discussion

Sandblasting is an important step in the fabrication process because the raw metal surface prominently reflects the endoscope light, making it impossible for the circles to be detected. It is difficult to distinguish the circles even with the naked eye (see Figure 5). Note that the surface of the target shown was already etched with a laser. However, this does not diminish light reflection.

Figure 5
Figure 5: Calibration target with no sandblasting applied. As seen from the endoscope view on the left, the glare from the endoscope light on the material surface makes it difficult even for the naked eye to distinguish the circles (there is a circle just to the southeast of the large reflection). Note that the surface of this target (i.e., the "background") was already etched, but this is not helpful in the absence of sandblasting. Please click here to view a larger version of this figure.

Prior to pattern etching, it is also important to etch the surface of the whole sample. This is necessary because the sandblasted surface has many specular reflections (see Figure 6), which interfere with blob detection.

Figure 6
Figure 6: Sandblasted surface with no etching. Although not as prominent as the raw metal surface, the relatively small specular reflections (some of which are highlighted with yellow arrows) are still sufficient to prevent blob detection from succeeding, so no calibration can be performed with this target. Please click here to view a larger version of this figure.

Applying the laser at different speeds gives different background colors. The background color plays a significant role in the contrast between the circles and the background. Hence, it is vital to determine the optimal background color. For this purpose, a plate with circles etched against a set of different backgrounds was created (see Figure 7). The backgrounds were tested using the feature detection module of OpenCV23, which is used in the OpenCV camera calibration module17. In this work, the target was made of stainless steel, as it is the most common and reliable material used in clinics for medical devices. This material is freely available, not expensive, robust, and easy to sterilize. Other materials could potentially be used for the calibration target, such as aluminum or iodized metals, but this is the scope of future work.

Figure 7
Figure 7: Stainless steel plate featuring a palette of different background colors etched with the laser. Practical experiments were conducted in conjunction with the OpenCV feature detection module to determine which background color gives the optimal result in terms of blob-to-background contrast23. The endoscope view on the left shows the plate. The moderate background colors (i.e., those other that the darkest and lightest ones) in this palette yield better blob detection. Please click here to view a larger version of this figure.

One of the advantages of this work is that performing a calibration using the fabricated target takes 2-3 min. Most of the effort goes to manually stabilizing the endoscope to obtain decent views of the calibration pattern. Using a custom-built endoscope holder could eliminate the need for manual stabilization, which in turn could significantly shorten calibration time.

Video 1
Video 1: Video showing how optical distortion calibration can be performed using the developed calibration target together with the endocal software. Please click here to view this video. (Right-click to download.)

An advantage of our work compared to the work of Wengert et al.15 is that the OpenCV camera calibration module17 can be used as is for calibration, without requiring any modification or custom parameterization. Because OpenCV is a well-established and well-maintained software package and is very popular in the computer vision community, using it eliminates the need for writing and maintaining custom software. For the reader's convenience, a compact GUI application is provided16, which the reader can easily install and use to test new calibration targets. One disadvantage of our method compared to Wengert et al.15 is that their method is more robust to occlusions of the pattern, as it does not require the detection of all blobs.

Initially, a calibration target with a checkerboard pattern was fabricated for this work. However, this type of calibration target proved to be unsuitable in experiments due to the difficulty of detecting the corners of the checkerboard squares. Corner detection relies on histogram-based image binarization (see the OpenCV source code24). This implies the need for a clear color contrast between the dark and light squares, which could not be guaranteed with our checkerboard pattern, partially due to specular reflections, like the ones shown in Figure 6. Such specular reflections are present even after background etching; however, the detection of the circles seems to be less sensitive to this shortcoming.

In the current setup, only perpendicular views of the calibration target allow for successful blob detection. This is due to the specular reflections from the target surface hampering blob detection at oblique angles. We are working to further improve the target so as to allow for the acquisition of views at a wider range of angles, which could potentially improve the quality of performed calibrations20.

In the real-time placental mosaicing pipeline that was previously proposed11, the computation of the transformation that maps image pairs relies on the successful detection and grouping of features. Optical distortions, on the other hand, cause a group of features with a rigid geometry to appear different across images. As a consequence, this difference leads to inaccuracies in the computed transformations, which cause drifts in the resulting image mosaics. Because the most prominent optical distortions are present towards the edges, endoscopic images are currently cropped to their innermost regions. A good correction for optical distortions would potentially allow for the incorporation of a larger part of each image into the mosaicing process. The advantage of this method is two-fold. First, it would increase the number of detected features in each image, potentially improving the computation of the image transformations. Second, it would allow for the whole target anatomical surface to be reconstructed in a shorter time.

Disclosures

The authors have nothing to disclose.

Acknowledgments

This work was supported through an Innovative Engineering for Health award by the Wellcome Trust [WT101957], the Engineering and Physical Sciences Research Council (EPSRC) [NS/A000027/1], and a National Institute for Health Research Biomedical Research Centre UCLH/UCL High Impact Initiative. Jan Deprest is being funded by the Fonds voor Wetenschappelijk Onderzoek Vlaanderen (FWO; JD as clinical researcher 1.8.012.07). Danail Stoyanov receives funding from the EPSRC (EP/N013220/1, EP/N022750/1), the EU-FP7 project CASCADE (FP7-ICT-2913-601021), and the EU-Horizon2020 project EndoVESPA (H2020-ICT- 2015-688592). Sebastien Ourselin receives funding from the EPSRC (EP/H046410/1, EP/J020990/1, EP/K005278) and the MRC (MR/J01107X/1). Marcel Tella is supported by the EPSRC-funded UCL Centre for Doctoral Training in Medical Imaging (EP/L016478/1).

Materials

Name Company Catalog Number Comments
1.2 mm Metal sheet 316 Grade, 40 mm by 40 mm
Water container at least 50 mm by 50 mm by 30 mm
A sterilization package
Saline water
Manual metal cutter
A file to round up the corners
A wooden or metal block 50 mm by 50 mm at least 10 mm thick
A vice (desirable but not required)
Sand Blasting machine
GUI application to create .dxf file with the pattern https://github.com/gift-surg/endocal
PC
Laser Cutter
Autoclave
An endoscope calibration software from GitHub: https://github.com/gift-surg/endocal
Endoscope
OpenCV camera calibration module http://docs.opencv.org/2.4/doc/tutorials/calib3d/camera_calibration/camera_calibration.html
Safety goggles
A lab coat
A ruler and a marker
Alcohol (preferably ethanol) for dust removal and cleaning

DOWNLOAD MATERIALS LIST

References

  1. Zhang, Z., Matsushita, Y., Ma, Y. Camera calibration with lens distortion from low-rank textures. CVPR 2011 IEEE Conference on Computer Vision and Pattern Recognition, Washington, D.C., USA, IEEE Computer Society. 2321-2328 (2011).
  2. Devernay, F., Faugeras, O. D. Automatic calibration and removal of distortion from scenes of structured environments. SPIE's 1995 International Symposium on Optical Science, Engineering, and Instrumentation. 62-72 International Society for Optics and Photonics, International Society for Optics and Photonics. 62-72 (1995).
  3. Duane, C. B. Close-range camera calibration. Photogramm. Eng. 37, (8), 855-866 (1971).
  4. Mallon, J., Whelan, P. F. Which pattern? biasing aspects of planar calibration patterns and detection methods. Pattern recognition letters. 28, (8), 921-930 (2007).
  5. Balletti, C., Guerra, F., Tsioukas, V., Vernier, P. Calibration of Action Cameras for Photogrammetric Purposes. Sensors. 14, (9), 17471-17490 (2014).
  6. Heikkila, J. Geometric camera calibration using circular control points. IEEE Transactions on pattern analysis and machine intelligence. 22, (10), 1066-1077 (2000).
  7. Deprest, J. A., et al. Fetal surgery is a clinical reality. Seminars in fetal and neonatal medicine. 15, (1), Elsevier. 58-67 (2009).
  8. Watanabe, M., Flake, A. W. Fetal surgery: Progress and perspectives. Advances in pediatrics. 57, (1), 353-372 (2010).
  9. Lewi, L., Deprest, J., Hecher, K. The vascular anastomoses in monochorionic twin pregnancies and their clinical consequences. American journal of obstetrics and gynecology. 208, (1), 19-30 (2013).
  10. Yamashita, H., et al. Miniature bending manipulator for fetoscopic intrauterine laser therapy to treat twin-to-twin transfusion syndrome. Surgical Endoscopy. 22, (2), 430-435 (2008).
  11. Daga, P., et al. Real-time mosaicing of fetoscopic videos using SIFT. Proc. SPIE 9786, Medical Imaging 2016: Image-Guided Procedures, Robotic Interventions, and Modeling. 97861R. International Society for Optics and Photonics. (2016).
  12. Yang, L., et al. Image mapping of untracked free-hand endoscopic views to an ultrasound image-constructed 3D placenta model. The International Journal of Medical Robotics and Computer Assisted Surgery. 11, (2), 223-234 (2015).
  13. Liao, H., et al. Fast image mapping of endoscopic image mosaics with three-dimensional ultrasound image for intrauterine fetal surgery. Minimally invasive therapy & allied technologies. 18, (6), 332-340 (2009).
  14. Chadebecq, F., et al. Practical Dry Calibration With Medium Adaptation For Fluid-Immersed Endoscopy. Hamlyn Symposium on Medical Robotics, (2015).
  15. Wengert, C., Reeff, M., Cattin, P. C., Székely, G., et al. Bildverarbeitung für die Medizin 2006: Algorithmen Systeme Anwendungen. Proceedings des Workshops vom 19. - 21. März 2006 in Hamburg. Handels, H., et al. Springer. Berlin Heidelberg. 419-423 (2006).
  16. Shakir, D. I. Compact GUI application for optical distortion calibration of endoscopes. Available from: https://github.com/gift-surg/endocal (2016).
  17. Camera calibration With OpenCV. Available from: http://docs.opencv.org/2.4/doc/tutorials/calib3d/camera_calibration/camera_calibration.html (2016).
  18. Reeff, M., Gerhard, F., Cattin, P. C., Székely, G. Mosaicing of endoscopic placenta images. Citeseer. (2011).
  19. Steigman, S. A., Kunisaki, S. M., Wilkins-Haug, L., Takoudes, T. C., Fauza, D. O. Optical properties of human amniotic fluid: implications for videofetoscopic surgery. Fetal diagnosis and therapy. 27, (2), 87-90 (2009).
  20. Zhang, Z. A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence. 22, (11), 1330-1334 (2000).
  21. Evans, C. C. The Official YAML Web Site. Available from: http://yaml.org (2016).
  22. Bouguet, J. -Y. Camera Calibration Toolbox for Matlab. Available from: http://www.vision.caltech.edu/bouguetj/calib_doc/ (2015).
  23. Common Interfaces of Feature Detectors. Available from: http://docs.opencv.org/2.4/modules/features2d/doc/common_interfaces_of_feature_detectors.html (2016).
  24. Open Source Computer Vision Library. Available from: https://github.com/opencv/opencv (2016).

Comments

0 Comments


    Post a Question / Comment / Request

    You must be signed in to post a comment. Please or create an account.

    Usage Statistics