Waiting
Elaborazione accesso...

Trial ends in Request Full Access Tell Your Colleague About Jove

Bioengineering

Area-based Image Analysis Algorithm for Quantification of Macrophage-fibroblast Cocultures

Published: February 15, 2022 doi: 10.3791/63058

Summary

We present a method, which utilizes a generalizable area-based image analysis approach to identify cell counts. Analysis of different cell populations exploited the significant cell height and structure differences between distinct cell types within an adaptive algorithm.

Abstract

Quantification of cells is necessary for a wide range of biological and biochemical studies. Conventional image analysis of cells typically employs either fluorescence detection approaches, such as immunofluorescent staining or transfection with fluorescent proteins or edge detection techniques, which are often error-prone due to noise and other non-idealities in the image background.

We designed a new algorithm that could accurately count and distinguish macrophages and fibroblasts, cells of different phenotypes that often colocalize during tissue regeneration. MATLAB was used to implement the algorithm, which differentiated distinct cell types based on differences in height from the background. A primary algorithm was developed using an area-based method to account for variations in cell size/structure and high-density seeding conditions.

Non-idealities in cell structures were accounted for with a secondary, iterative algorithm utilizing internal parameters such as cell coverage computed using experimental data for a given cell type. Finally, an analysis of coculture environments was carried out using an isolation algorithm in which various cell types were selectively excluded based on the evaluation of relative height differences within the image. This approach was found to accurately count cells within a 5% error margin for monocultured cells and within a 10% error margin for cocultured cells.

Introduction

Software is routinely implemented during image analysis techniques to ensure that the results are accurate, efficient, and unbiased. For cell-based assays, a common problem is the misidentification of cells. Images with improper focal and contrast settings may lead to cell blurring, in which the boundary of individual cells becomes hard to identify1. The presence of extraneous image features such as pores, bubbles, or other undesired objects can hamper counting procedures by slowing the counting process and leading to misidentification. Furthermore, cell counting can be onerous, and counting hundreds of replicates can be extremely time-consuming. Moreover, an inherent subjective bias exists during manual counting, and therefore decision-making regarding cell identification is often inaccurate2. Automated software offers exciting potential to bypass all these issues by rapidly and precisely differentiating cells from extraneous objects, including objects far beyond human capacity for precise detection, based on well-defined identification criteria that reduce the influence of investigator bias. Common techniques to identify cells using automated software involve two main methods: segmentation and thresholding3. Herein, we demonstrate a generalizable area-based protocol that enables rapid, accurate, and inexpensive cell counting within a widely accessible software framework.

Segmentation techniques, such as edge detection, seek to isolate individual cells by utilizing intensity differences within an image. Intensity changes that distinguish a cell from the rest of the image most often consist of sharp changes in brightness4. Edge detection involves a regularizing filtering step, followed by a differentiation step in which intensity changes are detected. The differentiation process identifies edges and contours within the image of high-intensity changes, and these edges and contours are correlated with cell presence. Although images with noise can be run through denoising algorithms4, edge detection techniques are ideally used for analyzing images with low background noise. The process functions optimally when cell boundaries are clearly and easily distinguishable and are not impeded by brightness contours unrelated to cell presence, cell blur, extraneous objects, or defined internal cell structures1,2. If an image is particularly noisy, cells may be further distinguished through fluorescent staining or transfection with fluorescent proteins2,5. Although this significantly improves the accuracy of segmentation techniques, it requires added costs and additional time investments to prepare cell cultures for imaging.

Thresholding techniques involve the division of an image into two categories: the foreground and the background, with cells assigned to the foreground3. These techniques utilize color/contrast changes to define the apparent height of an object; objects that are routinely 'taller' than the background can be easily identified as cells. The watershed-transform functions in this way by associating surfaces with light pixels as the foreground and those with dark pixels as the background6,7. Through height-based identification, thresholding techniques can routinely distinguish noise from desired objects, provided they exist within the same focal plane. When paired with an area-based quantification, a watershed-transform can accurately identify groups of objects in environments where typical segmentation techniques such as edge detection would be inaccurate.

Watershed-transforms are commonly coupled with segmentation techniques to prepare images for a cleaner analysis, resulting in higher accuracy of cell counting. For this process, the watershed-transform is used to highlight potential regions of interest prior to segmentation. A watershed-transform provides unique benefits by identifying cells in the foreground of images, which can improve the accuracy of segmentation analysis by removing potential false positives for cells, such as uneven patches of background. However, difficulties can arise when attempting to adapt cell-based images to a watershed-transform. Images with high cell density can be plagued with undersegmentation, in which aggregates of cells are identified as a singular group rather than as individual components. The presence of noise or sharp intensity changes can also result in oversegmentation, in which the algorithm overisolates cells, resulting in excessive and inaccurate cell counts8.

Herein, we detail a method to minimize the primary drawbacks of the watershed-transform by incorporating components of a thresholding analysis within an area-based quantification algorithm, as depicted in Figure 1. Notably, this algorithm was implemented with open-source and/or widely available software, and application of this cell-counting framework was possible without expensive reagents or complex cell preparation techniques. RAW264.7 macrophages were used to demonstrate the method due to their critical role in regulating connective tissue maintenance and wound healing processes9. Additionally, NIH/3T3 fibroblasts were analyzed due to their key role in tissue maintenance and repair. Fibroblast cells often coexist with and support macrophages, generating the need to distinguish these phenotypically distinct cell types in coculture studies.

Cell counts from images with high viable cell density (VCD) could be quantified reliably and efficiently by calculating the area covered by the cells, and the average area occupied by a singular cell. The use of thresholding as opposed to segmentation for cell identification also enabled more complex analyses, such as experiments in which different cell types in cocultures were analyzed concurrently. NIH/3T3 fibroblasts, which are often found to colocalize with RAW264.7 macrophages within a wound healing site, were found to grow at a focal plane that was distinct from the focal plane of macrophages10. Accordingly, multiple thresholding algorithms were run to define the background and foreground depending on the cell type being analyzed, enabling accurate counting of two different cell types within the same image.

Subscription Required. Please recommend JoVE to your librarian.

Protocol

1. Cell culture and image acquisition

  1. Culture RAW264.7 macrophages at 37 °C and 5% CO2 in Dulbecco's Modified Eagle Medium (DMEM) supplemented with 10% fetal bovine serum (FBS), 1% penicillin-streptomycin, 1.5 g/L sodium bicarbonate, and 5 µM β-mercaptoethanol.
    1. For monoculture imaging, culture RAW264.7 cells at a density of 25,000 cells/cm2 in a 5 mL cell culture flask with 1 mL of medium.
  2. Culture NIH/3T3 cells at 37 °C and 5% CO2 in DMEM supplemented with 10% fetal bovine serum and 1% penicillin-streptomycin11.
  3. For coculture imaging, culture RAW264.7 macrophages and NIH/3T3 fibroblasts together at varied ratios, at a total density of 25,000 cells/cm2. Use coculture medium that is 1 part RAW264.7 medium (DMEM supplemented with 10% FBS, 1% penicillin-streptomycin, 1.5 g/L sodium bicarbonate, and 5 µM β-mercaptoethanol) and 1 part NIH/3T3 medium (DMEM supplemented with 10% fetal bovine serum and 1% penicillin-streptomycin).
  4. Following seeding, incubate cells at 37 °C and 5% CO2 to reach a viable cell density of 80% cell confluence.
  5. Image cells with an inverted microscope equipped with a 40x objective. Acquire all images in grayscale and export them in the raw '.czi' file.
    1. Determine the capacity of the algorithm to accurately evaluate images with a range of image qualities. Acquire images with varying foci, producing both 'non-bulbous' images (Figure 2) and 'bulbous' images (Figure 3).
    2. Export and convert cell images to the 8-bit tiff (.tiff) image format using ImageJ using the 'Batch Convert' function on the raw '.czi' files prior to MATLAB analysis. Store converted images in a local folder and manually transfer these image files to the relevant MATLAB bin.

2. Image analysis-monoculture utilizing the "monoculture.m" file primarily

NOTE: The following steps were performed using MATLAB. Three files were used for the MATLAB protocol: "process.m" (Supplemental coding file 1), the file containing the algorithm, "monoculture.m" (Supplemental coding file 2), the file to run for analyzing monoculture images, and "coculture_modified.m" (Supplemental coding file 3), the file to run for analyzing coculture images.

  1. Use the area-based method to obtain images of cells showing relative height variations. Image outputs for each substep by utilizing the 'imshow()' function for each subimage. Copy and paste the image file for analysis into the bin and enter the file name in the following command. Press run to start the program.
    imread('filename.tiff')
    1. Analyze images by 'opening by reconstruction' followed by 'closing by reconstruction' using source functions8 to magnify the foreground from the background. Use the first command for opening by reconstruction and the second and third sequences to close by reconstruction
      'imopen()'
      'imerode()'
      'imreconstruct()'
  2. Binarize the reconstructed images to pure black and pure white pixels utilizing a percentile-based identification system. Note that cells are converted to a pure white pixel value of '255' in this system, whereas background and extraneous (non-cellular) objects are converted to a pure black pixel value of '0'.
    1. Distinguish cells from the background by using a percentile difference from the 'maximum relevant pixel,' the largest pixel value comprising at least 0.5% of a given image.
    2. Analyze and evaluate the pixel values for RAW264.7 macrophages. If these values are within 4.5% of the maximum relevant pixel, then mark the pixel as cellular.
      NOTE: This command can be issued using a simple if statement.
    3. For images containing bulbous cell profiles, such as those seen in Figure 2, implement an iterative procedure to correct for erroneous binarization at the centers of the cells (termed 'islands;' see below), as follows.
    4. Determine an initial guess for the total cell coverage; 60% was utilized for this study. Note that the number of cells present within the image should be dependent on the cell coverage-the relationship between cell number and cell coverage can therefore be determined experimentally. Using this relationship, determine the variables 'Alpha' and 'kappa.'
      NOTE: 'Alpha' represents a 3 x 1 vector containing the following relative height percentiles: the height percentile at which cells were identified, the height percentile at which the background was identified, and the height percentile at which islands-regions in which intensity values were significantly lower than cell values-typically resided. 'Kappa' represents the total area covered by cells in the image.
    5. Run the algorithm, which will (i) analyze images using initial estimates of alpha and kappa to fill a portion of the islands, and (ii) use the postanalysis cell counts and coverage to recalculate kappa. If the kappa value is within 10% of the initial guess, proceed to the next step.
      NOTE: For images containing RAW264.7 and NIH/3T3 cells, alpha was found to be [4.3, 5.5, 10]. In other words, a difference of 4.3% from the maximum relevant pixel determined the height percentile at which cells were identified, a difference of 5.5% from the maximum relevant pixel determined the height percentile at which the background was identified. A difference of 10% from the maximum relevant pixel determined the height percentile at which islands typically resided.
  3. Following binarization of the image, determine the average cell area automatically using the open-source circle finding algorithm, which performs a Hough transform on the binarized image12,13,14. Use the following command to obtain a vector of all center locations and radii of circles found within the image.
    '[centers, radii] = imfindcircle(A, [minradius, maxradius])'
    'A' in this command is the image of choice, and [minradius, maxradius] is the range of radii that the algorithm will attempt to detect.
    NOTE: This procedure to determine the average cell area assumes that the morphology of macrophage cells was both circular and consistent among cells10. From experimental data, the radii of macrophages are observed to be most commonly between 30 and 50 pixels at 40x magnification. This pixel range is defined as the acceptable range for the circle finding algorithm. The radii might be significantly different for other cell types and would require determination using experimental analysis.
    1. Use the radii outputs to calculate the average cell area by averaging. Analyze at least 10 cells to ensure accurate area identification.
  4. Determine the total number of cells within the image using the average cell area.
    1. Loop through the image matrix and count the total number of cell pixels. Determine the total cell coverage by dividing the number of cell pixels by the total number of pixels within the image. Determine the total number of cells within the image by dividing the number of cell pixels by the average area of a cell.

3. Image analysis-coculture utilizing the "coculture_modified.m" file primarily

NOTE: The following steps were performed using MATLAB.

  1. Use the area-based method to obtain images of cells showing relative height variations. Image outputs for each substep by utilizing the 'imshow()' function for each subimage. Copy and paste the image file for analysis into the bin and enter the file name in the following command. Press run to start the program.
    'imread('filename.tiff')'
    1. Analyze images by 'opening by reconstruction' followed by 'closing by reconstruction' using source functions10 to magnify the foreground from the background. Use the first command for opening by reconstruction and the second and third sequences to close by reconstruction.
      'imopen ()'
      'imerode()'
      'imreconstruct()'
  2. Conduct an analysis of cocultures containing RAW264.7 and NIH/3T3 cells by using two selective binarizations, obtained by exploiting the height differences between the two cell types.
    1. Experimentally determine a parameter 'phi' using images of RAW264.7 and NIH/3T3 cells, with phi representing the proportional height difference between the two cell types. Guess an initial phi value and iterate until cell counts and coverage match closely with manual counts, specific to the RAW264.7 and NIH/3T3 coculture.
      NOTE: The value for phi used in these studies was found to be 3.2. Phi is used to selectively filter an image to appear to contain only RAW264.7 cells, as seen in Figure 4C.
    2. Determine RAW264.7 cell counts and total cell coverage in a similar fashion as for monoculture images.
  3. Analyze the watershed-transformed image again without the phi parameter, detecting both macrophages and fibroblasts. Acquire NIH/3T3 fibroblast data by selectively subtracting RAW264.7 cell pixels, obtained using the standard thresholding and area-based quantification methods described in step 2.
    1. Once the entire image has been analyzed, determine the total number of NIH/3T3 pixels by subtracting the total number of cell pixels from the number of RAW264.7 pixels. Calculate the coverage of NIH/3T3 and RAW264.7 cells by dividing by the number of cell pixels for each cell line by the total number of image pixels.

Subscription Required. Please recommend JoVE to your librarian.

Representative Results

Analysis of non-bulbous RAW264.7 macrophages was conducted in a monoculture setting at 25,000 cells/cm2. Representative images were taken of the cell culture and processed in MATLAB following conversion to 8-bit tiff in ImageJ. Algorithm outputs throughout the process were recorded and documented in Figure 2 for the representative image. In this image, the algorithm counted 226 cells, and this image count was verified by comparison with a manual count that identified 241 cells (6.2% error). Algorithm outputs for at least 10 images of RAW264.7 cell counts had an average error of 4.5 ± 1.9%.

Analysis of bulbous RAW264.7 macrophages was performed using an iterative algorithm. In most images, the bulbous effect observed was primarily a result of focusing at a focal plane that was slightly above the focal plane of the substrate to which the cells were adherent. Accordingly, most images were acquired in non-bulbous form, for which analysis was significantly easier. The algorithm necessary to analyze bulbous images was developed for scenarios in which suboptimal was impossible to avoid due to meniscus effects or other optical interference. Typical algorithm outputs throughout the process were recorded and documented in Figure 3. In this representative image analysis, the algorithm counted 221 cells within the image, which was verified with a manual count of 252 cells (12.3% error).

Analysis of cocultures containing both RAW264.7 macrophages and NIH/3T3 fibroblasts was conducted to determine the capacity to differentiate between two distinct cell types within a single image. Due to the highly variable cell morphologies of NIH/3T3 fibroblasts, manual/automatic cell counts could not be obtained accurately, and cell coverages for fibroblasts were instead compared qualitatively to the original image. Representative algorithm outputs throughout the process were recorded and documented in Figure 4. For this image, the RAW264.7 macrophage count was 137, and this count was verified with a manual count of 155 (11.6% error). Algorithm outputs for at least 10 RAW264.7 macrophage counts had an average of 7.8 ± 3.9% error.

The robustness of the cell counting algorithm was also verified using a blind study to compare automatic cell identification with manual user counts. Five images were selected at random, with varying cell densities, and these images were blindly counted by three different users. The manual counts were compared with one another and with the results of the automatic cell counting algorithm. The comparison of manual and automated counting results is shown in Table 1. Furthermore, the segmentation accuracy of this method was tested by utilizing 'ground truth' images derived using conventional segmentation techniques. The DICE coefficient20,21 was utilized as a performance metric with an average parameter of 0.85 across five images. An example overlay can be seen in Figure 5.

Figure 1
Figure 1: Generalized area-based algorithm. A process in which an image is prepared for segmentation analysis through the watershed-transforms to determine regional maxima and minima10. This proposed method modifies the process (original process in black) by skipping the watershed ridge line determination and subsequent steps to adopt a percentile-based system coupled with an area-based quantification process (highlighted in red). Please click here to view a larger version of this figure.

Figure 2
Figure 2: RAW 264.7 non-bulbous algorithm output. The figure shows the intermediate image processing steps leading to the quantification of non-bulbous RAW264.7 macrophage counts. (A) Initial processing of Image to grayscale in ImageJ. The original image converted to 8-bit tiff and grayscale in ImageJ. (B) Postprocessing of grayscale image. The postprocessing image of A following opening and closing by reconstruction, as described in step 2.1.1. (C) Postbinarization of processed image. Panel B post binarization, performed as described in step 2.2. (D) Final image with representative cells for average area calculations. The image with blue circles indicating the cells used within the Hough-transform to identify the average area of a cell, as described in step 2.3. Scale bars = 20 µm. Please click here to view a larger version of this figure.

Figure 3
Figure 3: RAW264.7 bulbous algorithm output. The figure shows the intermediate image processing steps leading to the quantification of bulbous RAW264.7 macrophage counts. (A) Image of bulbous RAW264.7 macrophages. The original image converted to 8-bit tiff and grayscale in ImageJ. (B) Postprocessing of grayscale image. The postprocessing image of A following opening and closing by reconstruction, as described in step 2.1.1. (C) Post binarization of processed image with clear islands. The typical output without the iterative algorithm when analyzing bulbous images, with 'islands' of black pixels at the centers of the cells. The blue circles represent the cells used within the Hough-transform to identify the average area of a cell, as described in step 2.3. (D) Final image with representative cells, islands filled in using postoperative algorithm. The image post iterative algorithm, as described in step 2.2.2. Scale bars = 20 µm. Please click here to view a larger version of this figure.

Figure 4
Figure 4: RAW264.7 and NIH/3T3 coculture algorithm output. The figure shows the intermediate image processing steps leading to the quantification of coculture images containing RAW264.7 macrophages and NIH/3T3 fibroblasts. (A) Image of NIH/3T3 and RAW264.7 cells. The original image converted to 8-bit tiff and grayscale in ImageJ. (B) Postprocessing of grayscale image. The postprocessing image of A following opening and closing by reconstruction, as described in step 3.1.1. (C) Postbinarization of processed image with clear selection of macrophages based on height-based isolation. The selective isolation of RAW264.7 macrophages, as described in step 3.2.2. (D) Final image with clusters of macrophages and fibroblasts identified. The entire image containing both RAW264.7 macrophages and NIH/3T3 fibroblasts, as described in step 3.3. A sample macrophage cell is outlined with a green circle, and a sample fibroblast cell is outlined with a red oval. Scale bars = 20 µm. Please click here to view a larger version of this figure.

Figure 5
Figure 5: DICE parameter-segmentation performance, RAW264.7 macrophages. The image shows an overlay between the 'ground truth' segmentation approach and this method. The white regions are cells detected by both the 'ground truth' and this method, while the purple and green regions are false negatives and false positives, respectively. The 'ground truth' segmentation was obtained by common segmentation techniques and uses region-of-interest tools to correct for erroneous segmentation. Please click here to view a larger version of this figure.

Counter 1 Counter 2 Counter 3 Automatic Manual Counts Average (±STDEV) Error Compared with Automatic
Image 1 151 148 145 142 148 ± 3.0 4.22%
Image 2 164 166 168 173 166 ± 2.0 4.05%
Image 3 255 253 245 239 251 ± 5.3 5.02%
Image 4 153 152 157 166 154 ± 2.6 7.22%
Image 5 103 106 100 111 103 ± 3.0 7.20%

Table 1: Manual/automatic cell counts and robustness test.

Supplemental Information: Morphological differences in image analysis of macrophage and fibroblast cocultures. Please click here to download this File.

Supplemental coding file 1: "process.m", the MATLAB file necessary to run the algorithms. No manual actions required but contains the algorithm in a separate file. Please click here to download this File.

Supplemental coding file 2: "monoculture.m", the MATLAB file utilized for analyzing monoculture images. Please click here to download this File.

Supplemental coding file 3: "coculture_modified.m", the MATLAB file utilized for analyzing coculture images. Please click here to download this File.

Subscription Required. Please recommend JoVE to your librarian.

Discussion

We designed a general area-based procedure that accurately and efficiently counted cells on the basis of cell height, allowing for stain-free quantitation of cells even in coculture systems. Critical steps for this procedure included the implementation of a relative intensity system by which cells could be differentiated. The use of a relative height analysis served two purposes: the need for external parameters was rendered unnecessary, as relative parameters were constant for the given cell type and parameter, and absolute brightness/contrast levels did not need to be entered for every image. Furthermore, the use of a percentile-based system enabled the analysis of multiple cell types within the same image, provided that the different cell types resided at unique heights within the image.

The percentile-based system was defined as a difference from the maximum pixel value as opposed to the average. This was done to prevent skewing of the algorithm based on the average intensity value, which was dependent on the number and confluence of cells present in the image. Furthermore, the 'maximum relevant pixel' was implemented to prevent maximum intensity outliers from significantly affecting the data-maximum relevant pixels would need to represent at least 0.5% of the population-an experimental value. Additional steps included the use of an area-based quantification system rather than counting individual entities. This ensured that cell fragmentation or erroneous binarization would be minimized when counting cells, as the total affected area was minimal compared to the area of a cell.

Various troubleshooting methods were implemented within this algorithm. The initial efforts to quantify images of cell monocultures required methods to accurately identify both bulbous and non-bulbous cells. A unique phenomenon was observed for the bulbous cell images, in that the centers of the cells were identified as background following the watershed-transform, and these regions appeared as islands within the cell. A thorough analysis revealed that the lack of cell centers was occurring due to an excessively convex structure within the image, which resulted in an inversion in the watershed-transform. A solution was implemented specifically for bulbous images, which involved an iterative process known as plugging, in which the islands were filled in as true cell values. This functioned by a procedure in which the cell coverage and extent of island formation were used to determine the accuracy of the image. Generally, the cell coverage was used to determine the extent of island formation, as an unusually low cell coverage was most typically caused by a high level of island formation. The relationship between cell coverage and the extent of island formation was determined through experimental data.

Additional troubleshooting was necessary based on the observation that using an 'average pixel' threshold (defined as the mean average pixel intensity within the image) to differentiate cells, rather than a maximum pixel, led to inconsistencies. The average intensity of the image was found to increase as a function of the number of cells in the image, which could skew results based on the density of cells within the image. An alternative iteration of the algorithm was designed employing a maximum-intensity benchmark; however, this value also was inconsistent, as high-intensity outliers significantly skewed the data and led to unreliable results. Ultimately, the use of a maximum relevant pixel was adopted. The maximum relevant pixel was found to accurately account for high-intensity outliers, leading to the most consistent results when differentiating monoculture and coculture images.

Several limitations of the algorithm were identified when analyzing coculture images using the percentile-based system to selectively identify distinct types of cells. Occasionally, NIH/3T3 cells would appear continuously with RAW 264.7 cells in a multilayered aggregate, which rendered the image extremely difficult to analyze. Although this was mostly observed for coculture images, monoculture images of cells with abnormal culture conditions or high confluence levels could also result in multilayer cell aggregates. While the algorithm could successfully detect multilayered aggregates as unique from the background, it was not possible to obtain accurate cell quantification using this 2D image analysis on cell multilayers. Furthermore, cell debris within the image could hamper algorithm accuracy. The use of an area-based analysis algorithm served to minimize the error associated with erroneous detection, as small areas of cell debris would influence the cell counts less using an area-based analysis than segmentation.

Additionally, the conversion of czi images to 8-bit TIFF, performed to ensure simplicity and uniformity, resulted in pixel types of uint8, a collection of whole numbers from 0 to 255. Thus, differences could only be quantified at a maximum precision of 1/256 or 0.39%. Generally, pixel spread only covered between 210 and 250 pixels, resulting in a realistic precision of 2.5%. Although this was found to be adequate for monoculture analysis in which cells are extremely distinct from the background, the selective subtraction step within the coculture analysis required much more precision. Although reasonably accurate data could still be obtained from coculture images, the overall accuracy of coculture analysis was reduced compared to the accuracy of analysis from monoculture images. Furthermore, the use of an area-based identification system required an average cell area to obtain cell counts. This parameter is useful when dealing with noisy images or with cells of uniform geometries but would hinder quantification for area-variable cells.

The significance of this algorithm as opposed to existing methods lies within the use of developed methods such as the watershed-transform and morphological image operations to analyze mono- and cocultured cells without fluorescence staining procedures. Although several prior studies have reported methods to successfully distinguish and count cells in the absence of staining, these alternative approaches often required complex culture approaches or inaccessible software16. For example, Yilmaz and colleagues used automated counting analysis techniques to quantify immune cells through biochips with immunomagnetic beads on micropads15.

We also designed new and impactful approaches to address non-random noise patterns, such as 'islands,' that appeared at the centers of cells. Identification of different cell lines was also possible by utilizing the varying 'height' values of the cells, which could be identified from intensity pixels. The use of relative parameters rather than absolute parameters provided several advantages by enabling automation of the algorithm and providing more flexibility for the acquisition of images, as exact brightness, contrast, or color settings did not need to be preserved. The area-based identification provided accurate results even when cells were clustered, as is common with many cell types. In contrast, manual counting and segmentation are difficult, error-prone, and may require a trained user to identify cells accurately and efficiently.

Future applications and development of the algorithm will focus on fine-tuning and further optimizing of the cell-counting procedure. Furthermore, the use of experimental parameters to distinguish different cell types by relative intensity can be verified by testing of further cocultures; for example, the identification of neurons/astrocytes for neurotoxicity analysis17. Further applications can be made in the wound-healing field, where representative proportions of immune cells can play a prominent role in the inflammatory and antiinflammatory processes. Improvements in cell quantification of macrophages and fibroblasts could provide additional insight into the paracrine versus juxtracrine cell signaling effects relevant for evaluating foreign body response outcomes18,19. In Supplemental Information, we explore the possible mechanisms of incorporating morphometric parameters in the algorithm to aid in the accuracy of cell identification within coculture systems.

Extensions can also be made for more complex cocultures, which would focus on 3 or more different cell types within images, as opposed to just binary mixtures. Due to the limited precision available for 8-bit images, 16-bit images may be required, which provide additional information but may significantly complicate analysis due to more dynamic intensity ranges and distributions. The goal of this study was to develop a robust protocol for automated cell-counting that is optimized for diverse cell culture environments. The algorithm allows for accurate cell counts even in bulbous cell image acquisitions. Its self-correcting iterative code is useful for stain-free brightfield microscopy imaging of immunologically relevant cell culture systems.

Subscription Required. Please recommend JoVE to your librarian.

Disclosures

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was funded in part by the National Institutes of Health (R01 AR067247) and in part by the Delaware INBRE program, supported by a grant from the National Institute of General Medical Sciences-NIGMS (P20 GM103446) from the National Institutes of Health and the State of Delaware. The contents of the manuscript do not necessarily reflect the views of the funding agencies.

Materials

Name Company Catalog Number Comments
Axio Observer 7 Inverted Microscope Zeiss 1028290770
β-mercaptoethanol Life Technologies 21985023
Cell Scrapers CellTreat 229310
Dublecco's Modified Eagle Medium Fisher Scientific 12430047
Dublecco's PBS Fisher Scientific 14190144
MATLAB Software MathWorks 2021A
NIH/3T3 Cells ATCC ATCC CRL - 1658
Penicillin–Streptomycin Sigma Aldrich P4333-20ML
RAW264.7 Cells ATCC ATCC TIB - 71
Sodium Bicarbonate Sigma Aldrich S6014-25G
T75 Cell Culture Flask Corning CLS3814-24EA

DOWNLOAD MATERIALS LIST

References

  1. Young, D., Glasbey, C., Gray, A., Martin, N. Identification and sizing of cells in microscope images by template matching and edge detection. Fifth International Conference on Image Processing and its Applications, 1995. , 266-270 (1995).
  2. Zhu, R., Sui, D., Qin, H., Hao, A. An extended type cell detection and counting method based on FCN. Proc. - 2017 IEEE 17th International Conference on Bioinformatics and Bioengineering (BIBE). , 51-56 (2018).
  3. Choudhry, P. High-throughput method for automated colony and cell counting by digital image analysis based on edge detection. PLoS One. 11 (2), 0148469 (2016).
  4. Torre, V., Poggio, T. On edge detection. MIT Artificial Intelligence Lab Memo 768. , 1-9 (1984).
  5. Fuller, M. E., et al. Development of a vital fluorescent staining method for monitoring bacterial transport in subsurface environments. Applied and Environmental Microbiology. 66 (10), 4486-4496 (2000).
  6. Meyer, F. Topographic distance and watershed lines. Signal Processing. 38 (1), 113-125 (1994).
  7. Image Processing Toolbox Documentation. MATLAB_Marker-controlled watershed segmentation. MathWorks. , Available from: https://www.mathworks.com/help/images/marker-controlled-watershed-segmentation.html (2022).
  8. Bala, A. An improved watershed image segmentation technique using MATLAB. International Journal of Scientific and Engineering Research. 3 (6), 1206-1209 (2012).
  9. Glaros, T., Larsen, M., Li, L. Macrophages and fibroblasts during inflammation, tissue damage and organ injury. Frontiers in Bioscience (Landmark Edition). 14, 3988-3993 (2009).
  10. Witherel, C., Abebayehu, D., Barker, T., Spiller, K. Macrophage and fibroblast interactions in biomaterial-mediated fibrosis. Advanced Healthcare Materials. 8 (4), 1801451 (2019).
  11. Urello, M., Kiick, K., Sullivan, M. Integration of growth factor gene delivery with collagen-triggered wound repair cascades using collagen-mimetic peptides. Bioengineering & Translational Medicine. 1 (2), 207-219 (2016).
  12. Image Processing Toolbox Documentation. MATLAB_Detect and measure circular objects in an image. MathWorks. , Available from: https://www.mathworks.com/help/images/detect-and-measure-circular-objects-in-an-image.html (2022).
  13. Atherton, T., Kerbyson, D. Size invariant circle detection. Image and Vision Computing. 17, 795-803 (1999).
  14. Maitra, M., Kumar Gupta, R., Mukherjee, M. Detection and counting of red blood cells in blood cell images using Hough transform. International Journal of Computer Applications. 53 (16), 18-22 (2012).
  15. Uslu, F., Icoz, K., Tasdemir, K., Doğan, R. S., Yilmaz, B. Image-analysis based readout method for biochip: Automated quantification of immunomagnetic beads, micropads and patient leukemia cell. Micron. 133, 102863 (2020).
  16. Uslu, F., Icoz, K., Tasdemir, K., Yilmaz, B. Automated quantification of immunomagnetic beads and leukemia cells from optical microscope images. Biomedical Signal Processing and Control. 49, 473-482 (2019).
  17. Anderl, J., Redpath, S., Ball, A. A neuronal and astrocyte co-culture assay for high content analysis of neurotoxicity. Journal of Visualized Experiments: JoVE. (27), e1173 (2009).
  18. Holt, D., Chamberlain, L., Grainger, D. Cell-cell signaling in co-cultures of macrophages and fibroblasts. Biomaterials. 31 (36), 9382-9394 (2010).
  19. Boddupalli, A., Zhu, L., Bratlie, K. Methods for implant acceptance and wound healing: material selection and implant location modulate macrophage and fibroblast phenotypes. Advanced Healthcare Materials. 5 (20), 2575-2594 (2016).
  20. Wang, Z., Li, H. Generalizing cell segmentation and quantification. BMC Bioinformatics. 18 (1), 189 (2017).
  21. Zou, K. H., et al. Statistical validation of image segmentation quality based on a spatial overlap index. Academic Radiology. 11 (2), 178-189 (2004).

Tags

Area-based Image Analysis Algorithm Quantification Macrophage-fibroblast Cocultures Researchers Analyze Count Co-cultures Cells Area-based Analysis Technique Implement Software Reliable Accurate Identify Cell Types Co-culture Setting Tissue Regeneration Validate Monoculture Biomarker Studies Culture Raw 264.7 Macrophages DMEM Supplemented With FBS Penicillin-streptomycin Sodium Bicarbonate Beta-mercaptoethanol Cell Culture Flask Density NIH/3T3 Cells Co-culture Medium Ratios Seeding Incubate Cells Inverted Microscope
Area-based Image Analysis Algorithm for Quantification of Macrophage-fibroblast Cocultures
Play Video
PDF DOI DOWNLOAD MATERIALS LIST

Cite this Article

Borjigin, T., Boddupalli, A.,More

Borjigin, T., Boddupalli, A., Sullivan, M. O. Area-based Image Analysis Algorithm for Quantification of Macrophage-fibroblast Cocultures. J. Vis. Exp. (180), e63058, doi:10.3791/63058 (2022).

Less
Copy Citation Download Citation Reprints and Permissions
View Video

Get cutting-edge science videos from JoVE sent straight to your inbox every month.

Waiting X
Simple Hit Counter