We present a method, which utilizes a generalizable area-based image analysis approach to identify cell counts. Analysis of different cell populations exploited the significant cell height and structure differences between distinct cell types within an adaptive algorithm.
Quantification of cells is necessary for a wide range of biological and biochemical studies. Conventional image analysis of cells typically employs either fluorescence detection approaches, such as immunofluorescent staining or transfection with fluorescent proteins or edge detection techniques, which are often error-prone due to noise and other non-idealities in the image background.
We designed a new algorithm that could accurately count and distinguish macrophages and fibroblasts, cells of different phenotypes that often colocalize during tissue regeneration. MATLAB was used to implement the algorithm, which differentiated distinct cell types based on differences in height from the background. A primary algorithm was developed using an area-based method to account for variations in cell size/structure and high-density seeding conditions.
Non-idealities in cell structures were accounted for with a secondary, iterative algorithm utilizing internal parameters such as cell coverage computed using experimental data for a given cell type. Finally, an analysis of coculture environments was carried out using an isolation algorithm in which various cell types were selectively excluded based on the evaluation of relative height differences within the image. This approach was found to accurately count cells within a 5% error margin for monocultured cells and within a 10% error margin for cocultured cells.
Software is routinely implemented during image analysis techniques to ensure that the results are accurate, efficient, and unbiased. For cell-based assays, a common problem is the misidentification of cells. Images with improper focal and contrast settings may lead to cell blurring, in which the boundary of individual cells becomes hard to identify1. The presence of extraneous image features such as pores, bubbles, or other undesired objects can hamper counting procedures by slowing the counting process and leading to misidentification. Furthermore, cell counting can be onerous, and counting hundreds of replicates can be extremely time-consuming. Moreover, an inherent subjective bias exists during manual counting, and therefore decision-making regarding cell identification is often inaccurate2. Automated software offers exciting potential to bypass all these issues by rapidly and precisely differentiating cells from extraneous objects, including objects far beyond human capacity for precise detection, based on well-defined identification criteria that reduce the influence of investigator bias. Common techniques to identify cells using automated software involve two main methods: segmentation and thresholding3. Herein, we demonstrate a generalizable area-based protocol that enables rapid, accurate, and inexpensive cell counting within a widely accessible software framework.
Segmentation techniques, such as edge detection, seek to isolate individual cells by utilizing intensity differences within an image. Intensity changes that distinguish a cell from the rest of the image most often consist of sharp changes in brightness4. Edge detection involves a regularizing filtering step, followed by a differentiation step in which intensity changes are detected. The differentiation process identifies edges and contours within the image of high-intensity changes, and these edges and contours are correlated with cell presence. Although images with noise can be run through denoising algorithms4, edge detection techniques are ideally used for analyzing images with low background noise. The process functions optimally when cell boundaries are clearly and easily distinguishable and are not impeded by brightness contours unrelated to cell presence, cell blur, extraneous objects, or defined internal cell structures1,2. If an image is particularly noisy, cells may be further distinguished through fluorescent staining or transfection with fluorescent proteins2,5. Although this significantly improves the accuracy of segmentation techniques, it requires added costs and additional time investments to prepare cell cultures for imaging.
Thresholding techniques involve the division of an image into two categories: the foreground and the background, with cells assigned to the foreground3. These techniques utilize color/contrast changes to define the apparent height of an object; objects that are routinely 'taller' than the background can be easily identified as cells. The watershed-transform functions in this way by associating surfaces with light pixels as the foreground and those with dark pixels as the background6,7. Through height-based identification, thresholding techniques can routinely distinguish noise from desired objects, provided they exist within the same focal plane. When paired with an area-based quantification, a watershed-transform can accurately identify groups of objects in environments where typical segmentation techniques such as edge detection would be inaccurate.
Watershed-transforms are commonly coupled with segmentation techniques to prepare images for a cleaner analysis, resulting in higher accuracy of cell counting. For this process, the watershed-transform is used to highlight potential regions of interest prior to segmentation. A watershed-transform provides unique benefits by identifying cells in the foreground of images, which can improve the accuracy of segmentation analysis by removing potential false positives for cells, such as uneven patches of background. However, difficulties can arise when attempting to adapt cell-based images to a watershed-transform. Images with high cell density can be plagued with undersegmentation, in which aggregates of cells are identified as a singular group rather than as individual components. The presence of noise or sharp intensity changes can also result in oversegmentation, in which the algorithm overisolates cells, resulting in excessive and inaccurate cell counts8.
Herein, we detail a method to minimize the primary drawbacks of the watershed-transform by incorporating components of a thresholding analysis within an area-based quantification algorithm, as depicted in Figure 1. Notably, this algorithm was implemented with open-source and/or widely available software, and application of this cell-counting framework was possible without expensive reagents or complex cell preparation techniques. RAW264.7 macrophages were used to demonstrate the method due to their critical role in regulating connective tissue maintenance and wound healing processes9. Additionally, NIH/3T3 fibroblasts were analyzed due to their key role in tissue maintenance and repair. Fibroblast cells often coexist with and support macrophages, generating the need to distinguish these phenotypically distinct cell types in coculture studies.
Cell counts from images with high viable cell density (VCD) could be quantified reliably and efficiently by calculating the area covered by the cells, and the average area occupied by a singular cell. The use of thresholding as opposed to segmentation for cell identification also enabled more complex analyses, such as experiments in which different cell types in cocultures were analyzed concurrently. NIH/3T3 fibroblasts, which are often found to colocalize with RAW264.7 macrophages within a wound healing site, were found to grow at a focal plane that was distinct from the focal plane of macrophages10. Accordingly, multiple thresholding algorithms were run to define the background and foreground depending on the cell type being analyzed, enabling accurate counting of two different cell types within the same image.
1. Cell culture and image acquisition
2. Image analysis-monoculture utilizing the "monoculture.m" file primarily
NOTE: The following steps were performed using MATLAB. Three files were used for the MATLAB protocol: "process.m" (Supplemental coding file 1), the file containing the algorithm, "monoculture.m" (Supplemental coding file 2), the file to run for analyzing monoculture images, and "coculture_modified.m" (Supplemental coding file 3), the file to run for analyzing coculture images.
3. Image analysis-coculture utilizing the "coculture_modified.m" file primarily
NOTE: The following steps were performed using MATLAB.
Analysis of non-bulbous RAW264.7 macrophages was conducted in a monoculture setting at 25,000 cells/cm2. Representative images were taken of the cell culture and processed in MATLAB following conversion to 8-bit tiff in ImageJ. Algorithm outputs throughout the process were recorded and documented in Figure 2 for the representative image. In this image, the algorithm counted 226 cells, and this image count was verified by comparison with a manual count that identified 241 cells (6.2% error). Algorithm outputs for at least 10 images of RAW264.7 cell counts had an average error of 4.5 ± 1.9%.
Analysis of bulbous RAW264.7 macrophages was performed using an iterative algorithm. In most images, the bulbous effect observed was primarily a result of focusing at a focal plane that was slightly above the focal plane of the substrate to which the cells were adherent. Accordingly, most images were acquired in non-bulbous form, for which analysis was significantly easier. The algorithm necessary to analyze bulbous images was developed for scenarios in which suboptimal was impossible to avoid due to meniscus effects or other optical interference. Typical algorithm outputs throughout the process were recorded and documented in Figure 3. In this representative image analysis, the algorithm counted 221 cells within the image, which was verified with a manual count of 252 cells (12.3% error).
Analysis of cocultures containing both RAW264.7 macrophages and NIH/3T3 fibroblasts was conducted to determine the capacity to differentiate between two distinct cell types within a single image. Due to the highly variable cell morphologies of NIH/3T3 fibroblasts, manual/automatic cell counts could not be obtained accurately, and cell coverages for fibroblasts were instead compared qualitatively to the original image. Representative algorithm outputs throughout the process were recorded and documented in Figure 4. For this image, the RAW264.7 macrophage count was 137, and this count was verified with a manual count of 155 (11.6% error). Algorithm outputs for at least 10 RAW264.7 macrophage counts had an average of 7.8 ± 3.9% error.
The robustness of the cell counting algorithm was also verified using a blind study to compare automatic cell identification with manual user counts. Five images were selected at random, with varying cell densities, and these images were blindly counted by three different users. The manual counts were compared with one another and with the results of the automatic cell counting algorithm. The comparison of manual and automated counting results is shown in Table 1. Furthermore, the segmentation accuracy of this method was tested by utilizing 'ground truth' images derived using conventional segmentation techniques. The DICE coefficient20,21 was utilized as a performance metric with an average parameter of 0.85 across five images. An example overlay can be seen in Figure 5.
Figure 1: Generalized area-based algorithm. A process in which an image is prepared for segmentation analysis through the watershed-transforms to determine regional maxima and minima10. This proposed method modifies the process (original process in black) by skipping the watershed ridge line determination and subsequent steps to adopt a percentile-based system coupled with an area-based quantification process (highlighted in red). Please click here to view a larger version of this figure.
Figure 2: RAW 264.7 non-bulbous algorithm output. The figure shows the intermediate image processing steps leading to the quantification of non-bulbous RAW264.7 macrophage counts. (A) Initial processing of Image to grayscale in ImageJ. The original image converted to 8-bit tiff and grayscale in ImageJ. (B) Postprocessing of grayscale image. The postprocessing image of A following opening and closing by reconstruction, as described in step 2.1.1. (C) Postbinarization of processed image. Panel B post binarization, performed as described in step 2.2. (D) Final image with representative cells for average area calculations. The image with blue circles indicating the cells used within the Hough-transform to identify the average area of a cell, as described in step 2.3. Scale bars = 20 µm. Please click here to view a larger version of this figure.
Figure 3: RAW264.7 bulbous algorithm output. The figure shows the intermediate image processing steps leading to the quantification of bulbous RAW264.7 macrophage counts. (A) Image of bulbous RAW264.7 macrophages. The original image converted to 8-bit tiff and grayscale in ImageJ. (B) Postprocessing of grayscale image. The postprocessing image of A following opening and closing by reconstruction, as described in step 2.1.1. (C) Post binarization of processed image with clear islands. The typical output without the iterative algorithm when analyzing bulbous images, with 'islands' of black pixels at the centers of the cells. The blue circles represent the cells used within the Hough-transform to identify the average area of a cell, as described in step 2.3. (D) Final image with representative cells, islands filled in using postoperative algorithm. The image post iterative algorithm, as described in step 2.2.2. Scale bars = 20 µm. Please click here to view a larger version of this figure.
Figure 4: RAW264.7 and NIH/3T3 coculture algorithm output. The figure shows the intermediate image processing steps leading to the quantification of coculture images containing RAW264.7 macrophages and NIH/3T3 fibroblasts. (A) Image of NIH/3T3 and RAW264.7 cells. The original image converted to 8-bit tiff and grayscale in ImageJ. (B) Postprocessing of grayscale image. The postprocessing image of A following opening and closing by reconstruction, as described in step 3.1.1. (C) Postbinarization of processed image with clear selection of macrophages based on height-based isolation. The selective isolation of RAW264.7 macrophages, as described in step 3.2.2. (D) Final image with clusters of macrophages and fibroblasts identified. The entire image containing both RAW264.7 macrophages and NIH/3T3 fibroblasts, as described in step 3.3. A sample macrophage cell is outlined with a green circle, and a sample fibroblast cell is outlined with a red oval. Scale bars = 20 µm. Please click here to view a larger version of this figure.
Figure 5: DICE parameter-segmentation performance, RAW264.7 macrophages. The image shows an overlay between the 'ground truth' segmentation approach and this method. The white regions are cells detected by both the 'ground truth' and this method, while the purple and green regions are false negatives and false positives, respectively. The 'ground truth' segmentation was obtained by common segmentation techniques and uses region-of-interest tools to correct for erroneous segmentation. Please click here to view a larger version of this figure.
Counter 1 | Counter 2 | Counter 3 | Automatic | Manual Counts Average (±STDEV) | Error Compared with Automatic | |
Image 1 | 151 | 148 | 145 | 142 | 148 ± 3.0 | 4.22% |
Image 2 | 164 | 166 | 168 | 173 | 166 ± 2.0 | 4.05% |
Image 3 | 255 | 253 | 245 | 239 | 251 ± 5.3 | 5.02% |
Image 4 | 153 | 152 | 157 | 166 | 154 ± 2.6 | 7.22% |
Image 5 | 103 | 106 | 100 | 111 | 103 ± 3.0 | 7.20% |
Table 1: Manual/automatic cell counts and robustness test.
Supplemental Information: Morphological differences in image analysis of macrophage and fibroblast cocultures. Please click here to download this File.
Supplemental coding file 1: "process.m", the MATLAB file necessary to run the algorithms. No manual actions required but contains the algorithm in a separate file. Please click here to download this File.
Supplemental coding file 2: "monoculture.m", the MATLAB file utilized for analyzing monoculture images. Please click here to download this File.
Supplemental coding file 3: "coculture_modified.m", the MATLAB file utilized for analyzing coculture images. Please click here to download this File.
We designed a general area-based procedure that accurately and efficiently counted cells on the basis of cell height, allowing for stain-free quantitation of cells even in coculture systems. Critical steps for this procedure included the implementation of a relative intensity system by which cells could be differentiated. The use of a relative height analysis served two purposes: the need for external parameters was rendered unnecessary, as relative parameters were constant for the given cell type and parameter, and absolute brightness/contrast levels did not need to be entered for every image. Furthermore, the use of a percentile-based system enabled the analysis of multiple cell types within the same image, provided that the different cell types resided at unique heights within the image.
The percentile-based system was defined as a difference from the maximum pixel value as opposed to the average. This was done to prevent skewing of the algorithm based on the average intensity value, which was dependent on the number and confluence of cells present in the image. Furthermore, the 'maximum relevant pixel' was implemented to prevent maximum intensity outliers from significantly affecting the data-maximum relevant pixels would need to represent at least 0.5% of the population-an experimental value. Additional steps included the use of an area-based quantification system rather than counting individual entities. This ensured that cell fragmentation or erroneous binarization would be minimized when counting cells, as the total affected area was minimal compared to the area of a cell.
Various troubleshooting methods were implemented within this algorithm. The initial efforts to quantify images of cell monocultures required methods to accurately identify both bulbous and non-bulbous cells. A unique phenomenon was observed for the bulbous cell images, in that the centers of the cells were identified as background following the watershed-transform, and these regions appeared as islands within the cell. A thorough analysis revealed that the lack of cell centers was occurring due to an excessively convex structure within the image, which resulted in an inversion in the watershed-transform. A solution was implemented specifically for bulbous images, which involved an iterative process known as plugging, in which the islands were filled in as true cell values. This functioned by a procedure in which the cell coverage and extent of island formation were used to determine the accuracy of the image. Generally, the cell coverage was used to determine the extent of island formation, as an unusually low cell coverage was most typically caused by a high level of island formation. The relationship between cell coverage and the extent of island formation was determined through experimental data.
Additional troubleshooting was necessary based on the observation that using an 'average pixel' threshold (defined as the mean average pixel intensity within the image) to differentiate cells, rather than a maximum pixel, led to inconsistencies. The average intensity of the image was found to increase as a function of the number of cells in the image, which could skew results based on the density of cells within the image. An alternative iteration of the algorithm was designed employing a maximum-intensity benchmark; however, this value also was inconsistent, as high-intensity outliers significantly skewed the data and led to unreliable results. Ultimately, the use of a maximum relevant pixel was adopted. The maximum relevant pixel was found to accurately account for high-intensity outliers, leading to the most consistent results when differentiating monoculture and coculture images.
Several limitations of the algorithm were identified when analyzing coculture images using the percentile-based system to selectively identify distinct types of cells. Occasionally, NIH/3T3 cells would appear continuously with RAW 264.7 cells in a multilayered aggregate, which rendered the image extremely difficult to analyze. Although this was mostly observed for coculture images, monoculture images of cells with abnormal culture conditions or high confluence levels could also result in multilayer cell aggregates. While the algorithm could successfully detect multilayered aggregates as unique from the background, it was not possible to obtain accurate cell quantification using this 2D image analysis on cell multilayers. Furthermore, cell debris within the image could hamper algorithm accuracy. The use of an area-based analysis algorithm served to minimize the error associated with erroneous detection, as small areas of cell debris would influence the cell counts less using an area-based analysis than segmentation.
Additionally, the conversion of czi images to 8-bit TIFF, performed to ensure simplicity and uniformity, resulted in pixel types of uint8, a collection of whole numbers from 0 to 255. Thus, differences could only be quantified at a maximum precision of 1/256 or 0.39%. Generally, pixel spread only covered between 210 and 250 pixels, resulting in a realistic precision of 2.5%. Although this was found to be adequate for monoculture analysis in which cells are extremely distinct from the background, the selective subtraction step within the coculture analysis required much more precision. Although reasonably accurate data could still be obtained from coculture images, the overall accuracy of coculture analysis was reduced compared to the accuracy of analysis from monoculture images. Furthermore, the use of an area-based identification system required an average cell area to obtain cell counts. This parameter is useful when dealing with noisy images or with cells of uniform geometries but would hinder quantification for area-variable cells.
The significance of this algorithm as opposed to existing methods lies within the use of developed methods such as the watershed-transform and morphological image operations to analyze mono- and cocultured cells without fluorescence staining procedures. Although several prior studies have reported methods to successfully distinguish and count cells in the absence of staining, these alternative approaches often required complex culture approaches or inaccessible software16. For example, Yilmaz and colleagues used automated counting analysis techniques to quantify immune cells through biochips with immunomagnetic beads on micropads15.
We also designed new and impactful approaches to address non-random noise patterns, such as 'islands,' that appeared at the centers of cells. Identification of different cell lines was also possible by utilizing the varying 'height' values of the cells, which could be identified from intensity pixels. The use of relative parameters rather than absolute parameters provided several advantages by enabling automation of the algorithm and providing more flexibility for the acquisition of images, as exact brightness, contrast, or color settings did not need to be preserved. The area-based identification provided accurate results even when cells were clustered, as is common with many cell types. In contrast, manual counting and segmentation are difficult, error-prone, and may require a trained user to identify cells accurately and efficiently.
Future applications and development of the algorithm will focus on fine-tuning and further optimizing of the cell-counting procedure. Furthermore, the use of experimental parameters to distinguish different cell types by relative intensity can be verified by testing of further cocultures; for example, the identification of neurons/astrocytes for neurotoxicity analysis17. Further applications can be made in the wound-healing field, where representative proportions of immune cells can play a prominent role in the inflammatory and antiinflammatory processes. Improvements in cell quantification of macrophages and fibroblasts could provide additional insight into the paracrine versus juxtracrine cell signaling effects relevant for evaluating foreign body response outcomes18,19. In Supplemental Information, we explore the possible mechanisms of incorporating morphometric parameters in the algorithm to aid in the accuracy of cell identification within coculture systems.
Extensions can also be made for more complex cocultures, which would focus on 3 or more different cell types within images, as opposed to just binary mixtures. Due to the limited precision available for 8-bit images, 16-bit images may be required, which provide additional information but may significantly complicate analysis due to more dynamic intensity ranges and distributions. The goal of this study was to develop a robust protocol for automated cell-counting that is optimized for diverse cell culture environments. The algorithm allows for accurate cell counts even in bulbous cell image acquisitions. Its self-correcting iterative code is useful for stain-free brightfield microscopy imaging of immunologically relevant cell culture systems.
The authors have nothing to disclose.
This work was funded in part by the National Institutes of Health (R01 AR067247) and in part by the Delaware INBRE program, supported by a grant from the National Institute of General Medical Sciences-NIGMS (P20 GM103446) from the National Institutes of Health and the State of Delaware. The contents of the manuscript do not necessarily reflect the views of the funding agencies.
Axio Observer 7 Inverted Microscope | Zeiss | 1028290770 | |
β-mercaptoethanol | Life Technologies | 21985023 | |
Cell Scrapers | CellTreat | 229310 | |
Dublecco's Modified Eagle Medium | Fisher Scientific | 12430047 | |
Dublecco's PBS | Fisher Scientific | 14190144 | |
MATLAB Software | MathWorks | 2021A | |
NIH/3T3 Cells | ATCC | ATCC CRL – 1658 | |
Penicillin–Streptomycin | Sigma Aldrich | P4333-20ML | |
RAW264.7 Cells | ATCC | ATCC TIB – 71 | |
Sodium Bicarbonate | Sigma Aldrich | S6014-25G | |
T75 Cell Culture Flask | Corning | CLS3814-24EA |