RESEARCH
Peer reviewed scientific video journal
Video encyclopedia of advanced research methods
Visualizing science through experiment videos
EDUCATION
Video textbooks for undergraduate courses
Visual demonstrations of key scientific experiments
BUSINESS
Video textbooks for business education
OTHERS
Interactive video based quizzes for formative assessments
Products
RESEARCH
JoVE Journal
Peer reviewed scientific video journal
JoVE Encyclopedia of Experiments
Video encyclopedia of advanced research methods
EDUCATION
JoVE Core
Video textbooks for undergraduates
JoVE Science Education
Visual demonstrations of key scientific experiments
JoVE Lab Manual
Videos of experiments for undergraduate lab courses
BUSINESS
JoVE Business
Video textbooks for business education
Solutions
Language
English
Menu
Menu
Menu
Menu
A subscription to JoVE is required to view this content. Sign in or start your free trial.
Research Article
Erratum Notice
Important: There has been an erratum issued for this article. View Erratum Notice
Retraction Notice
The article Assisted Selection of Biomarkers by Linear Discriminant Analysis Effect Size (LEfSe) in Microbiome Data (10.3791/61715) has been retracted by the journal upon the authors' request due to a conflict regarding the data and methodology. View Retraction Notice
This protocol describes the integration of an AI-guided atomization sensing system that monitors respiration, temperature, and motion to adapt health education content in real-time during vocational physical education, and evaluates its effects through a six-week quasi-experiment.
This study evaluates an AI-guided atomization sensing system designed to integrate adaptive health education into vocational physical education. A quasi-experimental design was conducted with 356 participants allocated to either an intervention group (n = 178) or a control group (n = 178). The system used wearable atomization sensors (monitoring respiratory patterns, ambient temperature, and motion) connected to an AI engine that applied rule-based and machine-learning logic to adjust instructional content in real time-for example, reducing complexity or switching to audiovisual demonstrations when signs of fatigue were detected. Sessions were delivered in classroom settings twice per week for six weeks, with surveys administered immediately before and after the intervention. Five outcomes were assessed using validated instruments: knowledge retention, engagement, behavioral intention, self-efficacy, and technology acceptance. Statistical analyses included ANOVA, paired-sample t-tests, Pearson correlations, and χ2 tests, with results reported as F(df1, df2), p, η2, and 95% confidence intervals where applicable. The intervention group achieved significantly greater improvements than the control group in knowledge retention (Δ = +21.7 vs. +10.6; F(1, 354) = 46.21, p < 0.001, η2 = 0.21), engagement (F(1, 354) = 39.87, p < 0.001, η2 = 0.18), behavioral intention (F(1, 354) = 42.55, p < 0.001, η2 = 0.19), self-efficacy (F(1, 354) = 27.63, p < 0.001, η2 = 0.13), and technology acceptance (F(1, 354) = 35.12, p < 0.001, η2 = 0.17). These findings demonstrate that combining real-time sensing with AI-guided decision support provides a reproducible, adaptive framework for enhancing health education outcomes in vocational PE settings.
Health education is a cornerstone of modern public health, improving individuals' knowledge, attitudes, and health-related behaviors1. Strengthening self-management skills enables people to make informed decisions, adopt preventive practices, and engage in healthier lifestyles2. However, conventional delivery methods, such as classroom lectures, printed materials, and generic online modules, often provide limited interactivity and personalization3. These approaches typically fail to respond to learners' real-time needs, resulting in modest engagement and constrained behavioral change4.
In recent years, technology-enhanced models have sought to address these limitations. Mobile e-learning platforms can expand access but usually deliver standardized content without physiological adaptivity5. Wearable-only feedback systems capture motion or attention levels but rarely connect sensor data to instructional design6. Existing AI-based education tools demonstrate value in analyzing engagement patterns and providing automated feedback, yet most implementations rely on static resources and do not adapt dynamically to learners' states7. As a result, there remains a gap in integrating real-time sensing with adaptive instructional methods that can tailor both content and delivery to the learner context.
To bridge this gap, we introduce an AI-guided atomization sensing system that integrates multi-parameter physiological and environmental monitoring into vocational physical education. The system employs wearable sensors capable of detecting respiratory patterns, ambient temperature, and motion signals at a sampling rate of 50 Hz, with a latency of approximately 200 ms, embedded in a lightweight wearable device8. Data streams are processed by an AI engine that combines rule-based decision logic with machine-learning models to detect indicators of fatigue or stress9. When such triggers occur, the system adapts instructional delivery by, for example, simplifying content complexity or switching to audiovisual demonstrations, thereby sustaining motivation and comprehension10. Unlike isolated mobile platforms or sensor-only frameworks, this approach unifies sensing, decision-making, and adaptive teaching workflows within a reproducible protocol.
Evidence from related domains supports the potential of this integration. Studies in medical and vocational education show that AI-based analytics improve knowledge retention and engagement when adaptive feedback is provided11,12. Sensor-enabled learning environments and IoT-driven healthcare frameworks have achieved real-time prediction accuracy improvements of up to 36.9%, highlighting the benefits of multi-parameter data streams for personalized interventions13,14. Research on wearable and intelligent monitoring networks further demonstrates improved engagement and spatial awareness by capturing behavioral and contextual indicators15,16. In the context of vocational physical education, personalized instructional models have significantly outperformed conventional teaching approaches (80% vs. 32.5%) in fostering health-promoting behaviors17. Meanwhile, atomization sensing technologies, initially developed for aerosol therapies and environmental monitoring, have recently been adapted for educational use, enabling high-resolution physiological data capture in real time18,19. Despite these promising directions, most existing systems remain limited to static e-learning or isolated wearable applications, leaving the integration of atomization sensing and AI-driven adaptivity underexplored.
The present study addresses this gap by proposing a reproducible protocol that combines wearable atomization sensing with AI-guided instructional adaptation in vocational physical education. Specifically, the protocol evaluates its impact on five key outcomes: knowledge retention, engagement, behavioral intention, self-efficacy, and technology acceptance.
This study was reviewed and approved by the Institutional Review Board of Shijiazhuang Institute of Railway Technology. All procedures complied with the Declaration of Helsinki and relevant local regulations. The equipment and software used are listed in the Table of Materials.
1. Methodology
This study used a quasi-experimental design to evaluate an AI-guided atomization sensing system for digital health education in higher-education settings. Data were collected before and after the intervention to assess knowledge retention, engagement, behavioral intention, self-efficacy, and technology acceptance using validated instruments adapted from prior work. The overall workflow of sensing, AI decision-making, and adaptive instructional delivery is illustrated in Figure 1, which outlines how physiological and environmental signals are integrated into the rule-based and machine-learning components to adjust content in real-time.
2. Statistical analyses
Data analysis was performed using SPSS version 25, with significance set at p < 0.05. Four statistical approaches were applied:




3. Participants
A total of 356 participants aged 18-45 years (M = 26.7, SD = 5.4) were recruited from higher education institutions, met the inclusion criteria for age and basic technological literacy, and provided informed consent. Participants were assigned by random-number table to an intervention group (n = 178) receiving adaptive, AI-guided instruction or a control group (n = 178) receiving conventional modules without personalization. Table 1 summarizes baseline characteristics; the distribution of demographic characteristics across participant subgroups is visualized in Figure 2, which complements the tabulated statistics and confirms group comparability at baseline.
4. Apparatus and materials
The platform integrated wearable sensing of respiratory patterns, ambient temperature, and motion with an AI engine combining rule-based logic and SVM classification to trigger adaptive delivery upon signs of fatigue or reduced activity. Sessions were conducted in standardized multimedia classrooms (~50 m2) equipped with projectors and sensor base stations, lasted 45 min, and were delivered twice weekly over six weeks.
5. Measurement instruments
All self-report instruments were administered on a 5-point Likert scale (1 = strongly disagree to 5 = strongly agree), with higher scores indicating stronger endorsement of the target construct. Items were positively keyed unless otherwise specified, and scale scores were computed as the mean of their constituent items.
6. Knowledge retention
A 15-item multiple-choice test aligned with the instructional content was used to assess knowledge (total score range 0-15). Higher scores reflected greater knowledge acquisition. An example item was: "What is the primary energy source during aerobic exercise?"
7. Behavioral intention
Behavioral intention was measured with a 3-item instrument adapted from the Theory of Planned Behavior20. A representative item was: "I intend to adopt the health practices introduced in class during the following week."
8. Engagement
Engagement was evaluated with a 5-item instrument adapted from a validated framework of learning engagement capturing attention, interaction, emotional involvement, persistence, and learning satisfaction21. An illustrative statement was: "I was able to stay focused during the learning process."
9. Self-efficacy
Self-efficacy was assessed with a 4-item instrument grounded in the established conceptualization of perceived capability. A sample item was: "I am confident that I can independently complete the health exercises demonstrated in class."
10. Technology acceptance
Technology acceptance was measured with a 6-item adaptation of the Unified Theory of Acceptance and Use of Technology, covering performance expectancy, effort expectancy, social influence, and facilitating conditions. A representative item was: "Using this system can enhance my learning performance."
All instruments demonstrated acceptable to strong internal consistency in the present study: α = 0.87 (Knowledge), 0.82 (Behavioral Intention), 0.85 (Engagement), 0.84 (Self-efficacy), and 0.89 (Technology Acceptance).
Experimental results
To assess the effectiveness of the AI-driven atomization sensing system, multiple statistical analyses were conducted.
The ANOVA findings revealed clear group differences across all five outcomes, with participants in the intervention group achieving consistently higher post-test scores than those in the control group (Table 2; Figure 3).
When within-group changes were examined through paired-sample t-tests, the intervention group showed substantial improvements across all measures (Table 3; Figure 4), whereas the control group demonstrated only modest progress (Table 4; Figure 5). This contrast highlights the more significant impact of the AI-enhanced system compared to conventional instruction.
The pattern was further supported by chi-square analyses, which demonstrated significant associations between intervention exposure and categorical outcomes such as knowledge retention, engagement, self-efficacy, behavioral intention, and technology acceptance (Table 5; Figure 6). In the control group, some associations also emerged, but the effects were notably weaker, and self-efficacy did not reach statistical significance (Table 6; Figure 7).
At the relational level, Pearson correlation analyses highlighted positive and significant associations among all outcome variables. The intervention group not only exhibited stronger correlations overall but also showed particularly close links between engagement and self-efficacy, suggesting that active involvement enhanced learners' confidence (Table 7; Figure 8). In the control group, correlations were positive but consistently weaker (Table 8; Figure 9).
Taken together, these results provide convergent evidence that the AI-enabled system enhanced knowledge acquisition, strengthened self-efficacy, promoted behavioral intention, and increased both engagement and technology acceptance compared with standard health education modules.
ANOVA results
The analysis of variance revealed consistent group differences across all outcome variables (Table 2). Participants in the intervention group achieved notably higher post-test knowledge scores (M = 12.84, SD = 1.42) than those in the control group (M = 10.27, SD = 1.89), F(1, 354) = 18.73, p < 0.001. A similar pattern emerged for engagement, where the intervention group scored more favorably (M = 4.32, SD = 0.56) than the control group (M = 3.48, SD = 0.73), F(1, 354) = 12.56, p = 0.0004. Behavioral intention, self-efficacy, and technology acceptance also showed significant advantages for the intervention group, with medium to large effect sizes. These findings highlight the broad impact of the AI-driven system across cognitive, behavioral, and attitudinal dimensions of learning (Figure 3).
Paired t-test results
To examine within-group changes, paired-sample t-tests were conducted for both groups. In the intervention group, improvements were substantial across all measures (Table 3). Knowledge scores rose from 62.5 ± 8.4 at pre-test to 84.2 ± 7.9 at post-test (Δ = +21.7), t = 15.28, p < 0.001. Behavioral intention, engagement, self-efficacy, and technology acceptance all demonstrated similarly strong gains, each reaching high levels of statistical significance (Figure 4). By contrast, the control group showed only modest improvements (Table 4). Knowledge increased from 61.8 ± 8.6 to 72.4 ± 7.8 (Δ = +10.6), t = 9.34, p < 0.001, while behavioral intention and technology acceptance improved slightly but remained at lower levels than in the intervention group; self-efficacy also showed a small increase with a limited effect size (Figure 5). Together, these results emphasize the more pronounced benefits of adaptive, AI-driven delivery.
Chi-square results
Associations between group allocation and categorical outcomes were then examined. For the intervention group, significant associations were observed across all five variables-knowledge retention (χ2 = 18.67, df = 2, p < 0.001), behavioral intention (χ2 = 14.52, df = 2, p < 0.001), engagement (χ2 = 10.77, df = 2, p = 0.0046), self-efficacy (χ2 = 9.84, df = 2, p = 0.0073), and technology acceptance (χ2 = 13.38, df = 2, p = 0.0012)-indicating that exposure to the AI-enhanced system consistently shifted categorical responses in a favorable direction (Table 5; Figure 6). In the control group, significant associations appeared for knowledge retention (χ2 = 10.14, df = 2, p = 0.0063), behavioral intention (χ2 = 8.29, df = 2, p = 0.0159), engagement (χ2 = 6.78, df = 2, p = 0.0337), and technology acceptance (χ2 = 7.65, df = 2, p = 0.0219), whereas self-efficacy did not reach significance (χ2 = 5.24, df = 2, p = 0.0728), suggesting limited confidence gains without personalization (Table 6; Figure 7).
Pearson correlation results
Interrelationships among the five continuous outcomes were further examined using Pearson correlations. In the intervention group, all correlations were positive and statistically significant at p < 0.01: knowledge retention correlated with behavioral intention (r = 0.612), engagement (r = 0.588), self-efficacy (r = 0.605), and technology acceptance (r = 0.631); behavioral intention was strongly related to engagement (r = 0.662), self-efficacy (r = 0.648), and technology acceptance (r = 0.679); engagement and self-efficacy exhibited the highest association (r = 0.701), suggesting that active participation reinforced perceived competence (Table 7; Figure 8). In the control group, correlations were also positive but generally weaker, ranging from r = 0.451 to r = 0.556; knowledge retention correlated most strongly with technology acceptance (r = 0.504), and engagement related most to self-efficacy (r = 0.556), with all magnitudes below those in the intervention group (Table 8; Figure 9).
DATA AVAILABILITY:
The de-identified dataset underlying this article is publicly available at Figshare: https://doi.org/10.6084/m9.figshare.30194968.v1

Figure 1: Workflow of AI-driven intelligent atomization sensing technology in health education. Block diagram of the end-to-end pipeline showing wearable sensing (respiration, ambient temperature, motion), AI decision logic (rules + SVM), and adaptive content delivery; arrows indicate data flow and trigger conditions for instructional adjustments. Please click here to view a larger version of this figure.

Figure 2: Distribution of demographic characteristics across participant subgroups. Grouped bar or density plots summarizing gender, age, education level, and digital literacy for the intervention (n = 178) and control (n = 178) groups, illustrating baseline comparability. Please click here to view a larger version of this figure.

Figure 3: ANOVA results comparing educational variables between intervention and control groups. Post-test means (± error bars) for knowledge retention, behavioral intention, engagement, self-efficacy, and technology acceptance, with between-group differences tested by one-way ANOVA. Please click here to view a larger version of this figure.

Figure 4: Paired-sample t-test outcomes for key variables in the intervention group. Pre- vs post-intervention scores for all five outcomes (means ± error bars), with paired t-tests indicating within-group improvements under AI-guided delivery. Please click here to view a larger version of this figure.

Figure 5: Paired-sample t-test outcomes for key variables in the control group. Pre- vs post-scores for the control group (means ± error bars), showing modest gains under conventional modules, tested with paired t-tests. Please click here to view a larger version of this figure.

Figure 6: Chi-square analysis results for the intervention group. Stacked or clustered categorical distributions for each outcome and corresponding χ2 statistics, demonstrating significant associations with exposure to the AI-enhanced system. Please click here to view a larger version of this figure.

Figure 7: Chi-square analysis results for the control group. Categorical outcome distributions and χ2 statistics for the control group, indicating weaker associations and a nonsignificant pattern for self-efficacy. Please click here to view a larger version of this figure.

Figure 8: Pearson correlation results among variables in the intervention group. Correlation matrix/heatmap for the five continuous outcomes, highlighting stronger positive associations, especially between engagement and self-efficacy, under AI-guided instruction. Please click here to view a larger version of this figure.

Figure 9: Pearson correlation results among variables in the control group. Correlation matrix/heatmap showing uniformly positive but weaker associations among outcomes under conventional delivery. Please click here to view a larger version of this figure.
| Variable | Category | Intervention Group (n = 178) | Control Group (n = 178) | Total (n = 356) |
| Gender | Male | 89 (50.0%) | 86 (48.3%) | 175 (49.2%) |
| Female | 89 (50.0%) | 92 (51.7%) | 181 (50.8%) | |
| Age Group (years) | 18–25 | 51 (28.7%) | 54 (30.3%) | 105 (29.5%) |
| 26–35 | 65 (36.5%) | 62 (34.8%) | 127 (35.7%) | |
| 36–45 | 38 (21.3%) | 39 (21.9%) | 77 (21.6%) | |
| ≥46 | 24 (13.5%) | 23 (12.9%) | 47 (13.2%) | |
| Education Level | High School | 20 (11.2%) | 23 (12.6%) | 43 (11.9%) |
| Undergraduate | 93 (52.2%) | 90 (50.6%) | 183 (51.4%) | |
| Postgraduate or higher | 65 (36.5%) | 65 (36.8%) | 130 (36.7%) | |
| Digital Literacy | Basic | 49 (27.5%) | 46 (25.8%) | 95 (26.7%) |
| Intermediate | 94 (52.8%) | 97 (54.5%) | 191 (53.7%) | |
| Advanced | 35 (19.7%) | 35 (19.7%) | 70 (19.7%) | |
| Prior Digital Health Usage | Yes | 105 (59.0%) | 101 (56.7%) | 206 (57.9%) |
| No | 73 (41.0%) | 77 (43.3%) | 150 (42.1%) |
Table 1: Demographic characteristics of participants (N = 356) in AI-driven health education. Baseline characteristics are summarized as mean ± SD or n (%), with group comparisons reported where applicable to confirm equivalence at pre-test.
| Variable | Intervention (M ± SD) | Control (M ± SD) | F-value | p-value |
| Knowledge Retention | 12.84 ± 1.42 | 10.27 ± 1.89 | 18.73 | < 0.001 |
| Engagement | 4.32 ± 0.56 | 3.48 ± 0.73 | 12.56 | 0.0004 |
| Behavioral Intention | 4.40 ± 0.60 | 3.50 ± 0.60 | 16.95 | < 0.001 |
| Self-Efficacy | 4.30 ± 0.50 | 3.60 ± 0.50 | 10.84 | 0.0012 |
| Technology Acceptance | 4.50 ± 0.60 | 3.70 ± 0.60 | 13.45 | 0.0003 |
Table 2: ANOVA results comparing educational outcomes between the intervention and control groups. Post-test between-group comparisons for all outcomes, including means, SDs, F(df1, df2), p-values, and effect sizes (η2).
| Variable | Pre-test (M ± SD) | Post-test (M ± SD) | Mean Difference | t-value | p-value |
| Knowledge Retention | 62.5 ± 8.4 | 84.2 ± 7.9 | 21.7 | 15.28 | < 0.001 |
| Behavioral Intention | 3.1 ± 0.7 | 4.4 ± 0.6 | 1.3 | 12.06 | < 0.001 |
| Engagement | 3.3 ± 0.6 | 4.2 ± 0.5 | 0.9 | 10.45 | < 0.001 |
| Self-Efficacy | 3.5 ± 0.6 | 4.3 ± 0.5 | 0.8 | 9.63 | < 0.001 |
| Technology Acceptance | 3.2 ± 0.7 | 4.5 ± 0.6 | 1.3 | 13.71 | < 0.001 |
Table 3: Paired-sample t-test results for pre-test and post-test outcomes in the intervention group. Within-group changes under AI-guided delivery, reporting pre/post means, Δ, t, df, p, and 95% CIs.
| Variable | Pre-test (M ± SD) | Post-test (M ± SD) | Mean Difference | t-value | p-value |
| Knowledge Retention | 61.8 ± 8.6 | 72.4 ± 7.8 | 10.6 | 9.34 | < 0.001 |
| Behavioral Intention | 3.2 ± 0.6 | 3.6 ± 0.7 | 0.4 | 3.87 | 0.0001 |
| Engagement | 3.4 ± 0.5 | 3.7 ± 0.6 | 0.3 | 2.91 | 0.004 |
| Self-Efficacy | 3.6 ± 0.5 | 3.9 ± 0.6 | 0.3 | 2.64 | 0.009 |
| Technology Acceptance | 3.3 ± 0.6 | 3.7 ± 0.6 | 0.4 | 3.34 | 0.001 |
Table 4: Paired-sample t-test results for pre-test and post-test outcomes in the control group. Within-group changes under conventional modules with the same statistics as Table 3 for comparability.
| Variable | χ2 | df | p-value |
| Knowledge Retention | 18.67 | 2 | < 0.001 |
| Behavioral Intention | 14.52 | 2 | < 0.001 |
| Engagement | 10.77 | 2 | 0.0046 |
| Self-Efficacy | 9.84 | 2 | 0.0073 |
| Technology Acceptance | 13.38 | 2 | 0.0012 |
Table 5: Chi-square test results for categorical variables in the intervention group. Cross-tabulations and χ2(df) with p-values for categorical versions of the five outcomes, indicating associations with intervention exposure.
| Variable | χ2 | df | p-value |
| Knowledge Retention | 10.14 | 2 | 0.0063 |
| Behavioral Intention | 8.29 | 2 | 0.0159 |
| Engagement | 6.78 | 2 | 0.0337 |
| Self-Efficacy | 5.24 | 2 | 0.0728 |
| Technology Acceptance | 7.65 | 2 | 0.0219 |
Table 6: Chi-square test results for categorical variables in the control group. Cross-tabs and χ2(df) with p-values for the control group, showing comparatively weaker patterns.
| Variables | Knowledge Retention | Behavioral Intention | Engagement | Self-Efficacy | Technology Acceptance |
| Knowledge Retention | 1 | 0.612** | 0.588** | 0.605** | 0.631** |
| Behavioral Intention | 0.612** | 1 | 0.662** | 0.648** | 0.679** |
| Engagement | 0.588** | 0.662** | 1 | 0.701** | 0.685** |
| Self-Efficacy | 0.605** | 0.648** | 0.701** | 1 | 0.656** |
| Technology Acceptance | 0.631** | 0.679** | 0.685** | 0.656** | 1 |
Table 7: Pearson correlation matrix among key variables in the intervention group. Pairwise correlations (r) with significance levels among knowledge, behavioral intention, engagement, self-efficacy, and technology acceptance.
| Variables | Knowledge Retention | Behavioral Intention | Engagement | Self-Efficacy | Technology Acceptance |
| Knowledge Retention | 1 | 0.472** | 0.451** | 0.489** | 0.504** |
| Behavioral Intention | 0.472** | 1 | 0.523** | 0.508** | 0.533** |
| Engagement | 0.451** | 0.523** | 1 | 0.556** | 0.527** |
| Self-Efficacy | 0.489** | 0.508** | 0.556** | 1 | 0.492** |
| Technology Acceptance | 0.504** | 0.533** | 0.527** | 0.492** | 1 |
Table 8: Pearson correlation matrix among key variables in the control group. Pairwise correlations (r) with significance levels for the same variables in the control group for side-by-side interpretation.
The findings of this study provide supportive evidence that integrating AI-powered intelligent atomization sensing technology into health education can enhance learning outcomes across multiple dimensions. Compared with the control group, participants in the intervention group exhibited statistically significant improvements in all five measured variables, with the largest gains observed in knowledge retention (F = 18.73, SS = 62.47, p < 0.001), followed by behavioral intention (F = 16.95, SS = 54.32, p < 0.001), technology acceptance (F = 13.45, SS = 42.65, p = 0.0003), engagement (F = 12.56, SS = 38.90, p = 0.0004), and self-efficacy (F = 10.84, SS = 33.70, p = 0.0012). These effects were supported by consistently low within-group variance (MS = 3.10-3.33), indicating stable intervention performance.
From a theoretical perspective, these improvements may be explained by adaptive learning theory and cognitive load principles, which posit that timely, personalized feedback facilitates more efficient information processing and knowledge retention22. In this study, AI-driven sensors dynamically monitored learners' states and adjusted instructional delivery in real time, thereby reducing cognitive overload and optimizing learning efficiency.
A critical determinant of intervention success was the accurate calibration of atomization sensors. Technical challenges such as data dropout or temporary sensor malfunction were mitigated by Alkabbany et al.23. This highlights the importance of ensuring reliable data capture for effective adaptive learning.
Correlation analyses further revealed interdependencies among the measured variables. Engagement was strongly correlated with self-efficacy (r = 0.701), while behavioral intention aligned with technology acceptance (r = 0.679). These findings are consistent with prior studies demonstrating that learner engagement often reinforces self-perceived competence24 and that positive attitudes toward technology predict behavioral readiness to adopt innovative learning tools25.
Compared with earlier e-learning interventions that primarily relied on static, non-adaptive modules26, the present study demonstrates the value of integrating multi-parameter sensing with real-time adaptivity. Unlike wearable-only systems that provide limited feedback27, this approach enabled dynamic personalization based on physiological and behavioral indicators, which appears to explain the stronger observed outcomes.
Conclusion
This study demonstrates that AI-powered intelligent atomization sensing technology shows potential as a scalable approach to enhancing health education outcomes. By embedding personalized, real-time feedback into instructional delivery, the intervention improved knowledge acquisition, engagement, behavioral intention, self-efficacy, and technology acceptance compared with conventional digital modules.
Practically, these findings underscore the value of incorporating AI-driven sensing systems into vocational health education programs. Such integration enables resource-efficient, learner-centered, and adaptive models that can better address the diverse needs of students while supporting improved training quality. These results provide a replicable framework for modernizing health education curricula in both vocational and continuing education settings.
Limitations and future directions
Several limitations should be acknowledged. First, the quasi-experimental design restricts strict causal inference; future studies should employ randomized controlled trials to strengthen causal claims28,29. Second, as the sample consisted primarily of vocational health students, the findings may not be fully generalizable to other educational or professional populations. Third, the AI system relies heavily on large datasets, which introduces concerns regarding data privacy, security, and algorithmic bias30. Addressing these issues will be essential for ensuring fairness and inclusivity.
Future research should prioritize three areas. One is strengthening data security and ethical governance, for example, through advanced encryption protocols and transparent privacy safeguards31,32,33. Another approach is to refine adaptive algorithms to accommodate diverse cognitive profiles, learning speeds, and motivational states, thereby maximizing personalization. Validation across varied educational settings and cultural contexts is also recommended to enhance generalizability and scalability. Ultimately, longitudinal studies are necessary to investigate whether the observed improvements in engagement, knowledge retention, and self-efficacy persist over time and result in sustained behavioral change.
By addressing these areas, AI-powered intelligent atomization sensing technology may evolve into a secure, inclusive, and widely deployable solution for delivering individualized health education at scale.
None.
None.
| AtomWear X100 wearable sensors | Shenzhen AtomTech Co., Ltd., China | X100 | Wearable sensor for monitoring respiratory frequency, ambient temperature, and motion signals (sampling rate: 100 Hz, accuracy: ±0.1). |
| AI Engine (rule-based + SVM classifier) | Custom implementation | Python 3.9 with TensorFlow 2.10 and scikit-learn 0.24 | Algorithm for adaptive decision support: integrates sensor data and adjusts instructional delivery. |
| Behavioral Intention Scale | Adapted from Ajzen (1991), Theory of Planned Behavior | 3 items | Assesses intention to adopt health practices. |
| Engagement Scale | Adapted from Sun et al. (2019) | 5 items | Measures attention, interaction, emotional involvement, persistence, and satisfaction. |
| Knowledge Retention Test (15-item MCQ) | Custom, aligned with instructional content | 15-item multiple-choice test on aerobic exercise and health knowledge. | |
| Multimedia Classroom Setup | Standard higher education classrooms | Classroom (~50 m²), projector, base stations for sensor connectivity. | |
| Python | Python Software Foundation | Version 3.9 | Programming environment for AI engine and preprocessing. |
| scikit-learn | Open source | Version 0.24 | Machine learning library for SVM classification. |
| Self-Efficacy Scale | Adapted from Bandura (1997) | 4 items | Assesses confidence in performing health exercises. |
| SPSS Statistics Software | IBM Corp., Armonk, NY, USA | Version 25 | Statistical analysis (ANOVA, t-tests, correlation, χ² tests). |
| Technology Acceptance Scale | Adapted from UTAUT (Venkatesh et al., 2003) | 6 items | Covers performance expectancy, effort expectancy, social influence, and facilitating conditions. |
| TensorFlow | Google Brain | Version 2.10 | Deep learning library used in AI engine. |