Refine
Document Type
- Article (reviewed) (16)
- Conference Proceeding (1)
Conference Type
- Konferenzartikel (1)
Language
- English (17)
Has Fulltext
- yes (17) (remove)
Is part of the Bibliography
- yes (17)
Keywords
- COVID-19 (3)
- biomechanics (3)
- anterior cruciate ligament (2)
- change of direction (2)
- cutting (2)
- injury prevention (2)
- sports medicine (2)
- Biomedical engineering (1)
- Bone quality and biomechanics (1)
- Confinements (1)
Institute
Open Access
- Open Access (17)
- Gold (7)
- Hybrid (6)
Treadmills are essential to the study of human and animal locomotion as well as for applied diagnostics in both sports and medicine. The quantification of relevant biomechanical and physiological variables requires a precise regulation of treadmill belt velocity (TBV). Here, we present a novel method for time-efficient tracking of TBV using standard 3D motion capture technology. Further, we analyzed TBV fluctuations of four different treadmills as seven participants walked and ran at target speeds ranging from 1.0 to 4.5 m/s. Using the novel method, we show that TBV regulation differs between treadmill types, and that certain features of TBV regulation are affected by the subjects’ body mass and their locomotion speed. With higher body mass, the TBV reductions in the braking phase of stance became higher, even though this relationship differed between locomotion speeds and treadmill type (significant body mass × speed × treadmill type interaction). Average belt speeds varied between about 98 and 103% of the target speed. For three of the four treadmills, TBV reduction during the stance phase of running was more intense (> 5% target speed) and occurred earlier (before 50% of stance phase) unlike the typical overground center of mass velocity patterns reported in the literature. Overall, the results of this study emphasize the importance of monitoring TBV during locomotor research and applied diagnostics. We provide a novel method that is freely accessible on Matlab’s file exchange server (“getBeltVelocity.m”) allowing TBV tracking to become standard practice in locomotion research.
Background: Running overuse injuries (ROIs) occur within a complex, partly injury-specific interplay between training loads and extrinsic and intrinsic risk factors. Biomechanical risk factors (BRFs) are related to the individual running style. While BRFs have been reviewed regarding general ROI risk, no systematic review has addressed BRFs for specific ROIs using a standardized methodology.
Objective: To identify and evaluate the evidence for the most relevant BRFs for ROIs determined during running and to
suggest future research directions.
Design: Systematic review considering prospective and retrospective studies. (PROSPERO_ID: 236,832).
Data Sources: PubMed. Connected Papers. The search was performed in February 2021.
Eligibility Criteria: English language. Studies on participants whose primary sport is running addressing the risk for the seven most common ROIs and at least one kinematic, kinetic (including pressure measurements), or electromyographic BRF. A BRF needed to be identified in at least one prospective or two independent retrospective studies. BRFs needed to be determined during running.
Results: Sixty-six articles fulfilled our eligibility criteria. Levels of evidence for specific ROIs ranged from conflicting to moderate evidence. Running populations and methods applied varied considerably between studies. While some BRFs appeared for several ROIs, most BRFs were specific for a particular ROI. Most BRFs derived from lower-extremity joint kinematics and kinetics were located in the frontal and transverse planes of motion. Further, plantar pressure, vertical ground reaction force loading rate and free moment-related parameters were identified as kinetic BRFs.
Conclusion: This study offers a comprehensive overview of BRFs for the most common ROIs, which might serve as a starting point to develop ROI-specific risk profiles of individual runners. We identified limited evidence for most ROI-specific risk factors, highlighting the need for performing further high-quality studies in the future. However, consensus on data collection standards (including the quantification of workload and stress tolerance variables and the reporting of injuries) is warranted.
Activities for rehabilitation and prevention are often lengthy and associated with pain and frustration. Their playful enrichment (hereafter: gamification) can counteract this, resulting in so-called “exergames”. However, in contrast to games designed solely for entertainment, the increased motivation and immersion in gamified training can lead to a reduced perception of pain and thus to health deterioration. Therefore, it is necessary to monitor activities continuously. However, only an AI-based system able to generate autonomous interventions could vacate the therapists’ costly time and allow better training at home. An automated adjustment of the movement training’s difficulty as well as individualized goal setting and control are essential to achieve such autonomy. This article’s contribution is two-fold: (1) We portray the potentials of gamification in the health area. (2) We present a framework for smart rehabilitation and prevention training allowing autonomous, dynamic, and gamified interactions.
Background:
Ankle braces aim to reduce lateral ankle sprains. Next to protection, factors influencing user compliance, such as sports performance, motion restriction, and users’ perceptions, are relevant for user compliance and thus injury prevention. Novel adaptive protection systems claim to change their mechanical behavior based on the intensity of motion (eg, the inversion velocity), unlike traditional passive concepts of ankle bracing.
Purpose:
To compare the performance of a novel adaptive brace with 2 passive ankle braces while considering protection, sports performance, freedom of motion, and subjective perception.
Study Design:
Controlled laboratory study.
Methods:
The authors analyzed 1 adaptive and 2 passive (one lace-up and one rigid brace) ankle braces, worn in a low-cut, indoor sports shoe, which was also the no-brace reference condition. We performed material testing using an artificial ankle joint system at high and low inversion velocities. Further, 20 male, young, healthy team sports athletes were analyzed using 3-dimensional motion analysis in sports-related movements to address protection, sports performance, and active range of motion dimensions. Participants rated subjective comfort, stability, and restriction experienced when using the products.
Results:
Subjective stability rating was not different between the adaptive and passive systems. The rigid brace was superior in restricting peak inversion during the biomechanical testing compared with the passive braces. However, in the material test, the adaptive brace increased its stiffness by approximately 400% during the fast compared with the slow inversion velocities, demonstrating its adaptive behavior and similar stiffness values to passive braces. We identified minor differences in sports performance tasks. The adaptive brace improved active ankle range of motion and subjective comfort and restriction ratings.
Conclusion:
The adaptive brace offered similar protective effects in high-velocity inversion situations to those of the passive braces while improving range of motion, comfort, and restriction rating during noninjurious motions.
Clinical Relevance:
Protection systems are only effective when used. Compared with traditional passive ankle brace technologies, the novel adaptive brace might increase user compliance by improving comfort and freedom of movement while offering similar protection in injurious situations.
Background: Many countries have restricted public life in order to contain the spread of the novel coronavirus (SARS-CoV2). As a side effect of related measures, physical activity (PA) levels may have decreased.
Objective: We aimed (1) to quantify changes in PA and (2) to identify variables potentially predicting PA reductions.
Methods: A systematic review with random-effects multilevel meta-analysis was performed, pooling the standardized mean differences in PA measures before and during public life restrictions.
Results: A total of 173 trials with moderate methodological quality (modified Downs and Black checklist) were identified. Compared to pre-pandemic, total PA (SMD − 0.65, 95% CI − 1.10 to − 0.21) and walking (SMD − 0.52, 95% CI − 0.29 to − 0.76) decreased while sedentary behavior increased (SMD 0.91, 95% CI: 0.17 to 1.65). Reductions in PA affected all intensities (light: SMD − 0.35, 95% CI − 0.09 to − 0.61, p = .013; moderate: SMD − 0.33, 95% CI − 0.02 to − 0.6; vigorous: SMD − 0.33, − 0.08 to − 0.58, 95% CI − 0.08 to − 0.58) to a similar degree. Moderator analyses revealed no influence of variables such as sex, age, body mass index, or health status. However, the only continent without a PA reduction was Australia and cross-sectional trials yielded higher effect sizes (p < .05).
Conclusion: Public life restrictions associated with the COVID-19 pandemic resulted in moderate reductions in PA levels and large increases in sedentary behavior. Health professionals and policy makers should therefore join forces to develop strategies counteracting the adverse effects of inactivity.
Governments have restricted public life during the COVID-19 pandemic, inter alia closing sports facilities and gyms. As regular exercise is essential for health, this study examined the effect of pandemic-related confinements on physical activity (PA) levels. A multinational survey was performed in 14 countries. Times spent in moderate-to-vigorous physical activity (MVPA) as well as in vigorous physical activity only (VPA) were assessed using the Nordic Physical Activity Questionnaire (short form). Data were obtained for leisure and occupational PA pre- and during restrictions. Compliance with PA guidelines was calculated based on the recommendations of the World Health Organization (WHO). In total, n = 13,503 respondents (39 ± 15 years, 59% females) were surveyed. Compared to pre-restrictions, overall self-reported PA declined by 41% (MVPA) and 42.2% (VPA). Reductions were higher for occupational vs. leisure time, young and old vs. middle-aged persons, previously more active vs. less active individuals, but similar between men and women. Compared to pre-pandemic, compliance with WHO guidelines decreased from 80.9% (95% CI: 80.3–81.7) to 62.5% (95% CI: 61.6–63.3). Results suggest PA levels have substantially decreased globally during the COVID-19 pandemic. Key stakeholders should consider strategies to mitigate loss in PA in order to preserve health during the pandemic.
Purpose
To summarize the mechanical loading of the spine in different activities of daily living and sports.
Methods
Since the direct measurement is not feasible in sports activities, a mathematical model was applied to quantify spinal loading of more than 600 physical tasks in more than 200 athletes from several sports disciplines. The outcome is compression and torque (normalized to body weight/mass) at L4/L5.
Results
The data demonstrate high compressive forces on the lumbar spine in sport-related activities, which are much higher than forces reported in normal daily activities and work tasks. Especially ballistic jumping and landing skills yield high estimated compression at L4/L5 of more than ten times body weight. Jumping, landing, heavy lifting and weight training in sports demonstrate compression forces significantly higher than guideline recommendations for working tasks.
Conclusion
These results may help to identify acute and long-term risks of low back pain and, thus, may guide the development of preventive interventions for low back pain or injury in athletes.
Anterior cruciate ligament (ACL) ruptures are frequent in the age group of 15–19 years, particularly for female athletes. Although injury-prevention programs effectively reduce severe knee injuries, little is known about the underlying mechanisms and changes of biomechanical risk factors. Thus, this study analyzes the effects of a neuromuscular injury-prevention program on biomechanical parameters associated with ACL injuries in elite youth female handball players. In a nonrandomized, controlled intervention study, 19 players allocated to control (n = 12) and intervention (n = 7) group were investigated for single- and double-leg landings as well as unanticipated side-cutting maneuvers before and after a 12-week study period. The lower-extremity motion of the athletes was captured using a three-dimensional motion capture system consisting of 12 infrared cameras. A lower-body marker set of 40 markers together with a rigid body model, including a forefoot, rearfoot, shank, thigh, and pelvis segment in combination with two force plates was used to determine knee joint angles, resultant external joint moments, and vertical ground reaction forces. The two groups did not differ significantly during pretesting. Only the intervention group showed significant improvements in the initial knee abduction angle during single leg landing (p = 0.038: d = 0.518), knee flexion moment during double-leg landings (p = 0.011; d = −1.086), knee abduction moment during single (p = 0.036; d = 0.585) and double-leg landing (p = 0.006; d = 0.944) and side-cutting (p = 0.015;d = 0.561) as well as vertical ground reaction force during double-leg landing (p = 0.004; d = 1.482). Control group demonstrated no significant changes in kinematics and kinetics. However, at postintervention both groups were not significantly different in any of the biomechanical outcomes except for the normalized knee flexion moment of the dominant leg during single-leg landing. This study provides first indications that the implementation of a training intervention with specific neuromuscular exercises has positive impacts on biomechanical risk factors associated with ACL injury risk and, therefore, may help prevent severe knee injuries in elite youth female handball players.
In pandemic times, the possibilities for conventional sports activities are severely limited; many sports facilities are closed or can only be used with restrictions. To counteract this lack of health activities and social exchange, people are increasingly adopting new digital sports solutions—a behavior change that had already started with the trend towards fitness apps and activity trackers. Existing research suggests that digital solutions increase the motivation to move and stay active. This work further investigates the potentials of digital sports incorporating the dimensions gender and preference for team sports versus individual sports. The study focuses on potential users, who were mostly younger professionals and academics. The results show that the SARS-CoV-19 pandemic had a significant negative impact on sports activity, particularly on persons preferring team sports. To compensate, most participants use more digital sports than before, and there is a positive correlation between the time spent physically active during the pandemic and the increase in motivation through digital sports. Nevertheless, there is still considerable skepticism regarding the potential of digital sports solutions to increase the motivation to do sports, increase performance, or raise a sense of team spirit when done in groups.
Non-contact anterior cruciate ligament injuries typically occur during cutting maneuvers and are associated with high peak knee abduction moments (KAM) within early stance. To screen athletes for injury risk or quantify the efficacy of prevention programs, it may be necessary to design tasks that mimic game situations. Thus, this study compared KAMs and ranking consistency of female handball players in three sport-specific fake-and-cut tasks of increasing complexity. The biomechanics of female handball players (n = 51, mean ± SD: 66.9 ± 7.8 kg, 1.74 ± 0.06 m, 19.2 ± 3.4 years) were recorded with a 3D motion capture system and force plates during three standardized fake-and-cut tasks. Task 1 was designed as a simple pre-planned cut, task 2 included catching a ball before a pre-planned cut in front of a static defender, and task 3 was designed as an unanticipated cut with three dynamic defenders involved. Inverse dynamics were used to calculate peak KAM within the first 100 ms of stance. KAM was decomposed into the frontal plane knee joint moment arm and resultant ground reaction force. RANOVAs (α ≤ 0.05) were used to reveal differences in the KAM magnitudes, moment arm, and resultant ground reaction force for the three tasks. Spearman's rank correlations were calculated to test the ranking consistency of the athletes' KAMs. There was a significant task main effect on KAM (p = 0.02; ηp2 = 0.13). The KAM in the two complex tasks was significantly higher (task 2: 1.73 Nm/kg; task 3: 1.64 Nm/kg) than the KAM in the simplest task (task 1: 1.52 Nm/kg). The ranking of the peak KAM was consistent regardless of the task complexity. Comparing tasks 1 and 2, an increase in KAM resulted from an increased frontal plane moment arm. Comparing tasks 1 and 3, higher KAM in task 3 resulted from an interplay between both moment arm and the resultant ground reaction force. In contrast to previous studies, unanticipated cutting maneuvers did not produce the highest KAMs. These findings indicate that the players have developed an automated sport-specific cutting technique that is utilized in both pre-planned and unanticipated fake-and-cut tasks.
Biomechanical Risk Factors of Injury-Related Single-Leg Movements in Male Elite Youth Soccer Players
(2022)
Altered movement patterns during single-leg movements in soccer increase the risk of lower-extremity non-contact injuries. The identification of biomechanical parameters associated with lower-extremity injuries can enrich knowledge of injury risks and facilitate injury prevention. Fifty-six elite youth soccer players performed a single-leg drop landing task and an unanticipated side-step cutting task. Three-dimensional ankle, knee and hip kinematic and kinetic data were obtained, and non-contact lower-extremity injuries were documented throughout the season. Risk profiling was assessed using a multivariate approach utilising a decision tree model (classification and regression tree method). The decision tree model indicated peak knee frontal plane angle, peak vertical ground reaction force, ankle frontal plane moment and knee transverse plane angle at initial contact (in this hierarchical order) for the single-leg landing task as important biomechanical parameters to discriminate between injured and non-injured players. Hip sagittal plane angle at initial contact, peak ankle transverse plane angle and hip sagittal plane moment (in this hierarchical order) were indicated as risk factors for the unanticipated cutting task. Ankle, knee and hip kinematics, as well as ankle and hip kinetics, during single-leg high-risk movements can provide a good indication of injury risk in elite youth soccer players.
Purpose
To (1) identify neuromuscular and biomechanical injury risk factors in elite youth soccer players and (2) assess the predictive ability of a machine learning approach.
Material and Methods
Fifty-six elite male youth soccer players (age: 17.2 ± 1.1 years; height: 179 ± 8 cm; mass: 70.4 ± 9.2 kg) performed a 3D motion analysis, postural control testing, and strength testing. Non-contact lower extremities injuries were documented throughout 10 months. A least absolute shrinkage and selection operator (LASSO) regression model was used to identify the most important injury predictors. Predictive performance of the LASSO model was determined in a leave-one-out (LOO) prediction competition.
Results
Twenty-three non-contact injuries were registered. The LASSO model identified concentric knee extensor peak torque, hip transversal plane moment in the single-leg drop landing task and center of pressure sway in the single-leg stance test as the three most important predictors for injury in that order. The LASSO model was able to predict injury outcomes with a likelihood of 58% and an area under the ROC curve of 0.63 (sensitivity = 35%; specificity = 79%).
Conclusion
The three most important variables for predicting the injury outcome suggest the importance of neuromuscular and biomechanical performance measures in elite youth soccer. These preliminary results may have practical implications for future directions in injury risk screening and planning, as well as for the development of customized training programs to counteract intrinsic injury risk factors. However, the poor predictive performance of the final model confirms the challenge of predicting sports injuries, and the model must therefore be evaluated in larger samples.
Appraising the Methodological Quality of Sports Injury Video Analysis Studies: The QA-SIVAS Scale
(2023)
Background
Video analysis (VA) is commonly used in the assessment of sports injuries and has received considerable research interest. Until now, no tool has been available for the assessment of study quality. Therefore, the objective of this study was to develop and evaluate a valid instrument that reliably assesses the methodological quality of VA studies.
Methods
The Quality Appraisal for Sports Injury Video Analysis Studies (QA-SIVAS) scale was developed using a modified Delphi approach including expert consensus and pilot testing. Reliability was examined through intraclass correlation coefficient (ICC3,1) and free-marginal kappa statistics by three independent raters. Construct validity was investigated by comparing QA-SIVAS with expert ratings by using Kendall’s tau analysis. Rating time was studied by applying the scale to 21 studies and computing the mean time for rating per study article.
Results
The QA-SIVAS scale consists of an 18-item checklist addressing the study design, data source, conduct, report, and discussion of VA studies in sports injury research. Inter- and intra-rater reliability were excellent with ICCs > 0.97. Expert ratings revealed a high construct validity (0.71; p < 0.001). Mean rating time was 10 ± 2 min per article.
Conclusion
QA-SIVAS is a reliable and valid instrument that can be easily applied to sports injury research. Future studies in the field of VA should adhere to standardized methodological criteria and strict quality guidelines.
Relationships between External, Wearable Sensor-Based, and Internal Parameters: A Systematic Review
(2023)
Micro electro-mechanical systems (MEMS) are used to record training and match play of intermittent team sport athletes. Paired with estimates of internal responses or adaptations to exercise, practitioners gain insight into players’ dose–response relationship which facilitates the prescription of the training stimuli to optimize performance, prevent injuries, and to guide rehabilitation processes. A systematic review on the relationship between external, wearable-based, and internal parameters in team sport athletes, compliant with the PRISMA guidelines, was conducted. The literature research was performed from earliest record to 1 September 2020 using the databases PubMed, Web of Science, CINAHL, and SportDISCUS. A total of 66 full-text articles were reviewed encompassing 1541 athletes. About 109 different relationships between variables have been reviewed. The most investigated relationship across sports was found between (session) rating of perceived exertion ((session-)RPE) and PlayerLoad™ (PL) with, predominantly, moderate to strong associations (r = 0.49–0.84). Relationships between internal parameters and highly dynamic, anaerobic movements were heterogenous. Relationships between average heart rate (HR), Edward’s and Banister’s training impulse (TRIMP) seem to be reflected in parameters of overall activity such as PL and TD for running-intensive team sports. PL may further be suitable to estimate the overall subjective perception. To identify high fine-structured loading—relative to a certain type of sport—more specific measures and devices are needed. Individualization of parameters could be helpful to enhance practicality.
Running stability is the ability to withstand naturally occurring minor perturbations during running. It is susceptible to external and internal running conditions such as footwear or fatigue. However, both its reliable measurability and the extent to which laboratory measurements reflect outdoor running remain unclear. This study aimed to evaluate the intra- and inter-day reliability of the running stability as well as the comparability of different laboratory and outdoor conditions. Competitive runners completed runs on a motorized treadmill in a research laboratory and overground both indoors and outdoors. Running stability was determined as the maximum short-term divergence exponent from the raw gyroscope signals of wearable sensors mounted to four different body locations (sternum, sacrum, tibia, and foot). Sacrum sensor measurements demonstrated the highest reliabilities (good to excellent; ICC = 0.85 to 0.91), while those of the tibia measurements showed the lowest (moderate to good; ICC = 0.55 to 0.89). Treadmill measurements depicted systematically lower values than both overground conditions for all sensor locations (relative bias = -9.8% to -2.9%). The two overground conditions, however, showed high agreement (relative bias = -0.3% to 0.5%; relative limits of agreement = 9.2% to 15.4%). Our results imply moderate to excellent reliability for both overground and treadmill running, which is the foundation of further research on running stability.
In a randomized controlled cross-over study ten male runners (26.7 ± 4.9 years; recent 5-km time: 18:37 ± 1:07 min:s) performed an incremental treadmill test (ITT) and a 3-km time trial (3-km TT) on a treadmill while wearing either carbon fiber insoles with downwards curvature or insoles made of butyl rubber (control condition) in light road racing shoes (Saucony Fastwitch 9). Oxygen uptake, respiratory exchange ratio, heart rate, blood lactate concentration, stride frequency, stride length and time to exhaustion were assessed during ITT. After ITT, all runners rated their perceived exertion, perceived shoe comfort and perceived shoe performance. Running time, heart rate, blood lactate levels, stride frequency and stride length were recorded during, and shoe comfort and shoe performance after, the 3-km TT. All parameters obtained during or after the ITT did not differ between the two conditions [range: p = 0.188 to 0.948 (alpha value: 0.05); Cohen's d = 0.021 to 0.479] despite the rating of shoe comfort showing better scores for the control insoles (p = 0.001; d = −1.646). All parameters during and after the 3-km TT showed no differences (p = 0.200 to 1.000; d = 0.000 to 0.501) between both conditions except for shoe comfort showing better scores for control insoles (p = 0.017; d = −0.919). Running with carbon fiber insoles with downwards curvature did not change running performance or any submaximal or maximal physiological or biomechanical parameter and perceived exertion compared to control condition. Shoe comfort is impaired while running with carbon fiber insoles. Wearing carbon fiber insoles with downwards curvature during treadmill running is not beneficial when compared to running with control insoles.
Young female handball players represent a high-risk population for anterior cruciate ligament (ACL) injuries. While the external knee abduction moment (KAM) is known to be a risk factor, it is unclear how cutting technique affects KAMs in sport-specific cutting maneuvers. Further, the effect of added game specificity (e.g., catching a ball or faking defenders) on KAMs and cutting technique remains unknown. Therefore, this study aimed: (i) to test if athletes grouped into different clusters of peak KAMs produced during three sport-specific fake-and-cut tasks of different complexities differ in cutting technique, and (ii) to test whether technique variables change with task complexity. Fifty-one female handball players (67.0 ± 7.7 kg, 1.70 ± 0.06 m, 19.2 ± 3.4 years) were recruited. Athletes performed at least five successful handball-specific sidestep cuts of three different complexities ranging from simple pre-planned fake-and-cut maneuvers to catching a ball and performing an unanticipated fake-and-cut maneuver with dynamic defenders. A k-means cluster algorithm with squared Euclidean distance metric was applied to the KAMs of all three tasks. The optimal cluster number of koptimal = 2 was calculated using the average silhouette width. Statistical differences in technique variables between the two clusters and the tasks were analyzed using repeated-measures ANOVAs (task complexity) with nested groupings (clusters). KAMs differed by 64.5%, on average, between clusters. When pooling all tasks, athletes with high KAMs showed 3.4° more knee valgus, 16.9% higher downward and 8.4% higher resultant velocity at initial ground contact, and 20.5% higher vertical ground reaction forces at peak KAM. Unlike most other variables, knee valgus angle was not affected by task complexity, likely due to it being part of inherent movement strategies and partly determined by anatomy. Since the high KAM cluster showed higher vertical center of mass excursions and knee valgus angles in all tasks, it is likely that this is part of an automated motor program developed over the players' careers. Based on these results, reducing knee valgus and downward velocity bears the potential to mitigate knee joint loading and therefore ACL injury risk.