Refine
Document Type
Language
- English (8)
Is part of the Bibliography
- yes (8)
Keywords
- anterior cruciate ligament (2)
- biomechanics (2)
- change of direction (2)
- cutting (2)
- injury prevention (2)
- Additive manufacturing (1)
- COVID-19 (1)
- Confinements (1)
- Coronavirus (1)
- Customised (1)
- Inactivity (1)
- Injury risk factor (1)
- MEMS (1)
- Public life restrictions (1)
- Running shoe (1)
- Sedentary behavior (1)
- anticipated (1)
- biomechanical screening (1)
- decision tree (1)
- external load (1)
- footwear (1)
- handball (1)
- inertial sensor (1)
- internal load (1)
- inverse dynamics (1)
- joint loading (1)
- knee loading (1)
- locomotion (1)
- non linear time-series analysis (1)
- player monitoring (1)
- risk factor (1)
- running economy (1)
- running performance (1)
- running shoes (1)
- screening (1)
- shoe technology (1)
- sports medicine (1)
- unanticipated (1)
- wearable sensors (1)
- youth soccer (1)
Institute
- Fakultät Maschinenbau und Verfahrenstechnik (M+V) (8) (remove)
Open Access
- Gold (8) (remove)
Background: Many countries have restricted public life in order to contain the spread of the novel coronavirus (SARS-CoV2). As a side effect of related measures, physical activity (PA) levels may have decreased.
Objective: We aimed (1) to quantify changes in PA and (2) to identify variables potentially predicting PA reductions.
Methods: A systematic review with random-effects multilevel meta-analysis was performed, pooling the standardized mean differences in PA measures before and during public life restrictions.
Results: A total of 173 trials with moderate methodological quality (modified Downs and Black checklist) were identified. Compared to pre-pandemic, total PA (SMD − 0.65, 95% CI − 1.10 to − 0.21) and walking (SMD − 0.52, 95% CI − 0.29 to − 0.76) decreased while sedentary behavior increased (SMD 0.91, 95% CI: 0.17 to 1.65). Reductions in PA affected all intensities (light: SMD − 0.35, 95% CI − 0.09 to − 0.61, p = .013; moderate: SMD − 0.33, 95% CI − 0.02 to − 0.6; vigorous: SMD − 0.33, − 0.08 to − 0.58, 95% CI − 0.08 to − 0.58) to a similar degree. Moderator analyses revealed no influence of variables such as sex, age, body mass index, or health status. However, the only continent without a PA reduction was Australia and cross-sectional trials yielded higher effect sizes (p < .05).
Conclusion: Public life restrictions associated with the COVID-19 pandemic resulted in moderate reductions in PA levels and large increases in sedentary behavior. Health professionals and policy makers should therefore join forces to develop strategies counteracting the adverse effects of inactivity.
Injury prevention is essential in running due to the risk of overuse injury development. Tailoring running shoes to individual needs may be a promising strategy to reduce this risk. Novel manufacturing processes allow the production of individualised running shoes that incorporate features that meet individual biomechanical and experiential needs. However, specific ways to individualise footwear to reduce injury risk are poorly understood. Therefore, this scoping review provides an overview of (1) footwear design features that have the potential for individualisation; and (2) the literature on the differential responses to footwear design features between selected groups of individuals. These purposes focus exclusively on reducing the risk of overuse injuries. We included studies in the English language on adults that analysed: (1) potential interaction effects between footwear design features and subgroups of runners or covariates (e.g., age, sex) for running-related biomechanical risk factors or injury incidences; (2) footwear comfort perception for a systematically modified footwear design feature. Most of the included articles (n = 107) analysed male runners. Female runners may be more susceptible to footwear-induced changes and overuse injury development; future research should target more heterogonous sampling. Several footwear design features (e.g., midsole characteristics, upper, outsole profile) show potential for individualisation. However, the literature addressing individualised footwear solutions and the potential to reduce biomechanical risk factors is limited. Future studies should leverage more extensive data collections considering relevant covariates and subgroups while systematically modifying isolated footwear design features to inform footwear individualisation.
Non-contact anterior cruciate ligament injuries typically occur during cutting maneuvers and are associated with high peak knee abduction moments (KAM) within early stance. To screen athletes for injury risk or quantify the efficacy of prevention programs, it may be necessary to design tasks that mimic game situations. Thus, this study compared KAMs and ranking consistency of female handball players in three sport-specific fake-and-cut tasks of increasing complexity. The biomechanics of female handball players (n = 51, mean ± SD: 66.9 ± 7.8 kg, 1.74 ± 0.06 m, 19.2 ± 3.4 years) were recorded with a 3D motion capture system and force plates during three standardized fake-and-cut tasks. Task 1 was designed as a simple pre-planned cut, task 2 included catching a ball before a pre-planned cut in front of a static defender, and task 3 was designed as an unanticipated cut with three dynamic defenders involved. Inverse dynamics were used to calculate peak KAM within the first 100 ms of stance. KAM was decomposed into the frontal plane knee joint moment arm and resultant ground reaction force. RANOVAs (α ≤ 0.05) were used to reveal differences in the KAM magnitudes, moment arm, and resultant ground reaction force for the three tasks. Spearman's rank correlations were calculated to test the ranking consistency of the athletes' KAMs. There was a significant task main effect on KAM (p = 0.02; ηp2 = 0.13). The KAM in the two complex tasks was significantly higher (task 2: 1.73 Nm/kg; task 3: 1.64 Nm/kg) than the KAM in the simplest task (task 1: 1.52 Nm/kg). The ranking of the peak KAM was consistent regardless of the task complexity. Comparing tasks 1 and 2, an increase in KAM resulted from an increased frontal plane moment arm. Comparing tasks 1 and 3, higher KAM in task 3 resulted from an interplay between both moment arm and the resultant ground reaction force. In contrast to previous studies, unanticipated cutting maneuvers did not produce the highest KAMs. These findings indicate that the players have developed an automated sport-specific cutting technique that is utilized in both pre-planned and unanticipated fake-and-cut tasks.
Biomechanical Risk Factors of Injury-Related Single-Leg Movements in Male Elite Youth Soccer Players
(2022)
Altered movement patterns during single-leg movements in soccer increase the risk of lower-extremity non-contact injuries. The identification of biomechanical parameters associated with lower-extremity injuries can enrich knowledge of injury risks and facilitate injury prevention. Fifty-six elite youth soccer players performed a single-leg drop landing task and an unanticipated side-step cutting task. Three-dimensional ankle, knee and hip kinematic and kinetic data were obtained, and non-contact lower-extremity injuries were documented throughout the season. Risk profiling was assessed using a multivariate approach utilising a decision tree model (classification and regression tree method). The decision tree model indicated peak knee frontal plane angle, peak vertical ground reaction force, ankle frontal plane moment and knee transverse plane angle at initial contact (in this hierarchical order) for the single-leg landing task as important biomechanical parameters to discriminate between injured and non-injured players. Hip sagittal plane angle at initial contact, peak ankle transverse plane angle and hip sagittal plane moment (in this hierarchical order) were indicated as risk factors for the unanticipated cutting task. Ankle, knee and hip kinematics, as well as ankle and hip kinetics, during single-leg high-risk movements can provide a good indication of injury risk in elite youth soccer players.
Relationships between External, Wearable Sensor-Based, and Internal Parameters: A Systematic Review
(2023)
Micro electro-mechanical systems (MEMS) are used to record training and match play of intermittent team sport athletes. Paired with estimates of internal responses or adaptations to exercise, practitioners gain insight into players’ dose–response relationship which facilitates the prescription of the training stimuli to optimize performance, prevent injuries, and to guide rehabilitation processes. A systematic review on the relationship between external, wearable-based, and internal parameters in team sport athletes, compliant with the PRISMA guidelines, was conducted. The literature research was performed from earliest record to 1 September 2020 using the databases PubMed, Web of Science, CINAHL, and SportDISCUS. A total of 66 full-text articles were reviewed encompassing 1541 athletes. About 109 different relationships between variables have been reviewed. The most investigated relationship across sports was found between (session) rating of perceived exertion ((session-)RPE) and PlayerLoad™ (PL) with, predominantly, moderate to strong associations (r = 0.49–0.84). Relationships between internal parameters and highly dynamic, anaerobic movements were heterogenous. Relationships between average heart rate (HR), Edward’s and Banister’s training impulse (TRIMP) seem to be reflected in parameters of overall activity such as PL and TD for running-intensive team sports. PL may further be suitable to estimate the overall subjective perception. To identify high fine-structured loading—relative to a certain type of sport—more specific measures and devices are needed. Individualization of parameters could be helpful to enhance practicality.
Running stability is the ability to withstand naturally occurring minor perturbations during running. It is susceptible to external and internal running conditions such as footwear or fatigue. However, both its reliable measurability and the extent to which laboratory measurements reflect outdoor running remain unclear. This study aimed to evaluate the intra- and inter-day reliability of the running stability as well as the comparability of different laboratory and outdoor conditions. Competitive runners completed runs on a motorized treadmill in a research laboratory and overground both indoors and outdoors. Running stability was determined as the maximum short-term divergence exponent from the raw gyroscope signals of wearable sensors mounted to four different body locations (sternum, sacrum, tibia, and foot). Sacrum sensor measurements demonstrated the highest reliabilities (good to excellent; ICC = 0.85 to 0.91), while those of the tibia measurements showed the lowest (moderate to good; ICC = 0.55 to 0.89). Treadmill measurements depicted systematically lower values than both overground conditions for all sensor locations (relative bias = -9.8% to -2.9%). The two overground conditions, however, showed high agreement (relative bias = -0.3% to 0.5%; relative limits of agreement = 9.2% to 15.4%). Our results imply moderate to excellent reliability for both overground and treadmill running, which is the foundation of further research on running stability.
In a randomized controlled cross-over study ten male runners (26.7 ± 4.9 years; recent 5-km time: 18:37 ± 1:07 min:s) performed an incremental treadmill test (ITT) and a 3-km time trial (3-km TT) on a treadmill while wearing either carbon fiber insoles with downwards curvature or insoles made of butyl rubber (control condition) in light road racing shoes (Saucony Fastwitch 9). Oxygen uptake, respiratory exchange ratio, heart rate, blood lactate concentration, stride frequency, stride length and time to exhaustion were assessed during ITT. After ITT, all runners rated their perceived exertion, perceived shoe comfort and perceived shoe performance. Running time, heart rate, blood lactate levels, stride frequency and stride length were recorded during, and shoe comfort and shoe performance after, the 3-km TT. All parameters obtained during or after the ITT did not differ between the two conditions [range: p = 0.188 to 0.948 (alpha value: 0.05); Cohen's d = 0.021 to 0.479] despite the rating of shoe comfort showing better scores for the control insoles (p = 0.001; d = −1.646). All parameters during and after the 3-km TT showed no differences (p = 0.200 to 1.000; d = 0.000 to 0.501) between both conditions except for shoe comfort showing better scores for control insoles (p = 0.017; d = −0.919). Running with carbon fiber insoles with downwards curvature did not change running performance or any submaximal or maximal physiological or biomechanical parameter and perceived exertion compared to control condition. Shoe comfort is impaired while running with carbon fiber insoles. Wearing carbon fiber insoles with downwards curvature during treadmill running is not beneficial when compared to running with control insoles.
Young female handball players represent a high-risk population for anterior cruciate ligament (ACL) injuries. While the external knee abduction moment (KAM) is known to be a risk factor, it is unclear how cutting technique affects KAMs in sport-specific cutting maneuvers. Further, the effect of added game specificity (e.g., catching a ball or faking defenders) on KAMs and cutting technique remains unknown. Therefore, this study aimed: (i) to test if athletes grouped into different clusters of peak KAMs produced during three sport-specific fake-and-cut tasks of different complexities differ in cutting technique, and (ii) to test whether technique variables change with task complexity. Fifty-one female handball players (67.0 ± 7.7 kg, 1.70 ± 0.06 m, 19.2 ± 3.4 years) were recruited. Athletes performed at least five successful handball-specific sidestep cuts of three different complexities ranging from simple pre-planned fake-and-cut maneuvers to catching a ball and performing an unanticipated fake-and-cut maneuver with dynamic defenders. A k-means cluster algorithm with squared Euclidean distance metric was applied to the KAMs of all three tasks. The optimal cluster number of koptimal = 2 was calculated using the average silhouette width. Statistical differences in technique variables between the two clusters and the tasks were analyzed using repeated-measures ANOVAs (task complexity) with nested groupings (clusters). KAMs differed by 64.5%, on average, between clusters. When pooling all tasks, athletes with high KAMs showed 3.4° more knee valgus, 16.9% higher downward and 8.4% higher resultant velocity at initial ground contact, and 20.5% higher vertical ground reaction forces at peak KAM. Unlike most other variables, knee valgus angle was not affected by task complexity, likely due to it being part of inherent movement strategies and partly determined by anatomy. Since the high KAM cluster showed higher vertical center of mass excursions and knee valgus angles in all tasks, it is likely that this is part of an automated motor program developed over the players' careers. Based on these results, reducing knee valgus and downward velocity bears the potential to mitigate knee joint loading and therefore ACL injury risk.