Refine
Document Type
Language
- English (9)
Has Fulltext
- yes (9) (remove)
Is part of the Bibliography
- yes (9) (remove)
Keywords
- biomechanics (3)
- anterior cruciate ligament (2)
- change of direction (2)
- cutting (2)
- Biomedical engineering (1)
- Bone quality and biomechanics (1)
- COVID-19 (1)
- Confinements (1)
- Coronavirus (1)
- Inactivity (1)
Institute
Open Access
- Open Access (9)
- Gold (5)
- Hybrid (3)
In a randomized controlled cross-over study ten male runners (26.7 ± 4.9 years; recent 5-km time: 18:37 ± 1:07 min:s) performed an incremental treadmill test (ITT) and a 3-km time trial (3-km TT) on a treadmill while wearing either carbon fiber insoles with downwards curvature or insoles made of butyl rubber (control condition) in light road racing shoes (Saucony Fastwitch 9). Oxygen uptake, respiratory exchange ratio, heart rate, blood lactate concentration, stride frequency, stride length and time to exhaustion were assessed during ITT. After ITT, all runners rated their perceived exertion, perceived shoe comfort and perceived shoe performance. Running time, heart rate, blood lactate levels, stride frequency and stride length were recorded during, and shoe comfort and shoe performance after, the 3-km TT. All parameters obtained during or after the ITT did not differ between the two conditions [range: p = 0.188 to 0.948 (alpha value: 0.05); Cohen's d = 0.021 to 0.479] despite the rating of shoe comfort showing better scores for the control insoles (p = 0.001; d = −1.646). All parameters during and after the 3-km TT showed no differences (p = 0.200 to 1.000; d = 0.000 to 0.501) between both conditions except for shoe comfort showing better scores for control insoles (p = 0.017; d = −0.919). Running with carbon fiber insoles with downwards curvature did not change running performance or any submaximal or maximal physiological or biomechanical parameter and perceived exertion compared to control condition. Shoe comfort is impaired while running with carbon fiber insoles. Wearing carbon fiber insoles with downwards curvature during treadmill running is not beneficial when compared to running with control insoles.
Appraising the Methodological Quality of Sports Injury Video Analysis Studies: The QA-SIVAS Scale
(2023)
Background
Video analysis (VA) is commonly used in the assessment of sports injuries and has received considerable research interest. Until now, no tool has been available for the assessment of study quality. Therefore, the objective of this study was to develop and evaluate a valid instrument that reliably assesses the methodological quality of VA studies.
Methods
The Quality Appraisal for Sports Injury Video Analysis Studies (QA-SIVAS) scale was developed using a modified Delphi approach including expert consensus and pilot testing. Reliability was examined through intraclass correlation coefficient (ICC3,1) and free-marginal kappa statistics by three independent raters. Construct validity was investigated by comparing QA-SIVAS with expert ratings by using Kendall’s tau analysis. Rating time was studied by applying the scale to 21 studies and computing the mean time for rating per study article.
Results
The QA-SIVAS scale consists of an 18-item checklist addressing the study design, data source, conduct, report, and discussion of VA studies in sports injury research. Inter- and intra-rater reliability were excellent with ICCs > 0.97. Expert ratings revealed a high construct validity (0.71; p < 0.001). Mean rating time was 10 ± 2 min per article.
Conclusion
QA-SIVAS is a reliable and valid instrument that can be easily applied to sports injury research. Future studies in the field of VA should adhere to standardized methodological criteria and strict quality guidelines.
Background:
Ankle braces aim to reduce lateral ankle sprains. Next to protection, factors influencing user compliance, such as sports performance, motion restriction, and users’ perceptions, are relevant for user compliance and thus injury prevention. Novel adaptive protection systems claim to change their mechanical behavior based on the intensity of motion (eg, the inversion velocity), unlike traditional passive concepts of ankle bracing.
Purpose:
To compare the performance of a novel adaptive brace with 2 passive ankle braces while considering protection, sports performance, freedom of motion, and subjective perception.
Study Design:
Controlled laboratory study.
Methods:
The authors analyzed 1 adaptive and 2 passive (one lace-up and one rigid brace) ankle braces, worn in a low-cut, indoor sports shoe, which was also the no-brace reference condition. We performed material testing using an artificial ankle joint system at high and low inversion velocities. Further, 20 male, young, healthy team sports athletes were analyzed using 3-dimensional motion analysis in sports-related movements to address protection, sports performance, and active range of motion dimensions. Participants rated subjective comfort, stability, and restriction experienced when using the products.
Results:
Subjective stability rating was not different between the adaptive and passive systems. The rigid brace was superior in restricting peak inversion during the biomechanical testing compared with the passive braces. However, in the material test, the adaptive brace increased its stiffness by approximately 400% during the fast compared with the slow inversion velocities, demonstrating its adaptive behavior and similar stiffness values to passive braces. We identified minor differences in sports performance tasks. The adaptive brace improved active ankle range of motion and subjective comfort and restriction ratings.
Conclusion:
The adaptive brace offered similar protective effects in high-velocity inversion situations to those of the passive braces while improving range of motion, comfort, and restriction rating during noninjurious motions.
Clinical Relevance:
Protection systems are only effective when used. Compared with traditional passive ankle brace technologies, the novel adaptive brace might increase user compliance by improving comfort and freedom of movement while offering similar protection in injurious situations.
Background: Running overuse injuries (ROIs) occur within a complex, partly injury-specific interplay between training loads and extrinsic and intrinsic risk factors. Biomechanical risk factors (BRFs) are related to the individual running style. While BRFs have been reviewed regarding general ROI risk, no systematic review has addressed BRFs for specific ROIs using a standardized methodology.
Objective: To identify and evaluate the evidence for the most relevant BRFs for ROIs determined during running and to
suggest future research directions.
Design: Systematic review considering prospective and retrospective studies. (PROSPERO_ID: 236,832).
Data Sources: PubMed. Connected Papers. The search was performed in February 2021.
Eligibility Criteria: English language. Studies on participants whose primary sport is running addressing the risk for the seven most common ROIs and at least one kinematic, kinetic (including pressure measurements), or electromyographic BRF. A BRF needed to be identified in at least one prospective or two independent retrospective studies. BRFs needed to be determined during running.
Results: Sixty-six articles fulfilled our eligibility criteria. Levels of evidence for specific ROIs ranged from conflicting to moderate evidence. Running populations and methods applied varied considerably between studies. While some BRFs appeared for several ROIs, most BRFs were specific for a particular ROI. Most BRFs derived from lower-extremity joint kinematics and kinetics were located in the frontal and transverse planes of motion. Further, plantar pressure, vertical ground reaction force loading rate and free moment-related parameters were identified as kinetic BRFs.
Conclusion: This study offers a comprehensive overview of BRFs for the most common ROIs, which might serve as a starting point to develop ROI-specific risk profiles of individual runners. We identified limited evidence for most ROI-specific risk factors, highlighting the need for performing further high-quality studies in the future. However, consensus on data collection standards (including the quantification of workload and stress tolerance variables and the reporting of injuries) is warranted.
Background: Many countries have restricted public life in order to contain the spread of the novel coronavirus (SARS-CoV2). As a side effect of related measures, physical activity (PA) levels may have decreased.
Objective: We aimed (1) to quantify changes in PA and (2) to identify variables potentially predicting PA reductions.
Methods: A systematic review with random-effects multilevel meta-analysis was performed, pooling the standardized mean differences in PA measures before and during public life restrictions.
Results: A total of 173 trials with moderate methodological quality (modified Downs and Black checklist) were identified. Compared to pre-pandemic, total PA (SMD − 0.65, 95% CI − 1.10 to − 0.21) and walking (SMD − 0.52, 95% CI − 0.29 to − 0.76) decreased while sedentary behavior increased (SMD 0.91, 95% CI: 0.17 to 1.65). Reductions in PA affected all intensities (light: SMD − 0.35, 95% CI − 0.09 to − 0.61, p = .013; moderate: SMD − 0.33, 95% CI − 0.02 to − 0.6; vigorous: SMD − 0.33, − 0.08 to − 0.58, 95% CI − 0.08 to − 0.58) to a similar degree. Moderator analyses revealed no influence of variables such as sex, age, body mass index, or health status. However, the only continent without a PA reduction was Australia and cross-sectional trials yielded higher effect sizes (p < .05).
Conclusion: Public life restrictions associated with the COVID-19 pandemic resulted in moderate reductions in PA levels and large increases in sedentary behavior. Health professionals and policy makers should therefore join forces to develop strategies counteracting the adverse effects of inactivity.
Running stability is the ability to withstand naturally occurring minor perturbations during running. It is susceptible to external and internal running conditions such as footwear or fatigue. However, both its reliable measurability and the extent to which laboratory measurements reflect outdoor running remain unclear. This study aimed to evaluate the intra- and inter-day reliability of the running stability as well as the comparability of different laboratory and outdoor conditions. Competitive runners completed runs on a motorized treadmill in a research laboratory and overground both indoors and outdoors. Running stability was determined as the maximum short-term divergence exponent from the raw gyroscope signals of wearable sensors mounted to four different body locations (sternum, sacrum, tibia, and foot). Sacrum sensor measurements demonstrated the highest reliabilities (good to excellent; ICC = 0.85 to 0.91), while those of the tibia measurements showed the lowest (moderate to good; ICC = 0.55 to 0.89). Treadmill measurements depicted systematically lower values than both overground conditions for all sensor locations (relative bias = -9.8% to -2.9%). The two overground conditions, however, showed high agreement (relative bias = -0.3% to 0.5%; relative limits of agreement = 9.2% to 15.4%). Our results imply moderate to excellent reliability for both overground and treadmill running, which is the foundation of further research on running stability.
Non-contact anterior cruciate ligament injuries typically occur during cutting maneuvers and are associated with high peak knee abduction moments (KAM) within early stance. To screen athletes for injury risk or quantify the efficacy of prevention programs, it may be necessary to design tasks that mimic game situations. Thus, this study compared KAMs and ranking consistency of female handball players in three sport-specific fake-and-cut tasks of increasing complexity. The biomechanics of female handball players (n = 51, mean ± SD: 66.9 ± 7.8 kg, 1.74 ± 0.06 m, 19.2 ± 3.4 years) were recorded with a 3D motion capture system and force plates during three standardized fake-and-cut tasks. Task 1 was designed as a simple pre-planned cut, task 2 included catching a ball before a pre-planned cut in front of a static defender, and task 3 was designed as an unanticipated cut with three dynamic defenders involved. Inverse dynamics were used to calculate peak KAM within the first 100 ms of stance. KAM was decomposed into the frontal plane knee joint moment arm and resultant ground reaction force. RANOVAs (α ≤ 0.05) were used to reveal differences in the KAM magnitudes, moment arm, and resultant ground reaction force for the three tasks. Spearman's rank correlations were calculated to test the ranking consistency of the athletes' KAMs. There was a significant task main effect on KAM (p = 0.02; ηp2 = 0.13). The KAM in the two complex tasks was significantly higher (task 2: 1.73 Nm/kg; task 3: 1.64 Nm/kg) than the KAM in the simplest task (task 1: 1.52 Nm/kg). The ranking of the peak KAM was consistent regardless of the task complexity. Comparing tasks 1 and 2, an increase in KAM resulted from an increased frontal plane moment arm. Comparing tasks 1 and 3, higher KAM in task 3 resulted from an interplay between both moment arm and the resultant ground reaction force. In contrast to previous studies, unanticipated cutting maneuvers did not produce the highest KAMs. These findings indicate that the players have developed an automated sport-specific cutting technique that is utilized in both pre-planned and unanticipated fake-and-cut tasks.
Young female handball players represent a high-risk population for anterior cruciate ligament (ACL) injuries. While the external knee abduction moment (KAM) is known to be a risk factor, it is unclear how cutting technique affects KAMs in sport-specific cutting maneuvers. Further, the effect of added game specificity (e.g., catching a ball or faking defenders) on KAMs and cutting technique remains unknown. Therefore, this study aimed: (i) to test if athletes grouped into different clusters of peak KAMs produced during three sport-specific fake-and-cut tasks of different complexities differ in cutting technique, and (ii) to test whether technique variables change with task complexity. Fifty-one female handball players (67.0 ± 7.7 kg, 1.70 ± 0.06 m, 19.2 ± 3.4 years) were recruited. Athletes performed at least five successful handball-specific sidestep cuts of three different complexities ranging from simple pre-planned fake-and-cut maneuvers to catching a ball and performing an unanticipated fake-and-cut maneuver with dynamic defenders. A k-means cluster algorithm with squared Euclidean distance metric was applied to the KAMs of all three tasks. The optimal cluster number of koptimal = 2 was calculated using the average silhouette width. Statistical differences in technique variables between the two clusters and the tasks were analyzed using repeated-measures ANOVAs (task complexity) with nested groupings (clusters). KAMs differed by 64.5%, on average, between clusters. When pooling all tasks, athletes with high KAMs showed 3.4° more knee valgus, 16.9% higher downward and 8.4% higher resultant velocity at initial ground contact, and 20.5% higher vertical ground reaction forces at peak KAM. Unlike most other variables, knee valgus angle was not affected by task complexity, likely due to it being part of inherent movement strategies and partly determined by anatomy. Since the high KAM cluster showed higher vertical center of mass excursions and knee valgus angles in all tasks, it is likely that this is part of an automated motor program developed over the players' careers. Based on these results, reducing knee valgus and downward velocity bears the potential to mitigate knee joint loading and therefore ACL injury risk.
Treadmills are essential to the study of human and animal locomotion as well as for applied diagnostics in both sports and medicine. The quantification of relevant biomechanical and physiological variables requires a precise regulation of treadmill belt velocity (TBV). Here, we present a novel method for time-efficient tracking of TBV using standard 3D motion capture technology. Further, we analyzed TBV fluctuations of four different treadmills as seven participants walked and ran at target speeds ranging from 1.0 to 4.5 m/s. Using the novel method, we show that TBV regulation differs between treadmill types, and that certain features of TBV regulation are affected by the subjects’ body mass and their locomotion speed. With higher body mass, the TBV reductions in the braking phase of stance became higher, even though this relationship differed between locomotion speeds and treadmill type (significant body mass × speed × treadmill type interaction). Average belt speeds varied between about 98 and 103% of the target speed. For three of the four treadmills, TBV reduction during the stance phase of running was more intense (> 5% target speed) and occurred earlier (before 50% of stance phase) unlike the typical overground center of mass velocity patterns reported in the literature. Overall, the results of this study emphasize the importance of monitoring TBV during locomotor research and applied diagnostics. We provide a novel method that is freely accessible on Matlab’s file exchange server (“getBeltVelocity.m”) allowing TBV tracking to become standard practice in locomotion research.