Leeds Beckett University - City Campus,
Woodhouse Lane,
LS1 3HE
Professor Ben Jones
Professor
Ben is a Professor of Sports Science. He completed his BSc(hons), MSc and PhD at Leeds Beckett University, prior to becoming a Senior Lecturer in Sport and Exercise Physiology, then receiving his Professorship in 2017, at the age of 31.
About
Ben is a Professor of Sports Science. He completed his BSc(hons), MSc and PhD at Leeds Beckett University, prior to becoming a Senior Lecturer in Sport and Exercise Physiology, then receiving his Professorship in 2017, at the age of 31.
Ben is a Professor of Sports Science. He completed his BSc(hons), MSc and PhD at Leeds Beckett University, prior to becoming a Senior Lecturer in Sport and Exercise Physiology, then receiving his Professorship in 2017, at the age of 31.
Ben's PhD focused on the physiology and biochemistry of rugby, investigating the fluid and electrolyte balance of players. Since his PhD, his research has focused on aspects of sports performance, talent identification and development, sports medicine and injury prevention, which have had direct impact on policy and practice in numerous sports.
Ben holds consultancy roles with Premiership Rugby as their Sports Science and Medicine Research lead, and the Rugby Football League as their Strategic Lead for Performance Science and Research. He leads the overall research strategy for both organisations, which aims to improve player welfare and performance. Both research strategies involve applying the latest technology and analysis techniques to sport.
Ben also holds Visiting Professor positions at Australian Catholic University in Australia, University of New England in Australia and University of Cape Town in South Africa. He is an active member of the UK Concussion Network, DCMS Concussion Innovation and Technology Panel, and UK Concussion Prevention Network. He is a member of various RFL board sub-committees, and English Rugby Union Advisory Groups, and has engaged in a number of parliamentary discussions.
Since completing his PhD in 2013, Ben has published >250 peer-review publications, editing numerous books, and secured research and consultancy income exceeding £6 million. He has also supervised over 30 PhD students and examined over 30 PhD students internationally. He is frequently invited to deliver keynote presentations at Scientific Conferences globally, with the main focus on how research can positively impact sport from a player welfare and performance perspective. He has appeared numerous times on main stream media, including BBC breakfast to discuss concussion in sport.
In a charitable capacity, Ben has supported the Kevin Sinfield OBE challenges for MND, during which the team have raised over £8million. He worked closely with both Kevin Sinfield and Rob Burrow at Leeds Rhinos, where he worked in a consultancy capacity for 15 years.
Research interests
Ben's main research interests are within applied sports science and medicine, with direct translation into policy or practice.
For example during COVID-19, Ben led numerous seminal SARS-CoV-2 transmission papers in sport. This includes the Team Sports Risk Exposure Frameworks, which were adopted by UK Government to risk assess the SARS-CoV-2 transmission risk within sports, and also to identify increased risk contacts who were required to insolate, should an infectious individual participate in the same sports activity. This framework was also adopted by other sports organisations globally. He also led the first study in the world to evaluate SARS-CoV-2 transmission between athletes during sport, which informed contact tracing and the return of community and recreational sport during the pandemic.
Since then, Ben has undertaken a number of studies, aimed at reducing concussion and sub-concussion in sport. These studies include evaluating law changes, coaching and player interventions, and equipment trials. He is leading on numerous instrumented mouthguard projects globally. The instrumented mouthguard are used to measure head accelerations, which can lead to precise prevention intervention. He is also leading the ongoing validation of instrumented mouthguards, to meet industry defined minimum performance standards. He is also leading other league-wide projects in both codes of rugby, including player tracking, player profiling and fitness testing, and injury surveillance projects.
Publications (486)
Sort By:
Featured First:
Search:
Monitoring Fatigue and Recovery
Nutrition and Ergogenic Aids for Rugby
Fitness Testing for Rugby
Tackle characteristics associated with concussion in elite men’s rugby union: unpicking the differences between tacklers and ball-carriers
To identify characteristics of tackling, of being tackled and interactions between tackle characteristics that are associated with concussion. A case-control study in male professional rugby union players in England over five seasons (2016/2017 to 2020/2021) analysed characteristics of tackles that led to a clinically diagnosed concussion (cases), and a control group of tackles that did not result in a concussion. ORs were plotted against the overall frequency of each tackle characteristic.
231 tackles resulting in concussions (tackler 178, 77%; ball-carrier 53, 23%), alongside 9963 control tackles, were analysed. For tacklers, ‘head to torso’ (
Lower tackles reduce the chances of concussion to ball-carriers. The influence of tackle height on concussion to tacklers is more nuanced, but the chances are relatively low when contact is made with the ball-carrier’s torso. These findings support ongoing implementation of strategies to reduce concussion risk by lowering tackle height to target the torso.Objective
Methods
Results
Conclusions
Tackle Technique and Changes in Playerload™ During a Simulated Tackle: An Exploratory Study
In collision sports, the tackle has the highest injury incidence, and is key to a successful performance. Although the contact load of players has been measured using microtechnology, this has not been related to tackle technique. The aim of this study was to explore how PlayerLoad™ changes between different levels of tackling technique during a simulated tackle. Nineteen rugby union players performed twelve tackles on a tackle contact simulator (n = 228 tackles). Each tackle was recorded with a video-camera and each player wore a Catapult OptimEyeS5. Tackles were analysed using tackler proficiency criteria and split into three categories: Low scoring(≤5 Arbitrary units (AU), medium scoring(6 and 7AU) and high scoring tackles(≥8AU). High scoring tackles recorded a higher PlayerLoad™ at tackle completion. The PlayerLoad™ trace was also less variable in the high scoring tackles. The variability in the PlayerLoad™ trace may be a consequence of players not shortening their steps before contact. This reduced their ability to control their movement during the contact and post-contact phase of the tackle and increased the variability. Using the PlayerLoad™ trace in conjunction with subjective technique assessments offers coaches and practitioners insight into the physical-technical relationship of each tackle to optimise tackle skill training and match preparation.
VIDEO ANALYSIS OF CONTACT TECHNIQUE DURING HEAD COLLISIONS IN RUGBY UNION
Rugby union is characterised by frequent and dynamic collisions. Players typically aim to avoid direct contact to the head due to its potential for serious head injury. However, head collisions still occur – possibly due to players either being unaware of the impending contact, or executing poor technique during, or prior to contact. Video analysis of contact technique and head collisions in rugby union. Retrospective video analysis. Professional rugby union players. Video footage of 211 contact events. Contact characteristics and contact technique for attackers and defenders during head collisions. Attackers and defenders were categorized into higher and lower risk roles depending on which had the higher potential for injury. Contact descriptors and contact proficiency scores Eighty-four percent of head collisions occurred during the tackle, followed by aerial collisions (10%), and rucks (6%). Eighty-two percent of collisions occurred with an opponent. Higher risk players were aware of the impending contact 70% of the time. Mean contact proficiency score (arbitrary units; AU) in front-on tackled ball-carriers was 6.4±3.2 and 8.2±3.2 AU for ball carriers at higher and lower injury risk, respectively (p≤0.01, effect size=0.6, moderate). The mean contact proficiency score in front-on tackle tacklers was 9.8±3.7 and 9.2±3.5 AU for tacklers at higher and lower injury risk, respectively (p>0.05, ES=0.2, small). The tackle event accounted for most head collisions. Most players were aware of the impending contact. Higher injury risk ball-carriers in front-on tackles scored relatively low in proficiency compared to lower injury risk ball-carriers. For contact proficiency score, each technical criterion was equally weighted. Failure to execute specific criteria (e.g. head up and forward) may increase the risk of head collisions compared to other criteria (e.g. fending). Future studies on contact techniques should weight technical criteria more appropriately.Background
Objective
Design
Setting
Participants
Assessment of Risk Factors
Main Outcome Measurements
Results
Conclusions
Adaptation and physical development in a professional rugby league academy
The what and how of video analysis research in rugby union: a critical review
© 2018, The Author(s). Background: Video analysis is a common tool used in rugby union research to describe match performance. Studies using video analysis range from broad statistical studies of commercial databases to in-depth case-studies of specific match events. The range of types of studies using video analysis in rugby union, and how different studies apply the methodology, can make it difficult to compare the results of studies and translate the findings to a real-world setting. In attempt to consolidate the information on video analysis in rugby, a critical review of the literature was performed. Main body: Ninety-two studies were identified. The studies were categorised based on the outcome of the study and the type of research question, sub-categorised as ‘what’ and ‘how’ studies. Each study was reviewed using a number of questions related to the application of video analysis in research. There was a large range in the sample sizes of the studies reviewed, with some of the studies being under-powered. Concerns were raised of the generalisability of some of the samples. One hundred percent of ‘how’ studies included at least one contextual variables in their analyses, with 86% of ‘how’ studies including two or more contextual variables. These findings show that the majority of studies describing how events occur in matches attempted to provide context to their findings. The majority of studies (93%) provided practical applications for their findings. Conclusion: The review raised concerns about the usefulness of the some of the findings to coaches and practitioners. To facilitate the transfer and adoption of research findings into practice, the authors recommend that the results of ‘what’ studies inform the research questions of ‘how’ studies, and the findings of ‘how’ studies provide the practical applications for coaches and practitioners.
Background: Rugby league is a popular collision sport among Australian adolescent and young adult men. Concussion is one of the more common injuries in rugby league. Few studies have examined concussion in youth rugby league. To examine medically diagnosed concussions from a single season within two elite-level pathway rugby league competitions by evaluating game play risk factors and conducting a video review of potential concussion signs. Methods: All players involved in the Queensland Rugby League's (QRL) under 18 years and under 20 years age group competitions during the 2019 season were included in this study. Data included all head injury assessments (HIAs) identified in real-time through the QRL injury surveillance system for these two QRL age group competitions. The purpose of this study was to (i) report the rates of HIAs and medically diagnosed concussions; (ii) examine video signs of potential concussion; (iii) review game play risk factors related to HIAs and concussions; and (iv) determine the number of days until a concussed player returned to match play and the number of subsequent games missed by concussed players. Results: There were 86 HIAs and 30 medically diagnosed concussions from the two competitions. The concussion incidence was 2.93 per 1000 player match hours in the under 18-year age group and 5.75 per 1000 player match hours in the under 20-year age group. Slow to stand was the most commonly observed video sign (78.6%; 22/28 concussions). Most concussed players (91%, 21/23) missed at least one subsequent game (M = 1.4, SD = 1.7, range = 0–7 games), with the average days to return-to-play being 15.7 (SD = 7.0, range = 7–41 days). Conclusions: In elite-level pathway rugby league, the incidences of HIAs and medically diagnosed concussions were higher in the under 20 age group than the under 18 age group. Both age groups had lower incidences of HIAs and concussions than professional adult rugby league players. Return-to-play following concussion was similar across the two age groups and differed considerably compared to the elite level, with a longer time before return to play for the younger elite level development pathway players.
Why tennis players felt the heat is the Australian Open
Effect of evolocumab on carotid plaque composition in asymptomatic carotid artery stenosis (EVOCAR-1) using magnetic resonance imaging
Background and Aims: To determine the effect of evolocumab treatment in patients with asymptomatic carotid artery stenosis ≥50% on carotid plaque morphology and composition, as determined by magnetic resonance imaging. Methods: We conducted a double-blind randomized controlled trial in patients with asymptomatic carotid artery plaque with ≥50% stenosis and low-density lipoprotein-associated cholesterol (LDL-C) ≥1.8 mmol/L, despite standard lipid-lowering therapy, with 12 months of evolocumab or placebo injection every two weeks. The primary endpoint was the between group difference in the absolute change from baseline in carotid plaque lipid-rich necrotic core (LRNC), assessed by carotid magnetic resonance. Results: Due to interrupted recruitment during the COVID-19 pandemic, 33 patients (36% female) were randomised, which was less than the target of 52. Mean age was 68.7 years (SD, 8.5) and baseline LDL-C 2.4 mmol/L (SD, 0.7). LDL-C was reduced with evolocumab to 0.8 mmol/L (SD, 0.5) vs 2.2 mmol/L (SD, 0.7) with placebo at 3 months (between group absolute difference -1.3 mmol/L [95% confidence interval [CI], -1.7 to -0.9], p < 0.001). Evolocumab treatment was associated with a favourable change in LRNC at 12 months of -16 mm
3
(SD, 54) whereas the placebo group showed -4 mm3
(SD, 44). Between group differences did not show statistical significance with a placebo-adjusted LRNC change of -17 mm3
([95% CI, -45 to 12], p = 0.25). Percentage carotid plaque LRNC also numerically reduced at 12 months, however this did not reach statistical significance (-2.4% vessel wall volume [95% CI, -5.7 to 0.9], p = 0.16). Conclusion: Intensive LDL-C lowering with the addition of evolocumab to maximally tolerated lipid-lowering therapy did not lead to a statistically significant change in vulnerable plaque phenotype characteristics in patients with asymptomatic carotid artery stenosis, but the study was underpowered due to under-recruitment in the context of the COVID-19 pandemic.The optimal fluid for the cyclist…..
Hydration status of Rugby League Players during Home Match Play throughout the 2008 Super League Season
The hydration status of rugby league players during competitive home match play was assessed throughout the 2008 Super League season. Fourteen players from 2 Super League clubs were monitored (72 observations). On arrival, 2 h prior to kick off, following normal prematch routines, players' body mass were measured following a urine void. Prematch fluid intake, urine output, and osmolality were assessed until kick off, with additional measurements at half time. Fluid intake was also monitored during match play for club B only, and final measurements of variables were made at the end of the match. Mean body mass loss per match was 1.28 ± 0.7 kg (club A, 1.15 kg; club B, 1.40 kg), which would equate to an average level of dehydration of 1.31% (mass loss, assumed to be water loss, expressed as a percentage of body mass), with considerable intra-individual coefficient of variation (CV, 47%). Mean fluid intake for club B was 0.64 ± 0.5 L during match play, while fluid loss was 2.0 ± 0.7 L, with considerable intra-individual CV (51% and 34%, respectively). Mean urine osmolality was 396 ± 252 mosm·kg-1 on arrival, 237 ± 177 mosm·kg-1 prematch, 315 ± 133 mosm·kg-1 at half time, and 489 ± 150 mosm·kg-1 postmatch. Body mass losses were primarily a consequence of body fluid losses not being completely balanced by fluid intake. Furthermore, these data show that there is large inter- and intra-individual variability of hydration across matches, highlighting the need for future assessment of individual relevance.
Talent development environments (TDEs) strive to develop junior athletes towards senior elite performance, however, are subject to a range of contextual factors influencing their operations. This study aimed to investigate the influence of contextual factors on efficiency and effectiveness across all English rugby union men's academies. Fourteen focus groups were conducted, one for each academy. Underpinned by pragmatic research philosophy, focus group discussions were analysed via reflexive thematic analysis. Analysis led to the generation of four themes to explain the impact of contextual factors: "multiple loosely connected concurrent environments", "regulation drives practice", "organisational influences" and "searching for bang for buck". Findings suggest complex interactions between a network of individuals and organisations, both internal and external to the structure of the talent system. In this context, it seemed inadequate to only consider the role of a single TDE. Overall, results reflect contextual and resource challenges constrain practice within English rugby union academies. In practice, we suggest the need to consider the tension of regulation to enhance minimum standards, against the increased autonomy that may result from flexibility of regulation to facilitate enhanced efficiency and effectiveness.
Underwater near-infrared spectroscopy measurements of muscle oxygenation: laboratory validation and preliminary observations in swimmers and triathletes
The purpose of this research was to waterproof a near-infrared spectroscopy device (PortaMon, Artinis Medical Systems) to enable NIR measurement during swim exercise. Candidate materials were initially tested for waterproof suitability by comparing light intensity values during phantom-based tissue assessment. Secondary assessment involved repeated isokinetic exercises ensuring reliability of the results obtained from the modified device. Tertiary assessment required analysis of the effect of water immersion and temperature upon device function. Initial testing revealed that merely covering the PortaMon light sources with waterproof materials considerably affected the NIR light intensities. Modifying a commercially available silicone covering through the addition of a polyvinyl chloride material (impermeable to NIR light transmission) produces an acceptable compromise. Bland-Altman analysis indicated that exercise-induced changes in tissue saturation index (TSI %) were within acceptable limits during laboratory exercise. Although water immersion had a small but significant effect upon NIR light intensity, this resulted in a negligible change in the measured TSI (%). We then tested the waterproof device in vivo illustrating oxygenation changes during a 100 m freestyle swim case study. Finally, a full study compared club level swimmers and triathletes. Significant changes in oxygenation profiles when comparing upper and lower extremities for the two groups were revealed, reflecting differences in swim biomechanics.
The Challenge Point Framework (CPF) guides practice design for optimal motor skill learning. The CPF’s use and prevalence has not been reported. This review’s aims are to–(i) identify research areas that use the CPF, (ii) determine the CPF’s prevalence across research areas and (iii) summarise applications of the CPF across research areas. A systematic scoping review, following modified PRISMA-ScR guidelines, was conducted. Papers referencing Guadagnoli and Lee’s (2004) “Challenge Point Framework” paper were reviewed against inclusion/exclusion criteria. Data from 100 included papers were analysed for (1) numerical; (2) thematic; and (3) descriptive summaries. Four themes were identified and common CPF applications were identified within each theme. CPF use has been viewed favourably whilst its limitations have been acknowledged (e.g., lack of practical application research).
The Use of Portable NIRS to Measure Muscle Oxygenation and Haemodynamics During a Repeated Sprint Running Test
Portable near-infrared spectroscopy (NIRS) devices were originally developed for use in exercise and sports science by Britton Chance in the 1990s (the RunMan and microRunman series). However, only recently with the development of more robust, and wireless systems, has the routine use in elite sport become possible. As with the medical use of NIRS, finding applications of the technology that are relevant to practitioners is the key issue. One option is to use NIRS to track exercise training-induced adaptations in muscle. Portable NIRS devices enable monitoring during the normal 'field' routine uses to assess fitness, such as repeat sprint shuttle tests. Knowledge about the acute physiological responses to these specific tests has practical applications within team sport training prescription, where development of both central and peripheral determinants of high-intensity intermittent exercise needs to be considered. The purpose of this study was to observe NIRS-detected parameters during a repeat sprint test. We used the PortaMon, a two wavelength spatially resolved NIR spectrometer manufactured by Artinis Inc., to assess NIR changes in the gastrocnemius muscle of both the left and right leg during high-intensity running. Six university standard rugby players were assessed (age 20 ± 1.5 years; height 183 ± 1.0 cm; weight 89.4 ± 5.8 kg; body fat 12.2 ± 3.0 %); the subjects completed nine repeated shuttle runs, which incorporated forward, backward and change of direction movements. Individual sprint time, total time to complete test, blood lactate response (BL), heart rate values (HR) and haemoglobin variables (ΔHHb, ΔtHb, ΔHbO
Integration of research into practice; the Carnegie Adolescent Rugby Research (CARR) project
Oppositely biased glucagon-like peptide-1 receptor agonism does not differentially affect lipid metabolism in APOE*3-Leiden CETP mice
Aims [D3,G40,K41.C16 diacid]exendin-4 (acyl-ExD3) and [F1,G40,K41.C16 diacid]exendin-4 (acyl-ExF1) are oppositely biased glucagon-like peptide-1 (GLP-1) receptor agonists that preferentially promote β-arrestin recruitment or G protein-induced signalling, respectively. The latter is more favourable in glycaemic control and induces a steeper reduction in body weight in diet-induced obese mice. Here, we compared the effects of G protein-biased agonist acyl-ExF1 to those of β-arrestin-biased agonist acyl-ExD3 on lipid metabolism in hyperlipidaemic mice. Materials and methods APOE*3-Leiden.CETP mice were treated with saline, acyl-ExD3 or acyl-ExF1 via intraperitoneal injections for 6 weeks or intracerebroventricular infusion for 18 days. Body weight and composition were monitored at regular intervals, as were plasma glucose, triglyceride and cholesterol levels. At endpoint, mice were injected with very low-density lipoprotein (VLDL)-like particles containing glycerol tri[3H]oleate to study triglyceride-derived fatty acid uptake by peripheral tissues including brown and white adipose tissue (BAT and WAT). Results Upon peripheral treatment, body weight gain was prevented and plasma glucose levels were reduced by acyl-ExF1, but circulating lipids were not affected by either acyl-ExF1 or acyl-ExD3. In contrast, central administration of either agonist strongly reduced plasma triglyceride and cholesterol levels, but did not affect glucose levels. Acyl-ExD3 and acyl-ExF1 increased [3H]oleate uptake by adipose tissue, reaching statistical significance for the uptake by BAT and WAT, respectively, compared to vehicle treatment. Conclusion The oppositely biased GLP-1 receptor agonists acyl-ExD3 and acyl-ExF1 do not differentially affect lipid metabolism in APOE*3-Leiden.CETP mice, while effects on glucose homeostasis and prevention of body weight gain are more pronounced upon peripheral acyl-ExF1 treatment.
Muscle Oxygen Changes following Sprint Interval Cycling Training in Elite Field Hockey Players
This study examined the effects of Sprint Interval Cycling (SIT) on muscle oxygenation kinetics and performance during the 30-15 intermittent fitness test (
-1
) but not the CON group (pre = 5.37± 0.27 to 5.39±0.30m.s-1
) significant changes were seen in the 30-15The aim of this review was to consolidate and synthesise rugby union (RU) and rugby league (RL) studies on tackle and RU studies on ruck technique for rugby stakeholders. Forty-nine studies were identified (20 in RL and 29 in RU). RL studies primarily focussed on identifying factors that impact tackling ability. Leaner, fitter players, with greater lower body strength, tended to have more proficient tackle technique. Experience and level of play were positively associated with tackling ability. These findings highlight the importance of developing tackle technique and physical qualities to allow players to progress to higher levels. Research in RU mostly focussed on identifying tackle and ruck techniques associated with performance measures and injury outcomes. Eleven tackle techniques and five ball-carrier techniques were associated with both performance measures and injury outcomes. These findings support national injury prevention programmes that advocate that safe contact technique is also effective technique.
Hydration status of rugby league players during home match play throughout the 2008 Super League season.
Correction to: Quantifying the Collision Dose in Rugby League: A Systematic Review, Meta-analysis, and Critical Analysis
An amendment to this paper has been published and can be accessed via the original article.
The Validity of Automated Tackle Detection in Women's Rugby League
Abstract
Cummins, C, Charlton, G, Naughton, M, Jones, B, Minahan, C, and Murphy, A. The validity of automated tackle detection in women's rugby league.
Variable long-term developmental trajectories of short sprint speed and jumping height in English Premier League academy soccer players: An applied case study
Growth and maturation affect long term physical performance, making the appraisal of athletic ability difficult. We sought to longitudinally track youth soccer players to assess the developmental trajectory of athletic performance over a 6-year period in an English Premier League academy. Age-specific z-scores were calculated for sprint and jump performance from a sample of male youth soccer players (n = 140). A case study approach was used to analyse the longitudinal curves of the six players with the longest tenure. The trajectories of the sprint times of players 1 and 3 were characterised by a marked difference in respective performance levels up until peak height velocity (PHV) when player 1 achieved a substantial increase in sprint speed and player 3 experienced a large decrease. Player 5 was consistently a better performer than player 2 until PHV when the sprint and jump performance of the former markedly decreased and he was overtaken by the latter. Fluctuations in players’ physical performance can occur quickly and in drastic fashion. Coaches must be aware that suppressed, or inflated, performance could be temporary and selection and deselection decisions should not be made based on information gathered over a short time period.
The aim of this study was to identify the changes in movement variability and movement velocity during a six-week training period using a resistance horizontal forward–backward task without (NOBALL) or with (BALL) the constraint of catching and throwing a rugby ball in the forward phase. Eleven elite male rugby union players (mean ± SD: age 25.5 ± 2.0 years, height 1.83 ± 0.06 m, body mass 95 ± 18 kg, rugby practice 14 ± 3 years) performed eight repetitions of NOBALL and BALL conditions once a week in a rotational flywheel device. Velocity was recorded by an attached rotary encoder while acceleration data were used to calculate sample entropy (SampEn), multiscale entropy, and the complexity index. SampEn showed no significant decrease for NOBALL (ES = -0.64 ± 1.02) and significant decrease for BALL (ES = -1.71 ± 1.16; p < 0.007) conditions. Additionally, movement velocity showed a significant increase for NOBALL (ES = 1.02 ± 1.05; p < 0.047) and significant increase for BALL (ES = 1.25 ± 1.08; p < 0.025) between weeks 1 and 6. The complexity index showed higher levels of complexity in the BALL condition, specifically in the first three weeks. Movement velocity and complex dynamics were adapted to the constraints of the task after a four-week training period. Entropy measures seem a promising processing signal technique to identify when these exercise tasks should be changed.
Returning to Play after Prolonged Training Restrictions in Professional Collision Sports
Abstract
The COVID-19 pandemic in 2020 has resulted in widespread training disruption in many sports. Some athletes have access to facilities and equipment, while others have limited or no access, severely limiting their training practices. A primary concern is that the maintenance of key physical qualities (e. g. strength, power, high-speed running ability, acceleration, deceleration and change of direction), game-specific contact skills (e. g. tackling) and decision-making ability, are challenged, impacting performance and injury risk on resumption of training and competition. In extended periods of reduced training, without targeted intervention, changes in body composition and function can be profound. However, there are strategies that can dramatically mitigate potential losses, including resistance training to failure with lighter loads, plyometric training, exposure to high-speed running to ensure appropriate hamstring conditioning, and nutritional intervention. Athletes may require psychological support given the challenges associated with isolation and a change in regular training routine. While training restrictions may result in a decrease in some physical and psychological qualities, athletes can return in a positive state following an enforced period of rest and recovery. On return to training, the focus should be on progression of all aspects of training, taking into account the status of individual athletes.
Current Research in Rugby
Total Score of Athleticism: Holistic Athlete Profiling to Enhance Decision- Making
ABSTRACT
Oftentimes, the various coaching staff, sport science, and medical practitioners of a sports club require a single, holistic indication of an athlete's athleticism. Currently, there is no consensus on how this is best defined, and thus, a total score of athleticism (TSA) may provide one such method. The TSA is derived from the average of Z-scores (or T-scores in the case of small samples) from a sport-specific testing battery, ensuring athletes are judged across all the relevant fitness capacities that best define the physical demands of competition. To aid readers in using the TSA, this article also details how it is computed in EXCEL.
Applied Research in Practice
© 2019 Informa UK Limited, trading as Taylor & Francis Group. Instruction: Effective playing time in soccer is typically < 60 min per game and while players may reposition themselves when the ball is out of play, it is likely the physical demand decreases during this period. Therefore, if this period is included in data when quantifying match demands, it may under-report the physical requirements of soccer players. This study investigated an alternative method for quantifying external workload called ball in play (BiP), which analyses the data excluding stoppages, and thus potentially offers a more insightful analysis of match demands.Methods: Whole match demands as typically recorded via GPS, were compared to those based on BiP, and maximum BiP, with the latter representing worst case scenario phases of play. The 25-elite male youth soccer players (age: 17.9 ± 0.6 years; height: 174.8 ± 6.2 cm; body mass: 66.3 ± 8.1 kg) who participated in this study were also categorised in to positional groups (defender, midfielder, and forward) to assess differences in positional demands.Results: While no differences were noted based on position, whole match metrics were significantly lower than mean and maximum BiP metrics (p < 0.05). There was also a significant difference for maximum BiP outputs across different in-play durations, when comparing 30-60 seconds, 60–90 seconds, and > 90 seconds.Conclusion: This data allows practitioners to gain a deeper understanding of the physical demands imposed on players and plan sessions using targets that better represent match demands.
Training Loads and Exposure in Late specialization Sports; Challenges and Solutions
Integrating Research within Sport Understanding Physical Qualities, Injury Risk, Match Demands & Performance
OBJECTIVES: To examine the interactions between SARS-CoV-2 positive players and other players during rugby league matches and determine within-match SARS-CoV-2 transmission risk. METHODS: Four Super League matches in which SARS-CoV-2 positive players were subsequently found to have participated were analysed. Players were identified as increased-risk contacts, and player interactions and proximities were analysed by video footage and global positioning system (GPS) data. The primary outcome was new positive cases of SARS-CoV-2 within 14 days of the match in increased-risk contacts and other players participating in the matches. RESULTS: Out of 136 total players, there were 8 SARS-CoV-2 positive players, 28 players identified as increased-risk contacts and 100 other players in the matches. Increased-risk contacts and other players were involved in 11.4±9.0 (maximum 32) and 4.0±5.2 (maximum 23) tackles, respectively. From GPS data, increased-risk contacts and other players were within 2 m of SARS-CoV-2 positive players on 10.4±18.0 (maximum 88) and 12.5±20.7 (maximum 121) occasions, totalling 65.7±137.7 (maximum 689) and 89.5±169.4 (maximum 1003) s, respectively. Within 14 days of the match, one increased-risk contact and five players returned positive SARS-CoV-2 reverse transcriptase PCR (RT-PCR) tests, and 27 increased-risk contacts and 95 other participants returned negative SARS-CoV-2 RT-PCR tests. Positive cases were most likely traced to social interactions, car sharing and wider community transmission and not linked to in-match transmission. CONCLUSION: Despite tackle involvements and close proximity interactions with SARS-CoV-2 positive players, in-match SARS-CoV-2 transmission was not confirmed. While larger datasets are needed, these findings suggest rugby presents a lower risk of viral transmission than previously predicted.
Insights and Challenges from the Implementation of a League-wide GPS project in rugby league
BACKGROUND: Collisions (i.e. tackles, ball carries, and collisions) in the rugby league have the potential to increase injury risk, delay recovery, and influence individual and team performance. Understanding the collision demands of the rugby league may enable practitioners to optimise player health, recovery, and performance. OBJECTIVE: The aim of this review was to (1) characterise the dose of collisions experienced within senior male rugby league match-play and training, (2) systematically and critically evaluate the methods used to describe the relative and absolute frequency and intensity of collisions, and (3) provide recommendations on collision monitoring. METHODS: A systematic search of electronic databases (PubMed, SPORTDiscus, Scopus, and Web of Science) using keywords was undertaken. A meta-analysis provided a pooled mean of collision frequency or intensity metrics on comparable data sets from at least two studies. RESULTS: Forty-three articles addressing the absolute (n) or relative collision frequency (n min-1) or intensity of senior male rugby league collisions were included. Meta-analysis of video-based studies identified that forwards completed approximately twice the number of tackles per game than backs (n = 24.6 vs 12.8), whilst ball carry frequency remained similar between backs and forwards (n = 11.4 vs 11.2). Variable findings were observed at the subgroup level with a limited number of studies suggesting wide-running forwards, outside backs, and hit-up forwards complete similar ball carries whilst tackling frequency differed. For microtechnology, at the team level, players complete an average of 32.7 collisions per match. Limited data suggested hit-up and wide-running forwards complete the most collisions per match, when compared to adjustables and outside backs. Relative to playing time, forwards (n min-1 = 0.44) complete a far greater frequency of collision than backs (n min-1 = 0.16), with data suggesting hit-up forwards undertake more than adjustables, and outside backs. Studies investigating g force intensity zones utilised five unique intensity schemes with zones ranging from 2-3 g to 13-16 g. Given the disparity between device setups and zone classification systems between studies, further analyses were inappropriate. It is recommended that practitioners independently validate microtechnology against video to establish criterion validity. CONCLUSIONS: Video- and microtechnology-based methods have been utilised to quantify collisions in the rugby league with differential collision profiles observed between forward and back positional groups, and their distinct subgroups. The ball carry demands of forwards and backs were similar, whilst tackle demands were greater for forwards than backs. Microtechnology has been used inconsistently to quantify collision frequency and intensity. Despite widespread popularity, a number of the microtechnology devices have yet to be appropriately validated. Limitations exist in using microtechnology to quantify collision intensity, including the lack of consistency and limited validation. Future directions include application of machine learning approaches to differentiate types of collisions in microtechnology datasets.
Monitoring and Evaluating Load in Team Sports
Applying Science to Sports Performance
PURPOSE: The aim of this study was to describe the physical demands during U18 elite basketball games according to the game quarter and to identify a smaller subset of variables and threshold scores that distinguish players' physical performance in each quarter. METHODS: Data was collected from ninety-four players who participated in the study (age: 17.4 ± 0.74 years; height: 199.0 ± 0.1 cm; body mass: 87.1 ± 13.1 kg) competing in the Euroleague Basketball Next Generation Tournament. Players' movements during the games were measured using a portable local positioning system (LPS) (WIMU PRO®, Realtrack Systems SL, Almería, Spain) and included relative distance (total distance / playing duration), relative distance in established speed zones, high-intensity running (18.1-24.0 km·h-1) and sprinting (> 24.1 km·h-1). player load, peak speed (km·h-1) and peak acceleration (m·s-2) number of total accelerations and total decelerations, high intensity accelerations (> 2 m·s-2) and decelerations (< -2 m·s-2). RESULTS: There was an overall decrease in distance covered, player load, number of high intensity accelerations and decelerations between the first and last quarter of the games in all playing positions. A classification tree analysis showed that the first quarter had much influence of distance covered (above 69.0 meters), distance covered <6.0 km·h-1 and accelerations (> 2 m·s-2), whereas the fourth quarter performance had much influence of distance covered (below 69.0) and distance covered 12.1-18.0 km·h-1. CONCLUSIONS: A significant reduction in physical demands occurs during basketball, especially between first and last quarter for players in all playing positions during basketball games of under 18 elite players.
The Training Recovery Cycle; Applying Research to Practice
Physical demands of elite basketball during an official U18 international tournament
© 2019, © 2019 Informa UK Limited, trading as Taylor & Francis Group. The aims of this study were (a) to compare players’ physical demands between different playing positions in elite U18 basketball games and (b) to identify different clusters of performance. Data were collected from 94 male subjects (age: 17.4 ± 0.7 years), competing in a Euroleague Basketball Tournament. Guards covered a greater relative distance than centres and forwards (small to moderate effect). Forwards and guards had more peak accelerations, high accelerations and high decelerations than centres (moderate to large effects). A cluster analysis allowed to classify all cases into three different groups (Lower, Medium and Higher activity demands), containing 37.4%, 52.8% and 9.8% of the cases, respectively. The high accelerations, high decelerations, peak accelerations and total distance covered were the variables that most contributed to classify the players into the new groups. The percentage of cases distributed in the clusters according to playing position, game type (worst vs worst, mixed opposition, best vs best) and team were different. Centres have lower physical demands specially related with the number of accelerations and decelerations at high intensity and the peak acceleration when compared with guards. Each team has a different activity profile, that does not seem to influence the tournament outcome.
Using principal component analysis to develop performance indicators in professional rugby league
© 2018, © 2018 Cardiff Metropolitan University. Previous research on performance indicators in rugby league has suggested that dimension reduction techniques should be utilised when analysing sporting data sets with a large number of variables. Forty-five rugby league team performance indicators, from all 27 rounds of the 2012, 2013 and 2014 European Super League seasons, collected by Opta, were reduced to 10 orthogonal principal components with standardised team scores produced for each component. Forced-entry logistic (match outcome) and linear (point’s difference) regression models were used alongside exhaustive chi-square automatic interaction detection decision trees to determine how well each principle component predicted success. The 10 principal components explained 81.8% of the variance in point’s difference and classified match outcome correctly ~90% of the time. Results suggested that if a team increased “amount of possession” and “making quick ground” component scores, they were more likely to win (β = 15.6, OR = 10.1 and β = 7.8, OR = 13.3) respectively. Decision trees revealed that “making quick ground” was an important predictor of match outcome followed by “quick play” and “amount of possession”. The use of PCA provided a useful guide on how teams can increase their chances of success by improving performances on a collection of variables, instead of analysing variables in isolation.
Perspectives of applied collaborative sport science research within professional team sports
© 2018 European College of Sport Science The purpose of the study was to examine the perspectives of both academics and practitioners in relation to forming applied collaborative sport science research within team sports. Ninety-three participants who had previously engaged in collaborative research partnerships within team sports completed an online survey which focused on motivations and barriers for forming collaborations using blinded sliding scale (0–100) and rank order list. Research collaborations were mainly formed to improve the team performance (Academic: 73.6 ± 23.3; Practitioner: 84.3 ± 16.0; effect size (ES = 0.54), small). Academics ranked journal articles’ importance significantly higher than practitioners did (Academic: Mrank = 53.9; Practitioner: 36.0; z = −3.18, p = .001, p < q). However, practitioners rated one-to-one communication as more preferential (Academic: Mrank = 41.3; Practitioner 56.1; z = −2.62, p = .009, p < q). Some potential barriers were found in terms of staff buy in (Academic: 70.0 ± 25.5; Practitioner: 56.8 ± 27.3; ES = 0.50, small) and funding (Academic: 68.0 ± 24.9; Practitioner: 67.5 ± 28.0; ES = 0.02, trivial). Both groups revealed low motivation for invasive mechanistic research (Academic: 36.3 ± 24.2; Practitioner: 36.4 ± 27.5; ES = 0.01, trivial), with practitioners have a preference towards ‘fast’ type research. There was a general agreement between academics and practitioners for forming research collaborations. Some potential barriers still exist (e.g. staff buy in and funding), with practitioners preferring ‘fast’ informal research dissemination compared to the ‘slow’ quality control approach of academics.
Tackle and ruck technique proficiency within academy and senior club rugby union
© 2019, © 2019 Informa UK Limited, trading as Taylor & Francis Group. This study examined the validity of a tool that assesses tackle and ruck technique in training and established reference data for tackle, ball-carry and ruck technique at different levels of play in rugby union. One hundred and thirty-one amateur rugby union players; 37 senior, 51 first-grade academy and 43 second-grade academy players, participated in a two-on-two contact drill. The drill was filmed and the players’ tackle, ball-carry and ruck technique were assessed using standardized technical criteria. Senior level players scored significantly higher in all three assessments; tackle technique senior vs academy 1st (p < 0.01, effect size (ES) = 0.7, moderate), senior vs academy 2nd (p < 0.01, ES = 0.7, moderate); ball-carry technique senior vs academy 1st (p < 0.01, ES = 0.6, moderate), senior vs academy 2nd (p < 0.01, ES = 0.8, moderate); ruck technique senior vs academy 1st (p < 0.01, ES = 0.7, moderate), senior vs academy 2nd (p < 0.01, ES = 0.4, small). These findings emphasize the importance of developing contact technique to allow players to progress to higher levels, and provide validity to an assessment tool which can facilitate this process.
Fatigue and Recovery in Youth Rugby
Identifying and understanding the fatigue response of youth rugby players following training and competition is necessary to avoid chronic fatigue, underperformance, and injury. There is an abundance of monitoring tools that have been used in youth rugby cohorts to describe the fatigue response. Tests typically include assessment of lower- and upper-body neuromuscular function, biochemical markers, and athlete self-reported measures. Researchers in youth rugby have typically assessed fatigue measures following training, single matches, and during periods of congested fixtures. The fatigue response following training depends upon the training volume and design (e.g., non-contact vs. contact). Fatigue is present for up to 72 hours following single matches and has the potential for accumulation during periods of congested fixtures. Despite the limited number of studies investigating fatigue and recovery in youth rugby players, this chapter presents the current findings from the literature with the aim to provide an understanding of how youth rugby players recover from competition. This information can help to guide training prescription and recovery methods when youth athletes may be most vulnerable to performance decrements or negative health consequences.
Research and Innovation in Sport, Creating a Big Impact in Rugby
This study investigated the relationship between physical qualities and contact technique proficiency in rugby union players. Thirty-eight (n = 38) male academy rugby union players participated in the study. Physical measures of anthropometry, functional mobility, strength endurance, strength, power, speed, agility, and anaerobic and aerobic endurance were assessed. Tackler, ball-carrier, and ruck technique were assessed using video analysis of a standardised two-on-two contact drill. Seven physical qualities were moderately associated with tackler technique; Push-ups (r2 = 0.2; β = 0.04; p = 0.005; ES = 0.26), Sit-ups (r2 = 0.2; β = 0.08; p = 0.004; ES = 0.27), Relative 1RM Bench Press (r2 = 0.2; β = 2.32; p = 0.003; ES = 0.29), Broad Jump (r2 = 0.2; β = 0.03; p = 0.009; ES = 0.22), Agility (r2 = 0.2; β = −0.47; p = 0.019; ES = 0.19), 40m-Speed with Ball (r2 = 0.1; β = 0.93; p = 0.027; ES = 0.16) and Functional Mobility (r2 = 0.2; β = 0.16; p = 0.007; ES = 0.25). There was a large association between ball-carrier technique and Medicine Ball Throw (r2 = 0.3; β = 1.13; p = 0.001; ES = 0.37), and a moderate association between ruck technique and agility without (r2 = 0.2; β = −0.75; p = 0.005; ES = 0.21) and with (r2 = 0.2; β = −0.55; p = 0.015; ES = 0.29) the ball. The findings demonstrate the important contribution of physical strength and conditioning to contact technique in rugby union players. Contact technique training should be accompanied with physical strength and conditioning, as improvements in physical qualities may serve as foundational components to underpin improvements in technique.
Objectives: Effective tackle technique is associated with reduced injury risks and improved performance in contact. Injury prevention programmes aim to provide players with knowledge of effective technique. However, little is known of the impact of this knowledge on a player's technique in the tackle. This study aimed to determine the association between knowledge of proper tackle technique and tackle technique proficiency in training. Methods: Fifty-three rugby union players participated in a tackle contact drill and, thereafter, completed a questionnaire. The drill was filmed, and the players' tackle and ball-carry technique were assessed using standardised technical proficiency. In the questionnaire, the players were asked to rate the importance of each tackle and ball-carry technique on a 5-point Likert scale, for both injury prevention and performance tackle outcomes. Linear regression was performed to assess the relationship between the knowledge of the importance of proper tackle technique and tackle technique proficiency during the drill. Results: No association was found between players' knowledge of the importance of proper technique and tackle contact technique in training for both injury prevention and performance. Conclusion: The lack of association between players' knowledge and actual tackle contact technique reveals the gap between the knowledge of safe and effective techniques and the knowledge of how to execute the said techniques.
Coach and player rating of perceived challenge (RPC) as a skill monitoring tool in Rugby union
To determine the relationship between player and coach rating of perceived challenge (RPC) for different training sessions over a competitive rugby union season. A secondary aim was to explore the relationship between player RPC and player session rating of perceived exertion (RPE). We used an observational longitudinal study design to monitor 51 male highly-trained under 21 rugby union players and four coaches over an 11-week competitive rugby season (a total of 1798 training session observations). Player RPC (0 to 10 arbitrary units (AU)) and RPE ratings (0 to 10 AU) were collected after team sessions (a technical and tactical field-based session with all players training together), split sessions (a technical and tactical field-based session where players trained separately according to their positional grouping (forward and backs)) and gym sessions (non-field-based session with all players training together). Coach RPC ratings were only collected after team sessions and split sessions. A weak positive relationship (rho = 0.26; 95% CI: 0.09–0.42; p < .001) was found for split sessions (player RPC: 4.40; 95% CI: 3.87–4.87 AU; coach RPC: 4.25; 95% CI: 3.92–4.60 AU), while a moderate positive relationship (rho = 0.37; 95% CI: 0.31–0.43; p < .001) was found between player RPC (4.29; 95% CI: 4.00–4.55) and coach RPC (4.96; 95% CI: 4.89–5.05) for team sessions. Forwards had a higher RPC (5.33; 95% CI: 4.50–5.65) compared to backs (3.45; 95% CI: 2.88–4.00) for split ( p < .001) and team sessions (forward's RPC: 4.66; 95% CI: 4.37–4.94; back's RPC: 3.84; 95% CI: 3.38–4.26; p < .001). In conclusion, using a rating to quantify the perceived challenge of training, we found coaches may be overestimating how challenging their training sessions are. Forwards-rated field sessions more challenging than backs, which likely represents their additional technical and tactical demands from training scrums, line-outs and mauls. While the RPC has strong theoretical justification as a rating tool to potentially fulfil the gap of quantifying the perceived challenge of training, thoughtful validity studies are yet to be conducted on the scale, which are the required next steps if the RPC is going form part of a coach's and practitioner's toolbox to optimise skill training. Objective
Methods
Results
Conclusions
Objective: The tackle is the most injurious event in rugby league and carries the greatest risk of concussion. This study aims to replicate previous research conducted in professional men's rugby league by examining the association between selected tackle characteristics and head impact events (HIEs) in women's professional rugby league. Methods: We reviewed and coded 83 tackles resulting in an HIE and every tackle (6,318 tackles) that did not result in an HIE for three seasons (2018–2020) of the National Rugby League Women's (NRLW) competition. Tackle height, body position of the tackler and ball carrier, as well as the location of head contact with the other player's body were evaluated. Propensity of each situation that caused an HIE was calculated as HIEs per 1,000 tackles. Results: The propensity for tacklers to sustain an HIE was 6.60 per 1,000 tackles (95% CI: 4.87–8.92), similar to that of the ball carrier (6.13 per 1,000 tackles, 95% CI: 4.48–8.38). The greatest risk of an HIE to either the tackler or ball carrier occurred when head proximity was above the sternum (21.66 per 1,000 tackles, 95% CI: 16.55–28.35). HIEs were most common following impacts between two heads (287.23 HIEs per 1,000 tackles, 95% CI: 196.98–418.84). The lowest propensity for both tackler (2.65 per 1,000 tackles, 95% CI: 0.85–8.20) and ball carrier HIEs (1.77 per 1,000 tackles, 95% CI: 0.44–7.06) occurred when the head was in proximity to the opponent's shoulder and arm. No body position (upright, bent or unbalanced/off feet) was associated with an increased propensity of HIE to either tackler or ball carrier. Conclusions: In the NRLW competition, tacklers and ball carriers have a similar risk of sustaining an HIE during a tackle, differing from men's NRL players, where tacklers have a higher risk of HIEs. Further studies involving larger samples need to validate these findings. However, our results indicate that injury prevention initiatives in women's rugby league should focus on how the ball carrier engages in contact during the tackle as well as how the tackler executes the tackle.
This review and meta‐analysis aimed to describe the current rugby‐7s injury epidemiological literature by examining injury data from both sexes, all levels of play, and their associated risk factors. Studies published up until March 2024 were included. These studies were retrieved from six databases using search terms related to rugby‐7s or sevens, tackle, collision, collision sport, injury, athlete, incidence rate, mechanism, and risk factor. Only peer‐reviewed original studies using prospective or retrospective cohort designs with a clearly defined rugby‐7s sample were considered. Included studies needed to report one injury outcome variable. Non‐English and qualitative studies; reviews, conference papers, and abstracts were excluded. Twenty studies were included. The meta‐analysis used the DerSimonian–Laird continuous random‐effects method to calculate the pooled estimated means and 95% confidence interval. The estimated mean injury incidence rate for men was 108.5/1000 player‐hours (95% CI: 85.9–131.0) and 76.1/1000 player‐hours (95% CI: 48.7–103.5) for women. The estimated mean severity for men was 33.9 days (95% CI: 20.7–47.0) and 44.2 days (95% CI: 32.1–56.3) for women. Significantly more match injuries occurred in the second half of matches, were acute, located at the lower limb, diagnosed as joint/ligament, and resulted from being tackled. Fatigue, player fitness, and previous injuries were associated with an increased risk of injury. There were no statistically significant differences between women's and men's injury profiles. However, the inherent cultural and gendered factors which divide the two sports should not be ignored. The findings from this review will help pave the way forward beyond the foundational stages of injury prevention research in rugby‐7s.
Objectives: To provide normative reference values for the Sport Concussion Assessment Tool – 5th Edition for elite-level male rugby league players. Design: A descriptive cross-sectional study. Methods: Baseline Sport Concussion Assessment Tool – 5th Edition scores were obtained from 1005 National Rugby League players during the 2018 and 2019 preseasons. Normative values were calculated for the Standardized Assessment of Concussion, Symptom Evaluation (i.e., severity and number), and the Modified Balance Error Scoring System for each group and in total. Players self-identified their cultural heritage or ethnicity to be ‘Pasifika (Pacific Islander) or Maori’ (n = 243; 24.2 %) or ‘Indigenous Australian’ (n = 82; 8.2 %). Those who identified as any other race, ethnicity, or cultural heritage were combined into a single group (n = 680; 67.7 %). Results: In total the median Standardized Assessment of Concussion score was 27 (interquartile range = 25–28), the median symptom severity was 0 (interquartile range = 0–2), the median symptom number was 0 (interquartile range = 0–1), and the median the Modified Balance Error Scoring System error score was 3 (interquartile range = 1–5). Reporting 4 of 22 symptoms and 6 of 132 on the total severity score was uncommon. There was no significant difference between the cultural heritage or ethnicity groups for Standardized Assessment of Concussion scores, symptom severity or number, or Modified Balance Error Scoring System errors (p-values > 0.05). Conclusions: This normative data will assist with the clinical interpretation of Sport Concussion Assessment Tool – 5th Edition scores following a concussion in the National Rugby League.
Letter The physiological impact of an N‐terminal Halo‐tag on glucose‐dependent insulinotropic polypeptide receptor function in mice
The role of glucose dependent insulinotropic polypeptide (GIP) signalling in metabolic homeostasis is a hot area of research. The study of GIP receptor (GIPR) expression and trafficking, vital aspects in understanding GIPR physiology and pharmacology, has been hindered by the lack of validated GIPR antibodies, meaning that studies have relied on the presence of Gipr mRNA to determine GIPR expression in mouse and human tissue.1-3 The development of fluorophore-conjugated GIPR agonists has provided some insights into endogenous GIPR distribution and behaviour3, 4 but offers only indirect evidence of receptor localisation and may be misleading in some contexts, for example, if agonists dissociate from receptors after entering the endocytic pathway.
The purpose of this study was to compare tackle and ruck frequencies between pool and knockout matches during the Men's International World Rugby Sevens Series and also determine which technical determinants increase the likelihood of tackle success within each stage of the tournament. Video analysis of all matches during the 2018/2019 International Men's Rugby Sevens World Series was conducted (n = 449 matches). This equated to 21226 tackle contact events and 6345 rucks events. Each tackle event was further coded for tackle descriptors (type of tackle, direction of contact and point of body contact) and tackle outcomes (successful and unsuccessful). No differences were found between the mean tackles per match of pool and knockout stages (pool 47.5, 95% CI 46.5–48.6 vs. knockout 46.9, 95% CI 45.7–48.0). There was a significant difference (p < 0.001) in mean rucks per match between pool and knockout stages (pool 14.8, 95% CI 14.2–15.4 vs. knockout 13.3, 95% CI 12.7–13.9). In conclusion, tackle frequencies per match remained consistent across the series and between the different competition stages and match halves. Ruck frequencies on the other hand decreased from the first tournament to latter parts of the series, and fewer rucks were observed in the knockout stage of the tournaments. The frequency and higher likelihood of tackle success for arm tackles in Sevens highlights a unique demand of Sevens, which strengthens the argument for Sevens‐specific tackle training and coaching.
The conversion rate of junior athletes to senior performance is typically used to judge efficiency in talent development. This study aimed to calculate the conversion rate of English academy rugby union players conferred with 'high-potential' 'England Academy Player' status, distinguishing them from their age-matched peers, from academy selection into the Premiership, the elite domestic league. Conversion was calculated overall and between academies, considering when players were conferred high-potential status. In total, 3127 male players were sampled. Players conferred with high-potential status early (prior to achieving a senior contract) accounted for 68.03% of elite debuts and had increased odds of converting compared to other player sub-populations (OR: 5.82, 95% CI: 4.60-7.37). Regression analysis inferred acquiring high-potential status at younger ages increased odds of a Premiership appearance by 2.501. However, delimiting to players achieving senior contracts, players conferred 'high-potential' after accomplishing a senior contract had a greater relative conversion (46.34%) compared to early status players (40.27%). Academy conversion rates and contribution to net development considerably varied (14.63-45.28% and 2.72-12.84%). Findings suggest that whilst high-potential status influences progression, other important factors influence player development. Results demonstrate the need to consider how the specific context of each academy within the talent system influences player development.
Chronic GIPR agonism results in pancreatic islet GIPR functional desensitisation
Objectives There is renewed interest in targeting the glucose-dependent insulinotropic polypeptide receptor (GIPR) for treatment of obesity and type 2 diabetes. G-protein coupled receptor desensitisation is suggested to reduce the long-term efficacy of glucagon-like-peptide 1 receptor (GLP-1R) agonists and may similarly affect the efficacy of GIPR agonists. We explored the extent of pancreatic GIPR functional desensitisation with sustained agonist exposure. Methods A long-acting GIPR agonist, GIP108, was used to probe the effect of sustained agonist exposure on cAMP responses in dispersed pancreatic islets using live cell imaging, with rechallenge cAMP responses after prior agonist treatment used to quantify functional desensitisation. Receptor internalisation and β-arrestin-2 activation were investigated in vitro using imaging-based assays. Pancreatic mouse GIPR desensitisation was assessed in vivo via intraperitoneal glucose tolerance testing. Results GIP108 treatment led to weight loss and improved glucose homeostasis in mice. Prolonged exposure to GIPR agonists produced homologous functional GIPR desensitisation in isolated islets. GIP108 pre-treatment in vivo also reduced the subsequent anti-hyperglycaemic response to GIP re-challenge. GIPR showed minimal agonist-induced internalisation or β-arrestin-2 activation. Conclusions Although GIP108 chronic treatment improved glucose tolerance, it also resulted in partial desensitisation of the pancreatic islet GIPR. This suggests that ligands with reduced desensitisation tendency might lead to improved in vivo efficacy. Understanding whether pancreatic GIPR desensitisation affects the long-term benefits of GIPR agonists in humans is vital to design effective metabolic pharmacotherapies.
Objectives To investigate the network of stakeholders involved in rugby union research across the globe. Methods Using author affiliations listed on scientific publications, we identified the organisations that contributed to rugby union research from 1977 to 2022 and examine collaboration through coauthorship indicators. We determined the locations and sectors of identified organisations and constructed a collaboration network. Network metrics, including degree centrality and betweenness centrality, are computed to identify influential organisations and measure intersector collaboration. Results There is an increase in scientific knowledge creation and collaboration between organisations for rugby union research over time. Among the sectors, the university, professional sports team and sports governing body sectors exhibit the highest intersectoral and intrasectoral density. Predominantly, influential actors are located in England, Australia, France, New Zealand, Ireland and South Africa. Australian Catholic University, Leeds Beckett University, Stellenbosch University, Swansea University, University College London and the University of Cape Town emerge as influential actors between 2016 and 2022. Conclusions Our study underscores the ongoing growth of scientific knowledge generation in rugby union, primarily led by organisations in tier 1 rugby-playing nations within the university sector. Intersectoral collaboration with sports governing bodies plays a crucial role, acting as a broker between sectors. However, the overall collaboration landscape between and within sectors is low. These results highlight an opportunity for improved collaboration opportunities, as the organisations driving knowledge creation have been identified.
Trial matches are frequently used for team preparation in rugby league competitions, making it essential to understand the demands experienced to assess their specificity to actual competition. Consequently, this study aimed to compare the activity demands between pre-season trial matches and early in-season rugby league matches. Following a repeated-measures observational design, 39 semi-professional, male rugby league players from two clubs were monitored using microsensors during two trial matches and the first two in-season matches across two consecutive seasons. Total distance, average speed, peak speed, absolute and relative high-speed running (HSR; > 18 km · h-1) and low-speed running (LSR; < 18 km · h-1) distance, as well as absolute and relative impacts, accelerations (total and high-intensity > 3 m · s-2), and decelerations (total and high-intensity < -3 m · s-2) were measured. Linear mixed models and Cohen's d effect sizes were used to compare variables between match types. Playing duration was greater for in-season matches (p < 0.001, d = 0.64). Likewise, higher (p < 0.001, d = 0.45-0.70) activity volumes were evident during in-season matches indicated via total distance, HSR distance, LSR distance, total accelerations, high-intensity accelerations, total decelerations, and high-intensity decelerations. Regarding activity intensities, a higher average speed (p = 0.008, d = 0.31) and relative LSR distance (p = 0.005, d = 0.31) only were encountered during in-season matches. Despite players completing less volume, the average activity intensities and impact demands were mostly similar between trial and early in-season matches. These findings indicate trial matches might impose suitable activity stimuli to assist players in preparing for early in-season activity intensities.
Background: In rugby league (RL), the ability to repeatedly engage in the tackle, whether as a ball carrier or tackler, is essential for team success and player performance. It is also the leading cause of injury, with over 90% of total injuries occurring during the tackle in professional and amateur cohorts. To effectively reduce the risk of injury and optimise performance, establishing the extent of the ‘problem’, through injury surveillance or descriptive performance studies is required. Objective: The purpose of this narrative synthesis was to systematically search and synthesise tackle injury epidemiology and tackle performance frequency in RL. To achieve this objective, a systematic review was conducted. Methods: The search was limited to English-only articles published between January 1995 and October 2018. Based on the search criteria, a total of 53 studies were found: 32 focused on tackle injury epidemiology (nine cases studies) and 21 focused on tackle frequency. Results: In general, over 600 tackles may occur during an RL match. Tackle injury frequencies (both overall and time-loss injuries) ranged between 47%-94% at the professional level, and between 38%-96% for the lower levels of play. A greater proportion of injuries occurring in professional RL are severe time-loss injuries when compared to lower levels of play. Most time-loss and overall injuries occur to players who are tackled, i.e., ball carriers, across all levels of play. Conclusion: This narrative synthesis will facilitate tackle injury prevention and performance research in RL, and act as a reference document for coaches and practitioners.
Background: The risk of concussion at the elite level of rugby league has been extensively evaluated. However, there has been very little concussion research conducted at the semi-elite level. Purpose: To examine cases of medically diagnosed concussion from a single season of adult men's semi-elite rugby league. Methods: A retrospective review of the 2019 Queensland Cup season head injury assessment surveillance program was completed. All Head Injury Assessment (HIA) cases, including cases of medically diagnosed concussion were retrospectively video reviewed and game play characteristic variables along with video signs of concussion were coded. This data was combined with the return to play data to form the research database. Results: There were 132 players removed for HIAs in 170 games. There were 36 players medically diagnosed with concussions, which equates to an incidence rate was 6.11 concussions per 1000 player match hours, or one concussion every 4.7 matches. All concussions occurred in a tackle event, where the player was struck in the head/face. Possible balance disturbance was the most commonly observed video sign (97.2 %; 35/36), with slow to stand also commonly observed in concussed players (91.7 %; 33/36). Most concussed players (63.9 %; 23/36) did not miss a game following the concussion. Conclusion: This is one of the first studies to review video footage of concussions in sub-elite rugby league. These findings build on the growing body of video analysis research in rugby league and suggest that the retrospective review of the video of incidents may offer insights into modifiable risk factors that may help reduce concussion in rugby league.
OBJECTIVES: This study analysed the overall sentiment of attitudes, opinions, views and emotions expressed in posts on X related to red-carded and yellow-carded tackles during the 2019 Rugby World Cup (RWC). METHODS: Sentiment analysis was conducted on posts on X about red or yellow cards issued at the 2019 RWC. Posts were classified as 'agree', 'disagree' and 'neutral'. The frequency of posts, red cards, yellow cards, all injuries, tackle injuries and total number of tackles per match were also synced to the 45-match playing schedule. RESULTS: Five tackle-related red cards were issued during the 2019 RWC, and 15 tackle-related yellow cards, with 337 and 302 posts identified for each card decision, respectively. For red cards, 42% of posts (n=158/377) agreed with the referee's decision, 19% (n=71/377) disagreed and 40% were neutral. For yellow cards, 24% (n=73/302) agreed with the referee's decision, 33% (n=99/302) disagreed and 43% were neutral. CONCLUSIONS: For red cards, posts were 2.2 times more likely to agree with the referee's decision than disagree. Posts that agreed with a red card decision were also more likely to be shared (reposted) than posts that disagreed with a red card decision. In contrast, sentiments expressed for yellow card decisions were mixed. This may be related to interpreting the degree of danger and whether mitigation is applied. Within the ecosystem of rugby, sharing sentiments on social media plays a powerful role in creating a positive player welfare narrative.
A metabolic comparison of <scp>GIPR</scp> agonism versus <scp>GIPR</scp> antagonism in male mice
Abstract
Aims
Targeting the glucose dependent insulinotropic polypeptide receptor (
Materials and methods
In this study, the metabolic impacts of a GIPR agonist (GIP108) and antagonist (NN‐GIPR‐Ant) were evaluated in lean and high‐fat diet (HFD)–induced obese male mice. We assessed the impacts on food intake, body weight, glucose and insulin tolerance, liver triglyceride levels, bone markers and adipose tissue lipolytic gene expression.
Results
In lean mice, neither peptide affected food intake or body weight, but GIP108 improved glucose tolerance. In obese mice, both agents reduced food intake and body weight, with NN‐GIPR‐Ant producing more sustained appetite suppression. Energy expenditure remained unchanged, as weight loss matched that of pair‐fed controls. GIP108 improved glucose tolerance independently of weight loss, whereas NN‐GIPR‐Ant reduced insulin sensitivity compared to pair‐fed controls. Both treatments slightly increased liver triglyceride content compared to their pair‐fed controls, and no treatment significantly affected plasma bone marker levels. Finally, NN‐GIPR‐Ant reduced the expression of adipose tissue lipolytic genes.
Conclusions
Our data highlights the distinct metabolic effects of GIPR agonism and antagonism, offering insights for their future application in personalised metabolic disease treatments. Further human studies are needed to understand the long‐term metabolic impacts of these therapies.
The Contact Conundrum: Are We Introducing Contact at the Correct Time in Youth Sports?
Abstract
Participation in sport offers numerous physiological, psychological, and social benefits, yet injury remains an inherent risk, particularly in collision-based sports. Increasing scrutiny surrounds these sports, especially for youth, with inconsistency in the age for introducing deliberate contact (e.g., body checking, tackle) and debate regarding proposals for banning high-risk actions to reduce injuries. This article explores the policies and controversies regarding how, and when, physical contact is introduced in sports. Current policies vary significantly across sports, sexes, and national jurisdictions, leading to inconsistent implementation and outcomes. We outline arguments for both delaying and lowering the contact introduction age, including implications for participation rates, skill acquisition, and injury risk. Raising the age may reduce injury history and cumulative head impacts, while earlier, progressive contact training may enhance technical competence. Growth, maturation and size discrepancies further complicate such policy decisions. Evidence supports multimodal approaches, including training guidelines (e.g., reduced contact in practices), neuromuscular training, and rule modifications, to enhance safety without compromising play. Weight-based categorisation and bio-banding (grouping players by attributes associated with growth and/or maturation instead of chronological age) strategies show potential for injury-risk reduction but lack comprehensive evaluation. Despite polarised opinions, developing sport-specific recommendations on best practices for contact introduction remains critical to ensuring athlete welfare and sustainable participation in collision sports.
Fluorescent GLP1R/GIPR dual agonist probes reveal cell targets in the pancreas and brain
Abstract
Dual agonists targeting glucagon-like peptide-1 receptor (GLP1R) and glucose-dependent insulinotropic polypeptide receptor (GIPR) are breakthrough treatments for patients with type 2 diabetes and obesity. Compared to GLP1R agonists, dual agonists show superior efficacy for glucose lowering and weight reduction. However, delineation of dual agonist cell targets remains challenging. Here, we develop and test daLUXendin and daLUXendin+, non-lipidated and lipidated fluorescent GLP1R/GIPR dual agonist probes, and use them to visualize cellular targets. daLUXendins are potent GLP1R/GIPR dual agonists that advantageously show less functional selectivity for mouse GLP1R over mouse GIPR. daLUXendins label rodent and human pancreatic islet cells, with a signal intensity of β cells > α cells = δ cells. Systemic administration of daLUXendin strongly labels GLP1R
Binding kinetics, bias, receptor internalization and effects on insulin secretion in vitro and in vivo of a novel <scp>GLP</scp> ‐ <scp>1R</scp> / <scp>GIPR</scp> dual agonist, <scp>HISHS</scp> ‐2001
Abstract
Aims
The use of incretin analogues has emerged as an effective approach to achieve both enhanced insulin secretion and weight loss in Type 2 diabetes (T2D) patients. Agonists which bind and stimulate multiple receptors have shown particular promise. However, off‐target effects remain a complication of using these agents, and modified versions with optimised pharmacological profiles and/or biased signalling are sought.
Materials and Methods
Ligand synthesis was achieved using standard solid‐phase techniques. Assessments of GLP‐1R‐binding kinetics, G protein recruitment and receptor internalisation were performed using biochemical and imaging approaches. Insulin secretion was measured in purified mouse and human islets, and drug efficacy was assessed in hyperglycaemic db/db mice.
Results
We describe the synthesis and properties of a molecule which binds to both glucagon‐like peptide‐1 (GLP‐1) and glucose‐dependent insulinotropic polypeptide (GIP) receptors (GLP‐1R and GIPR) to enhance insulin secretion. HISHS‐2001 shows increased affinity at the GLP‐1R, as well as a tendency towards reduced internalisation and recycling at this receptor versus FDA‐approved dual GLP‐1R/GIPR agonist tirzepatide. HISHS‐2001 also displayed significantly greater bias towards cAMP generation versus β‐arrestin 2 recruitment compared to tirzepatide. In contrast, G
Conclusion
HISHS‐2001 represents a novel dual receptor agonist with a promising pharmacological profile and actions. Future clinical studies will be needed to assess the safety and efficacy of this molecule in humans.
GLP-1R associates with VAPB and SPHKAP at ERMCSs to regulate β-cell mitochondrial remodelling and function
Abstract
Glucagon-like peptide-1 receptor (GLP-1R) agonists (GLP-1RAs) ameliorate mitochondrial health by increasing mitochondrial turnover in metabolically relevant tissues. Mitochondrial adaptation to metabolic stress is crucial to maintain pancreatic β-cell function and prevent type 2 diabetes (T2D) progression. While the GLP-1R is well-known to stimulate cAMP production leading to Protein Kinase A (PKA) and Exchange Protein Activated by cyclic AMP 2 (Epac2) activation, there is a lack of understanding of the molecular mechanisms linking GLP-1R signalling with mitochondrial and β-cell functional adaptation. Here, we present a comprehensive study in β-cell lines and primary islets that demonstrates that, following GLP-1RA stimulation, GLP-1R-positive endosomes associate with the endoplasmic reticulum (ER) membrane contact site (MCS) tether VAPB at ER-mitochondria MCSs (ERMCSs), where active GLP-1R engages with SPHKAP, an A-kinase anchoring protein (AKAP) previously linked to T2D and adiposity risk in genome-wide association studies (GWAS). The inter-organelle complex formed by endosomal GLP-1R, ER VAPB and SPHKAP triggers a pool of ERMCS-localised cAMP/PKA signalling via the formation of a PKA-RIα biomolecular condensate which leads to changes in mitochondrial contact site and cristae organising system (MICOS) complex phosphorylation, mitochondrial remodelling, and β-cell functional adaptation, with important consequences for the regulation of β-cell insulin secretion and survival to stress.
In vivo functional profiling and structural characterization of the human <i>GLP1R</i> A316T variant
Glucagon-like peptide-1 receptor agonists (GLP-1RAs) are effective therapies for type 2 diabetes (T2D) and obesity, yet patient responses are variable, with
Peak Power: A Severity Measure for Head Acceleration Events Associated with Suspected Concussions
Abstract
Objectives
In elite rugby union, suspected concussions lead to immediate removal from play for either permanent exclusion or a temporary 12-min assessment as part of the Head Injury Assessment 1 (HIA1) protocol. The study aims to retrospectively identify a head acceleration event (HAE) severity measure associated with HIA1 removals in elite rugby union using instrumented mouthguards (iMGs).
Methods
HAEs were recorded from 215 men and 325 women, with 30 and 28 HIA1 removals from men and women, respectively. Logistical regression was calculated to identify whether peak power, maximum principal strain (MPS) and/or the Head Acceleration Response Metric (HARM) were associated with HIA1 events compared to non-cases. Optimal threshold values were determined using the Youden Index. Area under the curve (AUC) was compared using a paired-sample approach. Significant differences were set at p < 0.05.
Results
All three severity measures (peak power, HARM, MPS) were associated with HIA1 removals in both the men’s and women’s game. Peak power performed most consistent of the three severity measures for HIA1 removals based on paired-sample AUC comparisons in the men’s and women’s games. The HARM and MPS were found to perform lower than peak linear acceleration in the women’s game based on AUC comparisons (p = 0.006 and 0.001, respectively), with MPS performing lower than peak angular acceleration (p = 0.001).
Conclusion
Peak power, a measure based on fundamental mechanics and commonly communicated in sports performance, was the most effective metric associated with HIA1 removals in elite rugby. The study bridges the gap by identifying a consistent HAE severity measure applicable across sexes.
Spatially diffuse cAMP signalling with oppositely biased GLP-1 receptor agonists in β-cells despite differences in receptor localisation
Internalisation of G protein-coupled receptors (GPCRs) can contribute to altered cellular responses by directing signalling from non-canonical locations, such as endosomes. If signalling processes are locally constrained, active receptors in different subcellular locations could produce different downstream effects. This phenomenon may be relevant to the optimal targeting of the glucagon-like peptide-1 receptor (GLP-1R), a type 2 diabetes and obesity target GPCR for which several ligands with varying internalisation tendency have been discovered. To investigate, we compared the signalling localisation effects of two prototypical GLP-1RAs with opposite signal bias and effects on GLP-1R trafficking: exendin-asp3 (ExD3), a full agonist that drives rapid internalisation, and exendin-phe1 (ExF1), which shows much slower internalisation. After using bioorthogonal labelling and fluorescent agonist conjugates to verify the divergent trafficking patterns of ExF1 and ExD3 in β-cell lines and primary pancreatic islets, we used live cell biosensors to monitor signalling at different subcellular locations. This revealed that cAMP/PKA/ERK signalling in β-cells is in fact distributed widely across the cell over short- (<5 min) and medium-term (up to 60 min) stimulation at pharmacological (>10 pM) concentrations, with no major differences in signal localisation that could be linked to internalised versus cell surface-bound GLP-1R. Moreover, washout experiments highlighted that, whilst fast-internalising ExD3 shows much greater accumulation and binding to GLP-1R in endosomes than slow-internalising ExF1, it is a rather inefficient driver of both cAMP production in β-cells and insulin secretion from perfused rat pancreata. These data provide a greater understanding of the cellular effects of biased GLP-1R agonism.
Return on investment? Associations between resources and effectiveness of player development in a male rugby union talent system
Objectives: Substantial research has considered the factors contributing to effective talent systems, including environmental features and resources. This study aimed to quantitatively explore associations between academy resources and outcome effectiveness in the English male Premiership rugby talent system. Design: This study utilised a retrospective analysis of archival data. We compared academy resources (human, financial, contextual) between 2016/17 and 2019/20 seasons, with academy outcomes between 2020/21 and 2023/24. Academy outcomes were operationalised as number of Premiership players developed and Premiership appearances subsequently made. Thirteen of fourteen possible academies were included in analysis. Methods: Cost effectiveness was considered by dividing academy total financial investment (inflation adjusted) by number of developed Premiership players, with one-way ANOVA analysis comparing differences between academies. To explore the possible influence of academy resources on academy outcomes, three sets of multiple linear regression models were constructed. Results: Cost of development did not significantly differ between academies (F(12, 30) = 1.740, p = 0.107, Eta
2
= 0.41). Regression analysis suggested that resources had limited impact on cost effectiveness, which was primarily driven by the number of players developed. Resource availability did not significantly predict the number of players developed. However staff count emerged as a significant negative predictor of Premiership appearances (B = − 2.168, p = 0.006), suggesting that increased staffing did not necessarily enhance development outcomes. Conclusions: Whilst causation cannot be inferred, these findings suggest that academy effectiveness is not necessarily based upon their resource availability. Instead, the strategic utilisation of available resources may be more critical in supporting effective talent development.Physical Qualities in Youth Rugby
The quantification and development of physical qualities of youth rugby players is vital to support athlete preparation and long-term development. This chapter summarises and presents the research quantifying the physical qualities and their development of male youth rugby union and rugby league players and compares between age grades and playing positions, whilst considering the effect on career attainment and coaches’ perspectives. A range of research presents the physical qualities of youth rugby players, including stature, body mass, body fat percentage, muscular strength, muscular power, linear speed, change of direction, and aerobic capacity. Differences are apparent by age grade and position. However, the research has several limitations, including the presentation of small sample sizes and lack of consistency in the measures used. Future research should consider the use of national standardised testing batteries due to the inconsistency in testing methods and small samples limiting the reporting of positional differences. Practitioners can use the results from this review to evaluate the physical qualities of youth rugby players to enhance training prescription and goal setting.
Monitoring Fatigue and Recovery in Youth Rugby
Identifying and understanding the fatigue response of youth rugby players following training and competition is paramount to avoid chronic fatigue, underperformance and injury. Fatigue resulting from rugby participation may last for up to 72 hours and the potential for its accumulation during periods of congested fixtures exists, although it can depend on contextual and player-related factors (e.g., time between matches, playing level). The fatigue response following training depends upon the training volume and content (e.g., (non)inclusion of contact). As such, practitioners seek to monitoring players’ fatigue primarily through assessments of lower- and upper-body neuromuscular function, and athlete self-reported measures. However, the successful implementation of a fatigue monitoring system depends on several factors related to the needs of the environment, and the collection, analysis and visualisation of the data. This chapter presents (1) an overview of surrogate measures of fatigue, (2) the current findings from the literature related to the fatigue response to youth rugby training and competition and (3) practical considerations for the successful development of a fatigue monitoring system for young rugby players.
Objectives The purpose of this study was to investigate head kinematic variables in elite men’s and women’s rugby union and their ability to predict player removal for an off-field (HIA1) head injury assessment. Methods Instrumented mouthguard (iMG) data were collected for 250 men and 132 women from 1865 and 807 player-matches, respectively, and synchronised to video-coded match footage. Head peak resultant linear acceleration (PLA), peak resultant angular acceleration (PAA) and peak change in angular velocity (dPAV) were extracted from each head acceleration event (HAE). HAEs were linked to documented HIA1 events, with ten logistical regression models for men and women, using a random subset of non-case HAEs, calculated to identify kinematic variables associated with HIA1 events. Receiver operating characteristic curves (ROC) were used to describe thresholds for HIA1 removal. Results Increases in PLA and dPAV were significantly associated with an increasing likelihood of HIA1 removal in the men’s game, with an OR ranging from 1.05–1.12 and 1.13–1.18, respectively. The optimal values to maximise for both sensitivity and specificity for detecting an HIA1 were 1.96 krad⋅s−2, 24.29 g and 14.75 rad⋅s−1 for PAA, PLA and dPAV, respectively. Only one model had any significant variable associated with increasing the likelihood of a HIA1 removal in the women’s game—PAA with an OR of 8.51 (1.23–58.66). The optimal values for sensitivity and specificity for women were 2.01 krad⋅s−2, 25.98 g and 15.38 rad⋅s−1 for PAA, PLA and dPAV, respectively. Conclusion PLA and dPAV were predictive of men’s HIA1 events. Further HIA1 data are needed to understand the role of head kinematic variables in the women’s game. The calculated spectrum of sensitivity and specificity of iMG alerts for HIA1 removals in men and women present a starting point for further discussion about using iMGs as an additional trigger in the existing HIA process.
Objective To examine the likelihood of head acceleration events (HAEs) as a function of previously identified risk factors: match time, player status (starter or substitute) and pitch location in elite-level men’s and women’s rugby union matches. Methods Instrumented mouthguard data were collected from 179 and 107 players in the men’s and women’s games and synchronised to video-coded match footage. Head peak resultant linear acceleration (PLA) and peak resultant angular acceleration were extracted from each HAE. Field location was determined for HAEs linked to a tackle, carry or ruck. HAE incidence was calculated per player hour across PLA recording thresholds with 95% CIs estimated. Propensity was calculated as the percentage of contact events that caused HAEs across PLA recording thresholds, with a 95% CI estimated. Significance was assessed by non-overlapping 95% CIs. Results 29 099 and 6277 HAEs were collected from 1214 and 577 player-matches in the men’s and women’s games. No significant differences in match quarter HAE incidence or propensity were found. Substitutes had higher HAE incidence than starters at lower PLA recording thresholds for men but similar HAE propensity. HAEs were more likely to occur in field locations with high contact event occurrence. Conclusion Strategies to reduce HAE incidence need not consider match time or status as a substitute or starter as HAE rates are similar throughout matches, without differences in propensity between starters and substitutes. HAE incidence is proportional to contact frequency, and strategies that reduce either frequency or propensity for contact to cause head contact may be explored.
As a coach, coach educator and researcher, Wilbur contributed significantly to the field rugby science, both as an author and reviewer. The aim of this research note, structured as a ‘short report’, is to highlight some of Wilbur's many work achievements.
Disclosure of possible concussions in National Rugby League Women's Premiership players
Objectives: This study investigated the disclosure and reasons for non-disclosure of possible concussions and their symptoms in National Rugby League Women's (NRLW) Premiership players in Australia. Design: Cross sectional survey. Methods: During the 2022 NRLW season, NRLW players were invited to participate in a voluntary, anonymous, online survey exploring (i) player demographics, (ii) rugby playing history, (iii) concussion disclosure, and (iv) instances of, and reasons for, non-disclosure of possible concussions to medical staff during the past two seasons. Logistic regression analyses were used to identify reasons for non-disclosure of possible concussions in NRLW players. Results: Of the 132 eligible participants, 86 players responded to the survey and 63 % (n = 54/86) reported that they always disclosed a possible concussion during the past two seasons. A substantial number of NRLW players surveyed (n = 32/86, 37 %) did not disclose a possible concussion to their team or medical staff on one or more occasions. Sixty-three players (73 %) always reported symptoms during a medical assessment. Twenty-three players (27 %) did not disclose their symptoms during a medical assessment, primarily during or after a game or training session (n = 12/23, 52 %). Of the players who did not disclose their possible concussion symptoms, the two main reasons for non-disclosure were ‘not wanting to be ruled out of the game or training session’ (n = 8/23,35 %) and not being ‘sure if the symptoms were related to concussion’ (n = 8/23, 35 %). Most surveyed players (n = 74/86, 86 %) reported attending mandatory concussion education sessions at their respective clubs. Conclusion: We found high rates of non-disclosure amongst NRLW players, which is inconsistent with previous research suggesting that women are more aware of their symptoms than men and more likely to disclose their concussions. Not wanting to be ruled out of the game or training session and being unsure if the symptoms were related to concussion were the two most common reasons for nondisclosure. Concussion education initiatives could promote a supportive culture fostering disclosure amongst all stakeholders to ensure optimal player welfare.
Structure of force variability during squats performed with an inertial flywheel device under stable versus unstable surfaces
© 2019 The use of unstable surfaces during resistance training has demonstrated a maintenance or reduction on force production. However, the use of unstable surface on force variability has not been assessed using non-linear methods that may be better suited to detect changes in movement variability throughout a given movement. Consequently, this study compared the use of stable vs unstable surfaces on force variability during bilateral squats performed with an inertial flywheel device (Eccoteck, Byomedic System SCP, Spain). Twenty healthy men (mean ± SD: age 22.9 ± 2.9 years, height 1.81 ± 0.7 m, body mass 76.4 ± 7.6 kg and 1RM back squat 110.9 ± 19.7 kg) with a minimum of four years in resistance training performed six sets of six repetitions of squats at maximal concentric effort with one minute rest between sets. Force output on the vertical axes was measured using a strain gauge and the results were processed using non-linear sample entropy (SampEn). Results showed no differences for any of the dependent variables between stable and unstable conditions. SampEn showed no differences between conditions (chi-squared = 0.048 P = 0.827), while Forcemean and SampEn presented a small correlation (r = 0.184; p < 0.01). No changes in entropy were found over the course of the series. Together, these results suggest that the structure of force variability between stable and unstable surfaces are similar. This lack of difference between surfaces may be due to postural and anticipatory adjustments. Consequently, by introducing unstable surfaces to the flywheel bilateral squat exercise, practitioners may not observe changes in Forcemean and force variability when compared to stable surface training suggesting that increased training volumes or intensity may be required during unstable environments to cause a desired training stimulus.
Objectives To describe and compare the incidence and propensity of head acceleration events (HAEs) using instrumented mouthguards (iMG) by playing position in a season of English elite-level men’s and women’s rugby union matches. Methods iMG data were collected for 255 men and 133 women from 1,865 and 807 player-matches, respectively, and synchronised to video-coded match footage. Head peak resultant linear acceleration (PLA) and peak resultant angular acceleration (PAA) were extracted from each HAE. Mean incidence and propensity values were calculated across different recording thresholds for forwards and backs in addition to positional groups (front row, second row, back row, half backs, centres, back three) with 95% confidence intervals (CI) estimated. Significance was determined based on 95% CI not overlapping across recording thresholds. Results For both men and women, HAE incidence was twice as high for forwards than backs across the majority of recording thresholds. HAE incidence and propensity were significantly lower in the women’s game compared to the men’s game. Back-row and front-row players had the highest incidence across all HAE thresholds for men’s forwards, while women’s forward positional groups and men’s and women’s back positional groups were similar. Tackles and carries exhibited a greater propensity to result in HAE for forward positional groups and the back three in the men’s game, and back row in the women’s game. Conclusion These data offer valuable benchmark and comparative data for future research, HAE mitigation strategies, and management of HAE exposure in elite rugby players. Positional-specific differences in HAE incidence and propensity should be considered in future mitigation strategies.
This study aimed to quantify the variability of physical, technical, and subjective task-load demands in small-sided games (SSGs), and the effect of manipulating of pitch size and player numbers in SSG in adolescent rugby union (RU) players. Twenty-six subjects completed six conditions in a crossover study design. In each condition subjects played 4 × 3-min periods of an SSG. Games were completed with either 4 × 4, 6 × 6 or 12 × 12 players on either a small (W: 25 m, L: 30 m), medium (W: 30 m, L: 40 m), or large (W: 35 m, L: 50 m) sized pitch. Match demands were assessed using global navigation satellite systems, heart rate (HR) monitors, ratings of perceived exertion, National Aeronautical Space Association task-load index and video analysis. Statistical analysis comprised of typical error, coefficient of variation (CV) and intra-class correlations to assess variability, and the use of linear mixed effects modelling to assess differences between conditions. A range of variability was observed in technical (CV = 25.00% to 52.38%), physical (CV = 4.12% to 51.18%) and subjective task-loads (CV = 7.65% to 17.14%) between identical games. Reducing player numbers increased physical demands such as m/min (ES range = 0.45 to 1.45), technical exposures such as total involvements (ES range = 0.04 to 0.63) and effort, physical and temporal task-loads. Increasing pitch size caused greater movement demands such as m/min (ES range = 0.11 to 0.79), but did not change the technical demands.
Player movement in rugby league is complex, being spatiotemporal and multifaceted. Modeling this complexity to provide robust measures of player activity and load has proved difficult, with important aspects of player movement yet to be considered. These include the influence of time-varying covariates on player activity and the combination of different dimensions of player movement. Few studies have simultaneously categorized player activity into different activity states and investigated factors influencing the transition between states, or compared player activity and load profiles between matches and training. This study applied hidden Markov models (HMMs)-a data-driven, multivariate approach-to rugby league training and match GPS data to i) demonstrate how HMMs can combine multiple variables in a data-driven way to effectively categorize player movement states, ii) investigate the influence of two time-varying covariates, score difference and elapsed match time on player activity states, and iii) compare player activity and load profiles within and between training and match modalities. HMMs were fitted to player GPS, accelerometer and heart rate data of one English Super League team across 60 training sessions and 35 matches. Distinct activity states were detected for both matches and training, with transitions between states in matches influenced by score difference and elapsed time and clear differences in activity and load profiles between training and matches. HMMs can model the complexity of player movement to effectively profile player activity and load in rugby league and have the potential to facilitate new research across several sports.
Quantifying the highest intensity of competition (the maximal intensity period [MIP]) for varying durations in team sports has been used to identify training targets to inform the preparation of players. However, its usefulness has recently been questioned since it may still underestimate the training intensity required to produce specific physiological adaptations. Within this conceptual review, we aimed to: (i) describe the methods used to determine the MIP; (ii) compare the data obtained using MIP or whole-match analysis, considering the influence of different contextual factors; (iii) rationalise the use of the MIP in team sports practice and (iv) provide limitations and future directions in the area. Different methods are used to determine the MIP, with MIP values far greater than those derived from averaging across the whole match, although they could be affected by contextual factors that should be considered in practice. Additionally, while the MIP might be utilised during sport-specific drills, it is inappropriate to inform the intensity of interval-based, repeated sprint and linear speed training modes. Lastly, MIP does not consider any variable of internal load, a major limitation when informing training practice. In conclusion, practitioners should be aware of the potential use or misuse of the MIP.
Objective To establish a consensus on the structure and process of healthcare services for patients with concussion in England to facilitate better healthcare quality and patient outcome. Design This consensus study followed the modified Delphi methodology with five phases: participant identification, item development, two rounds of voting and a meeting to finalise the consensus statements. The predefined threshold for agreement was set at ≥70%. Setting Specialist outpatient services. Participants Members of the UK Head Injury Network were invited to participate. The network consists of clinical specialists in head injury practising in emergency medicine, neurology, neuropsychology, neurosurgery, paediatric medicine, rehabilitation medicine and sports and exercise medicine in England. Primary outcome measure A consensus statement on the structure and process of specialist outpatient care for patients with concussion in England. Results 55 items were voted on in the first round. 29 items were removed following the first voting round and 3 items were removed following the second voting round. Items were modified where appropriate. A final 18 statements reached consensus covering 3 main topics in specialist healthcare services for concussion; care pathway to structured follow-up, prognosis and measures of recovery, and provision of outpatient clinics. Conclusions This work presents statements on how the healthcare services for patients with concussion in England could be redesigned to meet their health needs. Future work will seek to implement these into the clinical pathway.
Wearing Regulation Soft‐Padded Headgear Does Not Reduce the Risk of Head Injuries in Professional Men's Rugby Players: An Observational Cohort Study
ABSTRACT
There is no empirical evidence that soft‐padded headgear is protective against head injury risk in rugby. However, studies that have assessed purported protective effects have not accounted for rates of contact. The aim of this study was to compare head injury rates while considering tackle‐event exposure in players with and without headgear. In the 2018 and 2019 professional men's SuperRugby season, video analysts recorded headgear use, playing position, match time and head injury assessments (proxy for head injury risk) for each player. Tackle‐event involvements for each player were obtained from third‐party video analysis provider. Tackle‐related head injury rates were calculated per 1000 h (incidence) and per 1000 tackle‐events (propensity), and compared between headgear and non‐headgear wearers using incidence rate ratios (IRRs) with 95% confidence intervals and Poisson regression models. Players wearing headgear were involved in more tackles per match than players without headgear (IRR 1.07, 95% CI 1.05–1.09). Head injury incidence (IRR: 1.78 95% CI: 1.11–2.70) and propensity (IRR: 1.66 95% CI: 1.04–2.52) were higher in players wearing headgear. However, statistical models found no difference in this risk between positional groups. A lack of protective effect is consistent with previous studies and could be explained by World Rugby's headgear design regulations while increased risk may be a result of greater injury susceptibility. As World Rugby's headgear regulations change and further advancements in headgear is made, it is important to continue to examine their effect on head injury risk at an individual level.
The primary aim of this thesis was to evaluate the dietary intake, energy expenditure and energy balance of young professional male rugby league players across the season. Twenty participants from one European Super League U19 academy were recruited. Prior to assessing ‘free-living’ dietary intake, study 1 investigated the validity of a traditional and contemporary dietary assessment tool to measure the total energy intake of young professional rugby league players (n =12). Findings highlighted a small and moderate mean bias for under-reporting by Snap- N-Send a very likely higher (4.96 (0.97) MJ; ES =0.30 (0.07); p =0.0021) total energy expenditure across a five-day microcycle matched for physical activity demands. Findings highlighted the large resting metabolic rates (11.63 (2.46) MJ.day-1), total energy expenditures (18.93 (3.18) MJ.day-1) and moderate physical activity levels (1.6 (0.2)) of players. Players were in a self-reported negative energy balance (-1.63 (1.73) MJ.day-1; p =0.233) and lost body mass (-0.65 (0.78) kg; p =0.076). To support a desired increase in player body mass, case study 1 established the large energy requirements (resting metabolic rate: 14.7 MJ.day-1; total energy expenditure: 22.4 MJ.day-1; physical activity level: 1.5) of one young professional rugby league player across a two-week pre-season microcycle, using the Behaviour Change Wheel to design and implement a behavioural intervention over a twelve week period. Study 4 Findings highlighted the large resting metabolic rates (10.26 (1.49) MJ.day-1), total energy expenditures (16.15 (0.77) MJ.day-1) and moderate physical activity levels (1.6 (0.2)) of players. Players were in a self-reported negative energy balance (-0.70 (1.11) MJ.day-1; p =0.145) and lost body mass (-0.3 (0.6) kg; p =0.222). Sub-analysis within study 4 (n = 5), demonstrated an almost certainly higher total energy expenditure across a pre-season (study 3) vs. in-season microcycle (0.02 MJ.kg.BM- 1; ES =1.14 (0.41); p =0.004). Study 5 and study 6 utilised data collected within study 3 and 4 to investigate the validity of wearable technology or Snap-N-Send to measure the total energy expenditure or total energy intake of young professional rugby league players across ‘true’ free-living conditions (n = 8), respectively. To support appropriate energy intakes following match-play, case study 2 investigated three-day changes in resting metabolic rate following a competitive young professional RL match (n = 5). Findings demonstrated almost certainly, most likely and likely increases in resting metabolic rate at 24 (2.38 (1.02) MJ.day-1; ES =1.06 (0.43); p =0.006), 48 (1.44 (0.93) MJ.day-1; ES =0.62 (0.38); p =0.025) and 72 hours (0.94 (0.78) MJ.day-1;ES =0.40 (0.32); p =0.055) after baseline. In conclusion, studies presented within this thesis establish the large resting and total energy requirements of young professional male rugby league players across the season, which are increased in the days following training-based collisions and competitive match-play. Players require support to consistently achieve desired manipulations of energy balance required for optimal physical development across pivotal youth development periods.
Illness incidence, prevalence and prevention experiences in rugby
Illness in athletes can cause time-loss and performance restriction from training and competition, as well as negatively affect physiological functioning. Team sport athletes, and in particular rugby players, may be at increased risk of illness due to close-proximity to teammates and competitors, as well the high-intensity exercise and collisions involved in the sport itself. Despite a growing wealth of literature assessing illness epidemiology in rugby, little is known about the incidence of illness in male academy rugby players, and no studies have assessed the lived experience of illness through qualitative inquiry. Therefore, the aim of this thesis was to investigate the presence and experiences of illness in rugby using a mixed methods approach. This programme of research includes three study chapters which 1) investigate and summarise the incidence, prevalence, and count of illnesses across full-contact football-code team sports through a systematic review 2) assess the incidence, prevalence, and consequences of illness in male academy rugby league players and 3) explore the barriers and enablers to the uptake of illness prevention guidelines in rugby. Chapter 3 highlighted that full-contact football-code team sport athletes are most commonly affected by respiratory system illnesses; however consistent methods of reporting were lacking, and future research should utilise International Olympic Committee (IOC) recommended definitions and monitoring tools. Chapter 5 identified that consequences of illness, such as loss of body mass and sleep disruptions, may negatively impact a male academy rugby league player’s development. Chapter 7 highlighted that tackling inequalities in resources between men’s and women’s cohorts is critical to effectively implement illness prevention guidelines. Education of coaches and players is essential, and emphasis must be placed on continuing preventative behaviours adopted due to COVID-19. In summary, the findings from this thesis suggest that illnesses pose less epidemiological burden than injury, however the consequences of illness may be detrimental to athlete development and future progression. This thesis offers new behavioural science driven practical recommendations to effectively implement illness prevention guidelines in the rugby context.
OBJECTIVES: Describe the highest frequency and variability for tackle events in rugby league. Investigate seasonal differences in total tackle events per match over a seven-year period. DESIGN: Retrospective observational. METHODS: Tackle events (i.e., ball carrier events [attacker] and tackler involvements [defender]) from 864 male professional rugby league players competing in 1176 Super League matches from 2014 to 2020 were included. A series of linear mixed effect models were used to determine the frequency and variability during peak 1-, 3-, 5-, 10-, 20-, 40-min and whole-match tackle events per player per match at a positional group level. Differences between seasons for the total number of tackle events per match were compared using a one-way analysis of variance and with Tukey's honestly significant difference test. RESULTS: Tackle events were greatest for Props (51.5 [47.7-55.4] per match). Within-players, between-matches, and between-seasons variability was <10 % for tackle events. There were significantly less tackle events and tackler involvements per match in 2014 and a significantly more tackle events per match in season 2020b when compared with all other seasons. CONCLUSIONS: Large between-position variability in peak tackle events, ball carrier events, and tackler involvements would suggest that coaches should separate players into positional groups and prescribe training accordingly. Total number of tackle events, ball carrier events, and tackler involvements were significantly greater in season 2020b when compared to season 2014 to 2019 (inclusive) which may be a consequence of rule changes introduced to the sport.
The 2019 and 2020 Super League (SL) seasons included several competition rule changes. This study aimed to quantify the difference between the 2018, 2019 and 2020 SL seasons for duration, locomotor and event characteristics of matches. Microtechnology and match event data were analysed from 11 SL teams, comprising 124 players, from 416 competitive matches across a three-year data collection period. Due to an enforced suspension of league competition as a consequence of COVID-19 restrictions, and subsequent rule changes upon return to play, season 2020 was divided into season 2020a (i.e. Pre-COVID suspension) and season 2020b (i.e. Post-COVID suspension). Duration, locomotor variables, and match events were analysed per whole-match and ball-in-play (BIP) periods with differences between seasons determined using mixed-effects models. There were significant (ρ ≤ 0.05) reductions in whole-match and BIP durations for adjustables and backs in 2019 when compared to 2018; albeit the magnitude of reduction was less during BIP analyses. Despite reduced duration, adjustables reported an increased average speed suggesting reduced recovery time between bouts. Both forwards and adjustables also experienced an increase in missed tackles between 2018 and 2019 seasons. When comparing 2019 to 2020a, adjustables and backs increased their average speed and distance whilst all positional groups increased average acceleration both for whole-match and BIP analyses. When comparing 2020a to 2020b, all positional groups experienced reduced average speed and average acceleration for both whole-match and BIP analyses. Forwards experienced an increased number of tackles and carries, adjustables experienced an increased number of carries, and backs experienced an increased number of missed tackles when comparing these variables between season 2020a and 2020b. Rule changes have a greater effect on whole-match duration and locomotor characteristics than those reported during BIP periods which suggests the implemented rule changes have removed stagnant time from matches. Amendments to tackle related rules within matches (e.g., introduction of the 'six-again' rule) increases the number of collision related events such as carries and tackles.
Editorial ‘Tackling’ safety through a systems thinking approach: building safety culture within sport
In 2023, we described a collective approach to safety in rugby (including league, union and sevens), outlining the different stakeholders along the passive–active injury prevention intervention continuum.1 We highlighted the current ‘passive’ measures in place to reduce concussion risk including tackle law policies, and the importance of promoting ‘active’ measures such as good tackle technique training. To extend this collective approach, the purpose of this editorial is to (1) highlight the importance of collecting and reporting on ‘near misses’ to promote safety culture and (2) describe a systems thinking method to advocate for shared responsibility in injury prevention.
Incidence, severity and burden of injury and illness at the men's, women's and wheelchair Rugby League World Cup 2021
Objectives: Describe the injury and illness incidence, severity and burden during the men's, women's and wheelchair Rugby League World Cup (RLWC). Design: Retrospective cohort epidemiological study. Methods: Injury and illness diagnosis and estimated return-to-play duration following consensus definitions were reported for men's (n = 16 teams), women's (n = 8 teams) and wheelchair (n = 8 teams) players during the 2021 RLWC. Results: Match injury incidence per 1000 player-hours was 91.2 (95 % CI; 74.0 to 111.1, n = 98) injuries for men, 115.4 (95 % CI; 88.1 to 148.5, n = 60) injuries for women, and 80.0 (95 % CI; 45.7 to 129.9, n = 16) injuries for wheelchair, with the mean severity of 17, 10 and 8 days. Training injury incidence per 1000 player-hours was 2.4 (95 % CI; 1.3 to 3.9, n = 15) injuries for men, 1.7 (95 % CI; 0.5 to 4.3, n = 4) injuries for women, and 6.4 (95 % CI; 2.1 to 15.0, n = 5) injuries for wheelchair. Match concussion incidence per 1000 player-hours was 11.2 (95 % CI; 5.8 to 19.5, n = 12) concussions for men, 19.2 (95 % CI; 9.2 to 35.4, n = 10) concussions for women, and 10.0 (95 % CI; 1.2 to 36.1, n = 2) concussions for wheelchair. There were 13, 5 and 3 episodes of illness in men, women and wheelchair. Conclusions: Injury incidence was highest in women, whereas the highest severity was found in men. Concussion incidence was highest in women. The match injury incidence at the RLWC was similar to domestic elite rugby league. Different factors, such as variations in physical characteristics, rule differences, and travel could have contributed to injury and illness observations.
This study aimed to introduce a novel Bayesian Mixture Model approach to the development of an EPV model in rugby league, which could produce a smooth pitch surface and estimate individual possession outcome probabilities. 99,966 observations from the 2021 Super League season were used. A set of 33 centres (30 in the field of play, 3 in the opposition try area) were located across the pitch. Each centre held the probability of five possession outcomes occurring (converted/unconverted try, penalty, drop goal and no points). Probabilities at each centre were interpolated to all locations on the pitch and estimated using a Bayesian approach. An EPV measure was derived from the possession outcome probabilities and their points value. The model produced a smooth pitch surface, which was able to provide different possession outcome probabilities and EPVs for every location on the pitch. Differences between team attacking and defensive plots were visualised and an actual vs expected player rating system was developed. The model provides significantly more flexibility than previous zonal approaches, allowing much more insightful results to be obtained. It could easily be adapted to other sports with similar data structures.
Tackle technique of rugby union players during head impact tackles compared to injury free tackles
The majority of head injuries in rugby union occur during tackles in which the head receives an impact. Head impacted tackles may be a result of poor tackle technique. Therefore, the purpose of this study was to analyse ball-carrier and tackler technique proficiency in head impacted tackles and compare the technique proficiency to successfully completed tackles in real-match situations. Design: Retrospective video analysis. Methods: Video footage of head impacts with the ‘head impacted player’ (n = 157) and the opposing player ‘impacting player’ (n = 156) were scored for contact technique using a list of technical criteria and compared to contact technique scores of role and tackle-type matched injury-free, successful tackles (n = 170). Results: Ball-carriers contacting their head during front-on head impacted tackles (mean 6.4, 95%CI 5.6–7.1 AU, out of a total score of 14) scored significantly less than the ‘impacting player’ (mean 8.1, 95%CI 7.1–9.1 AU, p < 0.01, ES = 0.5, small) and successful ball-carriers (successful ball-carrier mean 9.4, 95%CI 8.9–9.9 AU, p < 0.0001, ES = 1.1, moderate). Tackler contact proficiency scores during successful front-on tackles (mean 12.3, 95%CI 11.6–12.9 AU, out of a total score of 16) were significantly greater than tackler contact proficiency scores for the ‘head impacted player’ (mean 9.8, 95%CI 8.6–10.9, p < 0.001, ES = 0.8, moderate) and ‘impacting player’ (mean 9.3, 95%CI 8.4–10.1, p < 0.0001, ES = 1.0, moderate). Conclusions: Both the ball-carrier and tackler have a technical deficiency when there is a head impact in matches. The implication of this finding is that players and coaches need to acknowledge that both the ball-carrier and tackler are responsible for each other's safety during the tackle.
The cluster analysis of elite rugby league players identified groups of distinct playing positions that can be referred to as broad positional groups. However, the identified positional groups were based on traditional indicators (physical and technical–tactical) that provided no information about the exact match-based movement activities that led to such similarity grouping and the classification of elite rugby league players into these broad positional groups remains unexplored. Hence, this study finds the best model to classify elite rugby league players into positional groups, using data characterised by movement patterns to uncover the similar movement activities of distinct playing positions within a positional group. Key movement patterns for the positional group classification and differences between the groups were also investigated. A total of 18,173 unique movement patterns were derived from 422 players’ GPS data across the 2019 and 2020 seasons, where only 36 were identified as key patterns. The highest classification accuracy of 77.58% using all unique patterns and 74.5% accuracy using the key patterns was achieved, outperforming studies that used traditional indicators. Further analyses based on key patterns revealed differences between forwards and backs. These findings establish movement patterns as viable indicators to classify rugby league players into positional groups, enabling coaches and trainers to develop position-specific training programmes that cater to the unique physical demands of each position, leading to better player development and team performance. Movement patterns are therefore recommended as an alternative approach to quantifying players’ external loads and obtaining granular information.
PURPOSE: Injuries result from complex interactions between a variety of internal and external risk factors. Due to the complex-systems nature of injuries it is not possible to place responsibility for injury risk management solely within a single domain of professional practice. Instead, an interdisciplinary collaboration between technical/tactical coaches, strength and conditioning coaches, team doctors, physical therapists and sport scientists is more likely to have a meaningful impact on injury risk. The purpose of this study is to describe the development and application of a multidisciplinary model for reducing team injury risk in professional rugby union. METHODS: Epidemiological injury data was collected from a team of professional rugby union players over the course of 5 consecutive season. Following each season, a multidisciplinary audit was conducted to identify areas where risk mitigation strategies could be applied within the team environment. These strategies took the form of technical/tactical, strength and conditioning, player monitoring medical and therapeutic interventions that were all applied concurrently (Figure 1). The effectiveness of this program was assessed against the total team injury burden per season, as well as the burden from contact and non-contact injuries. 95% confidence intervals were calculated using standard equations, and values were considered significantly different if the 95% confidence intervals did not overlap. RESULTS: Overall team injury burden decreased significantly (9, 95%CI 5 to 13 %) from 2012 to 2016. Non-contact injuries were also significantly reduced (39, 95%CI 34 to 44 %), while contact injury burden was increased to a lesser degree (21, 95%CI 15 to 27 %) during the same period. Specific examples of multidisciplinary interventions are provided. CONCLUSIONS: The range of skills required to effectively manage injury risk in professional collision sport crosses disciplinary boundaries. The evidence presented here points to the effectiveness of a multidisciplinary approach to reducing injury risk. PRACTICAL APPLICATION: The multidisciplinary model of injury risk management will encourage increased collaboration across professional disciplines within sport. This model will likely be applicable across a range of team and individual sports.
Research has characterised the strength characteristics of elite youth male soccer players, although little is known about female players. This study investigated the influence of age and maturity status on strength characteristics in 157 female soccer players (U16; n=46, U14; n=43, U12; n=38, U10; n=30), recruited from three elite female soccer academies. Linear mixed models were used to determine the difference by age or maturation. Peak force (PF) was possibly and likely greater for older age groups, however relative PF was most likely trivial between consecutive age groups. Relative impulse at 100 and 300 ms was very likely greater at U12 than U10, likely and possibly less at U12 than U14, and most likely less and possibly greater at U16 than U14. Relative PF was likely less at Pre peak height velocity (PHV) than Circa and Circa than Post-PHV. Relative Impulse at 100 and 300 ms was most likely lower for Pre-PHV than Circa and Pre-PHV than Post-PHV, and possible greater at Circa than Post-PHV. Age and maturation impact upon PF and impulse, thus practitioners should account for individual maturation status when comparing players. These data provide reference strength data for elite youth female soccer players, which can be used when monitoring player development.
To minimize underperformance, injury, and illness, and to enhance readiness for training and match-play, post-match responses are commonly monitored within professional rugby. As no clear consensus exists regarding the magnitude and duration of post-match recovery, this review summarized the literature (17 studies yielded from literature searching/screening) reporting neuromuscular (countermovement jump [CMJ], peak power output [PP], and flight time [FT]), biochemical (creatine kinase [CK]) or endocrine (cortisol [C] and testosterone [T] concentrations), and subjective (wellness questionnaire and muscle soreness) indices after rugby match-play. For neuromuscular responses (11 studies), reductions in PP <31.5% occurred <30 minutes after match, returning to baseline within 48-72 hours. Post-match reductions in FT of <4% recovered after 48 hours. For biochemical and endocrine responses (14 studies), increases in CK, ranging from 120 to 451%, peaked between 12 and 24 hours, returning to baseline within 72 hours of match-play. Initial increases of <298% in C and reductions in T concentrations (<44%) returned to pre-match values within 48-72 hours. Mood disturbances (6 studies) required 48-72 hours to normalize after peak decrements of <65% at 24 hours. This review highlights that 72 hours were needed to restore perturbations in neuromuscular, biochemical and endocrine, and subjective/perceptual responses after competitive rugby match-play. Notably, only 4 studies reported responses in more ecologically valid scenarios (i.e., those in which regular training and recovery strategies were used) while also reporting detailed match demands. A lack of research focusing on youth players was also evident, as only 3 studies profiled post-match responses in younger athletes. Deeper insight regarding post-match responses in ecologically valid scenarios is therefore required.
In professional academy rugby league (RL) players, this two-part study examined; A) the within- and between-day reliability of isometric mid-thigh pulls (IMTP), countermovement jumps (CMJ), and a wellness questionnaire (n = 11), and B) profiled the responses with acceptable reliability (no between-trial differences and between-day coefficient of variation (CV) ≤10% and intraclass correlation coefficient (ICC) ≥0.8) for 120 h (baseline: -3, +24, +48, +72, +96, +120 h) following RL match-play (n = 10). In part A, force at 200, and 250 ms, and peak force (PF) demonstrated acceptable within- (CV%: 3.67-8.41%, ICC: 0.89-0.93) and between-day (CV%: 4.34-8.62%, ICC: 0.87-0.92) reliability for IMTP. Most CMJ variables demonstrated acceptable within-day reliability (CV%: 3.03-7.34%, ICC: 0.82-0.98), but only six (i.e., flight-time, PF, peak power (PP), relative PP, velocity at take-off (VTO), jump-height (JH)) showed acceptable between-day reliability (CV%: 2.56-6.79%, ICC: 0.83-0.91). Only total wellness demonstrated acceptable between-day reliability (CV%: 7.05%, ICC: 0.90) from the questionnaire. In part B, reductions of 4.75% and 9.23% (vs. baseline; 2.54 m∙s-1; 0.33 m) occurred at +24 h for CMJ VTO, and JH, respectively. Acceptable reliability was observed in some, but not all, variables and the magnitude and time-course of post-match responses were test and variable specific. Practitioners should therefore be mindful of the influence that the choice of recovery monitoring tool may have upon the practical interpretation of the data.
The effect of physical fatigue on tackling technique in Rugby Union
Objectives To measure the change in tackling technique of rugby union players following an acute bout of physically fatiguing exercise. Design Randomised cross-over study design with a physical fatigue condition and no-physical fatigue condition (control). Methods Nineteen male amateur club rugby union players ( n = 19) and a total of 887 tackles were analysed. During each condition, each player performed four sets of six tackles (three dominant and three non-dominant shoulder) on a contact simulator. Between each set of tackles in the physical fatigue condition, players performed the prolonged high-intensity intermittent running ability test. Using video, player's tackling proficiency for each tackle was measured by awarding either one point or zero points depending on whether a particular technique was performed or not. The sum of these points represents player’s tackling proficiency (score out of 9, measured in arbitrary units). Results In the non-dominant shoulder, a difference between fatigue and control was found at set two (Fatigue 7.3 [7.1–7.6] AU vs. Control 7.6 [7.4–7.9] AU, p = 0.06, ES = 0.3 small) and set three (Fatigue 7.3 [7.0–7.5] AU vs. Control 7.7 [7.5–7.9] AU, p = 0.006, ES = 0.5 small). During the control condition, tackling proficiency scores improved from baseline for non-dominant tackles (Baseline 7.4 [7.2–7.6] AU, vs Set two 7.6 [7.4–7.9] AU, p = 0.08 ES = 0.3 small; vs Set three 7.7 [7.5–7.9] AU, p = 0.05, ES = 0.4 small). Conclusions In conclusion, this study shows that physical fatigue can potentially affect rugby union players’ tackling technique. Therefore, players should develop technical capacity to resist the effects of physical fatigue during the tackle.
Sleep Quality and Quantity of International Rugby Sevens Players During Pre-season
Leduc, C, Jones, B, Robineau, J, Piscione, J, and Lacome, M. Sleep quality and quantity of international rugby sevens players during pre-season. J Strength Cond Res 33(7): 1878-1886, 2019-The aim of this study was to investigate the influence of training load on objective and subjective sleep measures among elite rugby sevens players during pre-season. Nine international male rugby sevens players participated in this study. Actigraphic and subjective sleep assessment were performed on a daily basis to measure sleep parameters. Training load was measured during the entire pre-season period, and sleep data from the highest and lowest training load week were used in the analysis through magnitude-based inferences. During the highest training load, likely to possibly small, moderate decreases in time in bed (effect sizes; ±90% confidence limits: -0.42; ±0.44 for session rating of perceived exertion [sRPE], -0.69; ±0.71 for total distance covered [TDC]) and total sleep time (-0.20; ±0.37 for sRPE, -0.23; ±0.35 for TDC) were found. Possibly small (-0.21; ±0.35 for high-speed distance, -0.52; ±0.73 for acceleration/deceleration [A/D]) and likely moderate (-074; ±0.67 for TDC) decreases were observed in subjective sleep quality. Possibly small to very likely moderate changes in sleep schedule were observed. Sleep quantity and subjective quality seem to be deteriorated during higher loads of training. This study highlights the necessity to monitor and improve sleep among elite rugby sevens players, especially for the intense period of training.
Correction: Quantifying Collision Frequency and Intensity in Rugby Union and Rugby Sevens: A Systematic Review
The following errors are noted and corrected: 1. In Abstract, Results, sentence 5: ‘156.1 (121.2–191.0)’ should have been ‘171.2 (140.5–201.8)’. 2. In section Microtechnology, Rugby Union Training, final sentence: ‘contacts’ should have been ‘tackles’ and vice versa. 3. In section Video-Based Analysis, Rugby Union Match Play, sentence 3: ‘156.1 (121.2–191.0)’ should have been ‘171.2 (140.5–201.8)’. 4. In Table 4: The following additional data have been added to the Vaz et al. (2010) (89) row: Column 2: S12 competition: 95 matches; Column 5: 112.7 ± 33.1; Column 6: 99.4 ± 3.0. The original version of Table 4 has been replaced with the version shown below: 5. Fig. 5c: The two entries for Vaz et al. 2010 (89) have now been removed from this figure. The original version of Fig. 5 has been replaced with the version shown below.
Relationships of Contact Technique in Training and Matches With Performance and Injury Outcomes in Male Rugby Union
The aims of this study were 3-fold: (1) to compare technical proficiency scores between training and matches for tackling, ball-carrying, and rucking outcomes; (2) to determine the relationship between technique in training and technique in matches for tackling, ball carrying, and rucking; and (3) to determine how contact technique (in training and matches) relates to match performance and injury outcomes. Twenty-four male players from an amateur rugby union club participated in the study. At the beginning of the season, players’ contact technique proficiency was assessed in a training drill. Contact technique in matches was assessed during 14 competitive matches. The technique proficiency was assessed using standardized criteria, and the outcomes of each tackle, ball carry, and ruck were recorded. In training and matches, positive performance outcomes were associated with higher contact technique proficiency scores. For instance, in both settings, tackle technique was significantly lower in missed tackles when compared to effective and ineffective tackles. Players’ contact technique scores in matches also had a positive effect on their tackle performance in matches. Ball-carry technique was associated with tackle breaks in matches (P < .05, r 2 = .31). In training and match environments, tackler, ball-carrier, and ruck technique scores were significantly associated with effective tackles, ball carries, and rucks. Despite the relationship between technical proficiency scores and performance, there were small to moderately higher scores in training compared with matches. The current study highlights the importance of contact skill training, in different environments and conditions, to ensure that skills developed in training are transferred to match performance.
Mental Fatigue Impairs Tackling Technique in Amateur Rugby Union Players
Purpose: To test the effects of mental fatigue (MF) on tackling technique on the dominant and nondominant shoulders in rugby union. Methods: Twenty male amateur rugby union players and a total of 953 tackles were analyzed. A randomized crossover counterbalanced design was used across a non-MF (control) and an MF condition. During each condition, each player performed 24 tackles, divided into 4 sets of 6 tackles (3 tackles on each shoulder). In the MF condition, players performed the Stroop Task between each set of tackles. A video recording of each tackle was used to evaluate each player’s technical proficiency. A score of 1 point was awarded if a specific technique was performed correctly, and 0 point was given if not. The total score, measured in arbitrary units (AU) out of 11, represents the player’s overall tackling proficiency. Results: Overall, players displayed a significantly lower technical proficiency score in the MF condition compared to control (set 2: control 7.30 [7.04–7.57] AU vs MF 6.91 [6.70–7.12] AU, P = .009, effect size (ES) = 0.30 small and set 3: control 7.34 [7.11–7.57] AU vs MF 6.88 [6.66–7.11] AU, P = .002, ES = 0.37 small). For the nondominant shoulder, players had a significantly lower technical proficiency score during the MF condition at set 2 (control 7.05 [6.68–7.41] AU vs MF 6.69 [6.42–6.96] AU, P = .047, ES = 0.29 small) and set 3 (control 7.14 [6.83–7.45] AU vs MF 6.61 [6.35–6.87] AU, P = .007, ES = 0.49 small). Conclusions: MF can diminish a player’s overall tackling proficiency, especially when tackling on the nondominant shoulder. The physiological mechanism for this finding may be impaired executive function and suboptimal functioning of neural signals and pathways, which result in less skillful coordination of movement. To further understand and explain MF-induced physiological changes in tackling, the feasibility of monitoring brain activity (such as electroencephalogram) and neuromuscular function (such as electromyogram) needs to be investigated. The findings from this study may also contribute to the development of more effective tackle training programs for injury prevention and performance.
Editorial Team Sport Risk Exposure Framework-2 (TS-REF-2) to identify sports activities and contacts at increased SARS-CoV-2 transmission risk during the COVID-19 pandemic
The coronavirus disease 2019 (COVID-19) pandemic has caused disruption to professional and recreational sports across the world. The SARS-CoV-2 virus can be transmitted by relatively large respiratory droplets that behave ballistically, and exhaled aerosol droplets, which potentially pose a greater risk. This review provides a summary of end-to-end SARS-CoV-2 transmission risk factors for sport and an overview of transmission mechanisms to be considered by all stakeholders. The risk of SARS-CoV-2 transmission is greatest indoors, and primarily influenced by the ventilation of the environment and the close proximity of individuals. The SARS-CoV-2 transmission risks outdoors, e.g. via water, and from fomites, appear less than initially thought. Mitigation strategies include good end-to-end scenario planning of activities to optimise physical distancing, face mask wearing and hygiene practice of individuals, the environment and equipment. The identification and removal of infectious individuals should be undertaken by means of the taking of temperature and COVID-19 symptom screening, and the use of diagnostic monitoring tests to identify asymptomatic individuals. Using adequate video footage, data from proximity technology and subject interviews, the identification and isolation of ‘close contacts’ should also be undertaken to limit SARS-CoV-2 transmission within sporting environments and into the wider community. Sports should aim to undertake activities outdoors where possible, given the lower SARS-CoV-2 transmission risk, in comparison to indoor environments.
Video‐based technical feedback and instruction improves tackling technique of community rugby union players
ABSTRACT
The aims of this study were to test the change and retention of player's overall tackling technique and technical components following a player‐specific video‐based technical feedback and instruction intervention on both their dominant and non‐dominant shoulders. Twenty‐four (n = 24) rugby union players participated in a non‐randomized control‐intervention, which consisted of a video‐based technical feedback and instruction group (video‐based technical feedback) and a no video‐based technical feedback and instruction group (control). During 3 sessions (baseline, intervention, retention) separated by one week, participants in each group performed six tackles (3 tackles on each shoulder) on a tackle simulator. In total, 432 tackles (video‐based technical feedback = 216, control = 216) were analysed. Each tackle was analysed using a standardized list of technical criteria (arbitrary units, AU). For the dominant shoulder, tackling technique scores significantly improved from baseline to intervention for both groups. For the non‐dominant shoulder, only the video‐based technical feedback group improved their tackling technique from baseline to intervention (baseline 6.89 [6.33–7.45] AU vs. intervention 7.72 [7.35–8.10] AU p = .001, ES = 0.60 moderate). For the retention session, the video‐based technical feedback group scored significantly higher than the control group, for dominant (video‐based technical feedback 8.00 [7.60–8.40] AU vs. control 7.22 [6.83–7.62] AU p = .014, ES = 0.66 moderate) and non‐dominant (video‐based technical feedback 8.11 [7.81–8.41] AU vs. control 7.22 [6.90–7.55] p = .004, ES = 0.96 moderate) tackles. This study demonstrates the efficacy of video‐based technical feedback as a method to optimize tackle training for player safety and performance.
BACKGROUND: Collisions in rugby union and sevens have a high injury incidence and burden, and are also associated with player and team performance. Understanding the frequency and intensity of these collisions is therefore important for coaches and practitioners to adequately prepare players for competition. The aim of this review is to synthesise the current literature to provide a summary of the collision frequencies and intensities for rugby union and rugby sevens based on video-based analysis and microtechnology. METHODS: A systematic search using key words was done on four different databases from 1 January 1990 to 1 September 2021 (PubMed, Scopus, SPORTDiscus and Web of Science). RESULTS: Seventy-three studies were included in the final review, with fifty-eight studies focusing on rugby union, while fifteen studies explored rugby sevens. Of the included studies, four focused on training-three in rugby union and one in sevens, two focused on both training and match-play in rugby union and one in rugby sevens, while the remaining sixty-six studies explored collisions from match-play. The studies included, provincial, national, international, professional, experienced, novice and collegiate players. Most of the studies used video-based analysis (n = 37) to quantify collisions. In rugby union, on average a total of 22.0 (19.0-25.0) scrums, 116.2 (62.7-169.7) rucks, and 156.1 (121.2-191.0) tackles occur per match. In sevens, on average 1.8 (1.7-2.0) scrums, 4.8 (0-11.8) rucks and 14.1 (0-32.8) tackles occur per match. CONCLUSIONS: This review showed more studies quantified collisions in matches compared to training. To ensure athletes are adequately prepared for match collision loads, training should be prescribed to meet the match demands. Per minute, rugby sevens players perform more tackles and ball carries into contact than rugby union players and forwards experienced more impacts and tackles than backs. Forwards also perform more very heavy impacts and severe impacts than backs in rugby union. To improve the relationship between matches and training, integrating both video-based analysis and microtechnology is recommended. The frequency and intensity of collisions in training and matches may lead to adaptations for a "collision-fit" player and lend itself to general training principles such as periodisation for optimum collision adaptation. Trial Registration PROSPERO registration number: CRD42020191112.
The efficacy of a multimodal recovery strategy implemented within 4 hours of rugby league (RL) training was investigated using repeated-measures, randomized, crossover methods in 10 professional academy RL players (age: 17 ± 1 years). Following standardized training (5,383 m covered, 350-m high-speed running, 28 repeated high-intensity efforts, 24 collisions), players completed a multimodal recovery (REC) strategy (i.e., ∼640 kcal meal + ∼1,285 kcal snacks or drinks, cold-water immersion, sleep hygiene recommendations) or control (i.e., ∼640 kcal meal: CONT) practices. Isometric mid-thigh pulls (IMTP), countermovement jumps (CMJ), and wellness questionnaires were completed before (−3 hours) and after (+24, +48 hours) training. The recovery strategy influenced IMTP peak force (p = 0.026), but between-trial differences were undetectable. No other between-trial effects (all p > 0.05) were seen for IMTP, CMJ, or wellness variables. Training-induced reductions in CMJ peak power (−4 ± 6% vs baseline: 4,878 ± 642 W) at +24 hours (p = 0.016) dissipated by +48 hours. Fatigue and lower-body soreness reduced by 16 ± 19% (p = 0.01) and 32 ± 44% (p = 0.024) at +48 hours versus +24 hours, respectively. Relative to CONT (i.e., posttraining nutrition), the effects of a single bout of recovery practices appeared limited when implemented after RL-specific training. Therefore, when training included limited collisions, balanced postexercise meals appeared equally effective relative to a multimodal recovery strategy. Transient changes in performance and wellness variables after training may have implications for practitioners. Consecutive training sessions, including a high frequency and intensity of eccentric muscle actions, should be carefully planned, especially near match-play.
This study aimed to assess the self-reported frequency and severity of gastrointestinal symptoms (GIS) at rest and around rugby training and match play in male and female rugby union players. An online questionnaire was sent to registered rugby union players (sevens or fifteens). Thirteen GIS were assessed alongside perceptions of appetite around rugby and rest using Likert and visual analog scales. Questions investigating a range of medical and dietary factors were included. Three hundred and twenty-five players (male n=271, female n=54) participated in the study. More frequent GIS (at least one GIS experienced weekly/more often) was reported by players at rest (n=203; 62%) compared to around rugby (n=154; 47%). The overall severity of GIS was low (mild discomfort), but a portion of players (33%) did report symptoms of moderate severity around rugby. Female players reported more frequent and severe symptoms compared to male counterparts (p<0.001). Self-reported appetite was significantly lower after matches compared to training. There were no dietary or medical factors associated with GIS severity scores. This study describes GIS characteristics in male and female rugby union players. Half of the players assessed experienced some form of GIS that may affect nutrition, training, or performance, and should thus be a consideration for practitioners supporting this cohort.
The original article can be found online at https://doi.org/10.1007/s40279-023-01953-7
OBJECTIVES: The aim of this study was to examine head acceleration event (HAE) propensity and incidence during elite-level men's and women's rugby union matches. METHODS: Instrumented mouthguards (iMGs) were fitted in 92 male and 72 female players from nine elite-level clubs and three international teams. Data were collected during 406 player matches (239 male, 167 female) using iMGs and video analysis. Incidence was calculated as the number of HAEs per player hour and propensity as the proportion of contact events resulting in an HAE at a range of linear and angular thresholds. RESULTS: HAE incidence above 10 g was 22.7 and 13.2 per hour in men's forwards and backs and 11.8 and 7.2 per hour in women's forwards and backs, respectively. Propensity varied by contact event, with 35.6% and 35.4% of men's tackles and carries and 23.1% and 19.6% of women's tackles and carries producing HAEs above 1.0 krad/s2. Tackles produced significantly more HAEs than carries, and incidence was greater in forwards compared with backs for both sexes and in men compared with women. Women's forwards were 1.6 times more likely to experience a medium-magnitude HAE from a carry than women's backs. Propensity was similar from tackles and carries, and between positional groups, while significantly higher in men than women. The initial collision stage of the tackle had a higher propensity than other stages. CONCLUSION: This study quantifies HAE exposures in elite rugby union players using iMGs. Most contact events in rugby union resulted in lower-magnitude HAEs, while higher-magnitude HAEs were comparatively rare. An HAE above 40 g occurred once every 60-100 min in men and 200-300 min in women. Future research on mechanisms for HAEs may inform strategies aimed at reducing HAEs.
Objectives Contact with the head should be avoided during a rugby league tackle, given the inherent risks of head injuries. This study aimed to characterise a sample of tackles, retrospectively identified as resulting in a potential head injury by the Rugby Football League (RFL) match review panel. Design Retrospective video analysis study. Methods 746 tackles, identified by the RFL match review panel from the men's 2018 and 2019 Super League seasons, were analysed. Video clips were coded using an adapted analysis framework, characterising tackle stage, head contact, affected player, offending player/surface, offending body part/surface and tackle sanctioning. Data were reported as frequencies and percentages. Results The majority of tackles resulting in a potential head injury occurred in the initial tackle contact stage (n = 590, 79.2%). The ball‐carrier was most frequently affected (n = 372, 49.9%) compared to initial tacklers (n = 213, 28.6%). The initial tackler was the most frequently impacting player (n = 268, 36.0%), with the majority of potential head injuries occurring from direct head contact by the arm (n = 230, 34.1%), shoulder (n = 170, 25.2%) and head/neck (n = 145, 21.5%) of the impacting player. Head contact was present in 90.6% (n = 675) of the tackles resulting in a potential head injury. Of the sample of tackles, 16.1% (n = 109) of direct head contact events received a sanction from on‐field match officials. Conclusion The initial tackle contact between the ball‐carrier and initial tackler remains the area of focus for research into potential head injuries in elite‐level men's rugby league, to improve awareness and understanding of the mechanisms of injury.
The purpose of the present study was to evaluate the anthropometry and fitness, and change in these characteristics over time, of youth rugby league players by using maturity status to determine annual categories instead of traditional chronological annual-age grouping. One hundred and twenty one male rugby league players were assessed using anthropometric (i.e., height, sitting height, body mass and sum of four skinfolds) and fitness (i.e., vertical jump, medicine ball chest throw, 10m and 20m sprint and multi stage fitness test; MSFT) measures over a 5 year period. Each player was classified into one of six maturity groups based on their maturity offset (Years from Peak Height Velocity; i.e., 1.5 YPHV). MANOVA analyses identified significant (p<0.001) main effects for maturity group for cross-sectional characteristics and longitudinal change in performance over time. Analyses demonstrated that more mature groups had greater anthropometric and fitness characteristics, except for endurance performance (MSFT -2.5 YPHV = 1872 ± 18 m vs 2.5 YPHV = 1675 ± 275m). For longitudinal changes in characteristics over time, a significant effect was only identified for height and sitting height (p<0.05). These findings provide comparative data for anthropometric and fitness characteristics and change in performance over time in accordance to maturity status within youth rugby league players. Classifying players into annual maturity groups may be an additional or alternative assessment method for evaluating anthropometry and fitness performance in adolescent populations. Further, tracking performance changes over time, especially in relation to maturation, may reduce the limitations associated with chronological annual-age grouping.
A retrospective analysis of the longitudinal development of physical qualities associated with career attainment in academy rugby league players
Understanding how players experience head‐acceleration events (HAE) whilst playing rugby is a priority area of research. In both rugby union and league, video analysis frameworks have been developed to comprehensively define key features of contact events. However, these frameworks were developed prior to recent advances in our understanding regarding the proportion of HAEs that occur due to head‐to‐ground mechanisms and do not consider important post‐contact variables. Therefore, there is a need to supplement the existing frameworks in order to capture how players fall and land post‐tackle. This study used the Delphi method with an interdisciplinary, international team of researchers, coaches and video analysts (working with a variety of playing levels in rugby union and league) to establish a consensus for defining falling and landing events. Subsequently, a draft framework was developed on which the research team provided feedback via online meetings, culminating in the falling/landing framework that each member of the research team rated agreement on, via a nine‐point Likert‐type scale, with consensus deemed to be reached when the median score was ≥ 7. The median scores were 8.0 (7.8–8.0), 8.0 (7.0–9.0) and 8.0 (8.0–9.0) for ‘Additional Contextual Characteristics for Carry and Tackle Events,’ ‘Falling Characteristics of Tackle and Carry Events,’ and ‘Landing Characteristics of Tackle and Carry Events,’ respectively. This novel framework defines more comprehensive falling and landing variables to capture post‐contact injury and performance markers in both rugby union and league, through a standardised approach.
Movement demands of international rugby league players during the 2013 rugby league world cup by group stage and position
Youth Rugby
This chapter aims to introduce the content of the Young Rugby Player: Science and Application book. This is achieved by summarising the codes of rugby (union and league) and introducing the concept of youth athletic development and its importance for all young rugby players. A key aspect here is providing participation, performance and development goals for all young rugby players regardless of their age, ability and experience. To support this, an applied example of the England Rugby Football Unions’ Age Grade Rugby system is presented. This applied example showcases how a national organisation has developed a system to support player participation, performance and development related to the needs of young rugby players. The chapter then concludes with a short and concise summary of each chapter, which are all key to supporting organisations and stakeholders in providing evidence-informed practices to young rugby players across the world.
Consensus on a video analysis framework of descriptors and definitions by the Rugby Union Video Analysis Consensus group
© Author(s) (or their employer(s)) 2020. No commercial re-use. See rights and permissions. Published by BMJ. Using an expert consensus-based approach, a rugby union Video Analysis Consensus (RUVAC) group was formed to develop a framework for video analysis research in rugby union. The aim of the framework is to improve the consistency of video analysis work in rugby union and help enhance the overall quality of future research in the sport. To reach consensus, a systematic review and Delphi method study design was used. After a systematic search of the literature, 17 articles were used to develop the final framework that described and defined key actions and events in rugby union (rugby). Thereafter, a group of researchers and practitioners with experience and expertise in rugby video analysis formed the RUVAC group. Each member of the group examined the framework of descriptors and definitions and rated their level of agreement on a 5-point agreement Likert scale (1: Strongly disagree; 2: Disagree; 3: Neitheragree or disagree; 4: Agree; 5: Strongly agree). The mean rating of agreement on the five-point scale (1: Strongly disagree; 5: Strongly agree) was 4.6 (4.3-4.9), 4.6 (4.4-4.9), 4.7 (4.5-4.9), 4.8 (4.6-5.0) and 4.8 (4.6-5.0) for the tackle, ruck, scrum, line-out and maul, respectively. The RUVAC group recommends using this consensus as the starting framework when conducting rugby video analysis research. Which variables to use (if not all) depends on the objectives of the study. Furthermore, the intention of this consensus is to help integrate video data with other data (eg, injury surveillance).
Objectives: To compare the game-play characteristics between the European Super League (ESL) and the National Rugby League (NRL) competitions. Methods: Eleven team performance indicators were extracted from each match played by every ESL and NRL team over their respective 2016 season. Data was averaged, classified according to competition (Two levels: ESL and NRL), and modelled using univariate and multivariate techniques. Specifically, effect size statistics enabled between group comparisons, while non-metric multidimensional scaling enabled multivariate insights into competition dissimilarity. Results: Seven of the 11 performance indicators showed ‘large’ to ‘very large’ effects. Notably, NRL game-play generated fewer ‘line breaks’, ‘errors’, ‘tackles’ and ‘dummy half runs’ relative to ESL game-play (d >1.2). Despite the NRL generating fewer ‘all runs’ (d = 1.27 [0.57-1.95]), game-play in this competition generated greater ‘all run distances’ relative to the ESL (d = 1.78 [1.02-2.51]). Non-metric multidimensional scaling revealed clear multivariate competition dissimilarity, with ESL and NRL teams orienting distinctive positions on the ordination surface. Further, there was a greater spread in the relative positioning of NRL teams compared to ESL teams, indicating greater team dissimilarity within the NRL. Conclusions: Our observations may be explained by differing competition rule interpretations, in addition to differing game strategies and player skill capabilities. Both coaches and talent recruitment managers associated with these competitions may consider our data to assist with the identification and recruitment of suitable players from these respective competitions.
The tackle in South African youth rugby union – gap between coaches’ knowledge and training behaviour
In youth rugby union matches, tackle-related injuries account for 60% of all injuries, 62% of concussion injuries and almost 50% of spinal cord injuries (youth and amateur). Because of this high risk of injury, the inclusion of the tackle in youth rugby has been a topic of discussion in the public and a high priority research area for World Rugby. What a coach knows and his/her attitude toward player safety directly impacts the risk and performance profile of a player. The purpose of this study is to describe the tackle knowledge, attitudes and training behaviours of youth rugby coaches. The entire population of Western Province Rugby Union Premier A1 division (highest level of school rugby) under-19 rugby coaches (n = 8) completed a knowledge and attitude questionnaire and 96 field-training sessions were observed over four weeks. Coaches rated tackling (mean 3.9, 95% confidence interval 3.3–4.4), rucking (mean 3.8, 95% confidence interval 3.0–4.5) and ball-carrying (mean 3.6, 95% confidence interval 2.6–4.6) as high-risk of injury facets of play (H = 30.8, p < 0.001). Coaching proper technique was rated as very important for safety (mean 4.6, 95% confidence interval 4.2–5.0) and performance (mean 4.8, 95% confidence interval 4.4–5.0, U = 28, p > 0.05). Of the 96 observed training sessions, tackle training was recorded 16% of the time (vs. 84% no tackle training, p < 0.001). Coaches were aware of the risk of injury in the tackle and rated the coaching of proper technique of utmost importance. These positive knowledge and attitudes did not transfer into their tackle training. The discrepancy between coaches' tackle knowledge and attitudes, and their training of the tackle might be related to how competent they believe themselves to be in delivering tackle training.
Purpose: The purpose of this study was to examine the criterion and construct validity of an isometric mid-thigh pull dynamometer to assess whole body strength in professional rugby league players. Methods: Fifty-six male rugby league players, (33 senior and 23 youth professional players) performed four isometric mid-thigh pull efforts (i.e. two on the dynamometer and two on the force platform) in a randomised and counterbalanced order. Results: Isometric peak force was underestimated (P<0.05) using the dynamometer compared to the force platform (95% LoA: -213.5 ± 342.6 N). Linear regression showed that peak force derived from the dynamometer explained 85% (adjusted R2 = 0.85, SEE = 173 N) of the variance in the dependent variable, with the following prediction equation derived: predicted peak force = [1.046 * dynamometer peak force] + 117.594. Cross-validation revealed a non-significant bias (P>0.05) between the predicted and peak force from the force platform, and an adjusted R2 (79.6%), that represented shrinkage of 0.4% relative to the cross-validation model (80%). Peak force was greater for the senior compared to youth professionals using the dynamometer (2261.2 ± 222 cf. 1725.1 ± 298.0 N, respectively; P<0.05). Conclusion: The isometric mid-thigh pull assessed using a dynamometer underestimates criterion peak force but is capable of distinguishing muscle function characteristics between professional rugby league players of different standards.
Ready or Not; A Narrative Synthesis of Sports Medicine Practitioners' Practices During Return to Play in the Management of Musculoskeletal Injuries
ABSTRACT
The purpose of this narrative synthesis was to identify and synthesise the literature focused on sports medicine practitioners' (SMPs) decision‐making practices during return to play (RTP) after musculoskeletal (MSK) injury. Using the Preferred Items for Reporting Systematic Reviews and Meta‐Analyses guidelines, four electronic databases were searched from the start of the database to July 2024 using terms related to SMPs and RTP in MSK injury. The Appraisal Tool of Cross‐Sectional Studies (AXIS) and the Johanna Briggs Institute (JBI) critical appraisal tools were used to assess the overall quality of the identified studies. A narrative synthesis format was considered the most appropriate methodological approach to review and synthesise the pool of literature. Data synthesis included the participating SMPs profession, study sample size, injury location, activity level, RTP outcome measures and results. Data were further characterised by the RTP practices for specific MSK injuries, including spine, shoulder, wrist, hand, hip, knee, ankle and foot. Eighty‐seven ( n = 87) publications were identified based on the inclusion and exclusion criteria. Forty‐seven percent ( n = 41) of the studies focused on surgeon practices and 29.9% ( n = 26) reported practices of multidisciplinary teams (MDTs). Almost half of all studies (40.2%; n = 35) addressed knee injuries, 85.7% ( n = 30/35) specific to the anterior cruciate ligament (ACL). Eighty‐three percent ( n = 34/41) of medical doctors consider injury and postoperative timelines compared to other SMP groups (47.8%; n = 22/46). Multidisciplinary team studies report the use of psychological readiness (50%; n = 13/26) and sport‐specific testing (38.5%; n = 10/26) criteria in RTP studies. Functional assessment and strength are reported in at least 50% of physiotherapist ( n = 18) and rehabilitation specialist ( n = 2) studies. Reference to RTP frameworks, guidelines and protocols in RTP decision‐making was found in less than 20% of the publications. Studies addressing input from other SMPs to assist decision‐making was also found in less than 20% of the studies despite research. From these studies, shared decision‐making with an athlete‐centred approach is preferred. The type of sport and the ambition of the athlete were the biggest influencing factors on decision surrounding RTP both reported in 26.4% ( n = 23) of all SMP studies. This suggests an athlete‐centred approach to SMPs RTP decision‐making. Similar RTP criteria was used between practitioner groups, although criteria were weighted differently, due to the different scopes of practice and complexity surrounding RTP decisions. This review provides context for future research to assist and guide RTP decision‐making practices after MSK injuries. The need for clear, uncomplicated and practical definitions, guidelines, protocols and criteria will improve the RTP process and reduce the risk of reinjury after MSK injury. This review included all study designs and there was heterogeneity in the analysed studies, which can be viewed as a limitation.
Trial Registration
The review was registered with the International Prospective Register of Systematic Reviews (PROSPERO) (registration ID: CRD42021270638) and OSF registries (registration doi:
Injury incidence and characteristics in South African school first team rugby: A case study
Background: Despite its apparent popularity, participation in the sport of rugby union is accompanied by a significant risk of injury. Concerned parties have recently questioned whether this risk is acceptable within school populations. This is difficult to assess within the South African schools’ population as no recent longitudinal injury studies exist.Objectives: To determine the training habits, rugby-related exposure and injury risk within a population of South African high school first team rugby players.Methods: Training and match exposure in both school and provincial competition were examined and the resultant injuries were longitudinally observed for the duration of a South African high school rugby season.Results: Match (79, 95%CI 52-105 injuries/1 000 h) and training (7, 95%CI 3-11 injuries /1000h) injury incidences were demonstrated to be greater than previously reported incidences in similar populations in England and Ireland. Weeks where players were exposed to both school and provincial competition (34, 95%CI 19-49 injuries /1 000 h) had significantly (p<0.05) greater injury incidences than during school competition alone (19, 95%CI 12-26 injuries /1 000 h).Conclusion: The injury risk demonstrated was greater than expected and represents reasons for concern. Possible reasons for the high injury incidence recorded may be the frequency of games played within the season, and the overlap of school and provincial competitions. It should be noted that these results were taken from one school over one season and might not be representative of the incidence of school rugby injuries overall. However, this research demonstrates the need for a multischool longitudinal study within South African schools rugby to determine the overall risk.
THE RELATIONSHIP BETWEEN TACKLE COACHING METHODS AND PLAYERS' TACKLE TRAINING ATTITUDES AND BEHAVIOURS
The tackle event in rugby is a technical and physical contest between opposing players. A player's ability to tolerate and contest during a tackle is a prerequisite for safe participation and success in rugby. The attitude and behaviour of players towards safety have been identified as risk factors for injury. How a skill is coached may influence the player's attitude and actions when executing the skill in training and match play. The purpose of this study was to investigate the relationship between tackle coaching methods and players' tackle training attitudes and behaviours. Cross-sectional Survey. High Schools. 164 Under-19 rugby male players. Questionnaire using a 5-point Likert scale for importance (attitude), quantity (behaviour) and frequency (behaviour). Associations between tackle coaching methods and players' tackle training attitudes and behaviours using the χ The more time spent on coaching proper technique to prevent injuries, the higher players rated the importance of injury prevention (28% somewhat important-very important, χ This is the first study to report on the relationship between tackle coaching methods and players' tackle training attitudes and behaviours. When coaches offered verbal instruction and spent more time coaching proper technique to prevent injuries, players tended to have a more positive attitude toward injury prevention when training the tackle.Background
Objective
Design
Setting
Participants
Assessment of Risk Factors
Main Outcome Measurements
Results
Conclusions
Background In tackle-collision sports, the tackle has the highest incidence, severity, and burden of injury. Head injuries and concussions during the tackle are a major concern within tackle-collision sports. To reduce concussion and head impact risk, evaluating optimal tackle techniques to inform tackle-related prevention strategies has been recommended. The purpose of this study was to perform a systematic scoping review of player-level tackle training intervention studies in all tackle-collision sports. Methods The Arksey and O’Malley’s five-stage scoping review process and Levac et al.’s framework were used, along with the Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR) checklist. The main inclusion criteria were that the study included an intervention aimed at improving a player’s tackle abilities, and the intervention had to be delivered/implemented at the player-level in a training setting. Results Thirteen studies were included in this review, seven studies in American Football (54%), followed by a combined cohort of rugby union and rugby league players (three studies; 23%), rugby union (two studies; 15%), and one study reported on a rugby league cohort (8%). Studies focused primarily on the tackler, with the intervention incorporating a form of instruction or feedback, delivered through video or an expert coach. Other interventions included an 8-week strength and power training programme, designing practice sessions based on baseline data, and helmetless training in American Football. All interventions demonstrated a favourable change in the outcome measured—which included tackler and ball-carrier kinematics based on motion capture video, tackler proficiency scoring, tackling task analysis, head impact frequencies by xPatch head-impact sensor technology, head impact kinematics using head-impact sensors (helmet or skin patches) and football tackle kinematics with motion capture systems or video. Conclusion This review shows that a range of studies have been undertaken focusing on player-level training interventions. The quality of studies were rated as ‘good’, and all studies showed improvements in outcome measures. Coaches and policy makers should ensure tackle technique is profiled alongside other player characteristics, and an evidence-based approach to improving player tackling is adopted, improving both performance and reducing injury risk.
Over the past five decades, running has become increasingly popular across the world. Although running is accessible to many people and is associated with numerous health benefits, it also carries a risk of sustaining a running related injury (RRI) (Malisoux et al., Citation2020). Of these RRIs, understanding the biomechanical changes and issues runners’ face when transitioning to a new pair of shoes is of special interest and not well understood.
Youth Rugby
This text is essential reading for all scientists, students and applied researchers wanting to develop world-class, evidence-based programmes for their youth athletes.
Purpose: The purpose of this invited commentary is to discuss the use of principal component analysis (PCA) as a dimension reduction and visualisation tool to assist in decision making and communication when analysing complex multivariate data sets associated with the training of athletes. Conclusions: Using PCA it is possible to transform a data matrix into a set of orthogonal composite variables called principal components (PC), with each PC being a linear weighted combination of the observed variables and with all PCs uncorrelated to each other. The benefit of transforming the data using PCA is that the first few PCs generally capture the majority of the information (i.e. variance) contained in the observed data, with the first PC accounting for the highest amount of variance and each subsequent PC capturing less of the total information. Consequently, through PCA it is possible to visualise complex data sets, containing multiple variables on simple 2D scatterplots without any great loss of information, thereby making it much easier to convey complex information to coaches. In the future, athlete monitoring companies should integrate PCA into their client packages to better support practitioners trying to overcome the challenges associated with multivariate data analysis and interpretation. In the interim, we present here an overview of PCA and associated R code to assist practitioners working within the field to integrate PCA into their athlete monitoring process.
The aim of this study was to identify between-position (forwards vs. backs) differences in movement variability in cumulative tackle events training during both attacking and defensive roles. Eleven elite adolescent male rugby league players volunteered to participate in this study (mean ± SD, age; 18.5 ± 0.5 years, height; 179.5 ± 5.0 cm, body mass; 88.3 ± 13.0 kg). Participants performed a drill encompassing four blocks of six tackling (i.e. tackling an opponent) and six tackled (i.e. being tackled by an opponent while carrying a ball) events (i.e. 48 total tackles) while wearing a micro-technological inertial measurement unit (WIMU, Realtrack Systems, Spain). The acceleration data were used to calculate sample entropy (SampEn) to analyse the movement variability during tackles performance. In tackling actions SampEn showed significant between-position differences in block 1 (p = 0.0001) and block 2 (p = 0.0003). Significant between-block differences were observed in backs (block 1 vs 3, p = 0,0021; and block 1 vs 4, p = 0,0001) but not in forwards. When being tackled, SampEn showed significant between-position differences in block 1 (p = 0.0007) and block 3 (p = 0.0118). Significant between-block differences were only observed for backs in block 1 vs 4 (p = 0,0025). Movement variability shows a progressive reduction with cumulative tackle events, especially in backs and when in the defensive role (tackling). Forwards present lower movement variability values in all blocks, particularly in the first block, both in the attacking and defensive role. Entropy measures can be used by practitioners as an alternative tool to analyse the temporal structure of variability of tackle actions and quantify the load of these actions according to playing position.
A Comparison of Two Data Acquisition Threshold Values on Head Acceleration Event Counts from an Instrumented Mouthguard
Introduction Small-sided games (SSGs) are used to train physical qualities while practicing sport specific skills. Live Global Positioning Systems (GPS) data can provide feedback during these games; however, the impact of feedback on subsequent locomotor performance is unknown. This study aimed to investigate if providing ‘live’ GPS feedback to players in between bouts of SSGs altered locomotor performance. Methods Using a reverse counterbalanced design, twenty male university rugby players received either feedback or no-feedback (control) during ‘off-side’ touch rugby SSGs. Eight 5v5, 6x4 minute SSGs were played over four days (2/day) with a 20-minute rest between SSGs and at least 72-hours rest between days. Teams were assigned to feedback (4-games) with verbal feedback provided during a 2-minute between bout rest interval, or no feedback (4-games) for the day. Locomotor performance was measured via a 10 Hz GPS and variables were analysed using a linear mixed model, reported using effect sizes (ES) and 90% confidence intervals and then interpreted via magnitude-based inferences. Results Over the full SSG (6x4 min bouts) there was a possibly trivial (ES = 0.15 [-0.03, 0.34]) difference between conditions in total distance (2200 (156) vs. 2177 (186) m). There was also possibly trivial (ES = 0.18 [0.00, 0.37]) and likely trivial (ES = -0.07 [-0.27, 0.13]) differences between conditions in low- and high-speed distance. Between bouts there was a possibly or likely trivial (ES = 0.08 to 0.14) difference in total distance for bouts 2, 4, 5 and 6, with unclear (ES = -0.01 [-0.24, 0.22]) differences in bout 3. There was a likely trivial (ES = 0.11 [-0.01, 0.22]) difference in total distance covered during the first minute of each bout. Discussion In this study, verbal feedback did not alter locomotor performance in rugby players during SSGs. These data suggest that technical and tactical aspects of SSGs might reduce any ergogenic effects of feedback, although it is unknown if the type of feedback provided nullified any potential effects. Furthermore, extrinsic motivating factors such as team success are likely to be perceived as more important than locomotor performance. Future research should endeavour to investigate if these findings are consistent across other forms of feedback, bout durations, football codes, playing levels or training modalities. Conclusions Verbal feedback of distance covered during bouts of SSGs does not alter subsequent locomotor performance.
Objective: This study aimed to: (a)identify the association between external-workloads and injury-risk in the subsequent week; and (b)understand the effectiveness of workload variables in establishing injury-risk. Design: Retrospective cohort study. Methods: Workload and injury data (soft-tissue)were collected from forty-eight professional male rugby league players. Load variables included duration (min), total distance (m), relative distance (m min
−1
), high speed distance ([m]>20 km h−1
), very-high speed distance ([m]>25 km h−1
), acceleration and deceleration efforts (count)and PlayerLoad (Arbitrary Unit: AU). Cumulative two-, three- and four-weekly loads; Acute:Chronic Workload Ratio (ACWR); Mean-Standard Deviation Workload Ratio (MSWR)and strain values were calculated and divided into three equally-sized bins (low, moderate and high). Generalised Estimating Equations analysed relationships between workload variables and injury probability in the subsequent week. Results: Injury-risk increased alongside increases in the ACWR for duration, total distance and PlayerLoad. Conversely, injury-risk decreased (Area Under Curve: 0.569–0.585)with increases in the four-weekly duration, total distance, accelerations, decelerations and PlayerLoad. For relative distance, high four-weekly workloads (high: >60 m min−1
)demonstrated a positive association with injury-risk, whilst high two-weekly loads (high: >82 m min−1
)were negatively associated. Conclusions: A range of external workload metrics and summary statistics demonstrate either positive or negative associations with injury-risk status. Such findings provide the framework for the development of decision-support systems in which external workload metrics (e.g. total or high speed distance)can be uniquely and routinely monitored across a range of summary statistics (i.e. cumulative weekly loads and ACWR)in order to optimise player performance and welfare.Elite Rugby League Players’ Signature Movement Patterns and Position Prediction
Although sports on-field activities occur sequentially, traditional performance indicators quantify players’ activities without regard to their sequential nature. Nowadays, movement patterns are used to sequentially quantify players’ activities to understand match demands on players. However, the specific behavioural (i.e., signature) movement patterns of rugby league players per playing position remain unknown and the prediction of rugby league players into all nine playing positions based on their movement patterns is largely unexplored. Hence, this study identified the signature movement patterns of elite rugby league players per position and revealed the contribution of movement patterns towards the prediction of players into positions during the 2019 and 2020 seasons. Varying numbers of signature movement patterns were identified across playing positions with centres having the highest number of signature patterns (i.e. 1241). Random Forest best predicted elite rugby league players’ positions at 73.41% accuracy, 0.74 recall, and 0.73 f1-score and precision scores based on movement patterns relative frequency values and top contributing movement patterns were identified. Therefore, we recommend sports stakeholders recognize the signature and contributing movement pattern of players per playing position while making decisions regarding training programmes, talent identification and recruitment.
Establishing dose-response relationships between training load and fatigue can help the planning of training. The aim was to establish the relative importance of external training load measurements to relate to the musculoskeletal response on a group and individual player level. Sixteen elite male rugby league players were monitored across three seasons. Two to seven day exponential weighted averages (EWMA) were calculated for total distance, and individualised speed thresholds (via 30-15 Intermittent Fitness Test) derived from global positioning systems. The sit and reach, dorsiflexion lunge, and adductor squeeze tests represented the musculoskeletal response. Partial least squares and repeated measures correlation analyses established the relative importance of training load measures and then investigated their relationship to the collective musculoskeletal response for individual players through the construction of latent variables. On a group level, 2 and 3 day EWMA total distance had the highest relative importance to the collective musculoskeletal response (p < 0.0001). However, the magnitude of relationships on a group (r value = 0.20) and individual (r value = 0.06) level were trivial to small. The lack of variability in the musculoskeletal response over time suggest practitioners adopting such measures to understand acute musculoskeletal fatigue responses should do so with caution.
BACKGROUND: Netball is the one of the most popular women's sports in the world. Since gaining professional status in 2008 there has been a rapid growth in research in the applied sports science and medicine of the sport. A scoping review of the area would provide practitioners and researchers with an overview of the current scientific literature to support on-court performance, player welfare and reduce injury. OBJECTIVE: The primary objective was to identify the current research on the applied sports science and medicine of netball. Additionally, the article provides a brief summary of the research in each topic of sports science and medicine in netball and identifies gaps in the current research. METHODS: Systematic searches of PubMed, SPORTDiscus, MEDLINE and CINAHL were undertaken from earliest record to Dec 2020 and reference lists were manually searched. The PRISMA-ScR protocol was followed. Studies were eligible for inclusion if they investigated netball as a sport or the applied sport science and medicine of netball athletes. RESULTS: 962 studies were identified in the initial search, 150 of which met the inclusion criteria. Injury was the most highly investigated sport science and medicine topic (n = 45), followed by physical qualities (n = 37), match characteristics (n = 24), biomechanics (n = 15), psychology (n = 13), fatigue and recovery (n = 9), training load (n = 4) and nutrition (n = 3). A range of cohorts were used from school to elite and international standards. All cohorts were female netballers, except for one study. A rapid growth in studies over recent years was demonstrated with 65% of studies published in the last decade. There still remains gaps in the literature, with a low evidence base for nutrition, training load and fatigue and recovery. CONCLUSION: This scoping review summarises the current evidence base and key findings that can be used in practice to enhance the applied sport science and medical support to netball athletes across a range of playing standards, and support the growth of the sport. It is evident that netball as a sport is still under-researched.
The peak locomotor characteristics of Super League (rugby league) match-play
This study quantified the position-, duration-, and phase-of-play specific peak locomotor characteristics of senior professional rugby league match-play at a multi-club level. Match-play data were collected from 378 male professional rugby league players, from 11 clubs, across two competitive seasons. A total of 9643 match-observations were analysed; 10-Hz instantaneous velocity and acceleration from Catapult S5 microtechnology units were aligned with video footage to determine to the phase-of-play and duration-specific peak locomotor characteristics (average running speed, relative high-speed running [HSR;>5.5 m·s−1], average absolute acceleration). Linear mixed effect models were used to determine positional differences for each dependent variable and differences between phases-of-play. Positional differences for the duration-specific and phase-of-play peak locomotive characteristics were identified. Fullbacks had greater peak HSR during defensive sets (86 ± 70 m·min−1) vs. all other positions (effect size = 0.26 to 0.49, small). Wingers demonstrated the greatest between phase differences with greater peak locomotor characteristics (effect size = 1.23 to 1.65, large) during attacking-defensive set transition vs. defensive sets. The multi-club normative data, and the differences identified, provides practitioners with valuable information for the consideration of training practices; the incorporation of phases-of-play enables greater consideration of technical-tactical factors whilst preparing players for the peak periods of competition.
In collision sports, the tackle has the highest injury incidence, and is key to a successful performance. Although the contact load of players has been measured using microtechnology, this has not been related to tackle technique. The aim of this study was to explore how PlayerLoad™ changes between different levels of tackling technique during a simulated tackle. Nineteen rugby union players performed twelve tackles on a tackle contact simulator (n = 228 tackles). Each tackle was recorded with a video-camera and each player wore a Catapult OptimEyeS5. Tackles were analysed using tackler proficiency criteria and split into three categories: Low scoring(≤5 Arbitrary units (AU), medium scoring(6 and 7AU) and high scoring tackles(≥8AU). High scoring tackles recorded a higher PlayerLoad™ at tackle completion. The PlayerLoad™ trace was also less variable in the high scoring tackles. The variability in the PlayerLoad™ trace may be a consequence of players not shortening their steps before contact. This reduced their ability to control their movement during the contact and post-contact phase of the tackle and increased the variability. Using the PlayerLoad™ trace in conjunction with subjective technique assessments offers coaches and practitioners insight into the physical-technical relationship of each tackle to optimise tackle skill training and match preparation.
The application of pattern mining algorithms to extract movement patterns from sports big data can improve training specificity by facilitating a more granular evaluation of movement. Since movement patterns can only occur as consecutive, non-consecutive, or non-sequential, this study aimed to identify the best set of movement patterns for player movement profiling in professional rugby league and quantify the similarity among distinct movement patterns. Three pattern mining algorithms (l-length Closed Contiguous [LCCspm], Longest Common Subsequence [LCS] and AprioriClose) were used to extract patterns to profile elite rugby football league hookers (n = 22 players) and wingers (n = 28 players) match-games movements across 319 matches. Jaccard similarity score was used to quantify the similarity between algorithms’ movement patterns and machine learning classification modelling identified the best algorithm’s movement patterns to separate playing positions. LCCspm and LCS movement patterns shared a 0.19 Jaccard similarity score. AprioriClose movement patterns shared no significant Jaccard similarity with LCCspm (0.008) and LCS (0.009) patterns. The closed contiguous movement patterns profiled by LCCspm best-separated players into playing positions. Multi-layered Perceptron classification algorithm achieved the highest accuracy of 91.02% and precision, recall and F1 scores of 0.91 respectively. Therefore, we recommend the extraction of closed contiguous (consecutive) over non-consecutive and non-sequential movement patterns for separating groups of players.
The Demands of Youth Rugby Match-Play
The quantification of the technical, tactical and physical demands of match-play in youth rugby is important for the appropriate prescription of training practices. Differences in the demands of match-play have been identified between positions, playing standards, age grades and phases-of-play. This chapter presents the research that has explored the physical and technical-tactical match demands of youth rugby. The chapter then provides a practical overview of how practitioners can utilise match-demands data to assist in appropriate training prescription through the adaptation, manipulation and evaluation of training drills and practices. The chapter concludes with a range of recommendations and practical implications for the use of match-play data within youth rugby environments.
Using an expert consensus-based approach, a netball video analysis consensus (NVAC) group of researchers and practitioners was formed to develop a video analysis framework of descriptors and definitions of physical, technical and contextual aspects for netball research. The framework aims to improve the consistency of language used within netball investigations. It also aims to guide injury mechanism reporting and identification of injury risk factors. The development of the framework involved a systematic review of the literature and a Delphi process. In conjunction with commercially used descriptors and definitions, 19 studies were used to create the initial framework of key descriptors and definitions in netball. In a two round Delphi method consensus, each expert rated their level of agreement with each of the descriptors and associated definition on a 5-point Likert scale (1-strongly disagree; 2-somewhat disagree; 3-neither agree nor disagree; 4-somewhat agree; 5-strongly agree). The median (IQR) rating of agreement was 5.0 (0.0), 5.0 (0.0) and 5.0 (0.0) for physical, technical and contextual aspects, respectively. The NVAC group recommends usage of the framework when conducting video analysis research in netball. The use of descriptors and definitions will be determined by the nature of the work and can be combined to incorporate further movements and actions used in netball. The framework can be linked with additional data, such as injury surveillance and microtechnology data.
OBJECTIVES The current study retrospectively compared the physical qualities of elite academy rugby league players (aged 16-19 years) by career attainment level (i.e., academy or professional). DESIGN: Retrospective cross-sectional and longitudinal design. METHODS: Eighty-one academy rugby league players were assessed for physical qualities (height, body mass, skinfolds, speed, momentum, vertical jump, Yo-Yo Level 1 and 1-RM squat, bench press and prone row) at the Under 17-19 age categories between 2007 and 2012. Player's career attainment level was determined in 2014. Longitudinal changes in physical qualities between Under 17 and 19s were compared by career attainment level. RESULTS: Professional players demonstrated moderate significant advantages for height (d=0.98) and 1-RM squat (d=0.66) at the Under 17s, 1-RM bench press (d=0.76) at the Under 18s and 1-RM prone row (d=0.73) at the Under 19s age categories when compared to academy players. When assessed longitudinally (Under 17s-19s), professional players significantly outperformed academy players for 1-RM squat (η2=0.20). Professional players also demonstrated greater increases in body mass (8.2 vs. 2.9kg) and 10m momentum (47 vs. 17kgs-1) than academy players between the Under 17s and 19s. CONCLUSIONS: Advanced physical qualities, particularly height and absolute strength, within 16-19 year old players may contribute to attaining professional status in rugby league. Further, the development of body mass and momentum for players within an academy is an important consideration in the progress towards professional rugby league. Therefore, practitioners should aim to identify and develop the physical qualities, especially size and strength, within academy rugby league players.
Peak movement and collision demands of professional rugby league competition
© 2019, © 2019 Informa UK Limited, trading as Taylor & Francis Group. To quantify the peak movement and contact demands of National Rugby League (NRL) and European Super League (ESL) competition players were tracked during 10 NRL (166 files) and 10 ESL (143 files) matches using microtechnology devices. The peak 1- to 5-min periods were then calculated for average match speed (m·min−1), and acceleration (m·s−2) when 0, 1, 2, and ≥3 collisions per min occurred. Linear mixed effect models and Cohen’s effect size statistic (± 90%CI) were used to determine the differences in movement profiles when collisions occurred. Compared to no collision periods, as frequency of collisions per minute increased, there were progressive reductions in running speed for most positional groups. The addition of 1 or more collisions per min resulted in average effect size reductions in match speed of −0.14 for NRL forwards, −0.89 for NRL backs, −0.48 for ESL forwards, and −2.41 for ESL backs. ESL forwards had the highest frequency of peak periods involving 3 or more collisions per min, 22% of all periods, followed by NRL forwards (14%), NRL backs (10%) and ESL backs (8%). This study highlights the peak movement and collision demands of professional rugby league competition and allows practitioners to develop training drills that reflect worst case scenarios.
BACKGROUND: With the increasing professionalisation of youth sports, training load monitoring is increasingly common in adolescent athletes. However, the research examining the relationship between training load and changes in physical qualities, injury, or illness in adolescent athletes is yet to be synthesised in a systematic review. OBJECTIVE: To systematically examine the research assessing internal and external methods of monitoring training load and physical qualities, injury, or illness in adolescent athletes. METHODS: Systematic searches of SPORTDiscus, Web of Science, CINAHL and SCOPUS were undertaken from the earliest possible records to March 2022. Search terms included synonyms relevant to adolescents, athletes, physical qualities, injury, or illness. To be eligible for inclusion, articles were required to: 1) be original research articles; 2) be published in a peer-reviewed journal; 3) participants aged between 10 and 19 years and participating in competitive sport; 4) report a statistical relationship between a measure of internal and/or external load and physical qualities, injury or illness. Articles were screened and assessed for methodological quality. A best-evidence synthesis was conducted to identify trends in the relationships reported. RESULTS: The electronic search yielded 4,125 articles. Following screening and a review of references, 59 articles were included. The most commonly reported load monitoring tools were session ratings of perceived exertion (n = 29) and training duration (n = 22). Results of the best-evidence synthesis identified moderate evidence of positive relationships between resistance training volume load and improvement in strength, and between throw count and injury. However, evidence for other relationships between training load and change in physical qualities, injury, or illness were limited or inconsistent. CONCLUSIONS: Practitioners should consider monitoring resistance training volume load for strength training. Additionally, where appropriate monitoring throw counts may be useful in identifying injury risk. However, given the lack of clear relationships between singular measures of training load with physical qualities, injury, or illness, researchers should consider multivariate methods of analysing training load, as well as factors that may mediate the load-response relationship, such as maturation.
Drawing from skill acquisition and development literature, we present a novel tackle skill training framework. The framework outlines the training purpose (technique proficiency, technique capacity, skill proficiency and skill capacity), skill workload measurements (available information, task difficulty, rating of perceived challenge, skill load), as well as the training conditions and coaching style for the tackle in rugby union. Using this framework and skill load measurements, we propose a pre-season tackle training plan. This tackle skill framework and skill load measurements serve as potential preventive measures for tackle injury risk while improving players’ tackle performance.
Rugby union is a late specialisation sport. As a consequence, youth players may still be engaged in other activities and sports throughout the year as they transition to rugby specialisation. Limited research exists quantifying rugby union training and matches as well as engagement in other activities and sports. Therefore, the aim of this study was to quantify and compare rugby union training, matches and other activities of elite youth U15 and U16 rugby union players at different stages of the season.
Practitioners prescribe numerous training modes to develop the varied physical qualities professional rugby league players must express during competition. The aim of the current study was to determine how the magnitude of external and internal training load per minute of time differs between modes in professional rugby league players. This data were collected from 17 players across 716 individual sessions (mean (SD) sessions: 42 (13) per player) which were categorised by mode (conditioning, small sided games, skills and sprint training). Derived from global positioning systems (5Hz with 15Hz interpolation), the distances covered within arbitrary speed- and metabolic-power-thresholds were determined to represent the external load. Session rating of perceived exertion (sRPE) and individualised training impulse (iTRIMP) represented the internal load. All data were made relative to session duration. The differences in time-relative load methods between each mode were assessed using magnitude based inferences. Small-sided-games and conditioning very likely to almost certainly produced the greatest relative internal and external loads. Sprint training provided players with the greatest sprinting and maximal-power distances without a concomitant increase in internal load. The metabolic-power method complements speed-based quantification of the external load, particularly during smallsided-games and skills training. In practice, establishing normative loads per minute of time for each mode can be useful to plan future training by multiplying this value by the planned session duration.
To investigate the effect of training mode (conditioning and skills) on multivariate training load relationships in professional rugby league via principal component analysis. Four measures of training load (internal: heart rate exertion index, session rating of perceived exertion; external: PlayerLoad™, individualised high-speed distance) were collected from 23 professional male rugby league players over the course of one 12-wk preseason period. Training was categorised by mode (skills or conditioning) and then subjected to a principal component analysis. Extraction criteria were set at an eigenvalue of greater than 1. Modes that extracted more than 1 principal component were subject to a Varimax rotation. Skills extracted 1 principal component, explaining 57% of the variance. Conditioning extracted 2 principal components (1st: internal; 2nd: external), explaining 85% of the variance. The presence of multiple training load dimensions (principal components) during conditioning training provides further evidence of the influence of training mode on the ability of individual measures of external or internal training load to capture training variance. Consequently, a combination of internal- and external- training load measures is required during certain training modes.
The ability to accurately evaluate player and team performances in professional sport is particularly valuable. Doing so provides competitive advantages include extracting important information regarding the tactical strategies of future oppositions and producing player rating systems. A common method of evaluating player and team performances is via expected possession value (EPV) models. EPV models assign a value to every location and/or action on the pitch, which reflects the probability of points being scored within a given time period. EPV models have been produced in several sports, including football, basketball and ice hockey. However, there is limited research surrounding these models in rugby league. Rugby league has a unique set of rules, including a six tackle attacking set and five possible scoring options at the end of a possession. These two factors, alongside the poor data availability in the sport ensure that the majority of previous methods cannot be adapted for use in rugby league. Therefore the aim of this thesis was to develop new methodologies evaluating player and team performances in rugby league. In the first section of this thesis (studies 1 and 2), previous Markov models using zonal approaches were applied, adapted and extended in rugby league to provide insights into player and team performances. Six EPV models were produced with varying zone sizes using Markov Reward Processes. The Kullback-Leibler Divergence was used to evaluate the zone sizes which could reproduce future team attacking performances. The model was then extended to incorporate actions and context nodes using Markov Decision Processes. Novel methods of evaluating player and team performances were also produced. In the second section (studies 3 and 4), novel models producing smooth pitch surfaces were developed. The spatial trends of team attacking performances were evaluated using Kernel Density Estimation. Two novel Wasserstein distance metrics were used to provide valuable insights into team performances. A novel approach to the estimation of individual possession outcomes was also proposed using a Bayesian mixture model approach. The model used linear and bilinear interpolation techniques for its weights to produce a smooth pitch surface. Novel performance metrics evaluating player and team performances were also created. The research provides new methodologies for use within rugby league, providing zonal and smooth EPV models through which player and team performances can be evaluated. Professional experts were impressed with the results they provided and validated their use within the sport.
Science of Sport: Rugby
A Framework for Planning your Practice: A Coach's Perspective
Despite the athlete monitoring cycle becoming increasingly popular within sport, very little evidence exists with regards to the relationships present between its measures or its relationship with illness incidence in youth athletes. The aim of this thesis was to evaluate the true predictive ability of an integrated athlete monitoring cycle model, incorporating measures of the training dose (training load), training recovery (sleep) and training response (wellness questionnaires (DWB and PRS), countermovement jumps and salivary IgA (s-IgA)), with regards to illness incidence in youth athletes. Study 1 outlined the reliability and usefulness of DWB (poor/marginal), PRS (poor/marginal) and CMJ (good/useful). Despite study 1’s findings, study 2 showed that CMJ was not suitable for use as a training response measure in youth athletes. Studies 3 and 4 supported the use of the sleep quality subscale as a training recovery measure rather than within the DWB training response measure (which was reduced to the four item DWBno-sleep). The overall DWBno-sleep score, fatigue, stress and mood were statistically related to the training recovery, whereas only muscle soreness was related to the training dose. Statistically, PRS was related to both the training dose and recovery. Despite the presence of these statistical relationships, only the effect of training load, including match exposure, on PRS was practically interpretable. Unfortunately, technical issues prevented the true predictive ability of an integrated athlete monitoring cycle model with regards to illness incidence being tested. However, study 5 showed that s-IgA measures could not accurately predict illness in youth athletes. Furthermore, analysis of the longitudinal trends of s-IgA, DWBno-sleep and PRS showed that the subjective fatigue/wellness measures were more responsive to qualitative events than objective measures of immune function. Overall, the results of this thesis provide support for the use of the integrated athlete monitoring cycle in youth athletes, particularly when subjective training response measures are included. However, future research needs to consider the true predictive ability of the proposed integrated athlete monitoring cycle model with regards to illness incidence.
Talent Identification and Development
Selective head-and-neck cooling as a treatment method for concussions in elite male rugby union players: the Clinical Observed Outcomes with Local HEad-and-neck cooling After Diagnosed concussions (COOLHEAD) study protocol
Sports-related concussions (SRCs) typically occur when the brain is hyperthermic. Acute head-and-neck cooling should, therefore, reduce the brain’s metabolic demands, with the potential to improve recovery following an SRC. Elite ice hockey players who underwent head-and-neck cooling after sustaining a concussion (SRC) showed reduced return-to-play times, although further investigation is warranted. This paper aims to describe the methods proposed for investigating the clinical effects and feasibility of acute head-and-neck cooling in elite male rugby union players.
A quasi-experimental study will be conducted in two professional male rugby competitions (clusters): the United Rugby Championship ‘intervention group’ and the PREM Rugby ‘standard care group’. Both groups will follow World Rugby’s standardised, graduated return-to-play concussion management protocols. In addition to this, within 30 min of the SRC, the intervention group will be offered head-and-neck cooling for 45 min. The quantitative phase of the study will collect return-to-play times and clinical outcomes in both groups (sample size calculated: 100 concussions per cohort). The qualitative phase will explore the experiences of players and medical teams with the intervention. Intention-to-treat and per-protocol analyses, using appropriate regression modelling techniques, will adjust for possible confounders between the two groups, and thematic content analysis will be employed in the analysis of the respective phases.
The Clinical Observed Outcomes with Local HEad-neck cooling After Diagnosed concussions study will provide evidence regarding acute head-and-neck cooling as a potential adjunct treatment to current concussion management in elite male rugby union.
Concussions in contact sports are challenging for athletes, health professionals and sporting bodies to prevent, detect and manage. Design of interventions for primary prevention, early recognition of concussion and continuing to improve postconcussion management are essential for protecting athletes and promoting brain health. Over the last decade, there have been advancements in video technology for analysing head impact events and improvements in the clinical management of concussions. This study protocol describes how researchers, clinicians and staff from the Australasian National Rugby League (NRL) have brought these advancements together and developed a database of videos with head impact events and clinical outcomes. The intended outputs from this work will enhance the understanding of head impact events in NRL, from biomechanical and gameplay factors to concussion and return to play outcomes. Publishing this protocol increases the transparency of this large-scale effort to better identify head impacts and their relationship to concussions and player movement behaviour to contextualise these variables to generate new knowledge and support the reproducibility of these emerging findings. Between 2017 and 2023, over 5250 head contact cases were recorded in the database, from which >1700 head injury assessments were performed, and >600 concussions were diagnosed. Future studies using these data are planned to inform both primary and secondary injury prevention initiatives, such as risk analysis and prediction of game scenarios that result in concussion, as well as investigation of features and factors that help to inform the duration of recovery and return to play.
Background Head-on-head impacts are a risk factor for concussion, which is a concern for sports. Computer vision frameworks may provide an automated process to identify head-on-head impacts, although this has not been applied or evaluated in rugby. Methods This study developed and evaluated a novel computer vision framework to automatically classify head-on-head and non-head-on-head impacts. Tackle events from professional rugby league matches were coded as either head-on-head or non-head-on-head impacts. These included non-televised standard-definition and televised high-definition video clips to train (n=341) and test (n=670) the framework. A computer vision framework consisting of two deep learning networks, an object detection algorithm and three-dimensional Convolutional Neural Networks, was employed and compared with the analyst-coded criterion. Sensitivity, specificity and positive predictive value were reported. Results The overall performance evaluation of the framework to classify head-on-head impacts against manual coding had a sensitivity, specificity and positive predictive value (95% CIs) of 68% (58% to 78%), 84% (78% to 88%) and 0.61 (0.54 to 0.69) in standard-definition clips, and 65% (55% to 75%), 84% (79% to 89%) and 0.61 (0.53 to 0.68) in high-definition clips. Conclusion The study introduces a novel computer vision framework for head-on-head impact detection. Governing bodies may also use the framework in real time, or for retrospective analysis of historical videos, to establish head-on-head rates and evaluate prevention strategies. Future work should explore the application of the framework to other head-contact mechanisms and also the utility in real time to identify potential events for clinical assessment.
Athlete external load is typically quantified as volumes or discretised threshold values using distance, speed and time. A framework accounting for the movement sequences of athletes has previously been proposed using radio frequency data. This study developed a framework to identify sequential movement sequences using GPS-derived spatiotemporal data in team-sports and establish its stability. Thirteen rugby league players during one match were analysed to demonstrate the application of the framework. The framework (Sequential Movement Pattern-mining [SMP]) applies techniques to analyse i) geospatial data (i.e., decimal degree latitude and longitude), ii) determine players turning angles, iii) improve movement descriptor assignment, thus improving movement unit formation and iv) improve the classification and identification of players’ frequent SMP. The SMP framework allows for sub-sequences of movement units to be condensed, removing repeated elements, which offers a novel technique for the quantification of similarities or dis-similarities between players and playing positions. The SMP framework provides a robust and stable method that allows, for the first time the analysis of GPS-derived data and identifies the frequent SMP of field-based team-sport athletes. The application of the SMP framework in practice could optimise the outcomes of training of field-based team-sport athletes by improving training specificity.
This study aims to (a) quantify the movement patterns during rugby league match-play and (b) identify if differences exist by levels of competition within the movement patterns and units through the sequential movement pattern (SMP) algorithm. Global Positioning System data were analysed from three competition levels; four Super League regular (regular-SL), three Super League (semi-)Finals (final-SL) and four international rugby league (international) matches. The SMP framework extracted movement pattern data for each athlete within the dataset. Between competition levels, differences were analysed using linear discriminant analysis (LDA). Movement patterns were decomposed into their composite movement units; then Kruskal-Wallis rank-sum and Dunn post-hoc were used to show differences. The SMP algorithm found 121 movement patterns comprised mainly of "walk" and "jog" based movement units. The LDA had an accuracy score of 0.81, showing good separation between competition levels. Linear discriminant 1 and 2 explained 86% and 14% of the variance. The Kruskal-Wallis found differences between competition levels for 9 of 17 movement units. Differences were primarily present between regular-SL and international with other combinations showing less differences. Movement units which showed significant differences between competition levels were mainly composed of low velocities with mixed acceleration and turning angles. The SMP algorithm found 121 movement patterns across all levels of rugby league match-play, of which, 9 were found to show significant differences between competition levels. Of these nine, all showed significant differences present between international and domestic, whereas only four found differences present within the domestic levels. This study shows the SMP algorithm can be used to differentiate between levels of rugby league and that higher levels of competition may have greater velocity demands.
Editorial Applying diffusion innovation theory to evaluate the attributes of the new tackle law in rugby football codes
Rule changes within football-code team sports aim to improve performance, enhance player welfare, increase competitiveness, and provide player development opportunities. This manuscript aimed to review research investigating the effects of rule changes in football-code team sports. A systematic search of electronic databases (PubMed, ScienceDirect, CINAHL, MEDLINE, and SPORTDiscus) was performed to August 2023; keywords related to rule changes, football-code team sports, and activity type. Studies were excluded if they failed to investigate a football-code team sport, did not quantify the change of rule, or were review articles. Forty-six studies met the eligibility criteria. Four different football codes were reported: Australian rules football (n = 4), rugby league (n = 6), rugby union (n = 16), soccer (n = 20). The most common category was physical performance and match-play characteristics (n = 22). Evidence appears at a high risk of bias partly due to the quasi-experimental nature of included studies, which are inherently non-randomised, but also due to the lack of control for confounding factors within most studies included. Rule changes can result in unintended consequences to performance (e.g., longer breaks in play) and effect player behaviour (i.e., reduce tackler height in rugby) but might not achieve desired outcome (i.e., unchanged concussion incidence). Coaches and governing bodies should regularly and systematically investigate the effects of rule changes to understand their influence on performance and injury risk. It is imperative that future studies analysing rule changes within football codes account for confounding factors by implementing suitable study designs and statistical analysis techniques.
Instrumented mouthguards (iMGs) have the potential to quantify head acceleration exposures in sport. The Rugby Football League is looking to deploy iMGs to quantify head acceleration exposures as part of the Tackle and Contact Kinematics, Loads and Exposure (TaCKLE) project. iMGs and associated software platforms are novel, thus limited validation studies exist. The aim of this paper is to describe the methods that will determine the validity (ie, laboratory validation of kinematic measures and on-field validity) and feasibility (ie, player comfort and wearability and practitioner considerations) of available iMGs for quantifying head acceleration events in rugby league. Phase 1 will determine the reliability and validity of iMG kinematic measures (peak linear acceleration, peak rotational velocity, peak rotational acceleration), based on laboratory criterion standards. Players will have three-dimensional dental scans and be provided with available iMGs for phase 2 and phase 3. Phase 2 will determine the on-field validity of iMGs (ie, identifying true positive head acceleration events during a match). Phase 3 will evaluate player perceptions of fit (too loose, too tight, bulky, small/thin, held mouth open, held teeth apart, pain in jaw muscles, uneven bite), comfort (on lips, gum, tongue, teeth) and function (speech, swallowing, dry mouth). Phase 4 will evaluate the practical feasibility of iMGs, as determined by practitioners using the system usability scale (preparing iMG system and managing iMG data). The outcome will provide a systematic and robust assessment of a range of iMGs, which will help inform the suitability of each iMG system for the TaCKLE project.
Objectives Quantify and identify factors associated with concussion underreporting in Super League rugby league players. Design Cross sectional survey. Methods During the 2022 season preseason, 422 Men's and Women's Super League players completed an online survey quantifying player demographics, rugby playing history, concussion history, prevalence of, and reasons for underreporting concussion, concussion knowledge and long-term implications and perceptions of concussion. Results Overall, 20% of respondents stated they did not report concussion-related symptoms to medical staff during the 2020 and 2021 seasons. The two most common reasons for underreporting concussion were ‘didn't want to be ruled out of a match’ (35%) and ‘didn't want to let down team’ (24%). 65% of players reported an appropriate level of knowledge about concussion and potential long-term implications at the start of their senior rugby career, versus 89% now. In relation to concussion knowledge, symptoms were correctly identified on 74% of occasions. 57% of players surveyed were concerned about the potential long-term implications from concussion, and 11% of players would encourage their/family members' children to not play rugby league. Conclusions The proportion of Super League players who did not report concussion symptoms was similar to rugby league players in Australia. The main reasons for not reporting concussion appeared to be due to perceptions of what is beneficial for the team, suggesting both performance and medical staff should collectively encourage players to report concussion. A player's attitude towards concussion is potentially an individual modifiable risk factor and should be considered within the concussion management of players.
Several microtechnology devices quantify the external load of team sports using Global Positioning Systems sampling at 5, 10, or 15 Hz. However, for short, explosive actions, such as collisions, these sample rates may be limiting. It is known that very high-frequency sampling is capable of capturing changes in actions over a short period of time. Therefore, the aim of this study was to compare the mean acceleration and entropy values obtained from 100 Hz and 1000 Hz tri-axial accelerometers in tackling actions performed by rugby players. A total of 11 elite adolescent male rugby league players (mean ± SD; age: 18.5 ± 0.5 years; height: 179.5 ± 5.0 cm; body mass: 88.3 ± 13.0 kg) participate in this study. Participants performed tackles (n = 200), which were recorded using two triaxial accelerometers sampling at 100 Hz and 1000 Hz, respectively. The devices were placed together inside the Lycra vests on the players’ backs. The mean acceleration, sample entropy (SampEn), and approximate entropy (ApEn) were analyzed. In mean acceleration, the 1000 Hz accelerometer obtained greater values (p < 0.05). However, SampEn and ApEn were greater with the 100 Hz accelerometer (p < 0.05). A large relationship was observed between the two devices in all the parameters analyzed (R2 > 0.5; p < 0.0001). Sampling frequency can affect the quality of the data collected, and a higher sampling frequency potentially allows for the collection of more accurate motion data. A frequency of 1000 Hz may be suitable for recording short and explosive actions.
To critically evaluate sport nutrition services available to male and female international rugby unions. Fifteen participants, representing 16 international rugby unions, including nine female and seven male teams (one participant worked with both a female and male union), responded to an online survey. Twelve of the unions recruited were ranked in the top 10 globally by World Rugby. Twelve unions employed accredited nutrition practitioners with significant experience (> 5 years: n = 5; > 10 years: n = 4) and advanced qualifications (master's degrees: n = 8; doctorates: n = 2). Three unions did not employ a qualified nutrition practitioner (female: n = 2; male: n = 1). Full-time employment was more common among nutrition practitioners serving male (n = 4/5) versus female (n = 3/6) unions. Practitioners served male unions for more hours per week (42 ± 28) than female unions (24 ± 20). Practitioners were involved in sport science meetings (n = 14/15), anti-doping education, menu design, strategy development (n = 13/15), body composition assessments, individual consultations (n = 12/15), focusing on fuelling, recovery and injury rehabilitation (n = 14/15). Participants were “moderately confident” (n = 8/15) in using behaviour change techniques. Most participants agreed on the lack of female-specific nutrition guidance (n = 14/15), relying on guidance for male players due to limited evidence (n = 7/9). This study provides the first critical reflection of sport nutrition service delivery within international rugby. The findings highlight gender disparities for female players, with reduced applied support and a lack of female-specific guidelines. Recommendations include enhancing practitioner training in behaviour change, hiring qualified nutritionists, deemphasising body composition assessment, and conducting more research to improve nutrition services, especially for women.
Ranking enables coaches, sporting authorities, and pundits to determine the relative performance of individual athletes and teams in comparison to their peers. While ranking is relatively straightforward in sports that employ traditional leagues, it is more difficult in sports where competition is fragmented (e.g. athletics, boxing, etc.), with not all competitors competing against each other. In such situations, complex points systems are often employed to rank athletes. However, these systems have the inherent weakness that they frequently rely on subjective assessments in order to gauge the calibre of the competitors involved. Here we show how two Internet derived algorithms, the PageRank (PR) and user preference (UP) algorithms, when utilised with a simple ‘who beat who’ matrix, can be used to accurately rank track athletes, avoiding the need for subjective assessment. We applied the PR and UP algorithms to the 2015 IAAF Diamond League men’s 100m competition and compared their performance with the Keener, Colley and Massey ranking algorithms. The top five places computed by the PR and UP algorithms, and the Diamond League ‘2016’ points system were all identical, with the Kendall’s tau distance between the PR standings and ‘2016’ points system standings being just 15, indicating that only 5.9% of pairs differed in their order between these two lists. By comparison, the UP and ‘2016’ standings displayed a less strong relationship, with a tau distance of 95, indicating that 37.6% of the pairs differed in their order. When compared with the standings produced using the Keener, Colley and Massey algorithms, the PR standings appeared to be closest to the Keener standings (tau distance = 67, 26.5% pair order disagreement), whereas the UP standings were more similar to the Colley and Massey standings, with the tau distances between these ranking lists being only 48 (19.0% pair order disagreement) and 59 (23.3% pair order disagreement) respectively. In particular, the UP algorithm ranked ‘one-off’ victors more highly than the PR algorithm, suggesting that the UP algorithm captures alternative characteristics to the PR algorithm, which may more suitable for predicting future performance in say knockout tournaments, rather than for use in competitions such as the Diamond League. As such, these Internet derived algorithms appear to have considerable potential for objectively assessing the relative performance of track athletes, without the need for complicated points equivalence tables. Importantly, because both algorithms utilise a ‘who beat who’ model, they automatically adjust for the strength of the competition, thus avoiding the need for subjective decision making.
To prevent the spread of infection during matches and training activities is a major challenge facing all sports returning from the enforced COVID-19 shutdown. During training and matches, rugby league players make contact with others which can result in SARS-CoV-2 virus transmission. While these interactions characterise the appeal of the game, a number of them can be avoided, including shaking hands and conversing after the match. This paper presents a framework underpinned by behavioural science (capability, opportunity, motivation and behaviour model, COM-B) to support stakeholders in helping players adopt new social distance norms and behaviours. This framework helps to ensure the players have the capability, opportunity, and motivation to adopt new COVID-19 risk minimising behaviours, which they will need to commit to 100%.
Changes in dietary intake, immune function and performance monitors throughout a season in professional rugby league players
Throughout a rugby league (RL) season players are exposed to a high volume of competitive fixtures. Cumulative loads and inadequate nutrition may supress immune function, which would have negative consequences for health and performance. As no study has monitored immune function and dietary intake throughout a RL season, the aim of the study was to identify the pattern of dietary intake and critical time points where immune function and selected performance monitors are compromised. Following ethics approval, 20 male volunteer professional RL players (25.4±3.2 y, 98.2±8.4 kg) were monitored the day prior to a competitive fixture for 25 weeks of the season. Weekly changes in neuromuscular function (CMJ), wellbeing (5-point questionnaire), training loads (sRPE), salivary testosterone (sTest) and immunoglobulin A (sIgA) were recorded. Dietary intake (4-day food diary) was assessed at the start, middle and end of the season. Changes from the overall mean were inferred via Cohen’s d effect sizes using means and standard deviations which were calculated from a linear mixed model to account for missing data. Moderate increases in training load occurred in weeks 3, 5, 9, 10, 12 and 20 and a very large increase occurred in week 21. Moderate decreases in CMJ flight time occurred in week 14 and small decreases occurred in weeks 6, 7, 20, 22, 24, 25. For all wellbeing parameters small decreases occurred in weeks 7 and 17 with small increases in stress, soreness and fatigue also occurring in week 20. Mean sTest was 119.8±55.5 pg.ml-1 with small declines occurring in week 7, 10, 13, 20 and 22. Overall sIgA concentrations ranged from 1.03-1.18 µg.min-1, but compared to the overall mean (1.12 ± 0.17 µg.min-1) small decreases were observed in weeks 6, 9, 16, 21, 23, 24. Dietary intakes were consistent during the recording periods with mean energy intake ranging from 2811-3149 kcal.day-1. Carbohydrate, protein, and fat intakes ranged from 2.9-3.2 g.kg.BM-1, 2.0-2.1 g.kg.BM-1, and 1.1-1.4 g.kg.BM-1 respectively. The findings suggest that a consistent intake throughout a competitive season did not prevent reductions in immune function and selected performance monitors. Further research to develop appropriate periodised nutritional strategies to maintain immune function and support player health and performance is needed.
Purpose: To explore the effects of travel related to international rugby sevens competition on sleep patterns. Methods: Seventeen international male rugby sevens players participated in this study. Sleep assessments were performed daily during two separate Sevens World Series competition legs (Oceania and America). The duration of each competition leg was subdivided into key periods (pre-tour, pre-competition, tournament 1 and 2, relocation and post-tour) lasting 2 to 7 nights. Linear mixed models in combination with magnitude-based decision were used to assess 1) the difference between pre-season and key periods and 2) the effect of travel direction (eastward or westward). Results: Shorter total sleep time (hh:mm) was observed during tournament 2 (mean ± SD, 06:16 ± 01:08), relocation (06:09 ± 01:09) and pre-tour week (06:34 ± 01:24) compared with pre-season (06:52 ± 01:00). Worse sleep quality (AU) was observed during tournament 1 (6.1 ± 65 2.0) and 2 (5.7 ± 1.2) as well as during the relocation week (6.3 ± 1.5) than during pre-season (6.5 ± 1.8). When traveling eastward compared with westward, earlier fall asleep time was observed during tournament 1 (ES -0.57, 90%CI [-1.12 to -0.01]), relocation week (-0.70 [-1.11 to -0.28]), and post-tour (-0.57 [-0.95 to -0.18]). However, possibly trivial and unclear differences were observed during pre-competition week (0.15 [-0.15 to 0.45]) and tournament 2 (0.81 [-0.29 to 1.91]). Conclusion: Sleep patterns of elite rugby sevens players are robust to the effects of long-haul travel and jet lag. However, staff should consider promoting sleep during the tournament and 73 relocation week.
Developing an expert consensus in rugby nutrition: a Delphi study
Background: Limited empirical evidence exists which has direct translation for nutrition practitioners working with team sport athletes. Furthermore, given the lack of rugby specific nutrition recommendations, it is important to provide a framework on the requirements of players in order to enhance the standard of nutritional practice in rugby. The primary aim of this research was to develop an expert consensus on optimum nutritional practices of rugby players. Methods: To obtain expert opinion, a Delphi Poll was implemented; this survey technique allows a consensus to be established where information is currently contradictory or insufficient (1). Following ethics approval, seven expert nutrition practitioners from across the United Kingdom (UK) and Ireland were recruited for the study. Practitioners were required to have at least two years continuous experience in professional rugby clubs. All recruited practitioners had at least three years elite experience and were working with international rugby teams at the time of data collection. During the initial stage of the research, three of the UK national nutrition leads were invited to participate in standardised open-ended interviews. A total of 359 statements, divided into 20 topic areas were generated from these interviews and were included in the first round of voting in the Delphi poll. Using a 5-point likert scale, all practitioners selected their level of agreement with the statements from strongly disagree to strongly agree. For a statement to be agreed upon as good practice, 75% of the practitioners were required to vote in agreement. Statements which were not agreed upon were recirculated for the second round of voting. During the first round of voting, all practitioners were invited to provide additional statements for inclusion in the final round. Results: Following the two rounds of voting a total of 201 statements were agreed upon, an indicative sample is provided below: Practitioners encouraged estimating energy needs from lean body mass measures but recommended that caution should be exerted when using predictive equations with rugby players. Recording dietary intakes should be done so with caution with additional supporting quality control measures adopted. High protein intakes (>2 g. kg. BM-1 ) are deemed acceptable by practitioners provided other nutrient consumption is not compromised. There appears to be varied practices in carbohydrate consumption across playing positions, and often players may sacrifice carbohydrate intake for body composition goals. Body composition is a primary driver for the rugby player, however influencing factors of vanity and body image play a large role which may compromise them as a rugby player. Social media has increased interest in food and can be used as an educational tool, however sometimes players receive inappropriate information. A trackable system of supplement use should be in place and any supplementation protocols implemented should have an evidence basis. Practitioners should always have an alternative food strategy to any supplements used. All supplement strategies should be assessed throughout the season for psychological and physiological variation and it is important to be aware of the large influence senior players have over young players regarding supplement use. Finally, a key message was that even in a team sport setting it is important to acknowledge that responses to nutrition and training are highly individual and variable. Despite obtaining much agreement in appropriate practices, a relatively high level of disparity in opinion occurred in the areas of fat intake, recovery and nutrition for supporting illness. Discussion: While current research in rugby nutrition is developing, this Delphi Poll study presents a unique approach to conducting research in the field of applied sport and exercise nutrition. These statements will facilitate expert practitioners in moving towards a consensus regarding optimum nutritional strategies for the rugby player. This research can be utilised by new or developing practitioners, while an empirical evidence base is being established. Conclusion: As well as providing a consensus, the present study highlights areas where practitioners should exercise caution regarding recommendations, as more research is required to help inform their practice. References: 1. Hasson F, Keeney, S, McKenna, H. Research guidelines for the Delphi survey technique, Journal of Advanced Nursing 2000; 32 1008-1015.
CHANGES IN DIETARY INTAKE, IMMUNE FUNCTION AND PERFORMANCE MONITORS THROUGHOUT A SEASON IN PROFESSIONAL RUGBY LEAGUE PLAYERS
In recent years an understanding has developed that sports injuries are the emergent outcomes of complex, dynamic systems. Thus, the importance of local contextual factors on injury outcomes is increasingly being acknowledged. These realisations place injury prevention research at a crossroads. Currently, injury prevention researchers develop universally applicable injury prevention solutions, but the adoption of these solutions in practice is low. This occurs because implementation contexts are both unique and dynamic in nature, and as a result singular, static solutions are often incompatible. In contrast, practitioners address injury prevention through iterative cycles of trial and error, aiming to optimise the injury prevention process within their own unique contexts. The purpose of this critical review is to draw attention to the misalignment between research and practice-based approaches to injury prevention. In light of this, we propose alternative research approaches that acknowledge the process-driven nature of injury prevention in practice. We propose that a core focus of sport injury prevention research should be to provide practitioners with useful and relevant information to support their decision-making around their localised injury prevention practice. Through this approach injury prevention research ceases to be about what works, and begins to engage with understanding what works in what contexts and why?
In recent years there has been an exponential rise in the professionalism and success of female sports. Practitioners (e.g., sport science professionals) aim to apply evidence-informed approaches to optimise athlete performance and well-being. Evidence-informed practices should be derived from research literature. Given the lack of research on elite female athletes, this is challenging at present. This limits the ability to adopt an evidence-informed approach when working in female sports, and as such, we are likely failing to maximize the performance potential of female athletes. This article discusses the challenges of applying an evidence base derived from male athletes to female athletes. A conceptual framework is presented, which depicts the need to question the current (male) evidence base due to the differences of the "female athlete" and the "female sporting environment," which pose a number of challenges for practitioners working in the field. Until a comparable applied sport science research evidence base is established in female athletes, evidence-informed approaches will remain a challenge for those working in female sport.
ABSTRACT Flywheels are a resistance training device that can increase lean body mass, strength, and power. However, due to their unique design and the inertia from the concentric portion directly relating to the force that is applied during the eccentric portion, monitoring the training stimulus can be difficult. Consequently, the aim of this study was to assess the validity of the kMeter app for quantifying force and power at a range of different isoinertial loads from a flywheel training device when compared against a criterion measure. Eleven subjects volunteered to take part in this study, with subjects completing between 5-35 repetitions of the harness squat with 0.05, 0.10, 0.15 kg·m2 isoinertial load. A synchronised dual force plate and tri-camera optoelectronic setup was used as the criterion measure to calculate force and power output, while the kMeter app was used as the practical measure. Very large to nearly perfect relationships were observed between the two measures, with trivial to moderate bias reported. Additionally, typical error of the estimate (TEE) was found to be <10% at all isoinertial loads. These findings suggest that the kMeter app, when used in conjunction with the kBox flywheel device, demonstrate acceptable levels of validity. However, due to the TEE, the kMeter app may not be able to accurately detect small differences and therefore be suitable for research purposes. These findings suggest that the kMeter app is an acceptable method of monitoring flywheel resistance training. Furthermore, it is advised that practitioners utilise mean power rather than mean force. Keywords: Flywheel; Validity; kBox; kMeter; Force; Power
Abstract
Darrall-Jones, J, Roe, G, Cremen, E, and Jones, B. Can team-sport athletes accurately run at submaximal sprinting speeds? Implications for rehabilitation and warm-up protocols.
This study aimed to evaluate team attacking performances in rugby league via expected possession value (EPV) models. Location data from 59,233 plays in 180 Super League matches across the 2019 Super League season were used. Six EPV models were generated using arbitrary zone sizes (EPV-308 and EPV-77) or aggregated according to the total zone value generated during a match (EPV-37, EPV-19, EPV-13 and EPV-9). Attacking sets were considered as Markov Chains, allowing the value of each zone visited to be estimated based on the outcome of the possession. The Kullback-Leibler Divergence was used to evaluate the reproducibility of the value generated from each zone (the reward distribution) by teams between matches. Decreasing the number of zones improved the reproducibility of reward distributions between matches but reduced the variation in zone values. After six previous matches, the subsequent match's zones had been visited on 95% or more occasions for EPV-19 (95±4%), EPV-13 (100±0%) and EPV-9 (100±0%). The KL Divergence values were infinity (EPV-308), 0.52±0.05 (EPV-77), 0.37±0.03 (EPV-37), 0.20±0.02 (EPV-19), 0.13±0.02 (EPV-13) and 0.10±0.02 (EPV-9). This study supports the use of EPV-19 and EPV-13, but not EPV-9 (too little variation in zone values), to evaluate team attacking performance in rugby league.
Hydration Status of Professional Rugby League Players during Match Play
Purpose: This study evaluated whether glycogen-associated water is a protected entity not subject to normal osmotic homeostasis. An investigation into practical and theoretical aspects of the functionality of this water as a determinant of osmolality, dehydration, and glycogen concentration was undertaken. Methods: In vitro experiments were conducted to determine the intrinsic osmolality of glycogen–potassium phosphate mixtures as would be found intra-cellularly at glycogen concentrations of 2% for muscle and 5 and 10% for liver. Protected water would not be available to ionic and osmotic considerations, whereas free water would obey normal osmotic constraints. In addition, the impact of 2 L of sweat loss in situations of muscle glycogen repletion and depletion was computed to establish whether water associated with glycogen is of practical benefit (e.g., to increase “available total body water”). Results: The osmolality of glycogen–potassium phosphate mixtures is predictable at 2% glycogen concentration (predicted 267, measured 265.0 ± 4.7 mOsmol kg
−1
) indicating that glycogen-associated water is completely available to all ions and is likely part of the greater osmotic system of the body. At higher glycogen concentrations (5 and 10%), there was a small amount of glycogen water (~ 10–20%) that could be considered protected. However, the majority of the glycogen-associated water behaved to normal osmotic considerations. The theoretical exercise of selective dehydration (2 L) indicated a marginal advantage to components of total body water such as plasma volume (1.57% or 55 mL) when starting exercise glycogen replete. Conclusion: Glycogen-associated water does not appear to be a separate reservoir and is not able to uniquely replete water loss during dehydration.Total energy expenditure (TEE) has been quantified in elite senior rugby league (RL) and rugby union (RU) players using multiple measures, with criterion measures lacking in RU and academy players. Robust measures of TEE are required as prediction equations used to estimate energy requirements are often unsuitable for athletes. This study quantified TEE of 27 elite male English academy (U16 and U20) and senior (U24) RL and RU players during a 14-day in-season period using doubly labelled water (DLW). Resting metabolic rate (RMR), using indirect calorimetry, and physical activity level (PAL) was also measured (TEE:RMR). Predicted TEE, determined by published equations, was compared to measured TEE by age group. Differences in TEE (RL, 4369 ± 979; RU, 4365 ± 1122; U16, 4010 ± 744; U20, 4414 ± 688; U24, 4761 ± 1523 Kcal.day-1) and PAL (overall mean 2.0 ± 0.4) were unclear. RMR was very likely greater for RL (2366 ± 296 Kcal.day-1) than RU players (2123 ± 269 Kcal.day-1). Relative RMR for U16, U20 and U24 (27 ± 4, 23 ± 3 and 26 ± 5 Kcal.Kg-1.day-1) was very likely greater for U20 than U24 players. Differences in TEE estimated by the Schofield, Cunningham and Harris-Benedict equations compared with DLW were unclear, likely and unclear for U16 (187 ± 614; -489 ± 564 and -90 ± 579 Kcal.day-1), likely, very likely and likely for U20 (-449 ± 698; -785 ± 650 and -452 ± 684 Kcal.day-1) and all unclear for U24 players (-428 ± 1292; -605 ± 1493 and -461 ± 1314 Kcal.day-1). Due to large variability between individuals, negligible differences in TEE were observed by code, and ~350-400 Kcal.day-1 differences between consecutive age groups were unclear. Differences in RMR may be due to training exposure and match play. The remaining components of TEE (i.e. thermic effect of feeding and activity thermogenesis) may reflect the differences in contact demands between codes, as RU players typically engage in more static exertions than RL players during match play. Prediction equations are currently insufficient to differentiate between individual variability in TEE. The importance of practitioners providing individual support for the elite rugby player is highlighted. Finally, the TEE measured in this study using the gold standard DLW method can be used as reference data for elite rugby players of different codes and ages, during an in-season training period.
Effectiveness of a commercially available guarana supplement on subjective measures in elite rugby players; a case study
Global anti-doping policy indicates that athlete support personnel (ASP, e.g., doctors, nutritionists) can play an important role in fostering supportive environments that protect against intentional and inadvertent doping. Yet, research into ASP anti-doping roles is limited and no study has examined how (if at all) different members of ASP work together. Therefore, this study investigated anti-doping roles of ASP in a single sports club environment via semi-structured interviews. Through inductive reflexive thematic analysis, three overarching themes were constructed: 1) Everyone has responsibility for anti-doping, but most of the work rests unevenly on a few shoulders, 2) Education is fundamental to doping prevention, and 3) (Preventing doping) It’s all about the way we work with players and each other. As the first study of its kind, the findings indicated that actions taken to prevent doping varied across ASP working together in the same environment. The nutritionist and medical staff were most active in anti-doping efforts and least active were strength and conditioning coaches. Factors underpinning anti-doping roles were individuals’ relevant expertise/training and overall job responsibilities (e.g., supplements, medications) related to risk of doping. Staff also connected their doping prevention efforts to the club’s person-centred philosophy, which prioritised ‘individualisation’ and supportive relationships. While the data indicates potential for anti-doping responsibilities to be shared amongst ASP who work well together and trust one another, it revealed that reliance on one or two ASP in any environment might allow other ASP to neglect their opportunity to have a positive influence on players’ doping-related decisions.
Is there a place for static stretching in warm-up routines of soccer players?
The objective of the study was to investigate the impact of static and dynamic stretching singularly or combined within a warm-up on a soccer-specific intermittent protocol (SSIP). Seven semi-professional players aged 21.9(±2.0) years old [height 181.5(±9.4) cm, body mass 74.8(±6.4) kg] following screening and ethical approval, completed a VO2max test and were familiarised with the SSIP. Within a 3-week period, participants undertook 3 different warm-ups, before completing the SSIP, consisting of three bouts of exercise with a 15-min recovery after the 2nd bout to simulate a soccer match. After the 3rd bout participants completed an intermittent time to exhaustion test (ITTE). Warm-ups consisted of 5-min activity, then 10-min of Static Stretches (SS), Dynamic Stretches (DS) and SS+DS (SS-DS) in a randomised-crossover order. Core temperature (Tc) and VO2 were recorded throughout. Testing took place in an environmental chamber, replicating UK conditions (9°C, 50% relative humidity). ITTE was 8.2(±2.3), 9.0(±4.6), and 10.7(±5.9) min for the SS, DS and SS-DS groups. Tc in the SS-DS (38.4±0.3°C) was higher (P<0.05) than the SS (37.9±0.4°C) during the 1st ITT bout. Tc was 38.3(±0.6)°C for the DS condition. Differences in Tc during the first two bouts disappeared in the 3rd. Participants in the DS (78±9.5%) and SS-DS (76.3±15.1%) exercised at a lower (P<0.05) %VO2max than the SS (83.0±15.4%). Combining SS and DS is more beneficial than SS alone during a warm-up, and may be more beneficial than DS alone. A second warm-up is necessary as benefits associated with the initial warm-up dissipate during the half-time break.
Sports invest in research to optimise performance and enhance athlete wellbeing. Involving stakeholders allows research priorities to be determined, maximising the adoption and relevance of research findings. A three‐round modified Delphi process was used to establish wellbeing and performance research priorities for Premiership Rugby (Professional men's rugby union competition in England). Up to 10 research priorities were provided during Round 1 (grouped into higher‐order categories and themes via content analysis). In Rounds 2 and 3, participants ranked higher‐order categories on a one to five Likert scale. Consensus was defined as ≥ 70% agreement. Sixty‐five participants responded in Round 1 (41 and 32 in Rounds 2 and 3). Staff and player experience of working or playing in the Premiership was 11.0 (4.5–16.5) and 7.0 (6.0–8.5) years. Following Round 1, 393 research priorities were provided and 53 higher‐order research priorities and 26 categories were identified, within three themes: performance, wellbeing and injury. Following Round 3, 21 research priorities reached consensus within performance (n = 7), wellbeing (n = 6) and injury (n = 8). Research priorities for a professional sports league, were established by the application of a pragmatic research lens, to ensure priorities were practically minded and also developed with minimal resource requirements, minimal burden for participants and in a short amount of time, which can be applied in other leagues. Research priorities deemed feasible and lacking a relevant evidence base can be addressed in future studies to maximise impact and compliment the ongoing research programmes already established by the professional league and governing body.
This study aimed to quantify the frequency of individual and team contact events during rugby union match play in top domestic and international men’s and women’s competitions. Analyst‐coded player individual and team contact event types (tackles, carries, attacking rucks and defensive rucks, lineouts, scrums and mauls) from the 2022/2023 rugby union season were analysed from top domestic and international competitions across the world using generalised linear mixed models. For both women’s and men’s rugby, competitions generally had similar numbers of contact events per playing position. Where differences were observed, most ranged between 0.5 and six per contact event per full game equivalent (FGE). Similar trends were observed when comparing women’s to men’s rugby. However, within‐game accumulation of these different contact events for certain positional groups may have a significant impact (e.g., a front five player called up from a Farah Palmer Cup team to play in WXV1 could be involved in as much as 6 more attacking rucks, 3 more tackles and 5 more mauls per game on average). Furthermore, the small differences between competitions per FGE may accrue across matches and thus result in far greater exposures across a season (e.g., a front five player in Premiership Rugby may make 48 more tackles over 20 matches than in Top 14 on average). Although a high proportion of contact events per FGE were similar between competitions and sexes per playing position, differences that were observed may have important implications for players transitioning between competitions and the long‐term exposure of players to higher‐risk contact events.
OBJECTIVE: Develop a questionnaire to monitor symptoms of player perceived shoulder function/dysfunction. DESIGN: 3-Stage Online Delphi Study. METHODS: Participants: surgeons, sports and exercise medics, academic researchers, strength and conditioning coaches, therapists and athletes split by level of expertise/experience. Stage-1: experts (n = 12) rated constructs/items from the steering group and made changes/proposed additional constructs/items. Stage-2: experts rated/amended new constructs/items from stage-1. Stage-3: experienced professionals (n = 25) rated/ranked constructs/items from stage 2. Consensus thresholds were defined per stage (≥50% agreement/4-5 rating on 1-5 Likert scale (stages 1-2), ≥68% agreement, and items ranked for perceived importance (stage-3)). RESULTS: Stage-1, all four constructs (a. Activities of daily living, b. Range of motion, c. Strength and conditioning, d. Sports specific training and competition) and 26/42 original items achieved consensus. Twelve items were combined into five items. Four new items were also proposed. Stage-2, the combined items and three of the four new items achieved consensus. Stage-3 the four constructs and 22 items all achieved consensus. CONCLUSIONS: Following a 3-stage online Delphi process, involving expert and experienced clinicians, practitioners and athletes, a new four construct, 22 item RSF questionnaire has been developed which can be used with rugby players, to monitor perceived shoulder performance and symptoms.
Reliability and Usefulness of Linear speed testing in field based sport athletes
Seasonal changes in session external training load in professional rugby league players; A Case Study from an Elite European Rugby League Squad
Contact and Head Acceleration Characteristics of a Women's Rugby Union Team During an International Tournament
ABSTRACT
This study aimed to describe the characteristics of contact and head acceleration event (HAE) exposure in an international women's rugby union team, across an international tournament, encompassing match and training contexts. Using a retrospective case study design, the contact and HAE exposure of 28 women's rugby union players were assessed using video analysis and instrumented mouthguards (iMGs). In a three‐week tournament, three matches and 16 training sessions were coded using consensus operational definitions, and synchronized with iMG data. Exposure duration was recorded for each player, facilitating analysis of contact frequency, and HAE incidence per player hour. The probability of contact events to result in HAEs was reported. Training accounted for 71% (forwards) and 81% (backs) of weekly contact count. Forwards had a greater contact frequency than backs during matches (58.0 ± 10.5 vs. 21.3 ± 8.6 events per player hour). The probability for an HAE was greater in matches than training, with large inter‐individual variability observed. During matches and training, the tackle event accounted for 82% and 71% of HAEs ≥ 25g, and 79% and 78% of HAEs ≥ 1.5 krad/s
This study aimed to investigate the incidence, severity, and burden of injury in English elite youth female soccer players. Qualified therapists at six English girls' academies prospectively recorded all injuries that required medical attention or caused time loss for matches and training in 375 elite youth female soccer players (under-10 , U12, U14 and U16) during the 2019/2020 season. One hundred- and eleven time-loss injuries (52 from training, 59 from matches) were sustained, resulting in 1,946 days absent (779 days from training injuries, 1,167 days from match injuries) from soccer activities. The injury incidence for matches (9.3/1000 hours, 95% CIs: 7.2-11.9) was significantly greater than training (1.1/1000 hours, 95% CIs: 0.9-1.5, p<0.001). Additionally, the injury burden for matches (183 days lost/1000 hours, 95% CIs: 142-237) was significantly greater than training (17 days lost/1000 hours, 95% CIs: 13-22, p<0.001). Injury incidence and burden were greatest in the U16 age group, and were found to increase with age. Whilst injury incidence and burden are greater in matches than training, a large proportion of preventable injuries, soft-tissue and non-contact in nature, were sustained in training. Findings provide comparative data for elite youth female soccer players.
Illness prevention is essential for athlete health management, but little is known about its uptake in sport. Prior to the pandemic, the International Olympic Committee (IOC) published a consensus statement recommending illness prevention guidelines are implemented in sports. Yet, little is known about guideline uptake. Therefore, this study aimed to explore the (1) illness experiences of rugby players and athlete support personnel and (2) barriers and enablers to illness prevention guideline uptake in rugby, using the lens of behaviour change theory. In a bid to inform and enhance athlete welfare, we sought to amplify the voices of participants through qualitative inquiry. Between August 2020 and May 2021, 16 semi-structured interviews were undertaken with players and athlete support personnel working across rugby. Analysis was conducted using Braun and Clarke’s reflexive thematic analysis. Prior to COVID-19, participants deemed illness to be of little concern, with experience of illnesses and the global pandemic critical enablers to guideline uptake. The rugby environment was a barrier to illness prevention, particularly in women’s and academy teams where resource deficiency was highlighted. ‘Rugby identity’ acted as both a barrier and enabler with participants’ passion for rugby driving both guideline adherence and non-adherence. Tackling resource inequalities between men’s and women’s cohorts is critical to effectively implement guidelines. Coach and player education is essential, and emphasis must be placed on continuing preventative behaviours adopted due to COVID-19. Our findings offer new insight into illness prevention, moving away from prevailing quantitative research, and instead voicing players’ experiences.
The purpose of the present study was to evaluate the anthropometric and physical characteristics of English regional academy rugby union academy players by age category (under 16, under 18 and under 21s). Data were collected on 67 academy players at the beginning of the pre-season period and comprised anthropometric (height, body mass and sum of 8 skinfolds) and physical (5 m, 10 m, 20 m & 40 m sprint, acceleration, velocity & momentum; agility 505; vertical jump; yo-yo intermittent recovery test level 1; 30-15 Intermittent Fitness Test; absolute and relative 3 repetition maximum (3RM) front squat, split squat, bench press, prone row and chin; and isometric mid-thigh pull). One way analysis of variance demonstrated significant increases across the three age categories (p < 0.05) for height (e.g., 16s = 178.8 ± 7.1; 18s = 183.5 ± 7.2; 21s = 186.7 ± 6.61 cm), body mass (e.g., 16s = 79.4 ± 12.8; 18s = 88.3 ± 11.9; 21s = 98.3 ± 10.4kg), countermovement jump height and peak power, sprint momentum, velocity and acceleration; absolute, relative and isometric (e.g., 16s = 2157.9 ± 309.9; 18s = 2561.3 ± 339.4; 21s = 3104.5 ± 354.0 N) strength. Momentum, maximal speed and the ability to maintain acceleration were all discriminating factors between age categories, suggesting that these variables may be more important to monitor rather than sprint times. These findings highlight that anthropometric and physical characteristics develop across age categories and provide comparative data for English academy Rugby Union players.
The physical match demands for professional rugby union are well established (Cahill et al., 2013, Journal of Sports Science, 31, 299–237). However, there is a lack of evidence for adolescent players, especially in the UK. Therefore, the purpose of this study was to quantify and compare the demands placed upon adolescent players representing county teams across three age groups (U16, U18 and U20) and two playing positions (forwards and backs). Two county representative games for each age group were assessed, with a total of 112 independent observations collected. Players were classified into age group categories and by position (forwards; U16 [n = 20], U18 [n = 21], U20 [n = 18] and backs; U16 [n = 15], U18 [n = 19], U20 [n = 19]). Match demands were analyzed via a microtechnology unit (OptimEye S5, Catapult Innovations, Melbourne, Australia) that contained a GPS system and triaxial accelerometer sampling at 10 and 100 Hz, respectively. The magnitudes of difference between age groups within positions for locomotive and accelerometer-based variables were investigated using Cohen’s d effect sizes (±90% CL). Institutional ethical approval was granted. For forwards, unclear differences between age groups were observed for total distance (TD), but relative distance (RD) showed very large (U16 vs. U20; d = −2.87 ± 0.53) and large (U18 vs. U20; d = −1.81 ± 0.52) differences between groups. Moderate effect sizes were found for both maximum sprint velocity (Vmax; d = −1.03 ±0.53) and total sprinting distance (d = −0.78 ± 0.53) between U16 and U20. When normalised for time, PlayerLoadSlowTM (PLslow · min-1) increased with age, showing moderate effects for U16 versus U18 (d = 0.68 ± 0.52) and U16 versus U20 (d = 0.80 ± 0.54). For backs, unclear differences between age groups were observed for TD, but RD showed moderate differences U16 versus U20 (d = −0.88 ± 0.58) and U18 versus U20 (d = −1.01 ± 0.54). Small effect sizes were observed for Vmax (d = −0.52 ± 0.54) and total sprinting distance (d = −0.46 ± 0.54) between U18 and U20, whereas U16 versus U20 showed a small difference for Vmax only (d = −0.46 ± 56). PLslow.min-1 increased with age, demonstrating a moderate difference between U16 and U18 (d = 0.86 ± 0.57) and a small difference between U16 and U20 (d = 0.56 ± 0.57). This study shows that the absolute locomotive demands are similar between age groups, although when expressed relative to time, differences were found. This is likely due to difference in playing time between age groups and the consequent fatigue and/or pacing strategies adopted by players. The increase in PLslowmin-1 with age suggests an increase in static exertions. Future research should look to explore the interaction between physical and technical performances at different ages of adolescent rugby.
Training load and movement demands of English adolescent rugby union players
The purpose of the present study was to evaluate the anthropometric, sprint and high-intensity running profiles of English academy rugby union players by playing positions, and to investigate the relationships between anthropometric, sprint and high intensity running characteristics. Data was collected from 67 academy players following the off-season period and consisted of anthropometric (height, body mass, sum of 8 skinfolds [∑SF]), 40 m linear sprint (5, 10, 20 30 & 40 m splits), the Yo-Yo intermittent recovery test level 1(Yo-Yo IRTL-1) and the 30-15 intermittent fitness test (30-15IFT). Forwards displayed greater stature, body mass and ∑SF; sprint times and sprint momentum, with lower high-intensity running ability and sprint velocities than backs. Comparisons between age categories demonstrated body mass and sprint momentum to have the largest differences at consecutive age categories for forwards and backs; whilst 20-40 m sprint velocity was discriminate for forwards between Under 16s, 18s and 21s. Relationships between anthropometric, sprint velocity, momentum and high-intensity running ability demonstrated body mass to negatively impact upon sprint velocity (10 m; r = -0.34 to -0.46); positively affect sprint momentum (e.g., 5 m; r = 0.85 to 0.93), with large to very large negative relationships with the Yo-Yo IRTL-1 (r= -0.65 to -0.74) and 30-15IFT (r= -0.59 to -0.79). These findings suggest that there are distinct anthropometric, sprint and high-intensity running ability differences between and within positions in junior rugby union players. The development of sprint and high-intensity running ability may be impacted by continued increases in body mass as there appears to be a trade-off between momentum, velocity and the ability to complete high-intensity running.
Group and individual monitoring of sprint and strength performance in adolescent rugby union players
The influence of body mass on the 30-15 Intermittent Fitness Test in Rugby Union players
INTRODUCTION: Rugby League is an invasion sport characterized by frequent accelerations, decelerations, changes of direction and collisions between players. The game is played world wide at junior and senior age groups and across competitive levels ranging from amateur to elite. Rugby league academy competition is an important step along the pathway to professional player status. Despite the importance of this pathway, to date no research has examined injury risk at senior academy level in England. METHODS: Three professional rugby league academies were recruited to this observational prospective cohort investigation. Eighty-one players were included in the investigation. Physiotherapists from each academy were recruited and remunerated to act as injury surveillance officers throughout the season. Match injuries were recorded using a time-loss definition consistent with the consensus adopted within rugby union. Injury incidence , injury severity and injury burden were all calculated. RESULTS: Injury incidence of 85 <95%CI 67 to 103> injuries/1000 h was observed during the 59 matches played. This equates to 1.5 <95%CI 1.2 to 1.8> time-loss injuries per match. The mean severity of injury was 22 ± 19 days resulting in an overall injury burden of 1870 <95%CI 1785 to 1955 days/1000h>. The tackle event was the most common cause of injury <69% of injuries>, with the tackled player injured more frequently than the tackler . Forwards sustained a greater proportion of injuries than backs and ankle sprains <11, 95%CI 4 to 17 per 1000 h> were the most commonly diagnosed injuries, but the shoulder joint was the most commonly injured site <17, 95%CI 9 to 25 per 1000 h>. CONCLUSION: Overall, the incidence of injury for academy rugby league was similar to that reported in senior professional rugby league <78 injuries / 1000 h> , but the mean severity and overall burden of injury was lower . Injury patterns indicate that academy players are at a higher risk of concussion and shoulder joint injuries than senior professional players. This suggests that the specific focus for injury risk management in academy rugby league should be on players’ tackle technique and prevention strategies for concussion and shoulder injuries. REFERENCES: Fitzpatrick, A. C., Naylor, A. S., Myler, P., & Robertson, C. <2018>. A three-year epidemiological prospective cohort study of rugby league match injuries from the European Super League. Journal of science and medicine in sport, 21<2>, 160-165.
Introduction Age-grade (e.g., U18) rugby union players play in multiple playing levels across a season, including international and academy competition. One method for quantifying the physical characteristics of different playing levels is to calculate the maximum locomotor intensity using relative distance (m·min-1) and high-speed (>5.5 m·s-1) relative distance (HSm·min-1). The aims of the study were to quantify the maximum locomotor intensities from match-play and compare between U18 international and academy levels. Methods In total, 142 U18 male rugby union players provided 232 observations. During match- play each player wore a micro-technology device (S5 Optimeye, Catapult Sports) that contained a global positioning system. Using the raw instantaneous speed (m·s-1) downloaded at 10 Hz, variables were calculated through the use of a 0.1 s rolling mean for time durations (15, 30 s and 1, 2, 2.5, 3, 4, 5, 10 min) relevant to age-grade rugby union. Players were split into four positional groups of: front row, back five, scrum-halves, and inside and outside backs. A linear mixed model was used to account for the repeated measurements of players and then results were interpreted with effect sizes (ES) ±90% confidence intervals and classified as trivial (0.00-0.19); small (0.20-0.59); moderate (0.60-1.19) and large (1.20-1.99). Ethics approval was granted from Leeds Beckett University. Results Differences between levels in relative distance were trivial or small for all time durations and positions, with the relative distance ranging from 148 ± 16 to 189 ± 17 m·min-1 in the one-minute duration. High-speed relative distance for one-minute ranged from 26 ± 11 to 71 ± 24 HSm·min-1 and throughout all comparisons were greater in international players. The differences in high-speed relative distance were moderate to large (ES = 1.17 ±0.64 to 1.59 ±0.64) in front row players. The differences between the back five positional groups were small (ES = 0.31 ±0.52 to 0.45 ±0.57) for high-speed relative distance. There were small differences between the groups of scrum halves in the 15 s, 30 s and 1 min durations (ES = 0.56 ±0.79 to 0.59 ±0.78), with moderate to large differences in time durations ≥2 min (ES = 0.82 ±0.87 to 1.24 ±0.93). The differences in high-speed relative distance were trivial to small (ES = 0.02 ±0.51 to 0.39 ±0.58) in the inside and outside backs comparison. Conclusion Relative distance was similar between playing levels but appears to be comparable to data from senior international rugby union match-play in previous studies. There is a greater amount of high-speed relative distance per minute completed during U18 international matches compared to U18 academy matches. Coaches working with rugby players can use this information to appropriately overload the intensity of running, specific to time durations and positions.
PURPOSE: To compare the external loads and external:internal load ratios (EL:IL) during match-play of adolescent collision sport athletes playing at both elite (i.e., academy) and sub-elite (i.e., school) standards. METHODS: Following ethics approval, seventeen elite adolescent male rugby union players (mean ± SD age = 17 ± 1 years) were recruited for this study. Global positioning system (GPS) locomotor (i.e., relative distance [RD; m·min-1], low speed activity [LSA; relative distance <61% maximum velocity [Vmax]], high speed running [HSR; relative distance ≥61% Vmax]), and accelerometer (relative PlayerLoadTM [RPL; AU·min-1], PLSLOW [relative accelerations <2 m·s-1], PLFAST [relative accelerations ≥2 m·s-1]) external loads, and session rating of perceived exertion (sRPE) internal load measures were obtained from 22 matches resulting in 86 match files (39 sub-elite and 47 elite match files; 5 ± 2 match files per subject). Perceptual wellbeing measures (i.e., fatigue, sleep quality, upper-body and lower-body soreness, stress, and mood) were also recorded using a 5-point Likert scale on the mornings pre- and post-match. Data were analysed using Cohen’s d effect sizes (d) and magnitude-based inferences. RESULTS: Differences in external loads were unclear between playing standards for RD (d = -0.2) and RPL (d = 0.0). However, subjects’ EL:IL were very likely lower during elite compared to sub-elite matches for both RD:sRPE (d = -1.0) and RPL:sRPE (d = -0.8), due to the very likely greater sRPE during elite matches (d = -1.1). There were unclear differences between sub-elite and elite matches for distribution of GPS and accelerometer variables (i.e., LSA, HSR, PLSLOW, and PLFAST; Table 1). Changes in total perceptual wellbeing were possibly greater following elite compared to sub-elite matches, with unclear differences for changes in sleep quality, lower-body soreness, stress and mood. However, changes in perceptual fatigue and upper-body soreness were both likely greater following elite matches. CONCLUSIONS: Adolescent rugby union players had similar locomotor and accelerometer external loads (i.e., RD, LSA, HSR, RPL, PLSLOW and PLFAST) during both elite and sub-elite standard matches. However, EL:IL was substantially reduced in higher standard matches, indicating a higher perception of effort for a given external load. Greater changes in perceptual fatigue and upper-body soreness following elite match-play may be related to greater magnitude of collision-based activity at higher playing standards. This may be due to the increased body mass and running velocities of opponents at higher playing standards, which may not be fully accounted for using external load measures alone. PRACTICAL APPLICATION: As collision sports require more than just movement and acceleration demands, the inclusion of subjective load measures (i.e., sRPE) or EL:IL may provide a further insight to the true demands of training or match-play than GPS and accelerometer data alone.
Objectives: Due to the complex-systems nature of injuries, the responsibility for injury risk management cannot lie solely within a single domain of professional practice. Interdisciplinary collaboration between technical/tactical coaches, strength and conditioning coaches, team doctors, physical therapists and sport scientists is likely to have a meaningful impact on injury risk. This study describes the application and efficacy of a multidisciplinary approach to reducing team injury risk in professional rugby union. Design: Observational longitudinal cohort study. Methods: Epidemiological injury data was collected from a professional rugby union team for 5 consecutive seasons. Following each season, these data informed multidisciplinary intervention strategies to reduce injury risk. The effectiveness of these strategies was iteratively assessed to inform future interventions. Specific examples of intervention strategies are provided. Results: Overall team injury burden displayed a likely beneficial decrease (-8 %; injury rate ratio (IRR) 0.9, 95%CI 0.9 to 1.0) from 2012 to 2016. This was achieved through a most likely beneficial improvement in non-contact injury burden (-39 %; IRR 0.6, 95%CI 0.6 to 0.7). Contact injury burden was increased, but to a lesser extent (+18 %; IRR 1.2, 95%CI 1.1 to 1.3, most likely harmful) during the same period. Conclusions: The range of skills required to effectively manage complex injury phenomena in professional collision sport crosses disciplinary boundaries. The evidence presented here points to the effectiveness of a multidisciplinary approach to reducing injury risk. This model will likely be applicable across a range of team and individual sports.
Talent Identification
The increased growth and professionalism of women’s football have led to an increased investment in talent identification and development of players from a young age. Governing bodies are now investing in talent identification and development environments such as academies, which are starting for girls as young as 10 years, and which are aligned with similar academy structures that are in place in the boys’ and men’s pathway. Talent identification in women’s football has traditionally been based on viewing players in a trial game or training session environment, whereby the players aim to impress coaches. This approach is not informed by scientific evidence, but rather coaches’ subjective preconceived notion of the ideal player, which, when used in isolation, may result in repetitive misjudgements and limited consistency. However, in recent years there has been an increased amount of research exploring talent identification and development in the women’s game. In this chapter, an overview of existing research is provided, as well as case study examples of talent-identification challenges and recommendations for talent identification and development practices.
This mixed methods study aimed to assess the agreement between coaches ranking of youth rugby league players compared against objective physical performance data and gather coaches' subjective descriptions of their players performance. Five hundred and eight male rugby league players (U16 n=255, U18 n=253) completed a fitness testing battery of anthropometric and physical performance measures. Subsequently, twenty-two rugby (n =11) and strength and conditioning (S&C) coaches (n =11) ranked each player's physical qualities using a 4-point Likert scale (1 - top 25%; 2 - 25-50%; 3 - 50-75%; and 4 - bottom 25%) and described their performance. U16 S&C coaches displayed fair agreement when assessing players body mass (39.3%, κ = 0.20). U18 rugby coaches demonstrated fair agreement for strength and size (42.5%, κ = 0.23) and body mass (48.7%, κ = 0.31) whilst both U18 rugby and S&C coaches showed fair agreement levels for endurance (39.8%, κ = 0.25, 44.3%, κ = 0.29), respectively. Three higher-order themes were identified from coaches' descriptions of players including physical, rugby and attitude characteristics when evaluating performance. Overall, coaches cannot accurately assess players physical performance against fitness testing data. Though, findings suggest coaches adopt a multidimensional approach when evaluating players performance. Practitioners within talent development systems should utilise both objective and subjective assessments when making decisions regarding players performance.
The purpose of this study was to (a) provide comparative isometric midthigh pull (IMTP) force-time characteristics for elite youth soccer players and (b) determine the effect of age and maturation on IMTP force-time characteristics. Elite male youth soccer players (U12 n = 51; U13 n = 54; U14 n = 56; U15 n = 45; U16 n = 39; and U18 n = 48) across 3 maturity offset groups (Pre n = 117; circa n = 84; and Post–peak height velocity n = 92) performed 2 maximal IMTP trials on a portable force platform (1,000 Hz). Absolute and relative values for peak force (PF) and impulse over 100 and 300 ms were analyzed. A full Bayesian regression model was used to provide probable differences similar to that of a frequentist p value. Advanced age and maturation resulted in superior IMTP force-time characteristics. Peak force demonstrated high probabilities of a difference between all consecutive age groups (p > 0.95). For absolute and relative impulse (100 and 300 ms), only 2 consecutive age groups (U14–15's and U16–18's) demonstrated high probabilities of a difference (p > 0.95) with large effects (d = 0.59–0.93). There were high probable differences between all maturity offset groups for PF and impulse with medium to large effects (d = 0.56–3.80). These were also reduced when expressed relative to body mass (relative PF and relative impulse). This study provides comparative IMTP force-time characteristics of elite male youth soccer players. Practitioners should consider individual maturation status when comparing players given the impact this has on force expression.
Fitness testing is common practice within youth athletes. However, the interpretation of fitness data often occurs within chronological annual-age categories, resulting in athletes being (dis)advantaged due to age or maturity discrepancies. Instead, evaluating fitness performance against rolling averages may be more appropriate. This article presents a novel method for analyzing fitness testing data in youth athletes using Z-scores according to rolling averages for both chronological age and maturity status. This analysis technique allows the dual ability to interpret youth fitness performance according to age and maturation, enhancing accuracy of data interpretation for talent identification, development and strength and conditioning programming.
Introduction Understanding the physical demands of rugby union can assist coaches in the preparation of players. Match demands in senior players for domestic competitions (Cahill et al., 2013) and international games (Quarrie et al., 2013) are well established. However, despite adolescent rugby union players playing concurrently at various standards, there is no study that has attempted to compare them. Therefore, the purpose of this study was to compare the physical demands of U18 school vs. academy rugby union match play. Methods A full season of games from the academy (6 games) were analysed and matched by six games from the school standard. Each player wore a microtechnology unit which contained a global positioning system and tri-axial accelerometer in addition to a heart rate monitor. The players were spilt into forwards and backs with only players who participated in the entire game included in the subsequent analysis (Forwards; school [n=25], academy [n=21] and Backs; school [n=25], academy [n=24]). All data were analysed using magnitude based inferences. Institutional ethical approval was granted. Results Forwards: Total distance was almost certainly greater in academy forwards (5461 ± 360 vs. 4881 ± 388 m). Distance walking was unclear between the two groups, but jogging, striding and sprinting was almost certainly, very likely and likely greater in academy forwards in comparison to school forwards. PlayerLoadTM slow was possibly greater for academy forwards whilst heart rate mean and maximum was likely lower for academy forwards. Backs: Total distance was very likely greater in academy backs (5597 ± 383 vs. 5260 ± 441 m). Distance walking and sprinting was unclear. Distance jogging was almost certainly greater in the academy backs and striding was possibly greater. PlayerLoadTM slow was possibly greater in academy backs whilst heart rate mean and maximum was unclear between the two groups. Discussion This study shows that academy rugby union provides forwards and backs with a greater physical demand than school players of the same position. The increase in PlayerLoadTM slow suggests an increase in static exertions for academy players. Future research should look to explore the interaction between physical and technical performances between different standards of adolescent rugby union.
This study aimed to investigate rugby league coaches’ perceptions of physical qualities for current and future performance, while also establishing the training practices of Under-16 and Under-19 players. Twenty-four practitioners (rugby coach, strength and conditioning coach) working within nine Super League clubs completed a questionnaire. The questionnaire required practitioners to rank eleven physical qualities (i.e., strength, power, acceleration, maximum speed, aerobic endurance, change of direction, agility, height, body mass, lean mass and fat mass) by importance for current performance, future performance and career longevity according to playing position (forwards, backs, hookers & halves). Practitioners were asked to provide detail on the frequency and duration of each type of training session completed during a typical week throughout each phase of the season; pre-season, in-season (early), in-season (mid), and in-season (late). Typically, practitioners ranked strength, power and acceleration qualities highest, and endurance and anthropometric qualities lowest. The importance of physical qualities varied according to each playing level and position. Training practices of U16 and U19 players differed during each phase of the season, with U19 players undertaking greater training volumes than U16s players. Overall, the physical qualities coaches perceived as most important were not reflected within their training practices. Rugby league practitioners can use this information as a reference source to design long term athletic development plans, prescribe training and during player development procedures. Moreover, these data can inform and improve training practices while influencing the design of pre-season preparatory phases and in-season periods.
Poster presentation
The purpose of this study was to determine the relationship between matchday wellness status and a technical-tactical performance construct during rugby match-play. One hundred and thirty-three male rugby union players (73 forwards and 60 backs) from five under-18 national squads who participated in the under-18 Six Nations competition completed a subjective wellness questionnaire on each matchday morning. Players subjectively rated each item (sleep quality, fatigue, muscle soreness, stress and mood) on a five-point Likert scale to calculate their daily wellness status (i.e., difference between matchday and baseline perceived wellness). Technical-tactical performance during match-play was quantified by coding individual key performance indicators (e.g., number of carries, number of tackles). Partial least squares correlation analysis (PLSCA) was employed to compute the latent variables of perceived wellness status (X matrix) and technical-tactical performance (Y matrix) for each player observation (n=271). The latent variables are a construct of each variable group, enabling higher dimensional data to be visualised more simply. Linear mixed-effect models were later conducted to assess the relationships between the latent variables. The effect of perceived wellness status on technical-tactical performance was statistically significant in forwards (p=0.042), not statistically significant in backs (p=0.120) and accounted for 4.9% and 1.9% variance in the technical-tactical performance construct, respectively. The findings of this study suggest that perceived wellness status can influence technical-tactical match performance, but the practical significance of these findings should be interpreted with caution given the amount of variance in technical-tactical performance accounted by the models.
Academy rugby league competition is an important step along the pathway to professional status, but little is known about injury at this level of the game. The aim of this research was to establish the nature, incidence and burden of injury in English academy rugby league. Using an observational prospective cohort study design, and a time-loss injury definition, the injury outcomes of three professional rugby league academies were recorded during the 2017 season. A total of 87 injuries occurred in 59 matches for an overall injury incidence of 85 (95%CI 67-103) injuries per 1000 hours played. The mean severity of injury was 22 ± 19 days resulting in an overall injury burden of 1898 (95%CI 1813-1983) days lost per 1000 hours. The tackle event was the most common cause of injury (77% of all injuries). Forwards sustained a greater proportion of injuries than backs (forwards 67% vs. backs 33% of injuries). Concussion (13 (6-20) per 1000 hours) and ankle sprains (11 (4-17) per 1000 hours) were the most commonly diagnosed injuries. The shoulder joint was the most commonly injured site (17 (9-25) per 1000 hours). The incidence of injury for academy rugby league is similar to senior professional rugby league.
The whole match demands of rugby union are well established, however it is unclear how these vary during specific phases of play within a match. For example, the influence of phases of play (attacking or defending) on the movement and physical demands are yet to be quantified. Therefore, the aim of this study was to investigate the influence of attacking and defensive phases of play on the movement (e.g., running) and physical (e.g., accelerometer activity) demands for forwards and backs. With institutional ethics approved, 50 male academy rugby union players (age: 17.6 ± 0.6 years; stature: 183.0 ± 6.8 cm; body mass 89.4 ± 10.9 kg) from one regional rugby union academy were tracked during match-play using microsensor technology (Optimeye S5, Catapult Innovations, Melbourne, Australia). 260 observations were collected over 2 seasons (12 matches). Differences in maximum sprint velocity (Vmax), relative distance an§d PlayerLoadTM (PL.min-1) were assessed using magnitude based inferences. The mean length of matches were 74.8 ± 3.3 min, whilst the mean amount of time the ball was in play was 27.4 ± 2.9 min. The mean amount of time spent attacking per match was lower than defending (12.7 ± 3.1 vs. 14.7 ± 2.5 min). There were a lower number of attacking phases (27 ± 9) compared to defensive phases (31 ± 10) whilst the mean phase was similar in length (26 ± 17 vs. 26 ± 18 s). The demands were almost certainly greater when defending compared to attacking for forwards; Vmax (3.3 ± 1.8 vs. 4.1 ± 1.5 m.s-1), relative distance (97.9 ± 53.7 vs. 121.8 ± 48.8 m.min-1) and PL.min-1 (10.6 ± 5.3 vs. 12.7 ± 4.6 AU.min-1). When defending, relative distance was very likely greater (101.6 ± 66.4 vs. 121.4 ± 60.9 m.min-1), and Vmax (3.7 ± 2.1 vs. 4.2 ± 1.8 m.s-1) and PL.min-1 (10.7 ± 7.6 vs. 12.4 ± 7.4 AU.min-1) were both likely greater compared to attacking for backs. The movement and physical demands were consistency greater when defending for both positional groups, although a smaller disparity between phases was observed for backs than forwards. This indicates backs have greater movement demands during attacking phases, which was also reflected in the higher Vmax. The greater PL.min-1 for the forwards during defending suggests a greater involvement in tackles and rucks. These data provide practitioners with reference data when replicating match specific phases of play.
Positional differences in total weekly in-season training loads of elite schoolboy rugby union players
Adolescent rugby union players may participate with multiple teams concurrently, with elite players potentially participating with school, club, regional academy, county and national representative teams simultaneously. The accumulated workloads of these athletes may have a substantial impact on their athletic development, health and wellbeing, with excessive workloads related to both illness and injury (Gabbett et al., 2014, Sports Medicine, 7, 989-1003). Therefore, the aim of this study was to quantify the total accumulated weekly workloads of elite schoolboy rugby union players and examine the differences between playing positions. Following institutional ethics approval, twenty elite schoolboy rugby union players (age 17.4 ± 0.7 years) were recruited from an under-18 regional academy and categorised by playing position; forwards (n=10) and backs (n=10). Each participant was allocated a microtechnology unit (10 Hz global positioning system and accelerometer) to be worn during all rugby training to quantify external loads, whilst internal loads (session-rating of perceived exertion (s-RPE)) for rugby, strength, and conditioning/other training modalities were reported daily using an online training questionnaire. Data were collected during the in-season over a 10-week period, with a total of 97 complete weekly observations (5 ± 3 weeks per participant). Mean weekly data were calculated for each participant to control for multiple and uneven observations. Differences between-positions were analysed using Cohen’s d effect sizes (ES) and magnitude-based inferences. There were unclear differences between forwards and backs for total weekly training volumes (301 ± 107 vs. 301 ± 80 min) and s-RPE (1186 ± 380 vs. 1249 ± 365 AU). However, the total weekly external training loads were substantially different between positions (moderate-large) with backs recording likely-very likely greater total distance (13063 ± 3933 vs. 10195 ± 2242 m), low-speed activity (12142 ± 3672 vs. 9694 ± 2215 m), high-speed running (807 ± 387 vs. 482 ± 174 m), very high-speed running (34 ± 51 vs. 5 ± 8 m), peak velocity (8.0 ± 0.3 vs. 7.1 ± 0.4 m·s-1) and PlayerLoadTM (1246 ± 345 vs. 1002 ± 279 AU). Elite schoolboy rugby union players are exposed to high in-season workloads, which are greater than those previously reported in professional adult players during both in-season (Bradley et al., 2015, European Journal of Sport Science, 15, 469-479), and pre-season (Bradley et al., 2014, Journal of Strength and Conditioning Research, 29, 534-544) training periods. The appropriateness of these high workloads needs to be questioned for optimal athletic development and player welfare.
Validity of an online daily training load questionnaire and weekly training diary for adolescent team sport athletes
In late specialisation team sports, adolescent athletes may participate with multiple teams concurrently, as they are not contracted to one particular organisation. Therefore, the monitoring of accumulated training loads can be challenging. In a recent study, manual collection of rating of perceived exertion (RPE) was shown to be robust between 5 minutes and 24-hours post-exercise (Christen et al., 2016, International Journal of Sports Physiology and Performance, doi: 10.1123/ijspp.2015-0438). If a self-reported measure of global training load such as session-rating of perceived exertion (sRPE; intensity x time (Foster et al., 2001, Journal of Strength and Conditioning Research, 15, 109-115)) was proved valid, it could provide a simple solution for accurate and reliable training load monitoring. Therefore, the purpose of this study was to assess the level of agreement between the criterion session-rating of perceived exertion (sRPE30min) and practical measure of a remote web-based training load questionnaire 24-hours post-training (sRPE24h) in adolescent athletes. A secondary aim was to assess the level of agreement between weekly summated sRPE24h values (ƩsRPE24h) and a weekly web-based training diary (sRPEweekly) for all field-based training accumulated on a subsequent training week. Following institutional ethics approval, thirty-six male adolescent rugby union players (age 16.7 ± 0.5 years) were recruited from a regional academy. Criterion measures (sRPE30min) were recorded by the principle investigator 30-minutes post a field-based training session. Participants then completed the sRPE24h via a web-based training load questionnaire 24-hours post-training, reporting both session time and intensity. In addition, on a subsequent week, participants completed the sRPE24h daily and then completed the sRPEweekly at the end of the week to recall all training times and intensities over those 7-days. The agreement between sRPE30min and sRPE24h, as well as between ƩsRPE24h and sRPEweekly were assessed using mean percentage bias, typical error of the estimate (TEE), and Pearson correlation coefficients, all with 90% confidence limits. There were trivial biases between sRPE30min and sRPE24h for sRPE (0.3%; -0.9, 1.5), with small TEE (4.3%; 3.6, 5.4) and nearly-perfect correlations (0.99; 0.98, 0.99). There were trivial biases between ƩsRPE24h and sRPEweekly for sRPE (5.9%; -2.1, 14.2), with moderate TEE (28.5%; 23.3, 36.9) and very-large correlations (0.87; 0.78, 0.93). The results of this study show that the use of an online daily questionnaire is a valid and robust measure to remotely quantify training loads in adolescent athletes. However, the weekly training diary was found to have a substantial TEE, which would limit practical application.
Do Training Exposures Prepare Adolescent Rugby Union Players for Match-Play?
Introduction The aim of the study was to assess the differences in external loads and movement demands between training and match-play in adolescent rugby union (RU) players. For mean training and match demands to be similar, athletes would have to be exposed to the equivalent loading of multiple matches within their training week, which may be impractical. An alternative analysis of comparing maximum demands instead of mean demands between conditions may offer a more appropriate comparison. Methods Sixty-one adolescent RU players (mean ± SD; age 17.0 ± 0.7 years) were recruited from four teams representing school and regional academy RU. Players were categorised into four independent groups based on playing standard and position; school forwards (SF; n=15), school backs (SB; n=15), academy forwards (AF; n=16) and academy backs (AB; n=15). Global positioning system and tri-axial accelerometer data were obtained from 61 match files during 6 matches, and 152 training files during 15 training sessions. Maximum data from individual training sessions that elicited the highest value for each variable were used in the comparison to match-play. This approach was used to account for variance in specific training objectives for respective sessions (i.e. collision or running focus), as mean data may dilute the magnitude of exposure of specific loads within the training week. Differences between training and match-play measures were analysed using paired t-tests, Cohen’s d effect sizes and 90% confidence interval. Results Within the SF group, training was significantly greater than matches for total and relative high speed running (HSR [≥61% maximum velocity (Vmax)] and HSR·min-1), total and relative very high speed running (VHSR [>90% Vmax] and VHSR·min-1), and relative peak velocity (%Vmax). Contrastingly, in the SB group, training were significantly less than match play for total distance (TD), %Vmax, VHSR and VHSR·min-1. Within the AF group, match-play was significantly greater than training for relative low speed activity (LSA·min-1 [<61% Vmax]). Finally, within the AB group, training was significantly greater than match-play for relative PlayerLoadTM (PL·min-1) and HSR·min-1. Discussion Players from a higher representative standard (i.e. academy) were exposed to external loads and movement demands during their training week comparative to the demands of match-play. However, within schools, due to positional differences during match-play, training sessions may under-prepare backs and over-prepare forwards. Coaches should ensure that training sessions reflect the standard- and position-specific demands of match-play to adequately prepare players for competition.
The purpose of the present study was to investigate the validity of an isometric mid-thigh pull dynamometer against a criterion measure (i.e., 1,000 Hz force platform) for assessing muscle strength in male youth athletes. Twenty-two male adolescent (age 15.3 ± 0.5 years) rugby league players performed four isometric mid-thigh pull efforts (i.e., two on the dynamometer and two on the force platform) separated by 5 minutes rest in a randomised and counterbalanced order. Mean bias, typical error of estimate (TEE) and Pearson correlation coefficient for peak force (PF) and peak force minus body weight (PFBW) from the force platform were validated against peak force from the dynamometer (DynoPF). When compared to PF and PFBW, mean bias (with 90% Confidence limits) for DynoPF was very large (-32.4 [-34.2 to -30.6] %) and moderate (-10.0 [-12.8 to -7.2] %), respectively. The TEE was moderate for both PF (8.1 [6.3 to 11.2] %) and PFBW (8.9 [7.0 to 12.4]). Correlations between DynoPF and PF (r 0.90 [0.79 to 0.95]) and PFBW (r 0.90 [0.80 to 0.95] were nearly perfect. The isometric mid-thigh pull assessed using a dynamometer underestimated PF and PFBW obtained using a criterion force platform. However, strong correlations between the dynamometer and force platform suggest that a dynamometer provides an appropriate alternative to assess isometric mid-thigh pull strength when a force platform is not available. Therefore, practitioners can use an isometric mid-thigh pull dynamometer to assess strength in the field with youth athletes but should be aware that it underestimates peak force.
Playing rugby union matches causes a number of fatigue responses, including reduced lower body neuromuscular function (NMF) (commonly measured by counter movement jump (CMJ))(1). The time course of this response following match play is well established in professional (2) and academy (3) level rugby union players, who take at least 60 hours for NMF to recover. No data exist for high school level rugby union players, but these players are often exposed to multiple game tournaments and festivals (2 games in 3 days, or 3 games in 5 days) within their competition structures. Aim. The aim of this case study is to document the NMF response to playing three rugby union matches within five days. This will provide useful information to practitioners who must manage fatigue and recovery of youth rugby union players who play multiple games within short time periods.
This study quantified the field-based external training loads of professional rugby league players using global positioning systems technology across a playing season. Eleven professional rugby league players were monitored during all field-based training activities during the 2014 Super League season. Training sessions undertaken in preseason (n = 211 observations), early (n = 194 observations), middle (n = 171 observations) and late (n = 206 observations) phases of the in-season were averaged for each player and used in the analyses. Large reductions in external training loads between preseason and in-season periods were observed. Within season, a decrease in intensity (relative distance, absolute and relative total-HSR) with a limited change in training duration was observed. These data provide a useful reference for coaches working with similar cohorts, while future research should quantify the adequacy of the training loads reported, considering impact on performance and injury.
Rugby league is a collision team sport played at junior and senior levels worldwide, whereby players require highly developed anthropometric and physical qualities (i.e., speed, change of direction speed, aerobic capacity, muscular strength and power). Within junior levels, professional clubs and national governing bodies implement talent identification and development programmes to support the development of youth (i.e., 13-20 years) rugby league players into professional athletes. This review presents and critically appraises the anthropometric and physical qualities of elite male youth rugby league players aged between 13 and 20 years by age category, playing standard and playing position. Height, body mass, body composition, linear speed, change of direction speed, aerobic capacity, muscular strength and power characteristics are presented and demonstrate that qualities develop with age and differentiate between playing standard and playing position. This highlights the importance of anthropometric and physical qualities for the identification and development of youth rugby league players. However, factors such as maturity status, variability in development, longitudinal monitoring and career attainment should be considered to help understand, identify and develop the physical qualities of youth players. Further extensive research is required into the anthropometric and physical qualities of youth rugby league players, specifically considering national standardized testing batteries, links between physical qualities and match performance, together with intervention studies, to inform the physical development of youth rugby league players for talent identification and development purposes.
Background: Despite its apparent popularity, participation in the sport of rugby union is accompanied by a significant risk of injury. Concerned parties have recently questioned whether this risk is acceptable within school populations. This is difficult to assess within the South African schools’ population as no recent longitudinal injury studies exist. Objectives: To determine the training habits, rugby-related exposure and injury risk within a population of South African high school first team rugby players. Methods: Training and match exposure in both school and provincial competition examined and the resultant injuries were longitudinally observed for the duration of a South African high school rugby season. Results: Match (79, 95%CI 52-105 injuries/1 000 h) and training (7, 95%CI 3-11 injuries /1000h) injury incidences were demonstrated to be greater than previously reported incidences in similar populations in England and Ireland. Weeks where players were exposed to both school and provincial competition (34, 95%CI 19-49 injuries /1 000 h) had significantly (p<0.05) greater injury incidences than during school competition alone (19, 95%CI 12-26 injuries /1 000 h). Conclusions: The injury risk demonstrated was greater than expected and represents reasons for concern. Possible reasons for the high injury incidence recorded may be the frequency of games played within the season, and the overlap of school and provincial competitions. It should be noted that these results were taken from one school over one season and might not be representative of the incidence of school rugby injuries overall. However, this research demonstrates the need for a multischool longitudinal study within South African schools rugby to determine the overall risk.
Data from global positioning system (GPS) technology are typically presented as the distances covered in specific locomotor categories (e.g., walking, jogging, striding, sprinting). Differences are found when categorisations are made using either pre-defined absolute thresholds or thresholds relative to maximum speed. However, there are two distinct methods of using relative speed thresholds currently employed in the literature, although no study has attempted to compare them. Therefore, the purpose of this study was to compare the differences in data when analysing the same GPS files relative to speed, using either a maximum velocity sprint (Vmax) or a maximum velocity achieved during match-play (Vpeak). Following institutional ethics approval, 99 GPS files were analysed from rugby union match-play and split between forwards (n=59) and backs (n=40). The male participants involved were part of a Regional Academy and had the following characteristics (age: 17.5 ± 0.7 years; stature: 183.6 ± 6.6 cm; body mass: 90.6 ± 10.6 kg). Vmax was established by players performing a maximum 40 m sprint, whilst Vpeak was defined as the maximum velocity achieved during each match. The locomotor categories were defined as walking 0-20%, jogging 20-50%, striding 50-80% and sprinting 80-100% (Duthie et al., 2006) of either Vmax or Vpeak. Data were analysed using magnitude based inferences. The mean Vmax and Vpeak for all players were 8.7 ± 0.6 m.s-1 and 7.2 ± 0.9 m.s-1, respectively. There were almost certain differences in walking (2088 ± 298 vs. 1611 ± 435 m), striding (670 ± 244 vs. 1197 ± 375 m) and sprinting (28 ± 29 vs. 145 ± 73 m) between Vmax and Vpeak, for forwards. There was also a likely difference in jogging (2674 ± 313 vs. 2502 ± 301 m). Very likely differences were found for walking (2414 ± 288 vs. 2177 ± 347 m) and striding (708 ± 159 vs. 927 ± 347 m) for backs. There was also an almost certain difference in sprinting (66 ± 41 vs. 151 ± 49 m) whilst an unclear difference was found for striding (2409 ± 433 vs. 2338 ± 352 m) for backs. The use of relative thresholds using Vpeak seems to overestimate the distance covered in striding and sprinting whilst underestimating walking and jogging. Practitioners should look to use Vmax for relative speed thresholds as Vpeak from match-play is likely to change from match-to-match and consequently misrepresent the movement demands that players are exposed to.
Women's sport has seen substantial growth in recent years, with increased attention to athlete performance and welfare. To support the ongoing professionalisation of women's rugby, performance and wellbeing must be prioritised. This study used a three‐round Delphi‐process to establish performance and wellbeing research priorities for Premiership Women's Rugby (PWR) in England. In Round 1, players and staff provided research priorities, which were grouped into higher‐order categories and themes via content analysis. In Rounds 2 and 3, participants ranked higher‐order categories on a 1–5 Likert scale. Consensus was defined as ≥ 70% agreement. Seventy‐seven participants responded in Round 1 (47 and 43 in Rounds 2 and 3). Player and staff experience of playing or working in PWR was 5.0 (2.0–7.0) and 2.5 (2.0–4.0) years. Following Round 1321 research priorities were provided, 32 higher‐order research priorities and 14 categories were identified, within three themes: performance, wellbeing and injury. Following Round 3, nine research priorities reached consensus within performance ( n = 1), wellbeing ( n = 4) and injury ( n = 4). The highest rated priority was ‘ Investigate the impact of being a dual‐career athlete on wellbeing, and any support mechanisms required ’ (79%). Future research should prioritise studies which are feasible and currently lack a comprehensive evidence‐base. This will enable researchers and governing bodies to address relevant knowledge gaps and inform ongoing performance and player safety initiatives. The research priorities identified in this study, by PWR players and staff, could be investigated to support the development of women's rugby domestically. These findings may also be applicable to other women's sports and leagues globally.
スポーツにおける体力テストの課題と解決策:身体的資質のプロファイリングツール
Talent identification
The ability of sporting organisations to identify and develop athletic talent into the sporting superstars of tomorrow has significantly grown and intensified in recent years. This chapter focuses on the talent identification step of this process. Talent identification is defined as ‘recognising participants with the potential at an earlier age to become elite performers in the future’ with multiple talent identification systems employed across the world. Talent identification processes can be informed by coach recommendations, training/competition observations and fitness assessments, although a collaborative approach is recommended. Although coaches believe they can identify talent, a multitude of research examining talent has emerged over the last two decades exploring the phenomenon of talent. This chapter reviews some of this research, with a focus on the physical qualities and methods that may be used as talent identification tools. The chapter then presents numerous problems related to talent identification within young athletes including the performance vs potential debate, early identification = early specialisation, annual age grouping and maturity variability, and unidimensional and cross-sectional approaches to talent identification. Numerous potential solutions to help practitioners optimise their talent identification programmes in young athletes are then discussed.
Background and aims Within adolescent female rugby union, various effective injury prevention strategies are available to players to mitigate injury. However, little is known regarding the players’ attitudes, beliefs and behaviours towards those strategies, as well as injuries. The primary aim of the study was to investigate the attitudes, beliefs, behaviours and injury-reporting behaviours of adolescent female rugby players regarding injury and injury prevention strategies. The secondary aim was to examine associations between individual factors (eg, player demographics) and injury-reporting behaviours. Methods Participants completed an online cross-sectional survey and were recruited from under-16 and under-18 rugby teams in schools/colleges, clubs and developing player pathway programmes in England. Results 1062 players were contacted to participate, 424 responded and 422 met the eligibility criteria; 79 participants had incomplete responses. 14% of players had not previously reported a suspected concussion to a coach/medical staff member, and 37% of players had previously not reported sustaining one or more musculoskeletal (MSK) injuries to a coach/medical staff member. Factors cited for non-disclosure of concussion and MSK injuries included not wanting to miss rugby sessions (43% and 39%) and not knowing that symptom(s) were related to an injury (11% and 17%). Players held positive attitudes, beliefs and behaviours towards injury and injury prevention, but their understanding of the effectiveness of protective equipment varied. Conclusion This study provides a greater understanding of adolescent female rugby players’ attitudes, beliefs and behaviours towards injury and injury prevention and aids in the development of effective injury prevention initiatives.
This PhD study is based on considering the sequential nature of match activity occurrences to understand the (physical) demand of games on players as well as uncover players’ behavioural movement patterns during competitive matches. It involves quantifying players’ external load (i.e., completed match activities) using movement patterns that provide information on how players accumulated external load in contrast to existing physical and technical-tactical performance indicators. The quantification of external load helps with team and player improvement, training specificity and even talent identification and recruitment. Elite players of rugby football league were considered as the participants of this PhD study because rugby league is a physically intense team sport where activities happen quickly and within a stipulated timeframe. Existing player movement profiling frameworks were investigated which revealed the use of a sequential pattern mining algorithm to extract movement patterns. However, the algorithm for extracting patterns in the existing player movement profiling frameworks is found to have limitations such as the identification of few movement patterns, providing only the longest common patterns within a cluster of movement sequences while it discards other interesting patterns. Furthermore, a review of sequential pattern mining algorithms revealed none of the existing algorithms is suitable for player movement profiling. The state-of-the-art algorithm for extracting closed contiguous patterns (i.e., CCSpan) does not scale well on large data as well as data with long rows of sequences. Also, the CCSpan algorithm is without a parameter to define a maximal length of extract patterns which is useful as a sliding window. This PhD study's contributions to the body of knowledge include the development and optimisation of a novel pattern mining algorithm for extracting user-defined length of frequent closed contiguous patterns, called LCCspm (l-length closed contiguous sequential pattern mining), among others. An intrinsic evaluation of the LCCspm algorithm was considered in terms of speed and memory consumption performance measures. Results revealed it is four, seven or ten times faster than the state-of-the-art algorithm when tested on natural data in three different use cases. An extrinsic evaluation of the LCCspm algorithm reveals and validates that its movement patterns are best to profile players into playing positions when compared to other distinct types of obtainable movement patterns. Furthermore, this PhD study applied LCCspm and other artificial intelligence algorithms to discover signature movement patterns of elite rugby league players per playing position, identify (key) movement patterns as predictor variables for classifying players into playing positions, and later establish a set of movement performance indicator useful for assessing players’ performance variability across playing positions. In the broader scope of computing, this PhD study contributes LCCspm algorithm as an advancement of pattern mining algorithms. The application of LCCspm can be extended to other sports domains and challenges, allowing for more accurate analysis and insights into player behaviour and team dynamics. Additionally, LCCspm can be applied in any field where the consecutiveness of items matters during the analysis of patterns.
Accurate quantification of energy intake is imperative in athletes; however traditional dietary assessment tools are frequently inaccurate. Therefore, this study investigated the validity of a contemporary dietary assessment tool or wearable technology to determine the total energy intake (TEI) of professional young athletes. The TEI of eight professional young male rugby league players was determined by three methods; Snap-N-Send, SenseWear Armbands (SWA) combined with metabolic power and doubly labelled water (DLW; intake-balance method; criterion) across a combined ten-day pre-season and seven-day in-season period. Changes in fasted body mass were recorded, alongside changes in body composition via isotopic dilution and a validated energy density equation. Energy intake was calculated via the intake-balance method. Snap-N-Send non-significantly over-reported pre-season and in-season energy intake by 0.21 (2.37) MJ.day-1 (p = 0.833) and 0.51 (1.73) MJ.day-1 (p = 0.464), respectively. This represented a trivial and small standardised mean bias, and very large and large typical error. SenseWear Armbands and metabolic power significantly under-reported pre-season and in-season TEI by 3.51 (2.42) MJ.day-1 (p = 0.017) and 2.18 (1.85) MJ.day-1 (p = 0.021), respectively. This represents a large and moderate standardised mean bias, and very large and very large typical error. There was a most likely larger daily error reported by SWA and metabolic power than Snap-N-Send across pre-season (3.30 (2.45) MJ.day-1; ES = 1.26 ± 0.68; p = 0.014) and in-season periods (1.67 (2.00) MJ.day-1; ES = 1.27 ± 0.70; p = 0.012). This study demonstrates the enhanced validity of Snap-N-Send for assessing athlete TEI over combined wearable technology, although caution is required when determining the individual TEIs of athletes via Snap-N-Send.
LCCspm: l-Length Closed Contiguous Sequential Patterns Mining Algorithm to Find Frequent Athlete Movement Patterns from GPS
The analysis of athletes’ spatiotemporal data provides actionable insights for strength and conditioning and customized training designs. The identification of unique movements and adjacent match events of team-sport athletes is important. It helps to understand the demands of a match and to advance training programs by improving training specificity. In this paper we present a novel l-length Closed Contiguous sequential pattern mining (LCCspm) algorithm. To validate LCCspm, England Rugby Football League (RFL) Super League players’ movements and Fédération Internationale de Football Association (FIFA) 2018 football world cup events datasets were used. The algorithm was compared with the other existing algorithm (i.e., CCspan). Empirically, the most frequently discovered closed contiguous patterns from RFL were 1-7 length movement patterns while 10-40 length patterns were those discovered in men’s FIFA 2018 world cup. This reflects the duration at which RFL and FIFA football match events usually occur and how data granularity influence results. LCCspm greatly outperforms the CCSpan in terms of scalability, runtime and memory usage. The use of LCCspm instead of CCspan for mining closed contiguous sequences regardless of the length of patterns and size of the database is recommended as it offers timely retrieval of patterns with lesser compute.
This study assessed the potential physiological and perceptual drivers of fluid intake (FI) and thirst sensation (TS) during intermittent exercise. 10 male rugby players (17 ± 1 years, stature: 179.1 ± 4.2 cm, body mass (BM): 81.9 ± 8.1 kg) participated in 6x6 min small-sided games, interspersed with 2 min rest, where FI was ad libitum during rest periods. Pre and post measurements of BM, subjective ratings (thirst, thermal comfort, thermal sensation, mouth dryness), plasma osmolality (POsm), serum sodium concentration (S[Na+]), haematocrit and haemoglobin (to calculate plasma volume change; PV) were taken. FI was measured during rest periods. BM change was -0.17 ± 0.59% and FI was 0.88 ± 0.38L. Pre to post POsm decreased (-3.1 ± 2.3 mOsm·kg−1; p = 0.002) and S[Na+] remained similar (-0.3 ± 0.7mmol·L-1, p = 0.193). ∆PV was 5.84 ± 3.65%. FI displayed a relationship with pre POsm (r = -0.640, p = 0.046), pre thermal comfort (r = 0.651; p = -0.041), ∆S[Na+] (r = 0.816, p = 0.004), and ∆PV (r = 0.740; p = 0.014). ∆TS displayed a relationship with pre mouth dryness (r = 0.861, p = 0.006) and ∆mouth dryness (r = 0.878, p = 0.004). Yet a weak positive relationship between ∆TS and FI was observed (r = 0.085, p = 0.841). These data observed in an ambient temperature of 13.6 ± 0.9̊C, suggest team sport athletes drink in excess of fluid homeostasis requirements and TS in cool conditions, however this was not influence by thermal discomfort.
OBJECTIVES: Concussion is a common injury in rugby union ('rugby') and yet its diagnosis is reliant on clinical judgment. Oculomotor testing could provide an objective measure to assist with concussion diagnosis. NeuroFlex® evaluates oculomotor function using a virtual-reality headset. This study examined differences in NeuroFlex® performance in clinician-diagnosed concussed and not concussed elite male rugby players over three seasons. METHODS: NeuroFlex® testing was completed alongside 140 head injury assessments (HIAs) in 122 players. The HIA is used for suspected concussion events. Of these 140 HIAs, 100 were eventually diagnosed as concussed, 38 were not concussed (2 were unclear) Eight of the 61 NeuroFlex® metrics were analysed as they were comparable at all time points. These eight metrics, from three oculomotor domains (vestibulo-ocular reflex, smooth pursuit and saccades), were tested for their ability to distinguish between concussed and not concussed players using mean difference / odds ratios and corresponding 95% confidence intervals (CI's). General and generalised linear mixed models, accounting for baseline test performance, were used to determine any meaningful differences in concussed and not concussed players. The diagnostic accuracy of these differences was provided by the area under the receiver operating curve (AUC). RESULTS: Only one of the eight metrics (number of saccades, smooth pursuit domain) had clear differences in performance between concussed and not concussed players at the HIA during the match (odds ratio: 0.76, 95%CI: 0.54-0.98) and after 48 hours (0.74, 95%CI: 0.52-0.96). However, the direction of this difference was contrary to clinical expectations (concussed performed better than not concussed) and the AUC for this outcome was also poor (0.52). CONCLUSION: NeuroFlex® was unable to distinguish between concussed and not concussed players in this elite male cohort. Future research could study other cohorts, later time points before return to play, and the tool's role in rehabilitation.
Re: The Integration of Internal and External Training Load Metrics in Hurling – interpretation beyond a significant relationship required
Identifying Contextual Influences on Training Load: An Example in Professional Rugby Union
We aimed to investigate the contextual factors influencing training load (TL), as determined by session rating of perceived exertion (sRPE-TL), accumulated within a match-to-match microcycle in rugby union players. Session rating of perceived exertion-TL data were collected daily from 35 professional rugby union players from the same team in the English Championship over the course of an in-season period. Players were split by positional groups (backs and forwards) and sRPE-TL data were categorized as: field-based on-feet sRPE-TL (sRPEField-TL), gym-based sRPE-TL (sRPEGym-TL), and the total summation of both (sRPETotal-TL). Three 2-level linear mixed models were built for each dependent variable in each positional group, with magnitude-based inferences applied. Long between-match recovery cycles (≥7 days) resulted in very likely to almost certainly small to moderate increases in sRPE-TL for all modalities and positions (fixed effect [mean range] = 28.5%–42.0%), apart from sRPEField-TL for forwards. For backs, there was a very likely small decrease in sRPEField-TL as the season progressed (−16.7% per trimester). Losing the last league match was associated with very likely and almost certainly small decreases in sRPETotal-TL and sRPEGym-TL for backs (−20.7% and −36.4%, respectively). Losing the last match in any competition resulted in a very likely small increase in sRPEField-TL (21.2%) and a possibly small decrease sRPEGym-TL (−18.5%) for backs—with a likely smaller sRPEGym-TL for forwards (−33.4%). The strength of the upcoming opposition had no effect on sRPE-TL. Our findings highlight some of the multifactorial contextual factors that must be considered when planning and evaluating training microcycles.
The aim of this study was to investigate the differences and long-term reliability in perceptual, metabolic, and neuromuscular responses to velocity loss resistance training protocols. Using a repeated, counterbalanced, crossover design, twelve team-sport athletes completed 5-sets of barbell back-squats at a load corresponding to a mean concentric velocity of ~0.70 m·s-1. On different days, repetitions were performed until a 10%, 20% or 30% velocity loss was attained, with outcome measures collected after each set. Sessions were repeated after four-weeks. There were substantial between-protocol differences in post-set differential ratings of perceived exertion (dRPE, i.e., breathlessness and leg muscles, AU) and blood lactate concentration (B[La], mmol·L-1), such that 30%>20%>10% by small to large magnitudes. Differences in post-set countermovement jump (CMJ) variables were small for most variables, such that 30%<20%<10%. Standard deviations representing four-week variability of post-set responses to each protocol were: dRPE, 8-11; B[La], 0.8-1.0; CMJ height, 1.6-2.0; CMJ PPO, 1.0-1.8; CMJ PCV, 0.04-0.06; CMJ 100ms-Impulse, 5.7-11.9. Velocity loss thresholds control the magnitude of perceptual, metabolic, and neuromuscular responses to resistance training. For practitioners wanting to reliably prescribe training that can induce a given perceptual, metabolic, or neuromuscular response, it is strongly advised that velocity-based thresholds are implemented.
Purpose: This study investigated whether providing Global Positioning Systems feedback to players in between bouts of small-sided games (SSGs) can alter locomotor, physiological, and perceptual responses. Methods: Using a reverse counterbalanced design, twenty male university rugby players received either feedback or no-feedback during ‘off-side’ touch rugby SSGs. Eight 5v5, 6 x 4 minute SSGs were played over four days. Teams were assigned to a feedback or no feedback condition (control) each day, with feedback provided during the 2 minute between bout rest interval. Locomotor, heart rate, and differential rating of perceived exertion (dRPE) of breathlessness and leg muscle exertion were measured and analysed using a linear mixed model. Outcomes were reported using effect sizes (ES) and 90% confidence intervals, and then interpreted via magnitude-based decisions. Results: Very likely trivial to unclear differences at all time points were observed in heart rate and dRPE measures. Possibly to very likely trivial effects were observed between-conditions, including total distance (ES= 0.15 [-0.03, 0.34]), high-speed distance (ES= -0.07 [-0.27, 0.13]), and maximal sprint speed (ES= 0.11 [-0.11, 0.34]). All within-bout comparisons showed very likely to unclear differences, apart from possible increases in low-speed distance in bout 2 (ES= 0.23 [0.01, 0.46]) and maximal sprint speed in bout 4 (ES= 0.21 [-0.04, 0.45]). Conclusions: In this study, verbal feedback did not alter locomotor, physiological, or perceptual responses in rugby players during SSGs. This may be due to contextual factors (e.g., opposition) or due to the type (i.e., distance) or low frequency of feedback provided.
The Effect of Rugby Union Match Play on Sleep Patterns and Subsequent Impact on Postmatch Fatigue Responses
Purpose: Sleep is recognized as an important recovery strategy, yet little is known regarding its impact on postmatch fatigue. The aims of this study were to (1) describe sleep and postmatch fatigue, (2) understand how sleep is affected by contextual and match factors, and (3) assess how changes in sleep can affect postmatch fatigue. Methods: Twenty-three male rugby union players were monitored across 1 season (N = 71 player–match observations). Actigraphy was used during preseason to establish baseline sleep quality and quantity. Sleep was then measured 1 and 2 days after each match day (MD + 1 and MD + 2). Global positioning systems, notational analysis, and rating of perceived exertion represented external and internal load from matches. Subjective wellness and a standardized run were used to characterize postmatch fatigue 2 days prior (baseline) and at MD + 1 and MD + 2. Linear mixed models established the magnitude of change (effect size [ES]) between baseline, MD + 1, and MD + 2 for sleep and postmatch fatigue. Stepwise forward selection analysis ascertained the effect of match load on sleep and the effect of sleep on postmatch fatigue. Each analysis was combined with magnitude-based decisions. Results: Sleep characteristics and neuromuscular and perceptual postmatch fatigue were negatively affected at MD + 1 and MD + 2 (ES = small to very large). Kickoff and travel time had the greatest effect on sleep (ES = small). Wellness and soreness were influenced by sleep (fall-asleep time and fragmentation index) and collisions, respectively (ES = small). Conclusion: Sleep quality and quantity were affected independently of the match load (ie, running activity) sustained, and changes in sleep marginally affected postmatch fatigue.
The purpose of this study was to investigate the influence of maturity status on the physical characteristics of youth female soccer players. One hundred fifty-seven players from 3 elite soccer academies in England completed assessments of anthropometry, strength (isometric midthigh pull), lower-body power (countermovement jump [CMJ]), aerobic capacity (Yo-Yo intermittent recovery test level 1), change of direction (CoD: 505-left/right), and speed (10 and 30 m). Each player was classified into 1 of 6 maturity groups based on their estimated years from peak height velocity (YPHV). Magnitude-based inferences were used to assess for the practical significance between consecutive groups. Speed, CoD time, CMJ, and aerobic capacity were all possibly most likely better in more mature players. However, there was a likely difference in relative peak force between maturity groups −0.5 YPHV (27.13 ± 4.24 N·Kg−1) and 0.5 YPHV (24.62 ± 3.70 N·Kg−1), which was associated with a likely difference in 10-m sprint time (−0.5 YPHV: 2.00 ± 0.12 vs. 0.5 YPHV 2.08 ± 0.16 seconds) and unclear changes in CMJ and CoD time. Findings provide novel comparative data for this cohort relative to maturity status and can be used by strength and conditioning coaches to inform the design of training programs for youth female soccer players. Strength and conditioning coaches should be aware that youth female soccer players may experience a decrease in relative strength around peak height velocity, which may impact upon the speed, CoD time, and CMJ of players.
On-field spacing has been linked to successful performance in a number of sportsto date, there is limited research investigating this within rugby league. This study aims to (a) quantify the defensive dispersal during rugby league match-play and (b) identify if contextual factors are associated with the dispersal. Global Positioning System data were analysed from 47 European Super League matches (1598 player files). Defensive dispersal was calculated for 1959 defensive sets of rugby league. Linear mixed models were used to analyse the effects of contextual factors on the average defensive dispersal per set when accounting for team and fixture. On-field position and match half were found to significantly affect defensive dispersal. However, set length, play-the-ball length, and final score difference were found to have minimal impact on defensive dispersal. This study demonstrates that defensive dispersal in rugby league can be measured using GPS data and may be strongly influenced by on-field positioning. As such, it quantifies an important element of tactical preparation for rugby league teams.
This study investigated differences in external training load between microcycle lengths and its variation between microcycles, players, and head coaches. Commonly used external training load variables including total-, high-speed- (5-7 m∙s-1), and sprint-distance (> 7 m∙s-1) alongside combined high acceleration and deceleration distance (> 2 m∙s-2). Which were also expressed relative to time were collected using microtechnology within a repeated measures design from 54 male rugby league players from one Super League team over four seasons. 4337 individual observations across ninety-one separate microcycles and six individual microcycle lengths (5 to 10 day) were included. Linear mixed effects models established the differences in training load between microcycle-length and the variation between-microcycles, players and head coaches. The largest magnitude of difference in training load was seen when comparing 5-day with 9-day (ES = 0.31 to 0.53) and 10-day (ES = 0.19 to 0.66) microcycles. The greatest number of differences between microcycles were observed in high- (ES = 0.3 to 0.53) and sprint-speed (ES = 0.2 to 0.42) variables. Between-microcycle variability ranged between 11% to 35% dependent on training load variable. Training load also varied between players (5-65%) and head coaches (6-20%) with most variability existing within high-speed (19-43%) and sprinting (19-65%). Overall, differences in training load between microcycle lengths exist, likely due to manipulation of session duration. Furthermore, training load varies between microcycle, player and head coach.
Objective: This study aimed to investigate the prevalence of self-reported shoulder dysfunction using the Rugby Shoulder Score (RSS) reported in arbitrary units (AU) of rugby players available for match selection (uninjured). Design: Cross-sectional survey. Methods: Paper survey at the mid-point of the season of uninjured players (n = 86 males (mean age (±SD): 26 ± 6.9y) from 8 squads (professional n = 34; amateur; n = 52)), using the RSS, subjective impact on rugby performance and previous shoulder injury, analysed using a Mann-Whitney U test. Results: 55% of players reported a level of RSS dysfunction despite being uninjured. Players who also reported their shoulder was impacting on performance had significantly higher median RSS (61, IQR 28AU, p = 0.02) than those who reported no impact on performance (40, IQR 22AU). Conclusions: Findings from this study show that over half of players were playing with a level of self-reported shoulder dysfunction. This figure is higher in the professional game, for those with a history of previous injury and for forwards.
Objectives: Identify tackle characteristics associated with concussions in male professional rugby league. Design: Case-control study. Methods: Tackles resulting in 196 clinically diagnosed concussions and 6592 non-concussive tackles were analysed, from the men's rugby league Super League between 2018 and 2022. Eleven tackle characteristics were coded for each tackle, and Firth penalised logistic regression models were employed to identify influential variables through forward stepwise selection. Three multivariate models were produced; all (i.e., ball-carrier and tackler), tackler, and ball-carrier concussions. Results: Of the 196 concussions, 70 % occurred to the tackler and 30 % to the ball-carrier. Initial impact location on the ball-carrier was identified as a predictor in all models, specifically the shorts, upper- and lower-leg (OR 9.1–12.3, compared to shoulder) for tacklers and head/neck (OR 66.1, compared to shoulder) for ball-carriers. Tackler head placement in front of the ball-carrier (OR 8.5, compared to away from the body) and a ball-carrier leading arm in any position (OR 4.8–22.1, compared to no leading arm) provided the greatest odds of a tackler concussion. For player's body position the greatest risk of concussion for all players was observed when both players were falling/diving (OR 8.8, compared to both players upright). One (OR 4.9, compared to two) and four (OR 3.7, compared to two) defender tackles provide the greatest odds for all concussions. Conclusions: Concussion prevention strategies should aim to reduce head impacts by deterring initial contact with the ball-carrier's head/neck. Tackle technique should prioritise making initial impact with the torso and avoid the head being in front of the ball-carrier and any leading arms.
Background Body mass and composition (fat and fat-free mass) manipulation is a common practice in sport, yet it can pose significant risks to athlete health and wellbeing. Practitioners must continually adapt to the growing body of evidence to implement safe, effective and context-specific practice. Objective This scoping review aimed to summarise dietary recommendations for altering body mass or composition in male and female, adult non-disabled athletes and appraise how these expert-group led recommendations have evolved over time. Methods Electronic databases, including SCOPUS, PubMed, SPORTDiscus, CINAHL Complete and APA PsycINFO were searched (last search 2 August 2024) without date restrictions. Papers were included if they provided dietary recommendations for altering body mass or composition in adult non-disabled athlete populations and were published by an expert organisation. Results From 6068 records screened, 73 documents were included, comprising 45 consensus statements, 27 position stands and 1 practice guideline, endorsed by 14 organisations and developed by 328 experts from 25 countries. Athletics (n = 19), aquatics (n = 7) and team sports (n = 5) were the most represented, leaving many sports underrepresented. A total of 50 documents were standalone rather than part of an updated series. Only 40 papers addressed specific targets, rates or timing of outcome changes. Individualised, realistic and health-focussed targets were recommended, aligned with the athlete’s sport, position, sex, age and competition phase, with gradual changes (e.g. 0.5–1.0 kg/week fat loss) to enhance performance. Common strategies for altering body mass and composition included creating an energy surplus (500–1000 kcal/day) or deficit (250–1000 kcal/day), maintaining energy availability above 30 kcal/kg fat-free mass/day, and periodising carbohydrate intake (3–12 g/kg/day) on the basis of training demands. Protein intake (1.6–2.4 g/kg/day) was recommended across 4–6 feeds from high-quality sources, alongside targeted supplements such as creatine, whey protein and a multi-vitamin and mineral. Recommendations focussed minimal attention on nutrients such as fats, fibre or micronutrients, and the language used was often vague, leaving significant room for interpretation. Conclusions Developing sport-specific, behaviourally anchored and regularly updated dietary recommendations, informed by athlete and multidisciplinary team input, is recommended. This approach would provide actionable, athlete-centred strategies that effectively support body composition goals whilst prioritising health, wellbeing and performance. OSF Registration https://doi.org/10.17605/OSF.IO/B4YJT
This study evaluated the anthropometric, speed and endurance characteristics of English academy soccer players, comparing players who obtained a ‘professional’ contract at 18 years old with those that did not (‘academy’); 443 male academy soccer players from an English professional club undertook anthropometric (height and body mass), speed (10 and 20 m sprint) and endurance (Yo-Yo intermittent endurance test level 2 [Yo-Yo]) assessments between 2005 and 2012. Significant improvements with age were found for speed and endurance at each annual age group up until U18 age category. Significant differences were only observed between ‘professional’ and ‘academy’ players for 10 m (p = 0.003, η2 = 0.01) and 20 m (p = 0.001, η2 = 0.01) speed at U16 and U18 and Yo-Yo performance (p = 0.001, η2 = 0.12) at U18 age category. Practitioners should use speed and endurance assessments for monitoring physical development of players rather than for talent identification purposes.
Objective This systematic review and meta-analysis aimed to determine differences in body composition between playing standard and age in male rugby union and rugby league athletes. Eligibility criteria The MOOSE (Meta-analysis of Observational Studies in Epidemiology) guidelines for design, implementation, and reporting were followed. Studies were required to be in male rugby union or league and have body composition as the primary or secondary outcome. Data was required to be presented separately for positional groups and body composition presented as whole-body. Data sources PubMed, Cochrane Library, MEDLINE, SPORTDiscus, and CINHAHL via EBSCOhost. Risk of bias The methodological quality of the included studies was evaluated using a modified assessment scale Results 58 studies were included for meta-analysis. Results highlighted significantly higher fat-free mass in senior elite than senior sub-elite or junior elite athletes for all RU and RL forwards. Small and non-significant differences were found in fat mass between rugby union playing standards and age categories. Rugby league senior elite forwards had less fat mass than junior elite forwards. Conclusions Practitioners should prioritise training and nutritional strategies that maximise fat-free mass development, especially in junior elite cohorts.
The purpose of this study was to investigate the validity of global positioning system (GPS) and micro-electrical-mechanical-system (MEMS) data generated in real-time via a dedicated receiver. Post-session data acted as criterion as it is used to plan the volume and intensity of future training and is downloaded directly from the device. 25 professional rugby league players completed two training sessions wearing a MEMS device (Catapult S5, firmware version: 2.27). During sessions, real-time data was collected via the manufacturer receiver and dedicated software (Openfield v1.14) which was positioned outdoors at the same location for every session. GPS variables included total-, low- (0 to 3 m∙s-1), moderate- (3.1 to 5 m∙s-1), high- (5.1 to 7 m∙s-1) and very-high-speed (> 7.1 m∙s-1) distances. MEMS data included total session PlayerLoad™. When compared to post-session data, mean bias for total-, low-, moderate-, high- and very-high-speed distances were all trivial, with the typical error of the estimate (TEE) small, small, trivial, trivial and small respectively. Pearson correlation coefficients for total-, low-, moderate-, high- and very-high-speed distances were nearly perfect, nearly perfect, perfect, perfect and nearly perfect respectively. For PlayerLoad™, mean bias was trivial whilst TEE was moderate and correlation nearly perfect. Practitioners should be confident that when interpreting real-time speed-derived metrics, the data generated in real-time is comparable to that downloaded directly from the device post-session. However, practitioners should refrain from interpreting accelerometer derived data (i.e. PlayerLoad™) or acknowledge the moderate error associated with this real-time measure.
Rugby union needs a contact skill-training programme
INTRODUCTION: For the optimal development of youth rugby league players’, knowledge of the match demands across the different levels is required. The peak demands of game play can be termed the ‘worst case scenario’ (WCS). Quantification of the WCS aids in the prescription of appropriate training drills. This study aimed to quantify, and compare, the absolute and WCS running demands of rugby league match-play between professional club and international youth levels.
Background Rugby union match demands are complex, requiring the development of multiple physical qualities concurrently. Quantifying the physical qualities of age grade rugby union players is vital for practitioners to support athlete preparation and long-term development. Aim This systematic review aimed to identify the methods used to quantify the physical qualities of male age grade (≤ Under-20) rugby union players, present the normative values for physical qualities, and compare physical qualities between age grades and positions. Methods Electronic databases were systematically reviewed from the earliest record to November 2019 using key words relating to sex, age, sport and physical testing. Results Forty-two studies evaluated the physical qualities of age grade rugby union players. Seventy-five tests were used to quantify body composition, muscular strength, muscular power, linear speed, change of direction ability, aerobic capacity and anaerobic endurance. Thirty-one studies met the eligibility criteria to present the physical qualities. Physical qualities differentiate between age groups below Under-16, while differences in older age groups (Under-16 to Under-20) are not clear. Positional differences are present with forwards possessing greater height, body mass, body fat percentage and strength while backs are faster and have greater aerobic capacities. Conclusions A wide variety of tests are used to assess physical qualities limiting between study comparisons. Although differences in older age grades are unclear, older age groups (Under-19-20) generally performed better in physical tests. Positional differences are associated with match demands where forwards are exposed to less running but a greater number of collisions. Practitioners can use the results from this review to evaluate the physical qualities of age grade rugby union players to enhance training prescription, goal setting and player development. Future research should consider the use of national standardised testing batteries due to the inconsistency in testing methods and small samples limiting the reporting of positional differences.
Youth Rugby
Youth Rugby provides a summary of the latest and most up-to-date research evidence in relation to developing the youth rugby player. The book provides an overview of the latest scientific research for key topics related to the youth rugby player across the codes of rugby (union, league and 7's; mainly league and union in youth players) whilst also summarising the quality of the evidence available and the limitations of this research and highlighting key future research directions. The book covers a range of fundamental scientific topics relating to paediatric exercise science, human physiology, youth athletic development and high-performance sport. Each author is an experienced researcher within their respective discipline related to the youth rugby player. The book includes chapters on: • Long-term athletic development, growth and maturation, talent identification and the physical demands of youth rugby training and match-play. • Physical characteristics and the current evidence behind training methods to promote desired physical qualities. • Fatigue and recovery, the tackle, psychosocial development, nutrition and injury prevalence and prevention. This text is essential reading for all scientists, students and applied researchers wanting to develop world-class, evidence-based programmes for their youth athletes.
Profiling Physical Qualities in Youth Rugby
The profiling (including measurement, analysis and evaluation) of the physical qualities of youth rugby players is vital for practitioners to support athlete preparation and long-term development. This chapter summarises the relevant research detailing how physical qualities are measured within youth rugby players. This includes a summary of methods that assess anthropometry and body composition, strength, power, speed, agility and change of direction, aerobic and anaerobic capacity, and movement skills. The second part of the chapter presents the practical application of a national standardised fitness assessment for 14–19-year-old rugby league players. This section highlights how the national standardised fitness assessment was (1) designed, (2) conducted, and (3) analysed, reported and evaluated using the ProPQ (Profiling Physical Qualities) Tool. The ProPQ Tool provides an interactive data analysis tool that can be used by academy managers, coaches and sport science staff for enhancing the physical development of rugby players. The chapter concludes with a range of recommendations and implications for practice for profiling the physical qualities of youth rugby players.
Match and collision characteristics and exposures across world rugby union
This study aimed to quantify the duration-specific peak average running speeds of Academy-level rugby league match play, and compare between playing positions. Global positioning system data were collected from 149 players competing across 9 teams during 21 professional Academy (under-19) matches. Players were split into 6 positions: hookers (n = 40), fullbacks (n = 24), halves (n = 47), outside backs (n = 104), middles (n = 118), and backrow forwards (n = 104). Data were extracted and the 10-Hz raw velocity files exported to determine the peak average running speeds, via moving averages of speed (m·min−1), for 10- and 30-second, and 1- to 5- and 10-minute durations. The data were log transformed and analyzed using linear mixed-effect models followed by magnitude-based inferences, to determine differences between positions. Differences in the peak average running speeds are present between positions, indicating the need for position-specific prescription of velocity-based training. Fullbacks perform possibly to most likely greater average running speeds than all other positions, at each duration, except at 10 seconds vs. outside backs. Other differences are duration dependent. For 10 seconds, the average running speed is most likely greater for outside backs vs. the hookers, middles, and backrow forwards, but likely to most likely lower for 10 minutes. Hookers have possibly trivial or lower average speed for 10 seconds vs. middles and backrow forwards, but very likely greater average running speed for 10 minutes. The identified peak average running speeds of Academy-level match play seem similar to previously reported values of senior professional level.
Objectives: To quantify, and compare, the whole- half- and peak-match running demands of professional club and international under-16 rugby league match-play. Methods: Four professional Club (n = 30) and two International (n = 23) under-16 matches were analysed using 10-Hz micro-technology units, with players analysed according to positional groups. Absolute (m) and relative (RD; m.min–1) total, high speed (>5 m·s–1; HSR) and sprint (>7 m·s–1) distance were analysed for whole- and half-match alongside maximum velocity (VMAX; m.s–1). Peak running demands were determined via moving averages of RD for 10, 30, and 60- to 600-seconds. Results: International forwards had most likely higher whole match relative sprint and VMAX, and 1st half RD than club level, and had very likely higher peak running demands at 60-, 180- and 600-second durations. For backs, whole game RD was most likely higher and total and sprint distance was likely higher at club level matches. Peak RD was also very likely higher for club backs at 10- and 60-seconds. Conclusions: The running demand differences between club and international level at the under-16 age group are position dependent, with greater running demands at club level match play for backs, but at the international level of forwards.
BACKGROUND: Quantifying the peak match demands within the football codes is useful for the appropriate prescription of external training load. Wearable microtechnology devices can be used to identify the peak match demands, although various methodologies exist at present. OBJECTIVES: This systematic review aimed to identify the methodologies and microtechnology-derived variables used to determine the peak match demands, and to summarise current data on the peak match demands in the football codes. METHODS: A systematic search of electronic databases was performed from earliest record to May 2018; keywords relating to microtechnology, peak match demands and football codes were used. RESULTS: Twenty-seven studies met the eligibility criteria. Six football codes were reported: rugby league (n = 7), rugby union (n = 5), rugby sevens (n = 4), soccer (n = 6), Australian Football (n = 2) and Gaelic Football (n = 3). Three methodologies were identified: moving averages, segmental and 'ball in play'. The moving averages is the most commonly used (63%) and superior method, identifying higher peak demands than other methods. The most commonly used variables were relative distance covered (63%) and external load in specified speed zones (57%). CONCLUSION: This systematic review has identified moving averages to be the most appropriate method for identifying the peak match demands in the football codes. Practitioners and researchers should choose the most relevant duration-specific period and microtechnology-derived variable for their specific needs. The code specific peak match demands revealed can be used for the prescription of conditioning drills and training intensity.
The most frequently occurring contact events in rugby union are the tackle and ruck. The ability repeatedly to engage and win the tackle and ruck has been associated with team success. To win the tackle and ruck, players have to perform specific techniques. These techniques have not been studied at the highest level of rugby union. Therefore, the purpose of this study was to identify technical determinants of tackle and ruck performance at the highest level of rugby union. A total of 4479 tackle and 2914 ruck events were coded for the Six Nations and Championship competitions. Relative risk ratio (RR), the ratio of the probability of an outcome occurring when a characteristic was observed (versus the non-observed characteristic), was determined using multinomial logistic regression. Executing front-on tackles reduced the likelihood of offloads and tackle breaks in both competitions (Six Nations RR 3.0 Behind tackle, 95% confidence interval [95% CI]: 1.9-4.6, effect size [ES] = large, P < 0.001); Championship RR 2.9 Jersey tackle, 95% CI: 1.3-6.4, ES = moderate, P = 0.01). Fending during contact increased the chances of offloading and breaking the tackle in both competitions (Six Nations RR 4.5 Strong, 95% CI: 2.2-9.2, ES = large, P = P < 0.001; Championship RR 5.1 Moderate, 95% CI: 3.5-7.4, ES = large, P < 0.001). For the ruck, actively placing the ball increased the probability of maintaining possession (Six Nations RR 2.2, 95% CI: 1.1-4.3, ES = moderate, P = 0.03); Championship RR 4.0, 95% CI: 1.3-11.8, ES = large, P = 0.01). The techniques identified in this study should be incorporated and emphasised during training to prepare players for competition. Furthermore, these techniques need to be added to coaching manuals for the tackle and ruck.
The aim of the present study was to investigate (a) the differences in the movement skills and physical qualities between academy and senior rugby league players, and (b) the relationships between movement skills and physical qualities. Fifty-five male rugby league players (Senior, n=18; Under 19 n=23; Under 16, n=14) undertook a physical testing battery including anthropometric (stature & body mass), strength (isometric mid-thigh pull; IMTP) and power (countermovement jump; CMJ) qualities, alongside the athletic ability assessment (AAA; comprised of overhead squat, double lunge, single-leg Romanian deadlift, press-up and pull-up exercises). Univariate analysis of variance demonstrated significant (p<0.001) differences in body mass, IMTP peak force, CMJ mean power, and AAA movement skills between groups. The greatest observed differences for total movement skills, peak force and mean power were identified between Under 16 and 19 academy age groups. Spearman's rank correlation coefficients demonstrated a significant moderate (r=0.31) relationship between peak force and total movement skill. Furthermore, trivial (r=0.01) and small (r=0.13; r=0.22) relationships were observed between power qualities and total movement skill. These findings highlight that both movement skills and physical qualities differentiate between academy age groups, and provides comparative data for English senior and academy rugby league players.
Head acceleration events (HAEs) are acceleration responses of the head following external short-duration collisions. The potential risk of brain injury from a single high-magnitude HAE or repeated occurrences makes them a significant concern in sport. Instrumented mouthguards (iMGs) can approximate HAEs. The distinction between sensor acceleration events, the iMG datum for approximating HAEs and HAEs themselves, which have been defined as the in vivo event, is made to highlight limitations of approximating HAEs using iMGs. This article explores the technical limitations of iMGs that constrain the approximation of HAEs and discusses important conceptual considerations for stakeholders interpreting iMG data. The approximation of HAEs by sensor acceleration events is constrained by false positives and false negatives. False positives occur when a sensor acceleration event is recorded despite no (in vivo) HAE occurring, while false negatives occur when a sensor acceleration event is not recorded after an (in vivo) HAE has occurred. Various mechanisms contribute to false positives and false negatives. Video verification and post-processing algorithms offer effective means for eradicating most false positives, but mitigation for false negatives is less comprehensive. Consequently, current iMG research is likely to underestimate HAE exposures, especially at lower magnitudes. Future research should aim to mitigate false negatives, while current iMG datasets should be interpreted with consideration for false negatives when inferring athlete HAE exposure.
Using markerless motion capture to <i>explore changes in</i> tackle kinematics and load-based tackling technique proficiency
The purpose of this study is to explore how PlayerLoad
TM
and tackle kinematics change during a simulated tackle in relation to tackle technique proficiency. Twenty amateur male rugby union players performed 12 tackles on a tackle contact simulator while wearing a microtechnology device. Each tackle was video-recorded and analysed using tackler proficiency criteria. Based on the criteria tackles were split into three categories: lower (≤7 AU), medium (8 AU) and higher scoring tackles (≥9AU). Markerless motion capture was used to derive kinematic variables for each tackle. Kinematic data and PlayerLoadTM
variables were analysed between the three different categories. Power of the shoulder at contact in the higher scoring tackles (27.8 [95%Cl: 11.36-44.3]kW) was significantly higher than in the lower scoring tackles (7.9 [5.3–10.5]kW). Force of the shoulder at contact was higher in the lower technique scoring tackles (3.6 [2.6-4.5]kN) than in higher scoring tackles (−2.1 [−0.8-5.1]kN), but in the opposite direction to where the tackle is made. Technically proficient tackles are more powerful because players are applying their force correctly at the point of contact. Despite the higher technical proficiency group being more powerful, the external loads experienced by the player may be similar for all tackles.Background Athletes are recognised as being at elevated risk of disordered eating (DE), with particularly high prevalence observed in ‘weight-sensitive’ sports. However, limited research has examined this issue within contact sports such as rugby, where unique physical demands and cultural norms may shape eating behaviours differently. This study aimed to investigate the prevalence, characteristics, and contextual drivers of DE among national-level male and female rugby union and rugby league players in England. A mixed-methods, cross-sectional design was employed utilising an online questionnaire, incorporating the Eating Disorders Examination Questionnaire (EDEQ), the Muscularity Orientated Eating Test (MOET) and open-ended qualitative questions. Participants (n = 182) included current players from Premiership Women’s Rugby, Gallagher Premiership, Women’s Super League, and Men’s Super League. Quantitative data were analysed using descriptive and non-parametric statistics and qualitative responses underwent reflexive content analysis. Results Data from the EDEQ showed 28% of female and 9% of male players exceeded clinical thresholds for DE, with the highest prevalence among front-row forwards and players with a self-reported body mass index of >25 kg/m². Muscularity-oriented DE was similar in both sexes (MOET mean scores: females = 13.8; males = 13.0). Athletes reported DE behaviours including binge eating (40% of participants), compulsive exercise (33%), vomiting (2%), and laxative use (1%). Qualitative findings revealed a perceived link between body composition and performance, with 82% of players attempting to manipulate their bodies in the previous 12 months. 63% of players reported pressure to change their body composition, commonly driven by coaches (17%) or intrinsic motivations (16%). 44% experienced body-related comments in the rugby setting, frequently perceived as insults/jokes (19%), with players reporting emotional distress in response (11%). Conclusion National-level rugby players – particularly females, forwards, and individuals with self-reported high-BMI – are at elevated risk for DE. The findings reveal a high-pressure sporting culture in which physical size is paradoxically required and ridiculed. To address this complex issue, comprehensive and system-wide prevention strategies are urgently needed. These include challenging harmful cultural norms, cultivating psychological safety, and prioritising the holistic health and wellbeing of players across both men’s and women’s national-level rugby.
Changes in Markers of Fatigue Following a Competitive Match in Elite Academy Rugby Union Players
Understanding differences in locomotor and collision characteristics between phases of play can help rugby league coaches develop training prescription. There are no data currently available describing these differences at the elite international level. The aim of our study was to determine the differences in average speed (m∙min−1), high-speed running (>5.5 m∙s−1) per minute and collision frequencies per minute (n∙min−1) between attack and defence during the 2017 Rugby League World Cup (RLWC). Methods: Microtechnology data were collected from 24 male professional rugby league players from the same international squad across six matches of the RLWC. Data were then subject to exclusion criteria and stratified into forwards (n = 9) and backs (n = 7) before being analysed with linear mixed-effects models. Results: When comparing attack with defence, forwards and backs had substantially slower average speeds (effect size [ES]; ±90% confidence limits: −2.31; ±0.31 and −1.17; ±0.25) and substantially greater high-speed distance per minute (1.61; ±0.59 and 4.41; ±1.19). Forwards completed substantially more collisions per minute when defending (2.75; ±0.32) whilst backs completed substantially more when attacking (0.63; ±0.70). There was greater within- and between-player variability for collision frequency (coefficient of variation [CV] range; 25–28%) and high-speed distance (18–33%) per minute when compared to average speed (6–12%). Conclusions: There are distinct differences in locomotor and collision characteristics when attacking and defending during international rugby league match-play, yet the variability of high-speed running and collisions per minute is large. These data may be useful to plan or evaluate training practices.
Male academy rugby league players are required to undertake field and resistance training to develop the technical, tactical and physical qualities important for success in the sport. However, limited research is available exploring the training load of academy rugby league players. Therefore, the purpose of this study was to quantify the field and resistance training loads of academy rugby league players during a pre-season period and compare training loads between playing positions (i.e., forwards vs. backs). Field and resistance training load data from 28 adolescent male (age 17 ± 1 years) rugby league players were retrospectively analysed following a 13-week pre-season training period (85 total training observations; 45 field sessions and 40 resistance training sessions). Global positioning system microtechnology, and estimated repetition volume was used to quantify external training load, and session rating of perceived exertion (sRPE) was used to quantify internal training load. Positional differences (forwards n = 13 and backs n = 15) in training load were established using a linear mixed effect model. Mean weekly training frequency was 7 ± 2 with duration totaling 324 ± 137 minutes, and a mean sRPE of 1562 ± 678 arbitrary units (AU). Backs covered more high-speed distance than forwards in weeks two (p = 0.024), and 11 (p = 0.028). Compared to the forwards, backs completed more lower body resistance training volume in week one (p = 0.02), more upper body volume in week three (p< 0.001) and week 12 (p = 0.005). The findings provide novel data on the field and resistance-based training load undertaken by academy rugby league players across a pre-season period, highlighting relative uniformity between playing positions. Quantifying training load can support objective decision making for the prescription and manipulation of future training, ultimately aiming to maximise training within development pathways.
Near Infrared Spectroscopy (NIRS) Observation of Vastus Lateralis (Muscle) and Prefrontal Cortex (Brain) Tissue Oxygenation During Synchronised Swimming Routines in Elite Athletes
The development of underwater Near-Infrared Spectroscopy (uNIRS) has enabled the measurement of tissue oxygenation within the swim environment. Unique physiological responses, such as the diving reflex, have been shown to occur during synchronized swimming and demonstrate an innate oxygen-conserving reflex. However, the prevalence of a sudden loss of consciousness (‘hypoxic blackout’) is an ongoing concern in this swim population. The purpose of this study was to investigate the reported low tissue oxygen conditions experienced in elite level synchronized swimmers (SyncS) during swim routines. Changes in peripheral muscle and brain oxygenation (Tissue Saturation Index (TSI %)) were continuously recorded during simulated synchronized swim routines. Six elite female synchronized swimmers were assessed; age 29.0 ± 4.4 years; height 168.4 ± 7.1 cm; weight 53.2 ± 3.2 kg; quadriceps skin fold; 10.2 ± 0.8 mm; ΔTSI (%) between the vastus lateralis (VL) and prefrontal cortex (PFC) were analyzed using paired (two-tailed) t-tests. The level of significance for analysis was set at p < 0.05. Significant difference (p = 0.001) was found in ΔTSI (%) between the VL and PFC. During dynamic leg kicking exercise, the initial effect of each leg kicking sequence is a rapid drop in TSI (%). This is consistent with an initial constriction (drop in blood flow in the muscle) accompanied by an increase in oxygen consumption. Cerebral oxygenation (PFC) remained largely unchanged during both maximal breath-hold and during vigorous exercise, presumably due to protective mechanisms in the brain in this population. We conclude that uNIRS is able to provide novel insights into SyncS hemodynamic responses and could be used to inform on the safety of new routines.
Sequential movement pattern-mining (SMP) in field-based team-sport: A framework for quantifying spatiotemporal data and improve training specificity?
Athlete external load is typically quantified as volumes or discretised threshold values using distance, speed and time. A framework that accounts for the movement sequences of athletes has previously been proposed using radio frequency data. This study developed a framework to identify sequential movement sequences using GPS-derived spatiotemporal data in team-sports and establish it’s stability. 13 rugby league players during one match were analysed to demonstrate the application of the framework. The framework (Sequential Movement Pattern-mining [SMP]) applies analysis techniques to handle i) geospatial data (i.e., decimal degree latitude and longitude), ii) determined players turning angles, iii) improved movement descriptor assignment thus, improving movement unit formation and iv) improved the classification and identification of players frequent SMP. The SMP framework allows for sub-sequences of movement units to be condensed, removing repeated elements, which offers a novel technique for the quantification of similarities or dissimilarities between players and playing positions. The SMP framework provides a robust and stable method which allows, for the first time the analysis of GPS-derived data and identifies the frequent SMP of field-based team-sport athletes. The application of the SMP framework in practice could optimise the outcomes of training of field-based team-sport athletes by improving training specificity.
Measuring and Analysing Physical Qualities in Youth Rugby
To support the understanding, preparation, and long-term development of youth rugby players, the accurate measurement of their physical qualities is vital. This chapter summarises how anthropometry, body composition, strength, power, speed, agility and change-of-direction, aerobic and anaerobic capacity, and athletic movement skills are measured within youth rugby players and discusses the accuracy and reliability of these methods. Furthermore, the implications of using these different testing methods within research are considered. Due to the large discrepancies in testing outcomes between rugby players of similar ages, this chapter will provide recommendations for accurate and reproducible testing of youth rugby players. Additionally, future research directions are provided that will enhance the understanding of youth rugby player development.
The Young Rugby Player: Science and Application
Injury Risk and Prevention in Youth Rugby
Injury risk is a concern for youth rugby, with research since the 1980s reporting the injury incidence within the codes. This chapter aims to review the existing literature on the injury risk (including patterns of injury, risk factors, and concussion) and the injury prevention strategies used within youth rugby union and rugby league. In summary, the chapter shows that injury incidence of children participating is low but does increase during adolescence. Most injuries occur to the knee, shoulder, and head, with the tackle the major cause of injury. Numerous injury prevention strategies have been investigated (e.g., equipment, law modification, integrative neuromuscular training programmes), demonstrating they are effective. Future research should use consistent injury definitions, evaluate injury risk according to other player development factors (e.g., grouping strategies, physical development), and consider wider adoption, implementation, and maintenance of injury prevention strategies to make the codes of rugby as safe as possible.
Talent Identification in Male Youth Rugby: An Ecological Perspective
In this chapter, the ecological dynamics framework is used to provide an overview of talent identification research in male youth rugby. Specifically, the literature and research implications are reviewed and synthesised using three constraints: (a) the task (i.e., participation history), (b) the performer (i.e., psychological characteristics, technical and tactical skills, physical factors), and (c) the environment (i.e., relative age effects, sociocultural influences). In summary, it is highlighted that talent identification in male youth rugby cannot be based upon any performance characteristic in isolation and that the interaction amongst all constraints should be considered when identifying young talent. Moreover, these constraints appear to be contingent on (a) age group, (b) competition level, (c) nationality, and (d) playing position. Limitations of the current literature and proposed directions for future research are discussed emphasising the need for multidisciplinary and longitudinal research within male and female rugby players.
Kinanthropometrcu and Grouping Strategies in Youth Rugby
A variety of kinanthropometric measurements (the study of size, shape, proportion, composition and maturation) have been used to characterise youth rugby-playing cohorts. Herein, differences between age-grades and playing groups (forwards and backs) have been established, whilst maturation appears to influence performance and selection in talent development programmes. Additionally, anthropometric-based grading methods of youth players have been applied as an alternative to traditional age grouping strategies. However, there is a lack of transparency as a consequence of limited detail in the methods for the measures used and limited research examining (1) the differences beyond comparisons of forwards and backs in players of the same age; (2) community age-grade rugby; and (3) youth female rugby. Furthermore, whilst anthropometric-based ‘grouping’ methods appear theoretically sound, there is currently a lack of research to support their proposed benefits.
Long-term Athletic Development: The Youth Rugby Player
The concept of training youth athletes is not novel; however, since the turn of the millennium, there has been a significant increase in interest surrounding the efficacy of various training approaches on the holistic development of both children and adolescents. Despite this growing interest, practitioners should remember that youth athletes are a unique population and that their exposure to sports training and strength and conditioning programmes will coincide with normal growth and maturity-related changes in a range of physical, physiological and psychosocial qualities. Understanding key principles of paediatric exercise science and growth and maturation is therefore important to support long-term athletic development of youths. This chapter aims to introduce the key concepts of long-term athletic development related to the youth rugby player, including growth and maturation, physical development, injury risk, training load management and psychosocial development.
Sprint Development in Football Code Athletes
Within the football codes, sprint performance is considered an important capacity for success and is therefore targeted as an area of athletic development programmes. However, the concurrent and complex nature of physical preparation for the football codes presents several challenges for effective sprint development. This thesis aimed to evaluate and enhance the understanding of the development of sprint performance in football code athletes to support the delivery of best practices. This thesis is comprised of sequential sections presented through a series of chapters. First, systematic reviews with meta-analyses to evaluate the evidence base for the development of sprint performance (short- and medium-long distances). Second, a practitioner survey analysing the applied training practices and justifications for the organisation and evaluation of the sprint development. The last section provides observations and evaluation of profiling methods for phase and distance-specific sprint performance using a case study of combined training methodologies in elite male youth rugby league athletes. The systematic review and meta-analysis showed sport-only training and short sprints with incomplete rest appear to be insufficient to enhance sprint performance in football code athletes. Instead, sprint development requires either or preferably a combined method approach to both improving sprinting skills (i.e., sprints performed with overload (physical or co-coordinative)) and the athlete's physical characteristics (i.e., plyometrics and resistance training). Combined with the surveys and case studies this research showed that a one size fits all approach to sprint development (i.e., exercises, loading ect.) is not applicable; instead, effective training strategies depend upon the individuals and context that it is applied. Therefore, the content of the training (e.g., training frequency, exercise selection, training load prescription) is highly variable in research and practice, but so is the training response. Applying frequent and embedded monitoring of key variables (i.e., mechanical profiling) can support personalised and potentially improved training practices. Sprint development in football code practice is challenging (particularly long-term) due to the complexity and at times, competing requirements of an athlete’s development. Therefore, if an individual or team of football code athletes aims to enhance sprint performance, it requires prioritisation from all the key stakeholders.
Developmentally Appropriate Coaching Practice for Children Playing Rugby
This book provides a comprehensive and accessible overview of the research behind the preparation, development and performance of the young rugby player.
The study aimed to illustrate how contact (from match‐event data) and head acceleration event (HAE) (from instrumented mouthguard [iMG]) data can be combined to inform match limits within rugby. Match‐event data from one rugby union and rugby league season, including all competitive matches involving players from the English Premiership and Super League, were used. Playing exposure was summarised as full game equivalents (FGE; total minutes played/80). Expected contact and HAE exposures at arbitrary thresholds were estimated using match‐event and iMG data. Generalised linear models were used to identify differences in contact and HAE exposure per FGE. For 30 FGEs, forwards had greater contact than backs in rugby union (n = 1272 vs. 618) and league (n = 1569 vs. 706). As HAE magnitude increased, the differences between positional groups decreased (e.g., rugby union; n = 34 and 22 HAE >40 g for forwards and backs playing 30 FGEs). Currently, only a relatively small proportion of rugby union (2.5%) and league (7.3%) players exceeded 25 FGEs. Estimating contact and HAEs per FGE allows policymakers to prospectively plan and model estimated overall and position‐specific loads over a season and longer term. Reducing FGE limits by a small amount would currently only affect contact and HAE exposure for a small proportion of players who complete the most minutes. This may be beneficial for this cohort but is not an effective HAE and contact exposure reduction strategy at a population level, which requires individual player management. Given the positional differences, FGE limits should exist to manage appropriate HAE and contact exposure.
Diet is an ever-changing, poorly characterised and multifaceted phenomenon. Consequently, traditional dietary assessment methods demonstrate considerable random intra- and inter-individual day-to-day variation and systematic over- or under-reporting bias (errors of reliability and validity; Beaton et al. 1997; Freedman et al. 2015) across populations (Pérez-Rodrigo et al. 2015). Expressed practically, true assessments of energy intake are misrepresented by hundreds of calories per day (Archer et al. 2016), erroneously informing medical conclusions (Schoenfeld & Ioannidis 2013), media claims (Archer, Pavela & Lavie 2015) and national dietary guidelines (Chowdhury et al. 2014). Ultimately, the enormous potential of nutrition research to drive national health, patient welfare and public service (Dhurandhar et al. 2015), urgently necessitates, and ethically obligates, the valid assessment of diet within all dietetic output.
To ensure that elite adolescent athletes meet their unique training, growth and maturation demands, it is imperative to have access to valid measures of energy intake. Contemporary methods demand close attention-to-detail, meaning that athletes often do not fully adhere to real-time protocols. This study represents the first investigation of a real-time dietary assessment designed using a comprehensive behaviour change framework (COM-B). In a crossover design, 12 elite adolescent male rugby players recorded their energy intake via an estimated food diary (est-FD) and photography-based mobile assessment ('Snap-n-Send'), combined with a 24-h dietary recall interview. Two 4-day assessment periods were divided into three separate recording environments: 96 h free-living and researcher-observed; 72 h free-living and 10 h researcher-observed. Assessment periods were one month apart. All foods and beverages were provided and weighed by the research team to quantify actual intakes. 'Snap-n-Send' reported a small mean bias for under-reporting across 96 h (-0.75 MJ day(-1); 95% confidence interval [CI] for bias = -5.7% to -2.2%, p < .001), 72 h (-0.76 MJ day(-1); 95% CI for bias = -5.6% to -2.1%, p = .004) and 10 h (-0.72 MJ day(-1); 95% CI for bias = -8.1% to -0.1%; p = .067) environments. The est-FD reported a moderate mean bias for under-reporting across 96 h (-2.89 MJ day(-1); 95% CI for bias = -17.9% to -10.2%; p < .001), 72 h (-2.88 MJ day(-1); 95% CI for bias = -17.9% to -10.1%; p < .001) and 10 h (-2.52 MJ day(-1);-26.1% to -5.3%; p = .023) environments. Results evidence the ability of 'Snap-n-Send' to accurately assess the diet of elite adolescent athletes, signalling the exciting promise of this comprehensive and theoretical behavioural approach within valid dietary assessment.
Objectives Soccer leagues reflect the partial standings of the teams involved after each round of competition. However, the ability of partial league standings to predict end-of-season position has largely been ignored. Here we analyze historical partial standings from English soccer to understand the mathematics underpinning league performance and evaluate the predictive ‘power’ of partial standings. Methods Match data (1995-2017) from the four senior English leagues was analyzed, together with random match scores generated for hypothetical leagues of equivalent size. For each season the partial standings were computed and Kendall’s normalized tau-distance and Spearman r-values determined. Best-fit power-law and logarithmic functions were applied to the respective tau-distance and Spearman curves, with the ‘goodness-of-fit’ assessed using the R2 value. The predictive ability of the partial standings was evaluated by computing the transition probabilities between the standings at rounds 10, 20 and 30 and the final end-of-season standings for the 22 seasons. The impact of reordering match fixtures was also evaluated. Results All four English leagues behaved similarly, irrespective of the teams involved, with the tau-distance conforming closely to a power law (R2>0.80) and the Spearman r-value obeying a logarithmic function (R2>0.87). The randomized leagues also conformed to a power-law, but had a different shape. In the English leagues, team position relative to end-of-season standing became ‘fixed’ much earlier in the season than was the case with the randomized leagues. In the Premier League, 76.9% of the variance in the final standings was explained by round-10, 87.0% by round-20, and 93.9% by round-30. Reordering of match fixtures appeared to alter the shape of the tau-distance curves. Conclusions All soccer leagues appear to conform to mathematical laws, which constrain the league standings as the season progresses. This means that partial standings can be used to predict end-of-season league position with reasonable accuracy.
Oral contraceptive use in Premiership and Championship women’s rugby union: Perceived symptomology, management strategies and performance and wellness effects
The aim of this study was to investigate the prevalence of oral contraceptive use in domestic rugby union, to compare symptomology by contraceptive use, and to determine symptom management strategies. Additionally, to characterise the perceived influence of oral contraceptive use and non-use on wellness and performance. A total of 238 Premiership and Championship women’s rugby union players completed an online questionnaire. The survey was comprised of questions relating to player characteristics, hormonal or non-hormonal contraceptive characteristics, perceived symptomology, symptom management strategies, and performance and wellness characteristics. The prevalence of oral contraceptive users was 26%. Non-hormonal contraceptive users reported greater perceived negative symptomology (i.e., back pain, nausea, sore breasts) and performance and wellness effects (i.e., fatigue, stress, mood, concentration, power, match-play) than oral contraceptive users. The most common symptom management strategies were medication (33%), nutritional interventions (20%), and training modulation (20%). Twelve percent of players had previously spoken to staff about their menstrual cycle (i.e., regular and irregular) or contraceptive use. The most common barriers to speaking to staff were ‘male staff’ (29%) and ‘club culture’ (24%). The importance of assisting non-hormonal contraceptive users in managing symptoms is evident. Emphasis on overcoming barriers to staff-player dialogue regarding menstrual/contraceptive cycle is required.
The aim of this study was to investigate the prevalence of oral contraceptive use in domestic rugby union, to compare symptomology by contraceptive use, and to determine symptom management strategies. Additionally, to characterise the perceived influence of oral contraceptive use and non-use on wellness and performance. A total of 238 Premiership and Championship women’s rugby union players completed an online questionnaire. The survey was comprised of questions relating to player characteristics, hormonal or non-hormonal contraceptive characteristics, perceived symptomology, symptom management strategies, and performance and wellness characteristics. The prevalence of oral contraceptive users was 26%. Non-hormonal contraceptive users reported greater perceived negative symptomology (i.e., back pain, nausea, sore breasts) and performance and wellness effects (i.e., fatigue, stress, mood, concentration, power, match-play) than oral contraceptive users. The most common symptom management strategies were medication (33%), nutritional interventions (20%), and training modulation (20%). Twelve percent of players had previously spoken to staff about their menstrual cycle (i.e., regular and irregular) or contraceptive use. The most common barriers to speaking to staff were ‘male staff’ (29%) and ‘club culture’ (24%). The importance of assisting non-hormonal contraceptive users in managing symptoms is evident. Emphasis on overcoming barriers to staff-player dialogue regarding menstrual/contraceptive cycle is required.
Fatigue in team sports has been widely researched, with a number of systematic reviews summarising the acute (i.e., within 48-hours) response in outdoor sports. However, the fatigue response to indoor court-based sports is likely to differ to outdoor sports due to smaller playing fields, harder surfaces, and greater match frequencies, thus should be considered separately to outdoor sports. Therefore, this study aimed to conduct a systematic review on acute fatigue in indoor court-based team-sport, identify methods and markers used to measure acute fatigue, and describe acute fatigue responses. A systematic search of the electronic databases (PubMed, SPORTDiscus, MEDLINE and CINHAL) was conducted from earliest record to June 2023. Included studies investigated either a physical, technical, perceptual, or physiological response taken before and after training, match, or tournament play. One-hundred and eight studies were included, measuring 142 markers of fatigue. Large variability in methods, fatigue markers and timeline of measurements were present. Cortisol (n = 43), creatine kinase (n = 28), countermovement jump (n = 26) and testosterone (n = 23) were the most frequently examined fatigue markers. Creatine kinase displayed the most consistent trend, increasing 10–204% at 24-hours across sports. There is large variability across studies in the methods and markers used to determine acute fatigue responses in indoor court-based team sports. Future researchers should focus on markers that display high reliability and transfer to practice. The robustness of studies may be increased by ensuring appropriate methods and timescale of fatigue marker measurement are used. Further research is required to determine which combination of markers best describes a fatigue response.
Influence of arrival hydration status and fluid availability on hydration markers during a rugby league match simulation protocol.
Descriptions and definitions for the rugby league tackle
INTRODUCTION Research within Rugby league (RL) tackle investigations using video analysis has often used two sources of variables. The exception being King et al (2010) who described the characteristics of the RL tackle event such as number of tacklers and tackle height of the first tackler. However, the majority of investigations have either adopted technical variables from rugby union (RU) tackle variables (Sperenza et al., 2017) or technical criteria from coaching cues (Gabbett, 2008). In doing so, content validity and relevance to RL could be questioned (O’Donoghue, 2014). The aim of this study was to adopt a 5 stage process to determine tackle variables which are valid and reliable for RL research METHODS A 5 stage process was undertaken based upon recommendations by O’Donoghue (2014). STAGE 1 involved a synthesis of literature and examined phases of the tackle, variables describing the tackle descriptions of these variables research. A draft variable list was then developed before the start of STAGE 2. To achieve content validity and relevancy, STAGE 2 formed an expert group of practitioners to critique the previously formed draft variable list and develop new phases, variables and descriptors. STAGE 3 refined the variable list based upon the practitioner consultation. STAGE 4 established an expert group agreement in the refined variable list. Finally, STAGE 5 tested intra and inter-reliability of the list using Kappa statistics (McHugh, 2012). RESULTS The agreed variable list comprised of 6 phases including defensive start point, pre-contact, initial contact, post-contact and play the ball phases. Within the phases 66 variables were determined. The intra- and inter-reliability testing resulted in at least moderate agreement (>0.7) (McHugh, 2012) of all phases. DISCUSSION Due to possessing both strong relevance to an RL tackle and demonstrating good levels of reliability, researchers can be confident that the variables within the list are valid for research purposes (O’Donoghue, 2014). In addition, the rigorous 5 stage process of validating the content of the variable list should be used when determining different variables within different sports and actions for research purposes. In doing so, researchers can be confident that they are valid in use and thus can be used consistently for research purposes. Furthermore, the findings show that although there are similarities between a RU and RL tackle, clear differences exist and therefore justifies the need for specific RL variables during tackle research.
Calcium, vitamin D and iron status of elite rugby players during a competitive season.
Immune responses and dietary intake of elite rugby union players during pre-season training
There is a developing base of research assessing hormonal status in rugby players as a means to monitor training and performance, however to date no research has investigated the impact of dietary intake on immunity. The objectives of the study were to monitor immune responses, as well as assess dietary intake, body composition and performance of elite rugby union players. Following ethics approval, nine players (height 185.8±6.2cm, age 28.0±3.4yrs) were assessed at the start and end of a 4-week pre-season training period for dietary intake (4-day food diary), body composition (sum of 8 [Σ8] skinfold sites), one repetition maximum (1RM) strength (bench press [BP] and prone row [PR]) and endurance (1200m run). Saliva immunoglobulin A (sIgA) measures were taken at the start and end of each week (~8.30am). Mean energy intakes were 12145.9±3772.3kJ (week 1) and 12419.7±2385.5kJ (week 4). During both weeks of dietary assessment protein and carbohydrate consumption was 2.3±0.7 and 2.7±1.1g.kg.BM-1.d-1 respectively. There was a 32%, 36% and 32% contribution from protein, carbohydrate and fat to energy intake. Despite relatively low energy consumption (approximately 45% below recommendations), there was only a 0.7±2.4kg reduction in body mass (103.7±13.7 to 103±13.4kg) and a 12±2% reduction in for Σ8 skinfolds from week 1 to week 4 (106.7±44.3 to 94.17±43.6mm). Mean sIgA over the training period was 47.9±ug.min-1, with a small intra-player variability observed throughout the 4-week preseason training period (CV 5%). sIgA was strongly correlated with fat (r=0.68) and saturated fat intake (r=0.62). sIgA was also moderately correlated with protein intake (r=0.40), Σ8 skinfolds (r=0.39) and endomorphy (r=0.43). Conversely, strong and moderate negative correlations occurred between sIgA and mesomorphy (r=-0.51), and sIgA and ectomorphy (r=-0.49). Significant improvements (P<0.05) were observed for 1RM in BP (4.0 ± 4.93 kg) and 1200m (23 ± 5.27 sec) with 1200m times strongly correlating with sIgA (r=0.56). Conversely there were moderate negative correlations between improvements in strength and sIgA (r=-0.42). In summary, players with a higher dietary fat intake and endomorphic characteristics demonstrated a better immune status than leaner ectomorphic players. Those showing better immune function also produced greater gains in endurance. The negative correlations between strength and sIgA were likely due to enhanced rates of catabolism as a result of resistance training. The pre-season training period elicited improvements in body composition despite inadequate dietary intake when compared to guidelines, highlighting the potential error of applying the recommended nutrient intake guidelines to an elite rugby union population.
Nutrition strategies and supplements may have a role to play in diminishing exercise associated gastrointestinal cell damage and permeability. The aim of this systematic review was to determine the influence of dietary supplements on markers of exercise-induced gut endothelial cell damage and/or permeability. Five databases were searched through to February 2021. Studies were selected that evaluated indirect markers of gut endothelial cell damage and permeability in response to exercise with and without a specified supplement, including with and without water. Acute and chronic supplementation protocols were included. Twenty-seven studies were included. The studies investigated a wide range of supplements including bovine colostrum, glutamine, probiotics, supplemental carbohydrate and protein, nitrate or nitrate precursors and water across a variety of endurance exercise protocols. The majority of studies using bovine colostrum and glutamine demonstrated a reduction in selected markers of gut cell damage and permeability compared to placebo conditions. Carbohydrate intake before and during exercise and maintaining euhydration may partially mitigate gut damage and permeability but coincide with other performance nutrition strategies. Single strain probiotic strains showed some positive findings, but the results are likely strain, dosage and duration specific. Bovine colostrum, glutamine, carbohydrate supplementation and maintaining euhydration may reduce exercise-associated endothelial damage and improve gut permeability. In spite of a large heterogeneity across the selected studies, appropriate inclusion of different nutrition strategies could mitigate the initial phases of gastrointestinal cell disturbances in athletes associated with exercise. However, research is needed to clarify if this will contribute to improved athlete gastrointestinal and performance outcomes.
Post Exercise Hyponatremia in Premiership Rugby Union Players
Objectives To assess the incidence, prevalence and consequences of illness in one professional academy rugby league club during an in-season period. Design Observational prospective cohort study. Method Seventeen male rugby league players (age 17.7 ± 0.7 years, stature 178.8 ± 5.1 cm, body mass 87.2 ± 9.6 kg) completed a weekly self-report illness questionnaire using an amended version of the Oslo Sports Trauma Research Centre (OSTRC) questionnaire on health problems. Results A total of 24 new illnesses were reported over the 25-week study period. 65% of players experienced at least one illness during the study. The incidence of illness in this cohort was 14.3 per 1000-player days, with the respiratory system being most commonly affected (n = 15; 62.5%). The average weekly illness prevalence was 10.3%. Time-loss illness incidence was 1.4 per 1000-player days. Loss of body mass and sleep disruptions were the most commonly reported consequences of illness episodes. Mean body mass loss during a period of illness was 2.2 ± 0.6 kg. Conclusions Academy rugby league players are most commonly affected by respiratory illness with a total of nineteen training and competition days lost to illness. Associated consequences of illness, such as loss of body mass and sleep disruptions may present a challenge and negatively impact a rugby league player’s development. Appropriate medical provisions should be provided for Academy rugby league players to support them during periods of illness to limit the impact of these consequences.
Background An increasing number of epidemiological studies assessing the incidence, prevalence and severity of injury in youth female sport are available. However, no study has sought to synthesise the current evidence base across all youth female sports. As such, a systematic review and meta-analysis of injury in this cohort is necessary to understand the diversity of injury and its associated burden between sports in addition to identifying the density of research available. Objective To conduct a systematic review and meta-analysis of epidemiological data of injuries in youth female athletes with particular attention to injury incidence, mean days lost and injury burden. Methods Searches were performed in PubMed, EBSCO (SportDiscus with Full Text MEDLINE, APA PsycInfo, CINAHL, Academic Search Complete) and Cochrane databases. Studies were considered if they reported time-loss injury incidence or prevalence in youth female (≤ 19 years old) athletes. Study quality and risk of bias was assessed using SIIS STROBE extension, Newcastle Ottawa Scale, and funnel plots, respectively. Injury incidence and burden rate data were modelled using a mixed-effect Poisson regression model. Days lost data were modelled using a generalised linear mixed model. Results Thirty-two studies were included. The overall incidence rate, mean days lost per injury, and burden rate was 4.4 injuries per 1000 h (95% CI 3.3–5.9), 10 days (95% CI 6–15), and 46 days per 1000 h (95% CI 23–92), respectively. Forty percent of athletes sustained at least one time-loss injury. Competitive level was a significant moderator for match and training injury incidence, with elite youth athletes presenting greater pooled injury incidence estimates than non-elite athletes (p = 0.0315 and p = 0.0047, respectively). The influence of moderators on days lost and injury burden could not be conducted due to an insufficient number of studies for analysis. Conclusion Despite a broad inclusion criterion, there is limited injury surveillance research available across youth female sport. Outside of soccer, little research density is evidenced with single studies available in popular team sports such as Australian Rules Football and Rugby Union. Insufficient study numbers reporting mean days lost and injury burden data were available for analysis, and pooled days lost data could only be estimated for soccer. This highlights a need for future research to report days lost data alongside injury number and exposure so burden can be calculated and the full risk of injury to youth female athletes can be identified.
Fluid and sodium balance is important for performance and health; however, limited data in rugby union players exist. The purpose of the study was to evaluate body mass (BM) change (dehydration) and blood[Na] change during exercise. Data were collected from 10 premiership rugby union players, over a 4-week period. Observations included match play (23 subject observations), field (45 subject observations), and gym (33 subject observations) training sessions. Arrival urine samples were analyzed for osmolality, and samples during exercise were analyzed for [Na]. Body mass and blood[Na] were determined pre- and postexercise. Sweat[Na] was analyzed from sweat patches worn during exercise, and fluid intake was measured during exercise. Calculations of fluid and Na loss were made. Mean arrival urine osmolality was 423 ± 157 mOsm·kg, suggesting players were adequately hydrated. After match play, field, and gym training, BM loss was 1.0 ± 0.7, 0.3 ± 0.6, and 0.1 ± 0.6%, respectively. Fluid loss was significantly greater during match play (1.404 ± 0.977 kg) than field (1.008 ± 0.447 kg, p = 0.021) and gym training (0.639 ± 0.536 kg, p < 0.001). Fluid intake was 0.955 ± 0.562, 1.224 ± 0.601, and 0.987 ± 0.503 kg during match play, field, and gym training, respectively. On 43% of observations, players were hyponatremic when BM increased, 57% when BM was maintained, and 35% when there was a BM loss of 0.1-0.9%. Blood[Na] was the representative of normonatremia when BM loss was >1.0%. The findings demonstrate that rugby union players are adequately hydrated on arrival, fluid intake is excessive compared with fluid loss, and some players are at risk of developing hyponatremia.
The purpose of the present study was to evaluate; 1) whether there were differences in sprint times at 5, 10, 20, 30 and 40 m between rugby union and rugby league players; 2) determine the reliability and usefulness of linear sprint testing in adolescent rugby players. Data were collected on 28 rugby union and league academy players over two testing sessions, with three day's rest between sessions. Rugby league players were faster at 5 m than rugby union players, with further difference unclear. Sprint time at 10, 20, 30 and 40 m were all reliable (CV = 3.1%, 1.8%, 2.0% and 1.3%) but greater than the smallest worthwhile change (SWC (0.2 x between-subject SD)), rating the test as marginal for usefulness. While the test was incapable of detecting the SWC we recommend that practitioners and researchers use Hopkins' proposed method (22); whereby plotting the change score of the individual at each split (± TE expressed as a CV) against the SWC, and visually inspecting whether the TE crosses into the SWC is capable of identifying whether a change is both real (greater than the noise of the test, i.e., >TE) and of practical significance (>SWC). Researchers and practitioners can use the TE and SWC from the present study to assess changes in performance of adolescent rugby players when using single beam timing gates.
Reducing the unknown of fluid balance and sodium loss for a recreationally trained athlete competing in the Marathon des Sables
The Marathon des Sables consists of 5.5 marathons in 6 days, while competitors are required to carry all food and living provisions in temperatures exceeding 50°C. Competitors may undertake sports science support prior to the event, likely to focus on hydration strategies due to the extreme heat of the event and the association with dehydration and potential hypernatraemia or hyponatraemia. Despite this, no data exists to aid practitioners when supporting clients running the Marathon des Sables. The purpose of the support was to educate a 40-year-old recreationally trained male client on the effect of running speed and rucksack weight on fluid loss, Na+ loss and blood[Na+] change during exercise in the heat. The client completed five trials, manipulating running speed and rucksack weight: 6.0 km h-1 (0 and 10 kg), 10.0 km h-1 (0 and 10 kg) and 8.0 km h-1 (7.5 kg) in a heat chamber (40°C and 20% relative humidity) for 30 min on a motorised treadmill. Ad libitum water intake was permitted (measured via mass change of drinks bottles) and body mass (BM) was determined pre- and post-exercise. Sweat patches were worn during exercise and analysed for [Na+]. Blood[Na+] was also measured preand post-exercise. Fluid loss was calculated (BMpre (kg) – BMpost (kg) + fluid intake (kg)) and estimations of fluid loss at various ecologically valid running speed and rucksack weights were calculated. Na+ loss was calculated based on [Na+] and fluid loss. Institutional ethical approval to use case study data for research purposes was obtained retrospectively. Fluid loss was 0.54 L h-1 at 6.0 km h-1 (0.0 kg), 0.94 L h-1 at 6.0 km h-1 (10.0 kg), 1.22 L h-1 at 10.0 km h-1 (0.0 kg) and 2.12 L h-1 10.0 km h-1 (10.0 kg). There was a strong significant positive relationship between fluid loss and sweat[Na+] (r = 0.966, P = 0.034) and a strong negative relationship between fluid loss and ?blood[Na+] (r = -0.776, P = 0.269). Estimations of potential fluid and Na+ deficits at the end of the day, dependent on the duration of stage, running speed and rucksack weight, were presented to the client. The physiological responses during this support were normal: increase in FL versus increase in running speed and rucksack weight and an increase in sweat [Na+] versus an increase in fluid loss, due to a reduction in Na+ reabsorption in the sweat gland duct (Cage and Dobson, 1965, Journal of Clinical Investigation, 44, 1270–1276). Despite the known findings, the client reported that the support was sufficient to provide confidence in his hydration strategy and an increased awareness of pacing, thus completing the Marathon des Sables safely without developing debilitating dehydration, hypernatraemia or hyponatraemia. D1.
Stepping into the unknown: Providing multidisciplinary support to an ultra-endurance race debutant
The purpose of this article is to provide a descriptive and reflective account of the multidisciplinary support (i.e., nutrition, physiology and psychology) provided to a 40-year-old male client entering the Marathon des Sables (MdS) for the first time. Reflections will be provided from client and practitioner perspectives. An initial assessment phase, consisting of intake interview (e.g., initial introductions and contracting), sport analysis and client needs analysis (e.g., nutritional requirements for training/racing; performance profiling and physiological testing), was conducted 16 weeks prior to the race to identify key priorities and inform support provision. Professional codes of ethics and conduct (e.g., British Association of Sport and Exercise Sciences [BASES], British Psychological Society [BPS] and Sport and Exercise Nutrition Register [SENR]) were consistently adhered to. Institutional ethical approval to use case study data for research purposes was obtained retrospectively. “Priority areas” (e.g., fitness and mental endurance) and “unknown aspects” (e.g., heat/hydration, nutrition and tent-mates) of race preparation and completion were initially identified. Based on this information, a programme of client-tailored support consisting of psychological skills training (e.g., goal-setting), dietary analysis (e.g., completion/evaluation of diet diaries) and physiological testing (e.g., heat chamber trials to monitor fluid balance and sodium loss during treadmill running) was provided. “Unknown aspects” became particularly salient for client and practitioners following a serious ankle injury sustained by the client within the first month of the support programme. Despite intentions to provide an interdisciplinary support programme, a predominantly multidisciplinary approach was adopted. Furthermore, in the face of limited time and availability, the support team employed a largely clientled consultancy approach (i.e., client empowered to take responsibility for decision-making and problemsolving). Client feedback (e.g., “cautious confidence” about hydration/nutrition, adapting overall race goal to “complete” rather than “compete” and being able to draw on previous experiences of dealing with unfamiliar scenarios) indicated that the consultancy approach was successful in facilitating client involvement in developing appropriate race strategies. Although the support team did not accompany the client to the race venue, remote support (i.e., online messages) was provided throughout the race. During a debrief interview conducted 3 days postrace, the client reflected on the sense of “reassurance” which the support programme had provided en route to achieving his adapted goal of race completion. This case study provides an account of the multidisciplinary support offered to an ultra-marathon debutant over a short yet turbulent timeframe. Reflections on the challenges, successes and learning experiences encountered during the support programme demonstrates that developing the ability to adapt to novel and unexpected circumstances represents an important challenge for both clients and practitioners alike.
Client and practitioner collaboration in nutritional planning and preparation for the Marathon des Sables
The Marathon Des Sables (MdS) is a multistage ultra-endurance footrace across the Sahara Desert. Event organisation supplies rationed water (9 L · day–1) but entrants must provide and carry a minimum of 2000 kcals · day–1 selecting provisions best suited to personal needs, health, environmental conditions, weight and backpack preference. This case study account details pre-race nutritional planning and preparation of a 40-year-old, recreationally trained but occupationally sedentary male client. The purpose of collaboration was to develop nutritional strategies to support training, alongside an event food plan that adhered to race regulations and maximised energy delivery within a client-determined backpack food weight allowance of 5 kg. The client (body mass 83.7 kg) self-referred 16 weeks prior to the event. Following initial assessment and dietary analysis, a nutrition intervention was designed to sustain training. Total energy requirement (TER) was predicted using basal metabolic rate and a physical activity level of 1.7 (2934 kcal). Nutritional targets were set based on American College of Sports Medicine (ACSM) guidelines; carbohydrate (CHO) 6–7 g · kgBM–1 · day–1, protein 1.2–1.7 g · kgBM–1 · day–1, fat 20–25% TER. A tailored event food plan was formulated and trialled pre-race. Post-race debrief occurred 3 days following completion. Initial assessment identified the need to shift the balance of CHO (mean intake 305 g, target 502–586 g) and protein (mean intake 192 g, target 100– 117 g) contributions to TER to support training. With emphasis placed on recovery nutrition strategies, an augmented CHO intake was advised as training duration increased. The client was taught to self-manage through the use of simple CHO content of common food resources. In respect of the event plan, using the limits of his own knowledge and generic information provided on the race website, the client had set a minimum target daily energy intake equivalent to 3000 kcal · day–1 using a range of freeze-dried meals and snack foods and sought the practitioner’s expertise to optimise energy availability of race provisions resulting in a mean energy and macronutrient profile of 3082 kcal, 389 g CHO, 110 g protein, 132 g fat. Post-race debrief indicated the intervention provided reassurance through the process of negotiation and practitioner expertise in accommodating the client’s taste preferences to formulate an event plan that was “fine-tuned” to safeguard product durability in extreme heat offering sufficient variety ensuring palatability and consumption. With the growing popularity of ultra-endurance events and the associated time and financial commitments of participants, this case study highlights the success of individualised nutrition strategies to facilitate performance and build client confidence in race completion.
Purpose: The aim of this study was to investigate the associations between matched mechanical variables derived from both vertical and horizontal force-velocity-power (FVP) profiling, and the performance outcome variables within squat jump (SJ) and sprint performance. Methods: 20 elite male academy rugby league players (age 17.6±0.9 years; height 179.9±6.6cm; body mass 91.2±11.8kg) performed two maximal 40m sprints. The sprints were recorded using a radar gun device (Stalker ATS II, Applied Concepts, Dallas, TX, USA), which obtained instantaneous speed-time measurements. In addition, the participants performed two maximal SJ (∼90◦ knee angle) repetitions with these loads: 0kg, 20kg, 40kg, 60kg and 80kg. An Optojump (OptoJump Next Microgate, Bolzano, Italy) was used to record the SJ’s, which provided jump height (cm) for each load. Body mass relative vertical and horizontal mechanical variables (theoretical maximal values of force (F0) (N/kg), velocity (V0) (m/s), power (Pmax) (W/kg)) and the slope of the F-V linear relationship (Sfv) were calculated. Sprint performance was determined from the modelled velocity-time data (2m,5m,10m,20m sprint time (s) and Vmax (m/s). Pearson’s correlation coefficients (r) assessed the relationship between matched vertical and horizontal mechanical variables (F0 vertical & horizontal, v0 vertical & horizontal, Pmax vertical & horizontal and Sfv vertical & horizontal) and SJ and sprint performance. Results: Table 1. shows the correlations coefficient between the sprint and SJ force-velocity profiles and performance variables. There was no significant correlation between vertical and horizontal FVP matched mechanical variables (p > 0.05). The correlations between vertical FVP variables and sprint performance and between horizontal FVP variables and SJ performance failed to reach statistical significance (p > 0.05). Moderate -0.32 to near perfect 1.0 significant correlations (p < 0.05) were found between mechanical and performance variables shifting the importance of separate variables depending on the testing task. Conclusions: The absence of significant correlations between the vertical and horizontal FVP profiles suggests that they provide distinctive information about the athlete’s mechanical variables. The magnitude of the correlations between mechanical variables and sprint performance shifted across the velocity-time curve, therefore performance is determined by separate qualities depending on the distance. Whereas, Pmax reported the greatest correlation with SJ height. Practical Application: To ensure specific, accurate and comprehensive characterisation of athletes’ physical qualities FVP profiles should be determined with exercises maximal mechanically similarity to the targeted performance task. These results will aid practitioners in test selection the prescription and individualisation of training by providing important information as to the most influential variables to develop SJ and sprint performance.
A plethora of research exists examining the physical qualities of rugby league players. However, no research has investigated practitioners’ insights into the use, analysis and perceptions of such fitness testing data that is vital for applying research into practice. Therefore, this study aimed to examine practitioners’ (coaches and strength & conditioning [S&C] coaches) perceptions and challenges of using fitness testing and the development of physical qualities. Twenty-four rugby league practitioners were purposefully sampled and completed a semi-structured interview. Interviews were transcribed and thematically analysed identifying five themes (it’s important, but it’s not everything; monitoring; evaluation and decision making; motivation; and other external challenges). The theme of “it’s important, but it’s not everything” emerged as a fundamental issue with regard fitness testing and the use of such data and that physical data alone does not inform coaches decisions. There appears conflicts between coaches and S&C coaches’ perceptions and use of fitness data, identifying complexities of supporting players in multidisciplinary teams. Collectively, the findings highlight the multifaceted nature of academy rugby league and suggest that practitioners should utilise fitness testing to inform player evaluations, positively influence training and assist with decision making. Moreover, practitioners should understand the combination of factors that influence fitness testing and work collaboratively to enhance talent development strategies.
Background Short-sprint (≤20m) performance is an important quality for success in the football codes. Therefore, developing an evidence base for understanding training methods to enhance short-sprint performance is key for practitioners. However, current systematic reviews are limited by 1) a lack of focus on football code athletes, 2) a lack of consideration of all training modalities, and 3) a failure to account for the normal training practices undertaken by intervention groups within their analysis. Therefore, this review aimed to 1) conduct a systematic review of the scientific literature evaluating training interventions upon short-sprint performance within football code athletes, 2) undertake a meta-analysis to assess the magnitude of change of sport-sprint performance following training interventions, and 3) identify how moderator variables affect the training response. Methods A systematic search of electronic databases was conducted. A random-effects meta-analysis was performed to establish standardised mean difference with 95% confidence intervals. This identified the magnitude and direction of the individual training effects of intervention subgroups (primary, secondary, combined-specific, tertiary and combined training methods) on short-sprint performance while considering moderator variables (i.e., football code, sex, age, playing standard, phase of season). Results 121 studies met the inclusion criteria, totalling 3,419 athletes. Significant improvements (small-large) were found between pre- and post-training in short-sprint performance for the combined, secondary, tertiary and combined-specific training methods. No significant effect was found for primary or sport only training. No individual mode was found to be the most effective. Between-subgroup analysis identified that football code, age, playing standard and phase of season all moderated the overall magnitude of training effects. Conclusions This review provides the largest systematic review and meta-analysis of short-sprint performance development methods and the only one to assess football code athletes exclusively. Practitioners can apply combined, secondary and tertiary training methods to improve short-sprint performance within football code athletes. The application of sport only and primary methods does not appear to improve short-sprint performance. Regardless of the population characteristics, short-sprint performance can be enhanced by increasing either or both the magnitude and the orientation of force an athlete can generate in the sprinting action.
Abstract
Sawczuk, T, Jones, B, Scantlebury, S, and Till, K. Influence of perceptions of sleep on well-being in youth athletes.
This study examined the relationship between accelerometer metrics and both collisions and running demands during rugby union match-play. Twelve under-18 forwards and 14 under-18 backs were recruited from a professional rugby union club. Six competitive matches were filmed during which players wore micro-technological units (Optimeye S5, Catapult Innovations, Melbourne, Australia). Video footage was analysed for total collisions, while GPS data was analysed for total distance. Accelerometer metrics analysed were Player loadTM (PL), Player LoadTM 2D (PL2D), and Player LoadTM slow (PLslow). A total of 81 player observations were included in the final analysis. Data were analysed using ordinary least squares regression. A 10-fold cross validation analysis was used to validate the findings. All PL variables demonstrated very large relationships with collisions in the forwards, while PLslow demonstrated the largest relationship (large) with collisions in the backs. Therefore, based on the strong relationship in both forwards and backs, PLslow may provide the most useful metric for measuring collision-based activity in both positional groups during match-play. Additionally, nearly perfect and very large relationships were observed between PL and total distance for forwards and backs respectively, suggesting that PL can be successfully used to quantify running demands when other methods are unavailable, for example during indoor training.
RFL 2020 Senior Super League Injury Surveillance
Background Within the football codes, medium-distance (i.e., > 20 m and ≤ 40 m) and long-distance (i.e., > 40 m) sprint performance and maximum velocity sprinting are important capacities for success. Despite this, no research has identified the most effective training methods for enhancing medium- to long-distance sprint outcomes. Objectives This systematic review with meta-analysis aimed to (1) analyse the ability of different methods to enhance medium- to long-distance sprint performance outcomes (0–30 m, 0 to > 30 m, and the maximum sprinting velocity phase [Vmax]) within football code athletes and (2) identify how moderator variables (i.e., football code, sex, age, playing standard, phase of season) affected the training response. Methods We conducted a systematic search of electronic databases and performed a random-effects meta-analysis (within-group changes and pairwise between-group differences) to establish standardised mean differences (SMDs) with 95% confidence intervals and 95% prediction intervals. This identified the magnitude and direction of the individual training effects of intervention subgroups (sport only; primary, secondary, tertiary, and combined training methods) on medium- to long-distance sprint performance while considering moderator variables. Results In total, 60 studies met the inclusion criteria (26 with a sport-only control group), totalling 111 intervention groups and 1500 athletes. The within-group changes design reported significant performance improvements (small–moderate) between pre- and post-training for the combined, secondary (0–30 and 0 to > 30 m), and tertiary training methods (0–30 m). A significant moderate improvement was found in the Vmax phase performance only for tertiary training methods, with no significant effect found for sport only or primary training methods. The pairwise between-group differences design (experimental vs. control) reported favourable performance improvements (large SMD) for the combined (0 to > 30 m), primary (Vmax phase), secondary (0–30 m), and tertiary methods (all outcomes) when compared with the sport-only control groups. Subgroup analysis showed that the significant differences between the meta-analysis designs consistently demonstrated a larger effect in the pairwise between-group differences than the within-group change. No individual training mode was found to be the most effective. Subgroup analysis identified that football code, age, and phase of season moderated the overall magnitude of training effects. Conclusions This review provides the first systematic review and meta-analysis of all sprint performance development methods exclusively in football code athletes. Secondary, tertiary, and combined training methods appeared to improve medium-long sprint performance of football code athletes. Tertiary training methods should be implemented to enhance Vmax phase performance. Nether sport-only nor primary training methods appeared to enhance medium to long sprint performance. Performance changes may be attributed to either adaptations specific to the acceleration or Vmax phases, or both, but not exclusively Vmax. Regardless of the population characteristics, sprint performance can be enhanced by increasing either the magnitude or the orientation of force an athlete can generate in the sprinting action, or both.
This cross-sectional study evaluated the sprint and jump mechanical profiles of male academy rugby league players, the differences between positions, and the associations between mechanical profiles and sprint performance. Twenty academy rugby league players performed 40-m sprints and squat jumps at increasing loads (0-80 kg) to determine individual mechanical (force-velocity-power) and performance variables. The mechanical variables (absolute and relative theoretical maximal force-velocity-power, force-velocity linear relationship, and mechanical efficiency) were determined from the mechanical profiles. Forwards had significantly (p < 0.05) greater vertical and horizontal force, momentum but jumped lower (unloaded) and were slower than backs. No athlete presented an optimal jump profile. No associations were found between jump and sprint mechanical variables. Absolute theoretical maximal vertical force significantly (p < 0.05) correlated (r = 0.71-0.77) with sprint momentum. Moderate (r = -0.47) to near-perfect (r = 1.00) significant associations (p < 0.05) were found between sprint mechanical and performance variables. The largest associations shifted from maximum relative horizontal force-power generation and application to maximum velocity capabilities and force application at high velocities as distance increased. The jump and sprint mechanical profiles appear to provide distinctive and highly variable information about academy rugby league players' sprint and jump capacities. Associations between mechanical variables and sprint performance suggest horizontal and vertical profiles differ and should be trained accordingly.
Background The evaluation of physical qualities in talent identification and development systems is vital and commonplace in supporting youth athletes towards elite sport. However, the complex and dynamic development of physical qualities in addition to temporal challenges associated with the research design, such as unstructured data collection and missing data, requires appropriate statistical methods to be applied in research to optimise the understanding and knowledge of long-term physical development. Aim To collate and evaluate the application of methodological and statistical methods used in studies investigating the development of physical qualities within youth athletes. Methods Electronic databases were systematically searched form the earliest record to June 2021 and reference lists were hand searched in accordance with the PRISMA guidelines. Studies were included if they tested physical qualities over a minimum of 3 timepoints, were observational in nature and used youth sporting populations. Results Forty articles met the inclusion criteria. The statistical analysis methods applied were qualitatively assessed against the theoretical underpinnings (i.e. multidimensional development, non-linear change and between and within athlete change) and temporal challenges (i.e. time variant and invariant variables, missing data, treatment of time and repeated measures) encountered with longitudinal physical testing research. Multilevel models were implemented most frequently (50%) and the most appropriately used statistical analysis method when qualitatively compared against the longitudinal challenges. Independent groups ANOVA, MANOVA and X were also used, yet failed to address any of the challenges posed within longitudinal physical testing research. Conclusions This methodological review identified the statistical methods currently employed within longitudinal physical testing research and addressed the theoretical and temporal challenges faced in longitudinal physical testing research with varying success. The findings can be used to support the selection of statistical methods when evaluating the development of youth athletes through the consideration of the challenges presented.
This study assessed the influence of training load, exposure to match play and sleep duration on two daily wellbeing measures in youth athletes. Forty-eight youth athletes (age 17.3 ± 0.5 years) completed a daily wellbeing questionnaire (DWB), the Perceived Recovery Status scale (PRS), and provided details on the previous day’s training loads (TL) and self-reported sleep duration (sleep) every day for 13 weeks (n = 2727). Linear mixed models assessed the effect of TL, exposure to match play and sleep on DWB and PRS. An increase in TL had a most likely small effect on muscle soreness (d = −0.43;± 0.10) and PRS (d = −0.37;± 0.09). Match play had a likely small additive effect on muscle soreness (d = −0.26;± 0.09) and PRS (d = −0.25;± 0.08). An increase in sleep had a most likely moderate effect on sleep quality (d = 0.80;± 0.14); a most likely small effect on DWB (d = 0.45;± 0.09) and fatigue (d = 0.42;± 0.11); and a likely small effect on PRS (d = 0.25;± 0.09). All other effects were trivial or did not reach the pre-determined threshold for practical significance. The influence of sleep on multiple DWB subscales and the PRS suggests that practitioners should consider the recovery of an athlete alongside the training stress imposed when considering deviations in wellbeing measures.
This study investigated the changes in measures of neuromuscular fatigue and physical performance in young professional rugby union players during a preseason training period. Fourteen young (age: 19.1 ± 1.2 years) professional rugby union players participated in the study. Changes in measures of lower body neuromuscular fatigue (countermovement jump (CMJ) mean power, mean force, flight-time) and physical performance (lower body strength, 40 m sprint velocity) were assessed during an 11-week preseason period using magnitude-based inferences. CMJ mean power was likely to very likely decreased during week 2 (-8.1 ± 5.5% to -12.5 ± 6.8%), and likely to almost certainly decreased from weeks 5 to 11 (-10 ± 4.3% to -14.7 ± 6.9%), while CMJ flight-time demonstrated likely to very likely decreases during weeks 2, and weeks 4-6 (-2.41 ± 1% to -3.3 ± 1.3%), and weeks 9-10 (-1.9 ± 0.9% to -2.2 ± 1.5%). Despite this, possible improvements in lower body strength (5.8 ± 2.7%) and very likely improvements in 40 m velocity (5.5 ± 3.6%) were made. Relationships between changes in CMJ metrics and lower body strength or 40 m sprint velocity were trivial or small (<0.22). Increases in lower body strength and 40 m velocity occurred over the course of an 11-week preseason despite the presence of neuromuscular fatigue (as measured by CMJ). The findings of this study question the usefulness of CMJ for monitoring fatigue in the context of strength and sprint velocity development. Future research is needed to ascertain the consequences of negative changes in CMJ in the context of rugby-specific activities to determine the usefulness of this test as a measure of fatigue in this population.
Sleep patterns of elite youth team-sport athletes prior to and during international competition
© 2019, © 2019 Informa UK Limited, trading as Taylor & Francis Group. Objective: To examine the effects of international competition on sleep patterns of elite youth team-sport athletes from two national squads compared to a baseline period. Methods: Fifty elite male youth rugby players from two squads were assessed two weeks before (HOME) and throughout two match-day cycles (matchday−1, matchday, matchday+1) of an international competition (COMP). Players were selected to represent their nation during the Six Nations Festival and completed daily self-reported sleep diaries before and during a competitive period. Linear mixed models were used to examine differences between HOME and COMP, and within camp days. Effect sizes±90% confidence intervals (ES±90%CI) were calculated to quantify the magnitude of pairwise differences. Results: Participants spent more time in bed (34.6±13.9 min; ES=0.26±0.19), slept for longer (35.4±12.7 min; ES=0.30±0.19), and woke up later (36.5±9.5 min; ES=0.41±0.20) in COMP compared to HOME, but maintained their regular bedtime (−1.8±11.2 min; ES=0.02±0.19), sleep onset latency (4.1±3.2 min; ES=0.17±0.25) and rating of sleep quality (0.30±0.17; ES=0.17±0.19). Conclusions: Elite youth team-sport athletes sleep for longer during a competition camp compared to home resulting from a delay in wake-up times. This highlights the opportunity for implementing interventions to improve sleep patterns in international-level team sport athletes in their daily environment.
Annual Changes in Lean and Fat mass in youth elite rugby league players
Bigger, stronger, faster: The differences in physical qualities between player development group and England academy players in youth rugby union
The quantification of internal (i.e., the physical stress imposed on the athlete) and external (i.e., distance covered) training load is viewed as essential to determine whether an athlete is adapting to a training programme, whilst minimising the risk of injury and overreaching. Although research has established correlations between internal measures of training load (i.e., session rating of perceived exertion [s-RPE] vs. summated heart rate zone method; Borrensen & Lambert, 2008, International Journal of Sports physiologi and performance, 3, 16-30) limited research exists comparing internal and external methods in team sports. The aim of this study was to establish the accuracy of s-RPE to quantify internal and external training load in adolescent rugby and hockey. Following institutional ethics approval, 22 youth sport (rugby & hockey) athletes were monitored across 125 training sessions (64 rugby & 61 hockey). External training load was monitored using a microtechnology unit to determine total distance and PlayerLoad, whilst internal loads were monitored using heart rate (summated heart rate zones) and s-RPE. Pearson correlation coefficients and 90% confidence intervals were calculated. Fishers r to z transformation compared the correlations between rugby and hockey. For summated HR zones and s-RPE, a large correlation (r=0.58, 90% CI: 0.43-0.70) was found for rugby with a very large correlation (r=0.75 90% CI: 0.64 to 0.83) for hockey. In rugby, large correlations were found between s-RPE and PlayerLoad (r= 0.64, 90% CI: 0.50 to 0.75), and total distance (r= 0.66 90% CI: 0.52 to 0.76). In hockey, large and moderate correlations were found between s-RPE and PlayerLoad (r= 0.55 90% CI: 0.39 to 0.69) and total distance (r= 0.42, 90% CI: 0.23 to 0.58) respectively. No significant differences were found between the correlations of internal and external measures between sports. The large and moderate correlations found between measures of total distance & PlayerLoad to s-RPE appear to support the theory that the individuals internal load is influenced by the external load they are exposed to highlighting the need for future research within this area. Furthermore, the large correlations found between s-RPE and the summated heart rate zones method highlights the potential for s-RPE to be used as an efficient technique in quantifying internal training load within adolescent rugby and hockey athletes. This suggests coaches can confidently monitor the internal training load of their athletes using s-RPE methods when HR technology is not available.
The purpose of this study was to determine the between-day reliability of the Hamstring Solo for measuring peak eccentric knee flexor force (EKF) during the Nordic hamstring curl. Data were collected on 18 male Professional rugby union players across two testing sessions separated by 7 days. There was no between-session difference in EKF force for left (p = 0.440 – 0.580) or right (p = 0.477 – 0.656) leg when using the best of 1 (left = 405.3±88.2 N vs. 412.8±92.7 N; right = 408.0±88.1 N vs. 416.7±85.2 N), 2 (left = 409.9±87.6 N vs. 415.0±96.2 N; right = 413.0±87.5 N vs. 418.3±86.2 N), or 3 repetitions (left = 411.2±88.2 N vs. 417.3±92.7 N; right = 417.7±87.4 N vs. 417.7±87.4 N). The between-day reliability of EKF peak force was acceptable for left (7.2 to 8.3%) and right (8.3 to 9.8%) leg, with the typical error lowest when using the best of three repetitions. The smallest worthwhile change (SWC) was similar for left (4.2 – 4.3%) and right (3.6 – 3.7%) when using the best of 3 repetitions. As the typical error was greater than the SWC for both the left (1.71 x the SWC) and right (2.24 x the SWC) legs, changes of 2.71 (Δ 41 N; 11%) and 3.24 (Δ 47 N; 12%) xSWC are required to detect a small change in EKF peak force, taking into account the typical error. Practitioners can use the reliability statistics from this study to monitor EKF peak force in professional rugby union players, when using the Hamstring Solo device. It is recommended that when monitoring EKF peak force with the Hamstring Solo, practitioners use the best of 3 repetitions.
Purpose: To assess the relationships between training load, sleep duration and three daily wellbeing, recovery and fatigue measures in youth athletes. Methods: Fifty-two youth athletes completed three maximal countermovement jumps (CMJ), a daily wellbeing questionnaire (DWB), the Perceived Recovery Status scale (PRS), and provided details on their previous day's training loads (training) and self-reported sleep duration (sleep) on four weekdays over a seven week period. Partial correlations, linear mixed models and magnitude-based inferences were used to assess the relationships between the predictor variables (training; sleep) and the dependent variables (CMJ; DWB; PRS). Results: There was no relationship between CMJ and training (r=-0.09; ±0.06) or sleep (r=0.01; ±0.06). The DWB was correlated with sleep (r=0.28; ±0.05, small), but not training (r=-0.05; ±0.06). The PRS was correlated with training (r=-0.23; ±0.05, small), but not sleep (r=0.12; ±0.06). The DWB was sensitive to low sleep(d=-0.33; ±0.11) relative to moderate, PRS was sensitive to high (d=-0.36; ±0.11) and low (d=0.29; ±0.17) training relative to moderate. Conclusions: The PRS is a simple tool to monitor the training response, but DWB may provide a greater understanding of the athlete's overall wellbeing. The CMJ was not associated with the training or sleep response in this population.
The monitoring of training load is important to ensure athletes are adapting optimally to a training stimulus. Before quanti ca- tion of training load can take place, coaches must be con dent that the tools available are accurate. We aimed to quantify the within-participant correlation between the session rating of perceived exertion (s-RPE) and summated heart rate zone (sHRz) methods of monitoring internal training load. Training load (s-RPE and heart rate) data were collected for rugby, soc- cer and eld hockey eld-based training sessions over a 14- week in-season period. A total of 397 sessions were monitored (rugby n = 170, soccer n = 114 and eld hockey n = 113). With- in-subject correlations between s-RPE and sHRz were quanti- ed for each sport using a general linear model. Large correla- tions between s-RPE and the sHRz method were found for rugby (r = 0.68; 95 % CI 0.59–0.75) and eld hockey (r = 0.60; 95 % CI 0.47–0.71) with a very large correlation found for soccer (r = 0.72; 95 % CI 0.62–0.80). No signi cant di erences were found between the correlations for each sport. The very large and large correlations found between s-RPE and the sHRz meth- ods support the use of s-RPE in quantifying internal training load in youth sport.
The purpose of this study was to evaluate the anthropometric and performance characteristics of high-level youth female soccer players by annual-age category (Under 10 (U10)–U16). Data were collected from 157 female soccer players (U16, n = 46; U14, n = 43; U12, n = 38; U10, n = 30), recruited from three high-level female soccer academies in England. Players completed assessments of anthropometry (height and body mass), isometric mid-thigh pull strength, jump height, aerobic capacity, change of direction (505-left/right) and speed (10 and 30 m). Magnitude based-inferences were used to assess for practical significance between consecutive age groups. Height (very likely–most likely), body mass (very likely–most likely), absolute strength (most likely), jump height (likely–very likely) and distance on the YYIRL1 (possibly–most likely) were greater in older players. Both speed and change of direction time were most likely to very likely lower in older players. However, only most likely trivial–possibly trivial differences were observed in relative strength between age groups. Findings suggest that physical characteristics except for relative strength differentiate by age categories. These findings provide comparative data and target reference data for such populations and can be used by coaches and practitioners for player development purposes. Practitioners should be aware that relative strength does not differ between age categories in high-level youth female soccer players.
Introduction It is well-recognised that fulfilling the role of a coach is multi-faceted. In rugby, some of these coaching facets have been studied, however the research has not been reviewed. Reviewing the literature on rugby coaches will inform and guide policies, coach education, research and practice. Therefore, the purpose of this study is to provide a scoping review of the current coach focused literature on rugby union, rugby league and rugby sevens. Methods A scoping review was conducted on five electronic databases (EBSCOhost, PubMed, Scopus, SPORTDiscus, Web of Science) until January 2022 using the PRISMA-ScR guidelines. Participants had to be coaches within rugby union, sevens and league to be included. Data were extracted and analyzed to form a numerical and thematic summary. Results 105 articles were included. 76% of the studies were on rugby union, 14% on league, 1% on sevens and the remainder focused on a combination of rugby cohorts or did not specify. Three themes were identified via a thematic analysis based on the content of the articles, these were coach knowledge (68%), coach pedagogies (29%), and coach development (4%). Conclusion The main finding in this review is that research on rugby coaches understood the risk, prevention, and management of injuries. Educational resources should include all aspects of rugby play or training injuries. The importance of the athlete-coach relationship and coach reflective practices was another significant finding. Coaches are encouraged to have a broad understanding of various aspects related to the player's welfare, which can be developed using formal and/or nonformal learning.
The Effect of Changing Weekly Contact Training Duration Beyond Current Guidelines on Head Acceleration Events in Rugby Union
Abstract
Background
This study simulated the effect of reducing contact training duration on overall in-season head acceleration event (HAE) exposure within men’s and women’s rugby union.
Methods
Players ( n = 982) from two professional men’s and two semi-professional women’s competitions wore instrumented mouthguards in training and match-play for one season. Generalised linear mixed models were used to estimate the in-season weekly HAE exposures per position, sex and contact type. Simulation of modelled estimates evaluated the impact of reducing contact load guidelines by 25%, 50% and 75% (scenario 1), and replacing full contact training with controlled contact (scenario 2) or non-contact (scenario 3) training for different seasonal match exposures. Previously established contact load guidelines were used as a reference point.
Results
HAEs were decreased by a maximum of 3.2 per week (0–95 HAEs per season; 0–23%). In scenario 1, the decrease in HAEs was disproportionately smaller than the reduction in contact training duration (e.g. 23.7% reduction in overall rugby minutes for 7% decrease in HAEs). Scenario 2 decreased HAEs similarly to scenario 1 but with no reduction in contact time. Scenario 3 decreased HAEs proportionally with contact time reductions (e.g. 8.9% decrease in HAEs >10 g for 9.6% reduction in overall rugby minutes).
Conclusions
HAEs were reduced in all scenarios, but the reduction was relatively small due to the low overall rate of HAEs in training. Policymakers should be aware of the tradeoffs involved in any change. Managing individuals with higher HAE exposures may be more appropriate than reducing contact training guidelines.
The aim was to use a combination of video analysis and microtechnology (10 Hz global positioning system [GPS]) to quantify and compare the speed and acceleration of ball-carriers and tacklers during the pre-contact phase (contact - 0.5s) of the tackle event during rugby league match-play. Data were collected from 44 professional male rugby league players from two Super League clubs across two competitive matches. Tackle events were coded and subject to three stages of inclusion criteria to identify front-on tackles. 10 Hz GPS data was synchronised with video to extract the speed and acceleration of the ball-carrier and tackler into each front-on tackle (n = 214). Linear mixed effects models (effect size [ES], confidence intervals, p-values) compared differences. Overall, ball-carriers (4.73 ± 1.12 m∙s-1) had greater speed into front-on tackles than tacklers (2.82 ± 1.07 m∙s-1; ES = 1.69). Ball-carriers accelerated (0.67 ± 1.01 m∙s-2) into contact whilst tacklers decelerated (-1.26 ± 1.36 m∙s-2; ES = 1.74). Positional comparisons showed speed was greater during back vs. back (ES = 0.66) and back vs. forward (ES = 0.40) than forward vs. forward tackle events. Findings can be used to inform strategies to improve performance and player welfare.
Time to embrace the complexity when analysing GPS data? A systematic review of contextual factors on match running in rugby league
This systematic review aimed to identify and summarise associations between currently identified contextual factors and match running in senior male professional rugby league. Eligible articles included at least one contextual factor and used GPS to measure at least one displacement variable within competitive senior, male, professional rugby league matches. Of the 15 included studies, the identified contextual factors were grouped into factors related to individual characteristics (n = 3), match result (n = 4), team strength (n = 2), opposition strength (n = 3), match conditions (n = 6), technical and tactical demands (n = 6), spatial and temporal characteristics (n = 7), and nutrition (n = 1). Speed was the most commonly reported measure of match running (100%), followed by distance (47%), and acceleration (20%). Inconsistencies were found between studies for most contextual factors on match running. Higher speeds were generally associated with higher fitness, encountered earlier in the match and whilst defending. All 15 studies utilised a univariate approach to quantify associations of a contextual factor. The inconsistencies found in the associations of given contextual factors highlight the complex and multi-faceted nature of match running. Therefore, practitioners should consider contextual factors when analysing and interpreting GPS data.
The purpose of this study was to investigate the neuromuscular and perceptual fatigue responses of elite rugby players during the inaugural Under-18 (U18) Six Nations Festival. One hundred and thirty-three male players from five national squads (73 forwards, 60 backs) were examined during the competition. Each national squad was involved in three matches separated by 96 h each. Over the competition, players completed a daily questionnaire to monitor perceived well-being (WB) and performed daily countermovement jumps (CMJ) to assess neuromuscular function (NMF). Reductions in WB were substantial 24 h after the first and second match in forwards (d=0.77±0.21, p<0.0001; d=0.84±0.22, p< 0.001) and backs (d=0.89±0.22, p <0.0001; d=0.58±0.23, p<0.0001) but reached complete recovery in time for the subsequent match. Reductions in CMJ height were substantial 24 h after the first and second match for forwards (d=0.31±0.15, p=0.001; d=0.25±0.17, p=0.0205) and backs (d=0.40±0.17, p=0.0001; d=0.28±0.17, p=0.0062) and recovered at 48 h after match-play. Average WB and CMJ height attained complete recovery within matchday cycles in the investigated international competition. The findings of this study can be useful for practitioners and governing bodies involved with fixture scheduling and training prescription during competitive periods.
Background Head acceleration events (HAEs) are an increasing concern in collision sports owing to potential negative health outcomes. Objectives The objective of this study is to describe the probabilities of HAEs in tackles of differing heights and body positions in elite men’s and women’s rugby union. Methods Instrumented mouthguards (iMGs) were worn in men’s (n = 24 teams, 508 players, 782 observations) and women’s (n = 26 teams, 350 players, 1080 observations) rugby union matches. Tackle height (i.e. point of contact on ball-carrier) and body positions of tacklers and ball-carriers were labelled for all tackles in which a player wore an iMG. HAEs from the initial impact were identified. Mean player, tackler and ball-carrier exceedance probabilities for various peak linear and angular acceleration thresholds were estimated from ordinal mixed-effects models. Results Contact with ball-carriers’ head/neck resulted in the highest mean HAE probabilities for both sexes. The probability of an HAE to the ball-carrier decreased as tackle height lowered. The highest probability for the tackler was initial contact to the ball-carriers upper leg. Body position influenced the probability of HAEs, with falling/diving ball-carriers resulting in higher mean probabilities. When a player, regardless of role, was bent-at-waist, elevated HAE probabilities were observed in men’s competitions. Women’s data demonstrated similar probabilities of an HAE for all body positions. Conclusions Initial contact to the ball-carrier’s head/neck had the highest chance of an HAE, whilst role-specific differences are apparent for different tackle heights and body positions. Future player-welfare strategies targeting contact events should therefore consider HAE mechanisms along with current literature.
This study aimed to quantify and compare mean head acceleration event (HAE) incidence within and between men's and women's rugby union competitions; quantify the incidence of HAEs during all contact‐events and describe individual player incidence. Players competing during the 2022/2023 season in women's (337 players; Premiership Women's Rugby, Farah Palmer Cup) and men's (371 players; Premiership Rugby, Currie Cup and Super Rugby) competitions wore instrumented mouthguards (iMGs). Mean HAE incidences using peak linear (PLA) and peak angular acceleration (PAA) were quantified by sex, positional groups and individual players per competition and for contact‐events across a range of magnitude thresholds. Within positional groups, there was high between‐player variability, with some players experiencing up to a 3‐fold greater mean HAE incidence than their positional average. Per full‐game equivalent (FGE), men had significantly higher HAE incidences in most positional groups and HAE magnitude thresholds compared to women ranging from approximately 0.11–3.44 HAEs per FGE. Incidence of HAEs (PLA > 25 g) per FGE was lowest in scrums (0.00–0.04/FGE) and highest for tackles and ball carries (0.21–1.97/FGE) in both women and men, whereas mauling was a frequent source of HAEs for men's back row (0.95/FGE). No significant differences were observed between competitions for most positional groups and HAE magnitude thresholds in both men and women. Per FGE, HAE incidences were similar within, but significant differences were apparent between men's and women's players. The scrum had the lowest HAE incidence of all contact‐events. Individual players can show large variation from the mean, emphasising the importance of HAE mitigation strategies that include individual player monitoring and management processes.
Purpose: The aim of this research was to investigate the convergent validity, reliability and sensitivity over a week of training of a standardized running test to measure neuromuscular fatigue. Methods: Twenty male rugby union players were recruited for the study, which took place during preseason. The standardized running test consisted of four 60 m runs paced at ≈5 m•s-1 with 33 seconds of recovery between trials. Data from micromechanical electrical systems (MEMS) were used to calculate a running load index (RLI) which was a ratio between the mechanical load and the speed performed during runs. RLI was calculated by using either the entire duration of the run or a constant velocity period. For each type of calculation, either an individual directional or the sum of the three components of the accelerometer were used. A measure of leg stiffness was used to assess the convergent validity of the RLI. Results: Unclear to large relationships between leg stiffness and RLI were found (r ranged from -0.20 to 0.62). Regarding the reliability, small to moderate (0.47 to 0.86) standardized typical errors were found. The sensitivity analysis showed the leg stiffness presented a very likely trivial change over the course of one week of training, while RLI showed very likely small to a most likely large change. Conclusion: This study showed that RLI is a practical method to measure neuromuscular fatigue. Additionally, such a methodology aligns with the constraint of elite team sport set up due to its ease of implementation in practice.
The Quality, Quantity, and Intraindividual Variability of Sleep Among Students and Student-Athletes
© 2019 The Author(s). Background: Student-athletes are subject to significant demands due to their concurrent sporting and academic commitments, which may affect their sleep. This study aimed to compare the self-reported sleep quality, quantity, and intraindividual variability (IIV) of students and student-athletes through an online survey. Hypothesis: Student-athletes will have a poorer sleep quality and quantity and experience more IIV. Study Design: Case-control study. Level of Evidence: Level 4. Methods: Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI), while sleep quantity and IIV were assessed using the Consensus Sleep Diary. Initially, the PSQI and additional questions regarding sport participation habits were completed by 138 participants (65 students, 73 student-athletes). From within this sample, 44 participants were recruited to complete the sleep diary for a period of 14 days. Results: The mean PSQI score was 6.89 ± 3.03, with 65% of the sample identified as poor sleepers, but no difference was observed between students and student-athletes. Analysis of sleep patterns showed only possibly to likely small differences in sleep schedule, sleep onset latency, and subjective sleep quality between groups. IIV analysis showed likely moderate to possibly small differences between groups, suggesting more variable sleep patterns among student-athletes. Conclusion: This study highlights that sleep issues are prevalent within the university student population and that student-athletes may be at greater risk due to more variable sleep patterns. Clinical Relevance: University coaches should consider these results to optimize sleep habits of their student-athletes.
Objectives Identify the frequency, propensity, and factors related to tackle events which result in contact with the head in elite-level women's rugby league. Design Prospective video analysis study. Methods Video footage from 59 Women's Super League matches were analysed (n = 14,378 tackle events). All tackle events were coded as no head contact or head contact. Other independent variables included: area contacting head, impacted player, concussion outcome, penalty outcome, round of competition, time in match and team standard. Results There were 83.0 ± 20.0 (propensity 304.0/1000 tackle events) head contacts per match. The propensity of head contact was significantly greater for the tackler than ball-carrier (178.5 vs. 125.7/1000 tackle events; incident rate ratio 1.42, 95 % confidence interval 1.34 to 1.50). Head contacts occurring from an arm, shoulder, and head occurred significantly more than any other contact type. The propensity of concussions was 2.7/1000 head contacts. There was no significant influence of team standard or time in match on the propensity of head contacts. Conclusions The observed head contacts can inform interventions, primarily focusing on the tackler not contacting the ball-carrier's head. The tackler's head should also be appropriately positioned to avoid contact with the ball-carrier's knee (highest propensity for concussion). The findings are consistent with other research in men's rugby. Law modifications and/or enforcement (reducing the number of un-penalised head contacts), concurrent with coaching interventions (optimising head placement or reducing the head being contacted) may help minimise head contact risk factors for women's rugby league.
This study quantified and compared the movement characteristics of elite domestic and international netball match-play, including fifteen individual players who compete at both levels. Microtechnology data were collected across 75 matches in a league-wide study from players (n = 113) competing in the Netball Superleague (elite domestic) and from international players (n = 23) in 22 international matches. Players were categorised according to the seven playing positions. Accelerometer-derived variables were analysed per whole-match and per quarter, for both absolute (i.e., volume) and relative to duration (i.e., intensity [per minute]) values. The median playing duration ranged across positions from 23.6 to 42.4 minutes at international and 31.6 to 48.1 minutes at domestic level. International matches were greater than elite domestic competition for relative variables across all positions. Moderate to large effect sizes (1.00–1.50) were found between playing levels for PlayerLoadTM per minute (AU·min-1). Significant decreases in both absolute and relative variables were observed across quarters for both competition levels. The movement characteristics are position dependent, with greater absolute characteristics at domestic level across whole-match analysis, but greater relative characteristics at international level. These findings provide practitioners with information to guide training prescription, return-to-play protocols, and transitioning athletes between levels of competition.
This study aims to establish the validity and reliability of the prone Yo-YoIRL1 in elite female rugby league players (part one) and determine the anthropometric and physical characteristics contributing to 15m prone Yo-YoIRL1 performance (part two). Part one, 21 subjects completed one Yo-YoIRL1, one 20m and two 15m prone Yo-YoIRL1 tests over four sessions, with 7–14 days in-between. Part two, ten subjects completed a testing battery, including body mass, height, dual-energy x-ray absorptiometry, isometric mid-thigh pull, isometric bench-press, 10m and 20m sprints and an incremental treadmill test (). The 15m prone YoYoIRL1 demonstrated poor reliability with a typical error of 68m (21%) and a smallest worthwhile change of 54m (9%). Validity analysis found the prone versions of the YoYoIRL1 were not sensitive measures of intermittent running performance. Both prone YoYoIRL1 test distances demonstrated large mean bias (76% and -37% respectively) and typical error of the estimate (19% and 21%, respectively) in comparison to the YoYoIRL1. Body mass (r = -0.89), lean mass (r = -0.64), body fat % (r = -0.68), (l∙min-1) (r = -0.64), IMTP (r = -0.69), IBP (r = -0.15), 10m (r = -0.77) and 20m (r = -0.72) momentum displayed large negative relationships with 15m prone Yo-YoIRL1 performance. Due to the poor validity of the 20m prone YoYoIRL1, the poor validity and reliability of the 15m prone YoYoIRL1, and the anthropometric and physical characteristics which negatively impact performance, practitioners should reconsider the use of the prone YoYoIRL1 test to monitor high intensity intermittent running performance.
Participation in women’s rugby league has been growing since the foundation of the English women’s rugby league Super League in 2017. However, the evidence base to inform women’s rugby league remains sparse. This study provides the largest quantification of anthropometric and physical qualities of women’s rugby league players to date, identifying differences between positions (forwards & backs) and playing standard (Women’s Super League [WSL] vs. International). The height, weight, body composition, lower body strength, jump height, speed and aerobic capacity of 207 players were quantified during the pre-season period. Linear mixed models and effects sizes were used to determine differences between positions and standards. Forwards were significantly (p < 0.05) heavier (forwards: 82.5 ± 14.8kg; backs: 67.7 ± 9.2kg) and have a greater body fat % (forwards: 37.7 ± 6.9%; backs: 30.4 ± 6.3%) than backs. Backs had significantly greater lower body power measured via jump height (forwards: 23.5 ± 4.4cm; backs: 27.6 ± 4.9cm), speed over 10m (forwards: 2.12 ± 0.14s; backs: 1.98 ± 0.11s), 20m (forwards: 3.71 ± 0.27s; backs: 3.46 ± 0.20s), 30m (forwards: 5.29 ± 0.41s; backs: 4.90 ± 0.33s), 40m (forwards: 6.91 ± 0.61s; backs: 6.33 ± 0.46s) and aerobic capacity (forwards: 453.4 ± 258.8m; backs: 665.0 ± 298.2m) than forwards. Additionally, international players were found to have greater anthropometric and physical qualities in comparison to their WSL counterparts. This study adds to the limited evidence base surrounding the anthropometric and physical qualities of elite women’s rugby league players. Comparative values for anthropometric and physical qualities are provided which practitioners may use to evaluate the strengths and weaknesses of players, informing training programs to prepare players for the demands of women’s rugby league.
Improved race walking performance in a thermally stressful environment following Intermittent heat acclimation by Commonwealth Games Champion
Rugby league (RL) carries a high injury incidence with 61% of injuries occurring at tackles. The ball carrier has a higher injury incidence than the defender, therefore understanding mechanisms occurring during injurious tackles are important. Given the dynamic, open nature of tackling, characteristics influencing tackle outcome likely encompass complex networks of dependencies. This study aims to identify important classifying characteristics of the tackle related to ball carrier injurious and non-injurious events in RL and identify the characteristics capability to correctly classify those events. Forty-one ball carrier injuries were identified and 205 matched non-injurious tackles were identified as controls. Each case and control were analysed retrospectively through video analysis. Random forest models were built to 1.) filter tackle characteristics possessing relative importance for classifying tackles resulting in injurious/non-injurious outcomes and 2.) determine sensitivity and specificity of tackle characteristics to classify injurious and non-injurious events. Six characteristics were identified to possess relative importance to classify injurious tackles. This included ‘tackler twisted ball carrier’s legs when legs were planted on ground’, ‘the tackler and ball carrier collide heads’, ‘the tackler used body weight to tackle ball carrier, ‘the tackler has obvious control of the ball carrier’ ‘the tackler was approaching tackle sub-maximally’ and ‘tackler's arms were below shoulder level, elbows were flexed’. The study identified tackle characteristics that can be modified in attempt to reduce injury. Additional injury data are needed to establish relationship networks of characteristics and analyse specific injuries. Sensitivity and specificity results of the random forest were 0.995 and 0.525.
Limited data exists on the hydration status of female athletes, with no data available on female rugby players. The objective of this study was to investigate the habitual hydration status on arrival, sweat loss, fluid intake, sweat Na loss and blood [Na] during field training and match-play in ten international female rugby league players. Urine osmolality on arrival to match-play (382 ± 302 mOsmol·kg) and training (667 ± 260 mOsmol·kg) was indicative of euhydration. Players experienced a body mass loss of 0.50 ± 0.45 and 0.56 ± 0.53% during match-play and training respectively. During match-play players consumed 1.21 ± 0.43 kg of fluid and had a sweat loss of 1.54 ± 0.48 kg. During training players consumed 1.07 ± 0.90 kg of fluid, in comparison to 1.25 ± 0.83 kg of sweat loss. Blood [Na] was well regulated ([INCREMENT]-0.7 ± 3.4 and [INCREMENT]-0.4 ± 2.6 mmol·L) despite sweat [Na] of 47.8 ± 5.7 and 47.2 ± 6.3 mmol·L during match-play and training. The findings of this study show mean blood [Na] appears to be well regulated despite losses of Na in sweat and electrolyte free fluid consumption. For the duration of the study players did not experience a body mass loss (dehydration >2%) indicative of a reduction in exercise performance, thus habitual hydration strategies appear adequate. Practitioners should evaluation the habitual hydration status of athletes to determine if interventions above habitual strategies are warranted.
When performing resistance training, verbal and visual kinematic feedback are known to enhance performance. Additionally, providing verbal encouragement can assist in the attenuation of fatigue. However, the effects of these forms of feedback have never been compared. Consequently, this study aimed to quantify the effects of verbal and visual kinematic feedback, and verbal encouragement on barbell velocity during the back squat. Furthermore, changes in performance were related to individual reported conscientiousness. Twelve semi-professional rugby union players volunteered to participate in the study which consisted of the subjects completing a set of the barbell back squat across four conditions (i.e. no-feedback (control), verbal feedback of kinematic information (verbal), visual feedback of kinematic information (visual), and verbal encouragement (encouragement)). Additionally, participants completed a questionnaire prior to the study to assess conscientiousness. Magnitude-based inferences were used to assess differences between conditions, while Spearman’s rank correlation coefficient was used to assess relationships between conscientiousness and changes in barbell velocity. All three forms of feedback showed almost certain improvements in barbell velocity, while differences between interventions were likely to very likely trivial. Changes in barbell velocity showed small to large inverse relationships with conscientiousness. These findings suggest that practitioners should supply kinematic feedback (verbally or visually) or, when technology is not available, provide athletes with encouraging statements while resistance training. Verbal encouragement may be of greatest benefit for individuals who demonstrate low levels of conscientiousness. Given these findings, practitioners are advised to use either technology or verbal encouragement to manipulate acute training outcomes.
Validity of 10 HZ GPS and Timing Gates for Assessing Maximum Velocity in Professional Rugby Union Players
The purpose of this study was to investigate the validity of timing gates and 10 Hz GPS units (Catapult Optimeye S5) against a criterion measure (50 Hz radar gun) for assessing maximum sprint velocity (Vmax).Nine male professional rugby union players performed three maximal 40 m sprints with three minutes rest between each effort with Vmax assessed simultaneously via timing gates, 10 Hz GPSOpen (Openfield software), GPSSprint (Sprint software) and radar gun. Eight players wore 3 GPS units, while one player wore a single unit during each sprint.When compared to the radar gun, mean bias for GPSOpen, GPSSprint and timing gates was trivial, small and small respectively. The typical error of the estimate (TEE) was small for timing gate and GPSOpen, while moderate for GPSSprint. Correlations with radar gun were nearly perfect for all measures. Mean bias, TEE and correlations between GPS units were trivial, small and nearly perfect respectively, while small TEE existed when GPSOpenfield was compared to GPSSprint.Based on these findings both 10 Hz GPS and timing gates provide valid measures of 40 m Vmax assessment when compared with a radar gun. However, as error did exist between measures, the same testing protocol should be used when assessing 40 m Vmax over time. Furthermore, in light of the above results, it is recommended that when assessing changes in GPS derived Vmax over time, practitioners should use the same unit for each player and perform the analysis with the same software, preferably Catapult Openfield.
ABSTRACT Purpose: Feedback can enhance acute physical performance. However, the effects of feedback on physical adaptation has received little attention. Therefore, the purpose of this study was to determine the effect of feedback during a four-week training programme on jump, sprint and strength adaptations. Methods: Twenty-eight semi-professional male rugby union players were strength-matched into two groups (feedback and non-feedback).
Objectives Assess the validity and feasibility of current instrumented mouthguards (iMGs) and associated systems. Methods Phase I; four iMG systems (Biocore-Football Research Inc (FRI), HitIQ, ORB, Prevent) were compared against dummy headform laboratory criterion standards (25, 50, 75, 100 g). Phase II; four iMG systems were evaluated for on-field validity of iMG-triggered events against video-verification to determine true-positives, false-positives and false-negatives (20±9 player matches per iMG). Phase III; four iMG systems were evaluated by 18 rugby players, for perceptions of fit, comfort and function. Phase IV; three iMG systems (Biocore-FRI, HitIQ, Prevent) were evaluated for practical feasibility (System Usability Scale (SUS)) by four practitioners. Results Phase I; total concordance correlation coefficients were 0.986, 0.965, 0.525 and 0.984 for Biocore-FRI, HitIQ, ORB and Prevent. Phase II; different on-field kinematics were observed between iMGs. Positive predictive values were 0.98, 0.90, 0.53 and 0.94 for Biocore-FRI, HitIQ, ORB and Prevent. Sensitivity values were 0.51, 0.40, 0.71 and 0.75 for Biocore-FRI, HitIQ, ORB and Prevent. Phase III; player perceptions of fit, comfort and function were 77%, 6/10, 55% for Biocore-FRI, 88%, 8/10, 61% for HitIQ, 65%, 5/10, 43% for ORB and 85%, 8/10, 67% for Prevent. Phase IV; SUS (preparation-management) was 51.3–50.6/100, 71.3–78.8/100 and 83.8–80.0/100 for Biocore-FRI, HitIQ and Prevent. Conclusion This study shows differences between current iMG systems exist. Sporting organisations can use these findings when evaluating which iMG system is most appropriate to monitor head acceleration events in athletes, supporting player welfare initiatives related to concussion and head acceleration exposure.
Previous research in academy rugby league players has evaluated the development of physical qualities according to chronological age. However, no study has considered the training age, defined as the number of formalized years of strength and conditioning training, of these players. Therefore, the purpose of this study was to present and compare the annual changes in physical qualities of academy rugby league players according to training age. Sixty-one academy players undertook a fitness testing assessment including anthropometric (height, body mass, sum of four skinfolds) and physical (10 and 20m sprint, 10m momentum, vertical jump, Yo-Yo intermittent recovery test level 1 [Yo-Yo IRTL1], one-repetition maximum [1-RM] squat, bench press and prone row) measures at the start of pre-season on two consecutive annual occasions. Players were categorized into one of three training age groups (i.e., 0, 1 or 2 years) and were analyzed using magnitude-based inferences. Almost certain, very likely or likely annual improvements were identified for body mass, 10m momentum, Yo-Yo IRTL1, vertical jump and all strength measures for the three training age groups. When training age groups were compared, 1 years showed possibly or likely lower strength increases than 0 years training age. However, the 2 years training age group demonstrated possibly or likely increased strength changes compared to 1 years. These findings suggest that training age is an important consideration for strength and conditioning practitioners but it is likely to be a combination of chronological age, biological maturity and training experience alongside dynamic inter-player variability that influences the physical development of academy rugby league players.
PURPOSE: Prescribing resistance training using velocity loss thresholds can enhance exercise quality by mitigating neuromuscular fatigue. Since little is known regarding performance during these protocols, we aimed to assess the effects of 10%, 20%, and 30% velocity loss thresholds on kinetic, kinematic, and repetition characteristics in the free-weight back squat. METHODS: Using a randomised crossover design, sixteen resistance-trained men were recruited to complete five sets of the barbell back squat. Lifting load corresponded to a mean concentric velocity (MV) of ~0.70 m·s-1 (115 ±22kg). Repetitions were performed until a 10%, 20% or 30% MV loss was attained. RESULTS: Set MV and power output were substantially higher in the 10% protocol (0.66 m.s-1 & 1341 W, respectively), followed by the 20% (0.62 & 1246) and 30% protocols (0.59 & 1179). There were no substantial changes in MV (-0.01- -0.02) or power output (-14- -55 W) across the five sets for all protocols and individual differences in these changes were typically trivial to small. Mean set repetitions were substantially higher in the 30% protocol (7.8), followed by the 20% (6.4) and 10% protocols (4.2). There were small to moderate reductions in repetitions across the five sets during all protocols (-39%, -31%, -19%, respectively) and individual differences in these changes were small to very large. CONCLUSIONS: Velocity training prescription maintains kinetic and kinematic output across multiple sets of the back squat, with repetition ranges being highly variable. Our findings therefore challenge traditional resistance training paradigms (repetition-based) and add support to a velocity-based approach.
This study aimed to identify which physical and technical-tactical performance indicators (PI) can classify between levels of rugby league match-play. Data were collected from 46 European Super League (ESL) and 36 under-19 Academy (Academy) level matches over two seasons. Thirty-one ESL players and 41 Academy players participated. Microtechnology units were used to analyse the physical PI and matches were videoed and coded for individual technical-tactical PI, resulting in 157 predictor variables. Data were split into training and testing datasets. Random forests (RF) were built to reduce the dimensionality of the data, identify variables of importance and build classification models. To aid practical interpretation, conditional inference (CI) trees were built. Nine variables were identified as most important for backs, classifying between levels with 83% (RF) and 78% (CI tree) accuracy. The combination of variables with the highest classification rate was PlayerLoad2D, PlayerLoadSLOW per Kg body mass and high-speed running distance. Four variables were identified as most important for forwards, classifying with 68% (RF) and 64% (CI tree) accuracy. Defensive play-the-ball losses alone had the highest classification rate for forwards. The identified PI and their unique combinations can be developed during training to aid in progression through the rugby league playing pathway.
Understanding the most demanding passages of European Super League competition can optimise training prescription. We established positional and match half differences in peak relative distances (m·min-1) across durations, and the number of collisions, high-speed- and very-high-speed-distance completed in the peak 10 min period. Moving-averages (10 s, 30 s, 1 min, 5 min, 10 min) of instantaneous speed (m·s-1) were calculated from 25 professional rugby league players during 25 matches via microtechnology. Maximal m·min-1 was taken for each duration for each half. Concurrently, collisions (n), high-speed- (5 to 7 m·s-1; m) and very-high-speed-distance (> 7 m·s-1; m) were coded during each peak 10 min. Mixed-effects models determined differences between positions and halves. Aside from peak 10 s, trivial differences were observed in peak m·min-1 between positions or halves across durations. During peak 10 min periods, adjustables, full- and outside-backs ran more at high-speed and very-high-speed whilst middle- and edge-forwards completed more collisions. Peak m·min-1 is similar between positional groups across a range of durations and are maintained between halves of the match. Practitioners should consider that whilst the overall peak locomotor "intensity" is similar, how they achieve this differs between positions with forwards also exposed to additional collision bouts.
To alleviate issues arising from the over/under prescription of training load, coaches must ensure that desired athlete responses to training are being achieved. The present study aimed to assess the level of agreement between the coach intended (pre-session) and observed (post-session) rating of perceived exertion (RPE), with athlete RPE during different training intensities (easy, moderate, hard). Coach intended RPE was taken prior to all field based training sessions over an 8 week in-season period. Following training, all coaches and athletes, whom were participants in hockey, netball, rugby and soccer were asked to provide an RPE measure for the completed session. Sessions were then classified based on the coaches intended RPE, with a total of 28, 125 and 66 easy, moderate and hard training sessions collected respectively. A univariate analysis of variance was used to calculate within-participant correlations between coach intended/observed RPE and athlete RPE. Moderate correlations were found between coach intended and athlete RPE for sessions intended to be moderate and hard whilst a small correlation was found for sessions intended to be easy. The level of agreement between coach and athlete RPE improved following training with coaches altering their RPE to align with those of the athlete. Despite this, moderate and small differences between coach observed and athlete RPE persisted for sessions intended to be easy and moderate respectively. Coaches should therefore incorporate strategies to monitor training load to increase the accuracy of training periodisation and reduce potential over/under prescription of training.
Identification of pattern mining algorithm for rugby league players positional groups separation based on movement patterns
The application of pattern mining algorithms to extract movement patterns from sports big data can improve training specificity by facilitating a more granular evaluation of movement. As there are various pattern mining algorithms, this study aimed to validate which algorithm discovers the best set of movement patterns for player movement profiling in professional rugby league and the similarity in extracted movement patterns between the algorithms. Three pattern mining algorithms (l-length Closed Contiguous [LCCspm], Longest Common Subsequence [LCS] and AprioriClose) were used to profile elite rugby football league hookers (n = 22 players) and wingers (n = 28 players) match-games movements across 319 matches. Machine learning classification algorithms were used to identify which algorithm gives the best set of movement patterns to separate playing positions with Jaccard similarity score identifying the extent of similarity between algorithms' movement patterns. LCCspm and LCS movement patterns shared a 0.19 Jaccard similarity score. AprioriClose movement patterns shared no significant similarity with LCCspm and LCS patterns. The closed contiguous movement patterns profiled by LCCspm best-separated players into playing positions. Multi-layered Perceptron algorithm achieved the highest accuracy of 91.02% and precision, recall and F1 scores of 0.91 respectively. Therefore, we recommend the extraction of closed contiguous (consecutive) over non-consecutive movement patterns for separating groups of players.
PURPOSE: The study aimed to identify which combination of external and internal training load (TL) metrics capture similar or unique information for individual professional players during skills training in rugby union using principal component analysis (PCA). METHOD: TL data were collected from twenty-one male professional rugby union players across a competitive season. This included PlayerLoad™, total distance (TD), and individualised high-speed distance (HSD; >61% maximal velocity; all external TL) obtained from a micro-technology device worn by each player (Optimeye X4, Catapult Innovations, Melbourne, Australia) and the session-rating of perceived exertion (sRPE; internal TL). PCA was conducted on each individual to extract the underlying combinations of the four TL measures that best describe the total information (variance) provided by the measures. TL measures with PC "loadings" (PCL) above 0.7 were deemed to possess well-defined relationships with the extracted PC. RESULTS: The findings show that from the four TL measures, the majority of an individual's TL information (1st PC: 55 to 70%) during skills training can be explained by either sRPE (PCL: 0.72 to 0.95), TD (PCL: 0.86 to 0.98) or PlayerLoad™ (PCL: 0.71 to 0.98). HSD was the only variable to relate to the 2nd PC (PCL: 0.72 to 1.00), which captured additional TL information (+19 to 28%). CONCLUSIONS: Findings suggest practitioners could quantify the TL of rugby union skills training with one of PlayerLoad™, TD, or sRPE plus HSD whilst limiting omitted information of the TL imposed during professional rugby union skills training.
Understanding the locomotor characteristics of competition can help rugby league (RL) coaches optimise training prescription. To date, no research exists on the locomotor characteristics of women’s RL. The aim was to compare whole match and peak locomotor characteristics of women’s RL competition at international (RL World Cup [WRLWC]) and domestic level (Super League [WSL]). Microtechnology data were collected from 58 players from 3-WSL clubs and 1-WRLWC team. Participants were classified into forwards (n = 30) and backs (n = 28). Partial least squares correlation analysis established which variables were important to discriminate between the level of competition (international vs. domestic) and positional group (forwards vs. backs). Linear mixed-effects models estimated the differences between standards of competition and positional group for those variables. International forwards were most likely exposed to greater peak 1-min average acceleration (standardised mean difference = 1.23 [0.42 to 2.04]) and peak 3-min average acceleration (1.13 [0.41 to 1.85]) than domestic forwards. International backs likely completed greater peak 1-min average acceleration (0.83 [0.08 to 1.58]) than domestic backs and possibly greater high-speed-distances (0.45 [−0.17 to 1.07]). Findings highlight the need for positional specific training across levels to prepare representative players for the increased match characteristics of international competition.
This study aimed to determine the similarity between and within positions in professional rugby league in terms of technical performance and match displacement. Here, the analyses were repeated on 3 different datasets which consisted of technical features only, displacement features only, and a combined dataset including both. Each dataset contained 7617 observations from the 2018 and 2019 Super League seasons, including 366 players from 11 teams. For each dataset, feature selection was initially used to rank features regarding their importance for predicting a player's position for each match. Subsets of 12, 11, and 27 features were retained for technical, displacement, and combined datasets for subsequent analyses. Hierarchical cluster analyses were then carried out on the positional means to find logical groupings. For the technical dataset, 3 clusters were found: (1) props, loose forwards, second-row, hooker; (2) halves; (3) wings, centres, fullback. For displacement, 4 clusters were found: (1) second-rows, halves; (2) wings, centres; (3) fullback; (4) props, loose forward, hooker. For the combined dataset, 3 clusters were found: (1) halves, fullback; (2) wings and centres; (3) props, loose forward, hooker, second-rows. These positional clusters can be used to standardise positional groups in research investigating either technical, displacement, or both constructs within rugby league.
Recognising and removing players with suspected sport-related concussions is crucial for community sports. Objectives Quantify rates and factors associated with non-reporting of concussion symptoms in community rugby league. Methods Overall, 484 community rugby league players aged ≥18 years and 965 parents of rugby league players aged <18 years completed an online survey, regarding concussion history, knowledge, prevalence and reasons for non-reporting of concussion, long-term implications and perceptions of concussion. Results Thirty-five percent of players aged ≥18 years and 22% of parents of players aged <18 years reported at least one concussion in the last two seasons. Forty-three percent of players aged ≥18 years and 5% of parents of players aged<18 years surveyed stated they did not report concussion-related symptoms sustained during 2020 and 2021 seasons. The two most common reasons for non-reporting of concussion symptoms were‘didn’t want to be ruled out of a match’and‘didn’t want to let down the team’. Players aged ≥18 years who received external coaching pressures around concussion were more likely to not report concussion symptoms. Over 40% of parents and players were concerned about the potential long-term implications. Ten percent of players aged ≥18 years and 7% of parents of players aged <18 years would encourage their family members/children to not play rugby league. Conclusions Non-reporting rates of suspected concussion symptoms in adult community players were twice as high as in professional rugby league, with similar reasons (wanting to play and not letting the team down). Engaging coaches to prioritise brain health and providing broader and appropriate education on concussion should be focused on, given the concerns reported by community players and parents.
The measurement, analysis, and reporting of physical qualities within sport is vital for practitioners to support athlete development. However, several challenges exist to support this process (e.g., establishing comparative data, managing large data sets) within sport. This article presents 7 challenges associated with physical testing in sport and offers solutions to overcome them. These solutions are supported by a description of the Profiling Physical Qualities (ProPQ) tool. The ProPQ tool uses advanced data analysis, visualization, and interactive elements, to enhance stakeholders' use of data to optimize player development and coaching practices. The ProPQ is currently used across rugby league in England.
Introduction: This study compared the effects of 10, 20, and 30% velocity loss (VL) thresholds on differential ratings of perceived exertion (dRPE), lactate, and countermovement jump height (CMJ) during, immediately post-, and 24 hours post-five sets of the barbell back-squat. Methods: In a randomised-crossover design, 15 resistance-trained males completed five sets of the back-squat with an initial mean concentric velocity of 0.70±0.01 m·s-1 and a set termination threshold of either 10% (0.63m·s-1), 20% (0.56m·s-1), or 30% (0.49m·s-1) VL. External load was manipulated throughout each session to ensure the first repetition of sets 2-5 was 0.70±0.06 m·s-1. Participants provided fingertip lactate at the completion of each set, while CMJ was collected pre-, post-, and 24 hours post-exercise. dRPE for the legs and lungs was provided at the completion of the 5th set. Three minutes rest was provided between sets, while barbell velocity was assessed during exercise to guide set termination. Results: Peak lactate responses in the 30% condition were likely (effect size ±90confidence interval: 1.45±2.29) and almost certainly (4.56±1.66) greater when compared to the 20% and 10%, respectively. In the 10, 20, and 30% conditions, CMJ height was reduced by 11.3% (±2.4), 14.0% (±3.3), and 20.0% (±3.4), immediately post-exercise. Additionally, dRPE (mean (±SD)) of the legs and lungs were, 10%: 27±12 and 20±9; 20%: 53±16 and 50±17; and 30%: 65±18 and 65±17. At 24 hours post-training, CMJ performance was, 10%: +0.7% (±2.4); 20%: -0.6% (±2.0); and 30%: -2.7% (±2.7). Conclusion: Different VL thresholds during the back-squat cause varying perceptual, metabolic, and neuromuscular responses. The use of 30% VL thresholds can cause substantially greater metabolic responses and potentially attenuate neuromuscular function at 24 hours post-training. Alternatively, a 10% VL can mitigate perceived exertion and changes in metabolic responses. These findings should be considered during the planning of velocity-based resistance training programmes.
Objectives: To quantify the incidence of concussion and compare between playing level in male rugby league. Design: Retrospective cohort Methods: Between 2016 and 2022, medically diagnosed concussions in Super League, Championship, and Academy competitions were reported to the Rugby Football League via club medical staff. Anonymised data were analysed using generalized linear mixed-effects models by season, month, and between competitions. Results: Overall, 1,403 concussions were identified from 104,209 player-match hours. Concussion incidence for Super League, Championship, and Academy was 15.5, 10.5, and 14.3 per 1,000 player-match hours, respectively. Championship concussion incidence was significantly lower than the Super League (p<0.001) and Academy (p<0.001). No significant differences were identified between year for Super League (range: 13.3 to 18.8 per 1,000 player-match hours) and Championship (range: 8.4 to 12.1 per 1,000 player-match hours). In Academy (range: 9.6 to 20.5 per 1,000 player-match hours), concussion incidence was significantly greater in 2021 compared to earlier years (2016, p=0.01 and 2017, p=0.03). No significant differences were identified between months for any competition. Conclusions: The incidence of concussion is greater in Super League and Academy compared to the Championship. Academy concussion incidence has increased over time. Different factors between and within competitions, such as changes to medical standards and knowledge, could have influenced the identification and diagnosis of concussion.
Objectives The aim of this study was to describe the incidence and magnitude of head acceleration events (HAEs) during elite men’s and women’s rugby union training for different contact training levels and drill types. Method Data were collected during the 2022–23 and 2023–24 seasons from 203 men and 125 women from 13 clubs using instrumented mouthguards (iMGs) during in-season training. One author reviewed the training videos to identify the contact level and drill type. HAE incidence was calculated per player minute. Results For men’s forwards and backs, only 4.7% and 5.8% of HAEs were ≥ 25 g and ≥ 1.5 Krad/s2, and 3.4% and 4.4% for women’s forwards and backs, respectively. The incidence of ≥ 5 g and ≥ 0.4 Krad/s2 was highest during full-contact training for men’s forwards (0.20/min) and backs (0.16/min) and women’s forwards (0.10/min). HAE incidence was 2–3 times higher during repetition-based compared with game-based training drills for men’s forwards (0.25/min vs 0.09/min) and backs (0.22/min vs 0.09/min) and women’s forwards (0.09/min vs 0.04/min) and backs (0.08/min vs 0.03/min). HAE incidences were halved when repetition-based training drills used pads compared with no pads for men’s forwards (0.21/min vs 0.44/min) and backs (0.17/min vs 0.30/min), and women’s forwards (0.06/min vs 0.14/min) and backs (0.06/min vs 0.10/min). Conclusion The average HAE incidence (~ 13–20% of weekly HAEs) and magnitude during an in-season training week is very low compared with matches. Opportunities to materially reduce HAE exposure in training are likely more limited than previously assumed. Future research on HAE load and injury, and understanding players’ specific weekly training exposure, may inform effective individual player management.
This study aimed to quantify contact-events and associated head acceleration event (HAE) probabilities in semi-elite women's rugby union. Instrumented mouthguards (iMGs) were worn by players competing in the 2023 Farah Palmer Cup season (13 teams, 217 players) during 441 player-matches. Maximum peak linear acceleration (PLA) and peak angular acceleration (PAA) per-event were used as estimates of in vivo HAE (HAEmax), linked to video analysis-derived contact-events and analysed using mixed-effects regression. Back-rows had the highest number of contact-events per full-match (44.1 [41.2 to 47.1]). No differences were apparent between front-five and centres, or between half-backs and outside-backs. The probability of higher HAEmax occurring was greatest in ball-carries, followed by tackles, defensive rucks and attacking rucks. Probability profiles were similar between positions but the difference in contact-events for each position influenced HAEmax exposure. Overall, most HAEmax were relatively low. For example, the probability of a back-row experiencing a PLA HAEmax ≥25g was 0.045 (0.037-0.054) for ball carries (1 in every 22 carries), translating to 1 in every 2.3 full games. This study presents the first in-depth analysis of contact-events and associated HAEmax in semi-elite women's rugby union. The HAEmax profiles during contact-events can help inform both policy and research into injury mitigation strategies.
Objective To (1) systematically review the literature to identify which match-related risk factors and mechanisms of rugby tackle events result in musculoskeletal injury; concussion; head injury assessments; and head impacts or head accelerations; and (2) identify the perceived importance and feasibility of potential intervention strategies for tackle-related injury reduction in the rugby codes. Design A systematic search was performed using PRISMA guidelines. Risk factors/mechanisms associated with tackle injuries across the rugby codes were extracted. After extraction, 50 international rugby experts participated in a Delphi poll. Via content analysis, expert-recommended risk factors/mechanisms were developed. In round two, experts rated all risk factors and mechanisms for importance to injury risk. In round three, the feasibility of law changes, coach and player education and training as interventions to reduce injury risk for each injury risk factor/mechanism deemed important during round two were rated. Data sources PubMed [MEDLINE], Scopus, SPORTDiscus [EBSCOhost] and CINAHL. Eligibility criteria Eligible studies included cohort, observational and cross-sectional designs, that included male or female rugby union, league, or sevens players. Results Thirty-seven eligible studies were identified, with 138 injury risk factors/mechanisms extracted. 70% of the studies were rated ‘high quality’, with 30% moderate quality. Thirty-eight new risk factors/mechanisms were recommended by the expert group, eight being identified as important and highly feasible for modification by an intervention strategy. ‘The tackler placing their head on the incorrect side of the ball carrier’ was described as the most important mechanism, with ‘training’ and ‘coach/player education’ thought to be highly feasible interventions. Conclusion Numerous risk factors or mechanisms associated with tackle-related injury appear important and modifiable, helping to guide interventions to reduce injury risk in the rugby tackle.
Different talent development (TDE) environments exhibit varying training practices in the rugby league talent identity and development systems (TIDS), which may influence rates of talent development and subsequent productivity of each TDE. This study aimed to compare physical qualities and rates of physical development between different rugby league TDEs within the same TIDS, alongside differences between groups of TDEs based on their level of productivity. A sample of 261 youth rugby league players from six academy teams (i.e., TDEs) within the professional TIDS were tested as part of a league-wide fitness testing battery for measures of anthropometrics, strength, power, speed, and cardiovascular fitness. Linear mixed models revealed medium, significant differences in maximum sprint velocity at the beginning of the season (η 2 = 0.05, p = 0.03) and large, significant differences in the development of prone Yo-Yo IR1 distance over time (η 2 = 0.14–0.18, p < 0.001) between TDEs. No significant differences between groups of TDEs based on their productivity were found. These findings indicate that possible variability in the practices of TDEs mostly leads to small or trivial differences in physical qualities and physical development. Differences in physical qualities and physical development do not appear to relate to the productivity of TDEs, therefore TDEs should focus on holistic development to maximise productivity.
Female sports have recently seen a dramatic rise in participation and professionalism world-wide. Despite progress, the infrastructure and general sport science provisions in many female sports are behind their male counterparts. From a performance perspective, marked differences in physical and physiological characteristics can be seen between the sexes. Although physical preparation practices for male athletes are known, there are currently no published literature pertaining exclusively to female athletes. This information would provide invaluable data for both the researcher and practitioner alike. This survey therefore aimed to examine current practices utilized in female rugby codes (union, league, and sevens). A questionnaire assessing seasonal physical preparation practices, recovery, monitoring and sport science technology, and unique aspects in female rugby was developed. Thirty-seven physical preparation practitioners (32 males, 5 females) responded to the questionnaire. Most participants (78%) worked with national or regional/state level female athletes. Performance testing was more frequently assessed in the pre- (97%) and in-season (86%), than off-season (23%). Resistance, cardiovascular, sprint and plyometric training, and recovery sessions were all believed to be important to enhancing performance and implemented by most participants (≥ 89%). Sport science technologies were commonly (54%) utilized to inform current practice. Menstrual cycle phase was monitored by 22% of practitioners. The most frequently reported unique considerations in female rugby codes included psycho-social aspects (41%), the menstrual cycle (22%), and physical differences (22%). Practitioners working with female rugby can use the presented data to inform and develop current practices.
Objectives: In part 1, the objective was to undertake a systematic scoping review of applied sports science and sports medicine in women’s rugby, and in part 2 to develop a consensus statement on future research priorities. Design: In part 1, a systematic search of PubMed (MEDLINE), Scopus and SPORTDiscus (EBSCOhost) was undertaken from the earliest records to January 2021. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020, the PRISMA extension for Scoping Reviews, and the PRISMA extension protocols were followed. In part 2, 31 international experts in women’s rugby (ie, elite players, sports scientists, medical clinicians, sports administrators) participated in a three-round Delphi consensus method. These experts reviewed the findings from part 1 and subsequently provided a list of priority research topics in women’s rugby. Research topics were grouped into expert-based themes and expert-based subthemes via content analysis. Expert-based themes and expert-based subthemes were ranked from very low to very high research priority on a 1–5 Likert scale. Consensus was defined by ≥70% agreement. The median research priority agreement and IQR were calculated for each expert-based theme and subtheme. Data sources: PubMed (MEDLINE), Scopus and SPORTDiscus (EBSCOhost). Eligibility criteria for selecting studies: Studies were eligible for inclusion if they investigated applied sports science or sports medicine in women’s rugby. Results: In part 1, the systematic scoping review identified 123 studies, which were categorised into six sports science and sports medicine evidence-based themes: injury (n=48), physical performance (n=32), match characteristics (n=26), fatigue and recovery (n=6), nutrition (n=6), and psychology (n=5). In part 2, the Delphi method resulted in three expert-based themes achieving consensus on future research priority in women’s rugby: injury (5.0 (1.0)), female health (4.0 (1.0)) and physical performance (4.0 (1.0)). Summary/Conclusion: This two-part systematic scoping review and Delphi consensus is the first study to summarise the applied sports science and sports medicine evidence base in women’s rugby and establish future research priorities. The summary tables from part 1 provide valuable reference information for researchers and practitioners. The three expert-based themes that achieved consensus in part 2 (injury, female health and physical performance) provide clear direction and guidance on future research priorities in women’s rugby. The findings of this two-part study facilitate efficient and coordinated use of scientific resources towards high-priority research themes relevant to a wide range of stakeholders in women’s rugby.
Professional collision-sport athletes report uniquely large energy expenditures across the season (1-4), as determined by gold standard assessment of resting metabolic rate (RMR (5)) and total energy expenditure (TEE (6)). Such expenditures are possibly a consequence of strenuous match demands, which repeatedly expose players to substantial exercise-and collision-induced muscle damage (7). Recovery from such large perturbations of homeostasis (8) are likely to be energetically expensive (9), in part determining the distinct in-season energetic demands of professional collision-sport athletes. Aim. Accurately determining the effect of match play on resting metabolism is essential to optimise acute manipulation of energy balance, player recovery and long-term athlete development. Therefore, for the first time this case report investigated the metabolic cost of a professional young rugby league match.
Women’s rugby (rugby league, rugby union and rugby sevens) has recently grown in participation and professionalisation. There is under-representation of women-only cohorts within applied sport science and medicine research and within the women’s rugby evidence base. The aims of this article are: Part 1: to undertake a systematic-scoping review of the applied sport science and medicine of women’s rugby, and Part 2: to develop a consensus statement on future research priorities. This article will be designed in two parts: Part 1: a systematic-scoping review, and Part 2: a three-round Delphi consensus method. For Part 1, systematic searches of three electronic databases (PubMed (MEDLINE), Scopus, SPORTDiscus (EBSCOhost)) will be performed from the earliest record. These databases will be searched to identify any sport science and medicine themed studies within women’s rugby. The Preferred Reporting Items for Systematic Reviews and Meta-analyses extension for Scoping Reviews will be adhered to. Part 2 involves a three-round Delphi consensus method to identify future research priorities. Identified experts in women’s rugby will be provided with overall findings from Part 1 to inform decision-making. Participants will then be asked to provide a list of research priority areas. Over the three rounds, priority areas achieving consensus (≥70% agreement) will be identified. This study has received institutional ethical approval. When complete, the manuscript will be submitted for publication in a peer-reviewed journal. The findings of this article will have relevance for a wide range of stakeholders in women’s rugby, including policymakers and governing bodies.
Applied sport science and medicine of women’s rugby
Instrumented mouthguards (iMGs) are a novel technology being used within rugby to quantify head acceleration events. Understanding practitioners' perceptions of the barriers and facilitators to their use is important to support implementation and adoption. This study assessed men's and women's rugby union and league iMG managers' perceptions of staff and player interest in the technology, data and barriers to use. Forty‐six iMG managers (men's rugby union and league n = 20 and n = 9 and women's rugby union and league n = 7 and n = 10) completed an 18‐question survey. Perceived interest in data varied across staff roles with medical staff being reported as having the most interest. The iMG devices were perceived as easy to use but uncomfortable. Several uses of data were identified, including medical applications, player monitoring and player welfare. The comfort, size and fit of the iMG were reported as the major barriers to player use. Time constraints and a lack of understanding of data were barriers to engagement with the data. Continued education on how iMG data can be used is required to increase player and staff buy‐in, alongside improving comfort of the devices. Studies undertaken with iMGs investigating player performance and welfare outcomes will make data more useful and increase engagement.
Introduction Research within Rugby league (RL) tackle investigations using video analysis has often used two sources of variables. The exception being King et al (2010) who described the characteristics of the RL tackle event such as number of tacklers and tackle height of the first tackler. However, the majority of investigations have either adopted technical variables from rugby union (RU) tackle variables (Sperenza et al., 2017) or technical criteria from coaching cues (Gabbett, 2008). In doing so, content validity and relevance to RL could be questioned (O’Donoghue, 2014). The aim of this study was to adopt a 5 stage process to determine tackle variables which are valid and reliable for RL research. Method A 5 stage process was undertaken based upon recommendations by O’Donoghue (2014). STAGE 1 involved a synthesis of literature and examined phases of the tackle, variables describing the tackle descriptions of these variables research. A draft variable list was then developed before the start of STAGE 2. To achieve content validity and relevancy, STAGE 2 formed an expert group of practitioners to critique the previously formed draft variable list and develop new phases, variables and descriptors. STAGE 3 refined the variable list based upon the practitioner consultation. STAGE 4 established an expert group agreement in the refined variable list. Finally, STAGE 5 tested intra and inter-reliability of the list using Kappa statistics (McHugh, 2012). Results The agreed variable list comprised of 6 phases including defensive start point, pre-contact, initial contact, post-contact and play the ball phases. Within the phases 66 variables were determined. The intra- and inter-reliability testing resulted in at least moderate agreement (>0.7) (McHugh, 2012) of all phases. Discussion Due to possessing both strong relevance to an RL tackle and demonstrating good levels of reliability, researchers can be confident that the variables within the list are valid for research purposes (O’Donoghue, 2014). In addition, the rigorous 5 stage process of validating the content of the variable list should be used when determining different variables within different sports and actions for research purposes. In doing so, researchers can be confident that they are valid in use and thus can be used consistently for research purposes. Furthermore, the findings show that although there are similarities between a RU and RL tackle, clear differences exist and therefore justifies the need for specific RL variables during tackle research.
The purpose of this study was to determine the importance of physical qualities for speed and change of direction (CoD) ability in female soccer players. Data were collected on 10 female soccer players who were part of a professional English Women’s Super League team. Player assessments included anthropometric (stature and body mass), body composition (dual-energy X-ray absorptiometry), speed (10m, 30m sprint), CoD ability (505 agility), aerobic (Yo-Yo Intermittent Recovery Test), lower-body strength (bilateral knee extensions) and power (countermovement jump [CMJ], squat jump [SJ], 30cm drop jump [DJ]) measures). The relationships between the variables were evaluated using eigenvector analysis and Pearson correlation analysis. Multiple linear regression revealed that the performance variables (10 and 20m speed, mean 505, and CoD deficit mean) can be predicted with almost 100% accuracy (i.e. adjusted R2 > 0.999) using various combinations of the predictor variables (DJ height, CMJ height, SJ height, lean body mass). An increase of one standard deviation (SD) in DJ height was associated with reductions of -5.636 and 9.082 SD in 10 m and 20 m sprint times. A one SD increase in CMJ also results in a reduction of -3.317 and -0.922 SD respectively in mean 505 and CoD deficit mean values. This study provides comparative data for professional English female soccer players that can be used by strength and conditioning coaches when monitoring player development and assessing the effectiveness of training programmes. Findings highlight the importance of developing reactive strength to improve speed and CoD ability in female soccer players.
Aim Exercise appears to cause damage to the endothelial lining of the human gastrointestinal tract and elicit a significant increase in gut permeability. Objective The aim of this review was to determine the effect of an acute bout of exercise on gut damage and permeability outcomes in healthy populations using a meta-analysis. Methods PubMed, The Cochrane Library as well as MEDLINE, SPORTDiscus and CINHAL, via EBSCOhost were searched through February 2019. Studies were selected that evaluated urinary (ratio of disaccharide/monosaccharide excretion) or plasma markers [intestinal Fatty Acid Binding Protein (i-FABP)] of gut permeability and gut cell damage in response to a single bout of exercise. Results A total of 34 studies were included. A random-effects meta-analysis was performed, and showed a large and moderate effect size for markers of gut damage (i-FABP) (ES 0.81; 95% CI 0.63–0.98; n = 26; p < 0.001) and gut permeability (Disaccharide Sugar/Monosaccharide Sugar) (ES 0.70; 95% CI 0.29–1.11; n = 17; p < 0.001), respectively. Exercise performed in hot conditions (> 23 °C) further increased markers of gut damage compared with thermoneutral conditions [ES 1.06 (95% CI 0.88–1.23) vs. 0.66 (95% CI 0.43–0.89); p < 0.001]. Exercise duration did not have any significant effect on gut damage or permeability outcomes. Conclusions These findings demonstrate that a single bout of exercise increases gut damage and gut permeability in healthy participants, with gut damage being exacerbated in hot environments. Further investigation into nutritional strategies to minimise gut damage and permeability after exercise is required. PROSPERO database number (CRD42018086339).
Relationships of dietary intake with age, body mass and body composition in professional adolescent rugby league and rugby union players
Due to the unique energetic demands of professional young collision sport athletes, accurate assessment of energy balance is required. Consequently, this is the first study to simultaneously investigate the energy intake, expenditure and balance of professional young rugby league players across a pre-season period. The total energy expenditure of six professional young male rugby league players was measured via doubly labelled water over a fourteen-day assessment period. Resting metabolic rate was measured and physical activity level calculated. Dietary intake was reported via Snap-N-Send over a non-consecutive ten-day assessment period, alongside changes in fasted body mass and hydration status. Accordingly, energy balance was inferred. The mean (standard deviation) difference between total energy intake (16.73 (1.32) MJ.day-1) and total energy expenditure (18.36 (3.05) MJ.day-1) measured over the non-consecutive ten-day period was unclear (-1.63 (1.73) MJ.day-1; ES = 0.91 ±1.28; p = 0.221). This corresponded in a most likely trivial decrease in body mass (-0.65 (0.78) kg; ES = 0.04 ±0.03; p = 0.097). Resting metabolic rate and physical activity level across the fourteen-day pre-season period was 11.20 (2.16) MJ.day-1 and 1.7 (0.2), respectively. For the first time, this study utilises gold standard assessment techniques to elucidate the distinctly large energy expenditures of professional young rugby league players across a pre-season period, emphasising a requirement for equally large energy intakes to achieve targeted body mass and composition adaptations. Accordingly, it is imperative that practitioners regularly assess the energy balance of professional young collision-sport athletes to ensure their unique energetic requirements are achieved.
The purpose of the present study was to evaluate the anthropometric and physical characteristics of English academy rugby league players by annual-age category (under 16s-under 20s) and between backs and forwards. Data were collected on 133 academy players over a 6-year period (resulting in a total of 257 assessments). Player assessments comprised of anthropometric (height, body mass, sum of 4 skinfolds) and physical (vertical jump, 10- and 20-m sprint, estimated V[Combining Dot Above]O2max via the yo-yo intermittent recovery test level 1, absolute 1 repetition maximum [1RM], and relative squat, bench press, and prone row) measures. Univariate analysis of variance demonstrated significant (p ≤ 0.05) increases in height, body mass, vertical jump, absolute, and relative strength measures across the 5 annual-age categories (e.g., body mass: under 16s = 75.2 ± 11.1, under 20s = 88.9 ± 8.5 kg; vertical jump: under 16s = 45.7 ± 5.2, under 20s = 52.8 ± 5.4 cm; 1RM bench press: under 16s = 73.9 ± 13.2, under 20s = 114.3 ± 15.3 kg). Independent t-tests identified significant (p ≤ 0.05) differences between backs and forwards for anthropometric (e.g., under 16s body mass: backs = 68.4 ± 8.6, forwards = 80.9 ± 9.7 kg) and physical (e.g., under 19s 20-m sprint: backs = 3.04 ± 0.08, forwards = 3.14 ± 0.12s; under 18s relative squat: backs = 1.65 ± 0.18, forwards = 1.51 ± 0.17 kg·kg) characteristics that were dependent on the age category and measure assessed. Findings highlight that anthropometric and physical characteristics develop across annual-age categories and between backs and forwards in academy rugby league players. These findings provide comparative data for such populations and support the need to monitor player development in junior rugby league players.
OBJECTIVES: Full-contact football-code team sports offer a unique environment for illness risk. During training and match-play, players are exposed to high-intensity collisions which may result in skin-on-skin abrasions and transfer of bodily fluids. Understanding the incidence of all illnesses and infections and what impact they cause to time-loss from training and competition is important to improve athlete care within these sports. This review aimed to systematically report, quantify and compare the type, incidence, prevalence and count of illnesses across full-contact football-code team sports. DESIGN/METHODS: A systematic search of Cochrane Library, MEDLINE, SPORTDiscus, PsycINFO and CINAHL electronic databases was performed from inception to October 2019; keywords relating to illness, athletes and epidemiology were used. Studies were excluded if they did not quantify illness or infection, involve elite athletes, investigate full-contact football-code sports or were review articles. RESULTS: Twenty-eight studies met the eligibility criteria. Five different football-codes were reported: American football (n=10), Australian rules football (n=3), rugby league (n=2), rugby sevens (n=3) and rugby union (n=9). One multi-sport study included both American football and rugby union. Full-contact football-code athletes are most commonly affected by respiratory system illnesses. There is a distinct lack of consensus of illness monitoring methodology. CONCLUSIONS: Full-contact football-code team sport athletes are most commonly affected by respiratory system illnesses. Due to various monitoring methodologies, illness incidence could only be compared between studies that used matching incidence exposure measures. High-quality illness surveillance data collection is an essential component to undertake effective and targeted illness prevention in athletes.
The original article has been updated to add the missing Electronic Supplemental Material.
Purpose Head acceleration events (HAEs) are a growing concern in contact sports, prompting two rugby governing bodies to mandate instrumented mouthguards (iMGs). This has resulted in an influx of data imposing financial and time constraints. This study presents two computational methods that leverage a dataset of video-coded match events: cross-correlation synchronisation aligns iMG data to a video recording, by providing playback timestamps for each HAE, enabling analysts to locate them in video footage; and post-synchronisation event matching identifies the coded match event (e.g. tackles and ball carries) from a video analysis dataset for each HAE, this process is important for calculating the probability of match events resulting in HAEs. Given the professional context of iMGs in rugby, utilising commercial sources of coded match event datasets may expedite iMG analysis. Methods Accuracy and validity of the methods were assessed via video verification during 60 rugby matches. The accuracy of cross-correlation synchronisation was determined by calculating synchronisation error, whilst the validity of post-synchronisation event matching was evaluated using diagnostic accuracy measures (e.g. positive predictive value [PPV] and sensitivity). Results Cross-correlation synchronisation yielded mean synchronisation errors of 0.61–0.71 s, with all matches synchronised within 3 s’ error. Post-synchronisation event matching achieved PPVs of 0.90–0.95 and sensitivity of 0.99–1.00 for identifying correct match events for SAEs. Conclusion Both methods achieved high accuracy and validity with the data sources used in this study. Implementation depends on the availability of a dataset of video-coded match events; however, integrating commercially available video-coded datasets offers the potential to expedite iMG analysis, improve feedback timeliness, and augment research analysis.
Seasonal Changes in Anthropometric, Fitness and Strength Characteristics within Academy Rugby League Players
Speed, Momentum and Peak Power Characteristics of Academy Rugby League Backs and Forwards by Annual-Age Category
Anthropometric, Fitness and Strength Characteristics of Academy Rugby League Backs and Forwards by Annual-Age Category
The Longitudinal Development of Strength Characteristics within Academy Rugby League Players
Professional rugby league clubs implement training programmes for the development of anthropometric and physical characteristics within an academy programme. However, research that examines seasonal changes in these characteristics is limited. The purpose of the study was to evaluate the seasonal changes in anthropometric and physical characteristics of academy rugby league players by age category (i.e., under 14, 16, 18, 20). Data were collected on 75 players pre- and postseason over a 6-year period (resulting in a total of 195 assessments). Anthropometric (body mass, sum of 4 skinfolds) and physical (10- and 20-m sprint, vertical jump, Yo-Yo intermittent recovery test and 1 repetition maximum squat, bench press, and prone row) measures were collected. The under 14s and 16s showed greater seasonal improvements in body mass (e.g., under 14s = 7.4 ± 4.3% vs. under 20s = 1.2 ± 3.3%) and vertical jump performance than under 18s and under 20s. In contrast, under 18s and under 20s players showed greater seasonal improvements in Yo-Yo performance and 10-m sprint (e.g., under 14s = 1.3 ± 3.9% vs. under 20s = -1.9 ± 1.2%) in comparison to under 14s and under 16s. Seasonal strength improvements were greater for the under 18s compared with under 20s. This study provides comparative data for seasonal changes in anthropometric and physical characteristics within rugby league players aged 13-20 years. Coaches should be aware that seasonal improvements in speed may not exist within younger age categories, until changes in body mass stabilize and consider monitoring changes in other characteristics (e.g., momentum). Large interplayer variability suggests that player development should be considered on an individual and longitudinal basis.
The purpose of this study was to evaluate the annual and long-term (i.e., 4 years) development of anthropometric and physical characteristics in academy (16-20 years) rugby league players. Players were assessed at the start of preseason over a 6-year period and were required to be assessed on consecutive years to be included in the study (Under 16-17, n = 35; Under 17-18, n = 44; Under 18-19, n = 35; Under 19-20, n = 16). A subset of 15 players were assessed for long-term changes over 4 years (Under 16-19). Anthropometric (height, body mass, sum of 4 skinfolds) and physical (10- and 20-m sprint, 10-m momentum, vertical jump, yo-yo intermittent recovery test level 1, 1 repetition maximum [1RM] squat, bench press, and prone row) assessments were collected. Paired t-tests and repeated measures analysis of variance demonstrated significant annual (e.g., body mass, U16 = 76.4 ± 8.4, U17 = 81.3 ± 8.3 kg; p < 0.001, d = 0.59) and long-term (e.g., vertical jump, Under 16 = 44.1 ± 3.8, Under 19 = 52.1 ± 5.3 cm; p < 0.001, d = 1.74) changes in anthropometric and physical characteristics. Greater percentage changes were identified between the Under 16-17 age categories compared with the other ages (e.g., 1RM squat, U16-17 = 22.5 ± 19.5 vs. U18-19 = 4.8 ± 6.4%). Findings demonstrate the annual and long-term development of anthropometric and physical characteristics in academy rugby league players establishing greater changes occur at younger ages upon the commencement of a structured training program within an academy. Coaches should understand the long-term development of physical characteristics and use longitudinal methods for monitoring and evaluating player performance and development.
Sub-optimal calcium, vitamin D and iron intakes are typical in athletes. However, quantification by dietary intake may be erroneous, with biomarkers providing a more accurate assessment. This study aimed to determine the calcium, vitamin D and iron status of 8 junior (i.e., under-18 [U18]; age 15.5 ± 0.5 years; height 180.4 ± 6.7 cm; body mass 81.6 ± 14.3 kg) and 12 senior (i.e., over-18 [O18]; age 19.7 ± 1.8 years; height 184.9 ± 6.9 cm; body mass 97.4 ± 14.4 kg) male rugby union players, and assess their adequacy against reference values. Fasted serum calcium, 25(OH)D and ferritin concentrations were analysed using Enzyme-Linked Immunosorbent Assay during the in-season period (March-April). U18 had very likely greater calcium concentrations than O18 (2.40 ± 0.08 vs. 2.25 ± 0.19 mmol.l-1). Differences between U18 and O18 were unclear for 25(OH)D (20.21 ± 11.57 vs. 29.02 ± 33.69 nmol.l-1) and ferritin (59.33 ± 34.61 vs. 85.25 ± 73.53 µg.l-1). Compared to reference values, all U18 had adequate serum calcium concentrations, whereas 33% and 67% of O18 were deficient and adequate, respectively. All U18 and 83% of O18 had severely deficient, deficient or inadequate vitamin D concentrations. Adequate (8%) and optimal (8%) concentrations of vitamin D were observed in O18. All U18 and 75% of O18 had adequate ferritin concentrations. Potential toxicity (17%) and deficient (8%) ferritin concentrations were observed in O18. Vitamin D intake should be increased and multiple measures obtained throughout the season. More research is required on the variation of micronutrient status
Designing and implementing successful dietary intervention is integral to the role of sport nutrition professionals as they attempt to positively change the dietary behaviours of athletes. High-performance sport is a time-pressured environment where immediate results can often supersede pursuit of the most effective evidence-based practice. However, efficacious dietary intervention necessitates comprehensive, systematic and theoretical behavioural design and implementation if the habitual dietary behaviours of athletes are to be positively changed. Therefore, this case study demonstrates how the Behaviour Change Wheel was used to design and implement an effective nutritional intervention within professional rugby league. The eight-step intervention targeted athlete consumption of a high quality dietary intake of 25.1 MJ each day, to achieve an overall body mass increase of 5 kg across a twelve-week intervention period. The Capability, Opportunity, Motivation-Behaviour model and APEASE criteria were used to identify population-specific intervention functions, policy categories, behaviour change techniques and modes of intervention delivery. The resulting intervention was successful, increasing the average daily energy intake of the athlete to 24.5 MJ, which corresponded in a 6.2 kg body mass gain. Despite consuming 0.6 MJ less per day than targeted, secondary outcome measures of diet quality, strength, body composition and immune function all substantially improved, supporting a sufficient energy intake and the overall efficacy of a behavioural approach. Ultimately, the Behaviour Change Wheel provides sport nutrition professionals with an effective and practical step-wise method via which to design and implement effective nutritional interventions for use within high-performance sport.
Good nutrition is essential for the physical development of adolescent athletes, however data on dietary intakes of adolescent rugby players are lacking. This study quantified and evaluated dietary intake in 87 elite male English academy rugby league (RL) and rugby union (RU) players by age (under-16 (U16) and under-19 (U19) years old) and code (RL and RU). Relationships of intakes with body mass and composition (sum of 8 skinfolds) were also investigated. Using 4-day diet and physical activity diaries, dietary intake was compared to adolescent sports nutrition recommendations and the UK national food guide. Dietary intake did not differ by code, whereas U19s consumed greater energy (3366 ± 658 vs. 2995 ± 774 kcal.day-1), protein (207 ± 49 vs. 150 ± 53 g.day-1) and fluid (4221 ± 1323 vs. 3137 ± 1015 ml.day-1) than U16s. U19s consumed a better quality diet than U16s (greater intakes of fruit and vegetables; 4.4 ± 1.9 vs. 2.8 ± 1.5 servings.day-1; non-dairy proteins; 3.9 ± 1.1 vs. 2.9 ± 1.1 servings.day-1) and less fats and sugars (2.0 ± 1. vs. 93.6 ± 2.1 servings.day-1). Protein intake vs. body mass was moderate (r = 0.46, p < 0.001), and other relationships were weak. The findings of this study suggest adolescent rugby players consume adequate dietary intakes in relation to current guidelines for energy, macronutrient and fluid intake. Players should improve the quality of their diet by replacing intakes from the fats and sugars food group with healthier choices, while maintaining current energy, and macronutrient intakes.
Dietary intakes differ across age groups in professional adolescent rugby league and rugby union players
Hydration status of rugby league players during home match play throughout the 2008 Super League season.
The hydration status of rugby league players during competitive home match play was assessed throughout the 2008 Super League season. Fourteen players from 2 Super League clubs were monitored (72 observations). On arrival, 2 h prior to kick off, following normal prematch routines, players' body mass were measured following a urine void. Prematch fluid intake, urine output, and osmolality were assessed until kick off, with additional measurements at half time. Fluid intake was also monitored during match play for club B only, and final measurements of variables were made at the end of the match. Mean body mass loss per match was 1.28 ± 0.7 kg (club A, 1.15 kg; club B, 1.40 kg), which would equate to an average level of dehydration of 1.31% (mass loss, assumed to be water loss, expressed as a percentage of body mass), with considerable intra-individual coefficient of variation (CV, 47%). Mean fluid intake for club B was 0.64 ± 0.5 L during match play, while fluid loss was 2.0 ± 0.7 L, with considerable intra-individual CV (51% and 34%, respectively). Mean urine osmolality was 396 ± 252 mosm·kg-1 on arrival, 237 ± 177 mosm·kg-1 prematch, 315 ± 133 mosm·kg-1 at half time, and 489 ± 150 mosm·kg-1 postmatch. Body mass losses were primarily a consequence of body fluid losses not being completely balanced by fluid intake. Furthermore, these data show that there is large inter- and intra-individual variability of hydration across matches, highlighting the need for future assessment of individual relevance.
Purpose: Longitudinal studies assessing the seasonal development of strength, speed and power qualities are limited in youth soccer players. The purpose of this study was to evaluate the seasonal changes in the physical development of elite youth soccer players across Pre-, Circa- and Post-Peak Height Velocity (PHV), against a similar age and maturity matched control groups. Methods: One-hundred and twelve male elite youth soccer players (Pre-PHV n = 55; Circa-PHV n = 21; Post-PHV n = 36) and 38 controls consisting of non-elite active participants (Pre-PHV n = 18; Circa-PHV n = 10; Post-PHV n = 10) all undertook isometric mid-thigh pull strength, 10–30 m sprints, change of direction speed (CODs) and countermovement jump (CMJ) tests pre- and post-season. Results: The elite Circa-PHV improved greater than the control group for all physical qualities between pre- and post-season. The elite Pre-PHV improved greater in sprints, CODs, CMJ jump height and strength while the elite Post-PHV group improved more in CODs and strength than their respective control groups. Conclusion: Findings suggest that systematic academy soccer training enhances the development of physical qualities in youth soccer players but maturity status may impact upon such adaptations.
This study aimed to identify and compare the training frequency and intensity (via session rating of perceived exertion load (sRPE load)) of representative and non-representative late adolescent athletes. Thirty-six team sport athletes completed a web-based questionnaire daily over an 8-month period, reporting their training/match activities from the previous day. Athletes were categorised as representative (academy/county/international) or non-representative (club/school) depending on the highest level of their sport they participated. Mean weekly frequencies and sRPE load of different training/match activities were quantified for each athlete across five school terms. Mann-Whitney U tests established the significance of differences and effect sizes between playing standards for mean weekly frequencies and mean sRPE load. Within-athlete weekly sRPE loads were highly variable for both playing standards however representative level athletes participated in significantly more activity outside of school compared to non-representative athletes during November to December (effect size; 0.43 – club technical training; 0.36 – club matches), January to February (effect size; 0.78 – club technical training; 0.75 – club matches) and February to March (effect size; 0.63 – club technical training; 0.44 – club matches). Therefore, club and school coaches must ensure that all elements of representative athlete's training schedules are coordinated and flexible to promote positive adaptions to training such as skill & physical development and prevent maladaptive responses such as overuse injury and non-functional overreaching. A cooperative and malleable training schedule between club/school coaches and the athlete will allow the athlete to perform on multiple fronts whilst also being able to meet the demands of additional stressors such as schoolwork.
The development of a youth team sport athlete is a complex process. This paper outlines challenges which may restrict the optimal balance between training and recovery and provide solutions to help practitioners overcome these challenges. To facilitate positive youth athletic development, training aims must be aligned between stakeholders to synchronise periods of intensified training and recovery. Within- and between-athlete variations in weekly training load must be managed and practitioners should attempt to ensure the intended load of training equals the load perceived by the athlete. Furthermore, practitioners should be cognizant of the athletes’ non-sport related stressors to enable both academic and sporting pursuits. Whilst each of these challenges adds intricacy, they may be overcome through collaboration, monitoring and if necessary, the modification of the athletes’ training load.
Purpose: To evaluate the relative importance and predictive ability of salivary immunoglobulin A (s-IgA) measures with regards to upper respiratory illness (URI) in youth athletes. Methods: Over a 38-week period, 22 youth athletes (age = 16.8 [0.5] y) provided daily symptoms of URI and 15 fortnightly passive drool saliva samples, from which s-IgA concentration and secretion rate were measured. Kernel-smoothed bootstrapping generated a balanced data set with simulated data points. The random forest algorithm was used to evaluate the relative importance (RI) and predictive ability of s-IgA concentration and secretion rate with regards to URI symptoms present on the day of saliva sampling (URIday), within 2 weeks of sampling (URI2wk), and within 4 weeks of sampling (URI4wk). Results: The percentage deviation from average healthy s-IgA concentration was the most important feature for URIday (median RI 1.74, interquartile range 1.41–2.07). The average healthy s-IgA secretion rate was the most important feature for URI4wk (median RI 0.94, interquartile range 0.79–1.13). No feature was clearly more important than any other when URI symptoms were identified within 2 weeks of sampling. The values for median area under the curve were 0.68, 0.63, and 0.65 for URIday, URI2wk, and URI4wk, respectively. Conclusions: The RI values suggest that the percentage deviation from average healthy s-IgA concentration may be used to evaluate the short-term risk of URI, while the average healthy s-IgA secretion rate may be used to evaluate the long-term risk. However, the results show that neither s-IgA concentration nor secretion rate can be used to accurately predict URI onset within a 4-week window in youth athletes.
Despite the importance and complexity of developing sprint performance in football code athletes, there are limited studies exploring practitioners’ practices to improve sprinting. Therefore, this study aimed to describe and evaluate the practices used with elite football code athletes to develop sprint performance. Ninety subjects completed a survey comprised of four sections (coaching demographic, evaluation of training, organisation of training, and training protocols). Survey responses showed that 98% of practitioners monitor sprint performance, and 92% integrated monitoring strategies into sprint development programmes to inform training. All practitioners used combined training methods including specific (e.g., sprints with or without overload) and non-specific (e.g., strength training or plyometrics) methods targeting the underpinning determinants of sprint performance. Most practitioners reported prescribing 1-3 or 2-4 days·wk-1 for sprint development, both in-season and pre-season. Sprint development programmes were uncommon in the off-season. Most specific sprint training sessions were reportedly shorter in duration (5-15 and 15-30 min) than non-specific sprint training methods (30-45 and >45 min) irrespective of the season phases. Sprint development was integrated before and after sport-specific training, regularly using warm-ups and gym sessions. Specific training methods were also implemented in separate sessions. The specific content (e.g., exercise selection, training load prescription) was highly variable between practitioners. This study represents the first detailed survey (practices and justification) of sprint development practices (evaluation and organisation of training protocols) in football code cohorts. These findings present multiple methods of structuring, integrating and manipulating sprint training based on the training aims and the individual context.
This study investigated the seasonal change in physical performance of 113 (Under 10: U10 (n=20), U12 (n=30), U14 (n=31) and U16 (n=32)) elite youth female soccer players. Players completed testing pre-, mid- and post-season, including speed (10 and 30m sprint), change of direction (CoD; 505 test), power (Countermovement jump, CMJ), strength (isometric midthigh pull) and aerobic capacity (YoYo Intermittent Recovery Test Level 1; YYIRL1).
Youth athletes frequently participate in multiple sports or for multiple teams within the same sport. To optimise player development and minimise undesirable training outcomes (e.g., overuse injuries), practitioners must be cognizant of an athlete's training load within and outside of their practice. The present study aimed to establish the validity of a 24-hour (s-RPE24) and 72-hour (s-RPE72) recall of session rating of perceived exertion (s-RPE) against the criterion measure of s-RPE collected 30 minutes' post training (s-RPE30). Thirty-eight adolescent athletes provided a s-RPE30 following the first field based training session of the week. Approximately 24 hours later subjects were asked to recall the intensity and duration of the previous days training. The following week subjects once again provided a s-RPE30 measure post training before recalling the intensity and duration of the session approximately 72 hours later. A nearly perfect correlation (0.98 [0.97 - 0.99]) was found between s-RPE30 and s-RPE24, with a small typical error of estimate (TEE; 8.3% [6.9 - 10.5]) and trivial mean bias (-1.1% [-2.8 - 0.6]). Despite a large correlation between s-RPE30 and s-RPE72 (0.73 [0.59 - 0.82]) and a trivial mean bias (-0.2% [-6.8 - 6.8]) there was a large typical error of estimate (TEE; 35.3% [29.6 - 43.9]). s-RPE24 provides a valid measure of retrospectively quantifying s-RPE, however the large error associated with s-RPE72 suggests it is not a suitable method for monitoring training load in youth athletes.
Can the Physical Development Trajectories of Rugby League Players at Different Age Groups Inform the Talent Pathway? A Multi‐Club Study of 261 Players
ABSTRACT
The structure of a talent identification and development system (TIDS), in terms of its starting, entry, and exit points is an important consideration for sporting organisations. Early talent identification decisions can be ineffective due to unpredictable and individually variable talent development. Physical qualities are a key contributor to performance in rugby league. Therefore, understanding physical development differences between age groups can inform the structure of the rugby league TIDS by highlighting key phases of development. Between‐player variability in physical development must also be considered to understand the generalisability of age‐group trends. Consequently, this study aimed to compare rates of physical development between annual age groups (i.e., U15, 16, 17, 18) in 261 youth rugby league players from multiple clubs, considering individual differences in development rates. Latent growth curve analysis was used to model rates of physical development for size (i.e., height, mass), strength, power, speed, and cardiovascular fitness in each age group. Results showed that U15s had significantly faster rates of development for body size and strength qualities compared with all older age groups, with large between‐player variability. No differences were apparent between age groups for power, speed, or cardiovascular fitness. These findings suggest that early talent identification and (de)selection decisions may ignore the potential development of body size and strength qualities, which occurs at individually variable rates. Such findings can inform the structure and design of the rugby league TIDS by highlighting expected rates of physical development based on players' age groups.
PURPOSE: To assess indirect markers of intestinal endothelial cell damage and permeability in academy rugby players in response to rugby training at the beginning and end of preseason. METHODS: Blood and urinary measures (intestinal fatty acid binding protein and lactulose:rhamnose) as measures of gastrointestinal cell damage and permeability were taken at rest and after a standardised collision-based rugby training session in 19 elite male academy rugby players (age: 20 ± 1 years, backs: 89.3 ± 8.4 kg; forwards: 111.8 ± 7.6 kg) at the start of preseason. A subsample (n = 5) repeated the protocol after six weeks of preseason training. Gastrointestinal symptoms (GIS; range of thirteen standard symptoms), aerobic capacity (30-15 intermittent fitness test), and strength (1 repetition maximum) were also measured. RESULTS: Following the rugby training session at the start of preseason, there was an increase (median; interquartile range) in intestinal fatty acid binding protein (2140; 1260-2730 to 3245; 1985-5143 pg/ml, p = 0.003) and lactulose:rhamnose (0.31; 0.26-0.34 to 0.97; 0.82-1.07, p < 0.001). After six weeks of preseason training players physical qualities improved, and the same trends in blood and urinary measures were observed within the subsample. Overall, the frequency and severity of GIS were low and not correlated to markers of endothelial damage. CONCLUSIONS: Rugby training resulted in increased intestinal endothelial cell damage and permeability compared to rest. A similar magnitude of effect was observed after six weeks of pre-season training. This was not related to the experience of GIS.
Video analysis research into the rugby league tackle typically uses technical criteria from coaching cues or tackle variables from rugby union. As such, the content validity and relevance could be questioned. A video analysis framework which establishes appropriate variables for rugby league is therefore required. The aim of this study was to adopt a 5-stage process to establish a video analysis framework for the rugby league tackle, which was content valid, relevant and reliable. The 5-stage process included 1) creation of draft variable list (video analysis framework), using available rugby tackle research, 2) expert group recruitment and critique, 3) refinement of video analysis framework to establish content validity, 4) response process validity task and agreement within expert group, 5) intra- and inter-reliability testing using Kappa statistics. The agreed video analysis framework comprised 6 phases including; tackle event, defensive start point, pre-contact, initial contact, post-contact and play the ball. Within the identified phases, 63 variables were established. The intra- and inter-reliability testing resulted in strong agreement (>0.81-1.0) within all phases. The 5-stage process allowed for the creation of a valid, relevant and reliable video analysis framework. The video analysis framework can be used in rugby league tackle research, categorising complex tackle events such as injurious or optimal tackles, improving both player welfare and performance. Furthermore, the application of the video analysis framework to future rugby league research will increase the coherence and usefulness of research findings.
We investigated 3-compartment body composition across one competitive season in professional male rugby union players using dual-energy X-ray absorptiometry (GE iDXA). Thirty five players from one English Premiership team (forwards: n=20, age: 25.5±4.7 years; backs: n=15, age: 26.1±4.5 years) received one total body DXA scan at pre-season (August), mid-season (January) and end-season (May), enabling quantification of body mass, total and regional fat mass, lean mass, percentage tissue fat mass (%TFM) and bone mineral content (BMC). Both team and individual changes were evaluated, and for the latter, least significant change (LSC) was derived from precision data and applied as per International Society for Clinical Densitometry guidelines. Mean body mass remained stable throughout the season (p>0.05), but total fat mass and %TFM increased from pre to end-season, and mid to end-season (p<0.05). There were also statistically significant increases in total-body BMC across the season (P<0.05). In backs, there was a loss of lean mass between mid and end-season (P<0.01). Individual evaluation using LSC and Bland Altman analysis revealed a meaningful loss of lean mass in 17 players and a gain of fat mass in 21 players from pre to end-season. Twelve players exhibited no change. Strategies to improve the maintenance of pre-season lean/ fat ratios across the season for professional rugby union players might be beneficial to performance and health, and thus require exploration. We recommend that future studies include an individualised approach to DXA body composition monitoring and this can be achieved through application of derived LSC.
The Longitudinal Development of Anthropometric and Fitness Characteristics within Academy Rugby League Players
Till, K, Jones, B, Darrall-Jones, J, Emmonds, S, and Cooke, C. Longitudinal development of anthropometric and physical characteristics within academy rugby league players. J Strength Cond Res 29(6): 1713-1722, 2015-The purpose of this study was to evaluate the annual and long-term (i.e., 4 years) development of anthropometric and physical characteristics in academy (16-20 years) rugby league players. Players were assessed at the start of preseason over a 6-year period and were required to be assessed on consecutive years to be included in the study (Under 16-17, n 35; Under 17-18, n=44; Under 18-19, n=35; Under 19-20, n=16). A subset of 15 players were assessed for long-term changes over 4 years (Under 16-19). Anthropometric (height, body mass, sum of 4 skinfolds) and physical (10-and 20-m sprint, 10-m momentum, vertical jump, yo-yo intermittent recovery test level 1, 1 repetition maximum [1RM] squat, bench press, and prone row) assessments were collected. Paired t-tests and repeated measures analysis of variance demonstrated significant annual (e.g., body mass, U16=76.4 ± 8.4, U17=81.3 ± 8.3 kg; p < 0.001, d=0.59) and long-term (e.g., vertical jump, Under 16=44.1 ± 3.8, Under 19=52.1 ± 5.3 cm; p < 0.001, d=1.74) changes in anthropometric and physical characteristics. Greater percentage changes were identified between the Under 16-17 age categories compared with the other ages (e.g., 1RM squat, U16-17 22.5 ± 19.5 vs. U18-19 4.8 ± 6.4%). Findings demonstrate the annual and long-term development of anthropometric and physical characteristics in academy rugby league players establishing greater changes occur at younger ages upon the commencement of a structured training program within an academy. Coaches should understand the long-term development of physical characteristics and use longitudinal methods for monitoring and evaluating player performance and development.
The effect of distance covered, number of high intensity efforts and heart rate on the decision-making accuracy of professional Rugby League referees
Rugby league referees have the responsibility of enforcing the laws of the game and can influence the outcome based on their decisions. Performance demands inherent in refereeing involve fitness and positioning, law knowledge and application, contextual judgement and game management (Weston et al., 2012, International Journal of Sports Medicine, 42, 615–617). No study to date has investigated the relationship between the physiological and movement demands of refereeing and penalty accuracy. To quantify penalty accuracy scores of rugby league referees and determine the relationship with total distance covered (TD), high intensity distance (HIT) and mean heart rate per half and 10-min period of a match. With institutional ethical approval, all 8 professional Super League referees participated in this study. During the 2012 season, 148 Super League matches were analysed using 10Hz GPS units (MinimaxV4; Catapult Sports, Australia) and 1-Hz heart rate monitors (Polar Electro, Kempele, Finland). Decision-making demands were quantified using Opta Stats (Leeds, UK), which were retrospectively reviewed by an expert referee review panel to determine the accuracy of decisions when awarding or not awarding a penalty. A dependant t-test was used to assess the differences between halves. Repeated measures ANOVA was conducted with a Bonferroni post hoc to assess the differences between 10-min match periods, in addition to Cohen’s d effect sizes. Pearson’s product correlation was used to determine relationships. Super League referees made the correct penalty decision on 74 ± 5% of occasions. Significantly more distance was covered (3586 ± 394 vs. 3514 ± 424 m, P = 0.009, d = 0.18), and a significantly greater heart rate (154 ± 9 vs. 149 ± 9 beats.min-1, P = 0.001, d = 0.56) was achieved in the first compared to the second half. There was no significant difference (P = 0.812) in penalty accuracy (75 ± 4 vs. 73 ± 6 %) or HIT (P = 0.081) between halves. When observed per half and 10 minute periods, there was no significant relationship between penalty accuracy scores and TD (r = –0.023, P = 0.645), HIT (r = 0.093, P = 0.18) or heart rate (r = 0.129, P = 0.135). Findings suggest that the physiological and movement demands of refereeing in rugby league are not significantly related to penalty accuracy scores per 40-min or 10-min period. While it has been observed that there was no significant relationship between TD, HIT or heart rate and accuracy, further research is required to investigate confounding variables (i.e. refereeing experience and fitness levels) that may further influence penalty accuracy. Given the small sample population of professional referees (n = 8), a case study approach to future research is recommended.
The aim of this study was to investigate the difference in head acceleration event (HAE) incidence between training and match‐play in women's and men's players competing at the highest level of domestic rugby union globally. Players from Women's (Premiership Women's Rugby, Farah Palmer Cup) and Men's (Premiership Rugby, Currie Cup) rugby union competitions wore instrumented mouthguards during matches and training sessions during the 2022/2023 seasons. Peak linear (PLA) and angular (PAA) acceleration were calculated from each HAE and included within generalized linear mixed‐effects models. The incidence of HAEs was significantly greater in match‐play compared to training for all magnitude thresholds in both forwards and backs, despite players spending approximately 1.75–2.5 times more time in training. For all HAEs (PLA > 5 g and PAA > 400 rad/s2), incidence rate ratios (IRRs) for match versus training ranged from 2.80 (95% CI: 2.38–3.30; men's forwards) to 4.00 (3.31–4.84; women's forwards). At higher magnitude thresholds (PLA > 25 g; PAA > 2000 rad/s2), IRRs ranged from 3.64 (2.02–6.55; PAA > 2000 rad/s2 in men's backs) to 11.70 (6.50–21.08; PAA > 2000 rad/s2 in women's forwards). Similar trends were observed in each competition. Players experienced significantly more HAEs during match‐play than training, particularly at higher magnitude thresholds. Where feasible, HAE mitigation strategies may have more scope for HAE reduction if targeted at match‐play, particularly where higher magnitude HAEs are the primary concern. However, the number of HAEs associated with different training drills requires exploration to understand if HAEs can be reduced in training, alongside optimizing match performance (e.g., enhancing contact technique).
To compare the probability of tackle success (the tackler preventing the ball‐carrier and ball from progressing towards the tackler try‐line) when contacting the ball‐carrier at different heights (shoulder, mid‐torso and legs) for different types of tackles (active, passive, smother and arm) while accounting for other tackler situational factors within seven playing levels. Video footage of 271 male rugby union matches were analysed across seven playing groups (Under [U] 12, n = 25 matches; U14, n = 35; U16, n = 39; U18 Amateur n = 39; U18 Elite n = 38; Senior Amateur, n = 40 and Senior Elite, n = 50) across England, New Zealand, South Africa, Portugal and USA (a total of 51,106 tackles). A multi‐level logistic regression model with tackle success as the outcome variable and first point of contact and type of tackle as the explanatory variables were computed. Included in the model as cofounders were the situational variables tackle direction, tackle sequence, number of players in the tackle and attacker intention. Post‐estimation marginal effects were used to calculate the probabilities (expressed as a percentage %) of tackle success for each interaction between tackle type (active shoulder, smother, passive shoulder and arm) and the first point of contact (shoulder, mid‐torso and legs). The probability of tackle success in relation to where the ball‐carrier is contacted varied by tackle type and within each age group. The probabilities (Pr) for contacting the shoulder versus mid‐torso at the senior levels (elite and amateur) did not differ in relation to tackle success (for instance, for active shoulder tackles within senior elite; shoulder Pr 86% 95% CI 82–89 and mid‐torso Pr 82% 95% CI 77–86), whereas at the junior levels, contacting the shoulder had a higher probability than other points of contact. Active shoulder tackles had the highest probability of tackle success across the different playing levels across the different contact heights, whereas arm tackles had the lowest probability (for instance, for mid‐torso tackles within senior elite, active Pr 82% 95% CI 77–86 vs. arm Pr 69% 95% CI 64–75). Coaches and practitioners can use this information to improve tackle training design and planning within the different age groups and facilitate player development.
This study investigated sources of variability in the overall and phase-specific running match characteristics in elite rugby league. Microtechnology data were collected from 11 Super League (SL) teams, across 322 competitive matches within the 2018 and 2019 seasons. Total distance, high-speed running (HSR) distance (>5·5 m·s−1), average speed, and average acceleration were assessed. Variability was determined using linear mixed models, with random intercepts specified for player, position, match, and club. Large within-player coefficients of variation (CV) were found across whole match, ball-in-play, attack and defence for total distance (CV range = 24% to 35%) and HSR distance (37% to 96%), whereas small to moderate CVs (≤10%) were found for average speed and average acceleration. Similarly, there was higher between-player, -position, and -match variability in total distance and HSR distance when compared with average speed and average acceleration across all periods. All metrics were stable between-teams (≤5%), except HSR distance (16% to 18%). The transition period displayed the largest variability of all phases, especially for distance (up to 42%) and HSR distance (up to 165%). Absolute measures of displacement display large within-player and between-player, -position, and -match variability, yet average acceleration and average speed remain relatively stable across all match-periods.
This study examined the relative contribution of exercise duration and intensity to team-sport athlete’s training load. Male, professional rugby league (n = 10) and union (n = 22) players were monitored over 6- and 52-week training periods, respectively. Whole-session (load) and per-minute (intensity) metrics were monitored (league: session rating of perceived exertion training load [sRPE-TL], individualised training impulse, total distance, BodyLoad™; union: sRPE-TL, total distance, high-speed running distance, PlayerLoad™). Separate principal component analyses were conducted on the load and intensity measures to consolidate raw data into principal components (PC, k = 4). The first load PC captured 70% and 74% of the total variance in the rugby league and rugby union datasets, respectively.. Multiple linear regression subsequently revealed that session duration explained 73% and 57% of the variance in first load PC, respectively, while the four intensity PCs explained an additional 24% and 34%, respectively. Across two professional rugby training programmes, the majority of the variability in training load measures was explained by session duration (~60–70%), while a smaller proportion was explained by session intensity (~30%). When modelling the training load, training intensity and duration should be disaggregated to better account for their between-session variability.
This two-part study evaluated the inter- and intra-unit reliability of Catapult Vector S7 microtechnology units in an indoor court-sport setting. In part-one, 27 female netball players completed a controlled movement series on two separate occasions to assess the inter- and intra-unit reliability of inertial movement analysis (IMA) variables (acceleration, deceleration, changes of direction and jumps). In part-two, 13 female netball players participated in 10 netball training sessions to assess the inter-unit reliability of IMA and PlayerLoadTM variables. Participants wore two microtechnology units placed side-by-side. Reliability was assessed using intraclass correlation coefficient (ICC), coefficient of variation (CV) and typical error (TE). Total IMA events showed good inter-unit reliability during the movement series (ICC, 1.00; CV, 3.7%) and training sessions (ICC, 0.99; CV, 4.5%). Inter-unit (ICC, 0.97; CV, 4.7%) and intra-unit (ICC, 0.97; CV, 4.3%) reliability for total IMA jump count was good in the movement series, with moderate CV (7.7%) during training. Reliability decreased when IMA counts were categorised by intensity and movement type. PlayerLoadTM (ICC, 1.00; CV, 1.5%) and associated variables revealed good inter-reliability, except peak PlayerLoadTM (moderate) and PlayerLoadSLOW (moderate). Counts of IMA variables, when considered as total and low-medium counts, and PlayerLoad variables are reliable for monitoring indoor court-sports players.
Rugby league has a relatively high injury risk, with the tackle having the greatest injury propensity. The number of tackles players engage in, prior to injurious tackles may influence injury risk, which has yet to be investigated. Therefore, this study investigated if rugby league players are involved in more tackles (as either tackler or ball carrier) (i) in the 10 minutes, or (ii) 1-min periods prior to an injurious tackle-event, (iii) differences for ball carriers vs. tacklers, and (iv) forwards vs. backs. Video analysis was utilised to quantify the number and rate of tackles in the 10-min periods prior to 61 tackle-related injuries. One thousand two hundred and eighty 10-min periods where players were not injured, were used as matched-controls. Generalized mixed linear models were used to analyse mean total and rate for tackles. Injured players were involved in significantly fewer tackles during the 10-min period, yet significantly more tackles during the final minute prior to the injurious tackle-event, compared to non-injured players. There were no differences between ball carriers vs. tacklers during the 10-min period. Both injured position groups were involved in significantly more tackles in the final minute. Additional match data sources are needed to further inform injury preventive strategies of tackle events.
Head acceleration events (HAEs) can potentially have adverse consequences for athlete brain health. In sports, in which head injuries have the highest incidence, identifying strategies to reduce HAE frequency and magnitude is a priority. Neck training is a potential strategy to mitigate against the magnitude of HAEs. This two-part study aimed to (1) systematically review the literature of neck training interventions in sport and (2) undertake an expert Delphi consensus on the best practices for neck training implementation to reduce HAEs in sport. Part I: a systematic search of four databases was undertaken from the earliest records to September 2024. The PRISMA (Preferred Reporting Items for Systematic Review and Meta-Analysis) guidelines were followed, and a quality assessment was completed using a modified Downs and Black assessment tool and the GRADE (Grading of Recommendations Assessment, Development and Evaluation). Papers were eligible if they both (1) implemented a reproducible exercise intervention targeting the neck within collision, combat or motor sport, and (2) assessed outcomes relating to either: the physical profile of the neck; head/neck injury incidence; and/or HAEs. Part II: 18 international experts, with experience in research and/or applied practice of neck exercise training, concussion and/or HAEs, reviewed the part I findings before completing a three-round Delphi consensus process. Part I included 21 papers, highlighting the heterogeneity of existing interventions. Part II resulted in 57 statements coded into five categories: contextual factors (n=17), neck training periodisation (n=12), training adaptations (n=10), neck training content (n=15) and athlete adherence (n=3). This study presents recommendations for neck exercise training aiming to reduce HAEs in sport, supporting both practice and future research.
This study aimed to establish consensus on injury risk factors in netball via a combined systematic review and Delphi method approach. A systematic search of databases (PubMed, Scopus, MEDLINE, SPORTDiscus, CINAHL) was conducted from inception until June 2023. Twenty-four risk factors were extracted from 17 studies and combined with a three-round Delphi approach to achieve consensus. In round-one, experts listed perceived risk factors for injury in netball which were combined with the risk factors identified via the systematic review. In round-two and round-three, experts rated their level of agreement with each risk factor on a 5-point Likert scale (1-strongly disagree to 5-strongly agree). Consensus was defined as 80% agreement (with <10% in disagreement). In round-three, experts also rated the priority for mitigating the risk factor (1-very low to 5-very high). Nineteen experts participated in round-one and round-two, and sixteen participated in round-three (response rate 84%). One-hundred and nine risk factors for injury were identified by the systematic review and experts combined. Sixty-one risk factors reached consensus, categorised into five groups: ‘individual characteristics’ (n=22), ‘lifestyle’ (n=11), ‘training and competition’ (n=14), ‘sport science and medical provision’ (n=6) and ‘facilities and equipment’ (n=8). ‘Poor landing technique/mechanics’ had a median (interquartile range) mitigation priority rating of 5(1), while all others had median ratings of 3-4.5. This study identifies a range of risk factors for injury, provides focus areas for injury prevention, and highlights the importance of a multi-disciplinary approach to injury mitigation in netball.
Super League (SL) and Championship (RLC) rugby league players will compete against each other in 2015 and beyond. To identify possible discrepancies, this study compared the anthropometric profile and body composition of current SL (full-time professional) and RLC (part-time semi-professional) players using dual-energy X-ray absorptiometry (DXA). A cross-sectional design involved DXA scans on 67 SL (n=29 backs, n=38 forwards) and 46 RLC (n=20 backs, n=26 forwards) players during preseason. A one-way ANOVA was used to compare age, stature, body mass, soft tissue fat percentage, bone mineral content (BMC), total and regional (i.e., arms, legs and trunk) fat and lean mass between SL forwards, SL backs, RLC forwards and RLC backs. No significant differences in age, stature or body mass were observed. SL forwards and backs had relatively less soft tissue fat (17.5 ± 3.7 and 14.8 ± 3.6 vs. 21.4 ± 4.3 and 20.8 ± 3.8%), greater BMC (4,528 ± 443 and 4,230 ± 447 vs. 4,302 ± 393 and 3,971 ± 280 g), greater trunk lean mass (37.3 ± 3.0 and 35.3 ± 3.8 vs. 34.9 ± 32.3 and 32.3 ± 2.6 kg) and less trunk fat mass (8.5 ± 2.7 and 6.2 ± 2.1 vs. 10.7 ± 2.8 and 9.5 ± 2.9 kg) than RLC forwards and backs. Observed differences may reflect selection based on favourable physical attributes, or training adaptations. To reduce this discrepancy, some RLC players should reduce fat mass and increase lean mass, which may be of benefit for the 2015 season and beyond.
Advances in rugby body composition: Comparison between Elite English Academy rugby league and professional Super League players
The assessment of body size and body composition is essential when evaluating and monitoring the development of Academy rugby league (RL) players. To date, no study has explored relative three-compartment body composition in Academy players compared to professional Super League (SL) players. The purpose of this study was to compare body size and relative body composition in Academy RL players and SL players using dual energy X-ray absorptiometry (DXA). With institutional research ethics approval, 63 European SL players from two clubs (backs: n = 25, age 25.7 ± 4.3 years; forwards: n = 38, age 26.1 ± 4.9 years) and 32 Academy players from one club (backs: n = 14, age 18.1 ± 1.0 years; forwards: n = 18, age 18.1 ± 0.9 years), received one total-body DXA scan (Lunar iDXA, GE Healthcare Little Chalfont, Buckinghamshire) during pre-season, in a euhydrated state (urine osmolality <700 mOsmol · kg-1). The regions of interest on scan images were manually adjusted where necessary by a qualified densitometrist, according to manufacturer guidelines. Independent t-tests compared height, body mass and percentage body fat (%BF). Multivariate analysis with height and body mass as covariates, examined positional differences in body composition by level. Effect size was calculated using Cohen’s d. SL players were taller (backs: 181.3 ± 6.1 vs. 179.5 ± 5.3 cm; forwards: 184.3 ± 5.5 vs. 179.1 ± 6.2 cm; P = 0.005, d = 0.33–0.89) and heavier (backs: 90.2 ± 9.1 vs. 83.1 ± 6.8 kg; forwards: 99.8 ± 8.1 vs. 90.1 ± 9.0 kg; P < 0.001, d = 0.88–1.13) than Academy players. %BF was greater in Academy compared to SL forwards (20.1 ± 3.0 vs. 17.5 ± 3.7%; P = 0.01, d = 0.77), but similar between levels in backs (16.1 ± 3.0 vs. 14.9 ± 3.6 %). In Academy forwards, total fat mass (FM) was greater (?3.0 (s x 0.9) kg, P = 0.009, d = 0.85), and total lean mass (LM) was lower (? -2.8 (0.9) kg, P = 0.016, d = 0.88) than in SL forwards. Relative to body size, total and regional FM, LM and BMC in Academy backs were similar to SL backs. Academy forwards had greater arm and leg FM than SL forwards (?2.7 (1.0) kg, P = 0.05, d = 0.83; ?1.7 (0.3) kg P < 0.001, d = 1.5) and lower arm (?-58.1 (16.3) g, P = 0.004, d = 1.03) and trunk (?-92.4 (31.4) g, P = 0.025, d = 0.78) BMC. Our findings of lower LM and BMC relative to body size in Academy forwards suggest that these players are still developing. This corresponds with longitudinal reports elsewhere that the majority of adult fat-free mass is achieved during the late second to early third decade, following the attainment of adult height and bone size. The longitudinal tracking of body size and composition of Academy RL players to senior level is a direction for future research.
Advances in rugby body composition: Seasonal changes in Premiership rugby union players
Body composition analysis is regularly conducted in professional rugby union (RU) players to monitor changes in body mass (BM), fat mass (FM), lean mass (LM), percentage body fat (%BF) and bone mineral content (BMC). It would be desirable for RU players to maintain LM for the duration of the season, due to the high levels of muscular power and strength required for performance. To date, the seasonal changes in body composition associated with professional RU have not been documented. The purpose was to investigate acute changes in body composition during a competitive season in professional RU players using dual energy X-ray absorptiometry (DXA). With institutional ethical approval, players were recruited from an English Premiership club (n = 23, age: 25.9 ± 4.7 years, height: 187.2 ± 7.7 cm). Players received one total-body DXA scans (Lunar iDXA, GE Healthcare) during three phases of the competitive season (pre-season (August), mid-season (January) and post-season (May)) in a euhydrated state (urine osmolality
Purpose: To compare the physical qualities between academy and international youth rugby league (RL) players using principal component analysis. Methods: Six hundred fifty-four males (age = 16.7 [1.4] y; height = 178.4 [13.3] cm; body mass = 82.2 [14.5] kg) from 11 English RL academies participated in this study. Participants completed anthropometric, power (countermovement jump), strength (isometric midthigh pull; IMTP), speed (10 and 40 m speed), and aerobic endurance (prone Yo-Yo IR1) assessments. Principal component analysis was conducted on all physical quality measures. A 1-way analysis of variance with effect sizes was performed on 2 principal components (PCs) to identify differences between academy and international backs, forwards, and pivots at under 16 and 18 age groups. Results: Physical quality measures were reduced to 2 PCs explaining 69.4% of variance. The first PC (35.3%) was influenced by maximum and 10-m momentum, absolute IMTP, and body mass. Ten and forty-meter speed, body mass and fat, prone Yo-Yo, IMTP relative, maximum speed, and countermovement jump contributed to PC2 (34.1%). Significant differences (P < .05, effect size = −1.83) were identified between U18 academy and international backs within PC1. Conclusion: Running momentum, absolute IMTP, and body mass contributed to PC1, while numerous qualities influenced PC2. The physical qualities of academy and international youth RL players are similar, excluding U18 backs. Principal component analysis can reduce the dimensionality of a data set and help identify overall differences between playing levels. Findings suggest that RL practitioners should measure multiple physical qualities when assessing physical performance.
The effect of body mass on 30:15 end stage running speed in rugby union players.
Objectives Professional sporting organisations invest considerable resources collecting and analysing data in order to better understand the factors that influence performance. Recent advances in non-invasive technologies, such as global positioning systems (GPS), mean that large volumes of data are now readily available to coaches and sport scientists. However analysing such data can be challenging, particularly when sample sizes are small and data sets contain multiple highly correlated variables, as is often the case in a sporting context.
Interpreting the physical qualities of youth athletes is complex due to the effects of growth, maturation and development. This study aimed to evaluate the effect of position, chronological age, relative age and maturation on the physical qualities of elite male academy rugby union players. 1,424 participants (n=2,381 observations) from nine Rugby Football Union Regional Academies prospectively completed a physical testing battery at three time points, across three playing seasons. Anthropometrics, body composition, muscular power, muscular strength, speed, aerobic capacity and running momentum were assessed. Positional differences were identified for all physical qualities. The largest effect sizes were observed for the associations between chronological age (d=0.65 to 0.73) and maturation (d=-0.77 to -0.69) and body mass related variables (i.e. body mass and running momentum). Relative strength, maximum velocity and aerobic capacity were the only models to include two fixed effects with all other models including at least three fixed effects (i.e. position and a combination of chronological age, relative age and maturation). These findings suggest a multidimensional approach considering position, chronological age, relative age and maturation is required to effectively assess the physical qualities of male age grade rugby union players. Therefore practitioners should use regression equations rather than traditional descriptive statistic tables to provide individualised normative comparisons thus enhancing the application of testing results for talent identification and player development.
This study quantified and compared the collision and non-collision match characteristics across age categories (i.e. U12, U14, U16, U18, Senior) for both amateur and elite playing standards from Tier 1 rugby union nations (i.e. England, South Africa, New Zealand). Two-hundred and one male matches (5911 min ball-in-play) were coded using computerised notational analysis, including 193,708 match characteristics (e.g. 83,688 collisions, 33,052 tackles, 13,299 rucks, 1006 mauls, 2681 scrums, 2923 lineouts, 44,879 passes, 5568 kicks). Generalised linear mixed models with post-hoc comparisons and cluster analysis compared the match characteristics by age category and playing standard. Overall significant differences (p < 0.001) between age category and playing standard were found for the frequency of match characteristics, and tackle and ruck activity. The frequency of characteristics increased with age category and playing standard except for scrums and tries that were the lowest at the senior level. For the tackle, the percentage of successful tackles, frequency of active shoulder, sequential and simultaneous tackles increased with age and playing standard. For ruck activity, the number of attackers and defenders were lower in U18 and senior than younger age categories. Cluster analysis demonstrated clear differences in all and collision match characteristics and activity by age category and playing standard. These findings provide the most comprehensive quantification and comparison of collision and non-collision activity in rugby union demonstrating increased frequency and type of collision activity with increasing age and playing standard. These findings have implications for policy to ensure the safe development of rugby union players throughout the world.
Identifying the external training load variables which influence subjective internal response will help reduce the mismatch between coach-intended and athlete-perceived training intensity. Therefore, this study aimed to reduce external training load measures into distinct principal components (PCs), plot internal training response (quantified via session Rating of Perceived Exertion [sRPE]) against the identified PCs and investigate how the prescription of PCs influences subjective internal training response. Twenty-nine school to international level youth athletes wore microtechnology units for field-based training sessions. SRPE was collected post-session and assigned to the microtechnology unit data for the corresponding training session. 198 rugby union, 145 field hockey and 142 soccer observations were analysed. The external training variables were reduced to two PCs for each sport cumulatively explaining 91%, 96% and 91% of sRPE variance in rugby union, field hockey and soccer, respectively. However, when internal response was plotted against the PCs, the lack of separation between low-, moderate- and high-intensity training sessions precluded further analysis as the prescription of the PCs do not appear to distinguish subjective session intensity. A coach may therefore wish to consider the multitude of physiological, psychological and environmental factors which influence sRPE alongside external training load prescription.
The aim of this study was to quantify the mean weekly training load (TL) of elite adolescent rugby union players participating in multiple teams, and examine the differences between playing positions. Twenty elite male adolescent rugby union players (17.4 ± 0.7 years) were recruited from a regional academy and categorised by playing position; forwards (n=10) and backs (n=10). Global positioning system and accelerometer microtechnology was used to quantify external TL, and session-rating of perceived exertion (sRPE) was used to quantify internal TL during all sessions throughout a 10-week in-season period. A total of 97 complete observations (5 ± 3 weeks per participant) were analysed, and differences between-positions were assessed using Cohen’s d effect sizes (ES) and magnitude-based inferences. Mean weekly sRPE was 1217 ± 364 AU (between-subject coefficient of variation (CV) = 30%), with a total distance (TD) of 11629 ± 3445 m (CV= 30%), and PlayerLoadTM (PL) of 1124 ± 330 AU (CV= 29%). Within-subject CV ranged between 5-78% for sRPE, 24-82% for TD, and 19-84% for PL. Mean TD (13063 ± 3933 vs. 10195 ± 2242 m), and PL (1246 ± 345 vs. 1002 ± 279 AU) were both likely greater for backs compared to forwards (moderate ES), however differences in sRPE were unclear (small ES). Although mean internal TLs and volumes were low, external TLs were higher than previously reported during pre-season and in-season periods in senior professional players. Additionally, the large between-subject and within-subject variation in weekly TL suggests players participate in a chaotic training system.
Objectives Report two-years of training injury data in senior and academy professional rugby league. Design Prospective cohort study. Method Match and training time-loss injuries and exposure data were recorded from two-seasons of the European Super League competition. Eleven/12 (2021) and 12/12 (2022) senior and 8/12 (2021) and 12/12 (2022) academy teams participated. Training injuries are described in detail and overall match injuries referred to for comparison only. Results 224,000 training exposure hours were recorded with 293 injuries at the senior (mean [95 % confidence interval]; 3[2–3] per 1000 h) and 268 academy level (2 [2–3] per 1000 h), accounting for 31 % and 40 % of all injuries (i.e., matches and training). The severity of training injuries (senior: 35 [30–39], academy: 36 [30–42] days-lost) was similar to match injuries. Lower-limb injuries had the greatest injury incidence at both levels (senior: 1.85 [1.61–2.12], academy: 1.28 [1.08–1.51] per 1000 h). Head injuries at the academy level had greater severity (35 [25–45] vs. 18 [12–14] days-lost; p < 0.01) and burden (17 [16–18] vs. 4[4–5] days-lost per 1000 h; p = 0.02) than senior level. At the senior level, the incidence of contact injuries was lower than non-contact injuries (risk ratio: 0.29 [0.09–0.88], p = 0.02). Conclusion Training injuries accounted for about a third of injuries, with similar injury severity to match-play. Within training there is a higher rate of non-contact vs. contact injuries. Whilst current injury prevention interventions target matches, these data highlight the importance of collecting high quality training injury data to develop and evaluate injury prevention strategies in training
Objectives To compare match injury incidence, severity and burden in men's and women's elite rugby league. Design A prospective cohort epidemiological study. Methods Time loss match injury data were collected from all men's (11,301 exposure hours) and women's (5,244 exposure hours) Super League clubs. Results Injury incidence and burden were not different between men and women (mean [95 % CI]; 54 [45 to 65] vs. 60 [49 to 74] per 1000 match-hours; p = 0.39, and 2332 [1844 to 2951] vs. 1951 [1560 to 2440] days lost per 1000 match-hours; p = 0.26). However, injury severity was greater for men than women (42 [35–50] vs. 35 [29 to 42]; p = 0.01). Lower limbs accounted for 54 % and 52 % of injuries for men and women, with the head/face the most frequently injured location due to concussion (12 [10 to 15] and 10 [8 to 14] per 1000 match-hours for men and women). Injuries to the knee had the greatest burden for men and women (708 [268–1868] and 863 [320–2328] days lost per 1000 match-hours). Being tackled was the most common injury mechanism for men and women (28 % and 38 %) with greater burden (p < 0.01) than other injury mechanisms. Conclusions Male and female rugby league players have similar injury incidence and burden; however, injury severity was higher in men. Head/face injuries have the highest injury incidence and knee injuries have the highest burden. These injuries should be the focus for prevention initiatives at a league (via laws), player, and coach level, with equal and specific focus for both men's and women's rugby league players.
Player profiling can aid talent identification and development by highlighting strengths and weaknesses, and evaluation training interventions. However, there is currently no consensus in rugby league on the qualities, skills, and characteristics (i.e., factors) which should be profiled, or the methods to use to assess these factors. Consequently, the aims of this two-part study were to 1) establish the most common factors and methods for profiling rugby league players, through a systematic scoping review, and 2) develop consensus on the factors and methods experts believe should be used when profiling rugby league players. In Part 1, a systematic scoping review of studies profiling rugby league players was conducted according to the PRISMA guideline for Scoping Reviews. In Part 2, a panel of 32 experts were invited to participate in a sequential three-round Delphi consensus, used to identify the factors that they believed should be profiled in rugby league players and associated methods of assessment. Part 1 identified 370 studies, which assessed varying numbers of factors from five higher order themes; physical (n=247, 67%), health-related (n=129, 35%), other (n=60, 16%; e.g., playing experience, level of education), technical-tactical (n=58, 16%), and psychological (n=25, 7%). Only 3% of these studies featured female participants (n=11). In Part 2, 120 factors were initially identified, of which 85 reached consensus (≥70% agreement). This included 22 physical, 22 psychological, 20 technical-tactical, 15 health-related, and six player information factors. Collectively, these findings evidence the multidimensional nature of talent in rugby league, highlighting a range of factors across several domains that should be considered when identifying and monitoring talent in the sport. Furthermore, technical-tactical and psychological factors were identified as areas for future research, due to the large number of factors which reached consensus in these areas and the comparatively low amount of research conducted in them.
This study aimed to 1) develop a consensus (≥70% agreement between experts) on injury risk factors specific to women playing rugby league, 2) establish the importance of the identified injury risk factors and the feasibility of mitigating these risk factors and 3) establish context specific barriers to injury risk management. Aim 1: A Delphi panel, consisting of 12 experts in rugby league and injury (e.g., physiotherapists, research scientists) were asked to identify injury risk factors specific to women playing rugby league. Aim 2: seven coaches of women's rugby league teams were asked to rate each risk factor that achieved consensus by their importance and feasibility to manage. Aim 3: Coaches reported barriers which restrict injury risk factor mitigation. Of the 53 injury risk factors which achieved consensus, the five injury risk factors with the highest combination of importance and feasibility ratings were: "poor tackle technique", "a lack of pre-season intensity", "training session are too short", "the current medical standards", and "limited access to physiotherapists". Following the identification of injury risk factors, their feasibility to manage and context specific barriers, this study proposes three constraint driven, integrated solutions which may reduce the barriers which limit injury risk factor management.
BACKGROUND: Elite rugby players experience poor sleep quality and quantity. This lack of sleep could compromise post-exercise recovery. Therefore, it appears central to encourage sleep in order to improve recovery kinetics. However, the effectiveness of an acute ergogenic strategy such as sleep extension on recovery has yet to be investigated among athletes. AIM: To compare the effects of a single night of sleep extension to an active recovery session (CON) on post-exercise recovery kinetics. METHODS: In a randomised cross-over design, 10 male rugby union players participated in two evening training sessions (19:30) involving collision activity, 7-days apart. After each session, participants either extended their sleep to 10 hours or attended an early morning recovery session (07:30). Prior to (PRE), immediately after (POST 0 hour [h]), 14h (POST 14) and 36h (POST 36) post training, neuromuscular, perceptual and cognitive measures of fatigue were assessed. Objective sleep parameters were monitored two days before the training session and over the two-day recovery period. RESULTS: The training session induced substantial decreases in countermovement jump mean power and wellness across all time points, while heart rate recovery decreased at POST 0 in both conditions. Sleep extension resulted in greater total sleep time (effect size [90% confidence interval]: 5.35 [4.56 to 6.14]) but greater sleep fragmentation than CON (2.85 [2.00 to 3.70]). Between group differences highlight a faster recovery of cognitive performance following sleep extension (-1.53 [-2.33 to -0.74]) at POST 14, while autonomic function (-1.00 [-1.85 to -0.16]) and upper-body neuromuscular function (-0.78 [-1.65 to 0.08]) were better in CON. However, no difference in recovery status between groups was observed at POST 36. CONCLUSION: The main finding of this study suggests that sleep extension could affect cognitive function positively but did not improve neuromuscular function the day after a late exercise bout.
Objectives Describe head acceleration events (HAEs) experienced by professional male rugby union players during tackle, ball‐carry, and ruck events using instrumented mouthguards (iMGs). Design Prospective observational cohort. Methods Players competing in the 2023 Currie Cup (141 players) and Super Rugby (66 players) seasons wore iMGs. The iMG‐recorded peak linear acceleration (PLA) and peak angular acceleration (PAA) were used as in vivo HAE approximations and linked to contact‐event data captured using video analysis. Using the maximum PLA and PAA per contact event (HAEmax), ordinal mixed‐effects regression models estimated the probabilities of HAEmax magnitude ranges occurring, while accounting for the multilevel data structure. Results As HAEmax magnitude increased the probability of occurrence decreased. The probability of a HAEmax ≥15g was 0.461 (0.435–0.488) (approximately 1 in every 2) and ≥45g was 0.031 (0.025–0.037) (1 in every 32) during ball carries. The probability of a HAEmax >15g was 0.381 (0.360–0.404) (1 in every 3) and >45g 0.019 (0.015–0.023) (1 in every 53) during tackles. The probability of higher magnitude HAEmax occurring was greatest during ball carries, followed by tackles, defensive rucks and attacking rucks, with some ruck types having similar profiles to tackles and ball carries. No clear differences between positions were observed. Conclusion Higher magnitude HAEmax were relatively infrequent in professional men's rugby union players. Contact events appear different, but no differences were found between positions. The occurrence of HAEmax was associated with roles players performed within contact events, not their actual playing position. Defending rucks may warrant greater consideration in injury prevention research.
Instrumented Mouthguards in Men’s Rugby League: Quantifying the Incidence and Probability of Head Acceleration Events at a Group and Individual Level
Abstract
Background
There is growing concern that exposure to head acceleration events (HAEs) may be associated with long-term neurological effects.
Objectives
To quantify the incidence and probability of HAEs during men’s professional rugby league match-play on a group and individual basis using instrumented mouthguards (iMGs).
Methods
A total of 91 men’s professional rugby league players participating in the 2023 Super League season wore iMGs, resulting in the collection of 775 player matches (mean 8.3 matches per player). Incidence of HAEs (rate of HAEs per median playing time) was calculated via generalised linear mixed models. Probability of HAEs (likelihood of experiencing an HAE during a tackle-event) was calculated using an ordinal mixed effects regression model.
Results
The mean incidence of HAEs exceeding 25 g per median playing time ranged from 0.86–1.88 for back positions and 1.83–2.02 for forward positions. The probability of exceeding 25 g during a tackle event was higher for ball-carriers (6.29%, 95% confidence intervals [CI] 5.27–7.58) than tacklers (4.26%, 95% CI 3.48–5.26). Several players exhibited considerably higher incidence and probability than others, e.g. one player averaged 5.02 HAEs exceeding 25 g per median playing time and another had a probability of 20.00% of exceeding 25 g during a tackle event as a ball-carrier and 34.78% as a tackler.
Conclusions
This study quantifies the incidence and probability of HAEs in men’s rugby league match-play, advancing our understanding of HAE exposure in men’s rugby league. These findings support the development of individualised HAE mitigation strategies targeted at individuals with elevated HAE exposures.
Research into the physiological and movement demands of Rugby League (RL) referees is limited, with only one study in the European Super League (SL). To date, no studies have considered decision-making in RL referees. The purpose of this study was to quantify penalty accuracy scores of RL referees and determine the relationship between penalty accuracy and total distance covered (TD), high-intensity running (HIR) and heart rate per 10-min period of match-play. Time motion analysis was undertaken on 8 referees over 148 European SL games during the 2012 season using 10Hz GPS analysis and heart rate monitors. The number and timing of penalties awarded was quantified using Opta Stats. Referees awarded the correct decision on 74 ± 5% of occasions. Lowest accuracy was observed in the last 10-minute period of the game (67 ± 13%), with a moderate drop (ES= 0.86) in accuracy observed between 60-70 minutes and 70-80 minutes. Despite this, there were only small correlations observed between HRmean, total distance, HIR efforts and penalty accuracy. Although a moderate correlation was observed between maximum velocity and accuracy. Despite only small correlations observed, it would be rash to assume that physiological and movement demands of refereeing have no influence on decision-making. More likely, other confounding variables influence referee decision-making accuracy, requiring further investigation. Findings can be used by referees and coaches to inform training protocols, ensuring training is specific to both cognitive and physical match demands.
The purpose of the current study was to investigate the anthropometric, body composition and fitness characteristics of female rugby league players by playing position. Data were collected on 27 players who were part of the English elite women's rugby league squad. Player assessments comprised anthropometric (stature and body mass), body composition (dual-energy X-ray absorptiometry) and fitness (lower-body power [countermovement jump (CMJ), 20 kg jump squat (JS) and 30 cm drop jump], 5, 10, 20, 30, and 40 m sprint, 505 agility, Yo-Yo intermittent recovery test level 1) measures. Players were classified into playing position (i.e., forwards and backs) prior to analysis. A multivariate analysis of variance (MANOVA) demonstrated significant (p<0.05) differences for body mass, stature, total fat, lean mass and percentage body fat between forwards and backs. Positional differences were also observed for speed, agility and lower-body power. Significant relationships were observed between total body fat and all fitness variables, and total lean mass was related to CMJ and JS peak power. This study provides comparative data for female rugby league forwards and backs. Body fat was strongly associated with performance and should therefore be considered in developing fitness characteristics. The relationship to match performance and trainability of these characteristics warrants further investigation.
Criterion data for total energy expenditure (TEE) in elite rugby are lacking, which prediction equations may not reflect accurately. This study quantified TEE of 27 elite male rugby league (RL) and rugby union (RU) players (U16, U20, U24 age groups) during a 14-day in-season period using doubly labelled water (DLW). Measured TEE was also compared to estimated, using prediction equations. Resting metabolic rate (RMR) was measured using indirect calorimetry, and physical activity level (PAL) estimated (TEE:RMR). Differences in measured TEE were unclear by code and age (RL, 4369 ± 979; RU, 4365 ± 1122; U16, 4010 ± 744; U20, 4414 ± 688; U24, 4761 ± 1523 Kcal.day-1). Differences in PAL (overall mean 2.0 ± 0.4) were unclear. Very likely differences were observed in RMR by code (RL, 2366 ± 296; RU, 2123 ± 269 Kcal.day-1). Differences in relative RMR between U20 and U24 were very likely (U16, 27 ± 4; U20, 23 ± 3; U24, 26 ± 5 Kcal.kg-1.day-1). Differences were observed between measured and estimated TEE, using Schofield, Cunningham and Harris-Benedict equations for U16 (187 ± 614, unclear; -489 ± 564, likely and -90 ± 579, unclear Kcal.day-1), U20 (-449 ± 698, likely; -785 ± 650, very likely and -452 ± 684, likely Kcal.day-1) and U24 players (-428 ± 1292; -605 ± 1493 and -461 ± 1314 Kcal.day-1, all unclear). Rugby players have high TEE, which should be acknowledged. Large inter-player variability in TEE was observed demonstrating heterogeneity within groups, thus published equations may not appropriately estimate TEE.
Prediction of adult performance from early age talent identification in sport remains difficult. Talent identification research has generally been performed using univariate analysis, which ignores multivariate relationships. To address this issue, this study used a novel higher-dimensional model to orthogonalize multivariate anthropometric and fitness data from junior rugby league players, with the aim of differentiating future career attainment. Anthropometric and fitness data from 257 Under-15 rugby league players was collected. Players were grouped retrospectively according to their future career attainment (i.e., amateur, academy, professional). Players were blindly and randomly divided into an exploratory (n = 165) and validation dataset (n = 92). The exploratory dataset was used to develop and optimize a novel higher-dimensional model, which combined singular value decomposition (SVD) with receiver operating characteristic analysis. Once optimized, the model was tested using the validation dataset. SVD analysis revealed 60 m sprint and agility 505 performance were the most influential characteristics in distinguishing future professional players from amateur and academy players. The exploratory dataset model was able to distinguish between future amateur and professional players with a high degree of accuracy (sensitivity = 85.7%, specificity = 71.1%; p<0.001), although it could not distinguish between future professional and academy players. The validation dataset model was able to distinguish future professionals from the rest with reasonable accuracy (sensitivity = 83.3%, specificity = 63.8%; p = 0.003). Through the use of SVD analysis it was possible to objectively identify criteria to distinguish future career attainment with a sensitivity over 80% using anthropometric and fitness data alone. As such, this suggests that SVD analysis may be a useful analysis tool for research and practice within talent identification.
Influence of 5, 10 and 20 second movement demands on rugby league referee penalty accuracy
Accessing off-field brains in sport; an applied research model to develop practice.
Objectives: To retrospectively compare the longitudinal physical development of junior rugby league players between the Under 13 and 15 age categories in relation to their adult career attainment outcome.Design: Retrospective longitudinal design.Methods: Fifty-one former junior rugby league players were retrospectively grouped according to their career attainment outcome as adults (i.e., amateur, academy or professional). As juniors, players under-took a physical testing battery on three consecutive annual occasions (Under 13s, 14s, 15s) including height, body mass, sum of four skinfolds, maturation, vertical jump, medicine ball chest throw, 10–60 msprint, agility 505 and estimated VO2max. Results: Future professional players were younger than academy players with a greater estimated˙VO2max compared to amateur players. Between Under 13s and 15s, professional players (5.8 ± 2.5 cm) increased sitting height more than amateur (4.4 ± 2.1 cm) and academy (4.1 ± 1.4 cm) players. Logistic regression analyses demonstrated improvements in sitting height, 60 m sprint, agility 505 and estimated˙VO2max between amateur and professional players with a high degree of accuracy (sensitivity = 86.7%, specificity = 91.7%). Conclusions: Findings demonstrate that the development of anthropometric, maturational and physical qualities in junior rugby league players aged between 13 and 15 years contributed to adulthood career attainment outcomes. Results suggest that age, maturity and size advantages, commonly observed in adolescent focused talent identification research and practice, may not be sensitive to changes in later stages of development in order to correctly identify career attainment. Practitioners should identify, monitor and develop physical qualities of adolescent rugby league players with long-term athlete development in mind.
To A) evaluate the difference in performance of the 30-15 Intermittent Fitness Test (30-15IFT) across four squads in a professional rugby union club in the United Kingdom (UK), and B) consider body mass in the interpretation of the end velocity of the 30-15IFT (VIFT).One hundred and fourteen rugby union players completed the 30-15IFT mid- season.VIFT demonstrated small and possibly lower (ES = -0.33; 4/29/67) values in the Under 16s compared to the Under 21s, with further comparisons unclear. With body mass included as a covariate all differences were moderate to large, and very likely to almost certainly lower in the squads with lower body mass, with the exception of comparisons between Senior and Under 21 squads.The data demonstrate that there appears to be a ceiling to the VIFT attained in rugby union players which does not increase from Under 16s to Senior level. However, the associated increases in body mass with increased playing level suggest that the ability to perform high intensity running is increased with age, although not translated into greater VIFT due to the detrimental effect of body mass on change of direction. . Practitioners should be aware that VIFT is unlikely to improve, however it needs to be monitored during periods where increases in body mass are evident.
Purpose: This study compared the body size and three compartment body composition between academy and senior professional rugby league players using dual energy X-ray absorptiometry (DXA). Methods: Academy (age 18.1±1.1 years; n=34) and senior (age 26.2 ±4.6 years; n=63) rugby league players received one total-body DXA scan. Height, body mass and body fat percentage alongside total and regional fat mass, lean mass and bone mineral content (BMC) were compared. Independent t-tests with Cohen’s d effect sizes and multivariate analysis of covariance (MANCOVA), controlling for height and body mass, with partial eta squared (η2) effect sizes, were used to compare total and regional body composition. Results: Senior players were taller (183.2±5.8 vs. 179.2±5.7 cm; p=0.001; d=0.70) and heavier (96.5±9.3 vs. 86.5±9.0 kg; p<0.001; d=1.09) with lower body fat percentage (16.3±3.7 vs. 18.0±3.7 %; p=0.032; d=0.46) than academy players. MANCOVA identified significant overall main effects for total and regional body composition between academy and senior players. Senior players had lower total fat mass (p<0.001, η2=0.15), greater total lean mass (p<0.001, η2=0.14) and greater total BMC (p=0.001, η2=0.12) than academy players. For regional sites, academy players had significantly greater fat mass at the legs (p<0.001; η2=0.29) than senior players. Conclusions: The lower age, height, body mass and BMC of academy players suggest that these players are still developing musculoskeletal characteristics. Gradual increases in lean mass and BMC whilst controlling fat mass is an important consideration for practitioners working with academy rugby league players, especially within the lower body.
Body composition analysis using dual energy X-ray absorptiometry (DXA) is becoming increasingly popular in both clinical and sports science settings. Obesity, characterised by high fat mass (FM), is associated with larger precision errors, however, precision error for athletic groups with high levels of lean mass (LM) are unclear. Total (TB) and regional (limbs and trunk) body composition were determined from two consecutive total body scans (GE Lunar iDXA) with re-positioning in 45 elite male rugby league players (age: 21.8 ±5.4 years BMI: 27.8 ±2.5 kg.m-1). The root mean squared standard deviation (percentage co-efficient of variation) were TB bone mineral content (BMC): 24g (1.7%), TB LM: 321g (1.6%), and TB FM: 280g (2.3%). Regional precision values were superior for measurements of BMC: 4.7-16.3g (1.7-2.1%) and LM: 137-402g (2.0-2.4%), than for FM: 63-299g (3.1-4.1%). Precision error of DXA body composition measurements in elite male rugby players is higher than those reported elsewhere for normal adult populations and similar to those reported in those who are obese. It is advised that caution is applied when interpreting longitudinal DXA-derived body composition measurements in male rugby players and population-specific least significant change should be adopted.
Due to the focus of research within athletic populations, little is known about the hydration strategies of rugby league referees. We observed all 8 full-time professional referees, during 31 Super League matches to investigate the drinking strategies and magnitude of dehydration (body mass loss) experienced by referees during match play. Referees arrived and remained euhydrated (urine osmolality; pre and post-match 558 ± 310 and 466 ± 283 mOsmol•kg-1). Mean body mass change was -0.7 ± 0.8%, fluid loss was 890 ± 435 g and fluid intake was 444 ± 167, 438 ± 190, 254 ± 108 and 471 ± 221 g during pre-match, first-half, half-time and second-half. This study suggests elite referees adopt appropriate hydration strategies during match-play to prevent large reductions in body mass, although individual variability was observed. Future research should investigate dehydration in referees from other sports and the effects on refereeing performance.
Six-year changes in body composition of UK professional rugby league players using dual-energy X-ray absorptiometry
Recent research has demonstrated that greater player body mass, lean mass (LM) and lower percentage body fat (%BF) are positively related to rugby league performance (e.g. Gabbett et al., 2011, Journal of Sports Sciences, 29, 1655–1664). Correspondingly, over recent years, there has been an increasing emphasis on player size and muscularity in the professional sport. However, to date, there has been no published data on the longitudinal changes in the body composition of senior professional rugby league players. Therefore, the purpose of this study was to investigate changes in three-compartment body composition over six years, in UK professional rugby league players using dual energy X-ray absorptiometry (DXA). Following institutional ethical approval, 12 professional rugby league players (baseline age: 25.0 ± 3.9 years, height: 183.4 ± 8.4 cm) from one European Super League club received total body DXA scans (Lunar iDXA, GE Healthcare) midseason in 2008 and 2014 when euhydrated (urine osmolality < 700 mOsmol · kg−1). The regions of interest on scan images were checked and manually adjusted where necessary by a qualified densitometrist according to DXA manufacturer guidelines. The primary outcomes were total body mass, %BF, total and regional fat mass (FM), LM and bone mineral content (BMC). A repeated measures multivariate analysis of variance (MANOVA), controlling for chronological age, examined differences between the two time points. Effect sizes were calculated. The repeated measures MANOVA found an overall significant effect for time (P = 0.048, = 0.99). Univariate analysis identified increases in total body mass (95.3 ± 12.2 vs. 98.5 ± 12.2 kg, P = 0.005, d = 0.26), total LM (77.2 ± 8.6 vs. 79.8 ± 9.6 kg, P = 0.006, d = 0.29) and leg LM (25.8 ± 2.8 vs. 27.6 ± 3.8 kg, P = 0.049, d = 0.54) across the six-year period. Increases were also found for total BMC (4324 ± 566 vs. 4575 ± 582 g, P < 0.001, d = 0.44) and BMC at the arms (P = 0.006, d = 0.36), legs (P = 0.001, d = 0.43) and trunk (P < 0.001, d = 0.45) regions over the six-year period. No changes were identified in %BF or FM across the six-year period. This study demonstrates that senior professional rugby league players competing in the European Super League over a six-year period have increased total body mass, which can be predominantly explained by a gain in LM of the lower body. Such findings may reflect the increasing physical demands of the professional game and a greater emphasis on lower body resistance training. These players had remained competitive in the professional sport for six years, which suggests that increasing LM and BMC may be beneficial to career longevity.
© 2016 National Strength and Conditioning Association. This study determined the magnitude of change in adductor strength after a competitive match in academy rugby union players and examined the relationship between locomotive demands of match-play and changes in postmatch adductor strength. A withinsubject repeated measures design was used. Fourteen academy rugby union players (age, 17.4 ± 0.8 years; height, 182.7 ± 7.6 cm; body mass, 86.2 ± 11.6 kg) participated in the study. Each player performed 3 maximal adductor squeezes at 458 of hip flexion before and immediately, 24, 48, and 72 hours postmatch. Global positioning system was used to assess locomotive demands of match-play. Trivial decreases in adductor squeeze scores occurred immediately (21.3 ± 2.5%; effect size [ES] = 20.11 ± 0.21; likely, 74%) and 24 hours after match (20.7 ± 3%; ES = 20.06 ± 0.25; likely, 78%), whereas a small but substantial increase occurred at 48 hours (3.8 ± 1.9%; ES = 0.32 ± 0.16; likely, 89%) before reducing to trivial at 72 hours after match (3.1 ± 2.2%; ES = 0.26 ± 0.18; possibly, 72%). Large individual variation in adductor strength was observed at all time points. The relationship between changes in adductor strength and distance covered at sprinting speed (VO2max 81%) was large immediately postmatch (p = 0.056, r = 20.521), moderate at 24 hours (p = 0.094, r = 20.465), and very large at 48 hours postmatch (p = 0.005, r = 20.707). Players who cover greater distances sprinting may suffer greater adductor fatigue in the first 48 hours after competition. The assessment of adductor strength using the adductor squeeze test should be considered postmatch to identify players who may require additional rest before returning to field-based training.
This study established the between-day reliability and sensitivity of a countermovement jump (CMJ), plyometric push-up, wellbeing questionnaire and whole blood creatine kinase concentration [CK] in elite male youth rugby union players. The study also established the between-day reliability of 1, 2 or 3 CMJ and plyometric push-up attempts. Twenty-five players completed tests on 2 occasions separated by 5 days (of rest). Between-day typical error (TE), coefficient of variation (CV) and smallest worthwhile change (SWC) were calculated for the wellbeing questionnaire, [CK] and CMJ and plyometric push-up metrics (peak/mean power, peak/mean force, height, flight-time and flight-time to contraction-time ratio) for 1 maximal effort or taking the highest score from 2 or 3 maximal efforts. The results from this study would suggest that CMJ mean power (2 or 3 attempts), peak force or mean force, and plyometric push-up mean force (from 2 or 3 attempts) should be used for assessing lower- and upper-body neuromuscular function respectively, due to both their acceptable reliability (CV<5%) and good sensitivity (CV
Changes in sprint and jump height during an academic year in high school adolescent and youth sport athletes
Purpose: This study quantified the frequencies and timings of rugby union match-play phases (i.e., attacking, defending, ball in play (BIP) and ball out of play (BOP)) and then compared the physical characteristics of attacking, defending and BOP between forwards and backs. Methods: Data were analysed from 59 male rugby union academy players (259 observations). Each player wore a micro-technology device (Optimeye S5, Catapult) with video footage analysed for phase timings and frequencies. Dependent variables were analysed using a linear mixed-effects model and assessed with magnitude-based inferences and Cohen’s d effect sizes (ES). Results: Attack, defence, BIP and BOP times were 12.7 ± 3.1, 14.7 ± 2.5, 27.4 ± 2.9 and 47.4 ± 4.1 min, respectively. Mean attack (26 ± 17 s), defence (26 ± 18 s) and BIP (33 ± 24 s) phases were shorter than BOP phases (59 ± 33 s). The relative distance in attacking phases was similar (112.2 ± 48.4 vs. 114.6 ± 52.3 m·min-1, ES = 0.00 ±0.23) between forwards and backs, while greater in forwards (114.5 ± 52.7 vs. 109.0 ± 54.8 m·min-1, ES = 0.32 ±0.23) during defence and greater in backs during BOP (ES = -0.66 ±0.23). Conclusion: Total time in attack, defence and therefore BIP was less than BOP. Relative distance was greater in forwards during defence, while greater in backs during BOP and similar between positions during attack. Players should be exposed to training intensities from in play phases (i.e., attack and defence) rather than whole-match data and practice technical skills during these intensities.
It is unknown whether instantaneous visual feedback of resistance training outcomes can enhance barbell velocity in younger athletes. Therefore, the purpose of this study was to quantify the effects of visual feedback on mean concentric barbell velocity in the back squat, and to identify changes in motivation, competitiveness, and perceived workload. In a randomised-crossover design (Feedback vs. Control) feedback of mean concentric barbell velocity was or was not provided throughout a set of 10 repetitions in the barbell back squat. Magnitude-based inferences were used to assess changes between conditions, with almost certainly greater differences in mean concentric velocity between the Feedback (0.70 ±0.04 m·s-1) and Control (0.65 ±0.05 m·s-1) observed. Additionally, individual repetition mean concentric velocity ranged from possibly (repetition number two: 0.79 ±0.04 vs. 0.78 ±0.04 m·s-1) to almost certainly (repetition number 10: 0.58 ±0.05 vs. 0.49 ±0.05 m·s-1) greater when provided feedback, while almost certain differences were observed in motivation, competitiveness, and perceived workload, respectively. Providing adolescent male athletes with visual kinematic information while completing resistance training is beneficial for the maintenance of barbell velocity during a training set, potentially enhancing physical performance. Moreover, these improvements were observed alongside increases in motivation, competitiveness and perceived workload providing insight into the underlying mechanisms responsible for the performance gains observed. Given the observed maintenance of barbell velocity during a training set, practitioners can use this technique to manipulate training outcomes during resistance training
Background: Post-match fatigue has yet to be investigated inacademy rugby union players.Objectives: To determine the magnitude of change in upper-(plyometric push-up (PP) flight-time) and lower-body(countermovement jump (CMJ) mean power) neuromuscularfunction (NMF), whole blood creatine kinase (CK) and perceptionof well-being following a competitive match in academy rugbyunion players.Methods: Fourteen academy rugby union players participatedin the study. Measures were taken 2 h pre-match (baseline) andimmediately post-match. Further testing was also undertaken at24-, 48- and 72 h respectively post-match. Changes in measuresfrom baseline were determined using magnitude-based inferences.Results: Decreases in CMJ mean power were likely substantialimmediately (-5.5±3.3%) post-match, very likely at 24 h (-7±3.9),likely at 48 h (-5.8±5.4), while likely trivial at 72 h (-0.8±3.8)post-match. PP flight-time was very likely reduced immediately(-15.3±7.3%) and 24 h (-11.5±5.7%) post-match, while possiblyincreased at 48 h (3.5±6.0%) and likely trivial at 72 h (-0.9±5.4%)post-match. Decreases in perception of well-being were almostcertainly substantial at 24 h (-24.0±4.3%), very likely at 48 h(-8.3±5.9%), and likely substantial at 72 h (-3.6±3.7%) post-match.Increases in CK were almost certainly substantial immediately(138.5±33%), 24 h (326±78%) and 48 h (176±62%) post-match,while very likely substantial at 72 h (57±35%) post-match.Conclusion: These findings demonstrate the transient andmultidimensional nature of post-match fatigue in academyrugby union players. Furthermore, the results demonstrate theindividual nature of recovery, with many players demonstratingdifferent recovery profiles from the group average.Keywords: collision sport, monitoring, sports injuries
Organized Chaos in Late Specialization Team Sports: Weekly Training Loads of Elite Adolescent Rugby Union Players.
Phibbs, PJ, Jones, B, Roe, G, Read, DB, Darrall-Jones, J, Weakley, J, Rock, A, and Till, K. Organized chaos in late specialization team sports: weekly training loads of elite adolescent rugby union players. J Strength Cond Res 32(5): 1316-1323, 2018-The aim of this study was to quantify the mean weekly training load (TL) of elite adolescent rugby union players participating in multiple teams and examine the differences between playing positions. Twenty elite male adolescent rugby union players (17.4 ± 0.7 years) were recruited from a regional academy and categorized by playing position: forwards (n = 10) and backs (n = 10). Global positioning system and accelerometer microtechnology was used to quantify external TL, and session rating of perceived exertion (sRPE) was used to quantify internal TL during all sessions throughout a 10-week in-season period. A total of 97 complete observations (5 ± 3 weeks per participant) were analyzed, and differences between positions were assessed using Cohen's d effect sizes (ES) and magnitude-based inferences. Mean weekly sRPE was 1,217 ± 364 arbitrary units (AU) (between-subject coefficient of variation [CV] = 30%), with a total distance (TD) of 11,629 ± 3,445 m (CV = 30%), and PlayerLoad (PL) of 1,124 ± 330 AU (CV = 29%). Within-subject CV ranged between 5 and 78% for sRPE, 24 and 82% for TD, and 19 and 84% for PL. Mean TD (13,063 ± 3,933 vs. 10,195 ± 2,242 m) and PL (1,246 ± 345 vs. 1,002 ± 279 AU) were both likely greater for backs compared with forwards (moderate ES); however, differences in sRPE were unclear (small ES). Although mean internal TLs and volumes were low, external TLs were higher than previously reported during preseason and in-season periods in senior professional players. Additionally, the large between-subject and within-subject variation in weekly TL suggests that players participate in a chaotic training system.
Purpose: To quantify and compare the maximum running intensities during rugby union match-play. Methods: Running intensity was quantified using micro-technology devices (S5 Optimeye, Catapult) from 202 players during 24 matches (472 observations). Instantaneous speed was used to calculate relative distance (m·min-1) using a 0.1 s rolling mean for different time durations (15 and 30 s and 1, 2, 2.5, 3, 4, 5, and 10 min). Data were analysed using a linear mixed-model and assessed with magnitude-based inferences and Cohen’s d effect sizes (ES). Results: Running intensity for consecutive durations (e.g., 15 s vs. 30 s, 30 s vs. 1 min, etc.) decreased as time increased (ES = 0.48-2.80). Running intensity was lower in forwards than backs during all durations (-0.74 ±0.21 to -1.19 ±0.21). Running intensity for the second row and back row positions was greater than the front row players at all durations (-0.58 ±0.38 to -1.18 ±0.29). Running intensity for scrum-halves was greater (0.46 ±0.43 to 0.86 ±0.39) than inside and outside backs for all durations besides 15 and 30 s. Conclusions: Front rowers and scrum-halves were markedly different from other sub-positional groups and should be conditioned appropriately. Coaches working in academy rugby can use this information to appropriately overload the intensity of running, specific to time durations and positions.
The primary aim of the study was to assess the level of agreement between the criterion session-rating of perceived exertion (sRPE30min) and a practical measure of a self-reported web-based training load questionnaire 24-hours post-training (sRPE24h) in adolescent athletes. The secondary aim was to assess the agreement between weekly summated sRPE24h values (ƩsRPE24h) and a weekly web-based training diary (sRPEweekly) for all field-based training accumulated on a subsequent training week. Thirty-six male adolescent rugby players (age 16.7 ± 0.5 years) were recruited from a regional academy. sRPE30min measures were recorded 30-minutes following a typical field-based training session. Participants then completed the sRPE24h via a web-based training load questionnaire 24-hours post-training, reporting both session duration and intensity. In addition, on a subsequent week, participants completed the sRPE24h daily and then completed the sRPEweekly at the end of the week, using the same web-based platform, to recall all field-based training session durations and intensities over those seven days. Biases were trivial between sRPE30min and sRPE24h for sRPE (0.3% [-0.9 to 1.5]), with nearly perfect correlations (0.99 [0.98 to 0.99), and small typical error of the estimate (TEE; 4.3% [3.6 to 5.4]). Biases were trivial between ƩsRPE24h and sRPEweekly for sRPE (5.9% [-2.1 to 14.2]), with very large correlations (0.87 [0.78 to 0.93]), and moderate TEE 28.5% [23.3 to 36.9]). The results of this study show that sRPE24h is a valid and robust method to quantify training loads in adolescent athletes. However, sRPEweekly was found to have a substantial TEE (29%), limiting practical application.
Limited information is available regarding the training loads (TLs) of adolescent rugby union players. One-hundred and seventy male players (age 16.1 ± 1.0 years) were recruited from ten teams representing two age categories (under-16 and under-18) and three playing standards (school, club and academy). Global positioning systems, accelerometers, heart rate and session-rating of perceived exertion (s-RPE) methods were used to quantify mean session TLs. Session demands differed between age categories and playing standards. Under-18 academy players were exposed to the highest session TLs in terms of s-RPE (236 ± 42 AU), total distance (4176 ± 433 m), high speed running (1270 ± 288 m) and PlayerLoadTM (424 ± 56 AU). Schools players had the lowest session TLs in both respective age categories. Training loads and intensities increased with age and playing standard. Individual monitoring of TL is key to enable coaches to maximise player development and minimise injury risk.
The purpose of this study was to quantify the physical demands of representative adolescent rugby union match-play and investigate the difference between playing positions and age groups. Players (n=112) were classified into 6 groups by playing position (forwards and backs) and age group (U16, U18, U20). The physical demands were measured using microsensor-based technology and analysed using magnitude based inferences to assess practical importance. Backs had a greater relative distance (except U16s) and a greater high-speed running distance per minute than forwards, with the magnitude of difference between the positions becoming larger in older age groups. Forwards had higher values of PlayerLoadTM per minute (accumulated accelerations from the three axes of movement) and PlayerLoadTM slow per minute (accumulated accelerations from the three axes of movement where velocity is <2 m.s-1) than backs at all age groups. Relative distance, low- and high-speed running per minute all had a trend to be lower in older age groups for both positions. PlayerLoadTM per minute was greater in U18 than U16 and U20 for both positions. PlayerLoadTM slow per minute was greater for older age groups besides the U18 and U20 comparisons, which were unclear. The contrasts in physical demands experienced by different positions reinforce the need for greater exposure to sprinting and collision based activity for backs and forwards, respectively. Given PlayerLoadTM metrics peak at U18 and locomotor demands seem to be lower in older ages, the demands of representative adolescent rugby union do not seem to be greater at U20 as expected.
The purpose of this study was to investigate longitudinal body composition of professional rugby union players over one competitive season. Given the potential for variability in changes, and as the first to do so, we conducted individual analysis in addition to analysis of group means. Thirty-five professional rugby union players from one English Premiership team (forwards: n = 20, age: 25.5 ± 4.7 years; backs: n = 15, age: 26.1 ± 4.5 years) received one total-body dual-energy X-ray absorptiometry (DXA) scan at preseason (August), midseason (January) and endseason (May), enabling quantification of body mass, total and regional fat mass, lean mass, percentage tissue fat mass (%TFM) and bone mineral content (BMC). Individual analysis was conducted by applying least significant change (LSC), derived from our previously published precision data and in accordance with International Society for Clinical Densitometry (ISCD) guidelines. Mean body mass remained stable throughout the season (p > 0.05), but total fat mass and %TFM increased from pre to endseason, and mid to endseason (p < 0.05). There were also statistically significant increases in total-body BMC across the season (p < 0.05). In both groups, there was a loss of lean mass between mid and endseason (p < 0.018). Individual evaluation using LSC and Bland-Altman analysis revealed a meaningful loss of lean mass in 17 players and a gain of fat mass in 21 players from pre to endseason. Twelve players had no change and there were no differences by playing position. There were individual gains or no net changes in BMC across the season for 10 and 24 players, respectively. This study highlights the advantages of an individualised approach to DXA body composition monitoring and this can be achieved through application of derived LSC.
PURPOSE: The purpose of this study was to evaluate changes in performance of a 6-second cycle ergometer test (CET) and countermovement jump (CMJ) during a 6-week training block in professional rugby union players. METHODS: Twelve young professional rugby union players performed two CET and CMJ on the first and fourth morning of every week prior to the commencement of daily training during a 6-week training block. Standardised changes in the highest score of two CET and CMJ efforts were assessed using linear mixed modelling and magnitude-based inferences. RESULTS: Following increases in training load during weeks three to five, moderate decreases in CMJ peak and mean power, and small decreases in flight-time were observed during weeks five and six that were very likely to almost certainly greater than the smallest worthwhile change, suggesting neuromuscular fatigue. However, only small decreases, possibly greater than the smallest worthwhile change, were observed in CET peak power. Changes in CMJ peak and mean power, were moderately greater than in CET peak power during this period, while the difference between flight-time and CET peak power was small. CONCLUSIONS: The greater weekly changes in CMJ metrics in comparison to CET may indicate differences in the capacities of these tests to measure training induced lower-body neuromuscular fatigue in rugby union players. However, future research is needed to ascertain the specific modes of training that elicit changes in CMJ and CET in order to determine the efficacy of each test for monitoring neuromuscular function in rugby union players.
Training that is efficient and effective is of great importance to an athlete. One method of improving efficiency is by incorporating supersets into resistance training routines. However, the structuring of supersets is still unexplored. Therefore, the purpose of this study was to assess the effects of agonist-antagonist (A-A), alternate peripheral (A-P), and similar biomechanical (SB) superset configurations on rate of perceived exertion (RPE), kinetic and kinematic changes during the bench press. 10 subjects performed resistance training protocols in a randomized-crossover design, with magnitude-based inferences assessing changes/differences within and between protocols. Changes in RPE were very likely and almost certainly greater in the A-P and SB protocols when compared with the A-A, while all superset protocols had very likely to almost certain reductions in mean velocity and power from baseline. Reductions in mean velocity and power were almost certainly greater in the SB protocol, with differences between the A-A and A-P protocols being unclear. Decreases in peak force were likely and almost certain in the A-A and SB protocols respectively, with changes in A-P being unclear. Differences between these protocols showed likely greater decreases in SB peak forces when compared to A-A, with all other superset comparisons being unclear. This study demonstrates the importance of exercise selection when incorporating supersets into a training routine. It is suggested that the practitioner uses A-A supersets when aiming to improve training efficiency and minimize reductions in kinetic and kinematic output of the agonist musculature while completing the barbell bench press.
The provision of instantaneous visual kinematic feedback has been shown to improve physical performance and psychological traits. However, this research has only investigated changes across a single set of exercise in adolescent males. Therefore, the aim of this study was to assess the effects of visual kinematic feedback on kinematic outputs during multiple sets of the jump squat in adolescent female athletes. In addition, motivation and competitiveness were also assessed. Eleven adolescent female athletes volunteered to take part in this study. In a randomised-crossover study design, subjects either were or were not provided peak concentric velocity using visual feedback during three sets of six repetitions of the jump squat. A linear position transducer measured peak concentric velocity of each repetition across the three sets, while motivation and competitiveness were measured before and after exercise. Magnitude-based inferences were used to assess changes between conditions, with mean peak concentric velocity (mean ±90%CI: 0.23 ±0.04m·s-1; ES ±90%CI: 2.73 ±0.44; percent ±90%CI: 10.3 ±1.8) and power (mean ±90%CI: 330 ±53W; ES ±90%CI: 2.87 ±0.52; percent ±90%CI: 16.5 ±3.2) almost certainly greater when feedback was provided. Furthermore, motivation almost certainly improved (ES ±90%CI: 2.81 ±0.63) when feedback was provided, while competitiveness was almost certainly greater (ES ±90%CI: 4.88 ±0.58) following the provision of kinematic feedback. Findings from this study demonstrate that providing adolescent female athletes visual kinematic information while completing plyometric exercise is beneficial for performance and can enhance psychological responses across multiple sets. Consequently, practitioners are advised to utilise kinematic feedback during training to enhance training quality and improve motivation and competitiveness.
This study investigated the change in body composition and bone mineral content (BMC) of senior rugby league players between 2008 and 2014. Twelve male professional rugby league players (age, 24.6±4.0 years; stature, 183.4±8.4 cm) received a DXA scan during pre-season in 2008 and 2014. Between 2008 and 2014, very likely increases in leg lean mass, total trunk and leg BMC, and a likely increase in arm BMC and possible increases in body mass, total and trunk fat mass, and total, trunk and arm lean mass were observed. Unlikely decreases and unclear changes in leg and arm fat mass were also found. Large negative correlations were observed between age and body mass (r=-0.72), lean mass (r=-0.70), fat mass (r=-0.61), and BMC (r=-0.84) change. Three participants (19.1 ± 1.6 years) increased lean mass by 7.0 – 9.3 kg. Younger players had the largest increases in lean mass during this period, although an older player (30 year-old) still increased lean mass. Differences in body composition change were also observed for participants of the same age, thus contextual factors should be considered. This study demonstrates the individuality of body composition changes in senior professional rugby players, while considering the potential change in young athletes.
Adolescent rugby players benefit from the implementation of resistance training. However resistance training practices and how they influence short-term physical change is unknown. Therefore the purpose of this study was to quantify resistance training practices, evaluate physical development, and relate these changes to resistance training variables across 12-weeks in adolescent rugby union players. Thirty-five male adolescent rugby union players participated in the study with subjects completing an anthropometric and physical testing battery pre- and post- a 12-week in-season mesocycle. Subjects recorded resistance training frequency, exercises, repetitions, load, minutes, and rating of perceived exertion for each session using weekly training diaries during the 12-week period. Paired sample t-tests and Cohen’s d effect sizes were used to assess change, while Pearson correlation coefficients assessed relationships between variables. Resistance training practices were variable, while significant (p ≤0.05) improvements in body mass, countermovement jump (CMJ) height, front squat, bench press, and chin up strength were observed. Resistance training volume load had moderate to strong relationships with changes in CMJ (r =0.71), chin up (r =0.73) and bench press (r =0.45). Frequency of upper and lower body compound exercises had significant moderate to large relationships with changes in CMJ (r =0.68), chin up (r =0.65), and bench press (r =0.41). Across a 12-week in-season period, adolescent rugby union players have varying resistance training practices, while anthropometric and physical characteristics appear to improve. Given the observed relationships, increased volume loads through the implementation of free-weight compound exercises could be an effective method for improving physical qualities in young rugby players. Rugby union, resistance training, strength, power
PURPOSE: Investigate the acute and short-term (i.e., 24 h) effects of traditional (TRAD), superset (SS), and tri-set (TRI) resistance training protocols on perceptions of intensity and physiological responses. METHODS: Fourteen male participants completed a familiarisation session and three resistance training protocols (i.e., TRAD, SS, and TRI) in a randomised-crossover design. Rating of perceived exertion, lactate concentration ([Lac]), creatine kinase concentration ([CK]), countermovement jump (CMJ), testosterone, and cortisol concentrations was measured pre, immediately, and 24-h post the resistance training sessions with magnitude-based inferences assessing changes/differences within/between protocols. RESULTS: TRI reported possible to almost certainly greater efficiency and rate of perceived exertion, although session perceived load was very likely lower. SS and TRI had very likely to almost certainly greater lactate responses during the protocols, with changes in [CK] being very likely and likely increased at 24 h, respectively. At 24-h post-training, CMJ variables in the TRAD protocol had returned to baseline; however, SS and TRI were still possibly to likely reduced. Possible increases in testosterone immediately post SS and TRI protocols were reported, with SS showing possible increases at 24-h post-training. TRAD and SS showed almost certain and likely decreases in cortisol immediately post, respectively, with TRAD reporting likely decreases at 24-h post-training. CONCLUSIONS: SS and TRI can enhance training efficiency and reduce training time. However, acute and short-term physiological responses differ between protocols. Athletes can utilise SS and TRI resistance training, but may require additional recovery post-training to minimise effects of fatigue.
The aims of this study were to determine the variability of weekly match and training loads in adolescent rugby union players across a competitive season, and to investigate the effect of match frequency on load distribution across different activities. Internal match and training load data (i.e., session-rating of perceived exertion: sRPE) were collected daily from 20 players from a regional academy across a 14-week season. Data were analysed using a mixed-effects linear model, and variability was reported as a coefficient of variation (CV). Differences between 0-, 1-, 2-, and 3-match weeks were assessed using Cohen’s d effect sizes and magnitude-based inferences. Mean weekly total match and training sRPE load was 1425 ± 545 arbitrary units (AU), with a between-player CV of 10 ±6% and within-player CV of 37 ±3%. Mean week-to-week change in total sRPE load was 497 ± 423 AU (35%), and 40% of weekly observations were outside of the suggested acute:chronic workload ratio ‘safe zone’. Total weekly sRPE loads increased substantially with match frequency (1210 ± 571 AU, 1511 ± 489, and 1692 ± 517 AU, for 0-, 1-, and 2-match weeks, respectively), except for 3-match weeks (1520 ± 442 AU). Weekly match and training loads were highly variable for adolescent rugby players during the competitive season, and match frequency has a substantial effect on the distribution of loads. Therefore, match and training loads should be coordinated, monitored, and managed on an individual basis to protect players from negative training consequences, and to promote long term athlete development.
The aim of this study was to compare the physical and movement demands between training and match-play in schoolboy and academy adolescent rugby union (RU) players. Sixty-one adolescent male RU players (mean ± SD; age 17.0 ± 0.7 years) were recruited from four teams representing school and regional academy standards. Players were categorised into four groups based on playing standard and position: schoolboy forwards (n=15), schoolboy backs (n=15), academy forwards (n=16) and academy backs (n=15). Global positioning system and accelerometry measures were obtained from training and match-play to assess within-group differences between conditions. Maximum data were analysed from 79 match files across 8 matches (1.3 ± 0.5 matches per participant) and 152 training files across 15 training sessions (2.5 ± 0.5 training sessions per participant). Schoolboy forwards were underprepared for low-intensity activities experienced during match-play, with schoolboy backs underprepared for all movement demands. Academy forwards were exposed to similar physical demands in training to matches, with academy backs similar to or exceeding values for all measured variables. Schoolboy players were underprepared for many key, position-specific aspects of match-play, which could place them at greater risk of injury and hinder performance, unlike academy players who were better prepared.
The aim was to compare the physical characteristics of under-18 academy and schoolboy rugby union competition by position (forwards and backs). Using a microsensor unit, match characteristics were recorded in 66 players. Locomotor characteristics were assessed by maximum sprint speed (MSS) and total, walking, jogging, striding and sprinting distances. The slow component (<2 m · s(-1)) of PlayerLoad(TM) (PLslow), which is the accumulated accelerations from the three axes of movement, was analysed as a measure of low-speed activity (e.g., rucking). A linear mixed-model was assessed with magnitude-based inferences. Academy forwards and backs almost certainly and very likely covered greater total distance than school forwards and backs. Academy players from both positions were also very likely to cover greater jogging distances. Academy backs were very likely to accumulate greater PLslow and the academy forwards a likely greater sprinting distance than school players in their respective positions. The MSS, total, walking and sprinting distances were greater in backs (likely-almost certainly), while forwards accumulated greater PLslow (almost certainly) and jogging distance (very likely). The results suggest that academy-standard rugby better prepares players to progress to senior competition compared to schoolboy rugby.
The purpose of this study was to determine the between-day reliability of commonly used strength measures in male youth athletes, while considering resistance training experience. Data were collected on 25 male athletes over two testing sessions, with 72 hours rest between, for the 3RM front squat, chin up and bench press. Subjects were initially categorized by resistance training experience (inexperienced; 6-12 months, experienced; >2 years). The assessment of the between-day reliability (coefficient of variation [CV%]) showed the front squat (experienced: 2.90%; inexperienced: 1.90%), chin up (experienced: 1.70%; inexperienced: 1.90%), and bench press (experienced: 4.50%; inexperienced: 2.40%) were all reliable measures of strength in both groups. Comparison between groups for the error of measurement for each exercise showed trivial differences. When both groups were combined, the CV% for the front squat, bench press, and chin up were 2.50%, 1.80%, and 3.70%, respectively. This study provides scientists and practitioners with the between-day reliability reference data to determine real and practical changes for strength in male youth athletes with different resistance training experience. Furthermore, this study demonstrates that 3RM front squat, chin up and bench press are reliable exercises to quantify strength in male youth athletes.
Repeated physical contact in rugby union is thought to contribute to post-match fatigue; however, no evidence exists on the effect of contact activity during field-based training on fatigue responses. Therefore, the purpose of this study was to examine the effect of contact during training on fatigue markers in rugby union players. Twenty academy rugby union players participated in the cross-over study. The magnitude of change in upper- and lower-body neuromuscular function (NMF), whole blood creatine kinase concentration [CK] and perception of well-being was assessed pre-training (baseline), immediately and 24 h post-training following contact and non-contact, field-based training. Training load was measured using mean heart rate, session rating of perceived exertion (sRPE) and microtechnology (Catapult Optimeye S5). The inclusion of contact during field-based training almost certainly increased mean heart rate (9.7; ±3.9%) and sRPE (42; ±29.2%) and resulted in likely and very likely greater decreases in upper-body NMF (-7.3; ±4.7% versus 2.7; ±5.9%) and perception of well-being (-8.0; ±4.8% versus -3.4; ±2.2%) 24 h post-training, respectively, and almost certainly greater elevations in [CK] (88.2; ±40.7% versus 3.7; ±8%). The exclusion of contact from field-based training almost certainly increased running intensity (19.8; ±5%) and distance (27.5; ±5.3%), resulting in possibly greater decreases in lower-body NMF (-5.6; ±5.2% versus 2.3; ±2.4%). Practitioners should be aware of the different demands and fatigue responses of contact and non-contact, field-based training and can use this information to appropriately schedule such training in the weekly microcycle.
The Incidence of Head Acceleration Events During Pitch‐Based Training and Match Play in Professional Men's Rugby League
ABSTRACT
This study aimed to describe the incidence of head acceleration events (HAEs) during pitch‐based in‐season training and matches in professional male rugby league. Data were recorded using instrumented mouthguards from 108 players (70 forwards and 38 backs) at nine Super League teams (2024 season), resulting in 468 player‐training sessions and 665 player‐matches included. Peak linear and angular acceleration were calculated from each HAE and analyzed using generalized linear mixed‐effects models. During the 468 player‐training sessions, 814 HAEs above the lowest magnitude threshold (5
g
and 400 rad.s
The purpose of this study was to quantify the total energy expenditure (TEE) of international female rugby union players. Fifteen players were assessed over 14-days throughout an international multi-game tournament, which represented two consecutive one-match microcycles. Resting metabolic rate (RMR) and TEE were assessed by indirect calorimetry and doubly labelled water, respectively. Physical activity level (PAL) was estimated (TEE:RMR). Mean RMR, TEE, and PAL were 6.60 ± 0.93 MJ.day-1, 13.51 ± 2.28 MJ.day-1 and 2.0 ± 0.3 AU, respectively. There was no difference in TEE (13.74 ± 2.31 vs. 13.92 ± 2.10 MJ.day-1; p = 0.754), or PAL (2.06 ± 0.26 AU vs. 2.09 ± 0.23 AU; p = 0.735) across microcycles, despite substantial decreases in training load (total distance: -8088 m, collisions: -20 n, training duration: -252 min). After correcting for body composition, there was no difference in TEE (13.80 ± 1.74 vs. 13.16 ± 1.97 adj. MJ.day-1, p = 0.190), RMR (6.49 ± 0.81 vs. 6.73 ± 0.83 adj. MJ.day-1, p = 0.633) or PAL (2.15 ± 0.14 vs 1.87 ± 0.26 AU, p = 0.090) between forwards and backs. For an injured participant (n = 1), TEE reduced by 1.7 MJ.day-1 from pre-injury. For participants with illness (n = 3), TEE was similar to pre-illness (+0.49 MJ.day-1). The energy requirements of international female rugby players were consistent across one-match microcycles. Forwards and backs had similar adjusted energy requirements. These findings are critical to inform the dietary guidance provided to female rugby players.
Background: Athlete exposure to contact could be a risk factor for injury. Governing bodies should provide guidelines preventing overexposure to contact. Objectives: Describe the current contact load practices and perceptions of contact load requirements within men’s and women’s rugby league to allow the Rugby Football League (RFL) to develop contact load guidelines. Methods: Participants (n=450 players, n=46 coaching staff, n=32 performance staff, n=23 medical staff) completed an online survey of 27 items, assessing the current contact load practices and perceptions within four categories: “current contact load practices” (n=12 items), “perceptions of required contact load” (n = 6 items), “monitoring of contact load” (n=3 items), and “the relationship between contact load and recovery” (n=6 items). Results: During men’s Super League pre-season, full contact and controlled contact training was typically undertaken for 15-30 minutes per week, and wrestling training for 15-45 minutes per week. During the in-season, these three training types were all typically undertaken for 15-30 mins per week. In women’s Super League, all training modalities were undertaken for up to 30 minutes per week in the pre- and in-season periods. Both men’s and women’s Super League players and staff perceived 15-30 minutes of full contact training per week was enough to prepare players for the physical demands of rugby league, but a higher duration may be required to prepare for the technical contact demands. Conclusion: Men’s and women’s Super League clubs currently undertake more contact training during pre-season than in-season, which was planned by coaches and is deemed adequate to prepare players for the demands of rugby league. This study provides data to develop contact load guidelines to improve player welfare whilst not impacting performance.
Roe, G, Shaw, W, Darrall-Jones, J, Phibbs, PJ, Read, D, Weakley, JJ, Till, K, and Jones, B. Reliability and validity of a medicine ball-contained accelerometer for measuring upper-body neuromuscular performance. J Strength Cond Res 32(7): 1915-1918, 2018-The aim of the study was to assess the between-day reliability and validity of a medicine ball-contained accelerometer (MBA) for assessing upper-body neuromuscular performance during a throwing task. Ten professional rugby union players partook in the study. Between-day reliability was assessed from the best score attained during 2 sets of 3 throws, on 2 testing occasions separated by 7 days. Validity was assessed against a criterion measure (Optioelectronic system) during 75 throws from a subgroup of 3 participants. The MBA exhibited a small between-day error of 2.2% (90% confidence intervals; 2.0-4.6%) and an almost perfect relationship with a criterion measure (r = 0.91 [90% CIs; 0.87-0.94]). However, the mean bias and standard error were moderate (7.9% [90% CIs; 6.6-9.2%] and 4.9% [90% CIs; 4.2-5.7%], respectively). Practitioners using an MBA to assess neuromuscular performance of the upper body must take into account the overestimation and error associated with such assessment with respect to a criterion measure. However, as the error associated with between-day testing was small and testing is easy to implement in applied practice, an MBA may provide a useful tool for monitoring upper-body neuromuscular performance over time.
Background: In England, rugby union is a popular sport and is widely played within schools. Despite the large participation numbers, the movement and physical demands of the sport and how they progress by age have not been explored. Method: Ninety-six male rugby union players wore microtechnology devices during six rugby union matches within the education pathway to investigate the movement and physical demands of match-play. To quantify the positional differences and progression by age, data were obtained for participants at the under 16 (U16) (n=31 participants), under 18 (U18) (n=34 participants) and university (n=31 participants) levels. Players were further divided in forwards and backs. Data were analysed using magnitude-based inferences. Results: For the movement demands, U16 total distance and ‘striding’ was likely higher for forwards than backs, whereas at U18, unclear differences were observed and from university players the inverse was observed (very likely). In all age groups sprint distance was likely to very likely greater for backs than forwards. Forwards had greater physical demands than backs at all age groups. For consecutive age groups, U16 had a likely higher relative distance than U18, and U18 had a likely lower relative distance than university players. Physical demands were similar across age groups for forwards, and greater for backs at older age groups. Conclusion: The movement and physical demands of rugby union players participating in schools (U16 and U18), may not be as expected, however, the findings from university players show a similar pattern to the senior game.
The implementation of long-term athletic development (LTAD) aims to improve health, physical activity and performance of all youth. Contemporary LTAD models suggest that a broad range of physical and psycho-social competencies should be developed in youth, but few resources are available for coaches that describe ‘how’ to achieve these outcomes. This paper overviews a coaching session framework called RAMPAGE (Raise, Activate, Mobilise, Prepare, Activity, Games, Evaluate). The framework provides practitioners with information on what can be planned and delivered and when within a coaching session, across multiple ages and stages of development within multiple contexts (e.g., physical education, talent development).
Background: Growing evidence highlights that elite rugby union players experience poor sleep quality and quantity which can be detrimental for performance. Objectives: This study aimed to i) compare objective sleep measures of rugby union players between age categories over a one week period, and ii) compare self-reported measures of sleep to wristwatch actigraphy as the criterion. Methods: Two hundred and fifty-three nights of sleep were recorded from 38 players representing four different age groups (i.e. under 16, under 18, senior academy, elite senior) in a professional rugby union club in the United Kingdom (UK). Linear mixed models and magnitude-based decisions were used for analysis. Results: The analysis of sleep schedules showed that U16 players went to bed and woke up later than their older counterparts (small differences). In general, players obtained seven hours of sleep per night, with trivial or unclear differences between age groups. The validity analysis highlighted a large relationship between objective and subjective sleep measures for bedtime (r = 0.56 [0.48 to 0.63]), and get up time (r = 0.70 [0.63 to 0.75]). A large standardised typical error (1.50 [1.23 to 1.88]) was observed for total sleep time. Conclusion: This study highlights that differences exist in sleep schedules between rugby union players in different age categories that should be considered when planning training. Additionally, self-reported measures overestimated sleep parameters. Coaches should consider these results to optimise sleep habits of their players and should be careful with self-reported sleep measures.
Longitudinal changes in anthropometric, physiological, and physical qualities of international women’s rugby league players
Abstract
This is the first study to assess longitudinal changes in anthropometric, physiological, and physical qualities of international women’s rugby league players. Thirteen forwards and 11 backs were tested three times over a 10-month period. Assessments included: standing height and body mass, body composition measured by dual x-ray absorptiometry (DXA), a blood panel, resting metabolic rate (RMR) assessed by indirect calorimetry, aerobic capacity (i.e., VLO
Purpose: Collision sports are characterised by frequent high intensity collisions that induce substantial muscle damage, potentially increasing the energetic cost of recovery. Therefore, this study investigated the energetic cost of collision-based activity for the first time across any sport. Methods: Using a randomised crossover design, six professional young male rugby league players completed two different five-day pre-season training microcycles. Players completed either a collision (COLL; 20 competitive one-on-one collisions) or non-collision (nCOLL; matched for kinematic demands, excluding collisions) training session on the first day of each microcycle, exactly seven days apart. All remaining training sessions were matched and did not involve any collision-based activity. Total energy expenditure was measured using doubly labelled water, the literature gold standard. Results: Collisions resulted in a very likely higher (4.96 ± 0.97 MJ; ES = 0.30 ±0.07; p=0.0021) total energy expenditure across the five-day COLL training microcycle (95.07 ± 16.66 MJ) compared with the nCOLL training microcycle (90.34 ± 16.97 MJ). The COLL training session also resulted in a very likely higher (200 ± 102 AU; ES = 1.43 ±0.74; p=0.007) session rating of perceived exertion and a very likely greater (-14.6 ± 3.3%; ES = -1.60 ±0.51; p=0.002) decrease in wellbeing 24h later. Conclusions: A single collision training session considerably increased total energy expenditure. This may explain the large energy expenditures of collision sport athletes, which appear to exceed kinematic training and match demands. These findings suggest fuelling professional collision-sport athletes appropriately for the "muscle damage caused” alongside the kinematic “work required”. Key words: Nutrition, Recovery, Contact, Rugby
Participation in women’s rugby league has been growing since the foundation of the English women’s rugby league Super League in 2017. However, the evidence base to inform women’s rugby league remains sparse. This study provides the largest quantification of anthropometric and physical qualities of women’s rugby league players to date, identifying differences between positions (forwards & backs) and playing level (Women’s Super League [WSL] vs. International). The height, weight, body composition, lower body strength, jump height, speed and aerobic capacity of 207 players were quantified during the pre-season period. Linear mixed models and effects sizes were used to determine differences between positions and levels. Forwards were significantly (p < 0.05) heavier (forwards: 82.5 ± 14.8kg; backs: 67.7 ± 9.2kg) and have a greater body fat % (forwards: 37.7 ± 6.9%; backs: 30.4 ± 6.3%) than backs. Backs had significantly greater lower body power measured via jump height (forwards: 23.5 ± 4.4cm; backs: 27.6 ± 4.9cm), speed over 10m (forwards: 2.12 ± 0.14s; backs: 1.98 ± 0.11s), 20m (forwards: 3.71 ± 0.27s; backs: 3.46 ± 0.20s), 30m (forwards: 5.29 ± 0.41s; backs: 4.90 ± 0.33s), 40m (forwards: 6.91 ± 0.61s; backs: 6.33 ± 0.46s) and aerobic capacity (forwards: 453.4 ± 258.8m; backs: 665.0 ± 298.2m) than forwards. Additionally, international players were found to have greater anthropometric and physical qualities in comparison to their WSL counterparts. This study adds to the limited evidence base surrounding the anthropometric and physical qualities of elite women’s rugby league players. Comparative values for anthropometric and physical qualities are provided which practitioners may use to evaluate the strengths and weaknesses of players, informing training programs to prepare players for the demands of women’s rugby league.
Rugby union (RU) is a skill-collision team sport played at junior and senior levels worldwide. Within England, age-grade rugby governs the participation and talent development of youth players. The RU player development pathway has recently been questioned, regarding player performance and wellbeing, which sport science research can address. The purpose of this review was to summarise and critically appraise the literature in relation to the applied sport science of male age-grade RU players in England focusing upon 1) match-play characteristics, 2) training exposures, 3) physical qualities, 4) fatigue and recovery, 5) nutrition, 6) psychological challenges and development, and 7) injury. Current research evidence suggests that age, playing level and position influence the match-play characteristics of age-grade RU. Training exposures of players are described as ‘organised chaos’ due to the multiple environments and stakeholders involved in coordinating training schedules. Fatigue is apparent up to 72 hours post match-play. Well developed physical qualities are important for player development and injury risk reduction. The nutritional requirements are high due to the energetic costs of collisions. Concerns around the psychological characteristics have also been identified (e.g., perfectionism). Injury risk is an important consideration with prevention strategies available. This review highlights the important multi-disciplinary aspects of sport science for developing age-grade RU players for continued participation and player development. The review describes where some current practices may not be optimal, provides a framework to assist practitioners to effectively prepare age-grade players for the holistic demands of youth RU and considers areas for future research.
Weakley, JJS, Till, K, Read, DB, Leduc, C, Roe, GAB, Phibbs, PJ, Darrall-Jones, J, and Jones, B. Jump training in rugby union players: barbell or hexagonal bar?. J Strength Cond Res XX(X): 000-000, 2018-The countermovement jump (CMJ) is an exercise that can develop athletic performance. Using the conventional barbell (BAR) and hexagonal barbell (HEX) while jumping, the intensity can be increased. However, the bar that provides greater adaptations is unknown. Therefore, this study aimed to assess changes in loaded and unloaded CMJ with either a BAR or HEX across a 4-week mesocycle in rugby union players. Twenty-nine subjects were strength-matched and randomized into 2 groups. Subjects completed 3 sets of CMJ at 20% of 1 repetition maximum back squat, 3 times per week for 4 weeks, using either a BAR or HEX. Subjects completed an unloaded CMJ on a force plate before and after, whereas the highest peak concentric velocity during the jump squat was recorded in the first and last training sessions using a linear position transducer. Magnitude-based inferences assessed meaningful changes within- and between-groups. Possibly greater improvements in unloaded CMJ were found in the HEX group in jump height (effect size ± 90% confidence intervals: 0.27 ± 0.27), relative peak (0.21 ± 0.23), and mean power (0.32 ± 0.36). In addition, likely to very likely greater improvements were observed in the HEX group in peak velocity (0.33 ± 0.27), relative mean power (0.53 ± 0.30), mean force (0.47 ± 0.27), and 100-ms impulse (0.60 ± 0.48). Similar raw changes in jump squat peak velocity occurred (0.20-0.25 m·s), despite the likely greater ES occurring with the BAR (0.32 ± 0.26). These results indicate that training with the HEX leads to superior unloaded CMJ adaptations. In addition, practitioners should use either the HEX or BAR when aiming to enhance loaded jump ability.
© 2017 Thomas Sawczuk, Ben Jones, Sean Scantlebury, Jonathan Weakley, Dale Read, Nessan Costello, Joshua David Darrall-Jones, Keith Stokes, and Kevin Till This study aimed to evaluate the between-day reliability and usefulness of a fitness testing battery in a group of youth sport athletes. Fifty-nine youth sport athletes (age = 17.3 ± 0.7 years) undertook a fitness testing battery including the isometric mid-thigh pull, counter-movement jump, 5–40 m sprint splits, and the 5–0-5 change of direction test on two occasions separated by 7 days. Usefulness was assessed by comparing the reliability (typical error) to the smallest worthwhile change. The typical error was 5.5% for isometric mid-thigh pull and 3.8% for counter-movement jump. The typical error values were 2.7, 2.5, 2.2, 2.2, and 1.8% for the 5, 10, 20, 30, and 40 m sprint splits, and 4.1% (left) and 5.4% (right) for the 5–0-5 tests. The smallest worthwhile change ranged from 1.1 to 6.1%. All tests were identified as having “good” or “acceptable” reliability. The isometric mid-thigh pull and counter-movement jump had “good” usefulness, all other tests had “marginal” usefulness.
Limited research has compared the physical qualities of adolescent rugby union (RU) players across differing playing standards. This study therefore compared the physical qualities of academy and school Under-18 RU players. One-hundred and eighty-four (professional regional academy, n = 55 school, n = 129) male RU players underwent a physical testing battery to quantify height, body mass, strength (bench press and pull-up), speed (10, 20 and 40 m), 10 m momentum (calculated; 10 m velocity * body mass) and a proxy measure of aerobic fitness (Yo-Yo Intermittent Recovery Test Level 1; IRTL1). The practical significance of differences between playing levels were assessed using magnitude-based inferences. Academy players were taller (very likely small), heavier (likely moderate) and stronger (bench press possibly large; pull-up plus body mass likely small) than school players. Academy players were faster than school players over 20 and 40 m (possibly and likely small), although differences in 10 m speed were not apparent (possibly trivial). Academy players displayed greater 10 m momentum (likely moderate) and greater IRTL1 performance (likely small) than school players. These findings suggest that body size, strength, running momentum, 40 m speed and aerobic fitness contribute to a higher playing standard in adolescent rugby union.
Background: Body composition and bone health are important for netball from a performance and health perspective (e.g., bone stress injury), given the typical characteristics of players and demands of the game. Objectives: The objectives of this study are to quantify and compare the positional group-specific body composition and site-specific bone health outcomes of netball players and to establish within-season changes in these variables. Methods: Forty-seven female netball players (senior: n=23, under-21: n=24) from one Netball Super League (NSL) franchise participated across three seasons (2021-2023). Dual-energy X-ray absorptiometry (DEXA) scans were conducted four times per season. Total body, anteroposterior lumbar spine and total hip scans were performed. General and generalised linear mixed models were used to compare positional groups and age groups, and to investigate within-season changes. Results: Goal circle netball players had greater total mass and bone mass than midcourt netball players at both levels (p<0.05, effect size: moderate to very large), but not when scaled for height. Senior players had greater lean mass, bone mass, total bone mineral density and bone mineral content than under-21 players (p<0.05, effect size: moderate to very large). No group-level significant changes were observed across a playing season, but individual trends varied. Conclusion: These findings highlight the importance of continued physical development in the under-21 squad before progressing to a senior squad, as well as the need for individualised approaches to nutritional and training interventions that support physical development, addressing positional requirements and developmental stages. Future research should explore longitudinal body composition trajectories across career phases and multiple teams to refine normative benchmarks.
Accurately determining total energy expenditure enables the precise manipulation of energy balance within professional collision-based sports. Therefore, this study investigated the ability of isolated or combined wearable technology to determine the total energy expenditure of professional young rugby league players across a typical pre-season and in-season period. Total energy expenditure was measured via doubly labelled water, the criterion method, across a fourteen-day pre-season (n=6) and seven-day in-season (n=7) period. Practical measures of total energy expenditure included SenseWear Pro3 Armbands in isolation and combined with metabolic power derived from microtechnology units. SenseWear Pro3 Armbands significantly under-reported pre-season (5.00 (2.52) MJ.day-1; p = 0.002) and in-season (2.86 (1.15) MJ.day-1; p < 0.001) total energy expenditure, demonstrating a large and extremely large standardised mean bias, and a very large and large typical error, respectively. Combining metabolic power with SenseWear Pro3 Armbands almost certainly improved pre-season (0.95 (0.15) MJ.day-1; ES = 0.32 ±0.04; p < 0.001) and in-season (1.01 (0.15) MJ.day-1; ES = 0.88 ±1.05; p < 0.001) assessment. However, SenseWear Pro3 Armbands combined with metabolic power continued to significantly under-report pre-season (4.04 (2.38) MJ.day-1; p = 0.004) and in-season (2.18 (0.96) MJ.day-1; p = 0.002) expenditure, demonstrating a large and very large standardised mean bias, and a very large and large typical error, respectively. These findings demonstrate the limitations of utilising isolated or combined wearable technology to accurately determine the total energy expenditure of professional collision-based sport athletes across different stages of the season.
The importance of contributors that can result in negative player outcomes in sport and the feasibility and barriers to modifying these to optimise player health and well-being have yet to be established. Within rugby codes (rugby league, rugby union and rugby sevens), within male and female cohorts across playing levels (full-time senior, part-time senior, age grade), this project aims to develop a consensus on contributors to negative biopsychosocial outcomes in rugby players (known as the CoNBO study) and establish stakeholder perceived importance of the identified contributors and barriers to their management. This project will consist of three parts; part 1: a systematic review, part 2: a three-round expert Delphi study and part 3: stakeholder rating of feasibility and barriers to management. Within part 1, systematic searches of electronic databases (PubMed, Scopus, MEDLINE, SPORTDiscus, CINAHL) will be performed. The systematic review protocol is registered with PROSPERO. Studies will be searched to identify physical, psychological and/or social factors resulting in negative player outcomes in rugby. Part 2 will consist of a three-round expert Delphi consensus study to establish additional physical, psychological and/or social factors that result in negative player outcomes in rugby and their importance. In part 3, stakeholders (eg, coaches, chief executive officers and players) will provide perceptions of the feasibility and barriers to modifying the identified factors within their setting. On completion, several manuscripts will be submitted for publication in peer-reviewed journals. The findings of this project have worldwide relevance for stakeholders in the rugby codes. PROSPERO registration number CRD42022346751.
Objective Within women’s rugby league (n=12 teams), we (1) identified modifiers for head-to-head contacts informed by sport partners (eg, players, coaches, match officials); (2) compared head-to-head contact and concussion rates to the previous two seasons following a one-season tackle technique coaching intervention and (3) explored barriers and enablers of the intervention. Methods A multi-method design was used. Part 1: Mitigation strategies were identified by sport partners reviewing footage of head-to-head contacts, informing the development of a coach-targeted tackle technique intervention. Part 2 evaluated the intervention, comparing head-to-head contact and concussion incidence rates (IRs). Interviews with coaches and players (n=6) explored barriers and enablers to effective implementation and compliance with the intervention. Results Sport partners reported tacklers were more responsible for head-to-head contacts and lowering the tackle height was the most frequently suggested mitigation strategy preintervention and postintervention. Head-to-head contact rates were significantly lower during the intervention than preintervention (IR 59; 95% CI 56 to 62 vs IR 28; 95% CI 25 to 30/1000 tackle events); however, concussion rates showed no difference. Perceived barriers to the intervention included underdeveloped physical and technical foundations of players, lack of knowledge and understanding of the intervention and its purpose, and the environmental context and lack of resources in women’s rugby league. Beliefs about the consequences of the tackle and concussion were perceived as barriers and enablers. Conclusions Head-to-head contact rates were significantly lower; however, concussion rates did not decrease following a tackle technique coaching intervention. Reduced head-to-head contacts are potentially due to an increased focus on head injury reduction and increased player/coach awareness and support.
The Young Rugby Player
The authors contributing to this book are world leading in their respective fields, ranging from academics researching rugby performance to practitioners delivering this information within the professional game.
This is the first study to assess longitudinal changes in anthropometric, physiological, and physical qualities of international women’s rugby league players. Thirteen forwards and 11 backs were tested three times over a 10-month period. Assessments included: standing height and body mass, body composition measured by dual x-ray absorptiometry (DXA), a blood panel, resting metabolic rate (RMR) assessed by indirect calorimetry, aerobic capacity (i.e.,) evaluated by an incremental treadmill test, and isometric force production measured by a force plate. During the pre-season phase, lean mass increased significantly by ~2% for backs (testing point 1: 47 kg; testing point 2: 48 kg) and forwards (testing point 1: 50 kg; testing point 2: 51 kg) (p = ≤ 0.05). Backs significantly increased their by 22% from testing point 1 (40 ml kg-1 min-1) to testing point 3 (49 ml kg-1 min-1) (p = ≤ 0.04). The of forwards increased by 10% from testing point 1 (41 ml kg-1 min-1) to testing point 3 (45 ml kg-1 min-1), however this change was not significant (p = ≥ 0.05). Body mass (values represent the range of means across the three testing points) (backs: 68 kg; forwards: 77–78 kg), fat mass percentage (backs: 25–26%; forwards: 30–31%), resting metabolic rate (backs: 7 MJ day-1; forwards: 7 MJ day-1), isometric mid-thigh pull (backs: 2106–2180 N; forwards: 2155–2241 N), isometric bench press (backs: 799–822 N; forwards: 999–1024 N), isometric prone row (backs: 625–628 N; forwards: 667–678 N) and bloods (backs: ferritin 21–29 ug/L, haemoglobin 137–140 g/L, iron 17–21 umol/L, transferrin 3 g/L, transferring saturation 23–28%; forwards: ferritin 31–33 ug/L, haemoglobin 141–145 g/L, iron 20–23 umol/L, transferrin 3 g/L, transferrin saturation 26–31%) did not change (p = ≥ 0.05). This study provides novel longitudinal data which can be used to better prepare women rugby league players for the unique demands of their sport, underpinning female athlete health.
To establish the criterion-assessed energy and fluid requirements of female netball players, 13 adult players from a senior Netball Super League squad were assessed over 14 days in a cross-sectional design, representing a two- and one-match microcycle, respectively. Total energy expenditure (TEE) and water turnover (WT) were measured by doubly labeled water. Resting and activity energy expenditure were measured by indirect calorimetry and Actiheart, respectively. Mean 14-day TEE was 13.46 ± 1.20 MJ day−1 (95% CI, 12.63–14.39 MJ day−1). Resting energy expenditure was 6.53 ± 0.60 MJ day−1 (95% CI, 6.17–6.89 MJ day−1). Physical activity level was 2.07 ± 0.19 arbitrary units (AU) (95% CI, 1.95–2.18 AU). Mean WT was 4.1 ± 0.9 L day−1 (95% CI, 3.6–4.7 L day−1). Match days led to significantly greater TEE than training (+2.85 ± 0.70 MJ day−1; 95% CI, +1.00– +4.70 MJ day−1; p = 0.002) and rest (+4.85 ± 0.70 MJ day−1; 95% CI, +3.13–+6.56 MJ day−1; p < 0.001) days. Matches led to significantly greater energy expenditure (+1.85 ± 1.27 MJ; 95% CI, +0.95–+2.76 MJ day−1; p = 0.001) than court-based training sessions. There was no significant difference in TEE (+0.03 ± 0.35 MJ day−1; 95% CI, −0.74–+0.80 MJ day−1; p = 0.936) across weeks. Calibrated Actiheart 5 monitors underestimated TEE (−1.92 ± 1.21 MJ day−1). Energy and fluid turnover were greatest on match days, followed by training and rest days, with no difference across weeks. This study provides criterion-assessed energy and fluid requirements to inform dietary guidance for female netball players.
Determining key performance indicators and classifying players accurately between competitive levels is one of the classification challenges in sports analytics. A recent study applied Random Forest algorithm to identify important variables to classify rugby league players into academy and senior levels and achieved 82.0% and 67.5% accuracy for backs and forwards. However, the classification accuracy could be improved due to limitations in the existing method. Therefore, this study aimed to introduce and implement feature selection technique to identify key performance indicators in rugby league positional groups and assess the performances of six classification algorithms. Fifteen and fourteen of 157 performance indicators for backs and forwards were identified respectively as key performance indicators by the correlation-based feature selection method, with seven common indicators between the positional groups. Classification results show that models developed using the key performance indicators had improved performance for both positional groups than models developed using all performance indicators. 5-Nearest Neighbour produced the best classification accuracy for backs and forwards (accuracy = 85% and 77%) which is higher than the previous method’s accuracies. When analysing classification questions in sport science, researchers are encouraged to evaluate multiple classification algorithms and a feature selection method should be considered for identifying key variables.
Activities (1)
Sort By:
Featured First:
Search:
Monitoring and evaluating practice in elite rugby
Current teaching
Delivers lectures on the Sport and Exercise Science undergraduate and postgraduate taught provisions
Module leader; The Physiology of Sports Conditioning (MSc Strength and Conditioning)
PhD and Postgraduate Research Supervision
Teaching Activities (7)
Sort By:
Featured First:
Search:
Illness incidence, prevalence and prevention experiences in rugby
01 January 2021 - 28 March 2023
Lead supervisor
Identifying Athlete Movement Signatures Using Wearable Technology from Spatiotemporal Data
01 February 2019
Joint supervisor
Identification of head injuries and head impacts in Elite Men's Rugby League
01 October 2021
Joint supervisor
Pending
01 October 2025 - 30 September 2029
Joint supervisor
Concussion and tackle technique in women's rugby league
01 October 2021 - 31 March 2026
Joint supervisor
Player Profiling in Youth Rugby League: Implications for Talent Identification and Development
01 February 2022
Joint supervisor
Quantifying Head Acceleration Exposure in Elite Rugby using Instrumented Mouthguards
01 February 2022
Advisor
Featured Research Projects
News & Blog Posts
Leeds Beckett University extends partnership with Rugby Football League for three years
- 15 Sep 2025
LBU's groundbreaking research influences tackle law changes to protect rugby league players from head injuries
- 08 Dec 2023
Instrumented mouthguards to be deployed throughout Rugby League
- 06 Jul 2022
LBU support England Women Rugby League on their road to the World Cup
- 11 Mar 2022
Rugby Football League launches extensive research project with LBU
- 07 Jan 2022
LBU supports Kevin Sinfield OBE with epic Extra Mile Challenge
- 19 Nov 2021
England Wheelchair Rugby League undertake performance testing at Leeds Beckett University
- 11 Oct 2021
Building strong relationships with professional and international rugby teams
- 12 May 2021
University supports Leeds Rhinos legend’s epic marathon challenge bid
- 01 Dec 2020
Developing the Healthy Young Rugby Player
- 30 Mar 2020
{"nodes": [{"id": "2781","name": "Professor Ben Jones","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-ben-jones.png","profilelink": "/staff/professor-ben-jones/","department": "Carnegie School of Sport","numberofpublications": "485","numberofcollaborations": "485"},{"id": "14388","name": "Professor Kevin Till","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-kevin-till.jpg","profilelink": "/staff/professor-kevin-till/","department": "Carnegie School of Sport","numberofpublications": "454","numberofcollaborations": "228"},{"id": "23395","name": "Dr Cameron Owen","jobtitle": "Senior Research Fellow","profileimage": "/-/media/images/staff/dr-cameron-owen.jpg","profilelink": "/staff/dr-cameron-owen/","department": "Carnegie School of Sport","numberofpublications": "75","numberofcollaborations": "53"},{"id": "20863","name": "Dr Sarah Whitehead","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-sarah-whitehead.jpg","profilelink": "/staff/dr-sarah-whitehead/","department": "Carnegie School of Sport","numberofpublications": "72","numberofcollaborations": "54"},{"id": "27402","name": "Daniel Tadmor","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/dr-daniel-tadmor/","department": "Carnegie School of Sport","numberofpublications": "10","numberofcollaborations": "6"},{"id": "20329","name": "Dr Nessan Costello","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-nessan-costello.png","profilelink": "/staff/dr-nessan-costello/","department": "Carnegie School of Sport","numberofpublications": "32","numberofcollaborations": "18"},{"id": "23427","name": "Dr Lucy Chesson","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/lbu-approved/css/lucy-chesson.jpg","profilelink": "/staff/dr-lucy-chesson/","department": "Carnegie School of Sport","numberofpublications": "13","numberofcollaborations": "8"},{"id": "27892","name": "Matthew Kitchin","jobtitle": "Research Assistant","profileimage": "/-/media/images/staff/matthew-kitchin.jpg","profilelink": "/staff/matthew-kitchin/","department": "Carnegie School of Sport","numberofpublications": "1","numberofcollaborations": "1"},{"id": "19301","name": "Dr Greg Roe","jobtitle": "Senior Research Fellow","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/dr-greg-roe/","department": "Carnegie School of Sport","numberofpublications": "81","numberofcollaborations": "67"},{"id": "25771","name": "James Tooby","jobtitle": "Research Fellow","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/james-tooby/","department": "Carnegie School of Sport","numberofpublications": "15","numberofcollaborations": "13"},{"id": "2945","name": "Dr Brian Hanley","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-brian-hanley.jpg","profilelink": "/staff/dr-brian-hanley/","department": "Carnegie School of Sport","numberofpublications": "169","numberofcollaborations": "1"},{"id": "29066","name": "Demi Davidow","jobtitle": "Research Officer","profileimage": "/-/media/images/staff/demi-davidow.png","profilelink": "/staff/demi-davidow/","department": "Carnegie School of Sport","numberofpublications": "11","numberofcollaborations": "11"},{"id": "16981","name": "Dr Stacey Emmonds","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-stacey-emmonds.png","profilelink": "/staff/dr-stacey-emmonds/","department": "Carnegie School of Sport","numberofpublications": "101","numberofcollaborations": "43"},{"id": "22664","name": "Sarah Chantler","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/sarah-chantler.jpg","profilelink": "/staff/sarah-chantler/","department": "Carnegie School of Sport","numberofpublications": "30","numberofcollaborations": "11"},{"id": "20332","name": "Dr Thomas Sawczuk","jobtitle": "Research Fellow","profileimage": "/-/media/images/staff/dr-thomas-sawczuk.jpg?la=en","profilelink": "/staff/dr-thomas-sawczuk/","department": "Carnegie School of Sport","numberofpublications": "64","numberofcollaborations": "46"},{"id": "6995","name": "Professor Carlton Cooke","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-carlton-cooke.jpg","profilelink": "/staff/professor-carlton-cooke/","department": "Carnegie School of Sport","numberofpublications": "321","numberofcollaborations": "17"},{"id": "19506","name": "Professor Clive Beggs","jobtitle": "Emeritus","profileimage": "/-/media/images/staff/professor-clive-beggs.jpg","profilelink": "/staff/emeritus/professor-clive-beggs/","department": "Carnegie School of Sport","numberofpublications": "153","numberofcollaborations": "25"},{"id": "18478","name": "Thomas Geeson-Brown","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/thomas-geeson-brown/","department": "Carnegie School of Sport","numberofpublications": "2","numberofcollaborations": "2"},{"id": "2279","name": "Dr Bob Muir","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-bob-muir.png","profilelink": "/staff/dr-bob-muir/","department": "Carnegie School of Sport","numberofpublications": "44","numberofcollaborations": "1"},{"id": "2479","name": "Dr Andrew Abraham","jobtitle": "Head of Subject","profileimage": "/-/media/images/staff/dr-andrew-abraham.jpg","profilelink": "/staff/dr-andrew-abraham/","department": "Carnegie School of Sport","numberofpublications": "85","numberofcollaborations": "1"},{"id": "26917","name": "Manish Mohan","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/manish-mohan/","department": "Carnegie School of Sport","numberofpublications": "1","numberofcollaborations": "1"},{"id": "24300","name": "Neil Collins","jobtitle": "Post Doctoral Research Fellow","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/neil-collins/","department": "Carnegie School of Sport","numberofpublications": "10","numberofcollaborations": "10"},{"id": "25790","name": "James Parmley","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/james-parmley/","department": "Carnegie School of Sport","numberofpublications": "0","numberofcollaborations": "6"},{"id": "23421","name": "Dr Omar Heyward","jobtitle": "Research Fellow","profileimage": "/-/media/images/staff/omar-heyward.jpg","profilelink": "/staff/dr-omar-heyward/","department": "Carnegie School of Sport","numberofpublications": "27","numberofcollaborations": "20"},{"id": "20327","name": "Dr Sean Scantlebury","jobtitle": "Senior Research Fellow","profileimage": "/-/media/images/staff/sean-scantlebury.jpg","profilelink": "/staff/dr-sean-scantlebury/","department": "Carnegie School of Sport","numberofpublications": "65","numberofcollaborations": "51"},{"id": "24298","name": "Jake Beech","jobtitle": "Postdoctoral Research Fellow","profileimage": "/-/media/images/staff/dr-jake-beech.png","profilelink": "/staff/jake-beech/","department": "Carnegie School of Sport","numberofpublications": "2","numberofcollaborations": "2"},{"id": "18200","name": "Dr Josh Darrall-Jones","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-josh-darrall-jones.jpg","profilelink": "/staff/dr-josh-darrall-jones/","department": "Carnegie School of Sport","numberofpublications": "63","numberofcollaborations": "54"},{"id": "3805","name": "Professor John O'Hara","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-john-ohara.jpg","profilelink": "/staff/professor-john-ohara/","department": "Carnegie School of Sport","numberofpublications": "197","numberofcollaborations": "26"},{"id": "7247","name": "Dr Andrew Manley","jobtitle": "Head of Subject","profileimage": "/-/media/images/staff/dr-andrew-manley.jpg","profilelink": "/staff/dr-andrew-manley/","department": "Carnegie School of Sport","numberofpublications": "66","numberofcollaborations": "4"},{"id": "10606","name": "Dr Lauren Duckworth","jobtitle": "Course Director","profileimage": "/-/media/images/staff/dr-lauren-duckworth.png","profilelink": "/staff/dr-lauren-duckworth/","department": "Carnegie School of Sport","numberofpublications": "54","numberofcollaborations": "12"},{"id": "5777","name": "Dr Debbie Smith","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-debbie-smith.jpg","profilelink": "/staff/dr-debbie-smith/","department": "Carnegie School of Sport","numberofpublications": "14","numberofcollaborations": "10"},{"id": "3446","name": "Professor Roderick King","jobtitle": "Emeritus","profileimage": "/-/media/images/staff/professor-roderick-king.jpg","profilelink": "/staff/emeritus/professor-roderick-king/","department": "Carnegie School of Sport","numberofpublications": "109","numberofcollaborations": "17"},{"id": "12931","name": "Professor Theocharis Ispoglou","jobtitle": "Professor","profileimage": "/-/media/images/staff/dr-theocharis-ispoglou.jpg","profilelink": "/staff/professor-theocharis-ispoglou/","department": "Carnegie School of Sport","numberofpublications": "130","numberofcollaborations": "6"},{"id": "18026","name": "Richard Partner","jobtitle": "Course Director","profileimage": "/-/media/images/staff/lbu-approved/soh/richard-partner.jpg","profilelink": "/staff/richard-partner/","department": "School of Health","numberofpublications": "9","numberofcollaborations": "2"},{"id": "12041","name": "Dr Laurie Patterson","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-laurie-patterson.jpg","profilelink": "/staff/dr-laurie-patterson/","department": "Carnegie School of Sport","numberofpublications": "86","numberofcollaborations": "1"},{"id": "3604","name": "Professor Susan Backhouse","jobtitle": "Director of Research & Knowledge Exchange","profileimage": "/-/media/images/staff/professor-susan-backhouse.jpg","profilelink": "/staff/professor-susan-backhouse/","department": "Carnegie School of Sport","numberofpublications": "151","numberofcollaborations": "9"},{"id": "29040","name": "Liam Colbert","jobtitle": "Research Assistant/Project Officer","profileimage": "/-/media/images/staff/default.jpg","profilelink": "none","department": "Carnegie School of Sport","numberofpublications": "1","numberofcollaborations": "1"},{"id": "24698","name": "Lara Wilson","jobtitle": "Research Assistant","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/lara-wilson/","department": "Carnegie School of Sport","numberofpublications": "3","numberofcollaborations": "3"},{"id": "1909","name": "Professor James McKenna","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-james-mckenna.jpg","profilelink": "/staff/professor-james-mckenna/","department": "Carnegie School of Sport","numberofpublications": "418","numberofcollaborations": "5"},{"id": "21351","name": "Dr Gareth Jowett","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/lbu-approved/css/gareth-jowett.jpg","profilelink": "/staff/dr-gareth-jowett/","department": "Carnegie School of Sport","numberofpublications": "38","numberofcollaborations": "1"},{"id": "27450","name": "Professor Mark Gilthorpe","jobtitle": "Professor","profileimage": "/-/media/images/staff/professor-mark-gilthorpe.jpg","profilelink": "/staff/professor-mark-gilthorpe/","department": "Carnegie School of Sport","numberofpublications": "297","numberofcollaborations": "4"},{"id": "24750","name": "Anthony Moore","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/anthony-moore/","department": "Carnegie School of Sport","numberofpublications": "2","numberofcollaborations": "1"},{"id": "16532","name": "Professor David Morley","jobtitle": "Consulting Professor","profileimage": "/-/media/images/staff/professor-david-morley.jpg","profilelink": "/staff/professor-david-morley/","department": "Carnegie School of Sport","numberofpublications": "131","numberofcollaborations": "4"},{"id": "22663","name": "Ben Nicholson","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/ben-nicholson.jpg","profilelink": "/staff/ben-nicholson/","department": "Carnegie School of Sport","numberofpublications": "10","numberofcollaborations": "8"},{"id": "10053","name": "Dr Alex Dinsdale","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-alex-dinsdale.png","profilelink": "/staff/dr-alex-dinsdale/","department": "Carnegie School of Sport","numberofpublications": "20","numberofcollaborations": "7"},{"id": "24748","name": "Dr Thomas Hughes","jobtitle": "Post Doctoral Research Fellow","profileimage": "/-/media/images/staff/default.jpg","profilelink": "none","department": "Carnegie School of Sport","numberofpublications": "2","numberofcollaborations": "1"},{"id": "28108","name": "Dr Anna Stodter","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-anna-stodter.jpg","profilelink": "/staff/dr-anna-stodter/","department": "Carnegie School of Sport","numberofpublications": "36","numberofcollaborations": "3"},{"id": "7149","name": "Louise Sutton","jobtitle": "Head of Subject","profileimage": "/-/media/images/staff/louise-sutton.jpg","profilelink": "/staff/louise-sutton/","department": "Carnegie School of Sport","numberofpublications": "24","numberofcollaborations": "9"},{"id": "6163","name": "Dr Amy Brightmore","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-amy-brightmore.jpg","profilelink": "/staff/dr-amy-brightmore/","department": "Carnegie School of Sport","numberofpublications": "25","numberofcollaborations": "12"},{"id": "17144","name": "Dr Jamie Matu","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-jamie-matu.png","profilelink": "/staff/dr-jamie-matu/","department": "School of Health","numberofpublications": "83","numberofcollaborations": "5"},{"id": "19523","name": "Dr Alex Griffiths","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-alex-griffiths.png","profilelink": "/staff/dr-alex-griffiths/","department": "School of Health","numberofpublications": "34","numberofcollaborations": "3"},{"id": "941","name": "Dr Gareth Nicholson","jobtitle": "Course Director","profileimage": "/-/media/images/staff/dr-gareth-nicholson.jpg","profilelink": "/staff/dr-gareth-nicholson/","department": "Carnegie School of Sport","numberofpublications": "65","numberofcollaborations": "8"},{"id": "16938","name": "Dr Alexander Bond","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-alexander-bond.jpg","profilelink": "/staff/dr-alexander-bond/","department": "Carnegie School of Sport","numberofpublications": "47","numberofcollaborations": "1"},{"id": "14175","name": "Dr Andrew Drake","jobtitle": "Athletics Talent Hub Manager","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/dr-andrew-drake/","department": "Beckett Sport - Sports Coaching","numberofpublications": "23","numberofcollaborations": "1"},{"id": "21363","name": "Mike Hopkinson","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/mike-hopkinson.png","profilelink": "/staff/mike-hopkinson/","department": "Carnegie School of Sport","numberofpublications": "7","numberofcollaborations": "5"},{"id": "26921","name": "Lois Mackay","jobtitle": "Postdoctoral Research Fellow","profileimage": "/-/media/images/staff/default.jpg","profilelink": "/staff/lois-mackay/","department": "Carnegie School of Sport","numberofpublications": "7","numberofcollaborations": "6"},{"id": "27744","name": "Thomas Briscoe","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/thomas-briscoe/","department": "Carnegie School of Sport","numberofpublications": "1","numberofcollaborations": "1"},{"id": "27401","name": "Sam Wild","jobtitle": "KTP Associate - Applied Performance & Health Practitioner","profileimage": "/-/media/images/staff/default.jpg","profilelink": "none","department": "Knowledge Exchange","numberofpublications": "4","numberofcollaborations": "3"},{"id": "27401","name": "Samuel Wild","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/samuel-wild/","department": "Carnegie School of Sport","numberofpublications": "4","numberofcollaborations": "3"},{"id": "120","name": "Costas Tsakirides","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/costas-tsakirides.jpg","profilelink": "/staff/costas-tsakirides/","department": "Carnegie School of Sport","numberofpublications": "29","numberofcollaborations": "1"},{"id": "5725","name": "Dr Matthew Barlow","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-matthew-barlow.png","profilelink": "/staff/dr-matthew-barlow/","department": "Carnegie School of Sport","numberofpublications": "70","numberofcollaborations": "12"},{"id": "5385","name": "Peter Mackreth","jobtitle": "Dean of School","profileimage": "/-/media/images/staff/lbu-approved/css/peter-mackreth.jpg","profilelink": "/staff/peter-mackreth/","department": "Carnegie School of Sport","numberofpublications": "23","numberofcollaborations": "6"},{"id": "28321","name": "Molly Fownes-Walpole","jobtitle": "Research Assistant/Project Officer","profileimage": "/-/media/images/staff/default.jpg","profilelink": "none","department": "Carnegie School of Sport","numberofpublications": "1","numberofcollaborations": "1"},{"id": "25760","name": "Marina Alexander","jobtitle": "Consultant - Radiographer","profileimage": "/-/media/images/staff/default.jpg","profilelink": "none","department": "Carnegie School of Sport","numberofpublications": "4","numberofcollaborations": "3"},{"id": "25693","name": "Stephanie Roe","jobtitle": "Research Assistant/Project Officer","profileimage": "/-/media/images/staff/stephanie-roe.jpg?la=en","profilelink": "/staff/stephanie-roe/","department": "Carnegie School of Sport","numberofpublications": "6","numberofcollaborations": "4"},{"id": "19644","name": "Benjamin Samuels","jobtitle": "Postgraduate researcher","profileimage": "https://www.leedsbeckett.ac.uk","profilelink": "https://www.leedsbeckett.ac.uk/pgr-students/benjamin-samuels/","department": "Carnegie School of Sport","numberofpublications": "1","numberofcollaborations": "1"},{"id": "10998","name": "Neil Holmes","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/neil-holmes.png","profilelink": "/staff/neil-holmes/","department": "Carnegie School of Sport","numberofpublications": "2","numberofcollaborations": "1"},{"id": "21041","name": "Dr Tom Mitchell","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-tom-mitchell.jpg","profilelink": "/staff/dr-tom-mitchell/","department": "Carnegie School of Sport","numberofpublications": "70","numberofcollaborations": "1"},{"id": "21346","name": "Dr Ian Cowburn","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-ian-cowburn.jpg","profilelink": "/staff/dr-ian-cowburn/","department": "Carnegie School of Sport","numberofpublications": "71","numberofcollaborations": "1"},{"id": "20069","name": "Dr Emily Williams","jobtitle": "Course Director","profileimage": "/-/media/images/staff/dr-emily-williams.jpg","profilelink": "/staff/dr-emily-williams/","department": "Carnegie School of Sport","numberofpublications": "32","numberofcollaborations": "1"},{"id": "22293","name": "Parag Parelkar","jobtitle": "Senior Learning Support Officer","profileimage": "/-/media/images/staff/parag-parelkar.png","profilelink": "none","department": "Carnegie School of Sport","numberofpublications": "7","numberofcollaborations": "1"},{"id": "21230","name": "Dr Antonis Stavropoulos-Kalinoglou","jobtitle": "Reader","profileimage": "/-/media/images/staff/dr-antonis-stavropoulos-kalinoglou.jpg","profilelink": "/staff/dr-antonis-stavropoulos-kalinoglou/","department": "Carnegie School of Sport","numberofpublications": "91","numberofcollaborations": "1"},{"id": "19085","name": "Dr Oliver Wilson","jobtitle": "Senior Lecturer","profileimage": "/-/media/images/staff/dr-oliver-wilson.png","profilelink": "/staff/dr-oliver-wilson/","department": "Carnegie School of Sport","numberofpublications": "31","numberofcollaborations": "1"}],"links": [{"source": "2781","target": "14388"},{"source": "2781","target": "23395"},{"source": "2781","target": "20863"},{"source": "2781","target": "27402"},{"source": "2781","target": "20329"},{"source": "2781","target": "23427"},{"source": "2781","target": "27892"},{"source": "2781","target": "19301"},{"source": "2781","target": "25771"},{"source": "2781","target": "2945"},{"source": "2781","target": "29066"},{"source": "2781","target": "16981"},{"source": "2781","target": "22664"},{"source": "2781","target": "20332"},{"source": "2781","target": "6995"},{"source": "2781","target": "19506"},{"source": "2781","target": "18478"},{"source": "2781","target": "2279"},{"source": "2781","target": "2479"},{"source": "2781","target": "26917"},{"source": "2781","target": "24300"},{"source": "2781","target": "25790"},{"source": "2781","target": "23421"},{"source": "2781","target": "20327"},{"source": "2781","target": "24298"},{"source": "2781","target": "18200"},{"source": "2781","target": "3805"},{"source": "2781","target": "7247"},{"source": "2781","target": "10606"},{"source": "2781","target": "5777"},{"source": "2781","target": "3446"},{"source": "2781","target": "12931"},{"source": "2781","target": "18026"},{"source": "2781","target": "12041"},{"source": "2781","target": "3604"},{"source": "2781","target": "29040"},{"source": "2781","target": "24698"},{"source": "2781","target": "1909"},{"source": "2781","target": "21351"},{"source": "2781","target": "27450"},{"source": "2781","target": "24750"},{"source": "2781","target": "16532"},{"source": "2781","target": "22663"},{"source": "2781","target": "10053"},{"source": "2781","target": "24748"},{"source": "2781","target": "28108"},{"source": "2781","target": "7149"},{"source": "2781","target": "6163"},{"source": "2781","target": "17144"},{"source": "2781","target": "19523"},{"source": "2781","target": "941"},{"source": "2781","target": "16938"},{"source": "2781","target": "14175"},{"source": "2781","target": "21363"},{"source": "2781","target": "26921"},{"source": "2781","target": "27744"},{"source": "2781","target": "27401"},{"source": "2781","target": "27401"},{"source": "2781","target": "120"},{"source": "2781","target": "5725"},{"source": "2781","target": "5385"},{"source": "2781","target": "28321"},{"source": "2781","target": "25760"},{"source": "2781","target": "25693"},{"source": "2781","target": "19644"},{"source": "2781","target": "10998"},{"source": "2781","target": "21041"},{"source": "2781","target": "21346"},{"source": "2781","target": "20069"},{"source": "2781","target": "22293"},{"source": "2781","target": "21230"},{"source": "2781","target": "19085"}]}
Professor Ben Jones
2781

