Shop NASE

Numerous studies have found links between sleep quality and or quantity and injury risk in athletes of various ages and levels. This has encouraged the development of novel sleep technologies and popularized the implementation of sleep-monitoring sports teams. However, one of the limitations of traditional research is the focus on mean data reporting from groups of subjects. While mean data is certainly useful to generalize results to a broad population, individual responses tend to get lost in the mix. For coaches and sports medicine practitioners, it is the outliers and individual responses that are most meaningful and interesting. This is because in high performance sports, there is considerable inter-individual variability in how athletes respond and adapt to training. For this reason, case-study research can be very useful for influencing practice.

A new case study published ahead of print in the Journal of Strength and Conditioning Research explored the association between markers of sleep quality the night before as well as the week leading up to an injury in an elite UEFA soccer player (7-year starter). Sleep was monitored nightly throughout 4 months of a competitive season via wrist actigraphy. In addition, the athlete rated perceived sleep quality and regularly reported potential causes of disturbed sleep to coaching staff. The athlete sustained three separate injuries during the observation period. On day 12 the athlete suffered a groin injury (sidelined for 13 days), on day 26 the athlete suffered a hamstring strain (sidelined for 10 days) and on day 115 the athlete suffered an ankle sprain (sidelined for 29 days).

The results showed that sleep onset latency (i.e., how long it took the athlete to fall asleep) the night before injury as well as the week leading up to the injury was significantly longer than baseline values obtained during the preseason (effect sizes = 3.1 and 1.6, respectively). Additionally, sleep efficiency (i.e.,  the ratio of the total time spent asleep compared to the total amount of time in bed) the night before injury as well as the week leading up to the injury was significantly lower than baseline values (effect sizes = 3.2 and 2.8, respectively). The athlete slept worse following night games due to late meals and post-match interviews. Other contributors of poor sleep were feelings of anxiousness about the next match and general soreness. The authors conclude that sleep quality was substantially worse than the athletes baseline sleep performance prior to injury occurrence. Thus, monitoring sleep quality and intervening when sleep quality deteriorates may be a useful strategy for reducing injury risk in athletes.

Reference:

Nedelec, M. et al. Case study: sleep and injury in elite soccer. A mixed method approach. Journal of Strength and Conditioning Research. In press.

Cold water immersion is one of the most common modalities used by sports teams to enhance recovery from training and competing in athletes. Following intensive training or a competition, athletes reluctantly submerge their lower body in ice-cold water. Athletes often report that this reduces pain and soreness following training. Its impact on performance recovery is less clear, however, as the research tends to be conflicting. Of note, its been shown that chronic use of cold water immersion following resistance training can impair strength and hypertrophy gains. It is thought that blunting the inflammatory response through cold water immersion may be responsible for the attenuated size and strength gains. Needless to say, more research is needed for coaches to understand when and how to implement cold water immersion with athletes.

A new study published ahead of print in the Journal of Strength and Conditioning Research evaluated the effects of cold water immersion on subjective and objective markers of recovery. A club rugby team comprised of 16 male athletes (~20 years old) were split into a cold water immersion group and a control group (n = 8 for each group). All subjects performed a simulated rugby match to induce fatigue and muscle damage. Before, following and at 24 and 48 hours post-training, counter-movement jump, maximal voluntary isometric contraction, serum creatine kinase and perceived muscle soreness were evaluated. The cold water immersion protocol involved the lower limbs being submerged in 10 degree Celsius water for 2 rounds of 5 minutes with 2.5 minutes rest between rounds. The control condition involved seated passive rest for 15 minutes while drinking a sports drink.

The results showed that at 24 and 48 hours post-training, perceived soreness levels were lower for the immersion group compared with the control group. Creatine kinase levels progressively increased across time for the control group but did not substantially increase for the immersion group. Countermovement jump performance decreased similarly in both groups post-training but demonstrated better recovery to baseline at 24 and 48 hours in the immersion group. Maximal voluntary isometric contraction followed a similar trend as countermovement jump performance. The authors conclude that repeated cold water immersion following competitions may alleviate the effects of training-induced muscle damage and speed up performance recovery.

Reference:

Barber, S. et al. The Efficacy of Repeated Cold Water Immersion on Recovery Following a Simulated Rugby Union Protocol. Journal of Strength and Conditioning Research. In press.

 

 

 

Programming resistance training during the in-season competitive phase can be quite challenging. This is because coaches need to ensure that athletes are receiving a sufficient training effect to maintain strength, power and muscle mass without compromising recovery and performance. Thus, picking the right days of the week to implement resistance training sessions that allows sufficient recovery before the next match is an important consideration. Post-exercise recovery is multifaceted and involves various components. For example, cardiovascular, neuromuscular, perceptual and biochemical variables will all be affected by training and likely recovery at varying time-points. Factors such as maximal force production and rate of force development are of particular importance to athletes involved in team-sports due to its strong association with sprinting and jumping performance.

A new study published ahead of print in the Journal of Science and Medicine in Sports monitored markers of fatigue and recovery for 48 hours following a full-body resistance training session in trained men. Eight subjects with >2 years of routine resistance training experience performed an intensive resistance training session modeled off of typical in-season training routines of contact-based team sports. The session included various squat variations, power cleans, presses, rows and pull ups. Loads ranged between 1-2x body weight for most exercises. Before, 1 hour post and 24 and 48 hour post-training, subjects rated their perceived levels of fatigue, soreness, mood, stress and sleep quality on a 1-5 point scale. Leg soreness was also evaluated seated at rest and following a body-weight squat. At the same time-points (in addition to immediately post-training), maximal torque, rate of torque development and thigh muscle activation were evaluated.

The results showed that perceptual measures of fatigue and soreness did not return to baseline until 48 hours post-training. Maximal torque and muscle contractility recovered to baseline at 24 hours post-exercise. Interestingly, rate of torque development and and muscle activation did not significantly change throughout the observation period. This study demonstrated that there is a disconnect in the time-course of recovery between perceptual and neuromuscular performance following resistance training. Despite reporting high levels of fatigue and soreness at 24 hours post-training, neuromuscular performance was not different from pre-exercise levels.

Reference:

Marshall, P. W., Cross, R., & Haynes, M. (2017). The fatigue of a full body resistance exercise session in trained men. Journal of Science and Medicine in Sport. In press.

Post-activation potentiation techniques are commonly used during training sessions to transiently enhance sprinting or jumping performance. The barbell back squat, deadlift and Olympic lifts tend to be the most commonly used potentiating exercises. Each of these movements predominantly involve vertical force production which would be optimal for vertical jumping and peak velocity sprinting. However, for improving acceleration ability, a potentiating exercise that involves horizontal force production (i.e., the barbell hip-thrust) may be an effective alternative to squat and deadlift variations. Acceleration ability may be more important than peak running velocity in team sport athletes who rarely achieve maximum, uninterrupted running speeds. However, research investigating the potentiating effects of the barbell hip-thrust on acceleration performance is limited.

A new study published ahead of print in the Journal of Sports Sciences evaluated the acute effects of barbell hip-thrusting on subsequent short-distance sprint performance. Eighteen male team-sport athletes performed two different experimental protocols of hip-thrusting followed by electronic-timed sprints in a randomized-crossover design. The first experimental protocol involved sets of hip-thrusts loaded up with 50% of 1RM. The second protocol involved sets of hip-thrusts loaded up with 85% of 1RM. Fifteen m sprint times were assessed at baseline, 15 seconds, 4 minutes and 8 minutes following each protocol. The researchers wanted to determine how moderate versus heavy loads impact acceleration ability compared to baseline performance and between conditions.

At 15 seconds post-hip-trust using 85% of 1RM, sprinting speed was significantly higher (i.e., impaired). Both protocols resulted in significantly faster sprint times at 4 minutes and 8 minutes post-hip-thrust. Comparing the two protocols against each other, sprint times were faster at 4 minutes and 8 minutes following the protocol using 85% of 1RM. Individual 1RM barbell hip-thrust values were positively associated with individual post-activation potentiation sprint responses. This means that individuals with stronger hip-thrust strength saw the biggest changes in sprint performance following the experimental protocol. The authors conclude that heavy and moderate barbell hip-thrust loads can positively affect acute sprint performance. However, the sprint performance is affected by recovery time between protocol and sprinting (i.e., 4 – 8 minutes rest is required) as well as individual strength level.

Reference:

Dello Iacano, A. et al. Loaded hip thrust-based PAP protocol effects on acceleration and sprint performance of handball players. Journal of Sports Sciences, Ahead of print.

 

Commitment to a structured and progressive strength and conditioning program can have a tremendous impact on the performance potential of an athlete. However, unless coaches keep records of changes in both performance markers and markers of strength and power, it is difficult to determine the magnitude of these improvements and if they’re related. For example, maximal lower body strength is often correlated with vertical jumping ability and sprinting speed in cross-sectional studies. Comparably less research has been done that demonstrates a relationship between longitudinal improvements in lower body strength or power and improvements in athletic performance tasks. Collegiate athletes are exposed to mandatory strength and conditioning training throughout their college playing career and therefore evaluating yearly changes in performance among this population can provide insight regarding the effects of chronic strength and power training on performance.

A new study published ahead of print in the Journal of Strength and Conditioning Research evaluated yearly changes in strength, power and performance among a NCAA Division-1 women’s volleyball team. A team of 29 volleyball players were grouped according to how many years of strength and conditioning training they had been exposed to with the team. The groups included individuals with 0-1 years of training, 1-2 years of training and 2-3 years of training. The variables that were monitored across time included static and countermovement jump heights (unloaded and with 11 and 20 kg of resistance) and peak force production from an isometric mid-thigh pull on a force plate. The researchers determined the magnitude of changes in each variable for each group to determine: 1) the improvement observed after ~1, ~2 and ~years of training relative to initial baseline testing when athletes first joined the team.

The results showed that static and countermovement jump variables demonstrated average improvements of 5.6 – 12.3% following 1 year of training, 6.6 – 12.7% following 2 years of training and 16.6 – 30.6% after 3 years of training. Peak force production from the isometric mid-thigh pull demonstrated an average improvement of 20.5% after 1 year of training, 22.2% after two years of training and 31.5% after three years of training. The authors conclude that a combination of traditional resistance training and Olympic weightlifting (in combination with volleyball training) can stimulate long term improvements in maximal strength and jumping ability in collegiate volleyball players. This was observed despite no additional plyometric training outside of volleyball practice.

Reference:

Kavanaugh, A.A., et al. Long-term changes in jump performance and maximum strength in a cohort of CAA Division I women’s volleyball athletes. Journal of Strength and Conditioning Research, In press.

 

 

 

 

Maximum acceleration ability and sprinting speed are key variables that separate the elite from the sub-elite athlete. The necessity to outrun an opponent occurs in virtually all team sports that can be a major determinant of match outcome and thus is an important element focused on during training. The ability to predict and monitor sprint and acceleration ability is of great use for coaches. Sprint testing is time consuming and fatigue-inducing. Thus, throughout a competitive season, it would be undesirable to perform frequent maximal sprint testing. Establishing predictors of running performance is an area of research among sport scientists with the aim of determining what tests can be used as an indicator of sprint performance. While this topic has received considerable attention among youth and amateur athletes, limited research has been conducted with elite level athletes.

A new study published ahead of print in the Journal of Strength and Conditioning Research sought to identify the best predictors of sprint performance among elite sprinters and jumpers. A sample of 19 Olympic or international level track and field athletes (age = ~24 years, n = 12 men and 7 women) performed an electronic-timed 60 m sprint with splits at every 10 m increment. In addition subjects underwent a series of tests that were used as predictor variables for acceleration and maximal sprinting speed. These included squat and countermovement jumps, drop-jumps at 45 and 75 cm, mean and peak power from the jump squat and half squat exercises and contraction velocity of the thigh muscles via tensiomyography. Correlations were performed to determine which predictor variables were strongly associated with sprint times.

The results showed that both squat jump height and countermovement jump height were similarly and very largely related with sprinting speed at each split (r values range from -0.76 – -0.88). Drop jump height at both 45 and 75 cm were also strongly predictive of sprint performance at each split (r values range from -0.75 – -0.85) while contact time and reactive strength index demonstrated weaker associations with sprint speed. While peak and mean propulsive velocities (expressed relative to body mass) for the jump squat and half squat were all significantly related with sprint time, mean propulsive velocity from the jump squat demonstrated the strongest associations (r values range from -0.73 – -0.86). Tensiomyographic parameters were not significant predictors of sprint speed. These results suggest the jump height and relative strength both appear to be important predictors of sprint performance. Future research is needed to determine if changes in these parameters (improvements or decrements) relate with changes in sprint performance.

Reference:

Loturco, I. et al. Predictive Factors Of Elite Sprint Performance: Influences Of Muscle Mechanical Properties And Functional Parameters. J Str Cond Res. In Press.

 

Preseason training for American college football players involves roughly 4 weeks of intense training, typically in hot and humid conditions. Heat stress is exacerbated by protective equipment requirements. The high concentration of training in extreme conditions put players at risk of injury and/or excessive fatigue. It’s no surprise that injury right are highest during the preseason period relative to other phases. Therefore, coaches are interested in managing fatigue and recovery in football players to optimize training adaptations and minimize injury risk. One of the simplest athlete-monitoring tools available are self-reported wellness measures. Provided that athletes are honest when filling out the questionnaires, subjects ratings of fatigue and recovery status may be useful for evaluating how individuals are adapting to training.

A new study published ahead of print in the Journal of Strength and Conditioning Research looked at the effects of previous day training demands on perceptual responses among a team of NCAA division 1 football players during preseason training. Movement demands were quantified by GPS devices with integrated tri-axial accelerometers which players wore between their shoulder blades for each practice session. Wellness questionnaires that had athletes rate their perceived levels of fatigue, soreness, mood, stress, sleep quality and sleep quantity on a 1-5 point scale were completed daily before training. The researchers sought to determine if training outputs affect subsequent day recovery status ratings.

The results showed that athletes who tended to accomplish the greatest training demands during practice (e.g., total distance covered, high intensity accelerations and decelerations, etc) reported greater levels of perceived soreness, fatigue and poorer sleep quantity compared with athletes who experience lower workloads. Stress, mood and sleep quality demonstrated less variation among groups. Overall, the wellness questionnaires demonstrated sensitivity to training load, indicating that these appear to be a useful tool for monitoring training effects in football players. Coaches may use wellness questionnaire data to assess how individuals are coping with training and therefore make appropriate interventions for fatigued athletes.

Reference:

Wellman AD. et al. Movement demands and perceived wellness associated with pre-season training camp in NCAA Division I college football players. Journal of Strength and Conditioning Research. In press.

Team sports are largely skill-based. All else being equal among teams, the more highly skilled team will typically prevail. This is why coaches are so adamant about technical development during training. Many would even argue that strength and conditioning training may be of little value to highly skilled athletes since they demonstrate success without participating in rigorous training. However, what these coaches may be overlooking is how improvements in strength and fitness levels may enhance skill expression, or at the very least, enable the athlete to express their skill for a longer period of time before fatigue sets in. Muscle strength and sport-skill are not necessarily independent of each other. This has been made evident in ample research showing relationships between strength and physical qualities such as running and jumping. However, further research is needed to evaluate how strength qualities relate with sport-specific tasks that occur during match-play and not measured during performance testing.

A new study published ahead of print in the Science and Medicine in Football journal evaluated relationships between match-play tackle outcomes (i.e., successful or unsuccessful, knocking the player down, etc.), tackling ability and physical qualities in a men’s rugby team. A total of 15 rugby matches were assessed with tackle characteristics and outcomes from over 2300 tackles from 16 players recorded and coded for analysis. In addition, physical qualities (i.e., strength and power) as well as a standardized assessment of tackling ability were performed. Relationships were then quantified among the variables to determine what technical and physical aspects were related with successful tackle outcomes during competitive match-play.

The results showed that three technical characteristics were associated with successful tackle outcomes during match-player. These characteristics were medium body position (flexed knees and hips), utilizing a shoulder or “smother” tackle and producing leg drive upon contact. Tackles made from a front-on position with a contact zone at the chest region reduced the odds of a missed tackle. Interestingly, lower body maximal strength was significantly associated with players’ ability to exhibit a medium body position as well as knocking the ball carrier on their back. This study nicely demonstrates how physical qualities such as strength levels can contribute to technical performance during match play either directly or indirectly.

Reference:

Speranza, M. J., Gabbett, T. J., Greene, D. A., Johnston, R. D., & Townshend, A. D. (2017). Tackle characteristics and outcomes in match-play rugby league: the relationship with tackle ability and physical qualities. Science and Medicine in Football, 1-7.

 

 

Some of the top minds in the field of strength and conditioning believe that the key to athletic performance comes down to how much force an athlete can put into the ground and how quickly that force can be applied. While this may be an oversimplification, one cannot argue that peak force and rate of force development are integral characteristics that affect sprinting, jumping and rapid changes of direction, each of which are key elements to team sports play. One of the best ways to assess lower body force production along with rate of force development is via the isometric mid-thigh pull, performed on a force plate. The isometric mid-thigh pull has been related to athletic performance variables such as sprinting and jumping in rugby players, however, how this markers relates with performance metrics among college basketball players remains to be determined.

A new study published ahead of print in the Journal of Strength and Conditioning Research evaluated relationships between isometric mid-thigh pull derived variables and various athletic performance tests among a group of NCAA Division 1 basketball players. Eight male and 15 female college basketball players were tested for peak force production and rate of force development at 50, 100, 150, 200 and 250 ms. In addition, all players performed a 20 m sprint test (5 and 10 m splits) while power, velocity and force were evaluated via tethered device (1080 Sprint). One repetition front squat and hang clean as well as vertical jump height and agility performance were also evaluated. Bivariate correlations were used to quantify relationships among the variables.

The results showed that isometric mid-thigh pull peak force was significantly related with 1RM front squat (r = 0.71), hang clean (r = 0.89), vertical jump height (r = 0.81) and agility (r = -0.66).  In addition, peak force was also significantly related with sprint time for all split intervals (r values range from -0.62 – -0.69), average and peak sprint velocity (r values range from 0.50 – 0.70) as well as average sprint force and average sprint power (r values range from 0.48 – 0.73). Interestingly, rate of force development between 50-250 ms was significantly associated with average sprint force and power over the first 5 m. The authors conclude that initial acceleration kinetics of a sprint (~5 m) are strongly affected by rate of force development and that practitioners may use the isometric mid-thigh pull test to assess and monitor performance in collegiate basketball players.

Reference:

Townsend, J. R., Bender, D., Vantrease, W., Hudy, J., Huet, K., Williamson, C., … & Mangine, G. T. (2017). Isometric Mid-Thigh Pull Performance Is Associated With Athletic Performance And Sprinting Kinetics In Division I Men And Women’s Basketball Players. The Journal of Strength & Conditioning Research.

 

Older studies in animal models suggested that extreme static stretching elicited substantial increases in muscle hypertrophy. Trainee’s have since been trying to find different ways to apply static stretching techniques for the purposes of building more lean tissue or improving resistance training performance. The inter-set period has been an area of focus for the implementation of various stretching protocols. However, it is well established today that excessive stretching of the agonist muscles between sets can results in substantial decrements in performance, such as force output, number of repetitions performed, etc. For example, static stretching the pectoral muscles between sets of bench press will decrease power and endurance of the subsequent set. However, it remains unclear whether static stretching of the antagonistic muscle group offers any benefits.

A recent study published in Research in Sports Medicine evaluated the performance effects of antagonistic muscle group static stretching between sets of a resistance training exercise. A group of 10 male subjects with resistance training experience repeated two workouts in a randomized order. Both workouts consisted of 3 sets of 10RM (repetitions to failure) of the wide-grip seated row with 2 minutes rest between sets. For the experimental condition, the subjects underwent 40 seconds of pectoralis major static stretching (antagonist muscles during the row) directly before each set. The outcome variables that were recorded included the number of repetitions performed per set and surface electromyography (muscle activation) of the latissimus dorsi, biceps brachii and pectoralis major muscles.

The results showed that for the control condition (no stretching of the antagonist between sets), number of repetitions performed dropped by ~7% between sets 1-2 and by ~13% between sets 2-3. This was significantly different from the stretching group who only saw decrements of ~4.5% and ~9.4% between sets 1-2 and 2-3, respectively. EMG analysis showed the agonist muscle activation was significantly greater in the stretching group compared with the control group while antagonist muscle activation (pectoralis major) was not different between groups. This study demonstrates that 40 seconds of static stretching between sets off-sets performance decline (repetitions to failure) and increases agonist muscle activity. This may result in superior training adaptations such as size and strength gains when performed over a longitudinal training program.

Reference:

Miranda, H., Maia, M. D. F., Paz, G. A., & Costa, P. B. (2015). Acute effects of antagonist static stretching in the inter-set rest period on repetition performance and muscle activation. Research in Sports Medicine23(1), 37-50.