The digital athlete, from the NFL and Amazon Web Services, aims to prevent injuries.

0

The NFL’s collaboration with AWS has helped reduce concussions and create position-specific helmets.courtesy of the NFL

The NFL’s most important football player of the future will do anything without complaining. Running the same game over and over again? Absoutely. Wearing a new helmet or new crampons? Happy to. Adapt to a change of location, weather, or even regulations? No problem.

It’s the Digital Athlete, a core part of the NFL’s health and safety initiative, which was developed in conjunction with cloud partner Amazon Web Services. This is a composite, by position group, of an NFL player. Some of them are already functional, others still ambitious. All of this is avant-garde.

“It’s really a computer simulation model of a football player that can be used to replicate scenarios in the game environment,” said Jennifer Langton, senior vice president of health and innovation at the NFL. “So whether it’s variations based on position or environmental factors, you can change those factors in a simulation model and see what the impact might be, which is really, really new.”

The NFL has been collaborating with AWS on its Next Gen statistics since 2017 and is halfway through an expanded three-year collaboration targeting player safety initiatives. This work is part of the larger $60 million commitment the league has made in its engineering roadmap. Among the milestones so far: a 24% reduction in concussions over the past three seasons (2018-20) compared to the previous three years; an increase from 40% to 99% of players wearing the most efficient helmets in terms of safety; and the creation of the first position-specific helmet. All of these data points – and more – helped inform the league’s decision to expand from 16 regular season games to 17 starting in the 2021 campaign.

The Digital Athlete relies on a wide range of inputs that generate nearly three terabytes of data per week: video review, equipment analytics, gumshield sensors, game and training performance data, and even more. This prevalence of sensors, coupled with video, helps engineers measure the range of NFL player experience that can be used to create a digital twin – not an identical Tom Brady or Derrick Henry – but a virtual placeholder replicating the attributes of a player at each position.

“The idea is to create a representation of NFL athletes that we can put into scenarios, and we’re not risking anyone,” said Sam Huddleston, chief data scientist at Biocore, whose company is led by the president. of NFL Engineering Committee Jeff Crandall.

Prior to joining Biocore, Huddleston spent two decades working for the Department of Defense in various Army, Navy and NATO positions.

“We have this phrase that we used to use in DoD that started in counter-IED – we’re still working ‘left of the boom,’” Huddleston said. “That’s how I try to approach these wounds: we have to stay ‘left of the boom’. So if we know something is at risk and we have the time, let’s create the time to step in and get rid of the injuries.

The league had been compiling its NFL injury monitoring system for decades, but progress in improving player safety was slow because much of the work was still done manually: Frame-by-frame game video review , injury data extraction and player identification. The league has only done about 50 games of such detailed analysis per year.

Four years ago, the NFL held a road show to meet technology partners who could amplify work at high speed and scale by creating a data lake with a multitude of inputs ranging from NFLISS to player tracking data. Next Gen players stats and head impact recorded by its mouth guard sensor pilot program.

After a pair of participatory competitions to develop efficient algorithms, the same frame-by-frame examination was then performed using the new computer vision techniques. The program completed all games for the past two seasons during the recent Christmas break.

“The digital athlete was part of the NFL’s vision when it came across AWS,” said Priya Ponnapalli, senior manager of the Amazon Machine Learning Solutions Lab.

The digital athlete is more of a catch-all term than a specific software application. “To understand athlete injuries, you need to have a complete history of what happened to the athlete before, during, and even to some degree after,” Huddleston said. “And, so, we’ve built this program around the ability to take every sensor that collects information about what’s going on in the game and stitch it together into an athlete-centric view.”

All NFL broadcast footage has already been migrated to the AWS cloud, so once the next-gen stats and game video timestamps are in sync, engineers can retroactively generate data back to 2016.

Video reviews, equipment analysis, game and practice performance data and more go into creating a ‘digital twin’ for NFL players like the Arizona quarterback Cardinals Kyler Murray.courtesy of the NFL

The injury reconstruction work conducted by the Biocore team necessarily followed an injuring event on the field, but the team never had comparative data. Video analytics combined with player tracking data will fill these gaps. Engineers were able to determine that 60% of all injuries are to a lower limb, and even identified a closing speed that makes a player more susceptible to concussions.

The objective is that this work be done in concert between the departments and the committees of the league. At an NFL-sponsored innovation panel last fall, Atlanta Falcons CEO Rich McKay, who serves as chairman of the competition committee, cited the importance of this artificial intelligence research. and machine learning.

“We used to project what we thought was going to happen, but we really didn’t know,” McKay said. “We know now. I think that helps us, in terms of innovation, to progress much more quickly. »

From rearranging kickoff formations to banning players from lowering their helmets, the engineering team is trying to design changes that are minimally invasive.

“We want to preserve the game,” Huddleston said. “But we want to remove the very, very specific things that cause injury.”

Although a number of biomechanical tracking and analysis options exist for other team sports, they have been conspicuously absent from contact sports such as football and hockey.

Keeping track of which player is which during field stackups has always been a major impediment to pose or skeletal analysis. Such identification was at the heart of two computer vision challenges the NFL and AWS sponsored, one to automatically mark helmet impacts and the second to identify the players involved.

One of the ways the NFL has begun to address this, Langton said, is to apply additional computer vision algorithms on top of Intel TrueView’s 360-degree volumetric video to generate skeletal modeling of players. . This is used in tandem with high fidelity broadcast footage for robust 3D motion capture.

Just knowing what equipment a player is wearing at all times is useful in the event of an injury, as is knowing what pitch surfaces are in use, grass or one of 12 unique artificial turf recipes across all 30 stadiums of the NFL.

To that end, the NFL has installed digital scanners in every stadium and locker room at practice facilities so that every player’s gear is recorded before every game and practice. Helmets, shoulder pads and cleats are all adorned with number labels.

No blade of grass is left to chance. The league pilots a device that traverses the playing surface before games to map the football field. It’s like a Mars rover for the NFL and helps feed a database of interactions between pitch type and gear, like cleats or helmets.

In a side deal with the NFLPA, Langton said, all portable player performance data is made available to the engineering committee — not anyone working directly in the league office — for the benefit of the players. player health and safety initiatives. (The NFLPA declined to comment for this story.)

Just as the NFL has made position-specific helmets a priority, Huddleston said the ultimate goal will be true personalization: cleats and helmet recommendations to each athlete based on game, location and weather.

“It’s really the first time we had the completeness of all the data that our engineers had access to,” Langton said, “and the digital athlete helped us put them in that environment.”

SportTechie is now part of Leaders Group. To learn more about the intersection of sport and technology, visit sporttechie.com.

Share.

Comments are closed.