Villanova Engineering Professor Develops Virtual Reality Baseball Training Environment

Villanova Engineering Professor Develops Virtual Reality Baseball Training Environment

A cutting-edge project by Mark Jupina, PhD, assistant professor of Electrical and Computer Engineering, is giving Villanova baseball players the opportunity to hone their pitch recognition and eye tracking capabilities against pitchers in the virtual realm. PITCHvr—which stands for Perceptual Image Trainer for the Complete Hitter in the Virtual Realm—can be used in a virtual reality CAVE environment or wearing head-mounted displays such as the HTC Vive Pro and Oculus Rift. The current version of PITCHvr, called vision, focuses on eye training. Later versions this year will include specific training modes for hitters, catchers and umpires.

PITCHvr uses either Major League Baseball pitch data available from the PITCHf/x database or customized pitch data based on the user’s specifications. Regardless of the source of the pitch data, Jupina uses a model to recreate the motions of a pitched ball—including the path, velocity, orientation and spin of the baseball—from the batters’ perspective. In the virtual or augmented realm, there are no limits as to how the training experience can be varied and analyzed.

“PITCHvr will allow batters to see more pitches and hone their pitch recognition abilities, while at the same time providing an opportunity for the Villanova research community to design new systems, which will lead to a better understanding of how to further enhance a player’s vision training experience,” says Jupina.

Jupina has developed data sets for three different types of curve balls, two sliders, a change-up and four fastballs, including a four-seam, two-seam, cutter and slider. The velocity range for each pitch type, the amount of break, spin rate, etc., can be altered to match high school, college and professional levels. PITCHvr includes both left-handed and right-handed pitching avatars with accurate pitching mechanics and grips. The release point of the pitched ball can also be varied.

Working closely with Villanova's head baseball coach Kevin Mulvey, a former standout pitcher for the University, who reached the major leagues—as well as his Wildcat players—has provided Jupina with input on the experience and how it could impact his players. Their feedback was instrumental in creating the PITCHvr Vision system.

“The opportunity to work alongside Dr. Jupina and his team as he put together the amazing experience that PITCHvr offers has been incredible,” says Mulvey. “Being able to answer questions, add input and enhance the overall experience from the baseball side of things was fun and exciting. Thanks to Mark’s efforts, our student-athletes are now able to experience something that wouldn’t otherwise be available to them had it not been for his willingness to include our program from the start.”

A variety of training assets have been added by Jupina and his team to the virtual realm, providing a set of tools for various training modes—as well as a means in which to evaluate a player’s performance. These training tools can help the brain better anticipate the path of a pitch. For example, in perceptual image training, other senses—such as hearing—can be used to help train the eyes to follow a moving ball. Since hearing is a quantitatively faster sense and highly discriminating, one’s hearing can assist sight, in training the brain. A unique soundscape file for each pitch is created, which is composed of a sequence of audible tones representing the path and the initial velocity of the moving ball. Consequently, the soundscape file for a curveball sounds very different than a fastball since their paths and velocities are quite different.

Other features include:

  • The position of the player in the batter’s box is tracked and the strike zone is automatically determined. Consequently, the location of the pitches can be automatically adjusted by the system with this knowledge. For example, by knowing if a player is left-handed or right-handed, the system will throw more pitches away from the batter—as opposed to inside—since this is the trend seen at the higher levels of the game.
  • Speech recognition is used to capture a player’s response during training so that the player’s performance can be automatically scored. The player’s performance during the training session is tracked, and AI (artificial intelligence) adjusts the level of difficulty to match the player’s level of performance. Thereby, an optimal training session is the result.

Currently in development is the integration of Pupil Labs eye trackers and Narbis EEG and EMG sensors into the Vive Pro headset to measure the degree of smooth eye pursuit, the level of focus and concentration, and the degree of relaxation by the player. Jupina is working with sports psychologists, vision experts and baseball training facilities to further develop and evaluate the PITCHvr training system.

PITCHvr has garnered interest from several Major League Baseball teams, who have sent personnel to see and learn more about the project. It has also been highlighted in the media by WIRED, The Philadelphia Inquirer, and this most recent story by SportTechie: Villanova, MLB Team Eye Virtual Reality Training for Batters and Catchers

The PITCHvr project has been a true multi-disciplinary effort. Jupina has engaged professor and chair Thomas Toppino, PhD in Villanova’s Department of Psychological and Brain Sciences, professor Frank Klassner, PhD, and Andrew Grace in Computing Sciences, Edmond Dougherty, Engineering Entrepreneurship director and president of Novation Tech LLC, and two computer engineering graduate students, David Lewis and Noah Schwanke.