Augmented reality system for accessible play, iGYM, goes international

Using iGYM’s computer vision module, the U-M team partnered with the University of Tsukuba’s FUTUREGYM Team to develop new interactive games that allow children of all abilities to play together.
Children playing iGYM/FUTUREGYM
A demonstration of the iGYM/FUTUREGYM collaboration, which aims to bring inclusive play to all children with the help of augmented reality. Photo Credit: University of Tsukuba.

The future of recreational play is digitally augmented—and accessible. At least, it can be with iGYM, the spatial augmented reality (AR) system for inclusive play and exercise developed by a team of University of Michigan researchers. 

“The creation of iGYM has shown the powerful potential of augmented reality to help people of varying abilities access recreational activities,” said Roland Graf, an Associate Professor at the Penny W. Stamps School of Art & Design, who co-leads the project. ​“This technology creates equitable opportunities for people to improve their health and quality of life, and promises to become one of many ways sports and exercise can be adapted to specific needs.”

iGYM uses projected AR to create a room-sized, interactive game environment that was initially modeled after a life-sized version of air hockey. The field, goals, and ball are projections that interact with players, who can pass the ball to their teammates and score goals. The number of projectors determines the number of players that can participate. Two projectors allow for a 2v2 game, four projectors support a 4v4 game.

Hun-Seok Kim, an Associate Professor in ECE, co-leads the project and developed the computer vision system that makes the gameplay possible.

“We wanted to make the system more easily deployable for other places, so we separated the computer vision module from the game engine module,” Kim said. “We use a standard ROS2-based distributed service protocol to attach different games developed by different people to the same computer vision engine. So anyone can design a game, and then they simply use our system to demonstrate it.”

The computer vision module and the game engine module run on different computers, and they communicate through the internet. The computer vision module tells the game engine module the location of each player and how they move on the field. A designer uses that information to create their own game on Unity, which is a popular game development platform. 

“We want people all over the world to be able to access and enjoy this system, so we’re always looking for opportunities to collaborate,” Kim said. “We’re happy to share our source code so other groups can easily integrate it with their own system.”

In October 2023, the iGYM team partnered with FUTUREGYM, a research group from the University of Tsukuba, who design activities and curriculum that are inclusive for children with both physical and intellectual disabilities.

After integrating the iGYM system with their large-scale floor projection system, FUTUREGYM hosted a social event on October 28th for students from the Otsuka Special Needs School and the University of Tsukuba-affiliated Sakado High School in Japan. Students played a series of games, including AirSoccer, in a successful demonstration of the iGYM-FUTUREGYM system. 

“With the goal of empowering both students and teachers, the system offers inclusive experiences and enhances the educational curriculum, fostering inclusivity and innovation in education,” stated a press release by FUTUREGYM.

Another way the developers are working to improve the iGYM system is by involving more current U-M students in the project. The iGYM team developed a new class, “Air Play: Inclusive Augmented Reality Game Development,” which is a Faculty Engineering/Arts Student Teams (FEAST) course offered through Arts Engine. The interdisciplinary design course (which qualifies as an MDP course) involves faculty and students in CoE.

“There are three main teams: technical development, user interface, and entrepreneurship, and we all meet bi-weekly to ensure we’re working cohesively to improve the system for the users,” Kim said. “We can find the right task for any student, so it doesn’t really matter what background they have.” 

One of the improvements to iGYM that came out of the FEAST course was a change to the user interface design. 

“The students made two types of web-based user interfaces: one for the operator and one for the player,” Kim said. “This made it a lot more interactive and fun to use.”

The operator’s interface allows them to test how changing certain parameters, such as difficulty settings for different players, will impact the gameplay. The player’s interface allows them to choose which games to play and select desired settings, such as the amount of play time. 

Students also helped improve solutions to issues regarding the “peripersonal circle”. The peripersonal circle is an individual’s immediate area of motion. A non-disabled person can easily stretch this area by lunging or pivoting, but someone with a physical disability may have less range of motion. This creates an unfair advantage for the non-disabled person.

To address this, Kim and his team created a button the disabled player can press that expands their peripersonal circle, emanating a kick. They created many different kinds of buttons to suit different players’ abilities. Some buttons were large; some were small. Some buttons could be pressed by fingers or hands; others could be attached to the inside of one’s knees. The player would get whichever button best suited their abilities. 

However, students in the FEAST course further improved these designs. 

“Last time, we used a wireless mouse as the button,” Kim said. “But now the students created a custom box, which uses Bluetooth low energy technology, which can be attached to a wheelchair.”

But there are still improvements to be made, particularly to how the computer vision camera identifies and tracks individual players. Currently, the system detects players by using a camera mounted on the ceiling. A computer vision algorithm then compares each image frame to the reference image without any players, allowing the algorithm to detect when a player is on a certain area of the field.

“But the problem is, everyone appears very similar when you’re looking down at them from the ceiling,” Kim said. “So right now, we have to apply the same parameter for everyone on the field, but if we could detect which person is which without slowing down the system running on a regular desktop computer, then we could personalize the parameter for each player.”

Kim’s team is working to add more features to make it possible to identify each individual player during the gameplay. This includes exploring wireless technologies, such as virtual ultra-wideband (UWB) transceivers, and Radio Frequency Identification (RFID) technology, which uses radio waves to localize people or objects. RFID technology is what’s typically used for laser tag and other location-based indoor games, but it requires a large number of readers, which have limited range.

“There are pros and cons for each type of technology, so we are going to investigate different options to find the one that works best for our system,” Kim said. “It’s a challenging problem, but if we can solve it, we’ll be able to support many new kinds of games as well.”

The project is also co-led by Michael Nebeling, Associate Professor from the School of Information. Sun Young Park, Associate Professor at the Penny W. Stamps School of Art & Design, also contributed to the project.