Mason NSF CAREER award looks at compressing and transmitting panoramic video for accurate analysis

In This Story

People Mentioned in This Story
Body
Zhisheng Yan

Panoramic video footage is a useful tool to capture important information, like identifying suspects or monitoring a natural disaster response during an earthquake or wildfire. 

George Mason University assistant professor Zhisheng Yan in the Department of Information Sciences and Technology will lead a National Science Foundation (NSF) CAREER research project called Machine-centered Cyberinfrastructure for Panoramic Video Analytics in Science and Engineering Monitoring to further develop and enhance machine centric video compression and transmission. This will look at developing a method for video footage captured by a 360 degree panoramic camera—which uses copious amounts of data—to get compressed and transmitted into a more usable, sizeable unit for data analysis in computing servers. 

According to Yan, the benefit of 360 video footage is having a larger scope of footage available but transmission can be challenging without efficient compression. 

“The use of 360 degree panoramic video is seen as an important tool for data collection in a variety of spaces, particularly when it comes to identifying wildlife, filming airport traffic, and suspect recognition,” says Yan. “All the views are available.” 

The end consumer of 360 degree video footage is now not always a human eye, but a computer algorithm, says Yan. This is why redesigning video compression and transmission capabilities is necessary to make sure the algorithms are properly analyzing footage, and retaining the data that’s needed. 

“The need to redesign compression and transmission is not necessarily for video quality, but to optimize analytic results and accuracy for the computer systems ‘viewing’ the video,” says Yan.  

He adds that research done around traditional camera and video systems have shown that there are often issues when software programs are used to analyze panoramic videos that are too large for the systems to handle. This is where machine centric video compression and transmission can help these systems generate an accurate analysis.   

Yan will be the single principal investigator for the project and anticipates working alongside student researchers throughout the duration. The Machine-centered Cyberinfrastructure for Panoramic Video Analytics in Science and Engineering Monitoring project will begin June 2022 and run for about five years. 

Until June, Yan says he will focus on some literature reviews and preparation.  

“Our first focus will be in compression technology, and then we will focus on the transmission aspect,” he says. “We’ll do testing on 360 degree panoramic video samples to see what works best.”