CS professors Sarita Adve and Shenlong Wang collaborate with Meta research to improve XR graphics

6/2/2025 Bruce Adams

CS professors Sarita V Adve and Shenlong Wang have received a grant from Meta for a three-phase research project aimed at creating rich, immersive experiences on all-day-wear extended reality (XR) devices and making content generation available to end users within an acceptable power envelope.

Written by Bruce Adams

Our vision is for comfortable use throughout the day, just like my glasses. I don't think about my glasses on my face. I wear them all day long. I need them all day long, and that's where we want to go with our vision.

Sarita Adve

Sarita Adve
Sarita Adve

Creating rich, immersive experiences on all-day-wear extended reality (XR) devices is a challenging engineering task. Constraints on power usage prompted by thermal dissipation and battery life are barriers. Generating and merging digital information and graphics into real-world and virtual environments takes computing power.

Meta has sponsored academic research by computer science (CS) professors Sarita V. Adve and Shenlong Wang from The Grainger College of Engineering Siebel School of Computing and Data Science at the University of Illinois Urbana-Champaign. Their project aims to increase realism and interactivity by orders of magnitude, making content generation available to end users and achieving all of the above within an acceptable power envelope for XR devices. The three-phase project began in the fall of 2024, with a conclusion in May of 2028.

A person wears an XR headset and reaches upward.

Three people in a room, one wears a VR (virtual reality) headset

Adve says, "Today, the battery lives of the systems we have on the market are low. For example, the Apple Vision Pro is about two hours, and that's with a battery pack in your pocket. Plus, the weight of these headsets is about 500 grams, and they soon get uncomfortable. Our vision is for comfortable use throughout the day, just like my glasses. I don't think about my glasses on my face. I wear them all day long. I need them all day long, and that's where we want to go with our vision."

The key to comfortable all day wear devices is to reduce their power consumption. But power constraints limit the computation that a headset can do. Adve explains, "There is orders of magnitude of gap between what we have today and where we want to be. Today's headsets use about 20 to 30W of power, but ideally, for comfort, we want to be at a few hundred milliwatts while still providing rich experiences. One way to get there is to move computation from the headset to another device. In the near term, that device might be like your phone in your pocket. But ideally, we don't want to be carrying multiple devices, and so we want to move computation to the cloud. This creates a distributed XR system and brings new challenges with higher latency and limited bandwidth."

A futuristic immersive experience that motivates Adve comes from her role as a teacher. "When I teach, I often ask my students who attend in person what motivates them to come to class. They say coming to class keeps them engaged. They ask questions, and really like the experience and learn a lot. It would be wonderful if we could provide the same experience to students in remote locations who take our courses online. Using immersive technologies, they could reconstruct the classroom in their homes and transport their virtual selves to the physical classroom, enabling interactions with other students like they were together in person. These technologies can transform how we educate and train."

But doing this at scale is quite futuristic right now. "A lot of things need to happen for it to come together, and that's what I mean by orders of magnitude gaps that need to be overcome. We are chipping away at them with our work."

Adve points to other XR uses. She is excited to collaborate with Carle Illinois College of Medicine and Jump ARCHES Simulation Center for use of XR in medical training and healthcare procedures.  

"A lot of this work is in the context of our IMMERSE Center for Immersive Computing," Adve notes, "which is a campus-wide center. I'm very excited about bringing together departments and units across campus so that we can leverage the technology work in the context of the needs of the application."

 

Wang’s contribution to the research concerns the ongoing challenge of generating experiences. The aim is AI-based content creation, combining traditional graphics rendering and a simulation pipeline. Using LLMs for program synthesis from text instructions, an intuitive prompt interface for scene editing will be built. Users will no longer need expertise in 3D modeling software.

Wang says collaboration with Meta to develop AI for headsets will proceed in two ways: computer‑vision models will “map the geometry, lighting, and semantics of your surroundings in real time, turning raw camera data into an accurate digital twin of the physical world,” and generative models will “synthesize virtual objects, avatars, and effects that blend seamlessly with what the cameras see, letting the real and virtual coexist without obvious boundaries.” Wang and Adve together will address efficiency through algorithmic innovation, system/algorithm/hardware co-design, and smart offloading to “push power-hungry workloads to the cloud and lengthen battery life giving cooler, lighter hardware.”

Splitting computation between a headset and the cloud presents “a multi-objective optimization problem,” Wang notes. “Sending data to the cloud saves battery but costs bandwidth and round-trip time; on-device processing has the opposite trade-off. Network variability, privacy requirements, and thermal limits all move the Pareto frontier.” As Wang says, “the ultimate KPI (Key Performance Indicator) is user comfort and presence,” so translating that subjective experience to something measurable that an optimiser can use will be essential.

Wang says that “consistency and physical plausibility are the big challenges in using AI for XR content. Imagine a virtual actor,” he explains, “who:

  • “Maintains identity over time – Appearance, voice, and style must stay coherent frame‑to‑frame,
  • Understands its environment – It must “see” your furniture to place its feet on the floor, not through it. And
  • Obeys physics – Interactions should respect gravity, collisions, and material properties so nothing breaks immersion.”
Shenlong Wang
Shenlong Wang

 

The ultimate KPI is user comfort and presence.

Shenlong Wang

The Siebel School team of faculty and students works closely with Meta researchers, meeting weekly. Adve says, “We've had connections with Meta in the past. Our IMMERSE white paper presented a lot of opportunity for collaboration. We were introduced to Meta’s relevant researchers, and we took it on from there. We were working on something specific at that time; Meta joined our meetings, and we started working together. Then, one of my students went to work at Meta for a summer internship, and that work continues together as well. Once the collaboration was established, we talked about funding and wrote this proposal based on topics of mutual interest. It was a very natural thing to do and got funded. And now it's an even more tight-knit funded collaboration.”


Grainger Engineering Affiliations

Sarita V. Adve is an Illinois Grainger Engineering professor of computer science and is director of IMMERSE. Sarita V. Adve is Richard T. Cheng Professor of computer science.

Shenlong Wang is an Illinois Grainger Engineering professor of computer science.


Share this story

This story was published June 2, 2025.