Skip to main content
MIXR UMD

News


2023

Immersive Learning Supply Chain Class

From Magical Machines to Real-World Design

Legend of Zelda Inspires New Engineering Course

By Robert Herschbach 

With nearly 20 million copies sold since May, The Legend of Zelda: Tears of the Kingdom isn’t just the fastest-selling Nintendo game of all time. As the basis for a new engineering course at the University of Maryland, the video game could be at the forefront of a new movement in higher education.

Players of the open-world action-adventure game–the latest in a popular 37-year-old franchise–control protagonist Link as he and the eponymous Princess Zelda navigate the world of Hyrule and contend with an unknown evil presence. Some of Link’s explorations, which include a subterranean realm and a series of floating sky islands, rely on the creation of gliders, rockets and other machines. 

When UMD Associate Professor Ryan D. Sochol realized how important the players’ design of these gadgets is to completing the game’s quest, he devised a course that incorporates the game in place of traditional computer-aided design (CAD) and engineering software.

“As I played through Tears of the Kingdom, I couldn’t believe how much I was relying on my engineering training,” said Sochol. “The more experience I had with the game’s CAD assembly interface, numerous machine elements and sophisticated physics, the more I felt it offered unique means to help students hone their skills in machine design.”

Just a few months after the game’s release, Sochol’s “The Legend of Zelda: A Link to Machine Design” course launched for this semester to provide undergraduate students with an uncommon opportunity to gain experience designing, prototyping and testing new types of vehicles, robots and machines—all within the virtual world of the game.

Samuel Graham, Jr., dean of the A. James Clark School of Engineering, said the course exemplifies a new approach to teaching in higher education, geared toward greater incorporation of immersive media–including virtual, augmented, and mixed reality—in the classroom.

“The Clark School prides itself on providing our students a rich mix of classroom and hands-on experiences, preparing them to tackle big challenges and position themselves for success in the workforce,” Graham said. “Gaming is a doorway for young people to become interested in engineering and computer science, and create simulation tools that help us solve real world challenges. Our Legend of Zelda class plays on that appeal and provides a powerful mix of intellectual and practical tools.” 

In one of the course’s projects, students created a transforming robot that can run on land and swim in water, and then raced their robots in the game to see whose was fastest. The project, along with others based on aerial vehicles, are designed to help students build their proficiencies in machine design and engineering—but it won’t necessarily make them better at Zelda, Sochol cautioned.

“The machines created for the design projects aren’t too useful if you’re looking to beat the game, but it enables us to teach engineering in the way it ideally should be taught—as something that is engaging, challenging, exciting and fun,” he said.


Luke Rose '26, a mechanical engineering major who also plans to minor in robotics and autonomous systems, was part of the team that won the in-class competition. 

"The biggest impact this course has had on me was that it offered me a different approach to machine design, allowing me to more easily think about constructing mechanical systems as a sum of the components rather than the (far more complex) whole," Rose said. "This has helped me in other major-related courses because, similar to groups of animals moving together, the individual components are following relatively simple rules which lead to very complex motion on the whole."

The game world allowed the class to put into action things they'd learned in theory in other classes, and which would be impractical to build as students in the real world, said mechanical engineering major Rheanna King '26.

"It felt like I was actually experiencing an engineer's job firsthand by doing calculations, designing, building and even going back to square one and starting over," King said. "The course showed me what it's like to be an engineer and how what we're learning in other classes can be tied together and applied."

This is not the first foray into gamified engineering for Sochol, who joined the UMD mechanical engineering faculty in 2015. He and researchers in his Bioinspired Advanced Manufacturing Laboratory made headlines when they demonstrated a 3D-printed soft robotic hand by playing Super Mario Bros.—work that led to a $3 million grant from the National Institutes of Health to build soft robots for neurosurgery.

Interest is high in the melding of gaming and machine design; a “Hyrule Engineering” group on Reddit amassed more than 150,000 subscribers, and the waitlist for Sochol’s class this semester was double the number of available seats. 

Fortunately, Sochol plans to offer the Zelda course every semester for the foreseeable future.

Watch video about: "The Legend of Zelda: A Link to Machine Design" | UMD's Newest Engineering Course

https://today.umd.edu/from-magical-machines-to-real-world-design

IEEE Pulse: New Center Primes the Extended Reality Frontier

August 31, 2023

Author(s): Leslie Mertz

Collaboration between academia, industry, and government lays the groundwork for medical AR and VR technologies

Extended reality has reached a critical point in biomedicine. The technology is accelerating and excitement about potential health care benefits is mounting but bridging the gap between envisioning what could be and actually making it happen is still a work in progress. Part of the challenge is coordinating all the major players to ease the development pipeline so virtual, augmented, and mixed reality can reach their potential in everything from diagnostics and surgeries to clinician training and telemedicine.

A new university, industry, and U.S. government collaboration is now in place to address that challenge [1]. Called the Center for Medical Innovations in Extended Reality (MIXR), it will bring together the researchers pursuing virtual, augmented, and mixed reality technologies that clinicians need and want; the companies seeking to develop and make products that have both strong market potential and can earn the necessary approvals from the U.S. Food and Drug Administration (FDA); and FDA officials who are working through the agency’s regulatory responsibilities and procedures as medical extended-reality projects surge.

Funded with a $5 million grant from the National Science Foundation (NSF) [2], [3], [4], MIXR is a joint center between the University of Maryland, College Park; the University of Maryland, Baltimore; and the University of Michigan. The FDA and several technology companies, including industry giants Sony and Microsoft, are also partners.

Its time has come

The timing is right for MIXR, said MIXR lead-site principal investigator (PI) Amitabh Varshney, Ph.D., professor of computer science at the University of Maryland and IEEE Fellow (Figure 1). “Just over the last few years, virtual and augmented reality devices have become commoditized so you can go to electronics stores and buy very high-powered headsets, which was not possible before. There have been amazing advances in artificial intelligence and machine learning, and we now have 5G and soon 6G wireless capabilities,” he said. “This confluence makes this a perfect time to invest the effort and resources to advance the field in a direction that will benefit society at large.”

AmitabhVarshney

Figure 1. Long-time researcher in virtual reality (since his doctoral studies at the University of North Carolina under noted computer architect Fred Brooks), Amitabh Varshney is helping to start a new university, industry, and government collaboration called the Center for Medical Innovations in Extended Reality (MIXR). Varshney, Ph.D., is a professor of computer science and the dean of the College of Computer, Mathematical, and Natural Sciences at the University of Maryland, and a lead-site principal investigator for MIXR. (Photo courtesy of the University of Maryland.)

The tipping point between dream and reality is here, agreed Rishi Reddy, M.D., M.B.A., a MIXR partner-site PI and the director of the Center for Surgical Innovation at the University of Michigan (Figure 2). “I don’t think we would have been able to make much headway just five years ago, but with 5G, we have arrived at an opportune time to really plan out thoughtfully what we can accomplish with extended reality in health care,” he said.

Rishi Reddy

Figure 2. Rishi Reddy (left) is the director of the Center for Surgical Innovation and José José Alvarez research professor in thoracic surgery at the University of Michigan. He is also partner-site PI for MIXR. (Photo courtesy of the University of Maryland.)

Extended reality cannot come soon enough, remarked partner-site PI Sarah Murthi, M.D., of the University of Maryland School of Medicine (Figure 3). While surgeons are already able to access real-time computed tomography (CT), ultrasound, and other images during procedures, the images are 2D and appear on separate computer screens. That means surgeons must continually look back and forth from patient to various screens as they carry out often-complex operations, she said. “You can’t see the images without turning away, and you can’t blend one image with another, so it’s still much like it was in the 1950s. Extended reality, on the other hand, can disrupt how data is displayed, and allow providers to see different data streams, fuse data, and overlay real-time images right on the patient” (Figure 4).

Sarah Murthi

Figure 3. Sarah Murthi is the director of the critical-care ultrasound program at the R. Adams Cowley Shock Trauma Center in Baltimore, a professor of surgery at the University of Maryland School of Medicine, and a MIXR PI. (Photo courtesy of the University of Maryland.)

Reddy views student training as having great potential, too, by allowing them to see what the surgeon is seeing, including the 3D overlays on the patient, so they can ask for clarifications as the real-time operation progresses. He believes extended reality will also be very helpful for preoperative patient counseling, and to allow patients to connect more meaningfully with physicians during a telehealth visit.

Murthi in XR

Figure 4. Murthi wears a virtual reality headset during an ultrasound procedure. She foresees extended-reality technologies that will allow clinicians to see different data streams, fuse data, and overlay real-time images on the patient. (Photo courtesy of the University of Maryland.)

“Overall, these technologies provide a lot of powerful opportunities to positively impact health care and the way that we currently function,” Reddy said. “Our goal with MIXR is to do that in a consistent way, which includes validating that these virtual, augmented, and mixed reality systems do make surgeries safer, improve the patient experience, and improve training and education.”

Smoothing the way

MIXR will evaluate the entire extended-reality pipeline (Figure 5). That evaluation begins with acquisition of the virtual scene, Varshney said. In training medical students on a surgical procedure, that would mean acquiring the entire technique so the students can observe in close range every step the surgeon takes. “To make it truly scalable, you would need an array of cameras to capture the procedure, so the students can place themselves in the shoes of the surgeon and see it as it is being performed.”

MIXR Team

Figure 5. Varshney, Murthi, and Reddy (left to right) hope MIXR will speed the development of safe and effective extended-reality technologies for health care, whether that be for diagnosis, treatment, medical training, or patient education. (Photo courtesy of the University of Maryland.)

Displays in extended-reality headsets will require evaluation to ensure they perform well. This includes such technical aspects as resolution and field of view, but also issues of user fatigue, Varshney said. “Right now, people can only wear headsets for a short period before they get headaches or nausea, so if a surgeon is starting a 6-hour operation but can only wear the headset for 20 minutes, that is not good. We are looking at what the industry and university roadmap should be so that we design and evolve headsets that are truly effective, ergonomic, and comfortable for people to wear for hours on end so they can be productive at their task.”

Other focus areas for MIXR include best practices for using artificial intelligence and machine learning in the context of visualization and visual immersion for medical diagnosis and medical procedures, and an examination of how next-generation wireless technologies may impact extended reality. As an example of the latter, Varshney proposed that 6G might support a technology to allow a doctor to appear via hologram in a rural or remote location, such as a battlefield, and provide medical direction to on-site personnel who can provide immediate care for a patient.

MIXR will look at the full range of health-related applications. One area that Reddy sees the technology playing an important role is in helping clinicians perform at a high level. “This could be a great boon to procedural training, because there are certain procedures that we surgeons only do once or twice a year, so augmented reality or virtual simulation is a way to practice and maintain skill sets for rare procedures,” he remarked.

Creating a new path

One more critical MIXR emphasis lies in facilitating regulatory approval for extended reality, because it falls outside the normal realm of the FDA, noted Murthi. “If it is a drug, or if it is something that is completely automated, such as a robot doing a tumor resection, that is clearly something for the FDA. A physician decision, such as my decision to choose one antibiotic over another or whether to read certain procedural information before surgery, is not something the FDA oversees,” she said. “Here is the question: What happens with something like augmented reality, where a physician is making a decision but that physician is heavily influenced by this technology? That falls somewhere in between and makes things much murkier for the FDA.”

Within these muddy waters, the FDA hopes to restructure the approval process for extended-reality devices in surgery and medicine, according to Varshney. Rather than going through a lengthy process to approve one-by-one requests to use a certain company’s headset for a specific procedure, the FDA is proposing that universities and companies create teams to define more inclusive guidelines. For example, a guideline might state that any extended-reality headset meeting a certain set of specifications for brightness, resolution, field of view, and weight is appropriate for use with X, Y, and Z surgical procedures, he said. “This way, the whole approval process would be much more streamlined.”

This type of streamlining would be a game-changer both in the United States and abroad, commented Reddy. “Similar groups around the world are working on accelerating the extended-reality field, but MIXR is in the unique position of having the FDA as a partner. Because the U.S. is such a big market for all these extended-reality use cases, if the FDA approval process becomes more efficient, that really opens up a lot of opportunities for companies wherever they’re based.”

MIXR will also have an impact on the research end, Varshney said. “We are already beginning to fund specific research projects that have been collaboratively designed and defined by the FDA, industry, and MIXR,” Varshney said. Although the center is only funding research projects at the three partner universities, he added, “the outcomes of these research projects will help accelerate the evolution of the virtual/augmented reality field for medicine.”

The NSF-funded MIXR to jumpstart the entire extended-reality field, not to push through specific research or products, Murthi noted. “This is specifically pre-commercial, so the point is not for the University of Maryland or the University of Michigan to develop a company to turn a profit, or for a certain tech company to make and sell a product. The idea is to address fundamental questions and issues that are inhibiting or could accelerate the whole field.”

Put another way, the center will advance regulatory standards and lead to foundational insights in a precommercial way, such that all companies in the extended-reality space will benefit, Varshney said. He added. “It’s a win-win-win across university, government, and industry, so it is really an amazing center in that regard.”

References

  1. MIXR: Center for Medical Innovations in Extended Reality. Accessed: Apr. 11, 2023. [Online]. Available: https://www.mixrcenter.org/
  2. IUCRC Phase I: University of Maryland, College Park: Center for Medical Innovations in Extended Reality (MIXR), Award Abstract # 2137229. Accessed: Apr. 11, 2023. [Online]. Available: https://www.nsf.gov/awardsearch/showAward?AWD_ID=2137229&HistoricalAwards=false
  3. IUCRC Phase I: University of Maryland, College Park: Center for Medical Innovations in Extended Reality (MIXR), Award Abstract # 2137187. Accessed: Apr. 11, 2023. [Online]. Available: https://www.nsf.gov/awardsearch/showAward?AWD_ID=2137187&HistoricalAwards=false
  4. IUCRC Phase I: University of Maryland, College Park: Center for Medical Innovations in Extended Reality (MIXR), Award Abstract # 2137207. Accessed: Apr. 11, 2023. [Online]. Available: https://www.nsf.gov/awardsearch/showAward?AWD_ID=2137207&HistoricalAwards=false

https://www.embs.org/pulse/articles/new-center-primes-the-extended-reality-frontier/

July 26, 2023

Using Emerging Tech, Students Create Innovative Art, Interactive Games in Incubator Program

By Karen Shih ’09

IMD New Works Incubator 2023
Andrei Davydov '25, a studio art and immersive media design double major, demonstrates how to use a motion capture suit during a New Works Incubator workshop as Kate Landenheim, assistant Artist-In-Residence at the School of Theatre, Dance, and Performance Studies, looks on.
By John T. Consoli

Deep in the maze-like white corridors of the A.V. Williams Building, gentle tides wash up driftwood and crab shells; oppressed workers rise up against a surveillance state; and aspiring pop stars perfect their recordings in a studio.

“This is pretty sick,” said Marilyn Ortega ’24—in the best sense of the word, of course—as she offered feedback to peers presenting their art installations and interactive games.

They weren’t being created for a class or for an internship but for the Immersive Media Design (IMD) program’s New Works Incubator, an eight-week summer program that gives any University of Maryland student a chance to try out emerging technologies and develop unique, offbeat projects.

“Students really take the opportunity and run with it,” said IMD lecturer Jonathan David Martin, who developed the incubator in 2021.

IMD, a major that debuted the previous fall and now has 122 students, encompasses augmented and virtual reality (AR and VR), motion capture, 3D scanning, audio and video editing, and more. All have creative and real-world applications, such as immersive video games or simulated surgeries for physician training.

“A lot of classes don’t have time to focus in on how all of this tech works, so we’re able to go a little deeper with equipment or software over the summer,” said IMD Lab Manager Ian McDermott, who led the program this year.

The incubator draws students from across campus, though most are computer science and art majors. They attend two workshops per week on different tech topics in June, then work on their projects and get peers’ and mentors’ feedback throughout July, culminating with a presentation at the end of the month. In the fall, some are invited to showcase their work at The Clarice Smith Performing Arts Center’s NextNOW Fest.

The incubator is supported in part by the Arts for All initiative, which connects arts, technology and social justice, enabling McDermott to bring in faculty mentors and purchase technology that students request.

“Whatever we want to do, they make possible. It’s really fun,” said IMD major Malaya Heflin ’24, who worked with IMD and computer science major Fiza Mulla ’25 and CS major Arti Dhareshwar ’25 to develop an AR musical escape room featuring a jazz club, an electronic dance music techno-city and a pop recording studio. To advance through levels, participants solve puzzles using a tablet, identifying famous quotes by musicians or matching a pitch for three seconds.

She hopes to work in game development or for a theme park, and says the incubator is giving her valuable experience.

Other projects ranged from the educational, teaching users how to play obscure instruments like the ocarina, to the political, demonstrating the daily obstacles that Chinese workers encounter in a highly censored state, as well as more speculative approaches.

“I want to immortalize and transform things through digital media,” said Jill Stauffer, M.F.A. ’25. Living along the coast of Rhode Island after college, they have powerful memories of walking along the shore. They are developing a motion sensor-illuminated exhibit of 3D-printed wave-worn rocks and fragmented shells, with projected videos and sounds of waves washing up on sand.

“I’m honoring the memory of being on the beach, but with a dystopian twist,” they said. “What if this is the only way this can be experienced 100 years from now?”

https://today.umd.edu/summ-ar-immersion

June 21, 2023

Engineering’s New Immersive Flight Simulator Will Test the Rigors of Flight for Safer Skies

By Maggie Haslam

XR Simulation and Control Lab
Photo by John T. Consoli

Last week, I made an unexpected stop at the McDonald’s drive-thru in the most literal sense. As I barreled toward the parking lot, the rotors of the Sikorsky S-76 helicopter I was flying sheared the golden arches clean off the building’s façade.

No one was hurt, and I didn’t scrap a $20 million chopper in the process. The removal of my virtual reality (VR) headset transported me instantly (and with relief) to a spacious lab on the fourth floor of the IDEA Factory at the University of Maryland. But my churning stomach and frayed nerves reflected an all-too-realistic seven minutes in the cockpit, and a grim realization: Flying is hard.

This crash-and-burn scenario was enabled by the new Extended Reality Flight Simulation and Control Lab, launched by the Department of Aerospace Engineering this spring. It’s the first university-based facility in the United States to reproduce different flying conditions in various types of aircraft through motion-based VR simulation and haptics, which provide information through tactile feedback like rumbling and vibrations.

“Our objective is to increase immersion and recreate scenarios that are difficult to simulate otherwise,” said Assistant Professor Umberto Saetti, who founded and directs the lab—“and ultimately, increase flight safety.”

Saetti’s setup is about as immersive as you can get without leaving terra firma: What looks like an upscale gaming chair bolted to an elevated platform seems to magically transform once you buckle in and gear up. VR goggles conjure an empty cockpit and a full instrument panel at your virtual fingertips, with a sun-drenched tarmac just beyond. Headphones flood your ears with the roar of the engine—and in my case, the thump of the rotor blades—with the platform oscillating in response to the controls: a gentle nudge of the cyclic (or control) stick forward and my chopper comes to life, pitching me up and out over the small town beyond the airport.

He developed his lab after seeing the limitations of conventional flight simulators, which use big-screen projectors built on top of gigantic motion bases, with a price tag of several millions of dollars (compared to Saetti’s set up, which runs around $400,000). And while they excel at training pilots to operate just one type of aircraft, at UMD’s lab, the sky’s the limit: With just a few keystrokes, a variety of aircraft can be flown through the simulator, from a Black Hawk helicopter to an F-18 fighter jet. Within a few months, a lunar lander will join the rotation.

The team is also testing a full-body haptic feedback suit, traditionally used for gaming, to explore new methods for providing sensory cues to help navigate low-visibility situations, hostile flight scenarios or visually impaired pilots. Made by Teslasuit, the snug, Catwoman-like getup I wriggled into in the IDEA Factory ladies’ room is outfitted with over 100 transcutaneous electrical muscle stimulation patches, similar to what is used in physical therapy, to replicate everything from the feeling of rain on your skin to the jolt of a bullet puncturing body armor. Gentle tingling on my shoulders during a separate, blindfolded simulation nudged me to maneuver left or right to keep my plane level.

“If you only feel the motion of the aircraft and can still fly without vision, that could be useful in a number of scenarios,” said Michael Morcos, a graduate student working in Saetti’s lab. “We’re trying to prove that’s actually possible.”

Certain weather-related flight conditions, such as the fog encountered by John F. Kennedy Jr. off the coast of Martha’s Vineyard during his fatal flight in 1999, can put the cues a pilot gets from their instruments (which are correct) in direct opposition with what they are feeling; haptics, said Saetti, could help eliminate confusion about what’s up and what’s down. The suit—which monitors biometrics like heart rate, pulse and cardiorespiratory activity—could also help pilots track external activity like approaching aircraft.

Saetti’s team is currently conducting research for the U.S. Army and Navy, as well as NASA, which together have provided the lab with $1.78 million in funding so far this year. In the future, the researchers plan to work with kinesiology Professor Bradley Hatfield to monitor brain activity and track stress and other human responses to flight.

Someday, the lab’s haptic innovations could take flight on actual aircraft to reduce a pilot workload and enable more difficult missions without compromising safety. “Our job is to come up with and demonstrate new ideas, then the companies can do the rest,” said Saetti.

Although precision flying runs in my blood—my grandfather piloted B-24 bombers during WWII—it’s clear I have a lot to learn. That’s the beauty of the simulators, said Saetti: Whether you’re a seasoned military or commercial flier or a land-loving writer, you can undergo incredibly realistic, often tricky flight experiences without leaving the ground.

Or imperiling someone’s Big Mac.

https://today.umd.edu/up-in-the-air-without-leaving-the-ground

May 05, 2023

Performance at The Clarice and its Virtual Reality Element Bring Listeners to Chesapeake Bay

By Sala Levin ’10

Rising Tides
The causeway that connects Hoopers Island in the Chesapeake Bay to Maryland's mainland inspired composer Alexandra Gardner to consider the impact of climate change on this part of the state. On Sunday, her Chesapeake Bay-influenced musical piece will be performed by the Tesla Quartet at The Clarice as part of "Rising Tides."
Photo by Jay Fleming

 

On a precariously narrow two-lane roadway, bordered on both sides by water lapping nearly at the asphalt surface, a string ensemble performs, seemingly oblivious to its surroundings.

That’s because the four musicians of the Tesla Quartet aren’t really performing on this causeway, which leads to Hoopers Island—three watermen’s villages perched in the Chesapeake Bay off the coast of Dorchester County. They’re there thanks to augmented reality (AR) which blends the virtual world with the physical one, to tell a musical story about how climate change is ravaging this part of Maryland.

On Sunday, The Clarice Smith Performing Arts Center will present “Rising Tides,” a new musical performance comprising a series of commissioned pieces reflecting how the state of Maryland, especially the water-centric Eastern Shore, is seeing communities, farmland and public infrastructure increasingly succumb to rising sea levels and other consequences of climate change. Concertgoers can watch and listen to the pieces in AR through the app ImmerSphere, which places the musicians in the spots that influenced the composers.

Richard Scerbo, artistic planning program director at The Clarice, said he has long been considering “how the music that we’re programming here at The Clarice can have an impact on our communities and speak to social issues of our time,” he said. “Climate change has been on my list.”

So he, along with the Tesla Quartet (who are being presented as part of The Clarice’s Visiting Artist Program), approached Maryland-based composers Alexandra Gardner and Adrian B. Sims ’22 to see if they might be interested in writing pieces that spoke to the impact of climate change on the Chesapeake Bay and the people who live on its shores. “The project sounded right up my alley,” said Gardner. “A lot of my work is inspired by the natural world and natural sciences, and of course I’m concerned about climate change.”

The timing was serendipitous: Gardner and her friends had recently visited Hoopers Island and been struck by the proximity of the water to the skinny causeway that connects the islands to the mainland. “It was a really sobering experience to drive out there and see the ditches beside the road as you approach the island full of water on a dry summer day,” she said. “We were like, ‘Wow, what’s it like to drive this when it’s actually raining, or at night when there’s no light?’”

Gardner’s piece is made up of three movements, titled “Causeway,” “Ceremony” and “Ghost Pines.” Each is inspired by a location on Hoopers Island, locations that can be seen—and heard—through ImmerSphere. In the app, the Tesla Quartet’s music is complemented by the ambient sounds of the environment surrounding them: the gentle splashing of water, the whoosh of wind through tall grasses, or the eerie silence of a pine forest whose trees have been killed off by saltwater encroachment.

The “Ghost Pines” movement is “noisy and scratchy,” Gardner said, echoing “what I imagine (is) the sound of a tree having the nutrients sucked out of it.”

Gardner hopes that the performance—and its AR/VR component—will encourage people to face the issue of climate change in the state. “People often don’t realize there’s a problem until they see it or are in it, so this is a way to immerse people in this environment so they can experience a little bit of it, or a feeling of it, even if they can’t actually be there,” she said.

https://today.umd.edu/hearing-and-seeing-climate-change-through-music-and-ar

Published April 27, 2023

Picture of student in VR Simulator
Aerospace engineering graduate student Michael Morcos tries out one of the BRUNNER motion-base VR simulators installed at University of Maryland (UMD) assistant professor Umberto Saetti's Extended Reality Flight Simulation and Control Lab. UMD is one of the only universities in the U.S. to house motion-base VR simulation equipment, which can reproduce flying conditions in a wide variety of aircraft. Photo credit: University of Maryland/Joanna Avery.

New motion-base Virtual Reality (VR) simulators installed at the University of Maryland’s (UMD) E.A. Fernandez IDEA Factory will enable researchers and students to develop and test advanced flight control systems and innovative cueing methods, and to study human-machine interaction.

Used in combination with full-body haptic feedback suits, the simulators support fully immersive, Extended Reality (XR) experiences that replicate piloting a variety of aircraft, including fixed wing airplanes as well as rotorcraft.

“We’re harnessing extended reality in ways that push the research envelope and support exciting new applications,” said Umberto Saetti, who joined the UMD aerospace engineering (AE) faculty in 2022 as an assistant professor. The equipment is housed at Saetti’s  Extended Reality Flight Simulation and Control Lab at the IDEA Factory.

The NOVASIM VR Simulator is designed by BRUNNER, a leading company in virtual-reality flight simulation hardware, and provides “a fully equipped motion platform, connected with all necessary units, including Joystick, rudder pedals and yokes in a more lightweight, flexible way compared to other FFS,” the company said.

It represents a step forward from the bulky systems–often costing millions of dollars per unit–that were used in the past for similar types of research, said Saetti, who teaches and conducts research as part of the AE department’s renowned Alfred Gessow Rotorcraft Center (AGRC)

“We used to rely on large projected screens that were mounted on large motion base systems, and you would need a different cockpit model for each type of aircraft that you wanted to simulate. In our system, by contrast, we can just swap out one virtual cockpit for another, and plug in any flight dynamics that we are interested in.”

Picture of Dr. Umberto Saetti
Dr. Umberto Saetti, assistant professor, University of Maryland Department of Aerospace Engineering.

Haptic TESLASUIts, meanwhile, allow users to experiment with using tactile stimuli to  obtain information on aircraft motion, desired landing zone locations and rotorcraft noise via tactile rather than visual or auditory cues. “Through electro-muscular stimulation, the pilot can receive this information through an alternate sensory channel,” Saetti said. 

In military contexts, haptics could be used to alert pilots that an aircraft is being followed, indicating the other aircraft’s relative position, velocity, and acceleration. Saetti also plans to utilize the system to build on his previously published research on flight control systems in helicopters when landing on ships.

The technology also opens up the possibility that users who do not meet the visual requirements to pilot aircraft could someday do so. Indeed, with haptics, an aircraft could even be flown by a visually impaired or blind pilot. 

The idea has precedents. In 1946, Helen Keller took the controls of a four-engine Douglas C-5 Skymaster and flew it for 20 minutes, en route from Rome to Paris. In 2002, deaf-blind teenager Katie Inman successfully demonstrated the use of tactical signing in flight instruction, flying the plane with one hand while communicating with her instructor through signs drawn in her other palm.

In a virtual flying environment equipped with haptics, a pilot like Keller or Inman could learn to fly an aircraft without needing to see or hear. “Say a helicopter rolls to the right,” Saetti said. “The pilot would recognize this through the feel on his or her body, and also be able to take the correct steps to recover, by detecting the changes in haptic stimulation.”

Alison Flatau, chair of the UMD aerospace engineering department, said she is excited to see Saetti and other faculty investigating novel areas of research with the help of advanced technologies. “Aerospace engineering is an increasingly interdisciplinary fast-moving field, and we aim to be on the leading edge of research that embraces creative insights and opportunities to advance our field,” Flatau said. “The new systems being made available at Dr. Saetti’s lab provide capabilities that are available at few other universities.”

The E.A. Fernandez IDEA Factory was opened at UMD in 2022 with the goal of fostering technology innovation and advances through collaboration across engineering, the arts, business, and science. In addition to the Extended Reality Flight Simulation and Control Lab, it hosts the Maryland Robotics Center’s Robotics and Autonomy Lab as well as other labs and facilities. 

https://eng.umd.edu/news/story/umd-aerospace-gets-vr-flight-simulators

March 7th, 2023

Greg Muraski

UMD Smith School students using VR

Students studying supply chain management at the University of Maryland’s Robert H. Smith School of Business are learning their way around a warehouse – without even having to leave their dorm rooms.

The Smith School’s Humberto Coronado is reimagining the way undergraduate supply chain students learn about warehouse management, using virtual reality technology for a completely new immersive learning experience.

Thanks to a University of Maryland Teaching and Learning Innovation grant, Coronado was able to buy 45 pairs of VR headsets and adapt an existing supply chain management course to include an experiential learning module driven by immersive technology. He won the grant last summer as one of the 90 projects to receive funding in the university’s multimillion-dollar commitment to transform teaching and learning. Coronado is the academic director of the Master of Science in Supply Chain Management program and teaches both undergrad and graduate students and hopes to eventually also use the technology with master’s students.

“Using VR is a game-changer for the way we can teach many supply chain concepts,” says Coronado, who had never used the technology before getting the grant to buy the goggles. He worked with an outside company to design the virtual warehouse space where his students will use the technology to explore and learn.

Students received the VR goggles and accompanying handheld controllers on Feb. 10, experiencing the virtual reality platform together – many for the first time. They learned the basics of how to use the technology in class, and each student got to take a set with them for the month-long series of learning modules they’ll be completing both individually and in teams. Outside of class, teams will agree on meeting times, when they’ll put on their VR goggles wherever they are – at home, in their dorm room, anywhere – and meet up in the virtual warehouse together to complete assignments.

“Before this technology, I would have to stand in class and ask students to imagine walking into a distribution facility. But how can I ask my students to imagine something they’ve never seen?” Coronado says. “This technology allows me to get rid of the ‘imagine’ and say, ‘Put on your goggles, go into this facility.”

So what do they see when they put on the headsets? They’re in a huge distribution warehouse together, where they see each other’s avatars and, using the handheld controller, they’re able to move around, interact with each other and assess the facility together. In some cases, they’ll be pinpointing problems – like safety hazards or inefficiencies that could cause slow-downs and mistakes – and how to fix them.

After trying the headsets for the first time, Coronado’s students were surprised at just how realistic the virtual world felt.

“It gave a totally different perspective than just watching a video,” said Krystal Taveras ’24. “I’m really excited about how much learning I’m going to be able to do. I’m a supply chain major and I think it’s going to be really beneficial for me to get a real look at what companies use and what they do.”

Anthony Marcelli ’24 is also excited for the learning opportunity and how it might help prepare him for a career. “I’ve talked to a lot of different employers and recruiters in supply chain and 90% of what you learn happens in the warehouse, hands-on, in-person, and that’s just something you don’t get sitting in a classroom,” he said. “I think this is just going to give us some opportunities that we never would have had before.”

Coronado said that’s really the goal – for students to use the experience to help get the jobs they want when they graduate.

“They will be prepared with a higher level of knowledge and capabilities – far beyond what they can get just from a textbook,” he says. “They’ll be able to go into a job interview with real examples from these facilities.”

Coronado says large companies with logistics and distribution operations – such as Amazon, DHL – are already using this kind of technology to train their employees and Smith students will have a leg up when entering the job market.

“They are going to be exposed to these technologies that are driving this field and they will get an idea of the new skill sets that are required.”

Coronado hopes future iterations of the course could include training content from companies to create a pipeline of students they could hire.

He says in the near future, many warehouse facilities are going to be fully automated, with no people inside the facility, where everybody will be sitting somewhere else doing everything from computers and using this VR technology.

“I want our students to see all the technologies that are playing into this new field that is growing so fast and requiring new skill sets. I want them to be exposed to that understanding that supply chain management is driven by technology. You can't do anything in supply chain management if it's not with technology.”

https://www.rhsmith.umd.edu/news/smith-students-learn-supply-chain-management-immersive-vr

Announced February 16, 2023

PI Christopher Ellis (AGNR),
Professor, Plant Science and
Landscape Architecture

Awarded Grant Type: Individual Project Grant
Topics: Climate Change and Social Justice
College Represented: AGNR

 

Across the United States, people are increasingly at risk of flooding due to climate change. This is especially true in low income urban communities who need the assistance of design professionals who can help analyze the flooding problems and identify solutions. Advancements in innovative 3D virtual reality (3D VR) technology can be used to visualize flooding—immersed in the scene at human scale—in ways that current methods simply cannot doon paper or computer screens.

The 3D VR technology is now affordable and easily integrated with standard off-the-shelf design software to develop realistic 3D VR simulations of actual neighborhoods under threat of flooding. This proposal is to work with state and local agencies in Maryland and Baltimore to conduct a comprehensive scientific study to determine if and how 3D VR is a more effective tool to advocate design solutions for the growing problem of severe urban flooding.

The GOALS of this study have three parts that are explained in detail in the proposal: 1) stakeholder engagement workshops to assess the strengths of communicating flood scenarios and design solutions using 3D VR with city officials and important stakeholders, 2) a laboratory experiment that tests the strengths of visualizing urban flooding with 3D VR technology resulting in usable scientific evidence, and 3) interviews with design professionals who currently use 3D VR to document when and how 3D VR is being used to support site analysis, conceptual design, onsite review, construction administration, and public presentations.

https://research.umd.edu/urban-flooding


2022

VR Black Broadway

Maryland The Daily Record

September 11, 2022

Gina Gallucci-White

When people think of virtual and augmented reality as well as other immersive media technologies, many focus on the entertainment aspect, but these tools have become an asset in medical care.

In May, the University of Maryland School of Medicine announced a partnership with the University of Maryland, College Park, and the University of Michigan to create the Center for Medical Innovations in Extended Reality (MIXR). Established through $5 million from the National Science Foundation’s Industry-University Cooperative Research Centers program, the center aims to accelerate the development of these technologies to use in clinical trials and eventually more broadly in medical care.

Companies like Microsoft, Meta, Google and others will also be providing funding and expertise to the team to develop, test and certify these technologies to use in the medical field.

“Virtual reality has many uses in a health care setting,” said Amitabh Varshney, dean and professor at the University of Maryland, College Park, College of Computer, Mathematical and Natural Sciences and the MIXR lead-site principal investigator. “For training, we’ve already done studies that show people can retain information better— nearly 9% better—than if they were to view the same information on a 2-D desktop screen.”

Staff has also developed virtual reality training prototypes for specialized surgical techniques like an emergency fasciotomy where the fascia is cut to relieve tension or pressure to treat the resulting loss of circulation to an area of tissue or muscle.

For augmented reality, the team has developed a point-of-care ultrasound prototype that displays information directly on the patient so the physician does not have to keep looking away to a monitor.

“These examples are just the beginning,” notes Varshney. “With the added momentum and synergy that our new center will bring — including working with federal regulatory experts to bring new devices and technologies to clinical settings more quickly — we anticipate a time in the not-too-distant future when immersive headsets will be just as commonplace in a hospital setting as a stethoscope.”

Officials note MIXR is needed because of the rapid movement in the private sector to advance new immersive technologies used for gaming, entertainment, education and training. This has filtered down to scientists and physicians using these same visualization tools in a clinical setting or for advanced medical training.

“We believe our new center will serve as a focal point for industry to collaborate — at the highest level — with academia and health care professionals to build, test and certify new devices that can greatly improve patient care and medical training,” Varshney said.

In 2017, Varshney along with Dr. Sarah Murthi launched the initial work with the Maryland Blended Reality Center (MBRC).

“It has been tremendously exciting and rewarding to work with Dr. Sarah Murthi and her colleagues in Baltimore,” he said. “They represent the very best in emergency medicine. Now, with added participation from technology leaders like Google, Microsoft and others, we believe we’ve developed a critical mass to move our ideas forward quickly and efficiently. The common theme of using technology to improve patient outcomes has been driving our efforts from the start. This is particularly satisfying for me as a computer scientist.”

MBRC will continue to work on other immersive projects that are not directly related to medicine and health care including implicit bias training and using immersive environments to train foreign language professionals at a very high level. They have also collaborated with artists and performers to bring new ideas in classical music and opera to the stage.

“So, while some of the new activities of MIXR may overlap with our work in MBRC, we see them as separate, yet complementary, entities,” Varshney said.

Some of the new activities that MIXR staff are exploring have not yet been used in a medical setting to a great extent. Murthi is working on helping patients cope with physical and emotional trauma through immersion in another world with a focus on quadriplegic patients who are hospitalized with acute spinal cord injury. Another collaborator, Dr. Luana Colloca, is a physician scientist using immersive technologies to reduce the need for addictive opioid pain medications.

Varshney and his UM colleagues are in the process of finalizing a HoloCamera studio featuring more than 300 immersive cameras fused together to create a 3-D visualization technology images to help train health care providers performing difficult medical procedures.

“We are in the final stages of addressing technical challenges that have arisen in our fusing together 300 immersive cameras,” he said. “The system works but we need it to work seamlessly for what we have in mind — high-end training for emergency medical procedures. We anticipate working with our partners in Baltimore on user-study training scenarios within the next six months.”

The collaborators have planned a three-day kickoff in College Park in October to bring all the MIXR partners together including scientists, physicians, private technology firms, and federal regulatory experts. The event is designed to brainstorm their agenda for the immediate future and the next five years. “We are certainly excited for what is yet to come,” Varshney said.

https://thedailyrecord.com/2022/09/11/new-center-for-medical-innovations-in-extended-reality-launched-at-um-umsom/

New York Times

Some care facilities are giving older adults a way to visit their pasts to boost their well-being.

May 6, 2022

John Faulkner, 76, was becoming emotionally withdrawn before he arrived at Central Parke Assisted Living and Memory Care, the community where he lives in Mason, Ohio. He had once been an avid traveler, but cognitive decline ended that, and he became socially isolated. By the time Mr. Faulkner arrived at Central Parke, he would sit alone in his room for hours, according to Esther Mwilu, who organizes activities for the community.

His treatment plan for dementia-related anxiety included antipsychotic drugs and reminiscence therapy, a decades-old practice in which older adults engage with reminders of their youth — like music or personal photographs — meant to bring about memories and cultivate joy and meaning.

Mr. Faulkner was underwhelmed by the nostalgia. So the staff at Central Parke tried again but used virtual reality. While studies suggest that traditional reminiscence therapy can significantly improve the well-being of older people, V.R. has the potential to make it more immersive and impactful. By putting on a headset, Mr. Faulkner could walk along the virtual Cliffs of Moher in western Ireland, just as he’d done with his wife several years earlier.

That was a turning point. Now, three months later, he has a 45-minute V.R. reminiscence therapy session every Monday. Ms. Mwilu said he requires less medication for anxiety and is more social. He has even started teaching classes for other residents like how to make paper airplanes.

Roughly a half-dozen companies today focus on providing V.R. reminiscence therapy for seniors in care communities. One of the largest of these, Rendever, works with more than 450 facilities in the United States, Canada and Australia, while another, MyndVR, has partnered with several hundred.

They are part of a growing trend of using virtual reality in health care, including treating patients with trauma and chronic pain. And with the number of people over age 65 expected almost to double by 2060 in America, the need for technological aides like V.R. for elder care is only increasing. More than 11 million Americans act as unpaid caregivers for a relative with dementia. The middle-aged “sandwich generation,” juggling careers and multiple care-taking roles, is looking to V.R. and other technologies, such as robo-pets, for support.

Eddie Rayden of Rhode Island said his 91-year-old mother, Eileen, brightened when using V.R. to see the Cleveland neighborhood where she grew up. “She immediately lit up,” he said. “All of a sudden, she was standing in front of the house she hadn’t been to in 80-plus years.”

How it works

The concept of reminiscence therapy goes back to 1963. Many psychiatrists at the time discouraged anything that seemed like living in the past, but one, Robert Butler, who later founded the National Institute of Aging, argued that seniors could get therapeutic value from putting their lives into perspective. Since then, psychologists have increasingly recommended using old wedding videos or favorite childhood meals as tools to benefit older people, including those with dementia. Experts say seniors troubled by declines in short-term memory often feel reassured when recalling the distant past, especially their young adulthood.

Over the past decade, faster and more powerful computing have made virtual reality more realistic and have led to studies showing how older people can use V.R. to re-experience meaningful parts of their lives. In 2018, researchers from the Massachusetts Institute of Technology found that virtual reality reduced depression and isolation among seniors. Other studies have suggested that V.R. reminiscence improves morale, engagement, anxiety and cognition by stimulating mental activity, though it cannot necessarily reverse cognitive decline.

Still, larger studies are needed before everyone over the age of 75 is putting on a headset. Dr. Jeremy Bailenson, director of Stanford’s Virtual Human Interaction Lab, is currently leading a clinical trial in 12 states to try to get more data at scale.

“I would never want V.R. to completely replace non-V.R. reminiscence therapy,” he said, but “different people need different tools.”

Senior communities today can pay companies for headsets and access to a library of virtual experiences, many of which are designed for reminiscence therapy. Seniors can participate individually or, more typically, in group sessions.

Prescriptions are not required, and participants often outnumber the headsets. Caretakers and researchers said they start to see benefits after multiple sessions over one to two months. Stephen Eatman, a vice president for Sunshine Retirement Living, which manages Central Parke, said the company’s use of antipsychotics has decreased as much as 70 percent in seniors using V.R. therapy.

In addition to reliving trips to places like Ireland, users can teleport to nightclubs that remind them of their youth. MyndVR offers visits to flamenco, ragtime and classical music venues, complete with musicians and actors dressed in the style of the day.

Family members have created location-based life stories, including vacations and childhood homes, for those undergoing V.R. therapy.
But users are not limited to prepackaged nostalgic experiences. Relatives, friends and caretakers can also record a 3-D video of a wedding or other event that the person can virtually attend over and over to reinforce new memories. Other family members search Google Streetview for important places in a senior’s life that can be converted into V.R. realms.

Dorothy Yu, a business consultant from Weston, Mass., had the streets around the University of Missouri campus converted to V.R. so her father could see the buildings where he’d been a professor. Now a 90-something resident of Maplewood Senior Living in Massachusetts, it helps him remember the work he did there with pride, both during the session and afterward, she said.

“I’ve never seen anything like the reactions to this technology,” said Brian Geyser, a vice president at Maplewood, which now offers V.R. in each of its 17 communities, which are mostly in the Northeast.

Not right for everyone

To participate in V.R. therapy, you have to strap on a headset that covers your eyes and blocks all light, but for the 3-D world you enter. For some older people who didn’t grow up with computers, such immersive technology can be overwhelming, said Amanda Lazar, a human-computer interaction researcher at the University of Maryland.

“The face is a very personal part of the body,” said Davis Park, vice president of the Front Porch Center for Innovation and Wellbeing, a nonprofit that brings technology, including V.R., to senior communities. Someone with dementia may worry when their eyes are covered or have trouble understanding the purpose of strapping a machine over their face at all, Mr. Park said.

To mitigate these risks, Sunshine Retirement limits V.R. activities to certain rooms where seniors can move around safely. They also avoid showing seniors places that could set off traumatic memories, said Mr. Eatman, but people’s reactions are tough to predict.

Most providers also limit V.R. reminiscence sessions to 45 minutes, though even at that length, it can cause dizziness and headaches, especially with certain medications. Headsets may also be too heavy for some older adults’ necks or may not account for hearing and vision impairments.

While companies like Rendever have V.R. simulations that can bring back good memories, headsets can sometimes overwhelm patients, especially those with dementia or who are easily confused.
Another downside: V.R. can be socially isolating. Traditionally, reminiscence therapy has encouraged groups of seniors to bond over special memories with one another and caretakers. “If someone puts on a headset, the people around them are blocked out,” said Dr. Lazar.

The Iona Washington Home Center in Southeast D.C., tries to solve this by projecting seniors’ V.R. experiences onto a 2-D screen for others to watch and discuss. The center, run by a nonprofit, received its V.R. headsets through a government grant, which is common for retirement communities. “People around here don’t have much money,” said Keith Jones, the program specialist. “Most of them didn’t get to see the world.” When he takes groups to another country in V.R., Mr. Jones positions the few members who’ve been there at the head of the table to share their memories.

The future of the memory metaverse

In the future, V.R. may offer another way for seniors to combat loneliness — by stepping into the experience with their loved ones.

Tamara Afifi, a researcher at the University of California, Santa Barbara, has studied V.R. and dementia and is investigating new technologies that let relatives take trips together. Ms. Rayden, who is a 91-year-old resident of Maravilla Senior Living, a community in Santa Barbara, participated in Dr. Afifi’s research. She and her 66-year-old son, Mr. Rayden, took a tour of her old Cleveland neighborhood together, despite his being in Rhode Island.

“I showed him where we played hopscotch and sledded in winter,” she said. “It was important that he knew the home we had and the neighborhood. It was my childhood. It brought back wonderful memories.”

Since Ms. Rayden’s husband died in 2019, she’s struggled with sadness and loneliness. Virtual reality has allowed her to take her son to Florida’s Intracoastal Waterway, where she’d enjoyed fishing vacations with her husband. “He loved fishing,” she said. “Such happy memories.”

Ruth Grande, executive director at Maravilla, said that adult children can “stop being caretakers for 30 minutes” when they have these experiences with their loved ones. “They remember what it’s like to enjoy being with their relative,” she said.

Matt Fuchs is a freelance writer based in Silver Spring, Md.

https://nyti.ms/3FpUuAe

May 9, 2022

Amitabh Varshney, professor of computer science and dean of the College of Computer, Mathematical, and Natural Sciences, leads a new multi-institutional center to advance medical innovations and regulatory science for extended reality technologies.

Ultrasound data displayed directly on a patient via augmented reality headsets. Immersive “grand rounds” for medical students and faculty even when they’re in different locations. Virtual reality landscapes matched with classical opera to transport people with painful injuries outside of themselves, reducing the need for potentially addictive opioids.

These medical examples of extended reality (XR)—the umbrella term used for technology based in virtual and augmented reality or other immersive media—are already being prototyped or tested in clinical trials. But its widespread use in hospitals and other health care settings is currently hampered by technical challenges and sparse regulatory guidelines.

Now, with $5 million from the U.S. National Science Foundation (NSF) and technology titans including Google, Microsoft and Meta (formerly known as Facebook), a trio of academic institutions are collaborating with industry and the federal government to develop, test and certify XR technologies in medicine and health care.

The new Center for Medical Innovations in Extended Reality, known as MIXR, joins University of Maryland computer scientists and engineers with physicians and clinicians at the University of Maryland School of Medicine in Baltimore and the University of Michigan to improve medical training, patient management and health care outcomes across all areas of clinical practice.

The award is part of NSF’s Industry-University Cooperative Research Centers (IUCRC) program, designed to jumpstart breakthrough research by enabling close and sustained engagement between industry innovators, world-class academic teams and government agencies.

Behrooz Shirazi, acting deputy division director of the NSF’s Division of Computer and Network Systems and a program director for IUCRC, called MIXR one of the first national centers at the intersection of medical and computing sciences. “We expect this vibrant collaboration to produce significant societal and health care impacts,” he said.

In addition to Google, Microsoft and Meta, other technology companies involved in MIXR are Sony, Magic Leap, Health2047, GigXR, Brainlab and apoQlar.

Another key partner in the MIXR initiative will be federal regulatory experts working at the U.S. Food and Drug Administration, ensuring that safe, effective and innovative clinical solutions make it to patients as soon as possible.

“We’ll work closely with our industry and government partners to answer any scientific questions regarding regulatory evaluations and decisions needed for the widescale clinical use of these devices,” said Amitabh Varshney, professor and dean of the College of Computer, Mathematical, and Natural Sciences at the University of Maryland.

Varshney is the lead site principal investigator on the project, and is joined by partner site PI’s Sarah Murthi, M.D., an associate professor of surgery at the University of Maryland School of Medicine, and Mark Cohen, M.D., a professor and vice chair of surgery at the University of Michigan Medical School with appointments in pharmacology and biomedical engineering. All three have extensive experience developing and using immersive technologies in a medical setting.

Varshney and Murthi co-direct the Maryland Blended Reality Center (MBRC), launched in 2017 as part of MPowering the State, the strategic partnership between the University of Maryland, College Park and the University of Maryland, Baltimore.

Early projects out of MBRC focused on prototyping new diagnostic tools to assist physicians at the renowned R Adams Cowley Shock Trauma Center in Baltimore, where Murthi is a critical care doctor and director of the critical care ultrasound program. This includes innovative AR medical displays that could improve how bedside procedures are done. 

MBRC clinicians also teamed up with the Maryland Institute College of Art to test a virtual reality platform that can help patients deal with physical and emotional trauma through immersion in another world, with a focus on quadriplegic patients who are hospitalized with acute spinal cord injury.  

In 2018, Murthi and Varshney co-authored an op-ed in the Harvard Business Review that detailed how augmented reality could improve patient care and lower costs in hospital settings.

“Immersive technologies have the potential to fundamentally change, improve and reduce the cost of medical training and of maintaining clinical skills across all aspects of health care,” Murthi said.

Cohen, who leads the Center for Surgical Innovation and trains new physicians at Michigan, said that using XR in medical rounds—adding in virtual reality-based illustrations or augmented reality data overlaid on a patient—makes for a richer experience for both teacher and student, both in-person or virtually observing from hundreds of miles away.

“We realized that having this ability to interact virtually with both the patients and residents—pulling up holographic windows and showing diagnostic imaging and labs—would greatly enhance the educational experience in these traditional grand rounds,” he said.

In an interview published last year, Cohen said he is also interested in combining XR technology with machine learning, hoping to leverage sophisticated immersive diagnostic imaging resources with artificial intelligence algorithms to “better predict when diseases will flare up, and how to improve the way we follow and treat chronic diseases like heart failure, cancer and diabetes.”

The MIXR initiative is heavily dependent on powerful computing resources. At Maryland, those resources will be handled by the University of Maryland Institute for Advanced Computer Studies. This includes building and maintaining a soon-to-be-unveiled “HoloCamera” studio, where more than 300 immersive cameras are fused together to bring unique visualization technology to bear on immersive medical environment captures.

The new camera system can be used to record cinematic-quality 3D demos of surgeons teaching intricate procedures like a lower extremity fasciotomy, a limb-saving technique of cutting the sheath of tissue encasing a muscle to treat for loss of circulation.

Joseph JaJa, professor and chair of the Department of Electrical and Computer Engineering at UMD, is also providing expertise. As the lead-site co-PI, he will support the integration of high-performance computing and machine learning into the XR technology being developed and use his extensive experience working with industry to foster stronger collaborative efforts.

Barbara Brawn, currently working with Murthi and Varshney as the associate director of MBRC, will act as the industry liaison contact between MIXR researchers and technology companies keen to see their latest hardware and software tools used to save lives and improve medical training.

“The synergy in MIXR will be contagious,” Varshney said. “Our industry partners will push forward new ideas and novel technologies. The scientists and physicians will help refine and test those ideas. And we both will work with the FDA to bring these technologies from the lab to the proper health care setting where they can have an exponential impact.”

—Story by Tom Ventsias

The Department welcomes comments, suggestions and corrections.  Send email to editor@cs.umd.edu.

https://www-hlb.cs.umd.edu/article/2022/05/bringing-health-care%E2%80%99s-vision-tomorrow-focus

April 28, 2022

by Hayleigh Moore

Two immersive media projects developed by iSchool students featured in Arts for All showcase.

ArtsforAllstudents
AJ Rudd (left) and Aishwarya Tare (right)

AJ Rudd, an HCIM student, and Aishwarya Tare, an Information Science student, participated in the Immersive Media + Arts for All Showcase earlier this month which immersed attendees into interactive media exhibits developed by current students and other members of the UMD community, as well as guest artists. Held from April 2-8, 2022, the Showcase demonstrates the ways that immersive media can bring the arts into dialogue with cutting-edge digital technology to transform public spaces and further social good through installations, performances and talks.

AJ Rudd, an HCIM student, and Aishwarya Tare, an Information Science student, participated in the Immersive Media + Arts for All Showcase earlier this month which immersed attendees into interactive media exhibits developed by current students and other members of the UMD community, as well as guest artists. Held from April 2-8, 2022, the Showcase demonstrates the ways that immersive media can bring the arts into dialogue with cutting-edge digital technology to transform public spaces and further social good through installations, performances and talks.

Spray AR

To address the issues of accessibility and the “emphemeral nature of graffiti,” AJ Rudd joined a student team to develop an original app called Spray AR, which allows a user to experience spray painting using their mobile device without using actual paint. AJ’s co-collaborator, Jason Alexander Fotso-Puepi, initially presented the idea for a graffiti experience using augmented reality (AR). The app was developed using Unity, a cross-platform game engine for creating 2D and 3D multiplatform games and interactive experiences.

“Due to the issue of public destruction of property, graffiti is not always accessible to everyone. AR afforded us the ability to provide anyone with the experience of spray painting a building, a bridge or another area of their choosing,” AJ said. 

A demo of Spray AR is available to view on YouTube which shows how a user can create graffiti on nearly any surface. The basic mechanics already in place will later be refined by AJ and his co-collaborators before introducing more complex functionalities, such as one of AJ’s original ideas to integrate blockchain technologies within the app. Blockchain would allow for verification of artworks in the real world which directly addresses the issue of graffiti’s short life in physical spaces.

The desire to bridge the gap between art and technology while also remaining local helped inspire AJ’s decision to enroll in the iSchool’s Master’s of Human-Computer Interaction (HCIM) program. AJ previously earned his Bachelor’s degree in studio art with an emphasis on sculpture from UMD in 2019. He will be graduating this May after successfully defending his Master’s thesis with plans to pursue a career in AR and eventually a PhD to land a more research-oriented role. 

“Sincerely, Ecocriticism”

The idea of sending a postcard to your loved ones is all about telling them what you’re experiencing, but behind the scenery found on these postcards, there is often a sinister, not-so-picture-perfect reality. The “Sincerely, Ecocriticisms” project created by iSchool undergraduate student, Aishwarya Tare, challenges the divide between our everyday lives and the natural world by using a physical form that many of us hold nostalgia toward: postcards. Using 100% recycled card stock, Photoshop, and Unity, Aishwarya created different versions of “picture perfect” postcards to show things often omitted from the scenery that display unsustainable behaviors.

“We tend to only send postcards that have really pretty pictures, but the truth is that even on the prettiest beaches, there is trash, and boats, surfers, and other human footprints. Having to send the picture to a loved one even when it isn’t the most beautiful untouched image, forces us to rethink how we view nature as a commodity,” said Aishwarya.

 Aishwarya said this idea was inspired by an event she had attended at the Smithsonian called FUTURES with artist-in-residence Carlos Carmonamedia. He had created these postcards that you can send to yourself in the future which forced attendees to reflect on their current reality.  

Over the past two years, Aishwarya has been working on her startup, Chat Health, a virtual assistant she developed to help students more easily access their campus health resources and receive personalized and empathy-driven health information. Chat Health won the Quattrone Venture Track’s second-place honors at the 2022 Pitch Dingman Competition, an annual competition that provides the university’s most talented student entrepreneurs the opportunity to compete for seed funding and venture development resources. She is also an intern for the Mixed Augmented Virtual Reality Innovation Center (MAVRIC), and a Do Good Institute Fellow, where she is building an augmented reality community garden for mental health. 

The Immersive Media + Arts for All Showcase is a weeklong event produced and presented by the University of Maryland’s Immersive Media Design (IMD) program, the College of Arts and Humanities, the College of Computer, Mathematical, and Natural Sciences and the campuswide Arts for All initiative.

More About Arts for All:

The University of Maryland’s new Arts for All initiative partners the arts with the sciences, technology and other disciplines to develop new and reimagined curricular and experiential offerings that nurture different ways of thinking to spark dialogue, understanding, problem solving and action. It bolsters a campus-wide culture of creativity and innovation, making Maryland a national leader in leveraging the combined power of the arts, technology and social justice to collaboratively address grand challenges.

https://ischool.umd.edu/news/putting-the-ar-into-art/

New Showcase Highlights Students’ Work in Virtual and Augmented Reality, Other Emerging Media

April 01, 2022

By Sala Levin ’10

Immersive_Art_UMD
A guest tests out “Pixel Party,” an immersive video piece created by Brayan Pinto '22, Manuela Fantcho '25 and Caroline Dinh ’25 that offers a new perspective on time, space and perception. Photo courtesy of Maria Herd

In a darkened section of UMD’s Herman Maril Gallery is an unexpected bounty of stimuli for the senses. Just noodle around (skillfully or not) on a keyboard whose keys are each linked via software to a projection; depending on what key is pressed, a colored shape appears on the walls around the instrument. Play a melody, and a rainbow of triangles, squares and trapezoids erupts.

Sean Preston’s MFA ’22 “Starlight Symphony” installation is one of the high-tech works on display in the “Immersive Media + Arts For All Showcase,” which runs April 2–8 in five buildings across campus. The inaugural event will highlight the University of Maryland’s Immersive Media Design (IMD) program, a major offered jointly by the College of Arts and Humanities (ARHU) and the College of Computer, Mathematical, and Natural Sciences, through exhibits, workshops and panel discussions.

The showcase is a collaboration between IMD and the university’s Arts for All initiative, which brings together the arts, technology and social justice to spark innovation and new ways of thinking.

One goal of the showcase, said Roger Eastman, professor of the practice in computer science and director of the IMD program, is to introduce students, faculty and staff to what is, for many, an unfamiliar concept. “The most common question we get is, ‘What is immersive media?’” he said. “The objective of this showcase is to show off these technologies and their artistic potential.”

The 12 student projects on display “provide great examples of how our students are working at the cutting edge of immersive media, and how diverse (their work is) both in terms of subject matter and also the ways they’re using immersive design,” said Jonathan David Martin, IMD lecturer and program manager of the showcase.

“Making Space,” a multimedia piece by Emily Pan ’23, Lei Danielle Escobal ’24 and Casey Taira ’23, examines what it’s like to be Asian American in 2022 through video, dance, poetry and painting. In the film, Taira’s performance of a dance she choreographed is overlaid with Escobal’s writing and artwork created by Pan and Taira. Plant and tree imagery—including the ginkgo, the national tree of China—suggests the importance of one’s roots, while Escobal’s poems touch on the anxiety many Asian Americans feel living in a country with a history of colonialism. The COVID-19 pandemic and its accompanying spike in hate crimes have also increased existing insecurities.

“When you see a lot of hate of your own culture, you take the chance to become (proud of) it,” Pan said. “That’s the main goal of our piece.”

 Other student work invites visitors into a virtual-reality barren underground facility, a movie-like experience by turns comedic and terrifying. In another piece, visitors can spray paint their own graffiti via augmented reality, which overlays computerized visuals on the real world.

Additional programming includes a keynote speech from Gabo Arora of Johns Hopkins University’s Interactive Storytelling and Emerging Technologies program. Arora is the founder of the United Nations’ division for virtual and augmented reality initiatives. A panel including a number of faculty members will discuss how immersive media can make a positive social impact.

Patrick Warfield, professor of musicology and ARHU associate dean for arts and programming, noted that virtual and augmented reality can spark insights that only come from intimacy and proximity. Reading about a Syrian refugee camp and seeing it for yourself through virtual reality are two strikingly different experiences, he said.

Emerging media can “bring us so close to a face-to-face experience,” Warfield said. “We get to deeply experience the lives of others when we’re immersed in video and sound.”

https://today.umd.edu/get-immersed-in-arts-and-tech

February 11, 2022

By Annie Krakower

Students Create Virtual Tour to Showcase Decades of U Street Culture

Black Broadway VR U Street
As part of its Sprinternship this winter, a team of Terps photographed U Street locations such as Ben’s Chili Bowl, the Howard Theatre and Industrial Bank to create a VR tour of the area.
Photos courtesy of Maxine Hsu

Even though she grew up with U Street practically in her backyard, Montgomery County native Maxine Hsu ’25 never knew just how many cultural gems lined the D.C. corridor: the stages of jazz and big band legends, longstanding family-owned shops and restaurants, and homes and haunts of Black trailblazers. Now, the computer science major and a team of Terps are helping others step out on “Black Broadway”—whether they live a few miles away or a few thousand.

During a three-week micro-internship this winter with local extended reality marketing company Capitol Interactive, Hsu and fellow students Saniya Nazeer ’25 and Kia Williams ’24 used a 360-degree camera to create an interactive tour of historic U Street, providing VR views to immerse users in locations that played a key role in Black Washington, ranging from restaurants to theaters to banks. The project will become part of Black Broadway on U, a multiplatform initiative created by alum Shellée Haynesworth ’84 to amplify the stories of the community.

“It’s just interesting to me that there’s so much history in a place that’s so local, that’s so nearby,” Hsu said. “I would just love to be a part of the movement to preserve that history in D.C. because U Street is actually being gentrified, and so a lot of history is being lost.”

The collaboration was among Break Through Tech DC at UMD’s first Sprinternships—quick, jam-packed programs that offer students who identify as women or nonbinary and other underrepresented students real-world experience through a “tangible project that can be implemented,” said Kate Atchison, UMD’s site lead at Break Through Tech DC, which strives to make the technology industry more inclusive. The Terp trio working at Capitol Interactive, one of 15 host organizations, landed on the virtual tour idea, thanks to founder Joseph Cathey’s connection with Haynesworth.

The broadcast journalism alum launched the Black Broadway on U website in 2014 after driving around 14th and U streets with her grandmother, who had lived and worked there as a barber. She couldn’t believe how much the area—which endured widespread damage from 1968 riots and has been held up in recent decades as a prime example of D.C.’s gentrification—had changed.

That led Haynesworth on a historical deep dive, where she discovered rich stories that went beyond music and entertainment by Cab Calloway and Billie Holliday to include details of civil rights activists like suffragette Mary Church Terrell, Black researchers and pioneers in science such as blood bank pioneer Dr. Charles Drew, and buildings designed by Black architects and financed by the Black community.

“The history was so fragmented that we needed a platform or a destination where people could go to learn. The goal was to tell the story at the intersection of technology so it could have a digital destination and reach a wider audience,” she said. “I felt that it’s important to expand the narrative of the African American experience.”

The Sprinternship project fit perfectly with that mission. The three students met virtually with Haynesworth, then used the Black Broadway on U website as a guide for initial research. In the winter term’s final week, after Haynesworth had helped them contact and coordinate meetings at various U Street locations, Hsu photographed those places with a 360-degree camera while Williams and Haynesworth wrote the tour’s narrative and worked with Nazeer to compile everything. Haynesworth also secured a narrator and provided music and archival photos to round out the tour.

The team was able to capture around a dozen locations, including Ben’s Chili Bowl, a culinary community staple since 1958; Industrial Bank, one of the largest Black-owned and -operated commercial banks in the United States; and the Whitelaw Hotel, an important lodging and social center for African Americans during segregation.

The group is planning to share the virtual experience with the public in the coming weeks on the Black Broadway on U website.

“It’s just amazing what you can do with technology. It makes me more excited to delve deeper into computer science,” Hsu said. “There are some technological innovations that haven’t even happened yet. Maybe I can help with that, and maybe I can help with other projects that can preserve history in this way.”

Follow Black Broadway on U on Facebook and Instagram @blackbroadwayonu and on Twitter @blkbroadwayonu.

https://today.umd.edu/a-vr-view-of-black-broadway

2021

MBRCresearch
LASSR
(Richard Moglen/The Diamondback)

Victoria Stavish

November 5, 2021

The University of Maryland and Jigsaw, a unit within Google, announced a partnership to create virtual reality training aimed at improving police de-escalation and communication across the United States.

The partnership was born out of Jigsaw’s “Trainer” platform, which aims to use technology to improve the interactions between police and the communities they serve. Trainer provides its partners with improved virtual reality technology and systems to help their partners conduct research to better understand how officer training can be improved.

Rashawn Ray, a sociology professor and executive director of this university’s Lab for Applied Social Science Research, said that this university was chosen as a partner because of its prowess in the field of criminal justice and policing research, along with the University of Cincinnati, Georgetown University Law Center and Morehouse College.

As a partner, the lab can continue its research with better equipment and more collaboration across different departments and the other participating schools.

“[Trainer] gives us the opportunity to collaborate more broadly, and bring more people in,” Ray said. “We’re moving at it in a very, very big way.”

The research team ultimately wants police departments to use virtual reality regularly in their training. This would be less expensive for police departments, add a more evaluative component to police training and allow for better communication and de-escalation training, said Connor Powelson, a sociology doctoral student working on this research.

“What’s important is that [officers are] trained not on these lethal encounters, which are relatively rare. We need to train officers on these very common social interactions where they’re talking to people,” Powelson said.

Their partnership with the Trainer platform can help them do just that.

“This program is focused on, ‘How do people talk to people?’ ‘How do officers talk to civilians in these policing situations?’” Powelson said. “We need to train officers on these more mild cases where they’re just talking to people as necessary.”

LASSR has evaluated officer biases using virtual reality training for about four years. Previously, the research team asked police departments across the country to implement virtual reality headsets in their implicit bias training, which allowed the researchers to better understand whether current implicit bias training is effective.

“What those trainings turn into, those in classroom implicit bias trainings … is a checkbox on a sheet instead of an understanding of, ‘Is this actually making officers better?’” Powelson said. “We want to add meat to the bones of other training programs.”

The research team focused on evaluating unconscious race biases, officer use of respectful or disrespectful language and how officers perceive situations based on the likelihood of criminality or victimhood, Powelson said.

What they found was that officers who have more unconscious biases typically treat Black men with less respect. This outcome is more likely in situations where perceived criminality or victimhood of a person is unclear, Powelson said.

“One huge reason that we started this program is because there’s no national database on police use of force,” Powelson said. “There’s very poor data on officer use of force that we can use to evaluate discrimination behaviors across policies, across departments.”

The new partnership with Trainer allows them to branch out their research and take advantage of more resources than before.

Powelson said new collaborations with the computer science department at this university will allow the team to vary conditions such as character attitudes, skin tone, size, age and ability.

They also plan to work with neuroscientists to better track and understand facial expressions and eye movements in officers.

Genesis Fuentes, a sociology doctoral student working on this research, said now that they are more collaborative and can work with virtual reality technology alongside people from other disciplines, it’s easier to develop and work with the software.

“We understand that we’re social scientists,” Fuentes said. “We’re not going to sit here and try to create … virtual reality software.”

The team’s virtual reality software is useful beyond directly training police officers, Powelson said. They also use their technology and research to improve community relations by facilitating a positive and educational environment for officers to interact with the community.

Powelson said the team has taken their virtual reality headsets to school assemblies, where they allow officers and students to experience the same situation virtually and then discuss why they acted differently than each other in the virtual situation.

“It creates this conversation where trust can be built,” Powelson said. “Understanding can be built and ultimately it does a lot of good for the community.”

https://dbknews.com/2021/11/05/umd-google-virtual-reality-police/

September 20, 2021

by Marlia Nash

ARTECHOUSE at NextNOW
Maryland Night Live performs during NextNOW Fest. (Kurt Leinemann)

On Sept. 17, students filled up Kay Theatre to watch Maryland Night Live. For two hours, students doubled over in laughter at skits and cried at heartfelt musical performances by AwBi and Benny Roman. 

Following the performance, students met in the front entrance of The Clarice Smith Performing Arts Center to be greeted by the Festival Snapshots with Monumental Magazine marketplace. 

NextNOW Fest historically has been held at The Clarice, full of art galleries, musical performances and other interactive exhibits. In 2019, students would expect to have just a weekend full of these activities.

Megan Pagado Wells, associate director of programming, wanted to focus on the expansion of NextNOW, expanding across the campus and disciplines. This was the first year students could see aspects of technology disciplines introduced to the arts, which made the festival to be inclusive to those who have those interests. 

“We’ve expanded to seven days of programming across campus from Monday to Sunday,” Wells shared. “We’ve [also] partnered with amazing partners … like Studio A and Stamp Gallery at the Parren J. Mitchell Art-Sociology building, with the UMD Art Gallery and the Michelle Smith Collaboratory for Visual Culture, with the UMD book lab … and of course the immersive media design showcase.”

The immersive media design program is a new component to both NextNOW and the university, with its premier being fall 2020. The immersive media design exhibit was featured in the Iribe Center and The Clarice. 

The projects selected by current immersive media design students were popular among students strolling by. Students such as Caroline Dinh enjoyed these exhibitions and even picked out a favorite. Dinh, a freshman immersive media design major, shared that they loved “Within Reality.”

Brayan Pinto, senior studio art major, is the creator of the project “Within Reality.” 

“My goal was just to make sure people have fun, come see it, like dance around pretty much,” Pinto shared. And students did exactly that. “My project is a visual effect project that takes the input from a camera and then outputs a special effect on top of it.”

Pinto revealed that he actually spent his summer on campus with other immersive media design students working on this project under the supervision of Jonathan Martin, lecturer in the immersive media design program. 

“It started with them coming up with their concept of designing, and then testing and prototyping it, and that was both something that they did individually,” Martin said. 

One particular fan of the exhibit was the dean of the arts and humanities college, Bonnie Thornton Dill.

“I loved watching the dancers as that went on, but I also really liked the meditative piece,” Dill said, referring to Pinto’s project and junior immersive media design and English major, Cassiel Arcilla’s project, “Meditations.” 

Dill came and spoke at the Iribe to students about the Arts for All Initiative.

“The arts, plus tech, plus [the] social justice piece. [We] really wanted people on all parts of the campus,” Dill said. “It’s like really hearing the students talk about what they’re doing and also seeing the kinds of the double majors people are doing or the various interests that they have and how they come together.”

On the other side of the campus at The Clarice, ARTECHOUSE had its own interactive installations. This organization is currently based in only three locations: New York, Miami and Washington, D.C. But for students at NextNOW Fest, they had exclusive access to the exhibit right here on the campus. 

Danielle Azu, ARTECHOUSE representative, shared that the installation, Renewal 2121 featured in the Grand Pavilion, was a focus on traditional Japanese scenes and cultural aspects but set in the year 2121. The objective was to focus on climate change and its effects on Japan in the future. 

This is the second time ARTECHOUSE has made an appearance at NextNOW and students absolutely enjoyed it once again. Freshman computer science major Vruti Soni’s awe at the exhibit encompasses the objective of creating a more inclusive arts environment for people with interests in technology. 

“When I first came, I thought that colors really drew you to the exhibit. All of them had art in tech, and I hadn’t seen that before,” Soni said.

Amy Yim, junior math major with a minor in arts leadership, was a student curator for NextNOW and touched on the merging of disciplines with art. 

“I want more of the sciences to be integrated because there are so often very separate ideas in people’s mind[s],” Yim said.

NextNOW coordinators and student curators had much more to offer than these two exhibits. The line up had something for everyone from comedy, to art and to even technology. 

Even if you weren’t able to come to NextNOW, it came to you. Whether you were crossing McKeldin listening to Terrapin Brass or walking through Stamp with the Ignis Wind Quintet, you had a small taste of NextNOW.

CORRECTION: Due to a reporting error, a previous version of this story misquoted Megan Pagado Wells. Wells said that NextNOW Fest partnered with the Michelle Smith Collaboratory for Visual Culture, not the Michelle Smith Performing Arts Library. This story has been updated.

https://dbknews.com/2021/09/20/nextnow-fest-umd-clarice-fall-2021/

Stocksy
Image by Stocksy
Facebook and other companies are betting a 3D online realm known as the "metaverse" will be the next big thing for the internet, but a UMD social media researcher suggests the public might not be clamoring to leave the real world behind just yet.

UMD Social Media Expert Explains Why Facebook and Others Want to Create an Alternate, Online World

By Chris Carroll

August 11, 2021

While those of a certain age might recall exotic novelties like accessing primitive chat rooms, logging onto AltaVista or receiving an “E-Mail,” the sense of wonder that accompanied the early internet is long gone.

Now a group of companies is placing a bet on a new internet frontier—one they hope could reinvigorate that sense of technological awe and endless possibilities for connection—with plans to build a 3D virtual world online that users can explore and interact in, called the “metaverse.”

Tech watchers took notice in April when Epic Games, founded by CEO Tim Sweeney ’93, announced it had raised $1 billion in funding to develop its metaverse vision after players of its smash hit Fortnite began hanging out in the game world (for instance, for a virtual concert by Ariana Grande) when not competing. 

Interest in the idea went into overdrive, however, when Facebook founder and CEO Mark Zuckerberg announced last month that the world’s dominant social media company saw its future in the metaverse. Just as what the metaverse will look like is unclear, so are details of Facebook’s planned shift. But expect the company to deploy huge resources and energy both to stay ahead of the tech curve and to divert attention from some of its current problems, said Professor of Information Studies Jennifer Golbeck, a University of Maryland computer scientist who studies social media algorithms.

Golbeck spoke to Maryland Today about what the internet of the future could look like:

The metaverse seems to be a big deal, but what is it, actually?
The idea of a metaverse has been around for years. Facebook didn’t just develop it, and it wouldn’t be a Facebook product, like Instagram. They’re one company that’s part of it. What it actually is, is still somewhat undefined; you could possibly see it interacting with the real world through augmented reality, like with the old Google Glass, with a visual overlay on the real world providing information as you walked around. Where Facebook is really focusing is a virtual space. Most of us have seen or maybe tried the Oculus virtual reality goggles—Facebook owns Oculus, and that seems to be their vision of how you experience the metaverse.

What would you actually see wearing the goggles? A cartoon world, or photorealistic?
I think it will be a blend. As with many online experiences, a lot of how it looks will depend on your hardware. It’s not going to be a platform run by a single corporation or an app like (online 3D virtual world) Second Life. That would suggest that you've got some more heterogeneity to how it looks and how it works. So theoretically, you could walk from something built by one company or organization to another, and the world would completely change.

Does Facebook’s interest make the metaverse more likely to take hold?
Facebook is definitely going to pour a lot of money into it, and they have access to a big user base, so yes. If you want to see the metaverse realized in a way where it features a whole lot of interesting stuff going on, and is a place you can interact with people, this is good for that. But Facebook tries a lot of things people don’t want, and I’ve wondered if this is a solution in search of a problem. Are people really clamoring en masse to leave the real world behind? I’m not so sure they are.

Why is Facebook betting so much on this, then?
After all the issues they’ve had, Facebook wants to reestablish the reputation that they had originally, which is someone who's coming along and doing really innovative new stuff that makes the world a better place, like something that helps us maintain relationships with people we might have lost touch with. I don’t have any insider knowledge of what goes on in Facebook, but I think there must be intense pressure to know what the next big thing is going to be and stay out in front of it.

How could this go bad?
I think all the problems Facebook has had with people using its platform for bad ends will be intensified in a metaverse application, because a lot of what goes on at Facebook these days is just trying to manage the bad stuff. There’s a lot of talk about the anti-vax stuff and the insurrection-related stuff, but there’s also child porn and many other things that don’t get a lot of attention, and I have to say, Facebook does a really good job stopping a lot of that content, and their moderators suffer for it. 

The problem for the metaverse is that we have technology that can automatically flag text or images that might have child porn or certain kinds of violence, but we don’t know how to manage this virtual world space. That means it will probably be effectively unregulated for a while. You’ll have a lot of techie people who want to try new things there, and that will include some jerks. So I don’t think that the “Facebook is evil” idea they’re dealing with is going to go away.

What could be good?
I went to a National Science Foundation review panel that was held in Second Life about five years ago, and it was great. You could talk over each other in a way much more natural than on Zoom, and you had visual cues that helped communication too. Because you have a visual avatar, you don’t have to be on camera—you don’t have to look at your stupid face all day.

So just because I don’t really think people are going to want to spend their lives in virtual reality doesn’t mean there aren’t some good uses for this, like better ways of doing telework. I think if Facebook overlooks that aspect, they’re missing an opportunity. But I think they have a bigger, more encompassing vision than just a better Zoom. I just don’t think it will work.

https://today.umd.edu/well-versed-metaverse-2d603b47-faf3-4681-b785-a446d80ad4a7

EEG Testing
Photo courtesy of Maryland Blended Reality Center
A test subject experiences a potentially stomach-churning virtual reality fly through of a space station while her brain activity is monitored. Eric Krokos '13, M.S. '15, Ph.D ’18 (below) is one of the authors of a new paper on measuring so-called "cybersickness" via EEG.

Better Understanding of VR-Induced Discomfort Could Broaden Tech’s Reach

By Maria Herd

July 07, 2021

If a virtual world has ever left you feeling nauseous or disorientated, you’re familiar with cybersickness, and you’re hardly alone. The intensity of virtual reality (VR)—whether that’s standing on the edge of a waterfall in Yosemite or engaging in tank combat with your friends—creates a stomach-churning challenge for 30-80% of users.

In a first-of-its kind study, researchers at the University of Maryland recorded VR users’ brain activity using electroencephalography (EEG) to better understand and work toward solutions to cybersickness. The research conducted by computer science alum Eric Krokos '13, M.S. '15, Ph.D ’18, and Amitabh Varshney, a professor of computer science and dean of the College of Computer, Mathematical, and Natural Sciences was published recently in the journal Virtual Reality.

The term cybersickness derives from motion sickness, but instead of physical movement, it’s the perception of movement in a virtual environment that triggers physical symptoms such as nausea and disorientation. While there are several theories about why it occurs, the lack of a systematic, quantified way of studying cybersickness has hampered progress that could help make VR accessible to a broader population.

Krokos and Varshney are among the first to use EEG—which records brain activity through sensors on the scalp—to measure and quantify cybersickness for VR users, and were able to establish a correlation between the recorded brain activity and self-reported symptoms of their participants. The work provides a new benchmark—helping cognitive psychologists, game developers and physicians as they seek to learn more about cybersickness and how to alleviate it.

“Establishing a strong correlation between cybersickness and EEG-measured brain activity is the first step toward interactively characterizing and mitigating cybersickness, and improving the VR experience for all,” Varshney said. 

EEG headsets have been widely used to measure motion sickness, but prior research on cybersickness has relied on users to accurately recall their symptoms through questionnaires filled out after users have removed their headsets and left the immersive environment. 

The UMD researchers said that such methods provide only qualitative data that makes it difficult to assess in real time which movements or attributes of the virtual environment are affecting users. 

Another complication is that not all people suffer from the same physical symptoms when experiencing cybersickness, and cybersickness may not be the only cause of these symptoms.

Without the existence of a reliable tool to measure and interactively quantify cybersickness, understanding and mitigating it remains a challenge, said Varshney, a leading researcher in immersive technologies and co-director of the Maryland Blended Reality Center.

For the UMD study, participants were fitted with both a VR headset and an EEG recording device, then experienced a minute-long virtual fly-through of a futuristic spaceport. The simulation included quick drops and gyrating turns designed to evoke a moderate degree of cybersickness.

Participants also self-reported their level of discomfort in real time with a joystick. This helped the researchers identify which segments of the fly-through intensified users’ symptoms.

This work was supported by the National Science Foundation, the state of Maryland’s MPowering the State Initiative and the NVIDIA CUDA Center of Excellence program. 

https://today.umd.edu/researchers-record-brain-waves-measure-cybersickness-173f8081-0d63-403e-ad92-9151f3b42b92

Maryland Today Campus & Community

Students Designing Virtual Astronaut Assistant for NASA Competition

By Chris Carroll

March 9, 2021

Space Illustration
Illustration by Shutterstock
A UMD student group is designing a virtual robot avatar that would appear in a display built into an astronaut's helmet to assist with lunar exploration. Their design will go head-to-head with those of other student groups next month in a NASA-sponsored competition.

The United States plans to return people to the moon in 2024, and if a University of Maryland student club has its way, astronauts from NASA’s Artemis program won’t be exploring the stark lunar landscapes alone—not exactly, anyway.

Members of the XR Club, which focuses on virtual and augmented (or “mixed”) reality, have designed a virtual robot assistant that would exist only in an astronaut helmet’s heads-up display, where it could provide information to assist with tasks ranging from navigation to equipment repair. The idea—headed for a NASA-sponsored student showdown—is that an interactive, seemingly embodied helper could be more intuitive to use than a computer-like display.

And if it provides a little entertainment while capering among the craters, that would only be true to its conceptual roots, says senior Sahil Mayenkar, a computer engineering major and president of the XR Club. The design was inspired by Hollywood, starting with the artificial intelligence JARVIS. that’s embedded in Marvel movies protagonist Tony Stark’s Iron Man suit. Visually, the team’s own ARTEMIS (Augmented Reality Trusty Extraterrestrial Mission-ready Intelligent Sidekick) is based on the angelic EVE robot in the Pixar film “Wall-E.”

“Mixed reality allows us to bring characters to life as something we can see in front of ourselves and interact with, and it doesn’t take up a ton of space or have the complications of an actual robot on the moon,” Mayenkar says.

The team’s adviser is Matthias Zwicker, professor and interim chair in the Department of Computer Science, and an expert in the intersection of artificial intelligence and computer graphics.

“It is exciting to see them using these cutting-edge augmented reality technologies to work on such a fascinating real-world problem,” he said. “There is no doubt that this experience will be a huge asset for the future careers of these students.”

The design is part of a competition—NASA’s Spacesuit User Interface Technologies for Students (SUITS) challenge—that will be run virtually in late April by space agency staff at Johnson Space Center in Houston. The space agency is looking to the students for innovative ways to use augmented reality to extend the capabilities of astronauts, something Mayenkar and his teammates practically dream about.

“The idea is if we can bring humans and machines closer together, it can almost give us superpowers.”

https://today.umd.edu/moonwalk-imagery-friend-3128f09a-603a-4f30-856d-d8025e6e3bbc

2020

IMDMajor
IMD Faculty
Photo by John T. Consoli
Brandon Morse, an associate professor of art, and Roger Eastman, professor of the practice in computer science, were instrumental in developing the curriculum for a new four-year major in immersive media design

First Introductory Course Offered This Fall, Two More Expected Next Spring

By Maria Herd

September 21, 2020

The University of Maryland has a new four-year undergraduate program that combines art with computer science to prepare students to design and develop immersive media content and tools.

The immersive media design (IMD) major is co-taught by art and computer science faculty with expertise in virtual and augmented reality, digital art, projected imagery, computer graphics, 3D modeling, and user interfaces spanning audio, visual and tactile platforms.

“The goal is to graduate students who can collaborate effectively across creative and technical boundaries, and will excel in their field, whether that’s in computing, health care, education, advertising, gaming or the visual and performing arts,” said Roger Eastman, a professor of the practice in computer science and inaugural director of the program.

The program kicked off this fall with one introductory course, with two more being offered in Spring 2021. 

IMD features two tracks. Innovative Coders, for students focused on computer science, offers a Bachelor of Science degree. Emerging Creatives, with coursework focused on digital art, offers a Bachelor of Arts degree.

Dani Feng, a sophomore in computer science intending to major in immersive media design, has her career sights set on the animation industry. Feng said that she dreams of designing digital tools for artists to better tell stories in broad styles. 

“I want to have the knowledge from both worlds, and be able to look at my work with both a technical eye and creative eye,” she said. 

The program is designed to be collaborative, with core digital art courses featuring small classes and extensive group project work, said Brandon Morse, an associate professor of art who helped develop the curriculum with Eastman.

Morse, a digital artist whose work has been showcased internationally, said that IMD students won’t need to look far for creative opportunities outside the classroom. The region has seen an explosion of immersive design opportunities in the past few years at venues like ARTECHOUSE and the REACH at the Kennedy Center.

IMD has a dedicated space in the A.V. Williams Building that is undergoing renovation. In addition, IMD faculty and students will use digital art labs and fabrication resources in the Parren J. Mitchell Art-Sociology Building, as well as a high-bay research lab in the Brendan Iribe Center for Computer Science and Engineering.

“Our computing program is strong, interest in digital media is expanding dramatically, and our location next to government agencies and companies excited about new immersive technologies offer unprecedented internship and employment opportunities,” said Amitabh Varshney, professor and dean of the College of Computer, Mathematical, and Natural Sciences.

Varshney played a key role in establishing the new major, co-chairing a task force in 2016 and teaching the university’s first undergraduate course in virtual reality that same year.

The IMD program also bolsters the university’s standing as an arts-tech integrative campus, said Bonnie Thornton Dill, professor and dean of the College of Arts and Humanities.

“This new program, at the intersection of art and technology, is a tremendous opportunity for students to develop their abilities in innovative ways and to expand their creativity and career opportunities,” she said.

https://today.umd.edu/new-major-immerses-students-coding-and-creativity-74170f80-0359-4859-a0ba-b0fa5f31c885

MBRC VR Tester
Photo courtesy of the Maryland Blended Reality Center
A tester in the Maryland Blended Reality Center views a weather system in virtual reality. The new system developed by UMD meteorologists and computer visualization experts allows a highly intuitive view of atmospheric data.

Revolutionary Virtual Reality System Gives 3D Vantage of Satellite Data

By Chris Carroll Feb 21, 2020

Glancing out the window tells you less about the weather than stepping outside. The same principle might also apply to meteorologists who rely only on computer screens to understand vast quantities of atmospheric data about developing weather patterns, including dangerous storms.

Now, UMD researchers are developing a groundbreaking system that lets forecasters don a virtual reality headset and “fly through” the atmosphere. It lets them zoom up next to temperature gradients, keep pace with differing wind speeds and soak up information on the atmospheric moisture content at different altitudes in a way that’s supremely intuitive—not to mention cool to look at.

“The current status is you chop up data into two-dimensional layers,” said Mason Quick, a meteorologist in the Cooperative Institute for Satellite Earth System Studies (CISESS). “But these are in fact three-dimensional datasets, so we’re now viewing them in their native form.”

The meteorologists worked with computer visualization experts in the Maryland Blended Reality Center (MBRC), led by Professor Amitabh Varshney, dean of the College of Computer, Mathematical, and Natural Sciences; together they designed a prototype system that incorporates some of the more than 100 available satellite atmospheric data sources. 

For its first test, they fed it about a dozen datasets that documented a dramatic weather event in the Pacific Ocean—an atmospheric river of extremely moist air flowing in a narrow, directional stream—that one year ago drenched the U.S. Pacific coast, prompting deadly floods and mudslides in California and record snowfall in Washington state. 

Researchers in CISESS and the MRBC, a program of the MPowering the State Initiative with the University of Maryland, Baltimore, are working to fine-tune the system, improve the user interface and potentially work in more satellite data sets. But it already provides a far more visceral experience than viewing a map-like weather data presentation on a screen, and it’s easy to envision a whole new way of tuning into the nightly weather forecast as VR technology continues to make inroads into the mainstream.

The wow factor is not necessarily going to sell professionals at the National Oceanic and Atmospheric Administration—CISESS’s target user group for the technology—but the system could supplement current, highly refined methods of analyzing weather, says CISESS meteorologist Patrick Meyers. It could be a source of new insights, because quite literally, he said, “It’s providing a whole new dimension.”

https://today.umd.edu/forecasts-future-427554c4-7719-42ae-af24-a4f045281bf2

2019

ViolinPerformanceVR

Maryland Today Research

AR: The Trend Changing How You Shop During the Holidays

By Karen Johnson

November 22, 2019

AR Holiday Marketing
Animation by Valerie Morgan
Companies are using augmented reality (AR) experiences that offer ways to use your smartphone to superimpose images of products in the real-world location (or on the person) where it would go if you bought it. 

Holiday shopping has always required some imagination and a bit of guesswork. These days, retailers are turning to augmented reality to help consumers eliminate some of that guesswork.

Companies across consumer sectors are rolling out augmented reality (AR) experiences that offer appealing, practical ways to use your smartphone to superimpose images of products you’re interested in—a table or a jacket, for instance—in the real-world location (or on the person) where it would go if you ponied up the cash.

Until very recently, this seemed like a futuristic pipe dream, said UMD marketing Professor Jie Zhang. And both in-store and online AR technologies are drawing positive reviews from consumers.

The technology has been steadily improving, driving up interest for in-app and in-store use, said Zhang, the Harvey Sanders Fellow of Retail Management at the Robert H. Smith School of Business.

“The core audience of mobile-commerce activities has been younger and tech-savvy consumers, who are particularly receptive to technological innovations,” Zhang said. “AR apps allow them to virtualize the look and feel of merchandise which they could not physically inspect, and thus substantially enhance the confidence in their purchase decisions.”

Want to see how those cool new shoes will look on you or someone you love? Lacoste, Converse, Nike and Gucci all have AR technology that can help you visualize the footwear before you buy it. For clothes, Gap and American Apparel, among others, offer similar apps.

Wondering how a comfy armchair will look in your sister’s new apartment? Apps from Anthropologie, Magnolia Market and Ikea can help you do that.

Even buying makeup for someone else has become easier with AR execution from L’Oreal Paris and Sephora.

Retailers are also using gamification in the apps, piquing the shopper's interest and building loyalty toward the retailers offering them, Zhang said, while learning more about how consumers shop.

“A key limitation of retailing via digital channels is the lack of opportunity for a shopper to touch and feel products before placing orders,” she said. “AR apps are an effective way to reduce this limitation. If done well, AR apps can help a retailer attract and retain more consumers and generate higher spending from them.”

If that makes you wonder what might be next, Zhang has the answer. Over the next five years she anticipates further integration of artificial intelligence, mobile technology, AR and virtual reality tools, as well as sensory devices that incorporate the senses of touch and smell, along with audio and visual effects.

https://today.umd.edu/ar-trend-changing-how-you-shop-during-holidays-64dc0477-07de-4f46-bb11-b3211c571e15

Yosemite National Park
©2019 Photography by Scott Highton
An all-night “Star Party” gathers under the clear skies above Glacier Point, which overlooks Yosemite Valley at an elevation of 7,214 feet. The image is part of Virtual Yosemite, the creation of photographer Scott Highton ’78.

November 15, 2019

Alum’s Website Uses VR to Bring You Yosemite’s Splendors.

Click here to go to Yosemite National Park. Seriously, try it.

OK, admittedly we haven’t invented teleportation, but the collection you’ll find of immersive, 360-degree images that swoop from thundering waterfalls to majestic summits and back down to that other Yosemite National Park staple—traffic jams—is the next best thing.

There’s nothing else on the web quite like Virtual Yosemite, the creation of photographer Scott Highton ’78, with hundreds of virtual reality panoramas that allow you to look up, down, zoom in and explore California’s most famous grouping of natural landmarks.

For Highton, a professional photographer who lives a short drive from the 1,169-square-mile park, Yosemite was the perfect place to capture a breathtaking virtual reality experience, with a plethora of locations that draw the eye in every direction.

“With its dramatic cliffs, waterfalls and mountains, Yosemite certainly has lots of these,” he said. “It’s also a stunning natural location overall, with almost indescribably beautiful light for those willing to look for it.” 

Highton has worked with VR technologies for 25 years, and has seen the popularity of the medium come and go. But for any new media format to truly take hold, he said, what matters more than the technology itself is meaningful content.

He spent more than two years creating the images for the Virtual Yosemite website, enlivened by audio from the scenes as well as brief text descriptions. He plans to continue to expand the site and keep it free of charge over the next decade, or more.

“Virtual Yosemite a project that is intended to both celebrate and preserve one of our most precious natural environments,” Highton said. “That provides value to everyone, whether financial or not.”

Read on to find out about some of Highton’s favorite images from the project.

An all-night “Star Party” gathers under the clear skies above Glacier Point, which overlooks Yosemite Valley at an elevation of 7,214 feet. These events are hosted every few weeks during the summer months by various astronomy groups, whose members bring their high-powered telescopes and other equipment for the public to look through and explore the night skies. In this 360-degree panoramic image, the Milky Way is clearly visible, along with light trails from commercial airliners flying high above the park. The green beams of light between ground and sky are laser pointers used by the astronomers to guide participants’ views toward certain stars and constellations. Participants are encouraged to use red-filtered flashlights to help preserve their night vision.

Grand Dining Room

The majestic interior of the Grand Dining Room of Yosemite's historic Ahwahnee Hotel, home to the annual holiday Bracebridge Dinner at Yosemite. The hotel was designed in the 1920s by famed architect Gilbert Stanley Underwood and was built in only 11 months at a cost of $1.225 million (about $18 million in today’s dollars). It has been host to presidents, royalty, celebrities and countless “normal folk” from around the world. (©2016 Photography by Scott Highton)

Virtual Yosemite

A screen grab from the Virtual Yosemite website shows the base of 400-foot-long “cable route” to the summit of Half Dome (elevation 8,842 feet). For hikers, this is the last section of the ascent of what is ultimately a 14- to 16-mile round trip. It is a turnaround point for many, who are exhausted from the long slog to get there, the high altitude and the “pucker” factor of facing the steep climb over polished granite to the summit. The route has become so popular in recent years that the National Park Service instituted a lottery system for daily hiking permits. In order to mitigate climber "traffic jams” that used to occur at the cable section, often resulting in waits of an hour or longer, only 275 people per day are now permitted to climb Half Dome during the summer season. (©2017 Photography by Scott Highton)

Yosemite Website

The opening screen of Virtual Yosemite features a dramatic view of the sheer granite face of Half Dome from a location known as the Diving Board, made famous by Ansel Adams’ 1927 photo, “Monolith, The Face of Half Dome.” There is no official trail to this overhang, and getting there requires off-trail wilderness travel, as well as basic climbing skills. (©2013 Photography by Scott Highton)

Yosemite Falls

Another Virtual Yosemite image captures the view over the precipice of Yosemite Falls into Yosemite Valley, almost 2,500 feet below. Comprising three sections, Yosemite Falls is the tallest waterfall in California, the sixth-tallest in the U.S., and the 20th-tallest in the world. There are over 800 miles of trails in Yosemite, and many of them lead to spectacular viewpoints like this one. Water flow over Yosemite Falls is usually at its peak in late spring and early summer, and generally stops completely by the end of the summer. It flows again with the arrival of rains and snow in winter. During particularly cold winters, a “beard” of ice formed by freezing spray surrounds the granite face of the waterfall. Each of the small red targets in the screen image are links to additional locations in the park that Virtual Yosemite viewers can transition to. (©2016 Photography by Scott Highton)

https://today.umd.edu/virtual-valley-2da0b685-2648-4875-8a64-25f2dd77b6b0

MPLEXteam
Photo by John T. Consoli
Mike Sorokin and Galen Stetsyuk are working to combat VR nausea as they develop “Core Disruption,” which they hope will be the first commercial video game released by their company, MPLEX.

Two Terps Take on Virtual Reality’s Big Bugaboo

By Maya Pottiger ’17, M.Jour. ’20

September 25, 2019

Rumbling across a dark landscape, an enemy tank appears in the distance. Your heart rate picks up, and as you take aim, preparing to destroy the intruder … you can’t take another second of this.

You pull off your VR mask, the battlefield fading away as your living room materializes in front you, and take deep breaths, fighting nausea.

Simulation sickness affects the majority of VR users, but virtual reality gaming doesn’t have to end with losing your lunch. Two Terps, Mike Sorokin ’18 and Galen Stetsyuk ’20, are working to combat nausea in this realm as they develop “Core Disruption,” which they hope will be the first commercial video game released by their company, MPLEX.

“Our goal with MPLEX is to realize the full potential of immersive technologies,” Stetsyuk said. “(VR is) bringing millions of people to a new platform; it’s not something that’s ever easy.”

Sorokin and Stetsyuk trace their interests in game development to their childhoods: In middle school, Sorokin started hacking into servers for popular multiplayer online games to get experience with coding languages. Similarly, Stetsyuk focused on ways he could’ve improved the games he was playing.

MPLEX
Photo by John T. Consoli

The pair met as freshmen in 2014, and quickly recognized their mutual interests in gaming and computer science. Soon, they started attending hackathons together, and after UMD alum and Oculus VR co-founder Brendan Iribe donated VR headsets to the university, Stetsyuk and Sorokin became interested in VR’s surging potential. That year, they founded MPLEX, a virtual reality entertainment company. Stetsyuk is the CEO and Sorokin is the CTO.

(And the name MPLEX? It’s a “complex of M’s,” Stetsyuk said—a combined reference to Maryland, the first letter of Sorokin’s given name, “Mike,” the initial of another partner who has since left, and the first letter of Marcus, the moniker Stetsyuk planned to assume. The name change plan fizzled, but the company name stuck.)

In June, the pair received $250,000 in seed funding from businessman and philanthropist Robert Hisaoka '79, with investment covering new equipment and staff for the company.

They’re not the first game developers to attack the problem of virtual reality sickness, which could limit consumers’ willingness to buy into the technology that many believe could be the next big wave in entertainment.

Even when using high-quality VR technology, Stetsyuk himself still experiences simulation sickness. Proposed solutions have included limiting players’ in-game movement, or constrict their field of vision. Both negatively impact gameplay, Stetsyuk said; MPLEX’s solution is different.

“We actually have a pretty clever way of handling the different types of movement in such a way that the environment reflects your real-world environment,” Stetsyuk said. “If you’re sitting upright in your living room, we want to make sure that in the game world, you’re always in an upright position, even when you’re on a tilted axis on a vehicle, like if you’re going up or down a hill.”

Holly DeArmond, managing director of the Dingman Center for Entrepreneurship, said Sorokin and Stetsyuk have a clear vision for addressing the problem from scientific and business standpoints.

“Not only are they committed to the business, they want Maryland to be a leader, and they want the students here to be educated and experts in this area,” she said.

As they work to launch their own game, Sorokin and Stetsyuk are trying to create opportunities for fellow Terps as well.

The pair founded the VR Club on campus, which grew to 300 members in its first year. (It has since merged with the AR Club to form the XR Club.) Last semester, they also taught a Student-Initiated Course on game development, which included some of the techniques they’ve discovered for reducing simulation sickness.

Lucien Parsons, director of the Mixed/Augmented/Virtual Reality Innovation Center (MAVRIC) on campus, has helped guide Stetsyuk and Sorokin over the last few years. Parsons has 15 years of experience as a video game developer, and he said it has been gratifying to watch MPLEX find its direction and see its success.

“What they are trying to do is very hard,” Parsons said, “and they have been open to constructive criticism and to learning from their mistakes, which is a huge part of building that first game.”

https://today.umd.edu/rx-vr-nausea-e116e0a1-0cf1-4507-a140-550e881801e4

Maryland Today Arts & Culture

Immersive Digital Settings Form Backdrop for Violin Professor's Performances

By Tom Ventsias

July 9, 2019

VRViolinPerformance
A viewer uses virtual reality goggles to watch Irina Muresanu, associate professor of violin, perform in a virtual environment, also visible on a TV screen. Muresanu, below, celebrates diverse musical cultures with her "Four Strings Around the World" album and performances.

The playful notes of “Tango Etude No. 3” by Argentinian composer Astor Piazzolla dance amid a lush backdrop of manicured hedgerows and crimson azaleas at the National Arboretum in Washington, D.C.

Irina Muresanu, an associate professor of violin in the University of Maryland School of Music, adds another burst of color, wearing a floral gown as vibrant as the piece she’s performing directly in front of you.

Yet neither you nor Muresanu is actually there.

What you’re seeing is a hologram of the artist, who—with sound flowing from her 1849 Giuseppe Rocca violin—has been digitally captured and transported to this and other locations that represent musical compositions from different cultures.

This new virtual reality experience is a collaboration between the university’s College of Arts and Humanities and the College of Computer, Mathematical, and Natural Sciences (CMNS). It builds on Muresanu’s “Four Strings Around the World” project, a studio album and series of live concerts that celebrate diverse musical cultures through the unifying voice of the solo violin.

Muresanu said this virtual adaptation of Four Strings offers a trove of possibilities in both musical education and performance.

“We could have only dreamt of something like this several years ago,” she said, giving the example of virtually performing in the exact location that inspired a musical masterpiece. “But that’s what we’re supposed to do as academics—take the dream one step closer toward reality.”

Making the dream happen, however, meant overcoming several technical and logistical challenges.

Television stations and movie productions have long used green screens to superimpose weather anchors in front of forecast maps, or place actors in elaborate settings that are impractical to film. But that technology is good only for two-dimensional viewing on a television or movie screen, or more recently, on a smartphone.

Full immersion, where viewers can experience a scene in 360 degrees as if they were there in person, requires a much higher level of technical proficiency, said Amitabh Varshney, a professor of computer science and dean of CMNS.

The innovation for the violin project came out of the Maryland Blended Reality Center (MBRC), where, among other efforts, researchers are developing new immersive technologies to capture the intricate hand movements of surgeons at the R Adams Cowley Shock Trauma Center in Baltimore. “We’re focused on accurately representing the highest levels of nuances and details, so that it can be used as a teaching tool,” Varshney said.

The MBRC team applied the same technology used to record a surgery to virtually render the rapid hand and bow arm movements of Muresanu’s violin playing. They also incorporated spatial audio, meaning 3D sound that “moves” in tandem as a listener turns his or her head or looks up and down.

After filming Muresanu on a soundstage performing three pieces from Four Strings, Varshney’s team hit the road to film the immersive background settings needed to complement the music.

The National Arboretum in peak bloom represents the joy of Piazzolla’s South American tango. Three majestic New York City cathedrals are the backdrop for Bach’s powerful “Chaconne” from the Partita in D minor.

For composer George Enescu’s “Airs in Romanian Folk Style,” Muresanu had a special request: Growing up in Bucharest, she had always wanted to perform in the Romanian Athenaeum, one of Europe’s grandest concert halls. When Varshney’s technical team was unable to find suitable 360-degree footage from inside the building, Muresanu gained access for a professional European crew to film it.

Now Muresanu and Varshney are seeking private support for the additional technical and staffing resources needed to film an entire virtual concert. They believe the technology will be useful for teaching: Violin students from anywhere in the world can analyze—multiple times at any speed from any angle—the motions of Muresanu’s hands and bow arm.

Virtual reality technology also helps democratize the performing arts, Varshney said. Attending a live concert performance can be expensive and inconvenient, if not impossible for, say, hospital patients, or those with low incomes. “We don’t want anyone to be deprived of these amazing gems of human performances that can lift people up in a very dramatic way,” he said.

https://today.umd.edu/four-strings-around-virtual-world-34315b13-a2af-4c80-9685-d83a65c2bad2

Maryland Today Research

UMD Researcher’s VR App Helps People With Autism Learn to Interact With Police

By Chris Carroll

September 12, 2019

Autism VR App
VR image courtesy of Floreo
UMD computer scientist Vibha Sazawal, below, helped create an app designed to help teach autistic people who to navigate social situations, like police encounters, that are difficult for them.

Down the street from a crime scene, a police officer approaches a young man who’s behaving oddly. He refuses to make eye contact with the officer, whose questions go unanswered. Then the young man turns and runs.

To an untrained eye, it appears the officer has located the suspect—and simply needs to give chase. But instead, he or she may have found someone with autism. The neurological disorder, often characterized by difficulties with communication, social interaction and sensory processing, can make it hard to follow directions and calmly interact.

And when interactions with the police go wrong, it can turn tragic. In a notorious 2016 incident in Florida, an officer shot and wounded a caregiver after the autistic man he was with didn’t comply with orders to put down a toy train mistaken for a weapon. In another incident this June, an off-duty police officer killed a nonverbal man with cognitive disabilities he said had attacked him in line at a store, and shot the man’s parents as well.

Fears about such incidents helped inspire University of Maryland computer scientist Vibha Sazawal’s idea for a virtual reality app so people with developmental disabilities can safely practice dicey social situations, like police stops, that can baffle them. Sazawal’s son, Manoj, 9, has autism.

“You don’t know how many unwritten social rules there are to follow until you see what happens when someone doesn’t know what the rules are,” says Sazawal, a lecturer in the Department of Computer Science and a visiting research scientist in the Institute for Advanced Computer Studies (UMIACS).

Starting in 2016, she developed the app, Floreo, with Manoj’s father, Vijay Ravindran, a software engineer and former engineering director at Amazon. She had been an assistant professor of computer science at UMD until she resigned to care for Manoj several years ago; she left Floreo to return to the university last year.

To practice interaction in a virtual world, Floreo users wear a VR headset while a therapist, teacher or parent controls the session. Users encounter cartoony police officers—first a friendly female cop who gently questions the user on a peaceful sidewalk during daylight hours. In later stages, more insistent male officers interrogate the user on a darkened street with flashing lights, sirens and loud noises that could put anyone on edge. In each case, the user is asked to choose the correct response.

“When my son first tried it, the virtual police officer asked him a question, and he responded, ‘I really like your car,’” Sazawal said. “That’s not an answer to any of the questions. But we start at that first level and practice these skills before moving to higher levels where it’s more challenging.”

Some research suggests that immersive virtual reality training environments have advantages versus other training simulations, and Sazawal is collaborating with researchers at Children’s Hospital of Philadelphia and St. Joseph’s University in ongoing research to determine whether Floreo’s police interaction program beats video training programs designed to impart the same skills.

Researchers have divided study participants into one group that uses Floreo and another that watches videos. Afterward, they’re tested in interactions with real Philadelphia police officers following predetermined scripts to see whether either program effectively taught the autistic study participants any useful skills.

Floreo is more than just an educational game, said Joseph McCleery, an assistant professor of psychology and executive director of academic programs in the Kinney Center for Autism Education and Support at St. Joseph’s University in Philadelphia.

“The thing that makes Floreo unique is the fact that it has the iPad link to it in real time,” said McCleery, who was part of the app’s development team. “It’s actually a lesson where a certified teacher or a parent gets to control what happens.”

Although there are safety concerns about the overuse of VR in children, whose visual systems can be harmed, McCleery said, such training can be used judiciously to prime users for  practice sessions with actual officers, which are even rarer: “Generally, you don’t have police officers who can come into the classroom all the time, but perhaps you practice in virtual reality a few times and follow it with a real-life experience.”

With clinical testing still in progress, Sazawal said she’s already seen enough to know virtual reality training—with its tightly controlled environments that can be tuned to minimize stress for a population group prone to overstimulation—has lots of promise.

“Even on some basic skills, like making eye contact, we’ve seen that after using Floreo for a while and then observing them in real life, people do improve,” she said.

Sazawal hopes that eventually, scripted experiences in the virtual world of the Floreo app will open the door to safer, fuller lives for Manoj and other autistic children.

“There’s a juggling act—wanting him to have all kinds of life experiences on one hand, but wanting to shield him from potentially negative interactions,” she said.

https://today.umd.edu/practice-thats-no-game-5740f5e3-72a4-447f-8926-b9e47102da9a

Maryland Today Arts & Culture

Art Installation to Illustrate Climate Change Unexpectedly Disrupted—by Climate Change Itself

By Sala Levin ’10

Sep 18, 2019

Arctic Ice Project Cy Keener
Photo by Stephanie S. Cordle
Art Assistant Professor Cy Keener installs part of his work tracking Arctic ice melt at the VisArts gallery in Rockville, Md.

Cy Keener stepped out onto the nearly monochromatic, frozen landscape surrounding the northernmost city in the U.S. The scene outside at Utqiagvik, Alaska (formerly known as Barrow) was breathtaking this April morning: Giant, fractured blocks of sea ice loomed over the assistant professor of art, and the stillness was at odds with the ocean that churned silently and invisibly beneath the surface.

Keener was at once awed and melancholic. He knew that this vast expanse of ice at the top of the world—the oldest sections of which have shrunk by 95% since 1980—could vanish within a few decades.

Some 3,400 miles away, in a street-facing window of the Rockville, Md., VisArts center, Keener planned to visually document this ice from May to September as it slowly thinned. Using sensors buried two meters into the ice, Keener and his collaborator, Justine Holzman of the University of Toronto, intended to track its thickness daily, transforming that information into “Sea Ice 71.348778º N, 156.690918º W,” an art installation in which hanging strips of 6-foot-long, blue-green polyester film would reflect the depth of the ice. Over the warm months, the lengths of the ever-growing number of strips—Keener added new ones every four days—would dramatically shorten.

But then, two snags: First, a polar bear destroyed one of the two sensors. (Standard job hazard.) Then, the piece of ice containing the second sensor detached from land and floated out to sea in mid-June. The ice further broke up, and the buoy traveled into open water. Keener could no longer receive data about ice thickness—unprecedented warming had already melted the ice he was depending on.

“Of course, I was disappointed, but I also think it’s indicative of what’s going on in the sense that in past years that ice might not have broken off” until much later, said Keener.

Instead of hanging strips, Keener made a series of six 30-by-70-inch maps of Arctic sea ice extent for 2019 to compare with sea ice extent in 2007.

Trained as both an artist and architect, Keener has long been interested in how technology, art and the environment intersect. He’s used sensors to track the movement of stones along a riverbed during flooding and buoys to monitor ocean currents. While working on a glacier project, Keener met a researcher from the National Ice Center, who linked Keener with a National Science Foundation-funded Arctic expedition.

“One of the struggles of art that tries to engage in issues like climate is that it gets cloistered away in a gallery setting … where not that many people go,” said Keener. “I liked the idea of the street being the audience, as opposed to whoever wandered into [a] gallery. It’s a 3-D billboard for melting Arctic ice.”

https://today.umd.edu/cold-hard-act-7c6279fc-5942-4a46-8508-737d4b1385de

2018

Enhance Memory with VR

Maryland Today

UMD Artists, Researchers Open New Views of Opera

By Chris Carroll

October 1, 2018

For decades, audiences have watched from their seats as a group of nuns are marched to the guillotine at the climax of Francis Poulenc’s 1956 opera “Dialogues of the Carmelites.” 

Now, thanks to a collaboration between the Maryland Opera Studio and the Maryland Blended Reality Center, they can be on stage with the performers through virtual reality (VR) technology. 

It can get almost uncomfortably intimate, as when a doomed young novice sings a hymn, seemingly staring into the eyes of the viewer only two feet away. The viewer is wearing a VR headset, but too immersed in the performance to even notice.

“You can look at the micro-expressions of the performers, you can see the gleam in their eye, and really establish empathy with them,” says Amitabh Varshney, a professor of computer science, dean of the College of Computer, Mathematical, and Natural Sciences and one of the project’s leaders. 

Allowing viewers to teleport through 360-degree views of the stage and even the orchestra pit with VR can enhance the experience for opera newbies and seasoned viewers alike, says Craig Kier, director of the Maryland Opera Studio, who’s also leading the research.

“We see this not only as an access point for someone who’s not familiar with opera, to demystify it, but for someone who is familiar—to really put them in the driver’s seat right in the middle of it all,” Kier says. “The complexity of this entire art form invites a more immersive experience.”

But, Kier says, he and Varshney are ever-mindful of the need not to degrade the traditional opera experience. So before incorporating VR imaging in public performances, they need to find ways, for example, to hide cameras still visible onstage in the Poulenc opera filming—maybe in scenery, maybe someday in tiny flying drones unnoticeable to the audience. 

Although grounded in art, the project is an offshoot of broader research in medical uses of virtual and augmented reality between the University of Maryland and the University of Maryland, Baltimore through the MPowering the State initiative that combines the strengths of both institutions. In this case, they’re exploring whether VR representations of artistic performances can lessen hospital patients’ need for drugs to control pain and anxiety.

“We feel like we are in the early stages of a new genre of visual communication,” says Varshney. “You can use it for experiencing art in a new way, or perhaps a patient who has to be isolated can use it to be with their loved ones virtually—there are so many possible uses for it.”

https://today.umd.edu/virtually-onstage-1566ab40-ed91-47fb-8a48-5e7e3ecae827

IciarAndreu
Iciar Andreu

September 6, 2018

Story by Maria Herd

A passion for both film and technology led Iciar Andreu to the University of Maryland, where she graduated in May with a dual degree in film studies and computer science.

The 22-year-old native of Spain first considered UMD because of its strong computer science program.

“Computer science was always a good option for me because it’s something I’m good at and its applications are very broad,” she says.

Studying film as a second major was a simple choice for Andreu—movies piqued her interest from an early age.

Her family’s move from Madrid to the suburbs of Washington, D.C. at age 14 only increased her appreciation for the big screen.

“It was a big change,” she recalls. “It actually helped with my love of movies, because if you don’t see them translated the acting is better.”

Andreu’s best use of her film-meets-computer skillset came during a 12-month stint at the Maryland Blended Reality Center (MBRC), where computer scientists are partnering with others to develop visual computing technologies for healthcare and high-impact training programs.

Working throughout her senior year and the summer after graduation, Andreu contributed to MBRC projects that explored virtual reality (VR) for implicit bias training and transformed how an opera performance can be experienced.

“We hired her as a student worker because of her film background combined with strong skills in computer science, which is not that common,” says Barbara Brawn-Cinani, the associate director of MBRC, who oversees the center’s day-to-day activities.

Andreu’s talents were quickly put to good use, says Brawn-Cinani. She assisted a team that partnered with UMD sociologists and Prince George’s County Police Department to develop a series of VR training videos to help police officers recognize implicit bias.

Implicit bias is an unconscious attitude toward a social group, and has recently become increasingly significant to law enforcement due to a greater public awareness and outcry over police-involved shootings.

“The goal of this type of training isn’t to point fingers, but help officers become more self-aware that they have may have an implicit bias,” says Amitabh Varshney, professor of computer science and dean of the College of Computer, Mathematical, and Natural Sciences. “Lately, we have seen a lot of divide in society, and we’re hoping this will help heal that.”

Varshney is co-director of the MBRC, a position in which he actively establishes partnerships that use the power of VR and other immersive media tools to improve healthcare and education, as well as enhance the visual and performing arts.

For her role in the implicit bias project, Andreu filmed three separate scenarios of a police officer’s interaction with actors of varied races and genders, resulting in series of virtual simulations that represent what officers may encounter while on duty. One scene had an officer stop a car, then come to the window to ask for license and registration. Another was stopping someone on the street who was holding an item in their pocket that could be a phone or weapon. In the third scenario, an officer stops someone who may be on drugs or have a disability that is causing them to act irrationally.

Andreu says that one of the biggest challenges was keeping the 360-camera steady while filming from the officer’s point of view. She gives the example of moving the camera up to the car window during a traffic stop.

“The problem with VR is that if the camera is shaky, you can get dizzy really fast looking through the headset,” she says. “We bought stabilizers and tried different things involving both hardware and software.”

Andreu filmed all of the scenes, edited them, and then stitched all of the audio and video together. Other MBRC staff helped design software interfaces that monitor officers’ reactions including eye movement, heart rate, stress levels in their voice, and whether they reach for a weapon.

All 1,800 officers on Prince George’s County police force are undergoing training through Andreu’s modules.

“The scope of the project is really amazing. That’s a very big contribution for a young researcher,” says Brawn-Cinani. “The level of work she produced was at the graduate or postdoc level.”

Andreu also used her skills as a videographer and film editor to capture performances by the Maryland Opera Studio, a graduate-level program at UMD that trains people for careers with professional opera companies worldwide.

The VR opera project involved filming portions of dress rehearsals for two famous operas—Dialogues of the Carmelites and La Clemenza di Tito—with multiple 360 cameras. Using innovative technology developed by MBRC staff, viewers wearing VR headsets can transport themselves to different vantage points on stage to experience the beauty and emotion of an opera performance from new angles.

“Opera is the most complex performing art that exists because there’s the orchestra that’s playing in the pit, the singers on stage, the principle singers chorus, and then there’s sets and costumes,” says Craig Kier, director of the Maryland Opera Studio. “We bring all of these things together. To capture it in a way that is more than by photos—in a way that is actually enhancing your experience as either a singer on stage or as someone that is witnessing it—is remarkable.”

One technological hurdle, says Andreu, was filming and stitching together 360-degree video that was taken on a large stage with varied lighting and sound.

“You get these bright lights coming [from] above and the rest is dark. It’s very difficult to have a middle ground with that,” she explains.

Sida Li, a research programmer in MBRC who collaborated with Andreu on several projects, emphasized how helpful it was for her to have a background in both film and computer science.

“It made the communication between her and us very easy because all of these videos she produced were going into our software pipeline,” he says.

Andreu’s work with MBRC has not gone unnoticed. Her team presented their implicit bias work at an event for state legislators in Annapolis highlighting ongoing research funded by the University of Maryland Strategic Partnership: MPowering the State, which provided significant resources to launch MBRC.

For the opera VR project—in addition to their work being highlighted in a video clip(link is external) and in the university’s Terp alumni magazine—the MBRC team joined singers from the Maryland Opera Studio for a unique performance before the University System of Maryland Board of Regents.

The opera singers performed several numbers followed by the MBRC team presenting the same performance in virtual reality.

Andreu says she especially enjoyed explaining the MBRC technology to board members, most of whom had not yet experienced VR, and were therefore caught off guard by the lifelike visual immersion.

“Since the camera is right at the edge of the stage, they feel like if they take a step back they’re going to fall into the orchestra pit,” she says.

According to Andreu, one of the current challenges in the field of VR is that there are not yet strong programs for editing video, so she created one herself.

“Jokingly, I called it Bravo,” she says, which in Spanish means “good job” after a performance.

When she wasn’t in class or at MBRC, Andreu could be found at a Terrapins game filming, editing and directing video for the Big Ten Network. As a student worker, she helped capture soccer, lacrosse, wrestling, basketball, field hockey and volleyball games.

Andreu started graduate school this fall at the University of Pennsylvania, pursuing a master’s degree in computer graphics and game technology.

Ultimately, she wants to produce special effects for movies, and this program will give her the skills to excel in that field.

Andreu encourages students who are interested in VR to try out the technology in various capacities.

“Be open to all the possibilities and know that it can fit your interests,” she says, referencing how she worked on both performance art and sociology projects.

Andreu points out that there are different roles to experiment with as well—from directing VR films to designing backgrounds to overcoming 3-D audio challenges.

“There is a lot of interest in immersive technologies by [people and organizations] that can provide significant funding, so there is a huge opportunity here,” she says.

https://www.umiacs.umd.edu/about-us/news/student%E2%80%99s-passion-film-and-technology-leads-innovative-vr-content

July 26, 2018

By Maryland Today Staff

Researchers Conduct In-Depth Analysis of Educational Use of VR

VR enhance memory

Can VR improve the way we learn? University of Maryland researchers studied the impact of virtual reality on memory recall, comparing virtual reality to more traditional ways of learning information. Watch this video on YouTube.

May 23, 2017

By Sala Levin ’10

Students Strive to Enhance Museum Experience Using Technology

Step inside Renoir’s beloved “Luncheon of the Boating Party” and pet the little dog perched—rather unhygienically—on the table. Taste the grapes that are all that remain of lunch. Try on one of the women’s flower-festooned hats. Take a swig of wine from one of the already-opened bottles.

Visitors to The Phillips Collection, the famed D.C. museum, may someday be able to do all this, with the first steps taken by students in the University of Maryland’s First-Year Innovation & Research Experience (FIRE) program.

Through the new partnership between the university and the museum, these freshmen are exploring how virtual reality (VR) and other new technologies can enhance visitors’ experience at the museum and their appreciation of the artwork on display.

“UMD and the Phillips are interested in advancing the arts and using the research capacity of UMD to aid in that cause,” says David Cronrath, special assistant to the provost and campus liaison with the Phillips. “The FIRE program introduces students to the museum and its range of visual and musical arts, and uses the museum as a research focus to enhance museum-goers’ experience.”

The two-semester Phillips Virtual Culture program is one of 14 research options available to undergraduates enrolled in FIRE. Freshmen first took a general introductory course in the fall before selecting a more specialized course of study; at the end of the spring semester, students in the Phillips Virtual Culture stream presented prototypes of projects they plan to execute next semester.

The course “allows the Phillips to function as a playground for FIRE students because they get a real-world application to test out their ideas,” says UMD-Phillips Collection Postdoctoral Fellow in Virtual Culture Nicole Riesenberger. She’s teaching the course with research educator Kyungjin Yoo and Amitabh Varshney, director of the Institute for Advanced Computer Studies, computer science professor and interim vice president for research at UMD.

In addition to VR research, the 35 computer science majors and one art history major are also learning web development and mobile technology, perhaps leading to an app that could help visitors navigate the Phillips. The museum, known as the nation’s first museum of modern art, houses works by masters including Georgia O’Keeffe, Mark Rothko and Vincent van Gogh.

“You want to have a good balance between engaging the visitors with tech and also allowing them to still appreciate the art how they normally would,” says Mark Keller ’20. His prototype is of an app that would allow visitors to scan QR codes of artwork and then rate whether they liked the work; this information would be used to create a personalized path through the collection highlighting works the visitor is likely to enjoy. (Example: No Pollocks for Degas-lovers.)

For students like Keller—a computer science major who was drawn to the Phillips research stream by his appreciation for art—the course represents the opportunity to meld two fields of study that haven’t often come into contact. “I’ve enjoyed the challenge of determining how to enhance the visitor experience at the Phillips while also not taking away from the gravity of the art,” he says.

https://today.umd.edu/vr-v-art-68eab4c8-aba9-4452-bd68-8ef22cd16260

March 20, 2018

by Sarah Murthi, MD and Amitabh Varshney

ARSurgery
mathisworks/Hayon Thapaliya/Getty Images

HBR Summary

While medical imaging has radically evolved, how images are displayed is basically the same as it was in 1950. Visual data are always shown on a 2D flat screen, on displays that force health care providers to look away from the patient, and even away from their own hands while operating. Augmented reality (AR), a set of technologies that superimpose digital information on the physical world, has the potential to change all of this. Researchers at the Maryland Blended Reality Center’s “Augmentarium” are prototyping AR applications in medicine as are teams at Stanford, Duke and Johns Hopkins. In envisioned application, a surgeon using an AR headset would be able to see digital images and other data directly overlaid on her field of view. The surgeon needn’t look away from the patient to multiple different displays to gather and interpret this information. Thus the technology has the potential to improve care and reduce errors.

Some of the biggest medical advances of the last few decades have been in diagnostic imaging — ultrasonogaphy, mammography, computerized tomography (CT), magnetic resonance imaging (MRI) and so on. The same forces that have propelled technology developments elsewhere — tiny cameras, smaller and faster processors, and real-time data streaming — have revolutionized how doctors use imaging in performing procedures. Almost every surgery involves some sort of scan prior to incision. Even in emergencies, surgeons have ultrasound or CT to help guide the procedure. Imaging can now be performed in real time at the point-of-care during procedures, both big and small.

Yet, while imaging has radically evolved, how images are displayed is basically the same as it was in 1950. Visual data are always shown on a 2D flat screen, on displays that force health care providers to look away from the patient, and even away from their own hands while operating. Further, the images are not displayed from the perspective of the viewer, but rather from that of the imaging device: doctors have to use skill and imagination to understand and mentally project the images into the patient while they are doing procedures. Finally, different types of visual data are displayed separately, so doctors have to direct additional attention to mentally fusing multiple image types, such as angiography and CT, into a coherent representation of the patient. Acquiring this skill takes years of training.

Augmented reality (AR), a set of technologies that superimpose digital information on the physical world, has the potential to change all of this. In our research at the Maryland Blended Reality Center’s “Augmentarium,” we are prototyping AR applications in medicine, as are teams at Stanford, Duke and Johns Hopkins. In envisioned application, a surgeon using an AR headset such as Microsoft’s HoloLens would be able to see digital images and other data directly overlaid on her field of view. In such a scenario, the headset might display a hovering echocardiogram with vital signs and data on the characteristics of the patient’s aneurysm directly above the surgical field. The surgeon needn’t look away from the patient to multiple different displays to gather and interpret this information.

AR’s potential ability to concurrently display imaging data and other patient information could save lives and decrease medical errors. This is especially true for procedures done outside an operating room. The OR may be the safest place in the hospital, where one patient has an entire team of 4 to 8 dedicated doctors and nurses. Because everyone has pre-operative imaging, the procedures are generally well-planned. Anesthesiologists monitor the patient’s physiology and administer pain-controlling and life-saving medications. Surgical nurses make sure all of the necessary equipment is immediately available. Surgeons can be completely immersed in the operative task. But time in the room is extremely costly, and ORs are solidly booked with elective cases. Elective operations are an essential source of revenue for all hospitals, so there is incredible pressure to keep ORs full and flowing. Small, emergent procedures do not easily fit into this reality. As a result, many of these procedures are done outside the OR in intensive care units and emergency departments. It’s during these “bedside procedures” that patients may be most at risk and where AR could provide some of the greatest benefit.

https://hbr.org/2018/03/how-augmented-reality-will-make-surgery-safer

The Diamondback

Prince George’s County Police are getting new implicit bias training

February 15, 2018

By Leah Brennan

In March, Prince George’s County Police Department officers will head to the University of Maryland for a new implicit bias training.

The training, which will run from March until November, aims to help the department’s 1,700 sworn members examine and confront their implicit biases, which are biases that people are “unable or unwilling to admit,” said Rashawn Ray, a sociology professor at this university and one of the head researchers behind the training, in a Feb. 2 press conference.

“We think it has the potential to serve as a national model,” PGPD spokeswoman Jennifer Donelan said. “We hope as we’re able to garner information and sort of gauge how the training is going, that others follow suit.”

Kris Marsh, another sociology professor and head researcher, said in the press conference that 50 different officers will come to this university every Tuesday for about 10 hours and experience the training in three major components — standard lectures, discussions and virtual reality scenarios.

The virtual reality technology enables researchers to collect physiological data, which allows them to tell officers when their heart rate might go up or when their pulse may be elevated, Marsh said in the conference. Researchers then provide the information back to the training academy to build additional models and further train officers in order to further work through these points, she added.

Officers will also undergo a debriefing at the end of each session, and will be able to log into a website to see group-level data, as well as videos of researchers explaining that data, Marsh wrote in an email. Data will be presented at the group level to protect officers’ anonymity, she added.

“We’re not trying to pick on the one who isn’t doing well, but look at all the ones that are doing well and see how we can train better for the future for other officers to think about how they have biases and to move past those biases so we can move towards a bias-free policing,” Marsh said at the conference.

Donelan emphasized in the press conference that the training is not just focused on racial implicit bias. Scenarios in the training include situations where people exhibit autistic behavior, or have hearing loss, Marsh said.

“We want the officer to be able to understand the difference between someone who is noncompliant and someone who has a hearing loss,” Marsh said. “We also … built a scenario for someone who is acting out in autistic behavior and we want the officer to be able to understand and see and recognize the difference between someone who is autistic and someone who is noncompliant or nonresponsive.”

The research team has developed over 90 scenarios, Ray said at the conference.

The new training, which has been in the works for more than two years, is in partnership with this university’s sociology department and behavioral and social sciences college, as well as this university’s Institute for Advanced Computer Studies, the MLAW program and the University of Maryland School of Medicine, according to a PGPD press release. Prince George’s County Police Chief Henry Stawinski initiated the plan “to take standard police academy training on implicit bias to a new level,” the release read.

Abir Muhuri, a junior mechanical engineering major, said he worked as an undergraduate research assistant for a lab that used VR to monitor emergency dispatch workers. He said he thinks it would be “really interesting” to implement VR in the PGPD training.

“It provides a lot of data and that’s really useful for just tracking how the training goes and how improvements could be made, besides just qualitatively observing that,” Muhuri said. “Also, [the training is] for a great cause because I think bias affects a lot of people. I’ve heard a lot of personal stories about how bias can just affect someone on a daily basis.”

Sophomore information sciences major Avery Parker said he thinks the VR training is “very important” and that it will “really help out in the field.”

“One of the major shortcomings in training comes up in real life situations — you’ve got to think on the dime. You’ve got to make an immediate decision,” Parker said. “And I think you don’t get that to the full extent in practicals and what not, so I think having the virtual simulations and whatnot will help incoming cadets to actively assess situations without ever being inside them.”

CORRECTION: Due to an error, a previous version of this story said officers would be able to see data and videos the researchers collected. The videos would not be of the training, but of researchers explaining data at the collective level. This story has been updated.

https://dbknews.com/0999/12/31/arc-ypnkiagafzhgxlkmpjvfebdlau/

Formerly located online at: http://www.dbknews.com/2018/02/16/pg-county-police-implicit-bias-training-umd/

2017

NewseumVR

In Baltimore and Prince George’s County, researchers are advising police as they attempt to rebuild trust with residents.

Police and universities collaborate on reforms
Genesis Fuentes (left) and Dr. Kris Marsh (right) are part of a team within the University of Maryland Department of Sociology that has been developing implicit bias trainings for police officers. “Once you change one officer at a time then you start to change the culture,” Marsh said. (Capital News Service / Photo by Helen Parshall)
  • By Helen Parshall and Teri West
  • Capital News Service
  • December 18, 2017

BALTIMORE — University professors and students are lending their expertise to local police departments looking for new ways to rebuild relationships and accountability with the communities they serve.

Universities “can explore grant opportunities that we may not be aware of or eligible to apply for without their partnership,” Baltimore Police Commissioner Kevin Davis said in an interview with Capital News Service.

“The marriage of academic institutions and boots on the ground police departments allows us to explore new opportunities as well as new technologies,” he said.

The School of Social Work at the University of Maryland, Baltimore, is partnering with the Baltimore Police Department on community policing efforts while the College Park campus’ sociology department is working with the Prince George’s County Police Department to help officers identify their biases. Johns Hopkins University began a partnership with the police department to develop programs aimed at reducing violent crime in 2016, according to a university press release.

A broken relationship

The Department of Justice’s 2016 investigation of the Baltimore Police Department, which led to the consent decree between the Justice Department and the police, found that “the relationship between the police and the community in Baltimore is broken.”

“Many residents throughout the City of Baltimore, and particularly in impoverished, primarily minority, neighborhoods, described being belittled, disbelieved, and disrespected by officers,” the report said.

“The consent decree and the DOJ report told us a lot of what we already knew about community-police relations in Baltimore,” said Wendy Shaia, executive director of the Social Work Community Outreach Service the University of Maryland, Baltimore. “I think this is a real opportunity for us to change those relationships.”

Two School of Social Work projects are in the works.

The first proposal will pair students from the school with officers working in four key “transformation zones” across the city. These zones are areas that had “elevated concentrations of gun-related crimes” in 2016, according to the police department’s website.

The department intends to collaborate with outside organizations in these zones. The idea is that community-driven intervention strategies will reduce violent crimes overall.

The Johns Hopkins Bloomberg School of Public Health is already working with officers in the transformation zone program, according to the police department's website.

The goal of the University of Maryland proposal, Shaia said, is to be able to place one graduate student working as a community analyst in each of the four zones, with a fifth student overseeing the broader mission of the program across all the communities.

“As social workers, we have the perfect set of skills to help bridge the gap between community members and the police,” Shaia said. “We want our students to think about community issues from a macro perspective so that ... they are recognizing all the societal, institutional, and structural issues that contribute to the complex and rippling impacts of poverty and oppression.”

The second project, said Richard Barth, dean of the School of Social Work, is a certificate program that would offer specialized training for police officers on community issues such as intimate-partner violence, suicide, youth violence, child abuse and neglect.

Like police, social workers are “first responders, and they have to be out working in families with many of the same problems,” Barth said. “We work in a lot of the same sectors, with a lot of the same populations as the police.”

Barth said that the School of Social Work surveyed officers to find out what they would want from a certificate program with the support of the police department’s Chief Ganesha Martin and Capt. Rhonda McCoy of the Department of Justice Compliance, Accountability and External Affairs Division. The survey received more than 600 responses, Barth said.

“As the leading state organization for training people that work in protective services, we have a lot of training capacity,” Barth said. “A substantial portion of 911 calls are not about violence prevention, but really are more about social problems that need a response.”

“We see ourselves as an asset,” Barth continued. “We work closely with the police and we would be delighted to do more training, but there are many other ways we can be of assistance to the police, and that involves the possibility of police-social work teams.”

Building ties between officers and citizens in Baltimore is vital to this process, Shaia said, but it also goes further into addressing the “deep and long and wide history of racial segregation and oppression” in Baltimore.

“Baltimore was ground-zero for racialized housing discrimination in the country,” Shaia said. “I think what we’re seeing now is simply the result of our very deep, racialized history in this city. And until we begin to recognize it for what it is, that is always going to create a disconnect between our criminal justice system and our community.”

The School of Social Work has been talking with the Baltimore Police Department since before the consent decree was signed to offer support.

Shaia said that addressing these structural issues — including education, substance abuse, employment, infrastructure and access to housing — holds the key to beginning to tackle crime in Baltimore.

“We can’t just tackle crime by itself. We've been doing that for years, and it hasn't been working,” said Shaia. “It's going to take more than the police department. They are just one part of our infrastructure.”

Help from virtual reality

The University of Maryland, College Park has been working with the Prince George’s County Police Department to develop simulation-based, virtual reality trainings to help officers recognize unconscious attitudes they bring to their policing.

Dr. Kris Marsh and Dr. Rashawn Ray, both sociology professors at the University of Maryland, College Park, study implicit bias — prejudice that develops and plays out unconsciously. Implicit bias has been associated with differences in policing of individuals depending on their race, gender or sexual orientation.

Marsh and Ray have created and begun offering classroom lessons on implicit bias for Prince George’s officers. Now, they are taking their research and merging it with existing virtual reality technology to show officers how their biases can show up on the streets.

In 2016, the sociology professors approached computer scientists in the University of Maryland Institute for Advanced Computer Studies with a proposal to develop a virtual reality simulation to complement their teachings.

“There’s some data that suggests that virtual reality has a longer-term effect than some of this two-dimensional kind of stuff that’s out there,” Marsh said.

The Maryland Blended Reality Center, a collaboration between the university’s College Park and Baltimore campuses that innovates with virtual and augmented reality, embraced the sociologists’ challenge. It anticipates debuting the implicit bias trainings in March 2018, said the center’s associate director, Barbara Brawn-Cinani, who works closely with director Amitabh Varshney.

Officers will wear headsets for training simulations, allowing for close detection of facial expressions of civilians they encounter in each scene, she said.

While other virtual reality police trainings focus on strengthening officers’ decision-making abilities, these trainings aim to make officers aware of their personal biases.

“In an immersive environment where you’re able to track things on a much more refined level those things become very hard to hide,” Brawn-Cinani said. “Sociology’s initial thought was it’s very hard to work with implicit bias on the basis of it being unconscious, and if we have a way to really get into people’s heads it will allow us to help them address that and become aware of this.”

Police departments across the country, including Baltimore, use virtual reality trainings to provide officers a risk-free setting to practice use of force and verbal de-escalation.

Baltimore Police Sgt. Regina Richardson participated in virtual reality scenarios for the first time during her annual training in early December. She said she enjoyed the experience and was pleased when she performed correctly.

“For something where we need to kind of just hold back a second to see what we have before we go blasting, I think it’s a good idea,” she said.

https://cnsmaryland.org/baltimore-university/

October 10, 2017

By Daniel Oyefusi '19

UMD Student Creates VR Prototype of Cold War-Era Berlin for Newseum

The daytime silence on an empty street in East Berlin is complete, and eerie. On one side, Communist propaganda posters are plastered on a brick wall. A barbed wire fence stands on the right, with police on patrol just around the corner.

This is one scene that brings to life the heart-pounding tensions of the Cold War through a new exhibit at the Newseum in Washington, D.C. “Berlin Wall: The Virtual Reality Experience” opened over the summer, inspired by a prototype developed by Mukul Agarwal M.S. ’17, a human-computer interaction graduate student working in the Augmentarium at the University of Maryland Institute for Advanced Computer Studies (UMIACS). 

“We want to connect with visitors on an emotional level and help them understand the values that the museum recognizes and to promote, defend and explain the five freedoms of the First Amendment,” says Mitch Gelman, chief technology officer at the Newseum.

Newseum officials reached out to Amitabh Varshney, professor of computer science and director of UMIACS, to see if the museum could work with UMD to develop a proof of concept using virtual reality technology. Over three weeks last year, Agarwal worked with movie director Cutter Hodierne on the concept and storyline for the model, which Newseum officials then took to the company HTC Vive to create a more polished version.

“The thing that was exciting about this project is that it is being used in a real-world setting,” Varshney says.

Since the Newseum’s 2008 opening just blocks from the U.S. Capitol, an exhibit has featured pieces of the original Berlin Wall and an East Berlin guard tower, as well as images of the wall from news archives as part of its mission to showcase the value of freedom of the press. These artifacts were used to develop the model, but the virtual reality experience provides an added dimension to the historic period. 

With the help of a VR headset and two handheld controllers, Newseum visitors can relive the anxiety of Berlin citizens. Over seven minutes, they’ll find themselves atop an overlooking guard tower, searching for courageous wall-jumpers, sifting through artifacts in a Berlin home, only to move a crate and find a hidden escape tunnel—and eventually using a sledgehammer to break down the wall that divided Germany for almost 30 years.

“I appreciated working with digital archivists and others at the Newseum on a topic that has so much historical significance,” Agarwal says. “(Working on the model) feels amazing. It feels good that I was a part of it and was able to make it happen successfully.” 

https://today.umd.edu/against-wall-efec68f3-a9a4-420e-ab93-143f5504851d

October 2017

Recently, the U.S. Department of Commerce awarded the University of Maryland $500,000 to launch the Mixed / Augmented / Virtual Reality Innovation Center (MAVRIC).  MAVRIC will connect the university’s world-class research expertise and state-of-the-art facilities with the assets of academic, public sector, and corporate partners in Maryland, Washington, D.C., and Virginia.  Mid-Atlantic Crossroads (MAX) is proud to support MAVRIC with its robust research and technology infrastructure.  With a strong foundation in place, MAVRIC is poised to become the east coast hub of immersive media.

UMD Awarded U.S. Department of Commerce Grant to Launch Immersive Media Innovation Ecosystem

From catching Pokémon in the real world to donning a virtual reality headset to see and feel what it was like to scale the Berlin Wall before its fall, advancements in immersive media have set the stage for the next digital revolution. The University of Maryland will lead this revolution with the launch of the Mixed/Augmented/Virtual Reality Innovation Center, called MAVRIC, which has been awarded a $500,000 grant from the U.S. Department of Commerce Economic Development Administration (EDA).

“We are already leaders in this dynamic, growing field, and the project promises to make our entire region a national hot spot for immersive media development,” said University of Maryland President Wallace D. Loh. “It will become an economic and technological boon to Maryland, Virginia, and Washington, D.C.”

Co-funded by the university and the EDA’s Regional Innovation Strategies (RIS) program i6 Challenge Grant award, MAVRIC will build on university assets such as the new Brendan Iribe Center for Computer Science and Innovation, as well as other relevant assets across the region. The center will aggregate and accelerate the research and training capabilities of universities in region, the direct needs and projects of the corporate and public sector, and the innovation engine of startup and small businesses to advance mixed, augmented, and virtual reality technologies in three select verticals: media, simulation and training, and arts and entertainment.

Immersive media is used to describe virtual reality, augmented reality, and mixed reality. Many industries are using immersive media as the next iteration of their business, as evident in the surge of 3-D video and virtual reality use in industries other than gaming. For example, immersive media has the potential to change the way viewers experience the news, a movie, or a sporting event. Beyond media and entertainment, immersive media technology is being used to transform training for medical clinicians, manufacturing operators, military, and public safety professionals.

UMD is home to a robust research and technology infrastructure to support MAVRIC, including the Mid-Atlantic Crossroads (MAX), which provides high-speed access and cutting-edge network capabilities; the Augmentarium, an interactive computer visualization lab; the Virtual Reality Cave, which is used to advance the integration of wearables and sensors, and study human performance and human error within high-stress situations; and the forthcoming Brendan Iribe Center for Computer Science and Innovation(link is external), which will feature six floors of specialized labs to support groundbreaking research in virtual and augmented reality, 360-degree video, artificial intelligence, robotics, computer vision, algorithms, programming languages and systems.

“Innovation is a significant driver of growth for the U.S. economy, and immersive media technology is poised to disrupt several key industries,” said UMD Associate Vice President for Innovation and Economic Development and MAVRIC Principal Investigator Julie Lenzer. “MAVRIC is well-positioned to emerge as the east coast hub of immersive media, and we will power that drive with a community-based, collaborative approach to commercializing these technologies.”

The center also aims to ensure a strong pipeline of diverse talent in the region. To stock this pipeline, the center will partner with higher education institutions such as Morgan State University and Coppin State University to promote and support school-based and community special interest clubs related to the field to harness the creativity of science, technology, engineering, arts, and math (STEAM) students in underserved urban and rural communities. Additionally, MAVRIC will partner with the university and local businesses to shape the creation of a new immersive media curriculum to prepare graduates for jobs in the field.

“MAVRIC will foster the development of immersive media technologies by building a network of influencers and executive champions, supporting the participation of traditionally underrepresented groups, and providing the strategic support needed to build a successful technology cluster,” said MAVRIC Program Director Lucien Parsons.

In addition to Lenzer and Parsons, the MAVRIC team includes collaborators UMD Interim Vice President for Research and Professor of Computer Science Amitabh Varshney and Philip Merrill College of Journalism Associate Dean for Academic Affairs and Master’s Program Director Rafael Lorente. Associate Professor of American Studies Sheri Parks serves as MAVRIC’s community engagement liaison.

Externally, the team was able to collect a record 54 support letters from regional and national stakeholders. Interest and support was offered from investors, other universities, and the state as well as private sector companies of all sizes, from startups to multi-national corporations.

The i6 Challenge grant was awarded through the Regional Innovation Strategies (RIS) program, a national and highly competitive program which is led by the EDA’s Office of Innovation and Entrepreneurship. The i6 Challenge competition fosters the development of centers for innovation and entrepreneurship that accelerate the commercialization of innovations and ideas into companies, jobs, products, and services. 

https://www.thequilt.net/quilt-news/max-technologies-support-umds-mavric-initiative/

University of Maryland plans AR/VR innovation center

UMD received a $500,000 federal grant for the project. The effort will include collaboration with startups and Baltimore universities.

By: Stephen Babcock

September 27, 2017 11:43am

The University of Maryland College Park received a $500,000 federal grant that will help align efforts around developing immersive media.

With the award from the U.S. Department of Commerce Economic Development Authority’s i6 Challenge competition, the university is planning to launch the Mixed/Augmented/Virtual Reality Innovation Center (MAVRIC). According to a press release, it’s intending to “aggregate and accelerate” research and training in those areas throughout the region.

“Innovation is a significant driver of growth for the U.S. economy, and immersive media technology is poised to disrupt several key industries,” Julie Lenzer, a leader in Maryland innovation circles who became UMD Associate Vice President for Innovation and Economic Development last year and is principal investigator for MAVRIC. The center, she said, “is well-positioned to emerge as the East Coast hub of immersive media, and we will power that drive with a community-based, collaborative approach to commercializing these technologies.”

In terms of existing assets to build on, UMD itself has an Augmentarium and Virtual Reality Cave. Additionally, plans are in place for more infrastructure with a new six-floor computer science and technology center named after Oculus cofounder and UMD alum Brendan Iribe.
MAVRIC’s team is also looking off campus. They are planning to extend partnerships with Morgan State University and Coppin State University, and tap into startups and businesses using the technology in the region.

According to the release, the center will focus efforts on three areas: media, simulation and training, and arts and entertainments.

There are signs of a groundswell in virtual and augmented reality throughout the area. As we’ve seen with other tech ecosystems, a central spot to align those efforts could help forge new growth.

https://technical.ly/diversity-equity-inclusion/university-maryland-ar-vr-center/

September 26, 2017

From catching Pokémon in the real world to donning a virtual reality headset to see and feel what it was like to scale the Berlin Wall before its fall, advancements in immersive media have set the stage for the next digital revolution. The University of Maryland will lead this revolution with the launch of the Mixed/Augmented/Virtual Reality Innovation Center, called MAVRIC, which has been awarded a $500,000 grant from the U.S. Department of Commerce Economic Development Administration (EDA).

“We are already leaders in this dynamic, growing field, and the project promises to make our entire region a national hot spot for immersive media development,” said University of Maryland President Wallace D. Loh. “It will become an economic and technological boon to Maryland, Virginia, and Washington, D.C.”

Co-funded by the university and the EDA’s Regional Innovation Strategies (RIS) program i6 Challenge Grant award, MAVRIC will build on university assets such as the new Brendan Iribe Center for Computer Science and Innovation, as well as other relevant assets across the region. The center will aggregate and accelerate the research and training capabilities of universities in region, the direct needs and projects of the corporate and public sector, and the innovation engine of startup and small businesses to advance mixed, augmented, and virtual reality technologies in three select verticals: media, simulation and training, and arts and entertainment.

Immersive media is used to describe virtual reality, augmented reality, and mixed reality. Many industries are using immersive media as the next iteration of their business, as evident in the surge of 3-D video and virtual reality use in industries other than gaming. For example, immersive media has the potential to change the way viewers experience the news, a movie, or a sporting event. Beyond media and entertainment, immersive media technology is being used to transform training for medical clinicians, manufacturing operators, military, and public safety professionals.

UMD is home to a robust research and technology infrastructure to support MAVRIC, including the Mid-Atlantic Crossroads (MAX), which provides high-speed access and cutting-edge network capabilities; the Augmentarium, an interactive computer visualization lab; the Virtual Reality Cave, which is used to advance the integration of wearables and sensors, and study human performance and human error within high-stress situations; and the forthcoming Brendan Iribe Center for Computer Science and Innovation, which will feature six floors of specialized labs to support groundbreaking research in virtual and augmented reality, 360-degree video, artificial intelligence, robotics, computer vision, algorithms, programming languages and systems.

“Innovation is a significant driver of growth for the U.S. economy, and immersive media technology is poised to disrupt several key industries,” said UMD Associate Vice President for Innovation and Economic Development and MAVRIC Principal Investigator Julie Lenzer. “MAVRIC is well-positioned to emerge as the east coast hub of immersive media, and we will power that drive with a community-based, collaborative approach to commercializing these technologies.”

The center also aims to ensure a strong pipeline of diverse talent in the region. To stock this pipeline, the center will partner with higher education institutions such as Morgan State University and Coppin State University to promote and support school-based and community special interest clubs related to the field to harness the creativity of science, technology, engineering, arts, and math (STEAM) students in underserved urban and rural communities. Additionally, MAVRIC will partner with the university and local businesses to shape the creation of a new immersive media curriculum to prepare graduates for jobs in the field.

“MAVRIC will foster the development of immersive media technologies by building a network of influencers and executive champions, supporting the participation of traditionally underrepresented groups, and providing the strategic support needed to build a successful technology cluster,” said MAVRIC Program Director Lucien Parsons.

In addition to Lenzer and Parsons, the MAVRIC team includes collaborators UMD Interim Vice President for Research and Professor of Computer Science Amitabh Varshney and Philip Merrill College of Journalism Associate Dean for Academic Affairs and Master’s Program Director Rafael Lorente. Associate Professor of American Studies Sheri Parks serves as MAVRIC’s community engagement liaison.

Externally, the team was able to collect a record 54 support letters from regional and national stakeholders. Interest and support was offered from investors, other universities, and the state as well as private sector companies of all sizes, from startups to multi-national corporations.

The i6 Challenge grant was awarded through the Regional Innovation Strategies (RIS) program, a national and highly competitive program which is led by the EDA’s Office of Innovation and Entrepreneurship. The i6 Challenge competition fosters the development of centers for innovation and entrepreneurship that accelerate the commercialization of innovations and ideas into companies, jobs, products, and services.

https://spac.umd.edu/news/story/umd-awarded-us-department-of-commerce-grant-to-launch-immersive-media-innovation-ecosystem

Press Release

Contact: EDA Public Affairs Department, (202) 482-4085
September 20, 2017

WASHINGTON – U.S. Secretary of Commerce Wilbur Ross today announced that 42 organizations — including nonprofits, institutions of higher education, and entrepreneurship-focused organizations from 28 states will receive over $17 million to create and expand cluster-focused proof-of-concept and commercialization programs, and early-stage seed capital funds through the Economic Development Administration’s (EDA) Regional Innovation Strategies (RIS) program.

This fourth cohort of Regional Innovation Strategies awardees expands the RIS portfolio to eight new states and continues to build vibrant regional entrepreneurial economies. Selected from a pool of more than 217 applicants, the awardees include a Philadelphia business incubator where startups scale to export worldwide; an aerospace manufacturing incubator in West Virginia; a commercialization program for advanced timber technology in rural Maine; agriculture technology commercialization efforts in Iowa, Nebraska, and California; and a new space technology commercialization effort in El Paso.

“The Trump Administration is committed to strengthening U.S. production and exports, which are essential to our nation’s economic growth,” said Secretary of Commerce Wilbur Ross. “These projects will enable entrepreneurs in communities across the United states to start new businesses, manufacture innovative products, and export them throughout the world – increasing America’s global competitiveness.”

The Office of Innovation and Entrepreneurship (OIE), housed within the U.S. Department of Commerce’s Economic Development Administration (EDA), leads the Regional Innovation Strategies Program to spur innovation capacity-building activities in regions across the nation. The program is authorized through the America COMPETES reauthorization Act of 2010, and has received dedicated appropriations since FY2014.

The RIS grants, broken into two categories — the i6 Challenge and the Seed Fund Support (SFS) Grants — were awarded to:

Seed Fund Support Awardees

i6 Challenge Awardees

https://www.eda.gov/archives/2021/news/press-releases/2017/09/20/ris.htm

The Diamondback

April 6, 2017

UMD researchers’ augmented reality technology could help doctors in the operating room

by Jack Roscoe

Augmentarium

Researchers at the University of Maryland’s Augmentarium introduced their augmented reality technology last week, taking a step toward the technology’s use in the operating room.

A team of five physicians and researchers, including Amitabh Varshney, a computer science professor and Augmentarium director, publicly demonstrated augmented reality technology designed to assist in intubation — putting a tube down a patient’s airway — and ultrasounds to a small crowd at the Newseum on March 27. The software used in the demonstration, which ran on Oculus and Hololens headsets, was developed at the Augmentarium, Varshney said.

The demonstrated augmented reality technology projects real-time information from the ultrasound onto the user’s field of view, said Barbara Brawn-Cinani, associate director at the university’s Center for Health-related Informatics and Bioimaging. This allows medical staff to see ultrasound images, for example, at the same time they’re looking at the patient, rather than having to repeatedly look away at small screens displaying the images, Brawn-Cinani said.

The technology was a collaborative effort between the Augmentarium and the University of Maryland School of Medicine’s shock trauma center and has been in development for about a year, Varshney said. It is still being refined, he added.

“Our next big steps are user studies for clinical practitioners,” she said. “We need to determine if this is an advantageous tool for use in clinical settings.”

Augmented reality technology adds data — such as visual floor plans for those in construction or detail for military operations — to a headset user’s physical environment rather than creating a new environment like virtual reality technology does, said Brian Servia, president of the augmented reality club on the campus and a junior computer engineering major.

And in addition to helping medical personnel with tests such as ultrasounds, augmented reality technology could help create three-dimensional simulated medical environments for health care providers to be trained in, according to the Augmentarium website. Non-invasive reminders that project some form of data onto the user’s field of view are another main application of augmented reality — in medicine and also in architecture and the military, Servia said.

“Some surgeries are very complex,” he said. “Maybe we can standardize the entire surgery throughout the United States with this lightweight device that can augment the steps of the surgery that are non-invasive.”

The process of having this technology implemented in hospitals around the country will include researching how information can be presented the best on the headsets to surgeons, securing all the necessary permissions and approvals from entities such as the Food and Drug Administration and popularizing the technology among health care professionals, Varshney said.

The Augmentarium, founded in December 2014 with a $1.6 million grant from the National Science Foundation and support from this university, makes use of cameras, displays and sensors to research augmented and virtual reality. The Augmentarium also receives funding from the MPowering the State initiative, a formal partnership between this university and the University of Maryland, Baltimore, Varshney said.

On top of this collaboration with the shock trauma center, the Augmentarium is working on other applications of augmented reality technology, from fluid dynamics to performing arts. The center is planning on working with the sports medicine center in Cole Field House, Varshney said, to better detect subtle effects of traumatic brain injuries that previously have been impossible to visualize without augmented reality.

“This amazing technology could really have a significant impact on our society,” Varshney said.

Brawn-Cinani noted the medical field is still the center’s “richest application area.”

While augmented reality is “ahead of its time,” Servia said, it will begin to see more practical application in the future as hardware limitations are solved. He said a common complaint among military research personnel was the durability and size of current augmented reality headsets. People are becoming more accustomed to the prospect of augmented reality, however, especially in the medical industry, Servia said.

“We are looking at virtual and augmented reality as basically being the next great shift in how the human society consumes information,” Varshney said.

CORRECTION: Due to a reporting error, a previous version of this story incorrectly referenced Barbara Brawn-Cinani as an assistant director at the university’s Center for Health-related Informatics and Bioimaging. She is an associate director. This story has been updated.

https://dbknews.com/2017/04/06/umd-augmented-reality-surgery/

2016

Augmentarium

Jun 24, 2016

By Maryland Today Staff

UMD Research Elevates Virtual Reality Beyond Entertainment

Augmentarium
Augmentarium

The room might be the strangest you’ve ever seen—soaring and palatial, with walls covered in a bizarre selection of portraits. Directly ahead is a photo of Taylor Swift, with one of Shrek to the right. Turn around to face the hawkish visage of inventor Nikola Tesla.

This isn’t a weird dream, but a virtual reality (VR) environment constructed by computer science Ph.D. student Eric Krokos and viewed through an advanced virtual reality headset. He’s testing the hypothesis that people can better recall visual information—like whose virtual portrait hangs where in the room—in realistic-looking environments, rather than when viewed in two dimensions on a flat screen.

“People are good at dealing with information spatially, but putting it on a screen makes it more abstract,” Krokos says. “Can we leverage a virtual reality headset to improve someone’s memory?”

An answer in the affirmative will have major implications for the fields of education and training. And it will illustrate how UMD researchers are helping virtual reality—now driven by gaming and entertainment—grow up a little.

After years of anticipation, VR hit the mainstream in recent weeks with the release of Facebook’s Oculus Rift headset. You need only to try the device to understand gamers’ excitement. Most of us will never drive a Formula One car or face a mob of zombies while low on ammo, but the full audio-visual immersion afforded by the Rift and competing headsets that Sony and HTC are releasing this year almost make it feel like we are.

Virtual reality game sales should reach $496 million in 2016, according to the global information firm IHS, and soar from there.

But Maryland researchers working in a unique computer visualization lab called the Augmentarium are convinced the technology has more profound uses. Virtual reality and its cousin, augmented reality—which overlays computer imagery on the real world using a heads-up display like Google Glass—could revolutionize schooling, medicine, public safety and more.

“The companies developing the technology are focused where the money is now,” says Amitabh Varshney, director of UMD’s Institute for Advanced Computer Studies (UMIACS). “We’re focused five or 10 years into the future, asking how we can use it to have a significant societal impact in critical areas.”

For instance, would cities be safer if police officers watching a disorienting array of security monitors could slip on a headset for a 3-D virtual view of the city stitched together from myriad camera feeds? UMIACS is testing just such a system outside the A.V. Williams Building using existing security cameras.

“The point is to present visual information in a more understandable way,” Varshney says. “Instead of a person appearing and disappearing on different screens, which is confusing, the system follows him.”

Medicine is another area that may be ripe for a virtual revolution. UMIACS researchers are working with doctors at the R Adams Cowley Shock Trauma Center at the University of Maryland Medical Center in Baltimore to make training films more visceral and instructive.

UMIACS researcher Sujal Bista ’05, M.S. ’10, Ph.D. ’14 last year filmed an abdominal surgery using a camera array that allows viewers to switch perspectives to see around intrusions like a doctor’s hand and view the surgery more closely. Future filming will use more capable camera setups, focusing on rare surgeries normally unseen by students. It’s the closest thing possible to participating in delicate procedures, Varshney says.

And using ultrasound, researchers are aiming to give doctors the power to look at patients and virtually see inside them, with real-time images projected on heads-up display glasses.

“The effect of much of the current imaging technology has been to drive us away from patients, even though we would prefer to be with the patient and focusing on them,” says Dr. Sarah Murthi of Shock Trauma. “Hopefully, what something like this will do is bring us back to the patient and away from the computer.”

Gaming is a great showcase for the technology, but Murthi hopes people understand how much more practical its uses can be.

“Being able to develop this technology for medicine depends on it being broadly accepted in society for uses beyond entertainment,” she says. “If we’re soon using it for navigation and many other practical things, the momentum from that could absolutely lead to a revolution in medicine.”

https://today.umd.edu/not-playing-games-d2939e91-1819-4d85-bf72-8c9521a72591

Thursday, May 19, 2016

For 29 students at the University of Maryland, the Spring 2016 semester ended with a virtual walk in the desert and junkyard demolition derby and a digital waterfall created from tweets.

The undergraduate and graduate students were part of the university’s first class in virtual reality, the exploding immersive technology that is expected to generate more than $1 billion in sales this year.

Students spent the semester learning the technical aspects of virtual reality (completely created visual content) and augmented reality (overlaying digital information on real-world settings).

They were also tasked with designing virtual demos that they presented as final class projects, offering an array of visually immersive scenarios that included fighting a swarm of zombies, a virtual tour of the UMD campus and a sensor-laden glove that tracked human hand movements.

“It was amazing to see the quality of the demos that came out of the class and lab sessions in a relatively short period of time,” says Galen Stetsyuk, a sophomore majoring in computer science who took the class to build upon his skills as a game developer.

Stetsyuk, president of the student-run UMD VR Club, was part of a team that designed an interactive game called “Junkyard Wars,” where players build a virtual car from scrap parts and then smash it into their opponent’s vehicle.

Another team developed a virtual reality version of “Shade,” the interactive horror fiction game that turns a quiet morning in your apartment into a sand-filled quest to survive in the barren desert.

Alexandria BenDebba, a graduating senior in computer science and member of the Shade team, says she took the class to learn new programming skills.

“Most of the other programming I’ve done involved web-based applications—nothing that involved 3-D models. So it was really interesting to code objects that you had to interact with. It took object-oriented programming to a much more literal state,” she says.

One team merged social media with virtual reality. They created a virtual community you could walk through, identifying who lived where by overlaying a person’s Facebook photos on their home’s exterior walls. To relax, you could head to the edge of town and watch a cascading waterfall of your friends’ Twitter messages.

Through a series of case studies and hands-on demos, students were also exposed to potential uses of virtual reality in areas like augmented navigation, medical education and training, and sports training and rehabilitation.

Amitabh Varshney, a professor of computer science co-taught the course, was impressed how students recognized that immersive technologies have applications beyond gaming.

Varshney—who is leading several major research efforts on campus in virtual and augmented reality—recalls a class discussion where the students theorized a virtual game of “Pong” could be used to help people with motion disabilities.

“They are very interested in looking at using virtual and augmented reality in societally important applications,” he says.

* * * *

“CMSC 498W: Introduction to Virtual and Augmented Reality” was offered through the university’s Department of Computer Science. It covered topics like generating virtual worlds, tracking and registering visual images, and digitally rendering haptics (interactives involving touch) and 3-D audio.

Amitabh Varshney, professor of computer science and director of the University of Maryland Institute for Advanced Computer Studies, and Derek Juba, an adjunct lecturer in computer science at UMD and a computer scientist at the National Institute of Standards and Technology, co-taught the class.

To view local TV coverage of the final class projects, go here.

https://www.umiacs.umd.edu/about-us/news/umd-students-show-final-projects-virtual-reality-class

2015

The University of Maryland announced today a gift of $31 million from Oculus VR co-founder and CEO and UMD alumnus, Brendan Iribe – the largest gift in the university's history. The majority of the gift, $30 million, will help fund construction of the Brendan Iribe Center for Computer Science and Innovation, and the remaining $1 million will establish the Brendan Iribe Scholarship in Computer Science.

Virtual and Augmented Reality to Revolutionize Surgery

By Chris Carroll

November 13, 2015

To the uninitiated, the scene is terrifying. A patient lies spread-eagle on an operating table, draped except for his midsection, where a trauma doctor digs for bullet fragments in a hollowed-out cavity large enough to swallow a bowling ball.

But to researchers in the University of Maryland’s Institute for Advanced Computer Studies (UMIACS) and doctors at the R Adams Cowley Shock Trauma Center at the University of Maryland Medical Center in Baltimore, it’s step one on the path to the future of emergency surgery.

The scene is playing out in 3-D on a giant screen in UMIACS’ Augmentarium, where researchers seek to graft the power of computing onto everyday life.

Postdoctoral computer science researcher Sujal Bista ’05, M.S. ’10, Ph.D. ’14 filmed the surgery at Shock Trauma with an innovative (and still-secret) video setup using cameras that present multiple viewpoints. Viewers can change perspective and see around objects in the scene, like doctors’ hands or medical equipment.

The filming experience was disturbing, Bista says—particularly the acrid smell of blood vessel cauterization. 

“If you’re not used to it, it’s very strong,” he says.

Giving viewers the same sense of immersion (minus the smell) is the first objective in this collaborative research, says Amitabh Varshney, computer science professor, UMIACS director and lead investigator for the project. The surgeries Bista is filming will be used primarily for training.

“We’re working on building virtual environments in which multiple students and teachers are immersed concurrently and can view the reconstructions in any environment with a shared perspective,” Varshney says.

Students and teachers will share virtual experiences using 3-D headsets like the Oculus Rift, which UMIACS has stocked up on, thanks to last year’s $31 million gift from Oculus VR co-founder and CEO Brendan Iribe. A $1.5 million gift from his mother, Elizabeth Iribe, will establish an endowed chair in virtual and augmented reality, fostering continued research of this type.

The environment in a trauma hospital like Shock Trauma, which established the “golden hour” ideal of saving lives through speedy care is disorienting for medical students, says Dr. Sarah Murthi, who is working with the UMD researchers.

But giving them an Oculus Rift headset and letting them acclimate by exploring the scene in virtual reality could revolutionize training, she says.

“We throw medical students and young trainees into the chaos of trauma with almost no preparation. We expect them to just pick up how to manage several patients at once, without really training them how to do that,” Murthi says. “Virtual reality will allow us a way to teach what I call spinning plates—paying attention to several things at once and not losing track of any of them.”

The next step, which will take place in coming years, is introducing virtual and augmented reality into clinical practice, Varshney says.

“If there was a virtual reality simulation where you could practice a particularly difficult surgery before you do it, the actual surgery would be faster and safer,” he says.

And during surgery, doctors could be guided by information and imagery projected on see-through goggles, perhaps listing the steps or showing systems in the body.

Further out, the researchers envision a world where remote-controlled robots on distant battlefields or in the backs of ambulances could act as the virtual hands of skilled surgeons at places like Shock Trauma, speeding the delivery of quality care dramatically.

“Augmented reality has the potential to fundamentally change and improve the practice of medicine—both in how we train and how we care for patients,” Murthi says.

https://today.umd.edu/visions-future-c2197096-241a-4d6a-8fd9-e2412497a56f

February 07, 2015

By Lauren Brown

Oculus VR CEO's Vision: UMD in Forefront of Virtual Reality

Brendan Iribe aims to transform entertainment, communication, educationand more with the most hotly anticipated tech advance since the smartphone: a pair of goggles and operating system offering a totally immersive 3-D experience in virtual reality.

The Maryland alumnus’ boldness extends to the university, where he envisions a building that will become a model for the study of computer science, allowing students and faculty to explore the potential of virtual reality, as well as robotics, computer vision, computer-human interaction and immersive science.

Iribe (right), co-founder and CEO of Oculus VR, has pledged $30 million to Maryland to help fund construction of the Brendan Iribe Center for Computer Science and Innovation. With an additional $1 million supporting scholarships in computer science, his is the largest gift in university history. His longtime business partner, Oculus co-founder and Chief Software Architect Michael Antonov ’03 (left), is pledging $4 million to support construction of the building and scholarships, and Iribe’s mother, Elizabeth Iribe, is contributing $3 million for new professorships in the Department of Computer Science.

“‘Giving while living’ is what many people are saying now,” says Iribe, whose company was acquired by Facebook in July for approximately $2 billion. “Giving back at this time allows us to participate in the school right now instead of waiting until we’re retired or much older. We’re going to be able to go back and not just talk there, but be able to help spread a lot of this technology and innovation that we’re creating at Oculus.”

The building, expected to open in 2017, will be prominently located at the corner of Campus Drive and Route 1 and designed to encourage collaboration, with open work spaces, community areas and “makerspaces” where students and faculty can experiment and create.

“Brendan’s remarkable vision will catapult our computer science department to an even greater level of national distinction,” says university President Wallace D. Loh. “It will spark student creativity, galvanize collaborative innovation and entrepreneurship across campus, and stimulate tech-based economic development in the state.”

Iribe and Antonov met as freshmen in Fall 1997 when they lived in Denton Hall. With Iribe as the business visionary and Antonov and Andrew Reisse ’01 as coding whizzes, they founded their first company, SonicFusion, in 1999 with lofty ambitions: to create a better windowing system than Windows.

“We were in a little bit over our heads,” Iribe later joked.

After freshman year, Iribe withdrew from Maryland to continue growing the newly renamed Scaleform; Antonov briefly did the same, then returned and juggled classes and coding with Reisse. After six long years of being broke, they finally licensed their technology and started developing a new Flash player for 3-D applications.

Their software was used in hundreds of video games by Activision, Disney and more, and in 2011, Greenbelt-based Scaleform was acquired by Autodesk for $36 million. The trio then moved to a cloud gaming company called Gaikai in California, and figured out how to stream games onto smart TVs. The next year, Sony bought that company, this time for $380 million.

“We were starting to get the hang of it,” Iribe says drily.

In 2012, Iribe met Palmer Luckey, who as a home-schooled teen had cobbled together a virtual reality (VR) headset called the Oculus Rift, predicting it would someday offer true immersion—the holy grail of gaming. John Carmack, the godfather of 3-D gaming who created Doom and Quake, took a duct-taped prototype of the Rift to a top video game conference, where he declared it “probably the best VR demo the world has ever seen.”

Iribe and his team came on board to formally create a company and launched a Kickstarter campaign to raise $250,000 to keep improving the Rift. It blew past that goal in hours and ultimately brought in more than $2.4 million. The first developer kits shipped out, and the ideas multiplied for how it could be used beyond gaming: to help children learn about the solar system or human body, to take homebound (or cash-poor) users on virtual vacations, to put sports fans in the middle of a game.

The company’s evolving headset was winning awards and positive press, when Reisse was killed in a hit-and-run crash near his Santa Ana, Calif., home in May 2013. Antonov, Iribe and their co-workers, along with Reisse’s parents, Robert ’76 and Dana ’73, quickly funded a new scholarship for Terp computer science students in his name.

“We wanted Andrew to be remembered and to support the kind of independence, creativity in computer science and love of nature, which he had,” Antonov says.

The first scholarship recipient invited Antonov and Iribe back to Maryland for Bitcamp, an April event that drew 700 students nationwide to hack together websites, apps and computer hardware projects. It was a stunner when, 10 days before the pair arrived, the Oculus-Facebook deal was announced.

Department Chair Samir Khuller gave them a tour of the computer science classrooms and labs carved out of the A.V. Williams Building, built in 1987 as office space. Iribe recalls, “My first thought was, this is pretty depressing. How can people get inspired to create the future in a space like this?”

Someone mentioned that computer science needs a new home. “I said, ‘We can fix that. How much is a building?’ The more we thought and talked about it, the more excited we became about the opportunity to transform University of Maryland with a new computer science building that inspires students the same way our offices and engineering labs inspire and attract the best and brightest in the industry.”

His offer was a godsend for the department, where undergraduate enrollment had doubled in the past five years, to 1,700, and is expected to double again in the next decade. As a result, students are working in a maze of cubicles in four buildings spread across campus. It’s hurting Maryland’s recruitment of faculty and graduate students, says Professor Emeritus Bill Pugh, who is spearheading the Iribe Center’s fundraising effort, as well as kicking in $500,000 of his own money.

He envisions a vibrant place that spurs cutting-edge research in the department and the university’s Institute for Advanced Computer Studies and offers courses, events and creative projects that attract students from practically every major.

“We need a building where formal and informal learning co-exist so our students can imagine and invent products that will change our world,” he says. “This is a once-in-a-generation opportunity.

https://today.umd.edu/record-31m-gift-jump-start-computer-science-building-63cc8961-34d0-4f5c-8dea-70f683fbc02b

2014

AR Illustration

TERP magazine Fall 2014

Record GiftRecord Gift Page 2

 

Tuesday, September 9, 2014

Visualization tools that allow a surgeon to “see through” a patient during surgery. Wearable sensors that encourage healthier behavior. Virtual-based headsets that train soldiers how to remove a wounded comrade from the battlefield.

Innovations like these and more under development at the University of Maryland just got a big boost.

The National Science Foundation (NSF) has awarded UMD a $600,000 Major Research Instrumentation grant in support of virtual and augmented reality research and education. Virtual reality (VR) either mimics real-world settings or creates fantasy worlds; augmented reality (AR) embeds digital information into real-world settings.

Both fields are expected to expand exponentially in the near future in applications tied to scientific visualization, entertainment, military uses, architecture, navigation, education, prototyping, collaborative communication, and more.

The NSF grant will be used to purchase new equipment and provide infrastructure support for a 1,000-square-foot “augmentarium” now under construction. The Virtual and Augmented Reality Laboratory will feature interactive projection displays, robotic mounts and high-speed computing clusters that, university officials say, will position UMD as a leader for the effective visualization of large and complex data.

When launched later this year, the interactive lab will bring together researchers from across campus and beyond to explore ideas and technologies that combine real-time data within virtual settings and backdrops.

“These technologies will engage our faculty and students to explore new pathways of discovery that can have far-reaching scientific and societal benefits,” says Jayanth Banavar, dean of the College of Computer, Mathematical, and Natural Sciences (CMNS).

The virtual and augmented reality lab is the latest addition to the university’s Institute for Advanced Computer Studies (UMIACS), with its director, Amitabh Varshney, named as principal investigator of the NSF award.

Varshney and other researchers in the Graphics and Visual Informatics Laboratory—an NVIDIA CUDA Center of Excellence—have led recent efforts in virtual and augmented reality research at Maryland and are expected to play a lead role in the new lab when it is fully operational.

Projects supported by the NSF grant include: understanding large-scale astronomy data (Lee Mundy in astronomy); weather and climate prediction (Kayo Ide in atmospheric and oceanic science); characterization of stem cells (Peter Bajcsy at the National Institute of Standards and Technology and Antonio Cardone in UMIACS); large-scale simulation of rotorcraft brownouts via accelerated algorithms (Ramani Duraiswami in computer science and UMIACS); data visualization for cybersecurity (Tudor Dumitras in electrical and computer engineering and UMIACS); user interaction with augmented reality (Catherine Plaisant in UMIACS and Rama Chellappa in electrical and computer engineering and UMIACS); visualization of big data (Joseph JaJa in electrical and computer engineering and UMIACS, and Varshney); and augmented reality-based, image-guided surgical interventions (Rao Gullapalli in radiology at the University of Maryland, Baltimore, and Varshney).

The grant also provides for education and training, with plans for the Maryland Center for Women in Computing to develop workshops that encourage middle school and high school girls to participate in VR and AR projects.

Other research will study how people interact with VR and AR technologies, and how they can best be used in an educational setting.

The virtual and augmented reality lab will also take advantage of several cross-institutional partnerships supported by MPowering the State, which joins scientists at the University of Maryland with physicians, clinicians and other health experts at the University of Maryland, Baltimore.

“We look forward to further collaboration with our colleagues in Baltimore to identify opportunities and leverage our combined strengths in computing power and clinical expertise,” says Varshney.

Researchers from both institutions envision specialized headgear that surgeons can wear in an operating room, providing real-time patient and surgical data that is “overlaid” on top of a patient during surgery.

Additional projects will look at wearable sensors that can track the movement of people suffering from neurological disorders like Parkinson’s disease, keeping tabs on the disease’s progression in hopes of providing better therapeutic outcomes. Also under discussion are wearable technologies that monitor human activity and offer visual feedback via warnings or positive responses, both of which can promote healthier behavior.

The federal government is very interested in VR and AR applications, including the development of specialized headwear that can provide soldiers with critical information like weather, combat efficiency, and the location of both friendly and adversarial forces. Other applications can train soldiers in essential tasks, such as evacuating wounded comrades while under fire on a virtual battlefield.

Initial seed funding for the UMD virtual and augmented reality lab came from CMNS, with additional support from the university’s Division of Research and the provost’s office.

To view illustrations of several virtual and augmented reality projects under development at UMD, go here.

https://www.umiacs.umd.edu/about-us/news/umd-receives-nsf-major-research-instrumentation-award-support-virtual-and-augmented

Brendan Iribe
©2014 Photography by Mike Denison

By Mike Denison - April 4, 2014

Capital News Service

COLLEGE PARK – Tech innovator Brendan Iribe has a penchant for ambitious goals. His first company aimed to make software better than what Microsoft and Apple had to offer — with just three employees.

Five penniless years passed before Iribe tasted success. But now, he’s the CEO of a tech company worth $2 billion.

Fresh off of his company’s March 25 acquisition by Facebook, Iribe addressed criticisms about the acquisition and teased the future of his company’s unique virtual reality headset during a speech at the University of Maryland Friday afternoon.

Virtual reality is a term used to describe video games or other computer experiences that are vastly more immersive than what traditional game consoles can provide. Virtual reality convinces the user that they are not simply controlling an avatar in an artificial world, but that they are in the artificial world themselves.

Iribe’s revolutionary product, the Oculus Rift, is a virtual reality system designed for video games. It takes the form of a pair of goggles that project two images into each eye and tracks where the player’s head moves, altering the view of the game world accordingly. The technology was created by Palmer Luckey, who had originally made a working version of the Rift with duct tape and hot glue, according to Iribe.

But when Iribe tried the system for himself, he knew he’d seen something special. It wasn’t the first attempt to create virtual reality for video games, but it was far cheaper than previous iterations. And that meant it just might work.

“It is incredible to work with a small team on a big idea. So, what bigger idea than virtual reality?” said the CEO of Oculus VR.

The acquisition by Facebook was met with some backlash from the video game community, many of whom feared a lack of independence would ruin Oculus’ innovation. Markus “Notch” Persson, the developer of the wildly popular game Minecraft, publicly announced a severing of ties with Oculus after the acquisition, saying on Twitter “Facebook creeps me out.”

Iribe, a former Maryland student, defended the acquisition, citing the massive costs of turning his vision for Oculus into a reality. He said the necessary technology for a new piece of gaming hardware often costs around $1 billion. Before the Facebook acquisition, Oculus had raised about $100 million.

Iribe said that Facebook’s massive user base and available capital was exactly what Oculus needed to succeed. When confronted with the costs of achieving the goals for Oculus, Facebook CEO Mark Zuckerberg said, “Well, we have a lot of money,” according to Iribe.

Iribe said that Facebook had the ability and resources to make Oculus and virtual reality affordable and commercially viable. He added that it was not in Facebook’s best interest to overrun Oculus’s independence, citing Instagram and WhatsApp as companies acquired by the social media giant without losing independence.

“Frankly, we didn’t want to be acquired. We wanted to be independent,” said Iribe. “But when we saw this partnership and how much sense it made, we really wanted to … go into the partnership if it meant we could stay Oculus and stay who we are. We’d keep our Oculus hoodies, we’d keep our Oculus email addresses, and we’d really keep our independence.”

While he emphasized that the team’s primary focus is gaming, Iribe said that there were many other fields that could benefit from Oculus and other virtual reality systems. He said he hoped that Oculus could eventually be used in the medical, architectural, communication and even travel industries.

Oculus was far from Iribe’s first entrepreneurial endeavor in games, however. While still a student at Maryland in 1999, he and two other students created the company that later become Scaleform, an application for designing video game interfaces. However, the company made “exactly zero dollars and zero cents” until five years after its creation. After Scaleform was bought by Autodesk, Iribe led another video game company, Gaikai, until it was bought by Sony, not long before he got involved with Oculus.

Iribe said he has not forgotten the lessons learned during the lengthy dry spell despite Oculus VR’s tremendous growth.

“You just have to keep with it,” he said. “Most of the time, these things don’t happen nearly as fast as, say, Oculus.”

https://cnsmaryland.org/2014/04/04/oculus-virtual-reality-ceo-defends-facebook-acquisition/

TERP magazine Spring 2014

Virtual Reality, Made More Real

 

Back to Top