Skip to main content

Video Library


IMD@UMD Workshop 2021

IMD@UMD Workshop

Watch all 15 virtual presentations given by UMD faculty and students. Below are the descriptions of each presentation.

DeanAmitabhVarshney

Workshop with opening remarks from Roger Eastman, Director of Immersive Media Design; Amitabh Varshney, Professor and Dean of the College of Computer, Mathematical, and Natural Sciences (CMNS); and Bonnie Thornton Dill, Professor and Dean of the College of Arts and Humanities (ARHU). Watch the Opening Remarks.

Virtually On Stage

The School of Theater, Dance and Performance Studies looks to create a game-driven training space for theatre students. Andrew Robert Cissna is a visiting assistant professor in the graduate lighting design program at the School of Theatre, Dance, and Performance Studies. He is a professional lighting designer for plays in D.C. and around the country. Watch this presentation.

Environmental Documentation

Terminal Front, a collaborative effort funded by the University of Maryland, uses LiDAR and photogrammetry to create a virtual reality experience of the Helheim Glacier in Greenland. The project blends 4D LiDAR data sets shared by glaciologists with artist-led immersive media design, and the CyArk organization’s expertise in place-based digital storytelling.

Cy Keener is an assistant professor of art and an interdisciplinary artist who uses environmental sensing and kinetic sculpture to record and represent environmental phenomena. He also teaches Digital Media Theory and Culture in the Immersive Media Design program. Watch this presentation.

AdamMarton

Adam Marton is the director of Capital News Service at the Philip Merrill College of Journalism. He is focused on quality storytelling across media, using design and technology to tell rich, human stories. Marton is a visual journalist and designer specializing in the presentation of the news, including data visualization, front-end development, and information graphics. Watch this presentation.

North Brentwood

The high-density documentation of Cultural Heritage sites and artifacts is becoming more common as the cost of laser scanners and photogrammetry drop, and their processing becomes more user-friendly. Interpreting this material to the public is still a challenge. North Brentwood is a historic African American town just south of College Park.

A project is underway to document the built environment before gentrification permanently obscures it. VR presents an opportunity to immerse diverse publics across the globe into an interactive environment where they can engage with the history of this small town.

AR provides an opportunity to share historic documents, photographs and oral histories with any visitor that has a smartphone. Linking these two technologies to disseminate significant cultural heritage stories will encourage visitation to the town, and will enable new residents and visitors to learn what has made North Brentwood's history so special.

Stefan Woehlke is a Ph.D candidate in archaeology at the School of Architecture, Planning, and Preservation where he is focusing in heritage, diaspora theory and post-emancipation studies. He is the co-founder of a D.C. food tour company, where his passions intersect at food, history and culture. Watch this presentation.

Mapping Malvasia

Caroline Paganussi is an art historian because of classes she took during a study abroad semester in Bologna, home to Europe’s first university. The classroom was often the city itself, especially its churches. As the Robert H. Smith Fellow for Digital Art History, she is working with Gregory on approximating that experience through XR (AR within VR) for online teaching.

Ironically, the pandemic, for which such teaching is ideal, precluded gathering much of the necessary content and media this past summer. Nevertheless, they are prototyping the concept, and at the same time have pivoted to structuring the data of a period book about Bolognese art so that it can be mapped and ultimately experienced as an augmented walk through the city.

Caroline Paganussi is a Ph.D. candidate and Jenny Rhee Fellow studying Italian Renaissance Art in the Department of Art History and Archaeology. Broadly, her research focuses on the exploration and expression of humanistic discourse in painting and printmaking in the fifteenth and sixteenth centuries. Watch this presentation.

Digital Storytelling

Virtual reality, augmented reality, and mixed reality represent emerging spatial technologies that are changing our relationship not just to each other, but to our environment as well. Helping humanity understand how best to use these technologies requires spatial storytellers and technologists who can collaborate together in these new mediums. As part of the Immersive Media Design program, we're bringing together a diverse group of students to explore the untapped narrative potential of spatial technologies.

Jonathan David Martin is a director, actor, producer and teaching artist focused on the creation and presentation of contemporary works of performance.He teaches the Immersive Media Design course Augmented Reality Design for Creatives and Coders. He is also the co-artistic director of Smoke & Mirrors Collaborative, a non-profit that creates original works for theater, VR/XR and the web with socially relevant themes. Watch this presentation.

Ephemeral Media

Mollye Bendell is a studio arts lecturer and the first faculty member to teach Intro to Immersive Media Design at UMD. As a professional artist, she creates digital and analog sculptures using the intangible nature of electronic media as a metaphor for exploring vulnerability, visibility and longing in a world that can feel isolating. Watch this presentation.

Sense of Presence

Virtual reality technology promises the unique capability to create a sense of presence for users in virtual environments. This requires the computation of highly detailed and photo-realistic imagery in a matter of milliseconds, which is a challenge even for today's specialized high performance processors. Matthias Zwicker discusses how research and advances in computer graphics rendering algorithms are bringing us closer to achieving movie-quality rendering in interactive VR applications.

Matthias Zwicker is a professor and the chair of the Department of Computer Science, as well as a lead faculty member of the Immersive Media Design major. He develops high-quality rendering and signal processing techniques that are used for computer graphics, data-driven modeling, and animation. Watch this presentation.

MBRC

In this panel, the MBRC team will discuss some of their most impactful ongoing projects, including a virtual reality tool for weather forecasting, an immersive concert, and a virtual module for developing surgical skills.

Barbara Brawn is the associate director of the Maryland Blended Reality Center, a lab that is dedicated to advancing visual computing for health care and innovative training for professionals in high-impact areas using AR and VR technologies. Brawn is followed by MBRC AR/VR researchers Jon Heagerty, Sida Li and Eric Lee. Watch this presentation.

HSIS

Monifa Vaughn-Cooke is an assistant professor of mechanical engineering. Her interdisciplinary research aims to identify the behavioral mechanisms associated with system risk propagation to inform the design of user-centric products and systems, with the ultimate goal of improving productivity and safety. Watch this presentation.

Project Intent

The INTENT autism app will use mixed reality to develop an empathy tool and assess its efficacy to promote greater understanding, acceptance, and inclusion of people with autism and other neuro-divergent conditions by altering sensory input to the user in ways that mimic the perceptions of autistic individuals.

Kathryn Dow-Burger is a clinical associate professor in the Department of Hearing and Speech Sciences. Her interests focus on social communication disorders, language-learning disabilities, emergent literacy, and stuttering. She is the founder and program designer of UMD's Social Interaction Group Network for All (SIGNA).

Elizabeth Redcay is an associate professor in psychology and director of the Developmental Social Cognitive Neuroscience Lab. Her research examines the development and neural correlates of social interaction and social cognition in both typical and atypical development, specifically individuals with autism spectrum disorder.

Andrew Begel is a principal researcher in the Ability Group at Microsoft Research. His work looks at how technology and AI can play a role in extending the capabilities and enhancing the quality of life for people with disabilities. Watch this presentation.

NSF I-Corps

I-Corps is a National Science Foundation program designed to foster, grow and nurture innovation ecosystems regionally and nationally. In this session, Kunitz will discuss how the program provides real world, hands-on training to researchers and early-stage technology entrepreneurs on how to successfully incorporate innovations into successful products to solve societal problems.

Daniel Kunitz represents the I-Corps program at the A. James Clark School of Engineering. He is the executive director of DC I-Corps Node, a role in which he advises and provides support to all early stage technology companies in the NSF-sponsored DC I-Corps program. Watch this presentation. Watch this presentation.

UMD Student Experiences

This project, originally created to give first-time UMD students an augmented reality tour of campus in the time of the pandemic, evolved into that and so much more. UMD has an extremely rich history–it is only through physically seeing the juxtaposition can we grasp how many generations of students and change it has taken to see what is standing before us.

Aishwarya Tare '22 is majoring in information science with a focus in human-centered design and a minor in art history. She is interested in rendering experiences using interaction design skills in AR/VR, human-computer interaction, and product design. Watch this presentation.

XRClub

The XR Club is a student organization at UMD that provides students with resources and opportunities to explore virtual, augmented and mixed reality through hands-on tutorials, projects, hackathons, game nights, open lab hours, mentor office hours and more! This presentation will give an overview of the club and feature some of its recent projects, including the NASA SUITS challenge.

Sahil Mayenkar is a computer engineering major and the president of the XR Club. He previously taught the student initiated course CSMC388M Introduction to Mobile XR, and worked as an augmented reality developer intern at New Wave. Watch this presentation.


MAVRIC 2020 Conference

MAVRIC 2020 Conference

This two day virtual conference was held September 10th and 11th, 2020. It was co-hosted by the Chesapeake Digital Health Exchange. The 2020 conference explored ways that extended reality (VR/AR/MR) are impacting health care, business, art, intelligence, defense, and government.


MAVRIC 2019 Conference

MAVRIC 2019 Conference

This two day conference was held September 17th and 18th, 2019 was hosted by Booz Allen Hamilton and took place at the Booz Allen Innovation Center in Washington, D.C. The conference brought together the east coast’s top corporate, government, research, and startup talent in XR to present on topics ranging from healthcare, simulation and training (both corporate and governmental), advanced research in the technology and design of interactive experiences to education, data analysis and cyber security.


More UMD XR-Related Videos

Jul 15, 2021 - Researchers at the University of Maryland recorded VR users’ brain activity using electroencephalography (EEG) to better understand and work toward solutions to prevent cybersickness.

The research team—Erik Krokos, Ph.D. and Amitabh Varshney, professor and dean of the College of Computer, Mathematical, and Natural Sciences—is the first to establish a correlation between the recorded brain activity and self-reported symptoms of their participants.

Their work provides a new benchmark—helping cognitive psychologists, game developers and physicians as they seek to learn more about cybersickness and how to alleviate it.

This project was partially funded by the National Science Foundation.

March 25, 2021 - Immersive visualization technology is rapidly changing the way three-dimensional data is displayed and interpreted. Rendering meteorological data in four dimensions (space and time) exploits the full data volume and provides weather forecasters with a complete representation of the time-evolving atmosphere. With the National Oceanic and Atmospheric Administration, we created an interactive immersive “fly-through” of real weather data. Users are visually guided through satellite observations of an event that caused heavy precipitation, flooding, and landslides in the western United States. This narrative and display highlights how VR tools can supplement traditional forecasting and enhance meteorologists’ ability to predict weather events.

August 27, 2019 - What if a piece of music could transport you to the place it was written, inspired by, or first performed? Through virtual reality, this collaboration between University of Maryland's College of Arts and Humanities (ARHU) and College of Computer, Mathematical, and Natural Sciences (CMNS) offers an immersive experience that takes audiences beyond the auditory. By virtually placing the performer within environments significant to the music, this UMD project provides a rich experience with possibilities in both musical education and performance.

July 24, 2018 - See how University of Maryland researchers are exploring the power of virtual and augmented reality. Their work was supported by the National Science Foundation, the state of Maryland’s MPower initiative and the NVIDIA CUDA Center of Excellence program.

“Virtual memory palaces: immersion aids recall” was authored by Eric Krokos, a 5th-year doctoral student in computer science, Catherine Plaisant, a senior research scientist in the University of Maryland Institute for Advanced Computer Studies, and Amitabh Varshney, professor and dean, College of Computer, Mathematical, and Natural Sciences.

May 1, 2018 - With colleagues at the University of Maryland School of Nursing, we are exploring virtual reality and music therapy for non-opioid pain interventions. For this project, we have captured and recreated a 360-degree navigable performance by the Maryland Opera Studio, which is part of a comparative study of VR environments as pain interventions. The combination of VR and music to decrease pain perception is truly innovative. Combining the novelty and immersion of VR with the known therapeutic effects of music presents a unique and promising approach to patient care, and could help further understanding of the complex relationship between music and healing.

August 30, 2017 - Using cutting edge VR technology, the University of Maryland played a crucial role in developing a virtual reality environment that lets Newseum visitors interact with the Berlin Wall.

October 12, 2016 - While medical imaging has radically evolved, images are displayed in the same way they were in 1950. Visual data are shown on 2D flat screens, on displays that force health care providers to look away from the patient, and even away from their own hands while operating. AR’s ability to concurrently display imaging data and other patient information could save lives and decrease medical errors. This is especially true for procedures done outside an operating room; during “bedside procedures” patients may be most at risk and AR could provide the greatest benefits. We are currently developing and testing several AR tools for patient care and diagnostics, including intubation and ultrasound.
Back to Top