Neukom/ITC AR/VR Symposium

Jones Media Center
Tuesday, December 3, 2019


11:30-12:15 – VR Demos Area Open

11:30-12:15 – Boxed Lunches from King Arthur Flour Available

12:15 – Opening remarks by Dan Rockmore, Associate Dean for the Sciences

12:20-3:00 – Presentations

3:00-3:30 – Discussion, Moderated by Dan Rockmore

3:30-4:00 – VR Demos Area Open

4:00 – Optional reception (wine and cheese) in the VR demos area


A map can be found here.


1 The Virtual Basilica of St Paul: An Annotated Virtual Reality Reconstruction of one of Rome’s Most Venerable Churches

  • Nicola (Nick) Camerlenghi (Associate Professor, Art History)

This project employs Virtual Reality to reconstruct a selection of important historical models from the millennial history of the Basilica of St. Paul in Rome, where one of Christianity’s founding figures — St. Paul — is buried and venerated. The building was tragically lost to fire in 1823, which has conditioned its limited exposure to scholarly inquiry. My project is currently entering its second generation, for which annotations are being added to the models to make them more approachable to a broader audience and to make them peer-reviewable to experts.

2 Teaching Sign Language in AR

  • David Kraemer (Assistant Professor, Education)
  • Devin Balkcom (Associate Professor, Computer Science)
  • Xia Zhou (Associate Professor, Computer Science – not presenting)

Teaching human motion like swimming, dance, or sign language is hard — words only roughly describe the motion, and vision requires the learner to make a connection between external motion and their own bodies. The goal of this project is to provide automated assistance in learning motion tasks, and in particular sign language, providing AR instruction and feedback based on sensed motion of the learner.

3 Immersive 3D Visualization of Landscapes for Virtual Fieldwork in the Geosciences  

  • Jonathan Chipman (Director, Citrin Family GIS/Applied Spatial Analysis Lab)
  • Marisa Palucis (Assistant Professor, Earth Sciences)

We are developing ways to use VR for field work in the Geosciences, building detailed photogrammetric models of landscapes on Earth and on Mars that faculty and students can explore and interact with. This is particularly valuable for inaccessible locations (e.g., Mars), and for remote and difficult-to-access places on Earth, as well as hazardous or environmentally-sensitive field sites. It also allows students to become familiar with remote landscapes before going into the field, and allows students with disabilities to participate in field work.

4 Launching the DEV Studio

  • John Bell (Director, Data Experiences and Visualization Studio)

The Data Experiences and Visualizations Studio (DEV Studio) is a new campus resource for working with XR in teaching and research. It helps explore technologies that bridge the virtual and physical world, including 3D scanning and digitization, augmented and virtual reality, and multimodal interfaces. This talk will give a brief overview of plans to develop the Studio and offer ideas for how it can fit into research and teaching practices across disciplines.

5 HoloLens – The Power of Augmented/Mixed Reality 

  • A.J. Herrmann (Business Sales Specialist, Microsoft)

Microsoft HoloLens is the first self-contained holographic computer, delivering heads-up, hands-free computing that enables you to interact with high-definition holograms in your world. This blended environment becomes your canvas, where you can create and enjoy a wide range of mixed reality experiences without disengaging from the task at hand. 

6 3D Representations in Paleoanthropology

  • Jeremey DeSilva (Associate Professor, Anthropology)

Spatial understanding and awareness is a key component to teaching paleoanthropology. 3D technologies, both new and old, are a powerful pedagogical tool for teaching students about evolution.

7 The Uncanny Valley effect in AR and VR

  • Lorie Loeb (Research Professor, Computer Science; Faculty Director/Co-Founder, DALI Lab; Director, Digital Arts Programs; Deputy Director of UI/UX Design for the Emerging Technology Core at the Center for Technology and Behavioral Health)
  • Kala Goyal ’20
  • Sung Jun Park ’19 (Not presenting)
  • Ting Yan GR ’19 (Not presenting)
  • Xia Zhou (Associate Professor, Computer Science – not presenting)

The Uncanny Valley is the uneasy response people have to computer generated characters or robots that look almost real, but not real enough.  We explore the Uncanny Valley effect in AR and VR in order to better understand how these altered states of reality might impact viewers’ perception of CGI characters.  Using a range of characters from cartoony to hyper-real, we test them in both AR and VR settings and use a mix of bio-sensor data and surveys to track viewer’s responses and discover what makes a character appear “creepy” as opposed to “cute” or “realistic”. 

8 ARKit on the iPhone and iPad

  • Lars Ljungholm (Apple, Systems Engineer)
  • Beth Marshdoyle (Apple, Account Executive) 

Apple is working to blur the lines between the virtual and reality so that your classroom could become the cosmos. The past could be as vivid as the present. And the familiar could look like nothing you’ve ever seen. With iPhone and iPad, those experiences are not only possible, they’re here. Augmented reality is a new way to use technology that transforms how you work, learn, play, and connect with almost everything around you. We will be presenting a short overview of these new technologies and how Apple is making them a reality!

9 Image-Guided Surgery – an AR Application

  • David Roberts, M.D. (Professor of Neurosurgery Active Emeritus, Geisel; Adjunct Professor, Engineering)
  • Keith D. Paulsen (Robert A. Pritzker Professor, Biomedical Engineering; Professor of Radiology, Geisel; Scientific Director, Advanced Imaging Center, Dartmouth-Hitchcock; Project Leader, Center for Surgical Innovation, Dartmouth-Hitchcock; Co-Director, Cancer Imaging and Radiobiology Research Program, Norris Cotton Cancer Center)

Image-guided surgery superposes preoperative imaging information, such as from MRI and CT, onto the surgical field to provide navigational assistance during operative procedures. Refined co-registration strategies and updating of image sets using sparse data have improved accuracy, functionality and safety.

10 Tarsier Vision Goggles

  • Tim Tregubov (Director/Co-Founder, DALI Lab; Senior Lecturer, Computer Science)

DALI has collaborated on numerous design and development projects related to mixed reality. In this session, Tim will describe what that process looks like using the Tarsier Goggles project as an example.

11 Design Tips and Tricks from Entangled

  • Max Seidmann (Senior Game Designer at Professor Mary Flanagan’s Tiltfactor Lab)

Entangled is a multidimensional VR escape room/puzzle game that was created to study the impact of VR avatars on gender bias, and the impact of VR on construal levels. I will briefly touch upon the findings, but will then focus on three design questions that everyone making a new VR experience needs to ask themselves, and the solutions we came up with in Entangled.

12 Dartmouth & Magic Leap: Building a Spatial Computing Journey Together

  • Stuart E. Trafford (Magic Leap)
  • Andrew Shepard ’89 (Chief Growth Officer, Immersion Analytics)

Magic Leap has built the next frontier of computing, a spatial computing platform that seamlessly blends our digital and physical worlds. Stuart Trafford from the Magic Leap Education team will discuss how this technology can benefit higher ed institutions like Dartmouth in teaching & learning as well as in academic research, across disciplines as diverse as Computer Science, Architecture, Data Science, Art History or Human Anatomy.

13 Student XR Work

  • James Mahoney (Digital Arts Lecturer, Computer Science Department)

Students in the 3D Digital Modeling, AR and VR Design, and AR and VR Development courses get hands-on experience creating XR software. I will be presenting work done by students in these XR classes offered by the Computer Science Department in 2019.

14 How Colorful is Visual Experience? Evidence from Gaze-Contingent Virtual Reality

  • Caroline Robertson (Assistant Professor, Psychological and Brain Sciences)

We used gaze-contingent rendering in immersive VR to reveal the limits of human color awareness during real-world, active vision.

15 Enhancing the Student Experience in XR

  • Kala Goyal ’20
  • Lindsay Kusnarowis ’20

To promote the enhancement and diversification of student experiences and engagement with XR, Kala Goyal ‘20 and Lindsay Kusnarowis ‘20 discuss their and other students’ past experiences working with XR in courses and various labs across campus.


  • Dartmouth College:
    • Dartmouth Library
    • Information, Technology, and Consulting
    • The Neukom Institute

This conference is sponsored by the Associate Dean of the Sciences, and Co-sponsored by Dartmouth Library; Information, Technology, and Consulting; Neukom Institute.