Embodied Code

A Platform for Embodied Coding in Virtual and Augmented Reality

Overview Documentation Video Publications Team Web Editor

Image

Overview

The increasing sophistication and availability of Augmented and Virtual Reality (AR/VR) technologies wield the potential to transform how we teach and learn computational concepts and coding. This project develops a platform for creative coding in virtual and augmented reality. The Embodied Coding Environment (ECE) is a node-based system developed in the Unity game engine. It is conceptualized as a merged digital/physical workspace where spatial representations of code, the visual outputs of the code, and user editing histories are co-located in a virtual 3D space.

It has been theorized that learners’ abilities to understand and reason about functions, algorithms, conditionals, and other abstract computational concepts stem in part from more fundamental sensori-motor and perceptual experiences of the physical world. Our own work, for instance, has revealed that computer science (CS) educators incorporate a wide range of metaphors grounded in tangible experience into their lessons on computational concepts, such as demonstrating sorting algorithms with a deck of cards or the transfer of information between functions by throwing paper airplanes. Our long-term research aims center on the question of how a coding platform that supports these types of embodied conceptual phenomena can make learning to code become a more intuitive process.

Getting Started

Follow our Getting Started Guide to run the app on you headset.

Video

Video Documentation for CHI22 Interactivity. Videos by Timothy Wood and Tommy Sharkey.

Workshops, Presentations, Papers

Team

Contact

To learn more, contact PI Ying Wu at ycwu@ucsd.edu

Participating Labs

Support

This work is supported by the National Science Foundation under Grant #2017042