Project 2: AR Campus Tour Experience

In collaboration with Catherine Liu

Anthony Pan
12 min readNov 21, 2021



Building Personas:

To begin this project, we thought about the different people who would come to visit CMU and created personas based on them. We also considered visitors of different ages, physical abilities, income levels, races, nationalities, and interests. While there are many more personas that we could account for, we thought that these 3 are enough as a starting point for consideration.

The Experience:

When thinking about the campus tour experience, we considered what the AR medium can add that visitors cannot get on a regular campus tour. We asked ourselves what affordances a digital medium provided that a physical couldn’t. We agreed that there are many significant events on campus that only happen once a year that visitors would not get to experience if they weren’t there at the right time. Thus we began listing out some of those events: buggy racing, spring carnival, experiencing the different seasons, club fairs, and an assortment of other student activities.

We settled on creating a virtual tour experience with the central theme of exploring and experiencing Carnegie Mellon University throughout the year — CMU 365. We wanted to focus on the specific interactions with the 4 seasons as well as the student club fairs in the spring and fall.

The premise of the interaction is pretty simple. We want to create an engaging experience that would convince a potential student to come to this university. To accomplish this, we decided to create a digital version of the club fair that most prospective students would miss out on during their tours of campus. Visitors would walk towards the cut where visual displays of the club fair would appear in their headset. Tables showcasing the different clubs would appear in front of the viewer. Holographic projections of people would also be displayed to make the experience more authentic. Within the headset, the viewer would be able to access a map showcasing the layout of the club fair as well as a list of club names to choose from. They would then be able to select a club they were interested in, and a waypoint would be activated to direct the student to the corresponding table. We wanted people to actually physically move through the space to create an immersive experience. They would be able to actually experience what it would feel like to participate in a club fair without it physically happening. We wanted to create a digital interaction that would also respond and change according to things happening in reality.

We also wanted to take into account the various weather conditions and physical affordances people could have or encounter on campus tours. For example, people with physical disabilities may not be able to move on the cut as effectively or conveniently as someone who didn’t have physical disabilities. Additionally, weather conditions like snow or rain may make it difficult to access the cut. Taking this into account, we decided that we could connect the AR experience to a weather application that would help update the experience according to the current weather conditions. During bad weather conditions, tables would be placed tangent to the sidewalks surrounding the cut. People would then be able to access the tables from the comfort provided under the overhang. This would also be true for people with physical disabilities.

When interacting with the tables, prospective students will be able to learn about what the club does on campus, how the club is engaged with the Carnegie Mellon community, ask questions about the club, and get extra information like a website or Facebook page for future contact or research.

4 Seasons Experience

To account for the 4 different seasons, visitors will be prompted outside the UC (which is where they will start the tour) to choose a specific season they are interested in experiencing. The seasons will be presented on the trees outside the UC. Through the hololens, if the visitor selects autumn, the tree leaves will turn orange, and if they select winter, the leaves will fall. Based on what season the visitor chooses, different activities will also show up and the visitor can choose which activity they are interested in and a waypoint will be projected on the ground to direct them across campus to where the activity usually takes place.

In class on Tuesday, we took test videos of the user walkway which we can layer the graphic interface onto, we also took note of how the rainy weather that day could be accounted for in the experience and how we can adapt the walkways to be more indoors.

For our next steps, we will explore some lo-fi prototypes with paper and graphics photoshopped over pictures.

To flush out our concept, we started with storyboarding our two main interactions: Choosing activity and exploring the fair.

From there, we looked at the videos we took and selected specific moments where interaction can occur, from which we took screenshots and imported them into Figma. On top of the screenshots, we made the graphic interaction prototypes at different stages so that we could visualize what the projections could look like in real life.

Here are the storyboards for the 2 experiences:

Table Interaction
4 seasons interaction

Lo-fi prototypes:

We built out some lo-fi prototypes for class. We used images and overlayed them with the potential UI and AR elements.

4 seasons interaction
Table Interaction

We also overlayed the UI on a video in After Effects to simulate what it would be like to walk with the headset on. We will show it in class tomorrow.

Self Reflection:

As the prevalence of digital media in our physical environments increases daily, what is the role and/or responsibility of designers in shaping our environments?


As digital media continues to increase in our environments, designers are offered a unique opportunity to use these new tools to explore and create new interactions that enhance our perception and experience with reality. Designers can help facilitate how digital interactions and interfaces help people better navigate through spaces, take in information about their environments, facilitate learning, and much more! I think that it is important for designers to take advantage of the rapid development of technology but also keep in mind issues like the universality of their designs or sustainability. Questions like how can we bring these new and innovative technologies to everyone or how can these new digital interfaces be used to reduce waste production or educate people on sustainability are vital to creating a better future for more people to continue to enjoy and experience these mixed digital and physical environments. I think that it also is an interesting concept to enhance our physical interactions with digital mediums. Take the holo lens example interaction of showing people how to build or assemble things. I think that anything along those lines is a snippet of the possibilities granted by digital media in our physical environments. I’m not sure as to what other methods would be available to create similar interactions in the same vein, but the potential is there and designers just need to explore it.

Thanksgiving Break Progress: 11.29.21

Based on feedback we got in class, we focused on incorporating more 3D elements that the visitors can interact with. For this, we decided to remove a lot of the 2D graphics such as the map and club list. Instead, the map would be a 3D map that you can pull out as a hologram from your wrist like in Hololens, and each table would start with a character who you interact with and they can show 3D models of information as needed. To show the extent of these 3D assets, we will focus on interacting with a buggy booth, where the visitor can physically interact with a buggy model and see the hill they race on. Additionally, instead of having the visitor choose a club they want to go to, they are instead encouraged to explore the different booths based on floating logos on top of them.

This is how it will look like through the Holo Lens

To prototype, we made a table using SketchUp and placed the file into Aero so we could begin seeing how it would look within the space. However, there were many interactions that we could not prototype in Aero, so we also took videos and layered graphics on top of them in After Effects. While doing this, we found trouble changing the trees according to different seasons which we will need to experiment with to find a good balance between realism and level of fidelity. Working forwards, we will also work on creating interaction videos for the table interactions, for which we will have to create a 3D model of the character and different assets.

Here are the videos we took in Aero and After Effects.

table from Aero

Catherine also made a script to give us a rough outline of what we wanted to do with the rest of the video.

Welcome to the CMU365 experience. To begin the tour, put on your Hololens glasses. Our goal is to bring the year-round CMU experience to you.

As you begin this tour, pick a season you’d like to experience by tapping the button on your left wrist. Select a season and apply it to the tour by dragging it to a tree nearby. Once the tree changes, CMU365 would have finished loading your events. Choose from a list of student events and a waypoint will direct you to its location on campus.

As you walk to your destination, you will listen to a brief introduction of the event.

“The FAIR is CMU’s annual club fair that is usually held on the cut. Various student clubs and organizations will have tables where students can talk with club members to learn more about the specific club and also join clubs of interest. This is one of CMU’s largest student-run events and is where many students take the first step to making relationships and deepen their academic and personal hobbies.”

Once you arrive at the cut, feel free to walk amongst the projected tables and explore different clubs. Find an interesting club? Tap on the floating logo.

“Welcome to CMU Buggy! We compete in the annual buggy race held during Carnival”

“What is buggy?”

“Good question! (hill terrain appears on table) This annual student-led relay race partners designers, engineers, mechanics and athletes. Together, they build and operate a buggy — an aerodynamic vehicle with no engine. In just over two minutes, teams race around a .84 mile track with buggy and driver leading the way around Schenley Park’s Flagstaff Hill, with speeds up to 40 mph.

“What does a buggy car look like?”

“(pulls up 3D model) A buggy car has many factors considered into its making such as it’s size, form, and material. Feel free to interact with this buggy car.

“What is it like to be a buggy driver?”

“You can experience it yourself! (Play 360 video)”

“Where can I contact this club?”

You can drag and save interesting club links to the bookmarks tabs located in the interface on your wrist


In class, we spoke with Daphne who advised us to think more critically about the elements in our table interaction. The main point we discussed was the hologram that appears. Originally, we were going to create a simple 3D character model, but Daphne suggested changing it to an actual hologram of a person behind the table. This is so that it is more realistic to the actual club fair where you will talk with club members who are sitting there.

We also considered showing 2 people on the tour instead of one since we can end the video with them talking about the club and then suggesting to go to another table. However, we felt that having just 1 person is enough to show the concept and interactions so 2 people were not necessary.

We also found a picture of the UC trees in summer so we went out to reshoot some clips from that specific angle so that we can overlay the picture in After EffectsWe did a lot in After Effects, combining 3D models we made in Aero as well as the clips we reshot yesterday. We were able to overlay the UI we developed in Figma and rotoscope the clips so that the hand appears in front of the buttons to make the AR experience feel more realistic. We also used a photo from the fall to overlay on the trees as if the leaves were actually being rendered and placed on the tree.

Currently, we are working on the navigation to the tables scene. This scene includes the user walking towards the cut after selecting the Fair. We used Adobe Aero to create tables that we placed in 3D space. We still need to place logos of the clubs above the tables within the Aero clip and finish filming the table interaction and its assets.

Finally, we outlined our presentation to review for Thursday.


We worked on the presentation and rehearsed our parts during finals week. We structured our presentation based on the outline Daphne and Peter gave us. We started by explaining the significance of our concept, detailing what the tour right now was missing and how AR could be used to enhance the already existing system. We then talked about personas and the significance of designing for a broad range of audiences. Here we explained what factors we were accommodating for, like physical disabilities and weather conditions. Finally, we created a system map of all the different parts involved with our tour from our hardware to the cloud database. For the rest of the presentation, outlined where the tour started and ended, what people would do on the tour, showed the final video, and talked about the interactions with more detail using stills. At the end, we recapped the AR tour using the verbs that encapsulated the goals of the AR experience.

In After Effects, we worked on adding the hologram, filmed the table interaction, created assets in Adobe Aero, and finished on small details like sounds and overall pacing. For the table experience, we created assets in Aero and went to the cut to film. We had it so that icons would float above the different tables to mimic the club fair. In After Effects, we added the gaze recognition icon, a waypoint, and a club description. We wanted to focus on natural interactions, such as using gaze recognition, audio dialogue, and wrist interactions that are easily accessible throughout the tour. Moving onto the table itself, we set it up so that the different elements on the table like the buggy and terrain model would appear at different times in between questions. We recorded Catherine on a green screen so that she would appear as a hologram behind the table. We recorded questions and answers, added a “listening” icon underneath the hologram AI, created a logo for the tour experience and had John Henley narrate over the video. For the narration, we tried to focus on clearly stating each interaction along with the concept behind it. For instance, the tree interaction is to show the seasons changing, and the table hologram is for more natural conversation. We also emphasized these interactions in our presentation, highlighting the tree and table parts of the tour.

Here is a copy of our slides:

And here is a link to our final video!