Home » VR-REU 2023 » STEM Education on Structural Biology through an Immersive Learning Environment

STEM Education on Structural Biology through an Immersive Learning Environment

Sabrina Chow, Cornell University

Week 1: Introduction and Project Proposal ๐ŸŽณ

The first few days in NYC for the REU started with an introduction to the rest of the cohort and the facilities. I went bowling with Dr. Wole and the other REU students, which was a lot of fun and very competitive. The next day, we all gathered at Hunter College and toured the building. We met some of the mentors, including my own: Kendra Krueger. The day after was the start of the class “CSCI49383 – VR, AR, Mixed Reality,” where I learned about the ideal principles of VR, how it works, and its history. After class, I worked with some of the other students to brainstorm for our proposals over poke and boba.

Later, I met with Kendra to write up the details of my project proposal. Kendra is the STEM Outreach and Education Manager at the Advanced Science Research Center (ASRC), and from our conversation, I can tell that she is truly an educator at heart. I’m really excited to work on this project, which will enhance the learning experience for K-12 students visiting the Illumination space in the ASRC. Kendra gave me two different paths to go down, but ultimately, I have decided to focus on structural biology instead of neuroscience. It’s a subject I’m more comfortable with and I think I can create a good STEM education project about it. Finally on Friday, I met with the rest of the cohort, where we got an introduction to Paraview and presented our project proposals.

A snapshot of the Paraview tutorial we went through.
A snapshot of the Paraview tutorial we went through.

Week 2: Working at the Advanced Science Research Center ๐Ÿงช

I started the week with getting set up at the ASRC and introduced to the other high school/undergraduate researchers working there over the summer. I got to talk more with Kendra about my project and briefly met Eta Isiorho, a researcher at the ASRC whose expertise in structural biology and crystallization I will be relying on. Then, I attended a lab safety training session over Zoom so that I’ll be able to enter Eta’s lab. I also used the time to complete CITI training since the world was on fire and it wasn’t safe to go outside. (see photo below)

Smoky air outside the dorm.
Smoky air outside the dorm from wind blowing down the smoke from the Canadian wildfires. The AQI was almost 300.

Towards the end of the week, I attended the SciComms conference at the ARSC with the rest of the VR-REU cohort. The format was each presenter prepared an informal presentation of their research followed by a formal, more scientific version. It was really interesting to hear about the wide variety of projects going on around us, and I think attending will really help to prepare me for our symposium at the end of these 8 weeks.

Part of the science/research art project at the SciComms conference.
Part of the science/research art project at the SciComms conference. The question was, “What about research inspires you?” For me, it’s my love for animals and therefore, biology.

For my project, I continued to compile sources and take notes for my literature review. I was hoping to create a mock-up for the project, but after meeting with Kendra and Eta, I think I will need to readjust my project to fit both of their expectations.

Week 3: Making Progress ๐Ÿ“ฑ

This week, I started to get into the meat of the project. Since this is a large project, I knew I had to break it down into smaller pieces. First, I made a mockup of what I wanted my application to look like using Figma (see below).

This is my general idea for the application that I’m developing. Students will be able to use their devices to see molecules and more through AR.

Second, I began to work with Xcode to create the real app. This took a little bit longer than I was expecting since I am still getting used to Xcode and Swift again, but I have the general layout.

Layout of app in Xcode.
The first look at my application in the Xcode storyboard.

Looking forward, I will need to work on the functionality of the application. That will be the most difficult part of the project, but I’ve found many YouTube tutorials that will help me understand how Apple’s RealityKit works so I am hopeful. In addition, another issue that I’ve been considering is how I will share my application. If I go through the official Apple App Store, I will need to submit the app for review and prepare the app with the proper certificates, etc.

Outside of my project, I also met a couple more times with Eta. She showed me the crystallization lab at the ASRC and taught me more about the software she uses. I’m hoping to use some of that software to create videos of the molecules. In addition, I attended the CUNY Graduate Sciences Information Session and learned more about the process of applying to grad school. Finally, towards the end of the week, Dr. Wole taught us about VMD.

Picture of VMD interface with overlaying structures.Picture of VMD interface with molecule with selected functional group.

Week 4: Application Framework ๐Ÿ› ๏ธ

For this week, I created the structural framework of my application in Xcode. I finished the storyboard for the application and began to make ViewControllers. The vast majority of the week was spent on implementing the collection tab. In hindsight, I think there are still ways that I could have made the code more efficient. For example, I made three separate UICollectionViews instead of just using the built-in sections version. This method would require adding custom sections though, so I will most likely not change this unless I have spare time at the end of the project.

The implemented version of the collections tab for my application.

I also worked on implementing the pop up page that shows up when a molecule is selected from the Collection tab. Each molecule will have more detailed information about what the image is showing and why it is relevant (in general and to the ASRC’s scientists).

This is the pop up tab that shows more details about a selected molecule from the Collection tab.

The only thing left to do regarding these parts is:

  • The actual game part. Users will need to unlock the molecules through the AR camera. That means that they should not be able to be clicked on until after the user has scanned a particular code.
  • The molecules. The image files used were random examples taken directly from RCSB PDB. I will need to find relevant molecules and their images– hopefully from Eta.
  • The descriptions. I will need to write the different blurbs and have Kendra look over them. My goal for the little descriptions is that they will be informational without having too much scientific jargon.

I think for this upcoming week, I will reach out to Eta and Kendra about getting files. Other than that, I will be focusing on implementing the AR part because I suspect that will be the most difficult. Once I have the files, I will also need to convert them from .pdb/.xyz/etc. to a 3D compatible format. Fingers crossed!

Week 5: Plateau-ing ๐Ÿฅฒ

This week, I started out with trying how to convert between file formats. Most protein files are saved as .PDB (old) or .mmCIF (new). First, I needed to change from those formats to .OBJ, a standard 3D file format. VMD and PyMol are both supposed to have native converter tools, but I found that when I tried to convert files using these two programs, the resulting files were almost or completely empty. Eventually, I found that the Chimera program works the best to convert the .PDB/.mmCIF files to .OBJ. Second, I would have to go from .OBJ to .USDZ, the 3D file format created by Pixar that Apple uses. The newest version of the application, ChimeraX, was the best for creating a .OBJ compatible with Apple’s RealityConverter tool that takes 3D files and converts them to .USDZ. The final file did not have color, which is definitely not ideal, but I think I will deal with that later.

A snapshot of RealityConverter taking in a .OBJ file and creating this .USDZ file.
A snapshot of RealityConverter taking in a .OBJ file and creating this .USDZ file.

Next, I worked on implementing the actual game functions. This required setting up ‘communication’ between the different ViewControllers. I tried many different methods, but I found that the best way was to have functions changing the items within the Collection class and make instances of the Collection class in the different ViewControllers that needed to use those functions.

Finally, I’ve been trying to learn and use the basics of RealityKit in the app. Specifically, I want to use the image tracking feature. I need to track multiple images, and each image should show a specific image. I have an idea of how to do it, but I have not been able to test it. Also, I still need the actual files that I will use in the application.

Week 6: Everything is Looking Up ๐Ÿฅณ

I began the week knowing that I would need to get the AR function implemented. My goal is to do user testing next week, and I can’t do that with an app that doesn’t have AR since the whole point of this program is to use XR in an innovative way. I was starting to feel panic as the deadline is quickly approaching.

As a result, I worked on setting up the AR. I finally began to test on my iPhone, instead of the built-in simulator on my laptop. On the first tab of my application, there is an ARView. My first issue was that this ARView was just showing up as a black screen. Eventually I got it to work with the camera by setting up permissions in the app’s .PLIST file (property list).

Camera on.
The first tab of my application with the working camera.

My application uses images to track where the model should be placed. Therefore, using my phone camera allowed me to actual see the model from different perspectives. I made a couple of sample scenes in Apple’s RealityComposer and then imported the project into my Xcode project. In RealityComposer, I was able to display the model by scanning the image (as seen below), so I assumed that it would work in Xcode. It did not.

Molecule model in RealityComposer.
This was the sample molecule model in RealityComposer using my phone camera. The model is sitting on top of the QR code.

I ran into a complete roadblock with the AR in the middle of the week. I found that my Scene was loading correctly but was not connecting to the Anchor. I honestly think I spent 12 hours searching for a solution. I kept adding and testing code over and over. What was the solution, you might ask? Deleting everything but the code loading in the models from screen… Sometimes the simplest answer is the solution. As a result, the models correctly showed up when its respective image appeared. (I’m updating this blog in the middle of the week because I need to share that I succeeded ๐Ÿฅน.)

Working AR models on iPhone screen.
Using ARView to scan the QR codes and place the models from RealityComposer on their respective QR code.

Then, I worked on the ‘unlocking’ feature. This required reworking my Item class and learning about how Substrings work in Swift. Thankfully, it was not nearly as difficult as the AR stuff. I also spent some time downloading QR codes for the image tracking. Finally, I worked on downloading files from the Protein Data Bank and writing descriptions for them.

Week 7: Final Stretch ๐Ÿƒ๐Ÿปโ€โ™€๏ธ

This week consisted of fixing a lot of small things before doing user testing. For example, one issue I had was that I needed to load the arrays of my Item class before ‘capturing’ a picture of the Item’s model in AR. My solution was to switch the positions of the Collection and AR tabs. This meant the Collection tab would load first. I also think logically it makes more sense for the Collection to be first as an introduction to the application.

Another major part was fixing the QR codes I was using. I originally generated the QR codes online using QRFY. Then I used a photo editor to add a blank space in the middle with the QR code’s number. The issue that I ran into was that the QR codes were too similar. Apple’s AR system kept mixing them up for each other, resulting in tons of overlapping models. At first, I thought the issue was that I was testing it on a screen; however, I printed them out and they were still glitching. I then spent a couple hours in the photo editor adjusting the QR codes by hand until Apple approved that they were different enough.

I also learned how to download the files with color. Instead of using the .OBJ file format, I started using the .GLB/.GLTF format. The command in ChimeraX looks like: “save filename.glb textureColors true”.

I then collected all of the files that I would need. I decided on three major subjects to talk about: 1) protein structure, 2) x-ray crystallography, and 3) scientists at the ASRC. All of the protein molecules were downloaded the RCSB Protein Data Bank. I wanted to use 3D models for the x-ray crystallography process, but I realized I did not have enough time to make them myself. Therefore, I used images from a presentation that Eta had sent me. For the scientist spotlights, I went through the faculty page of the ASRC and read a ton of papers until I found two more faculty (in addition to Eta) that work with protein files. Once I had all the images and converted the files, I put them into my RealityComposer project. Then, I wrote captions for each one.

What my Reality Composer project looked like with the correct model and QR code.
What my Reality Composer project looked like with the correct models and QR codes.

Finally, on Thursday of this week, I did user testing. It was a bit nerve-wracking, especially because I learned that I could not download the application on everyone’s phones. Apparently Apple only lets you download onto three devices, and for some reason, I could only download it onto my phone and one other person’s phone. I even tried to sign up for the paid developer program ($100 fee…) but it would take 2-3 days to get approved and it was the morning of user testing. Ultimately, I decided to split up the group into two and have everyone share the two phones.

The testing itself went pretty well! I was pleasantly surprised by how invested everyone was in finding all of the QR codes. Everyone was also quite impressed by the AR. The models stay still on the QR code, so moving around in real life allows the user to see a model from different perspectives.

Another part of the Thursday tour was my speech. I was invited to give a 30 min speech to my cohort about something related to research. My topic of choice was “Surviving Academia 101.” This is something I feel pretty strongly about since I am still figuring out my path through academia. To be honest, it certainly was not the most well-rehearsed speech, but I think (and hope) that my passion about the subject made up for it. I talked about my experience with wanting to do research but not feeling like I belonged.

Sabrina Chow giving a speech.
A photo of me giving a speech to my cohort about research.

Week 8: Saying Goodbye ๐Ÿฅฒ

Over the weekend, I started to write my final paper on Overleaf. I had already set up the general structure and some of the sections. Thanks, earlier me! The main thing that I did was updating my methods section and starting to look at the results. Although I did not get as much user data as I would have liked, I definitely had enough to consider my work a preliminary study.

On Monday, I went in to help test some of the other students’ projects. It was really impressive seeing what everyone else had accomplished in just 8 weeks! Dr. Wole also helped me with some questions I had about how to visualize my data. Later that night, all of us students met up for dinner. It was so much fun.

2023 VR-REU students dinner

For the rest of the week, I spent most of my time grinding out the paper and preparing my slides for the symposium. I honestly did not have too much trouble with using LaTeX. My real issue was finding the right words to summarize everything I did. There were parts where I wanted to overshare about the process (specifically to complain about all the problems I had run into). There were also parts where I had no idea what to write. Still, by writing a couple sentences and then switching when I ran into a mental roadblock, I began to make significant progress. Writing my paper while also working on the presentation helped a lot as well, since I could just take the information from the paper and simplify it for the presentation. Before long, the slide deck for my presentation was done.

Symposium presentation by Sabrina Chow
My presentation on what I had spent the last 8 weeks doing. It was 10 minutes long with another 2 minutes for questions.

Thursday morning was the VR-REU Symposium. One by one, we presented our projects, talking about how our projects took shape, what challenges we had faced, and our results. Even though I’d seen everyone’s projects by that point, it was quite interesting to hear about how they addressed issues with their projects.

Finally, Friday arrived. Our last day! It’s hard to believe that time passed by that quickly. I went to our usual classroom in Hunter and finished up my paper. I submitted it to the ACM Interactive Surfaces and Spaces poster session. Then, we said our goodbyes.

For my last night in NYC, I went for a quick walk through Central Park and reflected. I’m so grateful that I got to be a participant in this REU. I’ve learned so much from Dr. Wole, Kendra, and my fellow students. I challenged myself with a project that was all my own, and I am very proud of how it turned out. Wishing everyone else the best with their future endeavors! I know you all will do amazing things!

Final Paper:
Sabrina Chow, Kendra Krueger, and Oyewole Oyekoya. 2023. IOS Augmented Reality Application for Immersive Structural Biology Education. In Companion Proceedings of the 2023 Conference on Interactive Surfaces and Spaces (ISS Companion ’23). Association for Computing Machinery, New York, NY, USA, 14โ€“18. https://doi.org/10.1145/3626485.3626532 – pdf

Hunter College
City University of New York
HN-1001T
695 Park Ave
New York, NY 10065

Telephone: +1 (212) 396-6837
Email: oo700 at hunter dot cuny dot edu

Follow me:
wolexvr on Twitter  wole-oyekoya-5b753610 on Linkedin  woleucl's YouTube channel  oyekoya's Github profile  Googlescholar