The Maryland Anthropocene
Personal Project

The Maryland Anthropocene

An Immersive AR Installation Exploring Invisible Ecological Consequences

Skills:AR Development, 3D Modeling, Unity, Visual Design, Fabrication

Project Details

Team

Individual Project (with mentorship from Jonathan Martin)

Timeline

May 2025 - September 2025

Tools & Methods

Unity, Vuforia, Blender, Adobe Illustrator, Laser Cutter, Photogrammetry

Role

AR Designer & Developer, Fabricator

Overview

The Maryland Anthropocene is an immersive augmented reality installation that explores how human infrastructure leads to unexpected ecological consequences. Through a combination of physical models and AR technology, visitors discover how invasive species are introduced and thrive in Maryland because of our built environment and daily choices.

Showcased at the University of Maryland's NextNOW Fest 2025, this installation combined laser-cut wooden models, AR animations, spatial audio, and narrative storytelling to reveal the invisible connections between human activity and environmental impact.

0:00 / 0:00

Video: Full installation at NextNOW Fest 2025

The Challenge

Most environmental effects from human activity are invisible to us. Organisms and chemical processes exist in places we can't see, making us unaware of how our actions directly shape the ecology around us.

I wanted to create an experience that would reveal these hidden connections and help people understand their relationship with the natural world in a tangible, memorable way.

How do you make invisible ecological processes visible and personally meaningful?

Research & Concept Development

The inspiration for this project came from my exploration of the Feral Atlas, a project that investigates the connection between nonhuman entities and human infrastructure. I started by diving into the Feral Atlas and researching feral effects specific to Maryland and the Chesapeake Bay region. I was drawn to augmented reality as the medium because it offers the ability to ground fantastical discovery in physical reality, letting you see what's actually around you while revealing the invisible.

Image: Early sketches and concept drawings

Image: Early sketches and concept drawings

The concept evolved into a miniaturized landscape that visitors could walk around, discovering organisms and processes through AR as they moved. Each discovery would be tied to a specific structure or area, creating a narrative journey through the installation.

Image: Mood board and inspiration

Image: Mood board and inspiration

Narrowing the Scope

My initial proposal was too ambitious for my timeline, with too many species and feral effects that didn't connect cohesively. After discussing with my mentor Jonathan Martin, an XR specialist in the Immersive Media department at the University of Maryland, I refined the concept around a central theme: organisms transported across the sea that become invasive in new environments.

Four Species, Four Ecosystems

Rats

Urban environments

Cats

Suburban/rural areas

Emerald Ash Borers

Wooded areas

Zebra Mussels

Waterways

This focus allowed visitors to spend meaningful time with each species, learning how they arrived in Maryland, why they thrived due to human behavior, and (most importantly!) how individuals can make small changes to lessen these impacts.

Technical Implementation

Building the Physical Environment

I designed the landscape elements in Adobe Illustrator and fabricated them using a laser cutter. Starting with cardboard prototypes let me experiment quickly and cheaply before committing to the final walnut wood models.

Image: Cardboard prototypes and laser cutting process

Image: Cardboard prototypes and laser cutting process

The physical installation included buildings, houses, trees, and a reservoir, all arranged on a table with bright blue iridescent fabric representing water. Each structure contained internal lighting that made the areas glow softly and appear to be alive and inhabited.

Image: Final wooden models and table setup

Image: Final wooden models and table setup

The AR Tracking Challenge

Getting the AR tracking right was the most technically demanding part of this project. Poor tracking breaks immersion immediately, as the AR looks "off" and users disconnect from the experience.


I experimented with multiple photogrammetry approaches using RealityScan, including professional turntable setups and just my smartphone. The challenge became finding the balance between model quality (needed for good tracking) and file size (the tablets hosting the AR had limited RAM).

0:00 / 0:00

Video: Testing AR tracking performance with a prototype model

Workflow for optimizing 3D models for AR tracking:

  1. Scan models with smartphone using RealityScan
  2. Import scans into Blender to clean up geometry and reduce complexity
  3. Test tracking performance on AR tablets
  4. Iterate until tracking was solid but files remained lightweight

This took significant trial and error, but the final result tracked beautifully. At NextNOW Fest, multiple visitors commented, "I've never seen AR that tracked so well" and "I kept forgetting the AR wasn't really there."

Animation & Directing Attention

Because users experience the installation in 3D space, they can look anywhere at any time. Early testing revealed that people would miss key animations or get confused about where to focus. I needed to intentionally guide their attention through the narrative.


My solution combined several techniques:

  1. Movement cues:
    • Animals moved toward the next focus area after their section completes
  2. Environmental elements:
    • AR objects only appeared during relevant story moments
  3. Spatial audio:
    • 3D sound design that changed based on area of focus (busy street sounds for urban areas, birds chirping in wooded areas)
  4. Sound effects:
    • Emphasized important animation moments to draw attention

I used Unity's timeline system to orchestrate these elements with precise timing. I also added captions synced to the audio narrative, which required custom scripting to control correctly.

Audio Narration

Rather than using computer-generated voices, I hired a voice actress to record the scripts. This added a warm, human quality that visitors really connected with. The audio went through multiple iterations, as many versions were too long and lost users' attention when visuals weren't active enough. I had to balance sharing meaningful information with maintaining engagement through movement and sound.

Visual Design

The visual style evolved through several demos and critiques to create cohesion between the physical and digital elements.

Color Palette

The dark walnut wood models established a neutral, earthy foundation. I paired this with a beige tablecloth, light enough for AR elements to pop but subdued enough to feel natural. Internal lighting in each structure created soft, warm glows.

0:00 / 0:00

Video: Final models up close

AR environmental elements followed this same beige and brown palette with pale glowing accents. This monochrome approach let the physical and digital blend together, making users forget what was real and what was AR.


Against this neutral backdrop, two elements controlled attention:

  • The bright blue water fabric connected everything and emphasized that all four stories involved ships moving across the globe
  • Each species' vibrant color that grabbed focus and made each story memorable
Image: AR overlay on physical models showing color integration

Image: AR overlay on physical models showing color integration

Graphic Design

I designed a logo using the font Kiona, a simple, structured, modern font that echoes the infrastructure we build. The logo includes a graphic flourish of buildings, houses, trees, and boats running through the typography like the blue river of the installation.

Image: Logo and poster design

Image: Logo and poster design

The poster features the four central organisms on a water-like background matching the installation's iridescent blue fabric. The animals are rendered in monochrome white to let the blue dominate and tie everything together.

Results & Reception

The Maryland Anthropocene was showcased at NextNOW Fest 2025, where hundreds of visitors experienced the installation.

Image: Visitors experiencing the installation at NextNOW Fest

Image: Visitors experiencing the installation at NextNOW Fest

What I Learned

This project pushed me in lots of ways I didn't expect. Technically, I became far more confident diagnosing and solving complex problems after spending hours (and sometimes days) optimizing 3D scans, debugging Unity timelines, and cutting new prototypes.

More importantly, I learned how to direct attention and craft experiences in 3D space, which was new for me as a mostly digital interface designer. Every small detail matters, like the color of a light, the direction an object moves, and the timing of a sound effect. These micro-decisions combine to create immersion and guide people through a story without them realizing they're being guided.

I put months of work and a lot of heart into The Maryland Anthropocene, and the visitor response reenforced all the small details. Many people commented on the project looking or sounding nice, but I especially loved when they talked about feeling genuinely more connected to their local environment and community.

The project points out destructive human processes and their invisible consequences. However, even though a lot of these systems are too large for any one person to change, little actions as individuals can go a long way. The goal of the Maryland Anthropocene was always for visitors to leave feeling both more aware and more hopeful about their relationship with the world around them.

You can watch this full demo of the installation to see how everything came together!

Check out more of my projects!