The New Yorker:

Video-game engines were designed to mimic the mechanics of the real world. They’re now used in movies, architecture, military simulations, and efforts to build the metaverse.

By Anna Wiener

On a warm afternoon last fall, Steven Caron, a technical artist at the video-game company Quixel, stood at the edge of a redwood grove in the Oakland Hills. “Cross your eyes, kind of blur your eyes, and get a sense for what’s here,” he instructed. There was a circle of trees, some logs, and a wooden fence; two tepee-like structures, made of sticks, slumped invitingly. Quixel creates and sells digital assets—the objects, textures, and landscapes that compose the scenery and sensuous elements of video games, movies, and TV shows. It has the immodest mission to “scan the world.” In the past few years, Caron and his co-workers have travelled widely, creating something like a digital archive of natural and built environments as they exist in the early twenty-first century: ice cliffs in Sweden; sandstone boulders from the shrublands of Pakistan; wooden temple doors in Japan; ceiling trim from the Bożków Palace, in Poland. That afternoon, he just wanted to scan a redwood tree. The ideal assets are iconic, but not distinctive: in theory, any one of them can be repeated, like a rubber stamp, such that a single redwood could compose an entire forest. “Think about more generic trees,” he said, looking around. We squinted the grove into lower resolution.

Quixel is a subsidiary of the behemoth Epic Games, which is perhaps best known for its blockbuster multiplayer game Fortnite. But another of Epic’s core products is its “game engine”—the software framework used to make games—called Unreal Engine. Video games have long bent toward realism, and in the past thirty years engines have become more sophisticated: they can now render near-photorealistic graphics and mimic real-world physics. Animals move a lot like animals, clouds cast shadows, and snow falls more or less to expectations. Sound bounces, and moves more slowly than light. Most game developers rely on third-party engines like Unreal and its competitors, including Unity. Increasingly, they are also used to build other types of imaginary worlds, becoming a kind of invisible infrastructure. Recent movies like “Barbie,” “The Batman,” “Top Gun: Maverick,” and “The Fabelmans” all used Unreal Engine to create virtual sets. In 2022, Epic gave the Sesame Workshop a grant to scan the sets for “Sesame Street.” Architects now make models of buildings in Unreal. nasa uses it to visualize the terrain of the moon. Some Amazon warehouse workers are trained in part in gamelike simulations; most virtual-reality applications rely on engines.

Go to link