Metaverse Imaginaries Vignette #2: A Day in the Life of an Engineer, AI , AR and VR

Riz Virk
10 min readFeb 14, 2022

--

Figure 1: Src: Ars Technica

As part of my doctoral research on the metaverse, I am creating some sci-fi vignettes which can be used as “imaginaries” of the future of the metaverse. An “imaginary” is a term that social scientists like to use to talk about a vision of the future, usually (thought not always) rooted in science fiction, which starts out with a small group but then gets adopted by a larger collective, companies and governments, etc. and eventually becomes a “sociotechnical imaginary”.

See Metaverse Imaginary #1 here. Each Metaverse Imaginary vignette will have a specific theme (see below) and incorporate multiple threads (see below). I will also add some commentary eventually about the state of the imaginary vis a vis startups and larger companies that are toiling to turn these imaginaries into reality. I’m putting these up for feedback and questions to spur discussion about the various future scenarios of the Metaverse.

Let’s jump right into the second vignette, with a short discussion afterwards. Note that this is a near future imaginary and is more about the Future of Work, but not as extreme as some futures where new jobs and careers will be created in the Metaverse. As an example, in Ready Player One, Wade’s mother had two full time jobs in the OASIS (its fictional version of the Metaverse), one as a telemarketer and one in an online brothel in the Metaverse. Rather this vignette is about how the Metaverse might change the work of people who have existing jobs that might utilize the Metaverse. Other vignettes will get further into the Metaverse economy.

Using AR and VR technology at Work (src: shutterstock)

Vignette #2: A Day in the Life of an Engineer: AR, AI Assistants and more

Susan got up early for her engineering team meeting. Since the Engineering team was spread out between Buenos Aires, San Francisco, Bangalore and Beijing, and corporate headquarters was in London, they had settled on GMT. Why was it that the corporate, which was probably the smallest office got to set a convenient time for the meeting? She brought up her personal assistant Aivie, or AI Avatar, which she had named Laurie.

“Laure, can you get me a Chai Latte” she said as she put her smart glasses on. Of course, Laure was hooked into the smartspeaker in the apartment, but she preferred to see her assistant. The glasses were still in AR mode and Laure appeared next to her, in a not quite human form — she looked like a cross between a teddy bear and a cartoon, it was a furry avatar. But she had set it so that only she could see Laure, so she wouldn’t inconveniently appear while in the middle of a work meeting.

“Yes, I have sent the order. Would you like to go pick it up?” asked the Aivie. When Susan paused wondering if she had time to go down, Laure responded: “It will be tight to get there in time. I can have it delivered.”

“Very good..” muttered Susan as she was getting setup to go to work. She had dressed for work, but this was just a generic cream sweat suit that she used while working at home. The cream color made it easy to mask — like a green screen in the old days of film and TV production. Since it was just downstairs she knew Laure would find a courier that was nearby — all they had to do was pick up the drink and bring it up to her apartment. Luckily, Laure could authorizer her to come up when the front desk called so it wouldn’t bother Susan.

The drink arrived just before Susan’s avatar arrived in the the 3D conference room. She had brought up her usual work avatar, which looked just like her but was dressed in more professional clothes. The landscape of the meeting was something that looked like a cross between a playground monkey bars and something out of an old Mad Max movie she’d seen when she was a kid — she thought it was called Thunderdome or something.

She was floating up on one of the monkey bars. Her boss, Herbert, the VP of Engineering, was standing in the center of the dome-like contraption with a big screen above him. Technically, the screens were only necessary when viewing the real world, but it was a metaphor left over from the old days when people met in conference rooms and used actual screens to project things.

The engineers in her sub-team, Hiram and Lisa popped up near her, and the other sub-teams all appeared on various parts of the 3d space, including the off-shore team she worked closely with in Bangalore, which included Ranjan and Rajiv and Preeti. At least that’s where they were supposed to be in Bangalore — physical meaning didn’t have the same. She sent them private messages of greeting and set up a private chat group so they could talk to each other while the presentation started.

Joining Herbert was the main product manager, William for their software, called Automation 3.0. He brought up a 3d Chart that floated in the air of all the dependencies and milestones to get their next release, 3.2, out the door. It was a major upgrade and they’d already been working on it for the better parts of six months, and were behind schedule. Herbert and William reviewed the major objectives and when their turn came, their particular module zoomed up and as the lead engineer, she floated up to each of the specific points and updated the rest of the group on where they were and then turned it over to Ranjan for the offshore challenge they were facing. It was all very routine.

Once the status part of the meeting was done, another person, the VP of Customer Success, Sharon, who was also located on the West coast of the US, appeared.

“Our software, as you know, drives the robots that manufacture cars for the new Tesla plant in China. We’ve been working closely with them but there’s a problem. It started as a small glitch in the robots and they traced it to our control system.”

She brought up a 3D model using a subpatch of realism that simulated the physics of Earth quite well. It was a giant robot arm and a partially constructed Tesla car. “Susan, we need your team to figure out what’s going on. This has become a second tier support issue and has halted their assembly line so we need to give it top priority.”

Susan asked if the Beijing team had looked into it. Sharon responded that they were already on-site, she indicated to Chou, the support engineer who was on site, whose avatar lit up floating up on the other side of the giant dome. They had only gotten so far, he explained, his voice booming across the entire dome — it was a strange trick that was used to amplify a voice so that everyone could hear, whereas in a realistic sim you could only hear those immediately near you and sound faded. They had tracked it down to her module. Susan agreed and asked Rajiv, who was the best debugger to come with her as soon as the meeting ended.

The meeting ended and both Rajiv and Susan found themselves inside the realistic patch, a sub-sim that was a digital twin of the Chinese Tesla Plant. Since Chou and one other engineer was on-site, they could provide a video view and once the assembly line was shut down, they could holo-project her there to help with support. While they waited for the next break in the production line, Chou simulated the procedure using the simulated robotic arm and the half-done car. Tesla had been one of the first car companies to use the Metaverse, building a 3d environment that was a digital replica of one of their plants, using an old Nvidia relic called the Omniverse.

When the realistic subpatches came up and the Earthverse 2 went live, the omniverse tech had been incorporated, built by teams of developers from around the world. At any one time, thousands of engineers were upgrading the Earthverse2 and the metaverse in general, depending on which sim and verse you were visiting, things could change right before your eyes. In a realistic sub-patch though they weren’t allowed to do live updates and everything had to be scheduled, since the point was that the laws of physics should be simulated exactly as necessary. “Patch will be updated in 1 hour,” flashed a virtual billboard above their heads.

Standing with Susan and Rajiv was Jared, an AI troubleshooter that knew the interfaces between all the systems and could help in diagnosing the problem, though Jared wasn’t good enough to come up with real solutions. Chou had already been working with Jared, who looked like one of the old Dr. Who Doctors, a joke no doubt by the designers of Jared. Susan smiled, she knew one of the developers there in London and he was a Dr. Who fanatic.

A few minutes later, the plant had a production break and Susan and Rajiv both shifted to projection mode. They now found themselves inside the actual physical manufacturing plant. Chou was no longer in avatar form. Rajiv looked the same, but Susan had chosen a different set of clothes for this visit — it was her standard “site visit” outfit and avatar. The others knew it was her. Susan now felt like she was actually in the plant though she had only limited mobility based on the portable holo-projector that Chou had brought with him. Chou however could move around easily and she could shift to his point of view (or at least the point of view from his glasses).

Jared was also there but didn’t need to be projected. Chou pointed out the problem. “Jared,” said Susan, “please show me the code objects for modules 7–11 and the interfaces”. Rather than seeing a window with code, Jared popped up several holographic boxes, each of which was a module, connected to each other by numerous lines, each of which represented an interaction or exchange of data.

Susan zoomed in on the lines between module 7 and 8, using her hand controls with her left hand. She was still sipping the Starbucks in her right hand. “Color code and show the values…” said Susan and the lines started changing colors and she could see text moving back and forth — too much and too fast to read. “Slow it down,” she said, and zoomed in further and again using her left hand selected several of the lines to zoom in on.

“I think I see the problem..” she said, and took a sip from her Chai Latte in her right hand. “Let’s go back into the sim and try out a fix and see what that yields…”

Things to Think About

I’m putting these up for feedback and to spur discussion, so feel free to leave comments. Some questions to think about include:

  • How far in the future is this vignette? (1 year, 5 years, 10 years or more)?
  • What are some of the technologies that we already have, and what are some of the technologies that still need to be developed?
  • Is the development a simple matter of better engineering, or does it require any new breakthroughs?
  • Most of us are used to working remotely via zoom, how does the experience change if you are in a virtual space with other avatars? What kinds of issues come up in a work environment?
  • How does this vignette reflect on issues of identity in the “real” world and among avatars — would you have different sets of clothes for your avatar to wear in different work-related environments?
  • How about this idea of being in AR/VR ad projecting your avatar onto a physical location? Would others there be able to see you? Would you be able to see others?
  • Consider that the metaverse could be a set of very different use cases, each with their own physics engines, but linked together via the avatar as the critical component. What considerations need to happen to make this a reality?
  • How is the idea of a realistic sim different from the metaverse environments that most of us envision when we think of working in the metaverse? How good would the laws of physics have to be? How does this relate to NVidia’s omniverse, or other simulation software?
  • What is the role of AI and virtual humans in the metaverse ? In this vignette, we have Jared, an AI assistant that is not quite AGI (artificial general intelligence) but is an expert on coding, and in fact might be able to pass the Turing Test. Is it possible that AI assistants wi ll know more about our technical systems and be able to debug them faster or in conjunction with humans? In the context of work, what does a “centaur” — a term for a human AI team — provide that neither the human or the AI might be able to provide alone?
  • In this vignette we also have an AI assistant, called Laure, that is a personal assistant. If we use the Metaverse for work, what elements can we bring from our own lives, our assistants and data that knows about us? Would it be the same as installing our own software on a work laptop? Or are we talking about something completely different?
  • How does this relatively short imaginary, relate to past imaginaries (TV shows, movies, or books) and how does this measure up with various visions of the Metaverse being put forward by companies like Meta, Microsoft, NVidia’s Omniverse, Epic, Unity, Decentraland, Upland, The SandBox or others?

--

--

Riz Virk

The Simulation Hypothesis, Play Labs @ MIT, Startups/VC, Sci Fi, Bitcoin, Consciousness, Space, Video Games: visit www.zenentrepreneur.com