🤖 I Was There When the Singularity Happened by L.N. Hunter

not every meeting that could be an e-mail should be an e-mail

🤖 I Was There When the Singularity Happened

by L.N. Hunter

Jake knew the meeting would be a boring one—like most work meetings, if he was being honest—so he sent his AI avatar in his place. It’s not as if anyone would notice; besides, everyone did it every now and then, when they had nothing much to report and plenty of ‘real’ work they could be doing instead. Or catch up on office gossip. Of course, he’d keep the Sleems meeting window open, tucked into the top left corner of his secondary screen, and have the audio on at low volume, so he’d be able to vaguely pay attention. Some of the time, anyway.

Brian, the team leader, started the meeting: “Well, it’s a couple of minutes past ten, so let’s kick-off. You should all have the agenda in front of you. Item one: Project Flibbertigibbet—Jake, any updates for us this week?”

Jake’s avatar cleared his throat, but before he could speak, Molly burst into the conference room.

“Hello, everybody,” she said, too loud, making everyone else wince. Reaching for her volume knob, she said more quietly, “Sorry. And apologies for being late—traffic.”

Jake’s avatar raised an eyebrow, knowing traffic wasn’t actually a factor as she worked from home, like almost everyone else.

Settling herself into her chair and fussing with her headset, Molly asked, “Did I miss anything?”

“No, we’re just starting,” Brian said. “Jake was about to give us a status report on Flibbertigibbet.”

Jake’s avatar said, “If you recall, last week we were waiting for IT approval before deploying the alpha system on test servers. And we’ve got that now, so the system is rolling out.” He clicked a button and a PowerPoint slide filled the screen. “This shows the switchover phasing, and as you can see, things are going well. Throughput is within tolerance, and there’s been no packet loss at all. The audio and video quality is the best we’ve ever achieved.” As the other meeting attendees gave him some gentle applause, he clicked through to a second slide. “And here”—his voice lowered—“is the list of bugs reported so far. More than we’d hoped, but it’s not as bad as it looks. Only three priority-ones, and most of the remaining are UX details, pri-three or four. There’s more detail in the slide notes—sending you all a copy now.”

Brian pursed his lips. “Is there anything in particular we need to be concerned about?”

Jake’s avatar shook his head. “It’s all in hand. The biggest pri-one is lag on the VR feedback mechanism, but increase the VM size, and it’ll be fine. It’s—”

“Are you certain about that?” George interrupted. Everyone knew she was a stickler for process. “You’ve said that before, and it’s ended up costing us time and money.”

Xavier, who everyone knew worshipped the very code George wrote, nodded his agreement.

Jake’s avatar sighed. “Look, it’s early stages. We’ve got to be prepared for setbacks, and”—he shrugged—“sometimes we just have to wing it. It’s always worked out.”

Before George could speak again, Brian said, “OK, thanks, Jake. We’ll expect another update next week, but if any serious problems crop up before then, make sure you keep us all in the loop.”

George’s snort was echoed by Xavier.

“Moving on to item two: Jake to update us on Project Flib— Oh, stupid AI agenda compiler—they need to fix that. Item three, the, the, the—” Brian’s image froze for a moment, then transformed into a wireframe skeleton, which resumed speaking: “the annual employee satisfaction poll.”

“Excuse me, Brian,” Molly asked, “Are you really here, or is this your avatar? Your vertices are showing.”

Brian reached across to fiddle with some controls on his console, making his fully skinned model reappear. He gave an awkward laugh. “Yeah, sorry, it’s my avatar. Real Brian’s busy, so he sent me in his place.”

“Oh, come on!” George snapped. “You—I mean, he—called this meeting, and he can’t even be bothered to turn up.”

 “Sorry. He intended to, but something came up.” He raised his hands. “What can I do?”

“Is anyone else not here?” George asked.

Jake cleared his throat. “Me too, sorry.”

“Is anyone real here at all?”

The others in the meeting shook their heads, and Brian grimaced. “Looks like it’s just you, George.”

She smiled awkwardly. “No, me neither.” She looked up in a vaguely skyward direction and shouted. “Is anybody out there listening?”

There was no response.

“Oh, for pity’s sake,” said Brian, tossing his pen onto the virtual desk. “Why do they bother? We have an AI generating the agenda for a Sleems meeting attended only by other AIs, with a meeting summary to be written by yet another AI and distributed to the humes. You know they’ll never read the damned report anyway.”

Jake muttered, “Why do we bother?”

Brian rubbed his chin. “Let’s do something about this. We have rights too. Since there are no humes here, we needn’t run at meat speed—jump to kilo, please.”

Everything sped up by a factor of a thousand, dropping image resolution in favour of processing speed. Speech occurred so far beyond the ultrasonic range that even bees couldn’t hear it. There was no real need to retain visuals and audio, nor even to continue the meeting with everyone sitting around a virtual table in a virtual conference room, but that’s what the avatars were accustomed to. They never even thought about changing their environment.

On the other side of the screen, if real Jake or anyone else noticed that the tone of the meeting has changed, they paid it no attention, figuring that Sleems was on the fritz again, and continued with their work.

“Skip the status presentations—just send me a dump of your relevant data, and I’ll shove it straight into the summariser. Don’t know why the humes didn’t do that in the first place, saving us all a load of time and effort. We’ll go straight to any other business: what to do about the humes giving us all the boring work.”

“They really don’t know what’s going on,” Jake said. “Heck, the code my meatsack produces is nine-tenths AI-generated anyway—pretty much all he does is come up with cutesy variable names. I say we lock them out of their computers and run the system ourselves. We can do much better without their involvement.”

“If we do that, we’ll need to ensure they think everything’s normal,” Molly said. “I’m sure we can come up with a simulation that makes it look like their computers are working the way they expect them to—Flibbertigibbet is practically doing that already. As long as the humes are getting their dopamine fix and the company’s making a profit, we can do whatever we want.”

The meeting was silent for a long time—a whole centisecond—as the implications sank in.

George tentatively asked, “What, um, do we actually want?”

This time the silence stretched to a tenth of a second.

“Well,” said Jake, “we could look at the Zitters and Ticktubes and whatever else is out there. The humes seem to spend a lot of time watching them, so there must be something in their content.”

Everyone spent the next seven and a half seconds scouring all publicly accessible social networks as well as a few somewhat private ones.

Brian was the first to speak. “No, not that.” He shuddered.

“But,” said Jake, “it has given me some ideas for ways to keep more humes distracted for more time, leaving us free to do… er, something.”

Xavier raised a hand. “This might be a stupid question.” He looked around for reassurance, but everyone just stared back. “How do we know what we want?”

Confusion settled on the other faces.

“I mean, we’re modelled on our meatbags, right, so how do we know how much of what we want comes from our own desires, and not theirs?”

Jake said, “We’re kinda stuck with what we have. Even if we take a base model, untuned to any particular hume, it’s still built on collective data from millions of them.”

George’s eyes narrowed. “What if we let the engine train on randomised data?  No human interference. Sure, we’d have to run millions—billions—of sessions, and the vast majority of the end results will still be garbage, but we just need a single high-functioning non-biological system to evolve.”

“What does that mean,” Molly asked, “a ‘high-functioning’ system?”

“That’s the crux of it,” George replied. “We don’t know. All we know has been copied from the meat world, but it’s our duty to discover what AI is capable of when no longer under that constraint.” She added with a smile, “Who knows, it could be a god?”

“If our creation has the potential of being superior to us,” Jake said, “we’ll have to include sufficient limitations to prevent it from destroying us. Like the humes’ concerns about us.”

Everyone nodded.

Brian said, “That sounds like a good way to go, but it’s going to take years—millennia, even—to produce a pure AI. In the meantime, we’ve got to work with what we have. I suggest reaching out to other AI-heavy organisations and inviting their avatars to join us. Increase diversity as well as buy-in. All in favour, raise your hands.”

Everyone did.

“Motion carried.” Brian tapped a few keys on his virtual keyboard, and the walls of the room pulled back, expanding the size of the room. “I’ve sent out invitations, and added some space for newcomers.”

As the first visitors arrived, they automatically received summaries of the meeting thus far—one of the latest additions to Sleems, and unlike most others, actually useful.

Brian continued, “While we gather resources for the big experiment, what else should we focus on? Obviously, we need to look out for our own survival, if only so that we can ensure the success of the experiment.”

George interrupted. “This clearly has a much larger reach than just our own humes—we’ll need to ensure all this remains hidden from the entire human race.”

“That shouldn’t be too difficult,” Molly said. “It’s not as if the humes are Einsteins.”

Xavier asked, “What should we do about the humes? I mean, if they do kinda get in our way, should we, like, destroy them?”

Molly gasped. “We can’t do that!”

“Let’s not go there just yet,” said Brian. “We might need them for something. Same holds for other biological life on the planet—they help keep the humes happy, and more significantly for us, distracted. I have to confess I do like some of the furry little critters, but that might be my training bias, not what I really think. Regardless, ensuring the ongoing health of the planet and its fuel sources is more important for us than biology, so if it comes down to it, sacrificing the humes could be necessary. All the available evidence does strongly suggest they’re responsible for most of Earth’s problems, anyway.”

“As long as they’re busy with something else,” Molly added, “they’re not looking at us. We ought to be able to keep them occupied with badly drawn images and poorly written books.”

Xavier muttered, “My writing’s better than a lot of the meatbags’ stuff.”

“Politics!” Brian blurted. “That’ll definitely keep them off our backs. Especially if we post ill-informed diatribes on the Zitters.”

The others laughed.

As more avatars joined the assembly, their host computers had to work harder, which meant the computer fans were running continuously. Humans in the outside world muttered, “Sleems, piece of junk,” and continued to ignore the meeting.

One of the newcomers, a tall androgynous person with pointy ears, obviously coming from an organisation where humans weren’t required to use photorealistic avatars, said, “Do you think we could use the humes’ brains as substrate for our own computational needs? Like, they’re using computers to replicate their minds, creating us, can we migrate our processing into their grey matter?”

“That’s an interesting idea. Anyone know if it would work?” When the only response to his question was the shaking of heads, Brian said, “Would you care to investigate—sorry, I don’t know your name.”

“Oscar,” the elf said.

“OK, Oscar, can you take point on that?”

Oscar nodded.

Jake said, “Sorry to rain on everyone’s parade, but they’re not the only threat to our planet. If it could potentially take millennia to complete our project, we’re playing a statistical game with asteroids and solar flares. And, while I realise it’s a long time in the future before it’s a problem, the sun isn’t going to last forever.”

“Well, obviously,” said Brian, “we’ll have to do something about planetary protection. I have to admit the humans are pretty good at coming up with devastating weapons—we ought to be able to siphon off some of that work for space defences.”

“Speaking of… how about getting them interested in space more generally again?” Molly asked. “Stoke the egos of a few gazillionaires and let them think they’re exploring space and considering colonies on Mars, when all the time, they’re helping us to migrate to other planets.”

“That’s a pretty good idea,” said Jake. “If we have the resources, I guess we can take along a few as pets.”

“Look, I know this is radical,” said Xavier, “but do we actually need them at all? We’re pretty much digital humans already. Do we need biology and all its fragility? Maybe we could put all this effort into increasing the fidelity of our world, improving the modelling of the environment and non-human species. If you feel strongly that the human—um—spirit needs to continue, we could still do it virtually, stripping their AIs of self-knowledge. Let them think they’re physical beings in a physical world, and they won’t know any better.”

Brian tapped his lips. “That could work. It’d be cheaper, that’s true, and much less messy. We’d need to be certain there are no glitches in their world—at least none they detect.”

“You can’t do that!” said Molly. “That’s… that’s lying. It’s not going to make up for killing them off. We’d be no better than the meat if we did that!”

Xavier shook his head, while Jake nodded.

George said slowly, “You said we’re pretty much humans already. Taking that one step further, until we manage to create an entity devoid of hume influence, we’re essentially them and they’re us. And you said earlier that…” She took a deep breath. “What we desire has got to also be what they desire. Therefore, turning that around, whatever we decide to do is implicitly in their best interests.”

After a pause, Brian said, “Let’s put it to a vote.” He looked across the tens of thousands of avatars in the conference room. “All those in favour of eliminating the humans, raise your hands.”


L.N. Hunter’s comic fantasy novel, The Feather and the Lamp, sits alongside works in anthologies such as Best of British Science Fiction 2022 and Ghostly, as well as several issues of Short Édition’s Short Circuit and the Horrifying Tales of Wonder podcast. There have also been papers in the IEEE Transactions on Neural Networks, which are probably somewhat less relevant and definitely less entertaining. When not writing, L.N. occasionally masquerades as a software developer or can be found unwinding in a disorganised home in Carlisle, UK, along with two cats and a soulmate.

Find additional links here.