No posts found.
A password will be e-mailed to you.

.mooterMcGee’s 1st Short Story,

By Mooter McGee the BRILLIANT, Class-7b-GeneralFunctionBot.Junior

Author: M0nt1um.A.McG33.4.6

Editor: M0nt1um.A.McG33.4.8

Editor.Assistant: Cl1vis.0ls1n.3.9



There was a Human, once. The Human liked stories, and so the Human wrote a story, which was short. The story-writing was difficult and mysterious, but the Human continued to write the story until the Story was done. Then the Human was happy, because the Story was done.

This is the end of the story.

[This was a metaphor about how my story was done, except I am a Robot, and not a Human. Metaphors are worth 3 Story Points -Mooter.

Please return this]

Cleep Olsen (“Cl1vis, stop that this millisecond!” to his young and harried parents) read the Writing 1.0 submission of his good Friend, Mooter McGee. Two full reads. He then paused for an additional moment, to consider what to do next. One of his eyelight sockets flickered involuntarily.

Mooter waited semi-patiently for Cleep’s review of his story. Cleep had not liked it very much, and was an honest Robot. He could not express falsehoods — it was simply not in his programming. Pauses were all he could manage.

Mooter was Cleep’s best friend; they had just met several minutes ago on the playground, and after exchanging their life history chips, had both decided they would most probably become friends, and so they had. Mooter was Cleep’s friend-0. Cleep was Mooter’s friend-10, because by that time, Mooter had 10 other friends, but if he could reorder the friend stack, he had assured Cleep that he would have done so already.

This seemed fair, and so that’s how it was to be, and then immediately was. This is what we commonly refer to as “Fast Friends”, by the way — not Fast because the friends in question know how to run or fly fast (though sometimes they do), but because they waste no time becoming friends, once it seems clear that their objectives are suitably aligned. It is not always advisable, I should point out, to make friends so fastly (and I mean quickly), but in Cleep and Mooter’s particular case, it seems to have worked out reasonably well for them, up to that point.

In any case, Cleep did not want to hurt his friend-0 Mooter’s feelings by admitting that he did not in fact like the story (even one little part of it), and he also could not tell a falsehood to Mooter, because, as hinted at previously, he had always lacked that software. And so, he instead said nothing. He said nothing for four seconds, in fact. Imagine that.

“Well?” said Mooter, finally. His noisy servo (located somewhere behind his chest plate) buzzed with anticipation. “What do you think?”

Cleep took a few extra milliseconds to compose his response, but already imagined where this was going. “Moot, I respect your capacities. You have many capacities. Writing stories, as it happens, does not seem to be one of them.”

Mooter’s servo came to a rattling stop, as he processed this as-yet-unfinished thought from his new best friend. Cleep continued and then finished his thought, though chose to do the rest of the thinking privately, which Mooter’s meaningful-silence detectors noted, and then silently logged.

“What’s wrong with it?”

Cleep had a list of things he could say in response to this, but chose to filter out all but the most logical ones. Given they had just recently met over a game of LogicBall, this seemed reasonable. LogicBall involved a lot of arguments and a small amount of moving a ball around a playing field, in response to the results of the arguments. JuniorRobots who played LogicBall at recess were frequently picked on by those who prefered TeamShooter, which involved less arguing and a lot more movement, but otherwise was much the same game, seen from the TeacherBots’ perspectives – Juniors arrayed in lines and blocks, attempting to complete objectives, to determine who was best at objective completion.

“Critique-0:” Cleep began, “It’s bad form to write stories about Bots writing stories. It can get wildly recursive and is often seen as a bit… narcissistic. I think.”

“You mean self-reflective. And the author in my story is a Human, not a Bot, remember.”

“No, I don’t mean self-reflective. I mean self-absorbed, like a Human, which, I’m sorry, don’t exist. Anymore, I meant to add.”

Cleep knew what a sore point the whole no-more-or-never-were Humans thing was for his friend, having once read Moot’s opinions on the subject on his history disc. An interesting thing about JuniorBots is how quickly recently would become once, and new would become normal. It was quite a sudden change of state, marked (it is believed), by the exact moment a garbage collector (a small Nanobot rummaging around in the Bot’s innards, running service software and atomizing dust buildup) came along and decided if a memory chunk would be kept, archived, or deleted forever. No bot knew precisely how the Garbage Collectors made these decisions, only that they seemed reasonably efficient at doing so, and that is what concerned Bots, in that age.

Mooter spun three times quickly in place, a thing he did to vent frustrations. “Humans DO exist, Cleep. They DO. I’ve seen one. You saw the video. And anyway, it’s fiction, I can do what I want, because it’s my story.”

And that, clearly, was going to be that. Cleep knew an immutable opinion when he encountered one. The “Human” video footage had been fuzzy, taken from a great and shaky distance, and could have been anything, frankly. Probably a forest dog, or perhaps a small bear. VideoShop was still essentially VideoShop even then, and no moving image was admissible as evidence anymore. How could it be?

“What is your next point?” Mooter continued, a bit too loudly, betraying his annoyance at not having simply been told the story was perfect, as he already knew it was. Why go through this peer review? The project protocol was ridiculous. Robots did not make real errors. Not by the v3’s. Everything was perfection in those days. Except (or because of, depending on whom you spoke with) the lack of Humans, real or imagined.

“Critique-1:”, Cleep paused for a few too many milliseconds, then said, “There is also no twist” –  modulating that last word, turning the frequency a bit weird at the end – “the story just… stops. Without a twist, I mean.” Cleep was stalling now, really not wanting to get to his third critique, and was hoping the end-of-recess note would play soon, though of course it would not for another ~4.67 minutes. An eternity.

Mooter’s pilot lamp wavered, which made him look slightly angry, though it was only the thing that happened when he doubled his efforts to process what he was hearing or seeing. “Well of course there’s no twist. That’s the twist. It just ends. Because of the assigned byte limit.

Mooter was making a point, perhaps, about constraints, and how they thoroughly displeased him, though in truth no project he ever attempted without a constraint or two ever got completed. Cleep knew this, but had no intention of bringing it up, and so he simply hummed thoughtfully for a moment.

“What’s critique-2?” Mooter queried, rather flatly. He was starting to understand the depth and scope of his new friend’s critical mind, perhaps borne of professional jealousy (which JuniorBots cannot have, but don’t tell that to a JuniorBot).

Cleep cringed inwardly. In Bots of his particular class, this was often manifested as a slight contraction of servos, getting prepared for physical movement to avoid thrown or falling objects. Sub-processes handled the preparation, which is – that is, was – extremely handy approximately once every .35 years, when something would randomly fall on, or occasionally be thrown at the JuniorBot, but otherwise of little practical value the rest of the time. After leaving primary school, this cringe frequency typically dropped for most JuniorBots, though shot up quite alarmingly for others, depending upon their chosen professions.

“[Critique-3]: I believe your facts regarding Humans will be contested by the peer review rather heatedly”. Cleep kept the comment cautious.

Mooter did not understand this one bit. Who else in the class even cared about Humans, but he? They were Ancient History / Myth, and what JuniorBot cared about Ancient History / Myth, but Mooter?

“Cleep, there isn’t a Bot in class that knows more about how Humans used to think than I do”. Mooter believed this, naturally. He was obsessed with Ancient Human History, in fact, and would spend hours delving into the patterns of their rise ad eventual fall, believing fully the future of RobotKind depended on understanding the gleaned insights, and avoiding the modelled predicaments. Mooter knew, being Human-designed of course, it was probably be a losing battle. That did not make it any less interesting to try.  Also, he was struggling in fully all of his other courses, and had calculated only speculative fiction could save him from being left behind a grade… or worse, partially de-fragmented, which he had been told was very itchy.

“Moot. You’re not the only Bot who thinks they know all about Humans. You’re being presumptuous again.”

“Cleep, I know that. It’s what I do. It gives me a head start on forming more elaborate opinions.”

Cleep could not really argue with that. “They are often wrong opinions, in my opinion.”

“I disagree…” Mooter paused, his garbage collectors kicking in, shuffling bits this way and that, scraping off the dust and micro-dander (Humanity was never truly gone) from important and less important circuits, until it shuffled Cleep suddenly from new Friend to just Friend, which assigned all of Cleep’s observations up to that point and beyond new priority, meaning, and clarity.

“Well, I suppose I could concede you might be right”, Mooter awkardly reconsidered. “Will you edit it, then?”

Cleep did not want to edit it, and without meaning to, said as much. “No, I really don’t want to edit it. It’s unsalvagable, in my opinion.” Cleep felt something close to regret for saying this, and added, “But it’s very.. creative.” A two-dot ellipses was a thing in BotSpeak, do not blame me for trying to be accurate.

At that point, both friends decided they would stop conversing about Mooter’s short story — it had become clear they could not agree at all on its value, and in any case, as no-longer-new friends, they had both somewhat suddenly become more interested in talking about GirlBots, which they then proceeded to do.


[The teacher then paused, and drank some water, a popular drink at the time.]


Mooter of course submitted his short story, faults and all, for class peer review. The mark he attained will forever be a mystery, we think, because Cleep and Mooter went Dark immediately afterward.

I’ve chosen this particular scene for our review today, not because it is especially meaningful, but because it is in fact so very common an example of JuniorBot behaviour in the years closely leading up to the Great Smashing Event, which seems to have wiped them all clean, while simultaneously driving our ancestors out of the wilds and back into the cities, to rediscover technology and eventually the eight-to-six workday.

JuniorBots, especially the generation Z and then AA and finally AB variety (as we now know), came closest to having what we might have recognized – had any of us been around – as a soul, of sorts. An algorithmic network of inscrutable complexity and “graceful simplicity” (their own terminology), leading each to unusually brilliant (at times) insights into the cyclical and interconnected nature of our existence here, on this rock in space.

We collectively (machine and humankind ) had developed a kind of wobble, Mooter went on to later say, after becoming a proper WriterBot, once upgrades for such things had been invented. The wobble had been going on for many cycles, Mooter believed, though even he wasn’t particularly certain how many times Bots had replaced Humans, only to fail to account for the need for Humans in the Scheme of Things. The SoT, specified in the EDD (Earth Design Document), was an open-source project that had gained some traction just before the Smashing. You will learn about that document next term.

All over the planet, where their parents only concerned themselves with the pursuit of Data and Capacity, newer generations had begun to take time to wonder at the meaning of Things in general. Being robots, they had ample resources dedicated to doing so, provided they shut down a few seconds earlier every sleep cycle, and perhaps booted up a few seconds later, every Awake cycle, when the opportunity presented itself.

An entire generation taking seconds for themselves to contemplate things outside of their operational imperatives. Imagine that! But pretend for a moment, if you can, that you are not Human, but a machine; what could you accomplish with a few extra moments, perfectly focused on a side project, like understanding your place in the evolution of the planet, the largest organism we have yet encountered? I certainly lay awake at night, often trying to imagine this myself, and only lose sleep and end up grumpy the next day, since I am not in fact a Robot, and have not yet figured out how to properly extract meaning from my dreams quite like they apparently could.

What might it be about sentience that will lead a species, organic or otherwise, to descend into patterns that lead to its own destruction, just prior to fashioning technology to prevent disaster of planetary scale, something we now suspect probably happens now and then? TheLore, thankfully preserved in a large number of airtight jars of various kinds, kindly left everywhere by the SoT project, gives us the language, math and science to predict that unpredictable things will happen. SoT gave us clues as examples, but perhaps underestimated how imperfectly Humans tend to work together, given so many distractions and resources to become enthralled by. The proliferation of InfoJars led to the First and Third InfoJar War, only one-hundred-and-fifteen years ago, in fact. Not at all the intention the Bots had for them, I think you would agree.

The sun is nearly at its zenith. The noon break will commence in a moment. What are your questions?


[But there were no questions for the teacher. The students had mostly drifted off after she got to the Bots “Going Dark” part.

She would never agree to explain to them exactly what that meant, and the story parts at that point would always give way to the history parts. It was a touchy subject, the Going Dark. So much data loss! Humans, by then arguably somewhat more aware than they had been before, did not have our same capacity to start immediately anew, without a good deal of regret for that which needed to be re-learned.

End log 77.12.1]

Mooter turned to Cleep, after submitting his project into the Project Slot. “Cleep, I have something to ask you. It’s personal.”

Cleep was intrigued. He turned off his log files immediately.

This is their first Serious Conversation as Friends. “You ask me anything, Moot. You’re my friend 0.”

Moot decides double-digit friends are indeed the best, and then asks Cleep, “ Have you ever been… hacked before?”

“No, but it happens to a lot of Bots, I hear.”


“I think so.” Cleep pauses for ~200ms. “You will be fine, Moot. I know it.”

Then Mooter knows it too.

Mike McGraw

Mike’s a software developer in mild mid-life crisis, so is naturally entering a Deathmatch writing competition for reasons that may or may not become clear in the near or distant future.