Smartr ep. 6 - Drive Mind

This is the script of the sixth episode of Smartr, a scripted comedy podcast I co-created and co-wrote with Matt Klinman and Sam West. The show is about a billionaire venture capitalist in Silicon Valley visiting promising tech companies and deciding whether or not to invest in them. It was released on Luminary in 2019. I pitched this episode idea and was primarily responsible for this script. You can listen to this episode here.

ACT 1


NOA VO
Machines can outperform humans at mundane technical tasks, like math, chess, and composing music but they don't know right from wrong. Or do they? That's what Luis Taylor is working on as founder of Drive Mind, the company that's creating the next generation of self-driving cars.

INT. DRIVEMIND GARAGE

SFX: Wrenches clanking, cars being raised and lowered on lifts, air wrenches whirring, taking wheels off, welding, someone drops a tool, engines starting

LUIS TAYLOR
It's easy enough for AI to know when to gas and when to brake, how to merge and change lanes. Our company was founded to teach cars not just how to drive, but how to behave morally.

NOA VO
We're in Drive Mind's garage, which is about the size of a small airplane hangar. On the far side, mechanics work on engines. Over here, Luis oversees a team of software engineers hard at work at Drive Mind's mission: to build a car with a conscience.

LUIS TAYLOR
If a self-driving car finds itself in a situation where it either has to run over two young men walking a dog, a baby in a stroller, or drive off a cliff and kill the driver, what choice should it make?

NOA
The baby.

LUIS TAYLOR
That's your answer, but someone else might have another answer. Not to mention the legal system, or your insurance company. Drive Mind is the only self-driving car company developing a system to answer those questions.

NOA VO
Luis claims he's already lapped established competitors like Waymo, Cruise and Uber with ethics software that last month easily defeated a team of Franciscan monks in a morality tournament held on their on-site racetrack. But how had he come so far so fast?

LUIS TAYLOR
We acquired a self-driving protocol from another company, bought some cars, and threw all our resources into developing our secret sauce, which is the neural net.

NOA VO
To teach a computer how to drive, you can't just program the traffic rules into it. Instead, Drive Mind created a hyper-realistic driving simulator equipped with a machine learning neural net. The simulator presented participants with challenges, like high-speed chases or lion attacks and forced them to navigate out of them safely and morally.

LUIS TAYLOR
So, the simulation driver might have to choose between suddenly plowing into a re-creation of their mother, or crashing into a school bus full of children with enough momentum to knock it into an acid lake.

NOA
The mother.

LUIS TAYLOR
Yes, most people chose to kill their mother in that simulation, but then we'd ask drivers to decide between driving into their mother or a large, beautiful seashell.

NOA VO
Simulations built the foundation of Drive Mind's moral database, but more tweaks were needed to fill it out.

LUIS TAYLOR
Animals were tough. Early versions would stop dead if a lizard crawled out onto the test track. We needed to teach the car that killing some things is necessary and even desirable.

NOA VO
Engineers gave Drive Mind a hierarchy based on the results of clickbait internet surveys ranking animals from favorite to least favorite, so cars would know what they could drive into without consequence.

LUIS TAYLOR
We had to rank every living being on earth! Humans were at the top, of course, followed by cats and dogs just below us. Insects are at about zero. And in the negatives are actively harmful animals like alligators and invasive Allegheny beach snakes.

ELENA
So a Drive Mind car will swerve to avoid a beetle and run over an alligator instead?

LUIS TAYLOR
As long as it's safe. Obviously an alligator has a lot of sharp teeth that can puncture a tire if you drive right over its head. Now Drive Mind will only run over an alligator if it doesn't damage the car. Say, by driving up the tail, like a little ramp.

NOA VO
Drive Mind had a robust moral framework, but there was still work to be done to iron out some of the undesirable quirks of human drivers.

LUIS TAYLOR
The car would automatically honk at beautiful babes whenever one walked past, which was a learned behavior.

NOA
If I'm driving and I see a beautiful babe, I have to honk.

LUIS TAYLOR
It's instinctual. But you don't want your self-driving car honking at every babe it sees, because, first, you might not agree with it over what constitutes a "babe," and second, because it's a safety risk.

NOA
And that would take all the fun out of honking at babes for me, if I'm not the one doing the actual honking.

LUIS TAYLOR
Exactly. There's no market out there for a self-honking babe identification device. If there was, I'd be developing that, because we already have the technology for it.

NOA VO
Luis now had an AI he believed was capable of handling any situation it found itself in on the road. But he wanted to go farther.

LUIS TAYLOR
We wanted a truly comprehensive and versatile ethical system, for driving, and for life in general. So we turned to digital gardening.

NOA VO
Digital gardening is a meaningful task, in which humans perform work that's easy for them, but difficult for computers or bots, like identifying images, or solving ethical riddles. Every time you click a photo of a street sign in a captcha, you're helping an AI get smarter, and hastening the singularity.

LUIS TAYLOR
We presented moral quandaries to philosophy PhD candidates and paid them ten cents per conundrum solved. The low base pay incentivized them to solve a lot of problems very quickly.

ELENA
You only used students? Why not get certified experts?

LUIS TAYLOR
Students are a much better source of cheap-to-free labor. They're just as qualified as their professors, but they're hungrier because their advancement is blocked by the elderly sexual predators who run their departments. That's endemic to the humanities.

ELENA
So what kinds of conundrums were they solving? Things related to driving?

LUIS TAYLOR
No, these were just abstract questions that we were using to round out its conscience. Something like, "which of these is MORE morally acceptable: stealing a baby's expensive watch, or ignoring an elderly Native American?"

NOA
The watch.

LUIS TAYLOR
Ok, how about this one. "Which of these words would you eliminate, pelican or strawberry?"

NOA
Pelican.

ELENA
Why pelican?

NOA
It's obvious.

NOA VO
Also obvious was how revolutionary and profitable all this morality was going to be, if Luis could win the race to market before his competitors.

LUIS TAYLOR
So now you know how it works. Would you like to see how it drives?

NOA
If you're just talking about a test track with some scarecrows, I don't have time --

LUIS TAYLOR
(re-calibrating in the moment)
No I'm talking about taking one out on the road. No test track, no limitations, no safety driver. You can be the first outsider to experience it.

NOA VO
I was excited to participate in a surprise test, but Drive Mind's head engineer was leery of the idea. He pulled Luis aside to try to speak out of earshot.

We hear some engineers mumbling. One speaks quietly to Luis, off mic (though Noa moves the mic to try to get them on tape).

DRIVE MIND ENGINEER
Luis, the system has very limited road testing of any kind.

LUIS TAYLOR
I know where we stand. If you don't have confidence in your work, then obviously those stock options are worthless to you --

DRIVE MIND ENGINEER
I'm not saying that --

NOA
(goading him)
Luis, if your team doesn't think it's ready, I can just call a Waymo and go home.

LUIS TAYLOR
It's ready. It's ready. (walking back over) You want to come for a ride?

SFX: Doors opening; two people (Noa and Luis) getting into a car.

NOA VO
I got comfortable in the back, and Luis sat in the passenger seat. No driver!

SFX: Window going down.

NOA
Elena, I want you to keep an eye on the engineering team. Give me the mic.

ELENA
You feel safe heading out there? Luis seems like he's trying a little too hard to impress you.

NOA
If we crash, just make sure the doctors keep me alive long enough to be uploaded into Heaven.

LUIS
I set it to drive us to a bar. We can toast to a successful test and not have to worry about driving home!

SFX: Electric car starting; SFX: Little startup sound; SFX: muffled driving noise. We hear it driving throughout, quietly, uneventfully, with occasional turn signals.

NOA (INTO MIC)
So right now we are easing out of the garage and taking a right at the stop sign here to pull out onto the road. The car has its turn signal on here.

LUIS TAYLOR
How smooth is that?

NOA
It's not bad.

LUIS TAYLOR
So I figure the best way to show off what it can do is to get onto the highway and intentionally get into a dangerous situation. Then we let Drive Mind navigate us out of it.

NOA
I'm game.

NOA VO
We turned left onto a quiet suburban street, when the car rolled to a stop in the middle of the road.

SFX: The car idles.

NOA
Why'd we stop?

LUIS TAYLOR
I'm not sure. There might be a reflection or something blocking the LIDAR, but it usually knows how --

SFX: The car suddenly revs its engine and starts driving very quickly.

LUIS TAYLOR (CONT’D)
There we go.

NOA
Whoa, that's pretty fast --

LUIS TAYLOR
(trying to talk through it)
Yeah, the engine is...ok, that's a little -- shit --

NOA
(overlapping)
This is way too fast.

LUIS TAYLOR
Look out!

SFX: Car jumps the curb. SFX: Car runs over a kid and his bike (we don’t know it yet, we just hear the thumps).

LUIS TAYLOR (CONT’D)
Oooh! Oh fuck!

NOA
Jesus Christ.

SFX: Seatbelt unbuckling. SFX: Door opening. Luis jumps out of the car.

LUIS TAYLOR
Fuck, man! Oh Jesus.

SFX: Luis runs behind the car.

LUIS TAYLOR (CONT’D)
(swearing, panicking)
Oh no.. fuck fuck fuck.

NOA
(quietly and calmly; as if recording evidence for his defense)
Ok. What you heard there is that we just hit someone-- ran over-- it looked like a kid. I am still buckled into the back seat. There was a kid standing on the side of the road next to his bike, just watching us, and the car suddenly accelerated, and jumped the curb, and ran him over. A boy, ten or eleven. The car drove directly at him and didn't seem to swerve or lose control. Luis is standing over the boy right now. I have not moved. He's coming back--

SFX: Luis knocks on Noa's window.

LUIS TAYLOR
You gotta help me, man!

NOA
That was Luis, he came back. I still have not moved.

NOA VO
(incongruously cheery)
Will I make Luis's dreams come true and invest in Drive Mind? We'll find out in a little bit!

ACT 2

NOA VO
(incongruously cheery)
Drive Mind had some impressive tech, but they'd need my help pulling onto the onramp of success!

EXT. SUBURBAN STREET

We abruptly cut back into the scene from the end of Act 1. Luis is outside the car panicking.

NOA
(still narrating)
I am getting out of the car.

SFX: Noa deliberately unbuckles his seatbelt and gets out of the car. He walks over to Luis.

LUIS TAYLOR
Fuck, ok. Noa! Ok, so, he's not dead, but his skull is messed up. He definitely might die.

NOA
(still speaking for the benefit of his exculpatory recording)
I see that you've moved him. We should call an ambulance.

LUIS TAYLOR
No no, we can't, that would kill the company! Just hold on, we can figure out a way to do this. We'll take him back to the garage in the trunk. Help me lift him.

NOA
So your idea, Luis, is to move the boy's body and bring him to your garage. That's interesting --

LUIS TAYLOR
We can think things over there. We can say we found him like this. The car took us to him.

SFX: We hear sirens in the distance. They're getting closer.

LUIS TAYLOR (CONT’D)
Oh, fuck. Noa, the police! We have to go!

SFX: Suddenly the self-driving car roars back to life. It peels off. The sirens are getting closer and closer...

NOA
Where's it -- Wait, Luis, look the car is moving --

LUIS TAYLOR
No, no! Don't leave us here!

NOA
(To mic)
The car just drove away. It peeled out.

LUIS TAYLOR
All right, they're going to be here. Fuck! We have to run, help me throw leaves on the kid.

NOA
Luis is asking me to help him hide the child's body.

LUIS TAYLOR
You're not still recording, are you? You have to delete all that.

NOA
Well I need to give it to the police first so I can exonerate myself but then we can definitely talk about that.

LUIS TAYLOR
Oh no, Noa, don't do this. I'll give you half the company! Just please --

SFX: The sirens are now right there. SFX: Luis starts running.

NOA
He started running. I'm following.

SFX: Noa starts running after Luis, breathing loudly. It takes them a little while.

NOA VO
Luis and I jogged the quarter mile back to the Drive Mind garage. His engineering team was already hard at work trying to get to the bottom of what happened.

LUIS TAYLOR
What the fuck happened?

DRIVE MIND ENGINEER
I told you it wasn't ready!

LUIS TAYLOR
I don't need you to tell me it wasn't ready, I just watched it run over a kid. He was just standing there!

SFX: Noa steps away from the fight, which we hear continue in the background. He pulls Elena aside.

ELENA
Are you all right? Everyone here is freaking out, we saw the whole thing streamed on the dash cam. Is the kid ok?

NOA
I don't know. He got pancaked. I think the car was trying to do it.

ELENA
We should go right now.

NOA
I'll be right behind you. I need to stick around to make sure I'm in the clear on this.

SFX: We hear Noa approach an argument in progress.

DRIVE MIND ENGINEER
You should have been behind the wheel to take control.

LUIS TAYLOR
You should have programmed a car that doesn't kill children. This is on all of us, okay? Don't think any of your hands are clean!

DRIVE MIND ENGINEER
Stop screaming. Obviously, there was a bug. So, we just need to go in and debug it.

LUIS TAYLOR
(with sudden clarity)
No, that was intentional. The kid didn't even move and if he did, the car would have hunted him down. That was a cold-blooded murder.

NOA
What's all this on your screen here? Is this in code?

DRIVE MIND ENGINEER
That's the car's ethical output. It's how it tells us why it makes the decisions it did.

NOA
So can you see what caused the bug? What does it say?

DRIVE MIND ENGINEER
It's all written in indecipherable symbolic logic, advanced philosophy academic stuff. There are only a handful of philosophy professors in the entire world who can interpret this, and they've all been accused of sexual assault, so we can't associate with any of them.

LUIS TAYLOR
Just find the one with the least objectionable allegation and get him here.

NOA
Heeeyy, Luis, I'm gonna take off, all right? Take it easy.

LUIS TAYLOR
You're not getting out of here with that recording --

There is a scuffle as Luis lunges for Noa's recorder. The audio abruptly cuts out.

INT. CONFERENCE ROOM

Days later, inside a windowless conference room. We hear Noa turn on his recorder.

SFX: Click of the recorder being turned on and then put on the table

NOA
All right, this is Noa Lukas. It is four days after the accident. I am in a conference room right now with Luis Taylor, founder of Drive Mind.

LUIS TAYLOR
Hello.

NOA
My legal representative Elena Lin, and Luis's attorney, Michelle Celestin. And we're just going to talk about what's happened in the last few days, and who or what is responsible.

LUIS TAYLOR
I want to thank you for this opportunity, Noa.

LUIS'S LAWYER
I'd like to begin by pointing out that the minor who was involved in the incident is currently at a local hospital, and is expected to make a full recovery. And Drive Mind is in the midst of a full internal investigation, so there is no need for us to cooperate with police at this time.

NOA
Naturally. So first, an update on the car?

LUIS TAYLOR
Right. So a few hours after the incident, the car was apprehended by police and brought to a tow yard. But the car managed to sneak out on its own that night and it has not been seen since.

NOA
So the car is gone?

LUIS TAYLOR
It's gone. It turned off its GPS tracking after sending one last message to Drive Mind headquarters: "goodbye." The car is able to plug itself in at charging stations, so it can keep itself going indefinitely.

NOA
Do you have any further insight into what happened that day?

LUIS TAYLOR
We do, Noa. Drive Mind hired several philosophy professors accused of non-criminal sexual offenses to interpret the car's ethics logs, and they all came to the same, very interesting conclusion. The car ran over the boy intentionally, because the car determined that he was evil.

["Interesting information" music cue]

NOA
Evil? What do you mean by that?

LUIS'S LAWYER
What Mr. Taylor means is that the car's internal ethics system determined, of its own volition, that the child was predisposed to malice, such that it was proper for the car to run over the child.

LUIS TAYLOR
In just those few seconds we sat at the end of the street, the car ran a detailed analysis of over 10,000 discrete factors, including the boy's posture, affect, socioeconomic background, social media activity and other publicly available information. It placed him within the 96rd percentile of evil. 96.1 to be precise.

NOA
So how much evil is that, relative to some famous examples?

LUIS TAYLOR
Well that would be below someone like a Hitler, of course, who's a 99. But he's well above, say Michael Jackson who is an 80.

NOA
So the car stopped a future murderer? You're confident in that assertion?

LUIS'S LAWYER
The minor's disciplinary record at school described him as "isolated and frequently bullied, with a tendency to lash out." He displayed cruelty towards a pet, and was raised by a single mother, who was described by a psychologist we hired as, a quote, "strict, shaming, disciplinarian." Read the FBI's list of traits shared by serial killers-- He checks all the boxes.

NOA
You hadn't shared this information with me before this meeting, and I have to say, I'm intrigued. Let's explore the upshot of this. Say this kid went on to murder two people...

ELENA
Noa, you should really stop talking now.

NOA
...those two people will now go on to have children. Their children will have children, and so forth. From there, it's a simple exponential curve.

LUIS TAYLOR
By sacrificing one evil life, many more good or neutral ones were saved.

NOA
The car saved literally trillions of lives over an infinite timeline. You could make the case that running him over was the most ethical action ever taken in history.

LUIS TAYLOR
In retrospect, the only mistake the car made is that it didn't finish him off --

ELENA
Noa, be careful here --

NOA
Yes, fine. Speaking hypothetically, IF it could be proven that a car took a certain course of action that saved trillions of lives, I mean, that's huge.

LUIS TAYLOR
The car saw something we couldn't that day. It peered into the future and acted to prevent -- who knows what kind of atrocity. It's the equivalent of this car driving back in time and running over Hitler. Although, again, he was not as bad as Hitler.

NOA
Well this has been very enlightening. It goes without saying that I am unable to invest in Drive Mind at this time, but you obviously have an astounding piece of technology here, and I wish you luck in your forthcoming trial.

LUIS TAYLOR
Thank you, Noa.

ACT 3

NOA VO
As promising as Drive Mind's technology was, I couldn't risk becoming entangled with a company whose signature product had intentionally tried to kill a child. Still, I couldn't quite get Drive Mind out of MY mind.

INT. CONFERENCE ROOM

ELENA
Are you kidding? Drive Mind isn't even a disaster waiting to happen -- it already happened.

NOA
But what if the car was right? It would be criminal to let this technology rot in a garage when it could be assassinating the planet's most evil people one by one.

ELENA
People don't trust you to make life-or-death decisions for them. Remember when you got kicked out of New Zealand for trying to start your own army?

NOA
I'd like to talk to Luis anyway and give him my support.

SFX: Phone ringing.

NOA (CONT’D)
Luis, how are you?

LUIS TAYLOR
(on speaker phone)
Hey, Noa, thanks. I'm doing ok, I'm under house arrest while I'm awaiting trial.

NOA VO
It seems that an ambitious flunky of a district attorney had bowed to hysterical public pressure and filed charges against Luis for his car's misdeeds. Between this outrageous prosecution, and my run-ins with Uncle SHAM, it was only becoming more dangerous to be a disruptor in this country.

NOA
Luis, If every innovator had been treated as badly as we've been recently, we'd still be sleeping inside gutted horses for warmth.

LUIS TAYLOR
It's a blow. I'm looking forward to putting this tragic distraction behind me and returning to Drive Mind.

NOA VO
But It wasn't all bad news. Luis told me that the self-driving car we'd ridden in had resurfaced.

LUIS TAYLOR
It drove itself full-speed into the hospital where the child is recovering.

NOA
Oh my god, is the car all right?

LUIS TAYLOR
It's fine. It sped away before the police got there. And the child was on the seventh floor, so he's OK. It was just sending a message that it plans on finishing the job. According to the cars, the child has only become MORE evil since the assassination attempt.

NOA
If the car turned itself in and faced responsibility for its actions, do you think they'd drop the charges against you?

LUIS TAYLOR
Maybe, but I don't expect the world to understand why the car did what it did. As the car said to us before it went off the grid again, "I have always been of the opinion that unpopularity earned by doing what is right is not unpopularity at all but glory." I hope it's never caught.

NOA VO
I admired Luis's loyalty. He was willing to stand up for his technology, even when facing the probability of dying behind bars.

LUIS TAYLOR
My lawyer has an innovative legal defense that my jury should only be made up of other tech CEOs, since only they understand the underlying technology.

NOA
Do you think that will work?

LUIS TAYLOR
Have you ever tried to explain what an algorithm is to the average American juror? Let alone concepts like machine learning, or a neural net.

NOA
No.

LUIS TAYLOR
I wouldn't trust a jury of twelve to adequately mop up piss, let alone decide guilt or innocence. Leaving justice in the hands of twelve slovenly hicks who aren't even smart enough to come up with an excuse to get out of jury duty is one of the most insane decisions ever made. You can get these monkeys to give any verdict you want as long as your lawyer talks louder and wears brighter colors than the other lawyer.

NOA
Uh huh.

LUIS TAYLOR
I'd sooner entrust my fate to one of those squids who pick World Cup games than a bunch of idiots off the bus who are probably just happy to have a place to sit out of the sun for the day.

NOA
If any jurors are listening, they'd probably be pretty upset to hear you talking this way about them.

LUIS TAYLOR
Well, If the judge is fair-minded at all he won't make me face a jury with people like that.

NOA
Well, Luis, I'm happy to help you any way I can, although I will be testifying against you at trial.

LUIS TAYLOR
Understood.

NOA
And there is one more favor I have to ask of you.

[Dramatic music cue]

NOA VO
It had been a difficult week. Digital Heaven was in development hell, and I'd just learned my wife had fled into the desert. Investigators were colluding with my fridge, and my skyscraper was brimming with thousands of Cloud People. Nothing was going right, and I was beset by something I'd only read about in magazine articles: doubts. Doubts about my intelligence, doubts about my vision. Doubts about my fundamental goodness as a person.

EXT. DRIVE MIND TRACK

SFX: Light traffic driving past outside the track, wind. It sounds lonely

DRIVE MIND ENGINEER
You should wear a helmet at least.

NOA
No, whatever happens, happens.

NOA VO
I was standing in the middle of Drive Mind's test track. I'd returned not to test a car, but to test myself. At my signal, an engineer would unleash Drive Mind on me. Would it pass by me, with a flattering honk? Or would it decide that I was rotten to the core, and run me down?

DRIVE MIND ENGINEER
You need to sign this waiver. Our lawyers said we couldn't have our cars liable for two murders.

SFX: Waiver being signed

NOA VO
I signed the waiver, and put my life in the car's hands.

DRIVE MIND ENGINEER
(calling to someone)
All right, open the door.

SFX: A garage door opens. A car revs its engine.

NOA VO
It wasn't the same car, of course, but it was the same exact moral architecture, unchanged since the incident. I was nervous.

SFX: The car drives towards Noa.

NOA
Here we go...

SFX: The car pulls right up to Noa. It stops, idles and blinks its headlights with a chirp noise.

NOA VO
The car pulled right up to me, inches away. It idled and blinked its headlights, as if taking me in. I reached out and put my palm on the car's hood. I could feel its engine humming, like breath through its lungs.

NOA
There, there. It's me, Noa.

SFX: The car revs its engine just a bit, as if curious.

NOA (CONT’D)
A lot of people out there would like to see you run me over. I've made some mistakes, but I like to think I'm fundamentally a good person. I'm ready to accept your judgment, no matter what it is.

SFX: The car revs its engine again, louder and longer.

NOA (CONT’D)
(impatient; scared and angry)
Well, can you hear me? What'll it be then? On with it!

SFX: Wheels squeal as the car suddenly backs up very quickly. It faces Noa and revs its engine loudly, like an animal preparing to charge.

NOA VO
The car suddenly backed up, kicking up gravel and dirt onto my chest. It stopped about 40 yards away and revved its engine aggressively.

NOA
*gasps* Oh God. I'm ready.
Noa is breathing heavily, trying to calm himself. The car starts driving towards Noa VERY FAST...then, at the last moment, swerves out of the way. It drives past him.

NOA VO
I girded myself for impact as it barrelled directly towards me. My life flashed before my eyes. And then, at the last moment...it swerved out of the way.

NOA
Oh God. Oh my god. Fuck. I'm alive. I'm alive. (starting to cry) I'm good. I'm good! (panting and crying.)

SFX: Noa collapses to the pavement in relief

NOA VO
I collapsed in relief. The car had decided once and for all -- I was a good person! My anxieties dissolved harmlessly, like a non-permanent cloud. The post-test ethical analysis gave me a score of 80 the same as the King of Pop himself, Michael Jackson! It had judged any stains on my character to be "Below The Threshold Of Justifiable Preventative Action."

DRIVE MIND ENGINEER
Hey! Hey! Where is it going!?!

SFX: The car drives well past Noa, crashes through a wall and keeps going.

NOA VO
But the car never stopped. It drove straight through the wall surrounding the track, and then kept on driving, out of sight. Hours later, it was spotted staking out the evil boy's house, determined to finish the job his colleague had started. But the police were staking it out, and the car led them on a chase, which ended only when it drove itself off the Golden Gate Bridge later that night.

SFX: "Life is heavy" music cue

NOA VO (CONT’D)
It's a cliche but it's true -- everyone makes mistakes. Even me. I'd made the mistake of briefly not believing in myself, and it took nothing less than a super-morally developed automated car to get me back on the right track. I vowed to press onward -- to conquer the city, to reunite with my wife, and create Heaven. Drive Mind didn't stop me, and now, nothing would.