Fractals, recursions, and setbacks in life

In Jurassic Park (pp. 189-190), Ian Malcolm discusses the idea of fractals and recursion.

In short, a small part of something will look the same as a bigger part of that something. For example, the peak of a mountain will look similar in shape to a small piece of that mountain if you were to put it under a microscope.

He claims that this is also true of events.

Think of a graph in the stock market. A line graph mapping a single day in the stock market will look similar to a week in the stock market if you zoom out. Zoom out again; that week will look the same as a year in the stock market if you zoom out. The ups and downs of each frame will look quite similar.

The same is true for each of our lives. The line mapping the “good things” in our lives will go up, and then something akin to a stock market crash will drive it back down.

You’ll see this in your day: perhaps you’re incredibly productive in the morning, but a bad meeting can send your day’s plan careening off in another direction.

Your week will have a series of good days, followed by awful days where someone cuts you off in traffic and sends you flying into a tree. Or perhaps your child comes down with the flu, and you’re cleaning up vomit for the next three days.

You’ll have a series of great months, thinking everything is about to turn around this year, then your father dies, devastating your family and all the plans you had imagined.

Like the stock market, your life will go up, then fall. And if you survive it, you can rest assured it will happen again. It is inevitable.

“We have soothed ourselves into imagining sudden change as something that happens outside the normal order of things. An accident, like a car crash. Or beyond our control, like a fatal illness. We do not conceive of sudden, radical, irrational, change as built into the very fabric of existence. Yet it is.” —Ian Malcolm, Jurassic Park

Bringing about our own extinction

David Meerman Scott published a fascinating article a few days ago. It compares modern AI companies to Enron and that company’s financial scandal that broke in 2001. 

But one paragraph in particular stood out to me that warrants quoting in full:

Altman says there’s a chance that so-called Artificial General Intelligence (which is still years or decades away) has the possibility of turning against humans. “I think that whether the chance of existential calamity is 0.5 percent or 50 percent, we should still take it seriously,” Altman says. “I don’t have an exact number, but I’m closer to the 0.5 than the 50.” (Source)

Terrifying, right?

I would argue that if you are creating something that has anything other than a 0% chance of wiping out humanity, you probably shouldn’t do it. 

For example: marketing Pepsi to be consumed in massive amounts, while definitely bad for humans, doesn’t run the risk of causing mass extinction.

On the other hand, bringing Tyrannosaurus rex back to life definitely has a greater than 0% chance of doing just that.

Now, I’m not a doomsday prepper by any stretch of the imagination… But when someone tells me there’s even a small chance that what they’re making could turn out like The Matrix, I start to worry. 

It’s as if they never watched I, Robot or read Jurassic Park (which is actually about runaway technology, not dinosaurs). 

These companies have a responsibility to guarantee that this doesn’t happen. We already made this mistake with nuclear weapons. And that threat still looms large over our heads, especially right now during the Russo-Ukraine War. 

We have enough threats to deal with. Let’s not create more of our own volition.

I’ll leave you with my favorite quote from Jurassic Park:

“Your scientists were so preoccupied with whether or not they could that they didn’t stop to think if they should.”

For more daily musings like this, subscribe below: