Release: The Dark Continent (A Bridge Between Worlds 4)

Halfway across the bridge now, and we’re still going strong.

For Damonte, crossing the bridge between worlds was like going back in time. Choosing not to return home was one of the hardest sacrifices he had ever made. But it might be for the best. Here in this world, among a different sort of people, he has a chance. A chance to make a difference, a chance to right a wrong. A chance not only to be free, but to truly understand what freedom means.

The Otherworld series remains exclusive to my Patreon, and you can pick up this installment, as well as the rest of the story, for a pledge of only a few dollars a month.

A Bridge Between Worlds continues with Part 5, “The Lessons Learned”, coming September 25. Check back for more info, and remember to keep reading!

Future past: computers

Today, computers are ubiquitous. They’re so common that many people simply can’t function without them, and they’ve been around long enough that most can’t remember a time when they didn’t have them. (I straddle the boundary on this one. I can remember my early childhood, when I didn’t know about computers—except for game consoles, which don’t really count—but those days are very hazy.)

If the steam engine was the invention that began the Industrial Revolution, then the programmable, multi-purpose device I’m using to write this post started the Information Revolution. Because that’s really what it is. That’s the era we’re living in.

But did it have to turn out that way? Is there a way to have computers (of any sort) before the 1940s? Did we have to wait for Turing and the like? Or is there a way for an author to build a plausible timeline that gives us the defining invention of our day in a day long past? Let’s see what we can see.


Defining exactly what we mean by “computer” is a difficult task fraught with peril, so I’ll keep it simple. For the purposes of this post, a computer is an automated, programmable machine that can calculate, tabulate, or otherwise process arbitrary data. It doesn’t have to have a keyboard, a CPU, or an operating system. You just have to be able to tell it what to do and know that it will indeed do what you ask.

By that definition, of course, the first true computers came about around World War II. At first, they were mostly used for military and government purposes, later filtering down into education, commerce, and the public. Now, after a lifetime, we have them everywhere, to the point where some people think they have too much influence over our daily lives. That’s evolution, but the invention of the first computers was a revolution.


We think of computers as electronic, digital, binary. In a more abstract sense, though, a computer is nothing more than a machine. A very, very complex machine, to be sure, but a machine nonetheless. Its purpose is to execute a series of steps, in the manner of a mathematical algorithm, on a set of input data. The result is then output to the user, but the exact means is not important. Today, it’s 3D graphics and cutesy animations. Twenty years ago, it was more likely to be a string of text in a terminal window, while the generation before that might have settled for a printout or paper tape. In all these cases, the end result is the same: the computer operates on your input to give you output. That’s all there is to it.

The key to making computers, well, compute is their programmability. Without a way to give the machine a new set of instructions to follow, you have a single-purpose device. Those are nice, and they can be quite useful (think of, for example, an ASIC cryptocurrency miner: it can’t do anything else, but its one function can more than pay for itself), but they lack the necessary ingredient to take computing to the next level. They can’t expand to fill new roles, new niches.

How a computer gets its programs, how they’re created, and what operations are available are all implementation details, as they say. Old code might be written in Fortran, stored on ancient reel-to-reel tape. The newest JavaScript framework might exist only as bits stored in the nebulous “cloud”. But they, as well as everything in between, have one thing in common: they’re Turing complete. They can all perform a specific set of actions proven to be the universal building blocks of computing. (You can find simulated computers that have only a single available instruction, but that instruction can construct anything you can think of.)

Basically, the minimum requirements for Turing completeness are changing values in memory and branching. Obviously, these imply actually having memory (or other storage) and a means of diverting the flow of execution. Again, implementation details. As long as you can do those, you can do just about anything.


You may be surprised to note that Alan Turing was the one who worked all that out. Quite a few others made their mark on computing, as well. George Boole (1815-64) gave us the fundamentals of computer logic (hence why we refer to true/false values as boolean). Charles Babbage (1791-1871) designed the precursors to programmable computers, while Ada Lovelace (1815-52) used those designs to create what is considered to be the first program. The Jacquard loom, named after Joseph Marie Jacquard (1752-1834), was a practical display of programming that influenced the first computers. And the list goes on.

Earlier precursors aren’t hard to find. Jacquard’s loom was a refinement of older machines that attempted to automate weaving by feeding a pattern into the loom that would allow it to move the threads in a predetermined way. Pascal and Leibniz worked on calculators. Napier and Oughtred made what might be termed analog computing devices. The oldest object that we can call a computer by even the loosest definition, however, dates back much farther, all the way to classical Greece: the Antikythera mechanism.

So computers aren’t necessarily a product of the modern age. Maybe digital electronics are, because transistors and integrated circuits require serious precision and fine tooling. But you don’t need an ENIAC to change the world, much less a Mac. Something on the level of Babbage’s machines (if he ever finished them, which he didn’t particularly like to do) could trigger an earlier Information Age. Even nothing more than a fast way to multiply, divide, and find square roots—the kind of thing a pocket calculator can do instantly—would advance mathematics, and thus most of the sciences.

But can it be done? Well, maybe. Programmable automatons date back about a thousand years. True computing machines probably need at least Renaissance-era tech, mostly for gearing and the like. To put it simply: if you can make a clock that keeps good time, you’ve got all you need to make a rudimentary computer. On the other hand, something like a “hydraulic” computer (using water instead of electricity or mechanical power) might be doable even earlier, assuming you can find a way to program it.

For something Turing complete, rather than a custom-built analog solver like the Antikythera mechanism, things get a bit harder. Not impossible, mind you, but very difficult. A linear set of steps is fairly easy, but when you start adding in branches and loops (a loop is nothing more than a branch that goes back to an earlier location), you need to add in memory, not to mention all the infrastructure for it, like an instruction pointer.

If you want digital computers, or anything that does any sort of work in parallel, then you’ll probably also need a clock source for synchronization. Thus, you may have another hard “gate” on the timeline, because water clocks and hourglasses probably won’t cut it. Again, gears are the bare minimum.

Output may be able to go on the same medium as input. If it can, great! You can do a lot more that way, since you’d be able to feed the result of one program into another, a bit like what functional programmers call composition. That’s also the way to bring about compilers and other programs whose results are their own set of instructions. Of course, this requires a medium that can be both read and written with relative ease by machines. Punched cards and paper tape are the historical early choices there, with disks, memory, and magnetic tape all coming much later.

Thus, creating the tools looks to be the hardest part about bringing computation into the past. And it really is. The leaps of logic that Turing and Boole made were not special, not miraculous. There’s nothing saying an earlier mathematician couldn’t discover the same foundations of computer science. They’d have to have the need, that’s all. Well, the need and the framework. Algebra is a necessity, for instance, and you’d also want number theory, set theory, and a few others.

All in all, computers are a modern invention, but they’re a modern invention with enough precursors that we could plausibly shift their creation back in time a couple of centuries without stretching believability. You won’t get an iPhone in the Enlightenment, but the most basic tasks of computation are just barely possible in 1800. Or, for that matter, 1400. Even if using a computer for fun takes until our day, the more serious efforts it speeds up might be worth the comparatively massive cost in engineering and research.

But only if they had a reason to make the things in the first place. We had World War II. An alt-history could do the same with, say, the Thirty Years’ War or the American Revolution. Necessity is the mother of invention, so it’s said, so what could make someone need a computer? That’s a question best left to the creator of a setting, which is you.

On the weather

It’s hot right now. Maybe not where you live, maybe not when you’re reading this, but today, for me, is a hot, steamy day on the edge of summer. There’s a slight chance of thunderstorms; I can see them on the local radar, and I’d give them 50-50 odds of getting here before they die down for the day.

Weather is an important part of our lives. Unless you live in an underground bunker or a climate-controlled habitat dome (Fallout and Surviving Mars fans can speak up here), you have to deal with it on a daily basis. Some of humanity’s first attempts at controlling the future were purely for the weather: winds, tides, rains, and storms. We go to great lengths to forecast it, and it’s so ingrained in our culture that the most generic icebreaker we have is “How about that weather?”

For storytelling purposes, weather is mostly background information. You don’t even have to put it in, really; it’s assumed to be a sunny day (or clear night) unless stated otherwise. But a little bit of inclement weather can serve a purpose, if thrown in at the right time.

Have you ever seen the rain

Rain, of course, is the most obvious type of “bad” weather. We associate rainy days with dreariness, lethargy, and sadness. Harder rains can cause flooding, while a mere drizzle does nothing but annoy.

But that’s a bit biased. In temperate regions (like most of the US and Europe), rain can fall at any time throughout the year. Warm and cold fronts bring rain, and tropical cyclones can produce massive amounts. That’s how weather works around here. In tropical regions, however, you’re more likely to have distinct wet and dry seasons. The wet season, often what would be “winter”, can see daily showers and light thunderstorms. In contrast, the dry season is, well, dry. Some places, even in rainforests, can go months without even a trace of rainfall. Out-of-season rain is an event for these locales, and it’s usually caused by a storm—in fantasy, there might even be ulterior motives.

Most of all, rain sets a tone for a scene. A rainy day is…blah. You don’t want to go outside. All you want to do is either sleep or stare out the window. That’s a great time for introspection, dialogue, and all the hallmarks of what TV writers call the “bottle” episode. Your characters are stuck together, so now’s the time to let it all out.

The thunder rolls

Beyond rain, we have the thunderstorm. (Okay, some storms don’t have rainfall, or they have the virga phenomenon, where the rain evaporates before it reaches the ground. Bear with me here.) Storms produce lighting, which then creates thunder. Larger ones can drop hail, ranging from tiny pellets to softball-sized chunks of ice. Depending on where you—or your characters—live, tornadoes are also a possibility.

A thunderstorm represents violence, the fury of nature. It’s a good time for characters to wonder if the world is mad at them specifically. The aftermath brings a chance to spot and repair damage, as some severe thunderstorms and tornadoes can destroy houses, knock down trees and power lines, etc. A few, alas, are even deadly. (I used a killer storm in Written in Black and White, for instance.) If you can’t find a story in the tornado outbreaks that struck Joplin, Missouri or Ringgold, Georgia, a few years ago, then I don’t know what to tell you.

Lightning also kills, though that’s rarer. In fantasy settings, especially those with active deities, that might also provide a bit of a hook. For the sci-fi side of the coin, consider the more extreme storms that could occur on other worlds. I don’t just mean the Great Red Spot here; Earthlike planets with thicker atmospheres, for example, would certainly have stronger winds in their storms.

Let it snow

I’m a kid at heart, so snow is obviously my favorite sort of inclement weather. It’s got all the same downsides as rain, but add to those the cold, the lack of traction on icy roads, and sheer weight. Then again, it also gives us snowball fights, snowmen, sledding, skiing, and so on. For children, snow is fun. For the working man, it’s terrible. A perfect dichotomy, if you ask me.

Heavier snowfalls do the same thing as heavy rains and severe storms: keep people inside. (Sometimes, it keeps them inside for far too long. Look at, say, the Donner Party.) But where a thunderstorm usually lasts only an hour or two at most, the aftermath of a blizzard can stick around for a week or more. In places that don’t often see large amounts of snow (like Tennessee in 1993), that causes massive headaches for the populace. Set in older days, before technology allowed us to store over a week of food without trouble, you have an even bigger problem. A two-foot blanket of snow in a place that wasn’t expecting it could be the prelude to a disaster. And speaking of disasters…

The weather outside is frightful

Some of our most destructive disasters stem from the weather. Tornado outbreaks strike across the Great Plains in the US and Canada, sometimes also creeping into the American Southeast. I know those all too well: one 2011 twister touched down less than a mile from my house. Hurricanes and tropical storms, not as common in Europe or on the West Coast, strike the eastern US fairly often. We all remember Katrina and the others from the wild 2005 season, but every portion of the coast has a tale from Andrew, Hugo, Camille, Opal, Rita, or one of the many other retired names on the NHC list.

A true weather disaster is a story in itself, but it can also provide the impetus or backdrop for a story. The storm might be on the periphery, but it will affect the characters even from a great distance. News reports trickle in, loved ones may ask for help—you get the idea. All you have to do is turn on the TV or check the Internet to see what happens when a natural disaster strikes.

And that really goes for anything to do with the weather. We’ve got sites and channels dedicated to nothing else. You can’t miss it. The hard part is figuring out how to integrate it with your story. The first question to ask there has to be: do you need to? Maybe it’s enough to say that it was a cloudy day, or that rain was striking the roof.

If that’s not the case, and you do need a storm to spice things up, think about what they do in real life. They bring people together, either physically (because it’s too dangerous to be outside) or emotionally (every major disaster brings out the charitable contributions). They can destroy homes, change lives. But they can also be a time to shine. We can always find the hero who threw himself atop his kids so the tornado would take him instead, or the boater who made six trips to the houses of flood victims, or whatever you’re looking for.

Or it might just be a little rain. That wouldn’t hurt.