Writing World War II

Today, there is no more popular war than World War II. No other war in history has been the focus of so much attention, attention that spans the gap between nonfiction and fiction. And for good reason, too. World War II gave us some of the most inspiring stories, some of the most epic battles (in the dramatic and FX senses), and an overarching narrative that perfectly fits so many of the common conflicts and tropes known to writers.

The list of WWII-related stories is far too big for this post to even scratch the surface, so I won’t even try. Suffice to say, in the 70 years since the war ended, thousands of works have been penned, ranging from the sappy (Pearl Harbor) to the gritty (Saving Private Ryan), from lighthearted romp (Red Tails) to cold drama (Schindler’s List). Oh, and those are only the movies. That’s not counting the excellent TV series (Band of Brothers, The Pacific) or the myriad books concerning this chapter of our history.

World War II, then, is practically a genre of its own, and it’s a very cluttered one. No matter the media, a writer wishing to tackle this subject will have a harder time than usual. Most of the “good” stories have been done, and done well. In America, at least, many the heroes are household names: Easy Company, the Tuskegee Airmen, the USS Arizona and the Enola Gay. The places are etched into our collective memory, as well, from Omaha Beach and Bastogne to Pearl Harbor, Iwo Jima, and Hiroshima. It’s a crowded field, to put it mildly.

Time is running out

But you’re a writer. You’re undaunted. You’ve got this great idea for a story set in WWII, and you want to tell it. Okay, that’s great. Just because something happened within the last century doesn’t get you out of doing your homework.

First and foremost, now is the last good chance to write a WWII story. By “now”, I mean within the next decade, and there’s a very good reason for that. This is 2016. The war ended right around 70 years ago. Since most of the soldiers were conscripted, many right out of high school, or young volunteers, they were typically about 18 to 25 years old when they went into service. The youngest WWII veterans are at least in their late 80s, with most in their 90s. They won’t live forever. We’ve seen that in this decade, as the final World War I veterans passed on, and an entire era left living memory.

Yes, there are uncountably many interviews, written or recorded, with WWII vets. The History Channel used to show nothing else. But nothing compares to a face-to-face conversation with someone who literally lived through history. One of the few good things to come out of my public education was the chance to meet one of the real Tuskegee Airmen, about twenty years ago. The next generation of schoolchildren likely won’t have that same opportunity.

Give it a shot

Whether through personal contact or the archives and annals of a generation, you’ll need research. Partly, that’s for the same reason: WWII is within living memory, so you have eyewitnesses who can serve as fact-checkers. (Holocaust deniers, for instance, will only get bolder once there’s no one left who can directly prove them wrong.) Also, WWII was probably the most documented war of all time. Whatever battle you can think of, there’s some record of it. Unlike previous conflicts, there’s not a lot of room to slip through the cracks.

On the face of it, that seems to limit the space available for historical fiction. But it’s not that bad. Yes, the battles were documented, as were many of the units, the aircraft, and even the strategies. However, they didn’t write down everything. It’s easy enough to pick a unit—bonus points if it’s one that was historically wiped out to the man, so there’s no one left to argue—and use it as the basis for your tale.

And that highlights another thing about WWII. War stories of older times often fixate on a single soldier, a solitary hero. With World War II, though, we begin to see the unit itself becoming a character. That’s how it worked with Band of Brothers, for instance. And this unit-based approach is a good one for a story focused on military actions. Soldiers don’t fight alone, and so many of the great field accomplishments of WWII were because of the bravery of a squad, a company, or a squadron.

If your story happens away from the front lines, on the other hand, then it’s back to individuals. And what a cast of characters you have. Officers, generals, politicians, spies…you name it, you can find it. But these tend to be more well-known, and that does limit your choices for deviating from history.

Diverging parallels

While the war itself is popular enough, as are some of the events that occurred at the same time, what happened after is just as ripe for storytelling. Amazon’s The Man in the High Castle (based on the Philip K. Dick story of the same name) is one such example of an alternate WWII, and I’ve previously written a post that briefly touched on another possible outcome.

I think the reason why WWII gets so much attention from the alternate-history crowd is the potential for disaster. The “other” side—the Axis—was so evil that giving them a victory forces a dystopian future, and dystopia is a storyteller’s favorite condition, because it’s a breeding ground for dramatic conflict and tension. And there’s also a general sense that we got the best possible outcome from the war; thus, following that logic, any other outcome is an exercise in contrast. It’s not the escapism that I like from my fiction, but it’s a powerful statement in its own right, and it may be what draws you into the realm of what-ifs.

The post I linked above is all about making an alternate timeline, but I’ll give a bit of a summary here. The assumption is that everything before a certain point happened exactly as it did, but one key event didn’t. From there, everything changes, causing a ripple effect up to the present. For World War II, that’s only 70 years, but that’s more than enough time for great upheaval.

Most people will jump to one conclusion there: the Nazis win. True, that’s one possible (but unlikely, in my opinion) outcome, but it’s not the only one. Some among the allies argued for a continuation of the war, moving to attack the Soviets next. That would have preempted the entire Cold War, with all the knock-on effects that would have caused. What if Japan hadn’t surrendered? Imagine a nuclear bomb dropped on Tokyo, and what that would do to history. The list goes on, ad infinitum.

Fun, fun, fun

Any genre fits World War II. Any kind of story can be told within that span of years. Millions of people were involved, and billions are still experiencing its reverberations. Although it’s hard to talk of a war lasting more than half a decade as a single event, WWII is, collectively speaking, the most defining event of the last century. It’s a magnet for storytelling, as the past 70 years have shown. In a way, despite the horrors visited upon the world during that time, we can even see it as fun.

Too many people see World War II as Hitler, D-Day, Call of Duty, and nukes. But it was far more than that. It was the last great war, in many ways. And great wars make for great stories, real or fictional.

On ancient artifacts

I’ve been thinking about this subject for some time, but it was only after reading this article (and the ones linked there) that I decided it would make a good post. The article is about a new kind of data storage, created by femtosecond laser bursts into fused quartz. In other words, as the researchers helpfully put it, memory crystals. They say that these bits of glass can last (for all practical purposes) indefinitely.

A common trope in fiction, especially near-future sci-fi, is the mysterious artifact left behind by an ancient, yet unbelievably advanced, civilization. Whether it’s stargates in Egypt, monoliths on Europa, or the Prothean archives on Mars, the idea is always the same: some lost race left their knowledge, their records, or their technology, and we are the ones to rediscover them. I’m even guilty of it; my current writing project is a semi-fantasy novel revolving around the same concept.

It’s easy enough to say that an ancient advanced artifact exists in a story. Making it fit is altogether different, particularly if you’re in the business of harder science fiction. Most people will skim over the details, but there will always be the sticklers who point out that your clever idea is, in fact, physically impossible. But let’s see what we can do about that. Let’s see how much we can give the people a hundred, thousand, or even million years in the future.

Built to last

If your computer is anything like mine, it might last a decade. Two, if you’re lucky. Cell phone? They’re all but made to break every couple of years. Writable CDs and DVDs may be able to stand up to a generation or two of wear, and flash memory is too new to really know. In our modern world of convenience, disposability, and frugality, long-lasting goods aren’t popular. We buy the cheap consumer models, not the high-end or mil-spec stuff. When something can become obsolete the moment you open it, that’s not even all that unwise. Something that has to survive the rigors of the world, though, needs to be built to a higher standard.

For most of our modern technology, it’s just plain too early to tell how long it can really last. An LED might be rated for 11,000 hours, a hard drive for 100,000, but that’s all statistics. Anything can break tomorrow, or outlive its owner. Even in one of the most extreme environments we can reach, life expectancy is impossible to guess. Opportunity landed on Mars in 2004, and it was expected to last 90 days.

But there’s a difference between surviving a very long time and being designed to. To make something that will survive untold years, you have to know what you’re doing. Assuming money and energy are effectively unlimited—a fair assumption for a super-advanced civilization—some amazing things can be achieved, but they won’t be making iPhones.

Material things

Many things that we use as building materials are prone to decay. In a lot of cases, that’s a feature, not a bug, but making long-term time capsules isn’t one of those cases. Here, decay, decomposition, collapse, and chemical alteration are all very bad things. So most plastics are out, as are wood and other biological products—unless, of course, you’re using some sort of cryogenics. Crossing off all organics might be casting too wide a net, but not by much.

We can look to archaeology for a bit of guidance here. Stone stands the test of time in larger structures, especially in the proper climate. The same goes for (some) metal and glass, and we know that clay tablets can survive millennia. Given proper storage, many of these materials easily get you a thousand years or more of use. Conveniently, most of them are good for data, too, whether that’s in the form of cuneiform tablets or nanoscale fused quartz.

Any artifact made to stand the test of time is going to be made out of something that lasts. That goes for all of its parts, not just the core structure. The longer something needs to last, the simpler it must be, because every additional complexity is one more potential point of failure.

Power

Some artifacts might need to be powered, and that presents a seemingly insurmountable problem. Long-term storage of power is very, very hard right now. Batteries won’t cut it; most of them are lucky to last ten years. For centuries or longer, we have to have something better.

There aren’t a lot of options here. Supercapacitors aren’t that much better than batteries in this regard. Most of the other options for energy storage require complex machinery, and “complex” here should be read as “failure-prone”.

One possibility that seems promising is a radioisotope thermoelectric generator (RTG), like NASA uses in space probes. These use the heat of radioactive decay to create electricity and they work as long as there’s radioactivity in the material you’re using. They’re high-tech, but they don’t require too much in the way of peripheral complexity. They can work, but there’s a trade-off: the longer the RTG needs to run, the less power you’ll get out of it. Few isotopes fit into that sweet spot of half-life and decay energy to make them worthwhile.

Well, if we can’t store the energy we need, can we store a way to make it? As blueprints, it’s easy, but then you’re dependent on the level of technology of those who find the artifact. Almost anything else, however, runs into the complexity problem. There are some promising leads in solar panels that might work, but it’s too early to say how long they would last. Your best bet might actually be a hand crank!

Knowledge

One of the big reasons for an artifact to exist is to provide a cache of knowledge for future generations. If that’s all you need, then you don’t have to worry too much about technology. The fused-quartz glass isn’t that bad an option. If nothing else, it might inspire the discoverers to invent a way to read it. What knowledge to include then becomes the important question.

Scale is the key. What’s the difference between the “knowers” and the “finders”? If it’s too great, the artifact may need to include lots and lots of bootstrapping information. Imagine sending a sort of inverse time capsule to, say, a thousand years ago. (For the sake of argument, we’ll assume you also provide a way to read the data.) People in 1016 aren’t going to understand digital electronics, or the internal combustion engine, or even modern English. Not only do you need to put in the knowledge you want them to have, you also have to provide the knowledge to get them to where it would be usable. A few groups are working on ways to do this whole bootstrap process for potential communication with an alien race, and their work might come in handy here.

Deep time

The longer something must survive, the more likely it won’t. There are just too many variables, too many things we can’t control. This is even more true once you get seriously far into the future. That’s the “ancient aliens” option, and it’s one of the hardest to make work.

The Earth is like a living thing. It moves, it shifts, it convulses. The plates of the crust slide around, and the continents are not fixed in place. The climate changes over the millennia, from Ice Age to warm period and back. Seas rise and fall, rivers change course, and mountains erode. The chances of an artifact surviving on the surface of our world for a million years are quite remote.

On other bodies, it’s hit or miss, almost literally. Most asteroids and moons are geologically dead, and thus fairly safe over these unfathomable timescales, but there’s always the minute possibility of a direct impact. A few unearthly places (Mars and Titan come to mind) have enough in the way of weather to present problems like those on Earth, but the majority of solid rock in the solar system is usable in some fashion.

Deep space, you might think, would be the perfect place for an ancient artifact. If it’s big enough, you could even disguise it as an asteroid or moon. However, space is a hostile place. It’s full of radiation and micrometeorites, both of which could affect an artifact. Voyager 2 has its golden record, but how long will it survive? In theory, forever. In practice, it’ll get hit eventually. Maybe not for a million years, but you never know.

Summing up

Ancient artifacts, whether from aliens or a lost race of humans, work well as a plot device in many stories. Most of the time, you don’t have to worry about how they’re made or how they survived for so long. But when you do, it helps to think about what’s needed to make something like an artifact. In modern times, we’re starting to make some things like this. Voyager 2, the Svalbard Global Seed Vault, and other things can act, in a sense, as our legacy. Ten thousand years from now, no matter what happens, they’ll likely still be around. What else will be?

Out of the dark: building the Dark Ages

We have an awful lot of fiction out there set in something not entirely unlike our Middle Ages. Almost every cookie-cutter fantasy world is faux-medieval, and that’s only the ones that aren’t trying to be. The Renaissance and early Industrial Era also get plenty of love, and Roman antiquity even comes up from time to time. But there’s one time period in our history that seems a bit…left out. I’m talking about those centuries after Rome fell to the barbarian hordes, but before William crossed the Channel to give England the same fate. I’m talking about the Dark Ages.

A brighter shade of dark

Now, as we know today, what previous generations called the Dark Ages weren’t really all that dark. Sure, there were Vikings and Vandals, barbarians and Britons, Goths and Gauls, but it wasn’t a complete disaster. The reason we speak of the “Dark Ages”, though, is contrast. Rome was a magnificent empire by any account, and the first to coin the “Dark Age” moniker on its fallen children were living in the equally “shining” Enlightenment. By comparison, the time between wasn’t exactly grand.

Even in our modern knowledge, the notion of a Dark Age is still useful, even if it doesn’t quite mean what we think it means. In general, we can use it to refer to any period of technological, social, and political stagnation and regression. That’s not to say there wasn’t progress in the Dark Ages. One great book about the period is titled Cathedral, Forge, and Waterwheel, and that’s a pretty good indication of some of the advancement that did happen.

Compared to what came before—the Roman empire, with its Colosseum and aqueducts and roads—there’s a huge difference, especially at the start of the Dark Ages. In some parts of Europe, particularly those farthest from the imperial center, general conditions fell to their lowest levels in hundreds of years. While the Empire itself actually did survive in the east in the form of the Byzantines (who were even considered the “true” emperors by the first generations of barbarian kings), the west was shattered, and it showed. But they dug themselves out of that hole, as we know.

Dying light

So, even granting our more limited definition of “Dark Ages”, what caused them? Well, there are a lot of theories. Rome was sacked in 476, of course, and that’s usually considered a primary cause. A serious cold snap starting around 536 couldn’t have helped matters. Plagues around the same time combined with the war and famine to cause even greater death, completing the quartet of the Horsemen.

But all that together shouldn’t have been enough to devastate the society of western Europe, should it? If it happened today, it wouldn’t, because our world is so connected, so small, relative to Roman times. If the whole host of apocalyptic horror visited the EU today, hundreds of millions of people would die, but we wouldn’t have a new Dark Age. The reason can be summed up in one word: continuity.

Yes, half of the Roman Empire survived. In a way, it was the stronger half, but it was also the more distant half. When Rome fell, when all the other catastrophes visited its remnants, the effect was to cause a cultural break. Many parts of the empire were already more or less autonomous, growing ever more apart, and the loss of the “center of gravity” that was Rome merely hastened the process.

A look at Britain illustrates this. After Rome all but gave up on its island colony, England all but gave up on it. Outside of the monasteries, Rome was practically forgotten within a few generations, once the Saxons and their other Germanic friends rolled in. The Danes that started vacationing there in the ninth century cared even less for news from four hundred years ago. By the time William came conquering, Anglo-Saxon England was a far cry from Roman Britannia. This is an extreme example, though, because there was almost no continuity in Britain to start with, so there wasn’t much to lose. However, similar stories appear throughout Europe.

Recurring nightmare

Although Europe’s Dark Ages are a thousand years past, they aren’t the only example of the kind of discontinuity of a Dark Age. Something of the same sort happened in Greece two thousand years before that. The native peoples of America can be considered to have a Dark Age that started circa 1500, as the mighty empires of Mexico and Peru fell to Spanish invaders.

In every case, though, it’s more than just the fall of a civilization. A Dark Age needs a prolonged period of destruction, probably at least two generations long. To make an age go Dark requires severe population loss, a total breakdown of government, and the forcing of a kind of “siege mentality” on a society. Climatic shifts are just a bonus. In all, a Dark Age results from a perfect storm of causes, all of which combine to break the people. Eventually, due to the death, destruction, and constant need to be on guard, everything else falls by the wayside. There simply aren’t enough people to keep things going. Once those that are left start dying off, the noose closes. The circle is broken, and darkness settles in.

That naturally leads to another question: could we have a new Dark Age? It’s hard to imagine, in our present time of progress, something ever causing it to stop, but that doesn’t make it impossible. Indeed, almost the entire sub-genre of post-apocalyptic fiction hinges on this very event. It can happen, but—thankfully—it won’t be easy.

What would it take, then? Well, like the Dark Ages that have come before, it would be a combination of factors. Something causing death on a massive, unprecedented scale. Something to put humanity on the back foot, to disrupt the flow of society so completely that it would take more than a lifetime to recover. In that case, it would never recover, because there would be no one left who remembered the “old days”. There would be no more continuity.

I can think of a few ways that could work. The ever-popular asteroid or comet impact is an easy one, and it even has the knock-on effect of a severe climate shock. Nuclear war never really seemed likely in my lifetime, but I was born in 1983, so I missed the darker days of the Cold War. I did watch WarGames, though, and I remember seeing those world maps lighting up at the end. Two hundred years after that, and I don’t think we’re looking at a Fallout game.

Other options all have their problems. An incredibly virulent outbreak (Plague, Inc. or your favorite zombie movie) might work, but it would have to be so bad that it makes the 1918 flu look like the common cold. Zika is in the news right now, but it simply won’t cut it, nor would Ebola. You need something highly infectious, but with a long incubation period and a massive mortality rate. It’s hard to find a virus that fits all three of those, for evolutionary reasons. The other forms of infectious agents—bacteria, fungi, prions—all have their own disadvantages.

Climate change is the watchword of the day, but it won’t cause a Dark Age by itself. It’s too slow, and even the most alarming predictions don’t take us to temperatures much higher than a few thousand years ago, and that’s assuming that nobody ever does anything about it. No matter what you believe about global warming, you can’t make it enough to break us without some help.

Terminator-style AI is another possibility, one looking increasingly likely these days. It has some potential for catastrophe, but I’m not sure about using it as the continuity-breaker. The same goes for nanotech bots and the like. Maybe they’ll enslave us, but they won’t beat us down so badly that we lose everything.

And then there’s aliens. (Insert History Channel guy here.) An alien-imposed destruction of civilization would be the logical extension of the Roman hordes into the global future. Their attacks would likely be massive enough to influence the planet’s climate. They would cause us to huddle together for mutual defense, assuming they left any of us alive and alone. Yeah, that could work. It needs a lot of ifs, but it’s plausible enough to make for a good story.

The light returns

The Dark Age has to come to an end. It can’t last forever. But there’s no easy signal that it’s over. Instead, it’s a gradual thing. The key point here, though, is that what comes out of the Dark Age won’t be the same as what went in. Look again at Europe. After Rome fell, some of its advances—concrete is a good example—were lost to its descendants for a thousand years. Yet the continent did finally surpass the empire.

Over time, the natural course of progress will lift the Dark Age area to a level that is near enough where it left off, and things can proceed from there. It will be a different place, and that’s because of the discontinuity that caused the darkness in the first place. The old ways become lost, yes, but once we discover the new ways, they’ll be even better.

We stand on the shoulders of giants, as Newton said. Those giants are our ancestors, whether physically or culturally. Sometimes they fall, and sometimes the fall is bad enough that it breaks them. Then we must stand on our own and become our own giants. The Dark Age is that time when we’re standing alone.

Life below zero: building the Ice Age

As I write this post, parts of the US are digging themselves out of a massive snowstorm. (Locally, of course, the anti-snow bubble was in full effect, and the Tennessee Valley area got only a dusting.) Lots of snow, cold temperatures, and high winds create a blizzard, a major weather event that comes around once every few years.

But our world has gone through extended periods of much colder weather. In fact, we were basically born in one. I’m talking about ice ages. In particular, I’m referring to the Ice Age, the one that ended about 10,000 years ago, as it’s far better known and understood than any of the others throughout the history of the planet.

The very phrase “Ice Age” conjures up images of woolly mammoths lumbering across a frozen tundra, of small bands of humanity struggling to survive, of snow-covered evergreen forests and blue walls of ice. Really, if you think about it, it paints a picturesque landscape as fascinating as it seems inhospitable. In that, it’s no different from Antarctica or the Himalayas or Siberia today…or Mars tomorrow. The Earth of the Ice Age, as a place, is one that fuels the imagination simply because it is so different. But the question I’d like to ask is: is there a story in the Ice Age?

Lands of always winter

To answer that question, we first need to think about what the Ice Age is. A “glaciation event”, to use the technical term, is pretty self-explanatory. Colder global temperatures mean more of the planet’s surface is below freezing (0° Celsius, hence the name of this post), which means water turns to ice. The longer the subzero temps, the longer the ice can stick around. Although the seasons don’t actually change, the effect is a longer and longer winter, complete with all the wintry trappings: snow, frozen ponds and lakes, plant-killing frosts, and so on.

We don’t actually know what causes these glaciation events to start and stop. Some of them last for tens or even hundreds of thousands of years. The worst can cover the whole world in ice, creating a so-called “Snowball Earth” scenario. (While interesting in its own right, that particular outcome doesn’t concern us here. On a snowball world, there’s little potential for surface activity. Life can survive in the deep, unfrozen oceans, but that doesn’t sound too exciting, in my opinion.)

If that weren’t bad enough, an Ice Age can be partially self-sustaining. As the icecaps grow—not just the ones at the poles, but anywhere—the Earth can become more reflective. Higher surface reflectivity means that less heat is absorbed, dropping temperatures further. And that allows the ice to spread, in a feedback loop best served cold.

Living on the edge

But we know life survived the Ice Age. We’re here, after all. The planet-wide extinction event that ended the Pleistocene period came at the end of the glaciation event. So not only can life survive in the time of ice, it can thrive. How?

Well, that’s where the difference between “ice age” and “snowball” comes in. First off, the whole world wasn’t completely frozen over 20,000 years ago. Yes, there were glaciers, and they extended quite far from the poles. (Incidentally, the glaciers that covered the eastern half of America stopped not that far from where I live.) But plenty of ice-free land existed, especially in the tropics. Oh, and guess where humanity came from?

Even in the colder regions, life was possible. We see that today in Alaska, for instance. And the vagaries of climate mean that, strangely enough, that part of the world wasn’t much colder than it is today. So one lead on Ice Age life can be found by studying the polar regions of the present, from polar bears to penguins and Eskimos to explorers.

The changing face

But the world was a different place in the Ice Age, and that was entirely because of the ice. The climate played by different rules. Hundreds of feet of ice covering millions of square miles will do that.

The first thing to note is that the massive ice sheets that covered the higher latitudes function, climatically speaking, just like those at the poles. Cold air is denser than warm air, so it sinks. That creates a high-pressure area that doesn’t really move that much. In temperate regions, high pressure causes clockwise winds along their boundaries, but they tend to have stable interiors.

Anyone who lives in the South knows about the summer ridge that builds every year, sending temperatures soaring to 100°F and causing air-quality and fire danger warnings. For weeks, we suffer in miserable heat and suffocating humidity, with no rain in sight. It’s awful, and it’s the main reason I hate summer. But think of that same situation, changing the temperatures from the nineties Fahrenheit to the twenties. Colder air holds less moisture, so you have a place with dry, stale air and little prospect for relief. In other words, a cold desert.

That’s the case on the ice sheets, and some thinkers extend that to the area around them. Having so much of the Earth’s water locked into near-permanent glaciers means that there will be less precipitation overall, even in the warm tropics. That has knock-on effects in those climates. Rainforests will be smaller, for example, and much of the land will be more like savannas or steppes, like the African lands that gave birth to modern humans.

But there are still prospects for precipitation. The jet stream will move, stray winds will blow. And the borders of the ice sheets will be active. This is for two reasons. First, the glaciers aren’t stationary. They expand and contract with the subtle seasonal and long-term changes in temperature. Second, that’s where the strongest winds will likely be. Receding glaciers can form lakes, and winds can spread the moisture from those lakes. The result? Lake-effect precipitation, whether rain or snow. The lands of ice will be cold and dry, the subtropics warm (or just warmer) and dry, but the boundary between them has the potential to be vibrant, if cool.

Making it work

So we have two general areas of an Ice Age world that can support the wide variety of life necessary for civilization: the warmer, wetter tropics and the cool convergence zones around the bases of the glaciers. If you know history, then you know that those are the places where the first major progress occurred in our early history: the savannas of Africa, the shores of the Mediterranean, the outskirts of Siberia and Beringia.

For people living in the Ice Age, life is tough. Growing seasons are shorter, more because of temperature than sunlight; the first crops weren’t domesticated until after the ice was mostly gone, when more of the world could support agriculture. Staying warm is a priority, and making fire a core part of survival. Clothing reflects the cold: furs, wool, insulation. Housing is a must, if only to have a safe place for a fire and a bed. Society, too, will be shaped by these needs.

But the Ice Age is dynamic. Fixed houses are susceptible to moving or melting glaciers. A small shift in temperature (in either direction) changes the whole landscape. Nomadic bands might be better suited to the periphery of the ice sheets, with the cities at a safe distance.

The long summer

And then the Ice Age comes to an end. Again, there’s no real consensus on why, but it has to happen. We’re proof of that. And when it does happen…

Rising temperatures at the end of a glaciation event are almost literally earth-shattering. The glaciers recede and melt (not completely; we’ve still got a few left over from our last Ice Age, and not just at the poles), leaving destruction in their wake. Sea levels rise, as you’d expect, but they could also sink, as the continents rebound when the weight of the ice is lifted.

The tundra shrinks, squeezing out those plants and animals adapted to it. Conversely, those used to warmer climes now have a vast expanse of fresh, new land. Precipitation begins to increase as ice turns to water and then evaporates. The world just after the Ice Age is probably going to be a swampy one. Eventually, though, things balance out. The world’s climate reaches an island of stability. Except when it doesn’t.

Our last Ice Age ended in fits and starts. Centuries of relative warmth could be wiped out in a geological instant. The last gasp was the Younger Dryas, a cold snap that started around 13,000 years ago and lasted around a tenth of that time. To put that into perspective, if it were ending right now (2016), it would have started around the time of the Merovingians and the Muslim conquest of Spain. But we don’t even know if the Younger Dryas was part of the Ice Age, or if it had another cause. (One hypothesis even claims it was caused by a meteor striking the earth!) Whether it was or wasn’t the dying ember of the Ice Age doesn’t matter much, though; it was close enough that we can treat it as if it were.

In the intervening millennia, our climate has changed demonstrably. This has nothing to do with global warming, whatever you think on that topic. No, I’m talking about the natural changes of a planet leaving a glacial period. We can see the evidence of ancient sea levels and rainfall patterns. The whole Bering Strait was once a land bridge, the Sahara a land of green. And Canada was a frozen wasteland. Okay, some things never change.

All this is to say that the Ice Age doesn’t have to mean mammoths and tundra and hunter-gatherers desperate for survival. It can be a time of creation and advancement, too.

Colonization and the New World

It’s common knowledge that the Old World of Europe, Asia, and Africa truly met the New World of the Americas in 1492, when Columbus landed in the Caribbean. Of course, we now know that there was contact before that, such as the Vikings in Newfoundland, about a thousand years ago. But Columbus and those who followed him—Cortés, Pizarro, de Soto, Cabot, and all those other explorers and conquerors Americans learn about in history class—those were ones who truly made lasting contact between the two shores of the Atlantic.

Entire volumes have been written over the last five centuries about the exploration, the conquest, the invasion of the Americas. There’s no need to repeat any of it here. But the subject of the New World is one doesn’t seem to get a lot of exposure in the world of fiction, with the notable exception of science fiction. And I think that’s a shame, because it’s an awfully interesting topic for a story. It’s full of adventure, of gaining knowledge, of conflict and warfare. Especially for American writers (not limited to the United States, but all of North and South America), it’s writing about the legacy we inherited, and it’s odd that we would rather tell stories about the history of the other side of the ocean.

Written by the victors

Of course, one of the main reasons why we don’t write many stories about exploration and colonization is political. We know a lot about the Spaniards and Englishmen and Frenchmen that discovered (using that term loosely) the lands of America. We have written histories of those first conquistadors, of those that came after, and of the later generations that settled in the new lands. We don’t, however, have much of anything from the other side.

A lot of that is due to the way first contact played out. We all know the story. Columbus discovered his Indians (to use his own term), Cortés played them against each other to conquer them, and smallpox decimated them. Those that survived were in no position to tell their tale. Most of them didn’t have a familiar system of writing; most of those written works that did exist were destroyed. And then came centuries of subjugation. Put that all together, and it’s no wonder why we only have one side of the tale of the New World.

But this already suggests story possibilities. We could write from one point of view or the other (or both, for that matter), setting our tale in the time of first contact or shortly after, in the upheaval that followed. This is quite popular in science fiction, where the “New World” is really a whole new world, a planet that was inhabited when we arrived. That’s the premise of Avatar, for example.

Life of a colony

Colonization has existed for millennia, but it’s only since 1492 that it becomes such a central part of world history. The Europeans that moved into the Americas found it filled with wonders and dangers. For the Spanish, the chief problem—aside from the natives—was the climate, as Mexico, Central America, and the Caribbean mostly fall into the tropical belt, far removed from mid-latitude Spain.

The English had it a little better; the east coast of the United States isn’t all that different from England, except that the winters can be harsher. (This was even more the case a few hundred years ago, in the depths of the Little Ice Age.) It’s certainly easier to go from York to New York than Madrid to Managua.

No matter the climate, though, colonists had to adapt. Especially in those times, when a resupply voyage was a long and perilous journey, they had to learn to live off the land. And they did. They learned about the new plants (corn, potatoes, tomatoes, and many more) and animals (bison and llamas, to name the biggest examples), they mapped out river systems and mountain chains. And we have reaped the benefits ever since.

Building a colony can be fun in an interactive setting; Colonization wouldn’t exist otherwise. For a novel or visual work, it’s a little harder to make work, because the idea is that a colony starts out exciting and new, but it needs to become routine. Obviously, if it doesn’t, then that’s a place where we can find a story. Paul Kearney’s Monarchies of God is a great series that has a “settling new lands” sequence. In the science fiction realm of colonizing outer space, you also have such works as Kim Stanley Robinson’s Red Mars (and its colorful sequels).

Terra nullius

Whenever people moved into new land, there was always the possibility that they were the first ones there. It happened about 20,000 years ago in Alaska, about 50,000 in Australia, and less than 1,000 in Hawaii. Even in the Old World, there were firsts, sometimes even in recorded history. Iceland, for example, was uninhabited all the way through Roman times. And in space, everywhere is a first, at least until we find evidence of alien life.

Settling “no man’s land” is different from settling in land that’s already inhabited, and that would show in a story with that setting. There are no outsiders to worry about. All conflict is either internal to the colonists’ population or environmental. That makes for a harder story to write, I think, but one more suited to character drama and the extended nature of books and TV series. It doesn’t have to be entirely without action, though, but something like a natural disaster would be more likely than war.

This is one place where we can—must—draw the distinction between space-based sci-fi and earthly fiction or fantasy. On earth (or a similar fictitious world), we’re not alone. There are animals, plants, pests everywhere we go. We have sources of food and water, but also of disease. In deep space, such as a story about colonizing the asteroid belt, there’s nothing out there. Nothing living, at least. Settlers would have to bring their own food, their own water, their own shelter. They would need to create a closed, controlled ecosystem. But that doesn’t leave much room for the “outside” work of exploration, except as a secondary plot.

Go forth

I’m not ashamed to admit that I could read an entire book about nothing but the early days of a fictional colony, whether in the Americas or on an alien planet. I’ll also admit that I’m not your average reader. Most people want some sort of action, some drama, some reason for being there in the first place. And there’s nothing wrong with that.

But let’s look at that question. Why does the colony exist at all? The Europeans were looking for wealth at first, with things like religious freedom and manifest destiny coming later on. The exploration of space appears to be headed down the same path, with commercial concerns taking center stage, though pure science is another competitor. Even simple living space can be a reason to venture forth. That seems to be the case for the Vikings, and plenty of futuristic stories posit a horribly overcrowded Earth and the need to claim the stars.

Once you have a reason for having a colonial settlement, then you can turn to its nature. The English made villages and towns, the French trading posts. Antarctica isn’t actually settled—by international agreement, it can’t be—but the scientific outposts there point to another possibility. If there are preexisting settlements, like native cities, then there’s the chance that the colonists might move in to one of them instead of setting up their own place. That’s basically what happened to Tenochtitlan, now known as Mexico City.

Colonies are interesting, both in real history and in fiction. They can work as settings in many different genres, including space opera, fantasy, steampunk (especially the settling of the Wild West), and even mystery (we still don’t know what really happened at Roanoke Island). Even just a colonial backdrop can add flavor to a story, giving it an outside pressure, whether by restless natives or the cold emptiness of space. A colony is an island, in a sense, an island in a sea of hostility, fertile ground for one’s imagination.

Alternate histories

For a lot of people, especially writers and other dreamers, one of the great questions, a question that provokes more thought, debate, and even argument, is “What if?” What if one single part of history was changed? What would be the result? These alternate histories are somewhat popular, as fictional sub-genres go, and they aren’t just limited to the written word. It’s a staple of Star Trek series, for example, to travel into the past or visit the “mirror universe”, either of which involves a specific change that can completely alter the present (their present, mind you, which would be our future).

What-if scenarios are also found in nonfiction works. Look at the history section of your favorite bookstore, digital or physical. You’ll find numerous examples asking things like “What if the D-Day invasion failed?” or (much earlier in the timeline) “What if Alexander had gone west to conquer, instead of east?” Some books focus on a single one of these questions, concocting an elaborate alternative to our known history. Others stuff a number of possibilities in a single work, necessarily giving each of them a less-detailed look.

And altering the course of history is a fun diversion, too. Not only that, but it can make a great story seed. You don’t have to write a novel of historical fiction to use “real” history and change things around a little bit. Plenty of fantasy is little more than a retelling of one part of the Middle Ages, with only the names changed to protect the innocent. Sci-fi also benefits, simply because history, in the broadest strokes, does repeat itself. The actors are different, but the play remains the same.

Divergence

So, let’s say you do want to construct an alternate timeline. That could easily fill an entire book—there’s an idea—but we’ll stick to the basics in this post. First and foremost, believability is key. Sure, it’s easy to say that the Nazis and Japanese turned the tide in World War II, eventually invading the US and splitting it between them. (World War II, by the way, is a favorite for speculators. I don’t know why.) But there’s more to it than that.

The Butterfly Effect is a well-known idea that can help us think about how changing history can work. As in the case of the butterfly flapping its wings and causing a hurricane, small differences in the initial conditions can grow into much larger repercussions. And the longer the time since the breakaway point, the bigger the changes will be.

I’m writing this on September 21, and some of the recent headlines include the Emmy Awards, the Greek elections, and the Federal Reserve’s decision to hold interest rates, rather than raising them. Change any bit of any of these, and the world today isn’t going to be much different. Go back a few years, however, and divergences grow more numerous, and they have more impact. Obviously, one of the biggest events of the current generation is the World Trade Center attacks in 2001. Get rid of those (as Family Guy did in one of their time-travel episodes), and most of the people alive today would still be here, but the whole world would change around them.

It’s not hard to see how this gets worse as you move the breakaway back in time. Plenty of people—including some that might be reading this—have ancestors that fought in World War II. And plenty of those would be wiped out if a single battle went differently, if a single unit’s fortunes were changed. World War I, the American Civil War (or your local equivalent), and so on, each turning point causes more and more difference in the final outcome. Go back in time to assassinate Genghis Khan before he began his conquests, for instance, and millions of people in the present never would have been born.

Building a history

It’s not just the ways that things would change, or the people that wouldn’t have lived. Those are important parts of an alternate history, but they aren’t the only parts. History is fractal. The deeper you go, the more detail you find. You could spend a lifetime working out the ramifications of a single change, or you could shrug it off and focus on only the highest levels. Either way is acceptable, but they fit different styles.

The rest of this post is going to look at a few different examples of altering history, of changing a single event and watching the ripples in time that it creates. They go in reverse chronological order, and they’re nothing more than the briefest glances. Deeper delving will have to wait for later posts, unless you want to take up the mantle.

Worked example 1: The Nazi nuke

Both ways of looking at alternate timelines, however, require us to follow logical pathways. Let’s look at the tired, old scenario of Germany getting The Bomb in WWII. However it happens, it happens. It’s plausible—the Axis had a lot of scientific talent that defected around that time, including Albert Einstein, Werner von Braun, and Enrico Fermi. It’s not that great a leap to say that the atomic bomb could be pushed up a couple of years.

But what does that do to the world? Well, it obviously gives the Axis an edge in the war; given their leaders’ tendencies, it’s not too much of a stretch to say that such a weapon would have been used, possibly on a large city like London. (In the direst scenario, it’s used on Berlin, to stop the Red Army.) Nuclear weapons would still have the same production problems they had in our 1940s, so we wouldn’t have a Cold War-era “hundreds of nukes ready to launch” situation. At most, we’d have a handful of blasts, most likely on big cities. That would certainly be horrible, but it wouldn’t really affect the outcome of the war that much, only the scale of destruction. The Allies would likely end up with The Bomb, too, whether through parallel development, defections, or espionage. In this case, the Soviets might get it earlier, as well, which might lead to a longer, darker Cold War.

There’s not really a logical path from an earlier, more widespread nuclear weapon to a Nazi invasion of America, though. Russia, yes, although their army would have something to say about that. But invading the US would require a severe increase in manpower and a series of major victories in Europe. (The Japanese, on the other hand, wouldn’t have nearly as much trouble, especially if they could wrap up their problems with China.) The Man in the High Castle is a good story, but we need more than one change to make it happen.

Worked example 2: The South shall rise

Another what-if that’s popular with American authors involves the Civil War. Specifically, what if the South, the Confederacy, had fought the Union to a stalemate, or even won? On the surface, this one doesn’t have as much military impact, although we’d need to tweak the manpower and supply numbers in favor of our new victors. (Maybe France offered their help or something.) Economically and socially, however, there’s a lot of fertile ground for change.

Clearly, the first and most obvious difference would be that, in 1865 Dixie, slavery would still exist. That was, after all, the main reason for the war in the first place. So we can accept that as a given, but that doesn’t necessarily mean it would be the case 150 years later. Slavery started out as an economic measure as much as a racial one. Plantations, especially those growing cotton, needed a vast amount of labor. Slaves were seen as the cheapest and simplest way of filling that need. The racial aspects only came later.

Even by the end of the Civil War, however, the Industrial Revolution was coming into full force. Steam engines were already there, and railroads were growing all around. It’s not too far-fetched to see the South investing into machinery, especially if it turns out to be a better, more efficient, less rebellious method of harvesting. It’s natural—for a Yankee, anyway—to think of Southerners as backwards rednecks, but an independent Confederacy could conceivably be quite advanced in this specific area. (There are problems with this line of reasoning, I’ll admit. One of those is that the kind of cotton grown in the South isn’t as amenable to machine harvesting as others. Still, any automation would cut down on the number of slaves needed.)

The states of the Confederacy depended on agriculture, and that wouldn’t change much. Landowners would be reluctant to give up their slaves—Southerners, as I know from personal experience, tend to be conservative—but it’s possible that they could be wooed by the economic factors. The more farming can be automated, the less sense it makes for servile labor. Remember, even though slaves didn’t have to be paid, they did have costs: housing, for example. (Conversely, slavery can still exist if the economic factors don’t add up in favor of automation. We can see the same thing today, with low-wage, illegal immigrant labor, a common “problem” in the South.)

Socially, of course, the ramifications of a Confederate victory would be much more important. It’s very easy to imagine the racism of slavery coming to the fore, even if automation ends the practice itself. That part might not change much from our own history, except in the timing. Persecuted, separated, or disfavored minorities are easy to find in the modern world, and their experiences can be a good guide here. Not just the obvious examples—the Palestinians, the Kurds, and the natives of America and Australia—but those less noteworthy, like the Chechens or even the Ainu. Revolt and rebellion might become common, even to the point of developing autonomous regions.

This might even be more likely, given the way the Confederacy was made. It was intended to be a weak national government with strong member states, more like the EU than the US. That setup, as anyone familiar with modern Europe will attest, almost nurtures the idea of secession. It’s definitely within the realm of possibility that the Confederate states would break up even further, maybe even to the point of individual nations, and a “black” state might splinter off from this. If you look closely, you can see that the US became much more centralized after the Civil War, giving more and more power to the federal government. The Confederates might have to do that, too, which would smack of betrayal.

Worked example 3: Gibbon’s nightmare

One of the other big “change the course of history” events is the fall of the Roman Empire, and that will be our last example today. How we prevent such a collapse isn’t obvious. Stopping the barbarian hordes from sacking Rome really only buys time; the whole system was hopelessly corrupt already. For the sake of argument, let’s say that we found the single turning-point that will stop the whole house of cards from falling. What does this do to history?

Well, put simply, it wrecks it. The Western world of the last fifteen hundred years is a direct result of the Romans and their fall. Now, we can salvage a lot by deciding that the ultimate event merely shifted power away from Rome, into the Eastern (Byzantine) Empire centered on Constantinople. That helps a lot, since the Goths and Vandals and Franks and whatnot mostly respected the authority of the Byzantines, at least in the beginning. Doing it like this might delay the inevitable, but it’s not the fun choice. Instead, let’s see what happens if the Roman Empire as a whole remains intact. Decadent, perhaps, and corrupt at every level, but whole. What happens next?

If we can presume some way of keeping it together over centuries, down to the present day, then we have a years-long project for a team of writers, because almost every aspect of life would be different. The Romans had a slave economy (see above for how that plays out), a republican government, and some pretty advanced technology, especially compared to their immediate successors. We can’t assume that all of this would carry down through the centuries, though. Even the Empire went through its regressive times. The modern world might be 400 years more advanced, but it’s no less likely that development would be retarded by a hundred or more years. The Romans liked war, and war is a great driver of technology, but you eventually run out of people to fight, and a successful empire requires empire-building. And a Pax Romana can lead to stagnation.

But the Dark Ages wouldn’t have happened, not like they really did. The spread of Islam might have been stopped early on, or simply contained in Arabia, but that would have also prevented their own advances in mathematics and other sciences. The Mongol invasions could have been stopped by imperial armies, or they could have been the ruin of Rome on a millennium-long delay. Exploration might not have happened at the same pace, although expeditions to the Orient would be an eventual necessity. (It gets really fun if you posit that China becomes a superpower in the same timeline. You could even have a medieval-era Cold War.)

Today’s world, in this scenario, would be different in every way, especially in the West. Medieval Europe was held together by the Christian Church. Our hypothetical Romans would have that, sure, but also the threat of empire to go with it. Instead of the patchwork of nation-states that marked the Middle Ages, you would have a hegemony. There might be no need for the Crusades, but also no need for the great spiritual works iconic of the Renaissance. And how would political theory grow in an eternal empire? It likely wouldn’t; it’s only when people can see different states with different systems of government that such things come about. If everybody is part of The One Empire, what use is there in imagining another way of doing things?

I could go on, but I won’t. This is a well without a bottom, and it only gets deeper as you fall further. It’s the Abyss, and it can and will stare back at you. One of my current writing projects involves something like an alternate timeline—basically, it’s a planet where Native Americans were allowed to develop without European influence—and it has taken me down roads I’ve never dreamed of traveling. Even after spending hundreds of hours thinking about it, I still don’t feel like I’ve done more than scratch the surface. But that’s worldbuilding for you.