The Second Enlightenment

The Third Dark Age is upon us. We live in the modern equivalent of the final days of Rome, waiting for the sack that finishes off the Empire once and for all. Like the Huns and their Bronze Age counterparts, invaders run rampart in our towns and cities, not only not stopped by those who claim to lead us, but actively supported by them. Meanwhile, alleged academics want to banish all knowledge of the past, for fear of the masses recognizing our decline.

Can we halt our decline? Probably not, as far down the path as we’ve come so far. But that doesn’t mean we can’t work to ensure that it is as short as possible, and that our descendants are not left in a centuries-long era of regression and barbarism.

Awakening

Out of the Greek Dark Age came legends: Homer and Exodus, mythic tales of people overcoming the great and powerful with more than a little help from their chosen deities. By the time the dust had settled, the world had changed irrevocably, a full break with the past. Gone were the Hittites and Trojans and Canaanites, while Darius and Alexander were still hundreds of years away.

It was a long road back, but we got there in the end. Eventually, rational thought returned to the Western world, largely confined to Greece at this stage. Philosophy was born, and with it the awakening of wisdom, of reason.

Much, much later, the fall of Rome and rise of Islam brought the Medieval Dark Age to Europe, all but extinguishing that light. And while the cultural and technological and even scientific knowledge of the West rose from the mire after only a relatively few generations, the higher purpose of wisdom, of the kind of knowledge that creates civilizations and jump-starts human progress, lay dormant far longer. Instead, Europe looked to religion, to mysticism and myth, for another few centuries.

Only when science had advanced far enough to prove the fairytale stories of Jewish scripture demonstrably false could the Enlightenment begin. And only when it began was Europe able to cast off the last of the darkness.

That didn’t start until the early 17th century, with great thinkers like Galileo and Bruno to start, followed later by those such as Newton and Spinoza. Eventually, the Enlightenment even began to fracture, different regions going their separate ways. The French Enlightenment, for instance, gave us the rational philosophy of Descartes and the like: ways to look at the world that didn’t invoke the supernatural. The English and even Scottish, on the other hand, contributed the wisdom of politics, economics, and the "hard" sciences. Last, but most important, was the American Enlightenment, bringing the liberal (in the classical sense) values of French thinkers together with the moral imperatives of free speech, free markets, and freedom of religion that came from their rivals across the Channel.

In that sense, then, there were still bits of darkness in the world as late as 1800. (Really, not all of them left, but I digress.) Even though Europe didn’t have wandering hordes of invaders anymore, we in the West still needed a thousand years after Charlemagne to truly return to the glory days he was trying to emulate.

And that pairs up well with the Greek Dark Age. Yes, Homer was writing his epics in the ninth or tenth century BC, but they were the beginning, not the end, of Greece’s rise. Most of the great thinkers we associate with the Greek school of philosophy came much later, in the third and fourth centuries BC. In other words, almost a full millennium after the invasions of the Sea Peoples. In a very real sense, then, the path from darkness to light was longer for medieval Europe.

Learning from the past

We must do better than that. The effects of our Third Dark Age can’t last for a thousand years. We’ve come too far as a species to allow ourselves to be dragged back into the darkness, no matter what the "traditionalist" right and "inclusive" left wish for us.

So how do we do it?

First, we must keep knowledge alive. True knowledge, the wisdom passed down to us and created in our own time. Digital collections such as Library Genesis are a good thing—the fact that elites hate them so much is a pretty good indication—but they have the downside of being, well, digital. If the Third Dark Age collapse is too great, ubiquitous computing won’t be a given. In addition to distributed, censorship-resistant online libraries, then, we need open, secure libraries in the physical world. The Library of Alexandria, except there’s one in every city, and their shared goal is to archive as much knowledge as possible, in ways that it will endure even the harshest decline.

Second, those of us who are awake to the peril must continue to share that knowledge. Within our family, our community, and our country (in that order), we should be training others in the skills we possess, while also passing down what we have learned. And this includes the greatest lessons of all: that the world is not some divine mystery; that humanity is inherently a good and positive force; that science is not reserved to the elite, or those with the right credentials, but is something every one of us experiences every single day.

Third on the list is a greater focus on that community. The post-collapse time of the previous Dark Ages was a reversion to a heavily decentralized world. An anti-globalism, or a "localism", if you will. In modern times, I can foresee that creating a multitude of city-states; we’re already pretty close to this with New York, London, and a few others. But even rural areas will have to become more self-reliant as the Third Dark Age brings a fall of the American Empire. We can’t do that if we don’t know our neighbors. (We also can’t band together to resist invasions without that sense of brotherhood, so this strengthening of community has more than one beneficial effect.)

Fourth, reconnecting with our past, in the sense of doing useful work outside of the internet. This can be writing books, building a shed, or just anything. The key is that it has a physical presence. It’s a physical manifestation of our knowledge, which matters more in a world that will come to see that knowledge as worthless in itself. I’m not saying to become a prepper—that might be more useful in other collapse circumstances—but to prepare for a major shift in what society deems important.

Above all, we need to remember, to preserve, to teach the world that the coming darkness is not eternal. There is a light beyond, and it is not the light at the end of the tunnel. It is the dawn of a new day. Working together, recognizing what we are losing and why, we have the chance to bring that new dawn faster than in our ancestors’ previous two attempts.

If knowledge is power, our job is to be a generator. And that’s what you need to keep the lights on in a disaster.

The Third Dark Age

Twice before, the West faced a crisis, a series of unfortunate events that led ultimately to the decline of the reigning powers of the civilized world, a long stretch of technological stagnation or even regression, and a loss of the cultural achievements of those who came before. In short, a Dark Age.

Our world today is on the same track. We’re following those same steps, dealing with those same crises. Unlike the past, however, we have the ability to recognize what is happening, and to stop it. But we can only do that if we acknowledge our situation. To do that, we must understand the warning signs and the parallels.

The last Dark Age

Most people in the Western world have heard of the Dark Age. (Sometimes referred to as the Dark Ages, but the singular is important here, as you’ll see shortly.) The time after the fall of the Roman Empire was a period of barbarism in Europe, a long stretch of Vikings and serfdom and general horror.

Archaeological finds show that our legends of this time are exaggerated. Alas, these finds have given ammunition to those on both sides of the political spectrum who wish to argue that the Dark Age never even happened. The left will point to algebra and the Almagest to say, "See? Muslims kept making advances where white Europeans failed." The right, meanwhile, counters with, "Look at those cathedrals! That’s proof that Christianity is what kept civilization going."

Both are wrong, of course. The few Muslim inventions—and their occasional translation of ancient knowledge—don’t make up for the ravages of the Moorish conquest of Spain. The cathedrals built in the 9th century weren’t constructed from Roman concrete, because knowledge of how to make that was lost along with so much else. There was a Dark Age, no doubt about it. The only questions are how long it lasted and just how dark it was.

By any reasonable estimation, the fall of Rome was the tipping point. Many of the Gothic tribes that settled in Italy, France, and Spain at the time still considered themselves vassals of the Empire, to some extent, and some continued to pay homage to Constantinople, the eastern capital where the flame of civilization was kept alive. But even that had ended by 540, following a volcanic winter (caused by an eruption in Central America!) and attendant famines and plagues. So we can put the start of the Dark Age around 500 AD, plus or minus about 40 years.

When did it end? Tradition has it lasting as late as 1066, with the Norman Conquest. Academics like to credit Charlemagne’s accession in 800 as ending it. I’d say the best date comes in between those. The early not-quite-Renaissance of the late ninth and early tenth centuries shows that European culture was beginning to rise from its nadir far better than the end of the Merovingian era. Personally, I’d use 927, the year of Æthelstan’s coronation as king of England, as a good compromise, but you could make a case for anywhere in the range of 870-940.

In between, most of the continent was a mess. Rational thought took a back seat to mysticism and monasticism. Texts, cultural contributions, and even general knowledge of the Roman Empire all fell away, until the Romans themselves almost became mythologized. The typical Hollywood portrayal of medieval peasants in dirty, tattered clothing, treading muddy streets to go back to their thatch hovels, isn’t entirely accurate, but it’s probably closer to the truth than the almost romantic notions of some traditionalists.

The European Dark Age was, to sum up, a time where the strong ruled, the weak toiled, and the wisdom of the past was forgotten. What’s worse is that it wasn’t the first time that had happened.

The one before

As far back in history from the start of the European Dark Age as it is from our present day, the lands of the Mediterranean faced a crisis. This was precipitated by invasions from what are commonly called the Sea Peoples, a later collective name given to groups who are mostly known only from a few Egyptian accounts. We can identify some of them from such accounts, however: the Achaeans, Sicilians, Sardinians, and Philistines. Possibly also the Etruscans, though this etymology is on somewhat shakier ground.

Whoever they were, the Sea Peoples invaded Egypt, most of the Levant, and Anatolia. They clashed with the major civilizations of that time—Egyptians, Hittites, and so on—and ultimately wore them down so much that they fell into their own decline. It wasn’t a conquest, but more like a war of attrition, the same way forces in Palestine and Lebanon (almost the exact same place!) are bleeding their occupiers dry as we speak.

Dates are hard to find this far back in history, but the most common given for the start, or perhaps the height, of the troubles is 1177 BC, owing to the popular book of the same name, which isn’t bad except for the part where it does the usual academic trick of trying to be notable by minimizing the impact of known historical events.

This "Greek Dark Age", as it’s commonly called, isn’t as much known, but its effects were no less drastic than the European one that started a millennium and a half later. The Hittites fell out of history entirely, to the point where our only knowledge of them as recently as 200 years ago was a mention in the Bible. The Egyptians fared a little better, but lost most of their holdings east of the Sinai to the Philistines and Canaanites, who—in another event paralleled by modern times—later lost them to invading Hebrews. Farther north, Troy fell to an alliance that included Sea Peoples; its collapse was so total that the modern West thought the whole city was a myth until it was rediscovered.

Three thousand years is a long time, so it’s only reasonable that we have far less data about the Greek Dark Age. We don’t know a lot of details about it, but what we do know shows that it follows the same pattern as the one that befell Europe later on.

What’s to come

The biggest contributor to both of the previous Dark Ages is invasion. Rome was invaded by Goths, Huns, Vandals, and (later) Moors, all of whom picked apart the bones of the empire and left little behind for its citizens. The Sea Peoples did much the same to the powers of the Bronze Age, even when Ramesses II tried to resettle some of them.

It’s not hard to see that pattern repeating today. Our own country is being invaded as we speak, as are so many of the major Western powers. Millions of "asylum-seekers" who consume resources but refuse to assimilate, who provoke or cause violence, who care nothing for the sentiments of those who call this land home. The Haitians eating pets in Ohio, the Venezuelans capturing apartment complexes in Colorado, the rape gangs of England…these are the modern Sea Peoples, the modern Huns and Moors. And they are one tip of the trident thrust our way.

The apocalyptic conditions of the 530s contributed to the European Dark Age, as well as the fall of a number of smaller cultures in the Middle East, where the resulting power vacuum provided fertile ground for a Muslim conquest. It’s harder to pinpoint a major ecological disaster for the Greek Dark Age; probably the closest is an impact (or airburst) event on the shores of the Dead Sea circa 1600 BC, the historical basis for the Sodom and Gomorrah myth. But that must be too far back. Undoubtedly, the Sea Peoples wanted to migrate south for some reason. Perhaps it’s linked to the fall of the Minoans on Crete, another total collapse in that era.

Today, we don’t seem to have as much to worry about on that front. We’re in a stable climate epoch, a period of global "greening" while temperatures remain steady and comfortable. In our case, the ecological angle of collapse might come from an overreaction on the part of—or simply led by—doomsayers who claim our relatively quiescent climate is somehow a bad thing, and that we need to go back to the days of the Little Ice Age.

More likely, ecology will be used as a way to contribute to the collapse. We already see that happening, as clean nuclear plants are shut down and replaced with toxic solar panels and bird-killing turbines. Eugenics is another possibility: the attempts by the so-called "elite" to force us to eat bugs or genetically modified plants, to take experimental drugs that are shown to have a deleterious effect on our health.

The third and final pillar that must fall to create a Dark Age is cultural continuity. In modern times, that one isn’t so much collapsing as being demolished. The entire agenda of ideologies such as progressivism and communism is to create a clean break with the past, with the traditions and customs that brought us to where we are. What little history is allowed to be learned is shown through a distorted lens, and too many who should oppose such acts instead welcome them, hoping that, in the chaos that follows, their particular ideologies will have a chance to step forward.

To be continued

Our new Dark Age, then will come from those factors: unchecked immigration, ecological fear-mongering, and the destruction of our heritage. That’s not to say these things will start happening soon. No, they’re already happening. With every border crossing, every falsified temperature record, every statue torn down, we sink deeper into the darkness. We’re on the path of decline right now. We have been for almost the entire 21st century.

The question then becomes: what are we going to do about it? In the next post, I’ll offer my own thoughts on a solution. To combat the Third Dark Age, I believe we’ll need a Second Enlightenment.

Lands of the lost

Recently, I finished reading Fingerprints of the Gods. I picked it up because I found the premise interesting, and because the mainstream media made such a big deal about author Graham Hancock getting a Netflix miniseries to showcase his unorthodox theories. I went into the book hoping there would be something tangible about those theories. Unfortunately, there isn’t.

Time of ice

The basic outline of the book is this: What if an advanced civilization existed before all known historical ones, and imparted some of its wisdom to those later civilizations as a way of outliving its own demise?

Put like that, it’s an intriguing proposition, one that has cropped up in many places over the past three decades. The Stargate franchise—one of my favorites, I must admit—is based largely on Hancock’s ideas, along with those of noted crackpots like Erich von Daniken. Chrono Trigger, widely regarded as one of the greatest video games of all time, uses the concept as a major plot point. Plenty of novels, especially in fantasy genres, suppose an ancient "builder" race or culture whose fingerprints are left within the world in some fashion.

It was this last point that piqued my interest, because my Otherworld series revolves around exactly this. And I even unknowingly used some of Hancock’s hypotheses for that. The timing of my ancients leaving Earth for their second world matches that of his ancients’ final collapse. The connection of archaeoastronomy as a way of leading to their knowledge arises in my books. Even using the prehistoric Mesoamericans as the catalyst wasn’t an original idea of mine; in my case, however, I did it so I wouldn’t have to deal with the logistics of the characters traveling to another continent.

Some of the questions Hancock asks are ones that need to be asked. It’s clear that ancient historical cultures the world over have some common themes which arise in their earliest mythology. Note, though, that these aren’t the specific ones he lists. The flood of Noah and Gilgamesh is entirely different from those of cultures beyond the Fertile Crescent and Asia Minor, for example, because it most likely stems from oral traditions of the breaking of the Bosporus, which led to a massive expansion of the Black Sea. Celts, to take one instance, would instead have a flood myth pointing to the flooding of what is now the Dogger Bank; peoples of New Guinea might have one relating to the inundation of the Sunda region; American Indian myths may have preserved echoes of the flooding of Beringia; and so on.

While the details Hancock tries to use don’t always work, the broad strokes of his supposition have merit. There are definitely astronomical alignments in many prehistoric structures, and some of them are downright anachronistic. Too many indigenous American cultures have myths about people who most definitely are not Amerind. (And now I’m wondering if Kennewick Man was a half-breed. I may need to incorporate that into a book…)

The possibility can’t yet be ruled out that cultures with technology more advanced than their direct successors did exist in the past. We know that Dark Ages happen, after all. We have historical records of two in the West (the familiar medieval Dark Age beginning c. 500 AD and the Greek Dark Age that started c. 1200 BC), and we’re very likely on the threshold of what might one day be termed the Progressive Dark Age.

With the cataclysmic end of the Ice Age and the catastrophic Younger Dryas cold snap, which now seems likely to be caused by at least one asteroid impact, there’s a very good impetus for the "breaking the chain" effect that leads to a Dark Age, one that would erase most traces of such an advanced civilization.

Habeas corpus

Of course, the biggest problem with such a theory is the lack of evidence. Even worse, Hancock, like most unorthodox scholars, argues from an "absence of evidence is not evidence of absence" line of thought. Which is fine, but it’s not science. Science is about making testable and falsifiable predictions about the world. It’s not simply throwing out what-ifs and demanding that the establishment debunk them.

The onus is on those who make alternative theories, and this is where Hancock fails miserably. Rarely in the book does he offer any hard evidence in favor of his conjecture. Instead, he most often uses the "beyond the scope of this book" cop-out (to give him credit, that does make him exactly like any orthodox academic) or takes a disputed data point as proof that, since the establishment can’t explain it, that must mean he’s right. It’s traditional crackpottery, and that’s unfortunate. I would’ve liked a better accounting of the actual evidence.

Probably the most disturbing aspect of the book is the author’s insistence on taking myths at face value. We know that mythology is absolutely false—the Greek gods don’t exist, for example—but that it can often hide clues to historical facts.

To me, one of the most interesting examples of this is also one of the most recent: the finding in 2020 of evidence pointing to an impact or airburst event near the shore of the Dead Sea sometime around 1600 BC. This event apparently not only destroyed a town in such a violent event that it vaporized human flesh, but it also scattered salt from the sea over such a wide region that it literally salted the earth. And the only reference, oral or written, to this disaster is as a metaphor, in the Jewish fable of Sodom and Gomorrah.

Myths, then, can be useful to historians and archaeologists, but they’re certainly not a primary source. The nameless town on the shore of the Dead Sea wasn’t wiped out by a capricious deity’s skewed sense of justice, but by a natural, if rare, disaster. Similarly, references in Egyptian texts of gods who ruled as kings doesn’t literally mean that their gods existed. Because they didn’t.

In the same vein, Hancock focuses too much on numerological coincidences, assuming that they must have some deeper meaning. But the simple fact is that many cultures could independently hit upon the idea of dividing the sky into 360 degrees. It’s a highly composite number, after all, and close enough to the number of days in the year that it wouldn’t be a huge leap. That the timeworn faces of the Giza pyramids are currently in certain geometric ratios doesn’t mean that they always were, or that they were intended to be, or that they were intended to be as a message from ten thousand years ago.

Again, the burden of proof falls on the one making the more outlandish claims. Most importantly, if there did exist an ancient civilization with enough scientific and technological advancement to pose as gods around the world, there should be evidence of their existence. Direct, physical evidence. An industrial civilization puts out tons upon tons of waste. It requires natural resources, including some that are finite. The more people who supposedly lived in this Quaternary Atlantis, the more likely we would have stumbled upon one’s remains by now.

Even more than this, the scope of Hancock’s conjecture is absurdly small. He draws most of his conclusions from three data points: Egypt, Peru, and Central America. Really, that’s more like two and a half, because there were prehistoric connections between the two halves of the Americas—potatoes and corn had to travel somehow. Rarely does he point to India, where Dravidians mangled the myths of the Yamnaya into the Vedas. China, which became literate around the same time as Egypt, is almost never mentioned. Did the ancients just not bother with them? What about Stonehenge, which is at least as impressive, in terms of the necessary engineering, as the Pyramids?

Conclusion

I liked the book. Don’t get me wrong on that. It had some thought-provoking moments, and it makes for good novel fodder. I’ll definitely have to make a mention of Viracocha in an Otherworld story at some point.

As a scientific endeavor, or even as an introduction to an unorthodox theory, it’s almost useless. There are too many questions, too few answers, and too much moralizing. There’s also a strain of romanticism, which is common to a lot of people who study archaeological findings but aren’t themselves archaeologists. At many points, Hancock denigrates modern society while upholding his supposed lost civilization as a Golden Age of humanity. You know, exactly like Plato and Francis Bacon did.

That said, it’s worth a read if only to see what not to do. In a time where real science is under attack, and pseudoscience is given preferential treatment in society, government, and media, it’s important to know that asking questions is the first step. Finding evidence to support your assertions is the next, and it’s necessary if you want to advance our collective knowledge.

The last Dark Age

In the title of this post, “last” means “previous” rather than “final”, for I truly believe we are on the precipice of a new Dark Age. With that in mind, it’s not that bad an idea to look back at the one that came before.

Defining the moment

A lot of modern academics don’t even like talking about the Dark Ages. They prefer the bland descriptor “Early Middle Ages” instead. But that line of thinking is faulty in multiple respects.

First, the given reasoning for referring to the Dark Ages as something else is because the “darkness” of the times was a localized concept. Outside of Europe, it wasn’t all that dark. Islam, for instance, had a bit of a renaissance around the same time, and China barely noticed the troubles of the West at all.

However, this same logic should dictate that the Middle Ages are no less localized. After all, the term comes from post-medieval sources who placed that time between their modern era and the classical period of the Greeks and Romans. Similarly, is referring to the Iron Age (which began around the time of the Greek Dark Ages, starting in 1177 BC) any less patronizing? Iron tools were never developed by natives in the Americas or Australia; what was the Iron Age in Anatolia would have been nothing more than the later Stone Age in Mesoamerica. The Middle Ages aren’t “middle” at all, except through the same lens that gives us the Dark Ages.

The second reason why it’s an error to conflate the Dark Ages with the Middle Ages is character, and it’s the subject of this post.

Beginning and ending

Before we can get to that, though, we need to define the limits of the period. The beginning is fairly easy, because Europe’s decline can be traced directly to the fall of Rome in 476 AD. This event was the culmination of decades of barbarian activity, with the entire empire facing threats from waves of migrant Vandals, Goths, Huns, and others. Those peoples slowly encroached upon Roman territory, nipping away at the borders, until they were able to reach the capital itself. Rome was sacked, and the last western emperor, Romulus Augustulus, fled into exile. Or was sent there. Conflicting tales exist, but the gist is clear: Europe no longer bowed to Rome.

Things didn’t change overnight, of course. The barbarian kings often paid homage to the Byzantine emperor who continued to style himself Roman all the way to the 15th century. For a time, they considered themselves successors to the western throne, or at least to the provinces it had once controlled.

No, the Dark Ages only truly began once continuity was lost. That was a slow breakdown over years, decades, generations. The barbarian hordes lacked Roman culture. Without an imperial presence in Europe, that culture began to disappear, fading into memory as those who continued to consider themselves Roman aged and died. Later in the post, we’ll look at what that entailed.

As for when the Dark Ages ended, that’s a tougher question. Some might point to the coronation of Charlemagne as Holy Roman Emperor in 800. Indeed, this did rejuvenate Europe for a time, bringing about the Carolingian Renaissance, and the 9th century gave us a few technological advances; Cathedral, Forge, and Waterwheel, by Joseph and Frances Gies, details some of these, including the three in the book’s title.

Another date might be 927, marking the defeat of the Vikings by Æthelstan, first King of England. This was significant from both a political and religious standpoint, as England became a unified Christian kingdom for the first time in its history; Spain, for instance, wouldn’t manage that for nearly 600 years. And Æthelstan’s victory over the Danes did begin to bring about the changes that define the Middle Ages, such as the feudal system.

Still others would argue that the Dark Ages didn’t really end until William the Conqueror was crowned in 1066. By this point, all the pieces of the Middle Ages were in place, from the manorial society to the schism of Catholic and Orthodox. The Reconquista had begun in Spain, Turks were overrunning Byzantine lands, and the Crusades were about to begin. Clearly, the world had moved on from the Fall of Rome.

Continuum

Personally, I think that’s too late, while the Charlemagne date of 800 seems a bit too early. But it may be that there is no single date we can point to and say, “The Dark Ages ended here.” Rather, there’s a continuum. The period ended at different times in different places throughout Europe, as connections to the past were rediscovered, and connections among those in the present were strengthened.

When the period began, the results were devastating. As Roman rule fell, so too did Roman institutions. The roads, so famous that we enshrine them in aphorisms, began to succumb to the ravages of time. Likewise for the bath, the forum, the legal framework, and the educational system.

The replacements weren’t always up to par, either. One of the reasons the Dark Ages are, well, dark is because of the relative lack of written works from the time. We have tons of Roman-era books: Caesar’s commentaries on the Gallic Wars, Ovid’s masterpieces, the Stoic philosophy of Marcus Aurelius, and even the New Testament of the Bible all come from the Roman world. By contrast, the best-known writings to come from the period 476-1066 are histories like the Anglo-Saxon Chronicle, religious texts such as those by Bede, and Beowulf.

That’s not to say that people in the Dark Ages were stupid. Far from it. Instead, they had different priorities. They lived in a different world, one that didn’t have much opportunity for philosophy. Even when it did, that was almost exclusively the domain of the Church, one of the few institutions that retained some measure of continuity with the previous age.

With the breakdown of Roman society came a change in the way people saw themselves. While the barbarians did become civilized, they didn’t become Romanized. Gone were the trappings of republic and the scholastic zeal we associate with Late Antiquity. Dark Age society focused more on tribal identity, family honor, and individual heroism. The world, in a sense, shrank for the average person. Some of the changes came from the pagan background of the Gauls, Goths, and others, but they retained them even after converting to Christianity.

The unifying power of the Church may have helped usher in the end of the Dark Ages, in that it created the backdrop for the centralization of secular power, turning petty kingdoms into nation-states. Seven English kingdoms became a single England. Vast swathes of Europe fell under the rule of the emperor in Aachen. And this could be seen as lifting the continent out of the mire. A powerful nation can build bigger than a small tribe; the grand cathedrals begun in the ninth and tenth centuries are evidence of that.

But that didn’t change the fact that so much had been lost. In some places, particularly rural Britain, standards of living (which weren’t all that high in Roman times, to be fair) dropped to a level not seen since the Bronze Age, some 2000 years before. With Roman construction and sanitation forgotten, life expectancies fell, as did urban population. This was the Dark Ages in a nutshell. When Hobbes describes early man’s life as “nasty, brutish, and short,” he’s also talking about post-Roman, pre-medieval Europe. A life without even the most basic trappings of civilization, with little hope for advancement except through heroic deeds, with the specter of death lurking around every corner…that’s not much of a life at all.

Light returns

The Dark Ages did, however, come to an end. As I said above, the ninth century brought about the Carolingian Renaissance, a small uplifting. Much later came the 12th-century version, which brought about the High Middle Ages. Bits of darkness lingered all the way to 1453, when the last vestige of ancient Rome fell to the Ottoman Empire.

Odoacer’s sack of the imperial capital in 476 brought about, in a sense, the end of the world. When Mehmed II did the same thing to the other Roman capital, Constantinople, a millennium later, the effect was quite different. Instead of a new Dark Age, the end of the Byzantines fanned the flames of the Renaissance. The true Renaissance, the one which deserves this name. By then, so much of classical times had been forgotten by Europe at large, but it was now rediscovered, the bonds reforged.

Dark Ages end when light shines through. Or when enough people decide that they are destined for greater things. In Europe, the three centuries after 476 were a period of stasis, even regression. What little of our modern media touches on this period tends to focus on heroes real or invented: Vikings, The Last Kingdom, and so on. That’s understandable, as the life of the ordinary Saxon in Winchester, the Frank in Paris, or the Lombard in Pavia is relatively dull and uninspiring. The ones whose names we remember are those who rose above that. Heroes exist in every age, no matter what the society around them looks like.

Darkness, in this sense, can be defeated. This is a darkness of ignorance, of barbarism, of tribal infighting. Knowledge is the light that washes it away. To this day, we still can’t recreate some of the progress of Antiquity: we don’t know precisely how the Romans made their concrete, the composition of Greek fire, or the purpose of the Antikythera Mechanism.

Those secrets were lost because continuity was lost. The passing of culture from one generation to the next stopped, breaking a chain that had endured for centuries. With our interconnected world of today, it’s easy to think that can’t happen anymore. After all, we can call up an entire library on our phones. But what happens when that chain is sabotaged? What happens when culture and history are intentionally altered or buried? The result would be a new Dark Age.

Culture and history forgotten. Waves of migrants. Cities sacked. The loss of classical education and scholasticism. Sounds awfully familiar, doesn’t it?

The price of protest

Tin soldiers and Nixon coming…four dead in Ohio

I have written a lot in the past few years to commemorate the 50th anniversary of various spaceflight milestones: the Apollo 8 lunar orbit, Apollo 11’s landing in 1969, and so on. I do that because I love the American space program, of course, but also because I believe its accomplishments rank among the greatest in human history. They are certainly shining lights in the 20th century.

But we must also remember the darker days, lest, to paraphrase Santayana, we be doomed to repeat their mistakes.

This day 50 years ago, on May 4, 1970, four students at Kent State University were shot and killed by National Guard soldiers during a protest against the Vietnam War. Nine others were injured, a college campus became a battlefield, and the entire nation lost whatever vestiges of innocence it still had after years of needless death in the jungles of Southeast Asia.

I was not alive for these events. They were 13 years before I was born; those who lost their lives were over a decade older than my parents! Yet I have seen the documentaries. I’ve read the stories. That is how history survives, through the telling and retelling of events beyond our own experience. In the modern era, we have photographs, television recordings, and other resources beyond mere—and fallible—human memory.

For Kent State, I’ve watched the videos from the tragedy itself, and few things have ever left me more disgusted, more saddened, and more…angry. It boggles my mind how anyone, even soldiers trained in the art of war and encouraged to look at their enemy as less than human, could think this was a good thing, a just thing. Yet they did not hold their fire. If they stopped to think, “These are young Americans, people just like me, and they’re doing what’s right,” then it never showed in their actions.

Worse, however, is the public perception that followed. In the wake of the massacre, polls showed that a vast majority of people in this country supported the soldiers. Yes. About two-thirds of those surveyed said they felt it was justified to use lethal force against peaceful protestors who were defending themselves.

Let’s break that down, shall we? First, protests are a right. The “right of the people peaceably to assemble” is guaranteed in the First Amendment; it doesn’t get the attention of speech, religion, and the press, but it’s right there alongside them. And remember that the Bill of Rights, as I’ve repeatedly stated in my writings, is not a list of those rights the government has granted its citizenry. Rather, it’s an incomplete enumeration of rights we are born with—“endowed by our Creator”, in Jefferson’s terms—that cannot be taken away by a government without resorting to tyranny.

Some may argue that the Kent State protests were not peaceful. After all, the iconic video is of a student throwing a canister of tear gas at the police officers called in to maintain order, right? But that argument falls flat when you see that the tear gas came from those same cops. It was fired to disperse the crowd. The protestors didn’t like that, so they risked physical danger (not only the chance of getting shot, but even just burns from the canisters themselves) to clear the space they had claimed as their own.

And finally, the notion that killing students was the only way to end the protest would be laughable if it weren’t so sad. They were unarmed. Deescalation should always be the first option. Whatever you think about the protest itself, whether you feel it was wholly justified or dangerously un-American, you cannot convince me that shooting live rounds into a crowd is an acceptable answer. The only way, in my opinion, you could convince yourself is if you accept the premise that these students were enemy collaborators, and the National Guard’s response was legitimate under the rules of engagement.

But that presumes a dangerous proposition: that American citizens opposing a government action they feel is morally wrong constitutes a threat to the nation. And here we see that those lessons learned in Kent State 50 years ago have been forgotten since.


Today, we don’t have the Vietnam War looming over us. The eternal morass of Iraq and Afghanistan, despite taking twice as much time (and counting), has long since lost the furious reactions it once inspired. Trump’s presidency was worth a few marches, the Occupy and Tea Party movements were quashed or commandeered, and even the Great Recession didn’t prompt much in the way of social unrest.

But a virus did.

Rather, the government response to the Wuhan virus, whether on the federal, state, or local level, has, in some places, been enough to motivate protests. The draconian lockdown orders in Michigan, California, North Carolina, and elsewhere, unfounded in science and blatantly unconstitutional, have lit a fire in those most at risk from the continued economic and social devastation. Thousands marching, cars causing gridlock for miles, and beaches flooded with people who don’t want to hurt anyone, but just yearn to breathe free. It’s a stirring sight, a true show of patriotism and bravery.

Yet too many people see it as something else. They believe the protests dangerous. The governors know what’s best for us, they argue. They have experts backing them up. Stay at home, they say. It’s safe there. Never mind that it isn’t. As we now know through numerous scientific studies, the Wuhan virus spreads most easily in isolated environments and close quarters. It’s most deadly for the elderly, and some two out of every three deaths (even overcounting per federal guidelines) come from nursing homes and similar places. For the vast majority of people under the age of 60, it is, as the CDC stated on May 1, barely more of a risk than “a recent severe flu season” such as 2017-18. Compared to earlier pandemic flu seasons (e.g., 1957, 1969), it’s not that bad, especially to children.

Of course, people of all sorts are dying from it. That much is true, and my heart cries out for every last one of them. Stopping our lives, ending our livelihoods, is not the answer. People, otherwise healthy people who aren’t senior citizens, die from the flu every year. My cousin did in 2014, and he was 35. That’s the main reason I feared for my life when I was sick back in December; looking back, the symptoms my brother and I showed match better with the Wuhan virus than with the flu, and each week brings new evidence pointing to the conclusion that it was in the US far earlier than we were told. If that is what we had, it didn’t kill us, just like it won’t kill the overwhelming majority of people infected.

Epidemiology isn’t my goal here, however. I merely wanted to remind anyone reading this that the virus, while indeed a serious threat, is not the apocalypse hyped by the media. Common sense, good hygiene, and early medical treatment will help in most cases, and that’s no different from the flu, or the pneumonia that almost put me in the ICU in 2000, or even the common cold.

Now that all indications are showing us on the downslope of the curve, I’d rather look to the coming recovery effort, and the people—the patriots—who have started that conversation in the most public fashion. The Reopen America protestors are doing exactly what Americans should do when they perceive the threat of government tyranny: take to the streets and let your voice be heard. Civil disobedience is alive and well, and that is a good thing. It’s an American thing.

The movement is unpopular, alas. Reopen protestors are mocked and derided. Those who report on them in a favorable light are called out. A quick perusal of Twitter, for instance, will turn up some truly awful behavior. Suggestions that anyone protesting should be required to waive any right to medical treatment. Naked threats of calling Child Protective Services on parents who let their kids play outside. Worst of all, the holier-than-thou smugness of those who would willingly lock themselves away for months, if not years, over something with a 99.8% survival rate, solely on the basis of an appeal to authority.

A past generation would call such people Tories; in modern parlance, they are Karens. I call them cowards. Not because they fear the virus—I did until I learned more about it, and I accept that some people probably do need to be quarantined, and that some commonsense mitigation measures are necessary for a short time.

No, these people are cowards because they have sacrificed their autonomy, their rationality, and their liberty on an altar of fear, offerings to their only god: government. It’s one thing to be risk-averse. We beat worse odds than 500-to-1 all the time, but there’s always a chance. To live your life paralyzed by fear, unable to enjoy it without worrying about all the things that might kill you, that’s a terrible way to live. I know. I’ve been there. But never in my darkest moments did I consider extending my misery to the 320 million other people in this country. That is true cowardice, to be so afraid of the future that you would take it from everyone else.

Protest is a powerful weapon. The Vietnam War proved that beyond a shadow of a doubt. Fifty years ago today, four Ohio students paid the ultimate price for wielding that weapon. But they died believing what they did was right. They died free, because they died in a public expression of the freedom each of us is gifted the day we’re born.

Better that than dying alone in your safe space.

Future past: computers

Today, computers are ubiquitous. They’re so common that many people simply can’t function without them, and they’ve been around long enough that most can’t remember a time when they didn’t have them. (I straddle the boundary on this one. I can remember my early childhood, when I didn’t know about computers—except for game consoles, which don’t really count—but those days are very hazy.)

If the steam engine was the invention that began the Industrial Revolution, then the programmable, multi-purpose device I’m using to write this post started the Information Revolution. Because that’s really what it is. That’s the era we’re living in.

But did it have to turn out that way? Is there a way to have computers (of any sort) before the 1940s? Did we have to wait for Turing and the like? Or is there a way for an author to build a plausible timeline that gives us the defining invention of our day in a day long past? Let’s see what we can see.

Intro

Defining exactly what we mean by “computer” is a difficult task fraught with peril, so I’ll keep it simple. For the purposes of this post, a computer is an automated, programmable machine that can calculate, tabulate, or otherwise process arbitrary data. It doesn’t have to have a keyboard, a CPU, or an operating system. You just have to be able to tell it what to do and know that it will indeed do what you ask.

By that definition, of course, the first true computers came about around World War II. At first, they were mostly used for military and government purposes, later filtering down into education, commerce, and the public. Now, after a lifetime, we have them everywhere, to the point where some people think they have too much influence over our daily lives. That’s evolution, but the invention of the first computers was a revolution.

Theory

We think of computers as electronic, digital, binary. In a more abstract sense, though, a computer is nothing more than a machine. A very, very complex machine, to be sure, but a machine nonetheless. Its purpose is to execute a series of steps, in the manner of a mathematical algorithm, on a set of input data. The result is then output to the user, but the exact means is not important. Today, it’s 3D graphics and cutesy animations. Twenty years ago, it was more likely to be a string of text in a terminal window, while the generation before that might have settled for a printout or paper tape. In all these cases, the end result is the same: the computer operates on your input to give you output. That’s all there is to it.

The key to making computers, well, compute is their programmability. Without a way to give the machine a new set of instructions to follow, you have a single-purpose device. Those are nice, and they can be quite useful (think of, for example, an ASIC cryptocurrency miner: it can’t do anything else, but its one function can more than pay for itself), but they lack the necessary ingredient to take computing to the next level. They can’t expand to fill new roles, new niches.

How a computer gets its programs, how they’re created, and what operations are available are all implementation details, as they say. Old code might be written in Fortran, stored on ancient reel-to-reel tape. The newest JavaScript framework might exist only as bits stored in the nebulous “cloud”. But they, as well as everything in between, have one thing in common: they’re Turing complete. They can all perform a specific set of actions proven to be the universal building blocks of computing. (You can find simulated computers that have only a single available instruction, but that instruction can construct anything you can think of.)

Basically, the minimum requirements for Turing completeness are changing values in memory and branching. Obviously, these imply actually having memory (or other storage) and a means of diverting the flow of execution. Again, implementation details. As long as you can do those, you can do just about anything.

Practice

You may be surprised to note that Alan Turing was the one who worked all that out. Quite a few others made their mark on computing, as well. George Boole (1815-64) gave us the fundamentals of computer logic (hence why we refer to true/false values as boolean). Charles Babbage (1791-1871) designed the precursors to programmable computers, while Ada Lovelace (1815-52) used those designs to create what is considered to be the first program. The Jacquard loom, named after Joseph Marie Jacquard (1752-1834), was a practical display of programming that influenced the first computers. And the list goes on.

Earlier precursors aren’t hard to find. Jacquard’s loom was a refinement of older machines that attempted to automate weaving by feeding a pattern into the loom that would allow it to move the threads in a predetermined way. Pascal and Leibniz worked on calculators. Napier and Oughtred made what might be termed analog computing devices. The oldest object that we can call a computer by even the loosest definition, however, dates back much farther, all the way to classical Greece: the Antikythera mechanism.

So computers aren’t necessarily a product of the modern age. Maybe digital electronics are, because transistors and integrated circuits require serious precision and fine tooling. But you don’t need an ENIAC to change the world, much less a Mac. Something on the level of Babbage’s machines (if he ever finished them, which he didn’t particularly like to do) could trigger an earlier Information Age. Even nothing more than a fast way to multiply, divide, and find square roots—the kind of thing a pocket calculator can do instantly—would advance mathematics, and thus most of the sciences.

But can it be done? Well, maybe. Programmable automatons date back about a thousand years. True computing machines probably need at least Renaissance-era tech, mostly for gearing and the like. To put it simply: if you can make a clock that keeps good time, you’ve got all you need to make a rudimentary computer. On the other hand, something like a “hydraulic” computer (using water instead of electricity or mechanical power) might be doable even earlier, assuming you can find a way to program it.

For something Turing complete, rather than a custom-built analog solver like the Antikythera mechanism, things get a bit harder. Not impossible, mind you, but very difficult. A linear set of steps is fairly easy, but when you start adding in branches and loops (a loop is nothing more than a branch that goes back to an earlier location), you need to add in memory, not to mention all the infrastructure for it, like an instruction pointer.

If you want digital computers, or anything that does any sort of work in parallel, then you’ll probably also need a clock source for synchronization. Thus, you may have another hard “gate” on the timeline, because water clocks and hourglasses probably won’t cut it. Again, gears are the bare minimum.

Output may be able to go on the same medium as input. If it can, great! You can do a lot more that way, since you’d be able to feed the result of one program into another, a bit like what functional programmers call composition. That’s also the way to bring about compilers and other programs whose results are their own set of instructions. Of course, this requires a medium that can be both read and written with relative ease by machines. Punched cards and paper tape are the historical early choices there, with disks, memory, and magnetic tape all coming much later.

Thus, creating the tools looks to be the hardest part about bringing computation into the past. And it really is. The leaps of logic that Turing and Boole made were not special, not miraculous. There’s nothing saying an earlier mathematician couldn’t discover the same foundations of computer science. They’d have to have the need, that’s all. Well, the need and the framework. Algebra is a necessity, for instance, and you’d also want number theory, set theory, and a few others.

All in all, computers are a modern invention, but they’re a modern invention with enough precursors that we could plausibly shift their creation back in time a couple of centuries without stretching believability. You won’t get an iPhone in the Enlightenment, but the most basic tasks of computation are just barely possible in 1800. Or, for that matter, 1400. Even if using a computer for fun takes until our day, the more serious efforts it speeds up might be worth the comparatively massive cost in engineering and research.

But only if they had a reason to make the things in the first place. We had World War II. An alt-history could do the same with, say, the Thirty Years’ War or the American Revolution. Necessity is the mother of invention, so it’s said, so what could make someone need a computer? That’s a question best left to the creator of a setting, which is you.

Future past: steam

Let’s talk about steam. I don’t mean the malware installed on most gamers’ computers, but the real thing: hot, evaporated water. You may see it as just something given off by boiling stew or dying cars, but it’s so much more than that. For steam was the fluid that carried us into the Industrial Revolution.

And whenever we talk of the Industrial Revolution, it’s only natural to think about its timing. Did steam power really have to wait until the 18th century? Is there a way to push back its development by a hundred, or even a thousand, years? We can’t know for sure, but maybe we can make an educated guess or two.

Intro

Obviously, knowledge of steam itself dates back to the first time anybody ever cooked a pot of stew or boiled their day’s catch. Probably earlier than that, if you consider natural hot springs. However you take it, they didn’t have to wait around for a Renaissance and an Enlightenment. Steam itself is embarrassingly easy to make.

Steam is a gas; it’s the gaseous form of water, in the same way that ice is its solid form. Now, ice forms naturally if the temperature gets below 0°C (32°F), so quite a lot of places on Earth can find some way of getting to it. Steam, on the other hand requires us to take water to its boiling point of 100°C (212°F) at sea level, slightly lower at altitude. Even the hottest parts of the world never get temperatures that high, so steam is, with a few exceptions like that hot spring I mentioned, purely artificial.

Cooking is the main way we come into contact with steam, now and in ages past. Modern times have added others, like radiators, but the general principle holds: steam is what we get when we boil water. Liquid turns to gas, and that’s where the fun begins.

Theory

The ideal gas law tells us how an ideal gas behaves. Now, that’s not entirely appropriate for gases in the real world, but it’s a good enough approximation most of the time. In algebraic form, it’s PV = nRT, and it’s the key to seeing why steam is so useful, so world-changing. Ignore R, because it’s a constant that doesn’t concern us here; the other four variables are where we get our interesting effects. In order: P is the pressure of a gas, V is its volume, n is how much of it there is (in moles), and T is its temperature.

You don’t need to know how to measure moles to see what happens. When we turn water into steam, we do so by raising its temperature. By the ideal gas law, increasing T must be balanced out by a proportional increase on the other side of the equation. We’ve got two choices there, and you’ve no doubt seen them both in action.

First, gases have a natural tendency to expand to fill their containers. That’s why smoke dissipates outdoors, and it’s why that steam rising from the pot gets everywhere. Thus, increasing V is the first choice in reaction to higher temperatures. But what if that’s not possible? What if the gas is trapped inside a solid vessel, one that won’t let it expand? Then it’s the backup option: pressure.

A trapped gas that is heated increases in pressure, and that is the power of steam. Think of a pressure cooker or a kettle, either of them placed on a hot stove. With nowhere to go, the steam builds and builds, until it finds relief one way or another. (With some gases, this can come in the more dramatic form of a rupture, but household appliances rarely get that far.)

As pressure is force per unit of area, and there’s not a lot of area in the spout of a teapot, the rising temperatures can cause a lot of force. Enough to scald, enough to push. Enough to…move?

Practice

That is the basis for steam power and, by extension, many of the methods of power generation we still use today. A lot of steam funneled through a small area produces a great amount of force. That force is then able to run a pump, a turbine, or whatever is needed, from boats to trains. (And even cars: some of the first automobiles were steam-powered.)

Steam made the Industrial Revolution possible. It made most of what came after possible, as well. And it gave birth to the retro fad of steampunk, because many people find the elaborate contraptions needed to haul superheated water vapor around to be aesthetically pleasing. Yet there is a problem. We’ve found steam-powered automata (e.g., toys, “magic” temple doors) from the Roman era, so what happened? Why did we need over 1,500 years to get from bot to Watt?

Unlike electricity, where there’s no obvious technological roadblock standing between Antiquity and advancement, steam power might legitimately be beyond classical civilizations. Generation of steam is easy—as I’ve said, that was done with the first cooking pot at the latest. And you don’t need an ideal gas law to observe the steam in your teapot shooting a cork out of the spout. From there, it’s not too far a leap to see how else that rather violent power can be utilized.

No, generating small amounts of steam is easy, and it’s clear that the Romans (and probably the Greeks, Chinese, and others) could do it. They could even use it, as the toys and temples show. So why didn’t they take that next giant leap?

The answer here may be a combination of factors. First is fuel. Large steam installations require metaphorical and literal tons of fuel. The Victorian era thrived on coal, as we know, but coal is a comparatively recent discovery. The Romans didn’t have it available. They could get by with charcoal, but you need a lot of that, and they had much better uses for it. It wouldn’t do to cut down a few acres of forest just to run a chariot down to Ravenna, even for an emperor. Nowadays, we can make steam by many different methods, including renewable variations like solar boilers, but that wasn’t an option back then. Without a massive fuel source, steam—pardon the pun—couldn’t get off the ground.

Second, and equally important, is the quality of the materials that were available. A boiler, in addition to eating fuel at a frantic pace, also has some pretty exacting specifications. It has to be built strong enough to withstand the intense pressures that steam can create (remember our ideal gas law); ruptures were a deadly fixture of the 19th century, and that was with steel. Imagine trying to do it all with brass, bronze, and iron! On top of that, all your valves, tubes, and other machinery must be built to the same high standard. It’s not just a gas leaking out, but efficiency.

The ancients couldn’t pull that off. Not from lacking of trying, mind you, but they weren’t really equipped for the rigors of steam power. Steel was unknown, except in a few special cases. Rubber was an ocean away, on a continent they didn’t know existed. Welding (a requirement for sealing two metal pipes together so air can’t escape) probably wasn’t happening.

Thus, steam power may be too far into the future to plausibly fit into a distant “retro-tech” setting. It really needs improvements in a lot of different areas. That’s not to say that steam itself can’t fit—we know it can—but you’re not getting Roman railroads. On a small scale, using steam is entirely possible, but you can’t build a classical civilization around it. Probably not even a medieval one, at that.

No, it seems that steam as a major power source must wait until the rest of technology catches up. You need a fuel source, whether coal or something else. You absolutely must have ways of creating airtight seals. And you’ll need a way to create strong pressure vessels, which implies some more advanced metallurgy. On the other hand, the science isn’t entirely necessary; if your people don’t know the ideal gas law yet, they’ll probably figure it out pretty soon after the first steam engine starts up. And as for finding uses, well, they’d get to that part without much help, because that’s just what we do.

Future past: Electricity

Electricity is vital to our modern world. Without it, I couldn’t write this post, and you couldn’t read it. That alone should show you just how important it is, but if not, then how about anything from this list: air conditioning, TVs, computers, phones, music players. And that’s just what I can see in the room around me! So electricity seems like a good start for this series. It’s something we can’t live without, but its discovery was relatively recent, as eras go.

Intro

The knowledge of electricity, in some form, goes back thousands of years. The phenomenon itself, of course, began in the first second of the universe, but humans didn’t really get to looking into it until they started looking into just about everything else.

First came static electricity. That’s the kind we’re most familiar with, at least when it comes to directly feeling it. It gives you a shock in the wintertime, it makes your clothes stick together when you pull them out of the drier, and it’s what causes lightning. At its source, static electricity is nothing more than an imbalance of electrons righting itself. Sometimes, that’s visible, whether as a spark or a bolt, and it certainly doesn’t take modern convenience to produce such a thing.

The root electro-, source of electricity and probably a thousand derivatives, originally comes from Greek. There, it referred to amber, that familiar resin that occasionally has bugs embedded in it. Besides that curious property, amber also has a knack for picking up a static charge, much like wool and rubber. It doesn’t take Ben Franklin to figure that much out.

Static electricity, however, is one-and-done. Once the charge imbalance is fixed, it’s over. That can’t really power a modern machine, much less an era, so the other half of the equation is electric current. That’s the kind that runs the world today, and it’s where we have volts and ohms and all those other terms. It’s what runs through the wires in your house, your computer, your everything.

Theory

The study of current, unlike static electricity, came about comparatively late (or maybe it didn’t; see below). It wasn’t until the 18th century that it really got going, and most of the biggest discoveries had to wait until the 19th. The voltaic pile—which later evolved into the battery—electric generators, and so many more pieces that make up the whole of this electronic age, all of them were invented within the last 250 years. But did they have to be? We’ll see in a moment, but let’s take a look at the real world first.

Although static electricity is indeed interesting, and not just for demonstrations, current makes electricity useful, and there are two ways to get it: make it yourself, or extract it from existing materials. The latter is far easier, as you might expect. Most metals are good conductors of electricity, and there are a number of chemical reactions which can cause a bit of voltage. That’s the essence of the battery: two different metals, immersed in an acidic solution, will react in different ways, creating a potential. Volta figured this much out, so we measure the potential in volts. (Ohm worked out how voltage and current are related by resistance, so resistance is measured in ohms. And so on, through essentially every scientist of that age.)

Using wires, we can even take this cell and connect it to another, increasing the amount of voltage and power available at any one time. Making the cells themselves larger (greater cross-section, more solution) creates a greater reserve of electricity. Put the two together, and you’ve got a way to store as much as you want, then extract it however you need.

But batteries eventually run dry. What the modern age needed was a generator. To make that, you need to understand that electricity is but one part of a greater force: electromagnetism. The other half, as you might expect, is magnetism, and that’s the key to generating power. Moving magnetic fields generate electrical potential, i.e., current. And one of the easiest ways to do it is by rotating a magnet inside another. (As an experiment, I’ve seen this done with one of those hand-cranked pencil sharpeners, so it can’t be that hard to construct.)

One problem is that the electricity this sort of generator makes isn’t constant. Its potential, assuming you’ve got a circular setup, follows a sine-wave pattern from positive to negative. (Because you can have negative volts, remember.) That’s alternating current, or AC, while batteries give you direct current, DC. The difference between the two can be very important, and it was at the heart of one of science’s greatest feuds—Edison and Tesla—but it doesn’t mean too much for our purposes here. Both are electric.

Practice

What does it take to create electricity? Is there anything special about it that had to wait until 1800 or so?

As a matter of fact, not only was it possible to have something electrical before the Enlightenment, but it may have been done…depending on who you ask. The Baghdad battery is one of those curious artifacts that has multiple plausible explanations. Either it’s a common container for wine, vinegar, or something of that sort, or it’s a 2000-year-old voltaic cell. The simple fact that this second hypothesis isn’t immediately discarded answers one question: no, nothing about electricity requires advanced technology.

Building a rudimentary battery is so easy that it almost has to have been done before. Two coins (of different metals) stuck into a lemon can give you enough voltage to feel, especially if you touch the wires to your tongue, like some people do with a 9-volt. Potatoes work almost as well, but any fruit or vegetable whose interior is acidic can provide the necessary solution for the electrochemical reactions to take place. From there, it’s not too big a step to a small jar of vinegar. Metals known in ancient times can get you a volt or two from a single cell, and connecting them in series nets you even larger potentials. It won’t be pretty, but there’s absolutely nothing insurmountable about making a battery using only technology known to the Romans, Greeks, or even Egyptians.

Generators a bit harder. First off, you need magnets. Lodestones work; they’re naturally magnetized, possibly by lightning, and their curious properties were first noticed as early as 2500 years ago. But they’re rare and hard to work with, as well as probably being full of impurities. Still, it doesn’t take a genius (or an advanced civilization) to figure out that these can be used to turn other pieces of metal (specifically iron) into magnets of their own.

Really, then, creation of magnets needs iron working, so generators are beyond the Bronze Age by definition. But they aren’t beyond the Iron Age, so Roman-era AC power isn’t impossible. They may not understand how it works, but they have the means to make it. The pieces are there.

The hardest part after that would be wire, because shuffling current around needs that. Copper is a nice balance of cost and conductivity, which is why we use it so much today; gold is far more ductile, while silver offers better conduction properties, but both are too expensive to use for much even today. The latter two, however, have been seen in wire form since ancient times, which means that ages past knew the methods. (Drawn wire didn’t come about until the Middle Ages, but it’s not the only way to do it.) So, assuming that our distant ancestors could figure out why they needed copper wire, they could probably come up with a way to produce it. It might not have rubber or plastic insulation, but they’d find something.

In conclusion, then, even if the Baghdad battery is nothing but a jar with some leftover vinegar inside, that doesn’t mean electricity couldn’t be used by ancient peoples. Technology-wise, nothing at all prevents batteries from being created in the Bronze Age. Power generation might have to wait until the Iron Age, but you can do a lot with just a few cells. And all the pieces were certainly in place in medieval times. The biggest problem after making the things would be finding a use for them, but humans are ingenious creatures. They’d work something out.

Future past: Introduction

With the “Magic and Tech” series on hiatus right now (mostly because I can’t think of anything else to write in it), I had the idea of taking a look at a different type of “retro” technological development. In this case, I want to look at different technologies that we associate with our modern world, and see just how much—or how little—advancement they truly require. In other words, let’s see just what could be made by the ancients, or by medieval cultures, or in the Renaissance.

I’ve been fascinated by this subject for many years, ever since I read the excellent book Lost Discoveries. And it’s very much a worldbuilding pursuit, especially if you’re building a non-Earth human culture or an alternate history. (Or both, in the case of my Otherworld series.) As I’ve looked into this particular topic, I’ve found a few surprises, so this is my chance to share them with you, along with my thoughts on the matter

The way it works

Like “Magic and Tech”, this series (“Future Past”; you get no points for guessing the reference) will consist of an open-ended set of posts, mostly coming out whenever I decide to write them. Each post will be centered on a specific invention, concept, or discovery, rather than the much broader subjects of “Magic and Tech”. For example, the first will be that favorite of alt-historians: electricity. Others will include the steam engine, various types of power generation, and so on. Maybe you can’t get computers in the Bronze Age—assuming you don’t count the Antikythera mechanism—but you won’t believe what you can get.

Every post in the series will be divided into three main parts. First will come an introduction, where I lay out the boundaries of the topic and throw in a few notes about what’s to come. Next is a “theory” section: a brief description of the technology as we know it. Last and longest is the “practice” part, where we’ll look at just how far we can turn back the clock on the invention in question.

Hopefully, this will be as fun to read as it is to write. And I will get back to “Magic and Tech” at some point, probably early next year, but that will have to wait until I’m more inspired on that front. For now, let’s forget the fantasy magic and turn our eyes to the magic of invention.

On fantasy stasis

In fantasy literature, the medieval era is the most common setting. Sure, you get the “flintlock fantasy” that moves things forward a bit, and then there’s the whole subgenre of urban fantasy, but most of the popular works of the past century center on the High Middle Ages.

It’s not hard to see why. That era has a lot going for it. It’s so far back that it’s well beyond living memory, so there’s nobody who can say, “It’s not really like that!” Records are spotty enough that there’s a lot of room for “hidden” discoveries and alternate histories. You get all the knights and chivalry and nobility as a builtin part of the setting, but you don’t have to worry about gunpowder weapons if you don’t want to, or oceanic exploration, or some of the more complex scientific matters discovered in the Renaissance.

For a fantasy world, of course, medieval times give you mostly the same advantages, but also a few more. It’s less you have to do, obviously, as you don’t have the explosion of technology and discovery starting circa 1500. Medieval times were simpler, in a way, and simple makes worldbuilding easy. Magic fits neatly in the gaps of medieval knowledge. The world map can have the blank spaces needed to hide a dragon or a wizard’s lair.

Times are (not) changing

But this presents a problem, because another thing fantasy authors really, really want is a long history, yet they don’t want the usual pattern of advancement that comes with those long ages. Just to take examples from some of my personal favorites, let’s see what we’ve got.

  • A Song of Ice and Fire, by George R. R. Martin. You’ll probably know this better as Game of Thrones, the TV show, but the books go into far greater depth concerning the world history. The Others (White Walkers, in the show, for reasons I’ve never clearly understood) last came around some 8,000 years ago. About the only thing that’s changed since is the introduction of iron weaponry.

  • Lord of the Rings; J.R.R. Tolkien. Everybody knows this one, but how many know Middle Earth’s “internal” history? The Third Age lasts over 3,000 years with no notable technological progress, and that’s on top of the 3,500 years of the Second Age and a First Age (from The Silmarillion) that tacks on another 600 or so. Indeed, most technology in Middle Earth comes from the great enemies, Sauron and Morgoth and Saruman. That’s certainly no coincidence.

  • Mistborn; Brandon Sanderson. Here’s a case where technology actually regressed over the course of 1,000 years. The tyrannical Lord Ruler suppressed the knowledge of gunpowder (he preferred his ranged fighters to have skill) and turned society from seemingly generic fantasy feudalism into a brutal serfdom. (The newer trilogy, interestingly, upends this trope entirely; the world has gone from essentially zero—because of events at the end of Book 3—to Victorian Era in something like 500 years.)

  • Malazan Book of the Fallen; Steven Erikson. This series already has more timeline errors than I can count, so many that fans have turned the whole thing into a meme, and even the author himself lampooned it in the story. But Erikson takes the “fantasy stasis” to a whole new level. The “old” races are over 100,000 years old, there was an ice age somewhere in there, and the best anyone’s done is oceangoing ships and magical explosives, both within the last century or so.

Back in time

It’s a conundrum. Let’s look at our own Western history to see why. A thousand years ago was the Middle Ages, the time when your average fantasy takes place. It’s the time of William the Conqueror, of the Holy Roman Empire and the Crusades and, later, the Black Death. Cathedrals were being built, the first universities founded, and so on. But it was nothing like today. It was truly a whole different world.

Add another thousand years, and you’re in Roman times. You’ve got Caesar, Pliny the Elder, Vesuvius, Jesus. Here, you’re in a world of antiquity, but you have to remember that it’s not really any further back from medieval times than they are from us. If we in 2017 are at the destruction of the One Ring, the founding of the Shire was not long after all this, about at the fall of the Roman Empire.

Another millennium takes you to ancient Greece, to the Bronze Age. That’s “Bronze Age” as in “ironworking hasn’t been invented yet”, by the way. Well, it had been, but it was only used in limited circumstances. Three thousand years ago is about the time of the later Old Testament or Homer. Compared to us, it’s totally unrecognizable, but it’s about the same length of time between the first time the One Ring was worn by someone other than Sauron and the moment Frodo and Sam walked up to Mount Doom.

Let’s try 8,000, like in Westeros. Where does that put us in Earth history? Well, it would be 6000 BC, so before Egypt, Sumeria, Babylon, the Minoans…even the Chinese. The biggest city in the world might have a few thousand people in it—Jericho and Çatalhöyük are about that old. Domestication of animals and plants is still in its infancy at this point in time; you’re closer to the first crops than to the first computers. Bran the Builder would have to have magic to make the Wall. The technology sure wasn’t there yet.

Breaking the ice

And that’s really the problem with so many of these great epic fantasy sagas. Yes, we get to see the grand sweep of history in the background, but it’s only grand because it’s been stretched. In the real world, centuries of stasis simply don’t exist in the eras of these stories. Even the Dark Ages saw substantial progress in some areas, and that’s not counting the massive advancement happening in, say, the Islamic world.

To have this stasis and make it work (assuming it’s not just ancient tales recast in modern terms) requires something supernatural, something beyond what we know. That can be magic or otherworldly beings or even a “caretaker” ruler, but it has to be something. Left to their own devices, people will invent their way out of the Fantasy Dark Age.

Maybe magic replaces technology. That’s an interesting thought, and one that fits in with some of my other writings here. It’s certainly plausible that a high level of magical talent could retard technological development. Magic is often described as far easier than invention, and far more practical now.

Supernatural beings can also put a damper on tech levels, but they may also have the opposite effect. If the mighty dragon kills everything that comes within 100 yards, then a gun that can shoot straight at twice that would be invaluable. Frodo’s quest would have been a piece of cake if he’d had even a World War I airplane, and you don’t even have to bring the Eagles into that one! Again, people are smart. They’ll figure these things out, given enough time. Thousands of years is definitely enough time.

Call this a rant if you like. Maybe that’s what it really is. Now, I’m not saying I hate stories that assume hundreds or thousands of years of stagnation. I don’t; some of my favorite books hinge on that very assumption. But worldbuilding can do better. That’s what I’m after. If that means I’ll never write a true work of epic fantasy, then so be it. There’s plenty of wonder out there.