The merchants of despair

I am a humanist.

I’ve said that before, but it bears repeating. Now, most people who call themselves humanists do so out of a kind of rebellious nature. They’re agnostics or atheists who disapprove of such labels for whatever reason. Worse, too many tend to be the “militant” sort of atheist who hold their lack of belief with the same dogmatic zeal as the most fundamentalist Christian or Muslim.

I’m not like that at all. Instead, I see humanism as a celebration of humanity and its accomplishments, as well as a belief in its capability for good. We can achieve great things. We have. History is full of human milestones. We’re the only species on Earth (and, as far as we know, in the universe) to domesticate plants and animals, use spoken and written language, harness the power of fire, work metals, build cities, travel to the moon, cure diseases, split the atom, and a thousand other things. Above all, however, we introspect. We philosophize. We are aware of ourselves in a way no other creature has the capability of being.

That’s beautiful, in my opinion. The creations of man, whether mental, physical, or indeed spiritual, are beautiful. While we have made some awful mistakes and inventions, progress is, on the whole, a good thing for everyone involved. The rapid explosion of progress since the two most pivotal eras in history, the Enlightenment and the Industrial Revolution, has given us much to be thankful for. We live longer, healthier lives than our ancestors. We have more material wealth. We understand the world far better than they could hope.

Some people don’t like that, and I honestly can’t understand why. Why are they so dead set on keeping us poor, sick, ignorant, and isolated? A thirst for power explains a lot of irrational behavior, yes, but naked displays of dominance aren’t usually so…insidious. In 2020 alone, we have seen countless examples of human beings arguing for their own extinction, a position not only evolutionarily suspect but morally bankrupt. Yet this position finds backing in the media, on campus, and even in scientific papers. Why? Is there some kind of secret death cult out there?

Until a couple of weeks ago, I would have dismissed that notion as a conspiracy theory on the same level as the Illuminati and Pizzagate. But then I read a book that made everything click.

Humanity’s enemy

Robert Zubrin is best known for his advocacy, often to the point of mania, of manned Mars missions. For over 30 years, he has led the charge in fighting for a permanent human presence on the Red Planet as soon as possible. Growing up, I heard his name on numerous space documentaries, and I still see interviews he has given on the subject. (The series Mars is one example.)

He has other writings, though. In 2011, he published Merchants of Despair, in which he describes an “antihuman” movement that, according to his theory, has been operating for nearly two centuries with the express goal of controlling population by subverting progress.

Numerous examples show the antihumanists in action. Most are concerned with eugenics, the hateful policy of forced sterilization, abortion, and contraception for a specific set of undesirables: blacks, Jews, Indians, Uighurs, the mentally disabled, etc. The targets change depending on who’s doing the extermination, but the principle remains the same. If we don’t stop “those people” from reproducing, eugenicists claim, they’ll overrun us good and pure folk and drag us down to their level. Obviously, any sensible, rational person would reject such notions, but most people are neither rational nor sensible. Thus, population control movements have grown over the past 200 years.

It began with Malthus, who argued incorrectly that the Earth was running out of land for food, and severe measures to curb population growth had to be implemented right now in order to save our race from extinction. His theory was so wildly inaccurate that it couldn’t even predict past resource use, but he had friends and believers in high places. Malthusian principles created the Irish Potato Famine in the 1840s, then racked up an even greater death toll in 1870s India. In both cases, the country in question was a net exporter of food at the time, yet the British government forced residents to starve in order prevent some mythical calamity.

Fast forward to the 1930s, and we know what happened. The Nazis were the gold standard for eugenics, raising genocidal population control to an art form. Following the same principles as Malthus, Hitler argued that Germany would eventually be too crowded to feed itself. But now there was an added wrinkle, because science could “prove” that some races were more degenerate than others. And wouldn’t you know it, but Hitler’s enemies just happened to number among them!

Before the true horrors of the Holocaust were revealed—or even started, for that matter—many Americans were wholeheartedly in favor. Herbert Hoover attended the Second International Congress of Eugenics in 1921, seven years before he would be elected President of the United States and plunge our country into the Great Depression. J. P. Morgan was there, too. Representing the British (45 years after the India debacle, mind you) was Charles Darwin’s own son.

That was before World War II. With the end of the war, the opening of the death camps, and the subsequent Nuremberg trials, the whole world got to see what eugenics really looked like. So you’d think that would be the end of it, right?


Now, instead of open calls for extermination, those advocating population control became more subtle in their efforts. The best way to stop overpopulation, they decided, wasn’t to kill people who were already here, but to stop them from being born in the first place. Thanks to some politicking from such notables as Robert McNamara, forced sterilization became a condition of US foreign aid to Third World countries. Doing it at home (mostly for criminals and mental patients) was legal until the 1970s. The entire Vietnam War can be seen as a eugenics experiment, as those in power took the slogan “Better Dead Than Red” literally.

Abortion as a political and population-control tool also sees its birth in this era. Planned Parenthood formed out of the eugenics movement, and its original goal of choice carefully neglected the possibility of choosing to have children. Around the same time, one Communist Party official in China read up on these efforts and got the great idea of limiting all families in his country to one child each. Never mind the disastrous consequences for the fabric of society. Isn’t running out of food worse?

Yet the biggest crime to lay at the feet of the antihumanists is, in my opinion, environmentalism. In the past decade, and especially in the past four years, we’ve seen more radical forms of the Green movement grow like a cancer in our society, but they were there from the start. The Sierra Club has deep ties to eugenics, for instance.


Here’s where it gets interesting. And evil, in my opinion.

We’ve all seen it this year. “Nature is healing,” they say, as they show weeds growing through cracks in concrete or wild animals overrunning a city street. “We are the virus,” they claim, often adding that the Wuhan coronavirus (most likely created in a Chinese lab, so not natural at all) is some kind of divine wrath for our excesses. How a virus with a fatality rate of around 0.1% is supposed to be apocalyptic is beyond me, but you can’t expect logical consistency from some people.

Such extreme environmentalism has been around for over half a century, and Zubrin argues that it shows a more modern form of antihumanism. Instead of calling for deaths or preventing births, green eugenicists want to use economic and government pressure to make having children financially unbearable. To do this, they have blocked the progress of technologies, inventions, and medicines that save lives. We must not help people, they argue, because then those people will breed. Better if they die sick and miserable than be fruitful and multiply.

DDT was the first casualty, according to Zubrin. The endless campaigning against nuclear power is another front in this fight. Though he was writing with incomplete information, he even targets global warming, and here is where the last piece fell into place for me.

We know that the fears of global warming are overrated. Even top climate activists such as Michael Shellenberger (Apocalypse Never) admit this. Current climate trends are well within the limits of human civilization. Sea levels aren’t rising rapidly; the Maldives archipelago, to take one example, was supposed to be completely underwater by 2018, but they’ve now announced that they’re building new airports in anticipation of heavier tourism. Add in the work done by sleuths such as Tony Heller, who illustrate how temperature records are being manipulated to claim accelerated warming, and you get the feeling that somebody somewhere isn’t telling the whole truth.

Earth isn’t going to become a second Venus because we drive too much. In fact, as Zubrin illustrated nine years ago, the slight overall warming predicted through the 21st century is actually beneficial. It increases arable land, and actual climate shifts may open up even more. We’re seeing that today, with record crop yields all over North America.

Those who fail to learn from history will find that it repeats itself. 2020 America is in real danger of turning into a mirror of 1845 Ireland. We have plenty of food. We have plenty of jobs. We have plenty of toilet paper. Yet government control and overblown fears are preventing us from using these resources properly. They’re just saying it’s because of a virus instead of overpopulation by “inferior” races. That’s all.

But the result is the same. Lives are being lost. Not to starvation, as then, but to other preventable factors. Suicidal depression, of course, is one I’m intimately familiar with. Yet we also need to look at the back side of population control. How many children weren’t born because of lockdown restrictions? How many couples didn’t get a chance to meet because they were under effective house arrest? How many relationships ended (or are on the verge of ending, or never really got going in the first place) due to the loss of a job or the failure to find one?

Whatever that number is, it’s not zero. I know for a fact.

Humanity’s hope

That’s why I’m a humanist. I see these problems in the world, and I realize how many of them are of our own making. Worse yet, they’re easily fixed. We have the means to give food to everyone on Earth. We have ways of making power literally too cheap to meter. There is more than enough wealth to go around.

We shouldn’t have to force women into tubal ligation surgery out of some fear that they’ll have too many kids. We shouldn’t distribute condoms as business cards or demand IUD implants as conditions for government aid.

We shouldn’t claim that a one-degree change in temperature is going to wipe out all life on Earth. We shouldn’t argue that the cleanest, safest form of energy production we have is actually nothing more than a way to make bombs. We shouldn’t pack millions of people into unsanitary cities, then deny them treatment for the diseases that inevitably occur.

We can be better, but only if we embrace progress. Not progressivism, but progress itself, the liberal ideals of the Enlightenment which state that, as man is the only animal with the capability for reason, it stands to us to use that reason to shape the world, and society, in a positive way.

To do otherwise is to advocate for death on an unimaginable scale. Earth’s population is roughly 7.7 billion at present. With our current technology, we can easily feed, house, and care for at least twice that. But the goals of the environmentalists, the globalists, and others who, I now see, have been aligned with the idea of eugenics all this time, are to reduce our numbers to pre-Industrial levels. The problem with that is simple to recognize: technology allows our carrying capacity to increase. By banning those advances which produce more food or lead to longer, healthier lives, that capacity drops precipitously.

They would kill not the six million of Nazi fame, but over six billion. Some claim the goal is inscribed on the monument known as the Georgia Guidestones: a population not to exceed 500 million. Think about that. To reach that figure, we would first have to let over 90% of the world die. Then, those who survive would be forcibly limited to replacement-level reproduction. How many children would never be born in such a world? How many artists, statesmen, inventors, scientists, friends, and lovers would never take their first breath?

These are our enemies. They must be, for those who value life must always stand against those who preach only death.

Now I understand the cult-like behavior I see so often in the world. It really is a cult. It’s a cult of despair, destruction, and death. Looked at in that light, the lockdowns, the Great Reset, Chinese propaganda, Antifa, global warming fearmongers, and so many other things make sense. They all share one thing in common: they’re antihuman.

Future past: computers

Today, computers are ubiquitous. They’re so common that many people simply can’t function without them, and they’ve been around long enough that most can’t remember a time when they didn’t have them. (I straddle the boundary on this one. I can remember my early childhood, when I didn’t know about computers—except for game consoles, which don’t really count—but those days are very hazy.)

If the steam engine was the invention that began the Industrial Revolution, then the programmable, multi-purpose device I’m using to write this post started the Information Revolution. Because that’s really what it is. That’s the era we’re living in.

But did it have to turn out that way? Is there a way to have computers (of any sort) before the 1940s? Did we have to wait for Turing and the like? Or is there a way for an author to build a plausible timeline that gives us the defining invention of our day in a day long past? Let’s see what we can see.


Defining exactly what we mean by “computer” is a difficult task fraught with peril, so I’ll keep it simple. For the purposes of this post, a computer is an automated, programmable machine that can calculate, tabulate, or otherwise process arbitrary data. It doesn’t have to have a keyboard, a CPU, or an operating system. You just have to be able to tell it what to do and know that it will indeed do what you ask.

By that definition, of course, the first true computers came about around World War II. At first, they were mostly used for military and government purposes, later filtering down into education, commerce, and the public. Now, after a lifetime, we have them everywhere, to the point where some people think they have too much influence over our daily lives. That’s evolution, but the invention of the first computers was a revolution.


We think of computers as electronic, digital, binary. In a more abstract sense, though, a computer is nothing more than a machine. A very, very complex machine, to be sure, but a machine nonetheless. Its purpose is to execute a series of steps, in the manner of a mathematical algorithm, on a set of input data. The result is then output to the user, but the exact means is not important. Today, it’s 3D graphics and cutesy animations. Twenty years ago, it was more likely to be a string of text in a terminal window, while the generation before that might have settled for a printout or paper tape. In all these cases, the end result is the same: the computer operates on your input to give you output. That’s all there is to it.

The key to making computers, well, compute is their programmability. Without a way to give the machine a new set of instructions to follow, you have a single-purpose device. Those are nice, and they can be quite useful (think of, for example, an ASIC cryptocurrency miner: it can’t do anything else, but its one function can more than pay for itself), but they lack the necessary ingredient to take computing to the next level. They can’t expand to fill new roles, new niches.

How a computer gets its programs, how they’re created, and what operations are available are all implementation details, as they say. Old code might be written in Fortran, stored on ancient reel-to-reel tape. The newest JavaScript framework might exist only as bits stored in the nebulous “cloud”. But they, as well as everything in between, have one thing in common: they’re Turing complete. They can all perform a specific set of actions proven to be the universal building blocks of computing. (You can find simulated computers that have only a single available instruction, but that instruction can construct anything you can think of.)

Basically, the minimum requirements for Turing completeness are changing values in memory and branching. Obviously, these imply actually having memory (or other storage) and a means of diverting the flow of execution. Again, implementation details. As long as you can do those, you can do just about anything.


You may be surprised to note that Alan Turing was the one who worked all that out. Quite a few others made their mark on computing, as well. George Boole (1815-64) gave us the fundamentals of computer logic (hence why we refer to true/false values as boolean). Charles Babbage (1791-1871) designed the precursors to programmable computers, while Ada Lovelace (1815-52) used those designs to create what is considered to be the first program. The Jacquard loom, named after Joseph Marie Jacquard (1752-1834), was a practical display of programming that influenced the first computers. And the list goes on.

Earlier precursors aren’t hard to find. Jacquard’s loom was a refinement of older machines that attempted to automate weaving by feeding a pattern into the loom that would allow it to move the threads in a predetermined way. Pascal and Leibniz worked on calculators. Napier and Oughtred made what might be termed analog computing devices. The oldest object that we can call a computer by even the loosest definition, however, dates back much farther, all the way to classical Greece: the Antikythera mechanism.

So computers aren’t necessarily a product of the modern age. Maybe digital electronics are, because transistors and integrated circuits require serious precision and fine tooling. But you don’t need an ENIAC to change the world, much less a Mac. Something on the level of Babbage’s machines (if he ever finished them, which he didn’t particularly like to do) could trigger an earlier Information Age. Even nothing more than a fast way to multiply, divide, and find square roots—the kind of thing a pocket calculator can do instantly—would advance mathematics, and thus most of the sciences.

But can it be done? Well, maybe. Programmable automatons date back about a thousand years. True computing machines probably need at least Renaissance-era tech, mostly for gearing and the like. To put it simply: if you can make a clock that keeps good time, you’ve got all you need to make a rudimentary computer. On the other hand, something like a “hydraulic” computer (using water instead of electricity or mechanical power) might be doable even earlier, assuming you can find a way to program it.

For something Turing complete, rather than a custom-built analog solver like the Antikythera mechanism, things get a bit harder. Not impossible, mind you, but very difficult. A linear set of steps is fairly easy, but when you start adding in branches and loops (a loop is nothing more than a branch that goes back to an earlier location), you need to add in memory, not to mention all the infrastructure for it, like an instruction pointer.

If you want digital computers, or anything that does any sort of work in parallel, then you’ll probably also need a clock source for synchronization. Thus, you may have another hard “gate” on the timeline, because water clocks and hourglasses probably won’t cut it. Again, gears are the bare minimum.

Output may be able to go on the same medium as input. If it can, great! You can do a lot more that way, since you’d be able to feed the result of one program into another, a bit like what functional programmers call composition. That’s also the way to bring about compilers and other programs whose results are their own set of instructions. Of course, this requires a medium that can be both read and written with relative ease by machines. Punched cards and paper tape are the historical early choices there, with disks, memory, and magnetic tape all coming much later.

Thus, creating the tools looks to be the hardest part about bringing computation into the past. And it really is. The leaps of logic that Turing and Boole made were not special, not miraculous. There’s nothing saying an earlier mathematician couldn’t discover the same foundations of computer science. They’d have to have the need, that’s all. Well, the need and the framework. Algebra is a necessity, for instance, and you’d also want number theory, set theory, and a few others.

All in all, computers are a modern invention, but they’re a modern invention with enough precursors that we could plausibly shift their creation back in time a couple of centuries without stretching believability. You won’t get an iPhone in the Enlightenment, but the most basic tasks of computation are just barely possible in 1800. Or, for that matter, 1400. Even if using a computer for fun takes until our day, the more serious efforts it speeds up might be worth the comparatively massive cost in engineering and research.

But only if they had a reason to make the things in the first place. We had World War II. An alt-history could do the same with, say, the Thirty Years’ War or the American Revolution. Necessity is the mother of invention, so it’s said, so what could make someone need a computer? That’s a question best left to the creator of a setting, which is you.

Future past: steam

Let’s talk about steam. I don’t mean the malware installed on most gamers’ computers, but the real thing: hot, evaporated water. You may see it as just something given off by boiling stew or dying cars, but it’s so much more than that. For steam was the fluid that carried us into the Industrial Revolution.

And whenever we talk of the Industrial Revolution, it’s only natural to think about its timing. Did steam power really have to wait until the 18th century? Is there a way to push back its development by a hundred, or even a thousand, years? We can’t know for sure, but maybe we can make an educated guess or two.


Obviously, knowledge of steam itself dates back to the first time anybody ever cooked a pot of stew or boiled their day’s catch. Probably earlier than that, if you consider natural hot springs. However you take it, they didn’t have to wait around for a Renaissance and an Enlightenment. Steam itself is embarrassingly easy to make.

Steam is a gas; it’s the gaseous form of water, in the same way that ice is its solid form. Now, ice forms naturally if the temperature gets below 0°C (32°F), so quite a lot of places on Earth can find some way of getting to it. Steam, on the other hand requires us to take water to its boiling point of 100°C (212°F) at sea level, slightly lower at altitude. Even the hottest parts of the world never get temperatures that high, so steam is, with a few exceptions like that hot spring I mentioned, purely artificial.

Cooking is the main way we come into contact with steam, now and in ages past. Modern times have added others, like radiators, but the general principle holds: steam is what we get when we boil water. Liquid turns to gas, and that’s where the fun begins.


The ideal gas law tells us how an ideal gas behaves. Now, that’s not entirely appropriate for gases in the real world, but it’s a good enough approximation most of the time. In algebraic form, it’s PV = nRT, and it’s the key to seeing why steam is so useful, so world-changing. Ignore R, because it’s a constant that doesn’t concern us here; the other four variables are where we get our interesting effects. In order: P is the pressure of a gas, V is its volume, n is how much of it there is (in moles), and T is its temperature.

You don’t need to know how to measure moles to see what happens. When we turn water into steam, we do so by raising its temperature. By the ideal gas law, increasing T must be balanced out by a proportional increase on the other side of the equation. We’ve got two choices there, and you’ve no doubt seen them both in action.

First, gases have a natural tendency to expand to fill their containers. That’s why smoke dissipates outdoors, and it’s why that steam rising from the pot gets everywhere. Thus, increasing V is the first choice in reaction to higher temperatures. But what if that’s not possible? What if the gas is trapped inside a solid vessel, one that won’t let it expand? Then it’s the backup option: pressure.

A trapped gas that is heated increases in pressure, and that is the power of steam. Think of a pressure cooker or a kettle, either of them placed on a hot stove. With nowhere to go, the steam builds and builds, until it finds relief one way or another. (With some gases, this can come in the more dramatic form of a rupture, but household appliances rarely get that far.)

As pressure is force per unit of area, and there’s not a lot of area in the spout of a teapot, the rising temperatures can cause a lot of force. Enough to scald, enough to push. Enough to…move?


That is the basis for steam power and, by extension, many of the methods of power generation we still use today. A lot of steam funneled through a small area produces a great amount of force. That force is then able to run a pump, a turbine, or whatever is needed, from boats to trains. (And even cars: some of the first automobiles were steam-powered.)

Steam made the Industrial Revolution possible. It made most of what came after possible, as well. And it gave birth to the retro fad of steampunk, because many people find the elaborate contraptions needed to haul superheated water vapor around to be aesthetically pleasing. Yet there is a problem. We’ve found steam-powered automata (e.g., toys, “magic” temple doors) from the Roman era, so what happened? Why did we need over 1,500 years to get from bot to Watt?

Unlike electricity, where there’s no obvious technological roadblock standing between Antiquity and advancement, steam power might legitimately be beyond classical civilizations. Generation of steam is easy—as I’ve said, that was done with the first cooking pot at the latest. And you don’t need an ideal gas law to observe the steam in your teapot shooting a cork out of the spout. From there, it’s not too far a leap to see how else that rather violent power can be utilized.

No, generating small amounts of steam is easy, and it’s clear that the Romans (and probably the Greeks, Chinese, and others) could do it. They could even use it, as the toys and temples show. So why didn’t they take that next giant leap?

The answer here may be a combination of factors. First is fuel. Large steam installations require metaphorical and literal tons of fuel. The Victorian era thrived on coal, as we know, but coal is a comparatively recent discovery. The Romans didn’t have it available. They could get by with charcoal, but you need a lot of that, and they had much better uses for it. It wouldn’t do to cut down a few acres of forest just to run a chariot down to Ravenna, even for an emperor. Nowadays, we can make steam by many different methods, including renewable variations like solar boilers, but that wasn’t an option back then. Without a massive fuel source, steam—pardon the pun—couldn’t get off the ground.

Second, and equally important, is the quality of the materials that were available. A boiler, in addition to eating fuel at a frantic pace, also has some pretty exacting specifications. It has to be built strong enough to withstand the intense pressures that steam can create (remember our ideal gas law); ruptures were a deadly fixture of the 19th century, and that was with steel. Imagine trying to do it all with brass, bronze, and iron! On top of that, all your valves, tubes, and other machinery must be built to the same high standard. It’s not just a gas leaking out, but efficiency.

The ancients couldn’t pull that off. Not from lacking of trying, mind you, but they weren’t really equipped for the rigors of steam power. Steel was unknown, except in a few special cases. Rubber was an ocean away, on a continent they didn’t know existed. Welding (a requirement for sealing two metal pipes together so air can’t escape) probably wasn’t happening.

Thus, steam power may be too far into the future to plausibly fit into a distant “retro-tech” setting. It really needs improvements in a lot of different areas. That’s not to say that steam itself can’t fit—we know it can—but you’re not getting Roman railroads. On a small scale, using steam is entirely possible, but you can’t build a classical civilization around it. Probably not even a medieval one, at that.

No, it seems that steam as a major power source must wait until the rest of technology catches up. You need a fuel source, whether coal or something else. You absolutely must have ways of creating airtight seals. And you’ll need a way to create strong pressure vessels, which implies some more advanced metallurgy. On the other hand, the science isn’t entirely necessary; if your people don’t know the ideal gas law yet, they’ll probably figure it out pretty soon after the first steam engine starts up. And as for finding uses, well, they’d get to that part without much help, because that’s just what we do.

Future past: Electricity

Electricity is vital to our modern world. Without it, I couldn’t write this post, and you couldn’t read it. That alone should show you just how important it is, but if not, then how about anything from this list: air conditioning, TVs, computers, phones, music players. And that’s just what I can see in the room around me! So electricity seems like a good start for this series. It’s something we can’t live without, but its discovery was relatively recent, as eras go.


The knowledge of electricity, in some form, goes back thousands of years. The phenomenon itself, of course, began in the first second of the universe, but humans didn’t really get to looking into it until they started looking into just about everything else.

First came static electricity. That’s the kind we’re most familiar with, at least when it comes to directly feeling it. It gives you a shock in the wintertime, it makes your clothes stick together when you pull them out of the drier, and it’s what causes lightning. At its source, static electricity is nothing more than an imbalance of electrons righting itself. Sometimes, that’s visible, whether as a spark or a bolt, and it certainly doesn’t take modern convenience to produce such a thing.

The root electro-, source of electricity and probably a thousand derivatives, originally comes from Greek. There, it referred to amber, that familiar resin that occasionally has bugs embedded in it. Besides that curious property, amber also has a knack for picking up a static charge, much like wool and rubber. It doesn’t take Ben Franklin to figure that much out.

Static electricity, however, is one-and-done. Once the charge imbalance is fixed, it’s over. That can’t really power a modern machine, much less an era, so the other half of the equation is electric current. That’s the kind that runs the world today, and it’s where we have volts and ohms and all those other terms. It’s what runs through the wires in your house, your computer, your everything.


The study of current, unlike static electricity, came about comparatively late (or maybe it didn’t; see below). It wasn’t until the 18th century that it really got going, and most of the biggest discoveries had to wait until the 19th. The voltaic pile—which later evolved into the battery—electric generators, and so many more pieces that make up the whole of this electronic age, all of them were invented within the last 250 years. But did they have to be? We’ll see in a moment, but let’s take a look at the real world first.

Although static electricity is indeed interesting, and not just for demonstrations, current makes electricity useful, and there are two ways to get it: make it yourself, or extract it from existing materials. The latter is far easier, as you might expect. Most metals are good conductors of electricity, and there are a number of chemical reactions which can cause a bit of voltage. That’s the essence of the battery: two different metals, immersed in an acidic solution, will react in different ways, creating a potential. Volta figured this much out, so we measure the potential in volts. (Ohm worked out how voltage and current are related by resistance, so resistance is measured in ohms. And so on, through essentially every scientist of that age.)

Using wires, we can even take this cell and connect it to another, increasing the amount of voltage and power available at any one time. Making the cells themselves larger (greater cross-section, more solution) creates a greater reserve of electricity. Put the two together, and you’ve got a way to store as much as you want, then extract it however you need.

But batteries eventually run dry. What the modern age needed was a generator. To make that, you need to understand that electricity is but one part of a greater force: electromagnetism. The other half, as you might expect, is magnetism, and that’s the key to generating power. Moving magnetic fields generate electrical potential, i.e., current. And one of the easiest ways to do it is by rotating a magnet inside another. (As an experiment, I’ve seen this done with one of those hand-cranked pencil sharpeners, so it can’t be that hard to construct.)

One problem is that the electricity this sort of generator makes isn’t constant. Its potential, assuming you’ve got a circular setup, follows a sine-wave pattern from positive to negative. (Because you can have negative volts, remember.) That’s alternating current, or AC, while batteries give you direct current, DC. The difference between the two can be very important, and it was at the heart of one of science’s greatest feuds—Edison and Tesla—but it doesn’t mean too much for our purposes here. Both are electric.


What does it take to create electricity? Is there anything special about it that had to wait until 1800 or so?

As a matter of fact, not only was it possible to have something electrical before the Enlightenment, but it may have been done…depending on who you ask. The Baghdad battery is one of those curious artifacts that has multiple plausible explanations. Either it’s a common container for wine, vinegar, or something of that sort, or it’s a 2000-year-old voltaic cell. The simple fact that this second hypothesis isn’t immediately discarded answers one question: no, nothing about electricity requires advanced technology.

Building a rudimentary battery is so easy that it almost has to have been done before. Two coins (of different metals) stuck into a lemon can give you enough voltage to feel, especially if you touch the wires to your tongue, like some people do with a 9-volt. Potatoes work almost as well, but any fruit or vegetable whose interior is acidic can provide the necessary solution for the electrochemical reactions to take place. From there, it’s not too big a step to a small jar of vinegar. Metals known in ancient times can get you a volt or two from a single cell, and connecting them in series nets you even larger potentials. It won’t be pretty, but there’s absolutely nothing insurmountable about making a battery using only technology known to the Romans, Greeks, or even Egyptians.

Generators a bit harder. First off, you need magnets. Lodestones work; they’re naturally magnetized, possibly by lightning, and their curious properties were first noticed as early as 2500 years ago. But they’re rare and hard to work with, as well as probably being full of impurities. Still, it doesn’t take a genius (or an advanced civilization) to figure out that these can be used to turn other pieces of metal (specifically iron) into magnets of their own.

Really, then, creation of magnets needs iron working, so generators are beyond the Bronze Age by definition. But they aren’t beyond the Iron Age, so Roman-era AC power isn’t impossible. They may not understand how it works, but they have the means to make it. The pieces are there.

The hardest part after that would be wire, because shuffling current around needs that. Copper is a nice balance of cost and conductivity, which is why we use it so much today; gold is far more ductile, while silver offers better conduction properties, but both are too expensive to use for much even today. The latter two, however, have been seen in wire form since ancient times, which means that ages past knew the methods. (Drawn wire didn’t come about until the Middle Ages, but it’s not the only way to do it.) So, assuming that our distant ancestors could figure out why they needed copper wire, they could probably come up with a way to produce it. It might not have rubber or plastic insulation, but they’d find something.

In conclusion, then, even if the Baghdad battery is nothing but a jar with some leftover vinegar inside, that doesn’t mean electricity couldn’t be used by ancient peoples. Technology-wise, nothing at all prevents batteries from being created in the Bronze Age. Power generation might have to wait until the Iron Age, but you can do a lot with just a few cells. And all the pieces were certainly in place in medieval times. The biggest problem after making the things would be finding a use for them, but humans are ingenious creatures. They’d work something out.

Future past: Introduction

With the “Magic and Tech” series on hiatus right now (mostly because I can’t think of anything else to write in it), I had the idea of taking a look at a different type of “retro” technological development. In this case, I want to look at different technologies that we associate with our modern world, and see just how much—or how little—advancement they truly require. In other words, let’s see just what could be made by the ancients, or by medieval cultures, or in the Renaissance.

I’ve been fascinated by this subject for many years, ever since I read the excellent book Lost Discoveries. And it’s very much a worldbuilding pursuit, especially if you’re building a non-Earth human culture or an alternate history. (Or both, in the case of my Otherworld series.) As I’ve looked into this particular topic, I’ve found a few surprises, so this is my chance to share them with you, along with my thoughts on the matter

The way it works

Like “Magic and Tech”, this series (“Future Past”; you get no points for guessing the reference) will consist of an open-ended set of posts, mostly coming out whenever I decide to write them. Each post will be centered on a specific invention, concept, or discovery, rather than the much broader subjects of “Magic and Tech”. For example, the first will be that favorite of alt-historians: electricity. Others will include the steam engine, various types of power generation, and so on. Maybe you can’t get computers in the Bronze Age—assuming you don’t count the Antikythera mechanism—but you won’t believe what you can get.

Every post in the series will be divided into three main parts. First will come an introduction, where I lay out the boundaries of the topic and throw in a few notes about what’s to come. Next is a “theory” section: a brief description of the technology as we know it. Last and longest is the “practice” part, where we’ll look at just how far we can turn back the clock on the invention in question.

Hopefully, this will be as fun to read as it is to write. And I will get back to “Magic and Tech” at some point, probably early next year, but that will have to wait until I’m more inspired on that front. For now, let’s forget the fantasy magic and turn our eyes to the magic of invention.

Magic and tech: privacy

Privacy is a major topic in today’s world. We hear about surveillance, privacy rights, wiretapping, and so much else that it’s hard not to have at least some knowledge of the subject. Whether it’s privacy in the real world, on the Internet, or wherever, it’s really a big deal.

Although we may talk about privacy in strictly modern terms, that doesn’t mean it’s a modern invention. Previous generations had privacy, and they had the attacks on it, the dangers to it, and the need for it. It’s only in recent times that “bad” actors (e.g., foreign—or domestic—government agents) have such a capacity for invading our privacy so effortlessly, so imperceptibly.

Private eyes

The easiest way to keep something private, of course, is to never make it public in the first place. If you’re putting every detail of your life on Facebook, then you really only have yourself to blame when it’s used against you. In general, that applies in any era, with the caveat that what’s considered “public” now might not have been so, say, a century ago. Now, this isn’t to say that not posting something guarantees it’ll never be seen in public (look at, for example, FBI-made spyware or NSA-developed cryptography algorithms), but it’s a good start.

Throughout history, privacy has also been a fight against those who are deliberately trying to invade your personal space. Today, it’s governments and corporations. Years ago, it was governments and neighborhood activist groups (is your neighbor a Communist?). In earlier times, it was governments and rival merchants. All of them would employ spies, informants, private detectives, and the like in their efforts to expose your secrets. And if you were important enough, you were almost obliged to do the same in retaliation.

Those things we need to keep private haven’t really changed, either. We still want to cover up our earlier transgressions, possibly illegal deeds, and all those things we wouldn’t be comfortable having “out there”. Yesterday’s scarlet letter is today’s racist tweet, a reminder of what happens when privacy fails. And the lengths we go to, the things we do to keep such parts of our past out of the public eye, those are becoming more important every day, because our world is getting more connected, but also less forgetful.

Today, we might use a VPN to hide our browsing history. We’ll clear cookies and block tracking scripts. Some people go even farther outside the Internet, avoiding entire city blocks because of surveillance, using burner phones, paying with cash wherever possible, and so on. Those are modern methods of protecting our privacy, but they have their roots in older ways. Hired runners, safe houses, ciphers—it’s all the same, just under a different name.

Magic-eye puzzles

Now, if you add magic, that breaks some of those methods. First off, if you’re in a D&D-style fantasy world, where any hedge wizard has access to the entire Player’s Handbook, you’ve got serious problems. A wizard who can use a scrying spell to see anywhere makes the NSA look like amateur hour. If he can pick up more senses—hearing, specifically—then privacy is essentially dead on arrival. Unless scry-blocking spells and enchantments are available, cheap, and useful, there’s nothing stopping such a setting from becoming the Panopticon.

But let’s take a step back, because the magical realm we’ve been discussing so far isn’t like that. No, it’s a bit more…down to earth. So let’s see what tools it has to protect privacy. While we’re at it, we’ll also take a look at the other side, because that’s always so much easier.

First, there aren’t any invisibility cloaks or disguise spells, unfortunately. However, we do have, thanks to the greater advances in the sciences that magic has created, a lot more options for mundane disguises. Clothing is cheaper, for example, so it’s easier to procure a sizable wardrobe. And travel is not nearly as time-consuming as in pre-modern Earth, meaning that hopping over to the next town to do your dirty work isn’t impossible; you may be suspicious, but not if enough people are moving around.

Privacy in our magical setting, then, is going to be mostly a matter of hiding and deflection, just like it used to be here. It’s not so much a technical problem as a way of thinking about a problem. It faces the same obstacles as in the Industrial era, and the people will most likely develop the same kinds of responses as our ancestors then. To take another example, think back to our magical pseudo-telegraph. These can’t easily be wiretapped—the telegraph (and later telephone) is where the term comes from—because there aren’t any wires. But that doesn’t mean our equivalent to the operator can’t be bought or even replaced. So, if sensitive information has to be sent over the magical lines, it’ll need to be encrypted.

On the flip side, once we’ve established that there are ways of recording or transmitting images and sounds, there’s an obvious kind of surveillance that comes about naturally: the hidden camera. Although they’d be magical in nature, the principle would be the same as in any spy movie. Visiting dignitaries would be wise to bring in their own mages to inspect their lodgings. (Although our actions in real life can’t be encrypted, our communications can, and a good cipher wouldn’t get any easier to crack with magic. Not until computers come around, at least.)

Hiding in plain sight

To remain private in our low-magic setting, therefore, we have to be cautious, but not overly so. The availability of recording devices and other such subterfuge won’t be high; the devices are expensive to create, and they take mages away from other tasks. But that doesn’t mean vigilance isn’t needed. Like in today’s world, how far you need to go to ensure your privacy is directly proportional to the damage your secrets would cause if they got out. If you’re carrying around national secrets, then you’d be stupid not to use the best encryption available. You’d be a fool if you didn’t inspect every room you entered for hidden microphones, magical or mundane.

For most of us, though, it’s a matter of being careful. Don’t give out sensitive information, because you never know who might be listening. Unlike today, our magical kingdom doesn’t have government supercomputers listening to everything we say. It doesn’t have corporations scanning every word we write. But that doesn’t mean it’s easy to keep private matters private. There are always people snooping around. Magic won’t make them go away.

Magic and tech: cities

In today’s world, over half the planet’s population lives in urban areas. In other words, cities. That’s a lot, and the number is only increasing as cities grow ever larger, ever more expansive. Even on the smaller end (my local “big” city, Chattanooga, has somewhere around a quarter of a million people, and it’s not exactly considered huge), the city is a marker of human habitation, human civilization, and human culture. It’s a product of its people, its time and place.

In the city

The oldest cities are really old. Seriously. The most ancient ones we’ve found date back about 10,000 years, places like Çatalhöyük. Ever since then, the history of the world has centered on the urban. These oldest cities might have housed a few hundred or thousand people, probably as a way of ensuring mutual protection and the sharing of goods. But some eventually grew into monsters, holding tens or even hundreds of thousands of people, primarily to ensure mutual protection and the sharing of goods.

Looked at a certain way, that’s really all a city is: a centralized place where people live together. The benefits are obvious. It’s harder to conquer a city’s multitudes. There’s always somebody around if you need help. Assuming it’s there, you don’t have to go very far to find what you’re looking for. In a rural area, you don’t have any of that.

Of course, clustering all those people together has its downsides. In pre-modern times, two of those were paramount. First, every person living in a city was one not working in the fields, which meant that somebody else had to do the work of growing the city-dweller’s food and shipping it to the urban market. Great for economics, but now you’re depending on a hinterland that you don’t necessarily have access to.

The second problem is one we still struggle with today, and that is sanitation. I’m not just talking about sewage (which wasn’t nearly as big a problem in some old cities as we typically imagine), but a more general idea of public health. Cities are dirty places, mostly because they have so many people. Infections are easier to spread. Waste has to go somewhere, as does trash. Industry, even the pre-industrial sort, produces pollution of the air and water. And water itself becomes a commodity; even though most older cities were built near rivers or lakes (for obvious reasons), it might not be the cleanest source, especially in an unusually dry season.

Through the ages

A city’s character has changed throughout history. While they’ve retained their original purpose of being a gathering place for humanity, the other purposes they serve fall into a few different categories, some of which are more important in certain eras.

First of all, a city is an economic center. It holds the markets, the fairs, the trading houses. Sure, a village can have a weekly market pretty easily, but it takes a city to provide the infrastructure necessary for permanent shops and vendors. This includes food sellers, of course, but also craftsmen and artisans in older days, factories and department stores today. You don’t see Wal-Mart sticking a new store out in the middle of nowhere (the nearest to me are each about 10 miles away, in cities of about 10,000), and that’s for the same reason why, say, a medieval village won’t have a general shop: it’s not profitable. (The Wild West trope of the dry goods store is a special case. They provided needed materials to settlers, miners, and railroad workers, which was profitable.)

Another purpose of a city is as an administrative center. It’s a seat of government, a home to whatever the culture’s notion of justice entails. In modern times, that means a police force, a city council or mayor, a courthouse, a fire department, and so on. Cultures with cities will begin to centralize around them, and these central cities may later grow into states, city-states, nations, and even empires. Larger cities also have a way of “projecting” themselves; all roads lead to Rome, and how many Americans can name all five of New York City’s boroughs, but can’t name that many counties in their home state? With national and imperial capitals, this projection is even greater, as seen in London, Washington, Beijing, etc. This ties into both the economic reason above, as capitals of administration are very often capitals of commerce, and the one we’re about to see.

Thirdly, cities become cultural centers. While projecting force and economic power outward, they do the same for their culture. This develops naturally from the greater audiences the city provides; it’s hard for an artist to find patronage when he lives out in the country. (That’s just as true in 2017 as it was in 1453, by the way.) And since cities provide stability that rural areas can’t, this creates more incentive for creative types to move downtown. This creates a snowball effect, often spurred on by government investment—grants in modern times, patronage in eras past—until the city begins to take on a cultural character all its own. Like begets like in this case, and in a larger nation with multiple big cities, a kind of specialization arises: movies are for Los Angeles, Memphis has the blues, Vegas is where you go to gamble.

Now with magic

So that’s cities in the real world: urban centers of commerce, government, art, defense, and so many other things. What about in a magical world?

In many cases, it depends on how magic works in the setting. Magic that can be “industrialized” is easy: it effectively becomes another public service (if it requires infrastructure such as artificial “ley lines”—I have written a series based on exactly this concept) or private industry (if it instead takes skilled craftsmanship, as with enchanters in fantasy RPGs). In both of these cases, magic can almost fade into the background, becoming a part of the city’s very fabric.

For the slightly rarer and much less powerful magic we’ve been talking about in this series, it’s a bit of a different story. Yes, there will be magical industries, crafts, and arts; we’ve seen them in earlier parts. As magic in our realm is predictable, almost scientific, it will be used by those who depend on that predictability and repeatability. That includes both the private and public sectors. And enterprising mages will certainly sell the goods they create. That may be in a free market, or their prices and supplies might be tightly controlled, creating a black market for magical items.

If magic can be harnessed for public works, then that implies that cities in our magical realm are, by default, cleaner than their real-world contemporaries. They won’t be dystopian disaster areas like Victorian London or modern Flint. They’ll have clean streets and healthier, longer-lived people than their predecessors. Again, the snowball starts rolling here, because those very qualities, along with the city’s other aspects, will function as advertising, drawing immigrants from the countryside. And the automation and advancement we’ve already said will come to food production lets them do it. Thus, it’s not nearly as hard as you think to get a magical city up to, say, half a million in population.

The main thrust of this series has been that magic can effectively replace technology in certain types of worldbuilding. That’s never more true than in the city. Technology has made cities possibly in every era. The first urban areas arose about the same time as farming, and there’s no denying a connection there. Iron Age advances created the conditions necessary for the first true metropolises, and industrialization, machinery, and electricity gave us our modern megacities. At each stage, magic can create a shortcut, allowing cities to grow as large as they could in the “next” technological leap forward.

Magic and tech: food and drink

The need to eat is one of our most basic survival instincts. Every living thing has to do it, and humans have, as in so many other areas, taken the processes of collecting, preparing, and eating food to a level unseen anywhere else on Earth. Many inventions have come about solely for the purpose of making our food better. Sometimes, better means more nutritious. Much more often in history, however, better food is simply food that lasts longer.

And don’t forget about drinks. There’s not an animal alive that doesn’t enjoy a drink of water, but humanity has taken water and flavored it in myriad ways to create beverages. And we use more than just water as a base for our drinks: orange juice is one of the most popular “natural” drinks around, and all we have to do is extract it.

In the world

The history of food is tied to the history of mankind. Cooking seems to have emerged about 10,000 years ago, right around the same time as so many other parts of the Neolithic Revolution, like pottery and plant domestication; before this, we have some evidence of open fires and cooking pits, but not cooking vessels. An announcement in December 2016 (the very week I wrote this post, in fact) details a pottery find in the Sahara that shows biological markers of cooked plant matter dating back about this far. The timing can’t be a coincidence: the first domesticated cereal grains, the oldest ceramics, and a technique that just happens to use both of them? If anything, that sounds like cause and effect to me. Not sure which one’s the cause, though.

Anyway, preparing food has a long history. So does producing it, whether through growing crops or raising stock. Domestication of animals for food took a bit longer than plants (animals are a bit more willful, you see), but it happened. Some would say we’re doing too well at that these days—the free-range movement is all about lowering food production, because the techniques we’ve developed to get the extreme yields lead to extreme suffering on the animals’ part.

Cooking was, for most of human history, something you did over a fire. You could build a box to contain the fire (an oven), put a slab on top of it (a stove or griddle), stick a pot full of water over it to boil (a cooking pot), but it was still a fire. It’s only very recently that we got rid of that, with our gas and electric ovens, our microwaves, and our coffee makers. Yet we go back to the fire even now, when we’re camping or roasting marshmallows, or when the chef breaks out the blowtorch. And gas stoves still use flames, so a lot of Americans retain that millennia-old connection to their Stone Age ancestors.

If there’s anything we have undeniably improved on in the modern era, it’s food preservation. As little as 200 years ago, that was largely limited to salting, pickling, and similar curing processes. In colder climates, you could freeze food through the winter by stuffing it in the snow; everywhere else could get a mild cooling—but not freezing—effect by digging a deep enough hole, which works well for, say, wine. But much food was eaten fresh, or near enough to it. What wasn’t usually came out in some other form: pickles, jams, etc.

Today, by contrast, preserved foods are the norm. We’ve got refrigerators, freezers, canning, vacuum-sealed plastic packaging, and an array of foods specifically designed for a long shelf life. (That’s something else olden days didn’t have. Food sitting on a shelf was food gone to waste.) We have “instant” mixes that, while they may not taste like the real deal, are close enough for people on a budget in time and money. I eat frozen dinners all the time, and they’re basically the same thing. And even when we do use older techniques, we combine them with the new, putting our pickles in the fridge.

Finally, our modern world has given us another benefit in terms of our diet. As we’ve become more connected, as the apparent distances between us have shrunk, we have expanded our palates. Any decent-sized American city will have not only American food, but Italian, Mexican, Japanese, Chinese, and many more. India and Thailand are about as far from the east coast as you can get while still on the same planet, yet immigration and modern food production have combined to let us sample their cuisines from thousands of miles away.

Now with magic

In the general timeframe of the Middle Ages, they didn’t have all that. Sure, there was a booming trade in spices, as there has always been. A few exotic foods made their way to distant locales, though rarely in fresh form. And the European climate in most places was so different from their nearest “exotic” trading partners, the Muslims of the Middle East and North Africa, that many of the food plants simply wouldn’t grow.

Magic, as we’ve provided for our magical realm, won’t change that too much. The distances will still be great, and magical transportation won’t move that much faster than sea travel, once you take into account the often winding roads, the customs checkpoints, the weather, and so on. Grain can keep for a long time, and so can a lot of other food items, but the “goes bad quickly” set won’t shrink very much, because the timing isn’t right. Thus, this part of the exercise won’t be that different from what the real world gives us.

Producing food, however, will get a big boost from magic. Indeed, there’s almost no reason why that won’t be one of the first areas of interest our mages work on. Higher crop yields, protection against crop failure, larger stock, more eggs…these all help everybody in an agrarian society. If the mages don’t focus somewhat on improving agriculture, what good are they? (Even combat-oriented RPGs get this one right. D&D 4th Edition has Bloom as a 2nd-level ritual. Pathfinder’s Plant Growth can be cast at level 5, and it only takes 6 seconds. And that’s not counting the direct “Create Food” stuff.)

So we can assume our magical kingdom will have more food produced. Next up comes harvesting it, often a labor-intensive task. Again, we’ve seen how magic can reduce the labor needed by creating industrial-like machines. All that’s stopping the mages from moving into farm machines is imagination. They may not be to magic-powered combines and tractors yet, but those aren’t too far into the future.

Even a marginal mastery of heat and cold—one we’ve already said this realm has—opens up a lot of avenues for research into refrigeration and cooking. Everything from starting fires to chilling wine gets a boost, along with too many other things to name. Remember that cooking in pre-modern times is mostly about fire. Make that fire easier to work with, and the improvements naturally follow from there. The other processes of cooking, such as chopping, don’t benefit as much, but control over temperature more than makes up for that.

Last, let’s take a look at drinks. Most of those won’t be too modern, as our sodas and imitation fruit juices and “lite” beer take a lot of chemistry and machinery that is out of their league. But cold drinks will be more common, even in summer, and this magical kingdom may learn the joys of iced beverages far sooner than ours did. Fruit juices, easier to extract thanks to magical machines, will likely become popular. Distillation will allow for stronger alcoholic drinks. And then we come back to plain old water. With magic, purification gets a boost (it’s about the same as with alcohol, actually), so clean drinking water isn’t a problem, even in cities.

We’ll leave it on that note, but keep that last idea in mind, because that’s where the next post will go: into the magical cities.

Magic and tech: clothing and fashion

We humans are peculiar in a great many regards, but one of those is our clothing. Call it a cultural imperative, but we all wear clothes. Those few of us that don’t, such as nudists or those few indigenous peoples who still haven’t adopted at least a loincloth, are seen as odd by the rest of our species. (The story of Genesis is at pains to point out that, once they received the higher wisdom of the tree, Adam and Eve very specifically became “ashamed” of their nakedness.) But the big picture tells a different story: as life on this planet goes, we are the weird ones. Only humans feel the need to cover some or most of their bodies in some other substance most of the time.

This may be from an evolutionary quirk, as humans are a rarity in another way. How many other animals choose to leave their evolved habitat? Very few. That’s not just how evolution works, but why. Species adapt to their environments, and there’s a kind of “inertia” that keeps them there. It’s probably because adapting is hard, and where’s the reproductive advantage in doing it all over again?

Putting something on

The first and most obvious choices for human clothing, looking back to prehistoric times, were likely animal skins. Despite the misguided crusades of PETA and others, that’s still an attractive option today. How many of you own a leather jacket, or a fur coat, or something of that sort? Skins are a good choice for protecting us from the elements (one of the original and most important uses for clothing), because, hey, it works for the animals they belong to.

Any culture can make clothing out of animals. It’s not that hard to do, all things considered. And there’s a lot of technological progress that can be made there. Tanning, the process of transforming raw hides into leather, may have been one of the defining developments of the Neolithic, alongside agriculture and villages, if only because it’s one of our oldest examples of a “manufacturing” process.

A few other materials coming from animals see use for clothing. Wool is the big one, but the hair of a few other mammals can also work. Biblical-style sackcloth, for instance, used animal hair, as did medieval hairshirts, strangely enough. Outside of the mammals, we also find silk, which comes from the cocoon of the silkworm. Like hair, silk is a fiber, and we can spin fibers into threads, then weave threads into cloth. Simple as that.

But the best fibers, in terms of cost, ease of use, and animal ethics, come in the form of plant fibers. And it’s those that formed the basis for most day-to-day clothing in the Western world until modern times. As a matter of fact, even our synthetic world of polyester and nylon and the like still holds ample evidence of plant use. I’m wearing an awful lot of cotton right now, for example, and linen (from flax) hasn’t gone away after all these centuries.

Dressing up

Intimately related to clothing is the idea of fashion. It’s all well and good to say that humans cover themselves with animal or plant parts, but how they do so is one of the hallmarks of a culture. What parts do we cover? (That’s a more nuanced question than you might think; in America, it’s different for men and women and children.) What sorts of clothes are acceptable? What kinds of styling do we use, and when?

A lot of questions like this are highly specific to a culture, and it’s hard to draw many general conclusions. Most every culture agrees that the pelvic region should be covered, for instance—though even that is not universal. And it’s rare to find a place that doesn’t have a fashion “hierarchy”, where certain people are expected to wear “better” clothes at certain times. Think of a suit, a tuxedo, or our “Sunday best”, then compare that to what we might wear at the beach, or just around the house.

One of the more interesting—and more visible—aspects of fashion is color. At some point long ago, our ancestors discovered they could dye those materials they used for their clothing. Today, we take that for granted, but it wasn’t always thus. Purple is seen as a royal color in the West because one shade of purple (Tyrian purple) was once worn exclusively by royalty. And why did they choose that particular purple? Because it was just about the most expensive kind of dye you could find: literally worth its weight in silver.

Throughout the ages, that becomes the refrain of high fashion. And high fashion eventually trickles down to low fashion, but low fashion has made its own developments in the meantime. Some of those developments are modern, such as the boxer briefs I’m wearing as I write this. Others have a much longer history, like sandals. Sometimes, the history is longer than you’d expect; art from over 2,000 years ago shows women wearing something that looks an awful lot like a bikini.

Fashionable magic

Whatever form it takes, fashion is an integral part of a culture, and it’s also an important part of any study of clothing. Thus, as we turn to our magical realm, we’ll treat the two of them as inseparable.

First, though, we need to make the clothes. In olden days, that was a laborious, time-consuming task. It’s not a stretch to say that the whole Industrial Revolution came about as a way to simplify that task. Spinning fibers into threads took so much time that some researchers have concluded that it was effectively a constant job for medieval-era women. They’d do it while they weren’t doing anything else, and sometimes when they were. Weaving was likewise hard work. Dyers might have been respected, but only if you weren’t downwind of them. And forget about all those things we take for granted, like zippers or standard sizes.

Industry changed all that, and so can magic. We’ve already seen how magic, within the boundaries we have set, can improve the manufacturing capabilities of our realm. Applying that to clothes-making will likely be one of the first things the mages do. It’s a no-brainer. In our world, it was one of the first true cases of factory automation. That’s not going to be any different if it’s magic powering the factories. (Putting all those women out of work will have…interesting consequences.)

On the other hand, dyeing doesn’t get much of a boost from magic. It’ll benefit from the advances in chemistry made possible by magic itself and the general inquisitiveness that magic will bring, but there are fewer direct applications. Processing the materials for dyes might be automated, though, in much the same way as spinning thread. The same goes for extracting the plant fibers for clothes in the first place; every American student has heard of Eli Whitney and the cotton gin.

One thing is for certain: magic will make clothes cheaper across the board. When clothes cost less, people will have more of them. Even the poorest folks will be able to afford richly dyed fabrics instead of plain whites, browns, and grays. That’s the point when fashion becomes “mainstream”. Once a sufficient percentage of the population has access to finery, styles can develop. Fashion transforms from a noble quirk to a cultural phenomenon. What form it will take is nearly impossible to predict. And it’s a moving target, even in older times. How many people do you know in 2017 wearing bell-bottoms or tie-dyed shirts? How many have you seen in corsets and pantaloons outside of reenactments?

To end this post, let’s look at one very intriguing possibility that sprang from the development of clothes: computers. I know that sounds crazy, but bear with me. Weaving complex fabric patterns on a loom is difficult. It’s hard to make a machine that can do that, and harder still to develop one that can change its patterns. Joseph Marie Jacquard did just that about 200 years ago. He created a mechanized loom that could change its weave based on a pattern of holes punched in a series of “input” cards. Punched cards. Herman Hollerith took them for his census-counting machine at the end of the 19th century. Sixty or so years later, IBM used them to store the data for their first computers.

Now, the “programming language” of Jacquard looms isn’t Turing-complete, and nobody would claim that someone using the loom was truly programming a computer, but the seed of the idea is there. In fact, almost everything an early computer would need can be done with the magic we’ve seen in this series, some six centuries before it “should” exist. That doesn’t mean our magical realm has computers, or will get them anytime soon, but it’s definitely one of those strange paths you might want to look down. In this new year, I’ll try and find more of them for us to explore.

Magic and tech: economy

One of the biggest topics of the last decade has been the economy. We’re finally climbing out of the hole the banks dug for us in 2008, and it’s been long enough that most people have taken notice. Employment, income, wages, and benefits are important. So are less obvious subjects like inflation, debt and credit, or mortgages. Even esoteric phrases like “quantitative easing” make the news.

The economy isn’t a modern invention, however. It’s always been there, mostly in the background. From the first trade of goods, from the first hiring of another person to perform a service, the economy has never truly gone away. If anything, it’s only becoming bigger, both in terms of absolute wealth—the average American is “richer” than any medieval king, by some measures, and today’s billionaires would make even Croesus jealous—and sheer scope.

How would magic affect this integral part of our civilization? The answer depends on the boundaries we set for that magic, as we shall see.


Our economy, whether past, present, or foreseeable future, is based on the concept of scarcity. For the vast majority of human history, it was only possible to have one of something. One specific piece of gold, one individual horse, one of a particular acre of land or anything else you can think of. You could have more than one “instance” of each type—a man could own twenty horses, for example—but each individual “thing” was unique. (Today, we can easily spot the friction caused when this notion of scarcity meets the reality of lossless digital copying, the lashing out by those who depend on that scarcity and see it slipping away.)

Some of those things were rarer than others. Gold isn’t very common; gems can be rarer still. Common goods were relatively cheap, while the rare stuff tended to be expensive. And that remains true today. Look at any “limited edition”. They might have nothing more than a little gold-colored trim or an extra logo, but they’ll command double the price, if not more.

Supply and demand

All that only applies to something people want. It’s a natural tendency for rare, desirable goods to climb in value, while those things that become increasingly common tend to also become increasingly worthless. This is the basis of supply and demand. If there’s more of something than is needed, then prices go down; if there’s a shortage relative to demand, then they go up.

Although it’s a fairly modern statement, the concept is a truism throughout history. It’s not just a fundamental idea of capitalism. It’s more a natural function of a “scarcity economy”. And you can apply it to just about anything, assuming all else is equal. A shortage of laborers (say, due to a plague) pushes wages higher, because demand outstrips supply. That’s one of the ultimate killers of feudalism in the Middle Ages, in fact. Its converse—a glut of supply—is the reason why gas prices have been so low in America the past year or so.


Another thing you have to understand about the economy is that it’s all connected. Today, that’s true more than ever; it’s the reason we can talk about globalism, whether we consider it a bringer of utopia or the cause of all the world’s ills. For less advanced societies, the connectivity merely shrinks in scale. There was, for example, no economic connection between Europe and the Americas until the fifteenth century, apart from whatever the Vikings were up to circa 1000. The Black Death had no effect on the economy of the Inca, nor did the collapse of the great Mayan cities cause a recession in Rome. Similarly, Australia was mostly cut off from the “global” economy until shortly before 1800.

Everything else, though, was intertwined. The Silk Road connected Europe and Asia. Arab traders visited Africa for centuries before the Portuguese showed up. Constantinople, later Istanbul, stayed alive because of its position as an economic hub. And like the “contagious” recessions of modern times, one bad event in an important place could reverberate through the known world. A bad crop, a blizzard blocking overland passes, protracted warfare…anything happening somewhere would be felt elsewhere. This was the case despite most people living a very localized lifestyle.

Making magic

In role-playing games, whether video games or the pen-and-paper type, some players make it their mission to break the economy. They find some loophole, such as an easily creatable magic item that sells for far more than its component cost, and the exploit that to make themselves filthy rich. It happens in real life, too, but government tends to be better at regulating such matters than any GM. (The connection between these two acts might make for an interesting study, come to think of it.)

We’re trying for something more general, though, so we don’t have to worry about something as fine-grained as the price of goods. Instead, we can look at the big picture of how an economy can function in the presence of magic. As it turns out, that is very dependent on the type of magic you have at your disposal.

First, let’s assume for a moment that wizards can create things out of thin air. Also, let’s say that it’s not too difficult to do, and it doesn’t require much in the way of training or raw materials. Five minutes of chanting and meditating, and voila! A sword falls at your feet! Something more complex might take more time, and living things can’t be created at all, but crafted goods are almost as easy as a Star Trek replicator.

Well, that destroys any economy based on scarcity. It’s the same problem media companies have with computers: if something can be copied ad infinitum, with no loss in quality, then its unit value quickly drops to zero. Replicating or creating magic, if it’s reasonably widespread, would be like giving everyone a free 3D printer, a full library of shape files, and an unlimited supply of feedstock. Except it’d be even better than that. Need a new sword/axe/carriage/house? Call up the local mage. No materials needed; you’re only paying for his time, the same as what would happen to books, music, and movies without licensing fees and DRM.

So that’s definitely a “broken” economy. Even a single user of such magic breaks things, as he can simply clone the most expensive or valuable items he knows, selling them whenever he needs the cash. Sure, their value will eventually start to drop—supply and demand in action—but he’ll be set for life long before he gets to that point.

It’s the economy, stupid

For our magical kingdom, let’s look at something more low-key. It doesn’t have creation magic. Instead, we have at our disposal a large amount of “automating” magic, as we’ve seen in previous parts. What effect would that have on the economy? Probably the same effect increasing automation has in our real world.

Until very recently, most work was done by hand, occasionally with help from machines that were powered by people, animals, or natural forces. The Industrial Revolution, though, changed all that. Now, thanks to the power of steam (and, later, electricity), machines could do more and more of the work, lightening the load for the actual workers. Fast-forward to today, where some studies claim as many as 40% of jobs can be done entirely automatically. (For labor, we’re actually getting fairly close to “post-scarcity” in many fields, and you can see the strain that’s beginning to cause.)

Magical force and power can easily replace steam and electricity in the above paragraph. The end result won’t change. Thus, as magic becomes more and more important in our fictional realm, its effects stretch to more and more areas of the economy. As discussed in the post about power, this is transforming the workforce. Unskilled labor is less necessary, which means it has a lower demand. Lower demand, without a corresponding decrease in supply, results in lower wages, fewer working hours, fewer jobs overall. We know how that turns out. The whole sordid story can be found in all sorts of novels set in Victorian England or Reconstruction America—Charles Dickens is a good start. Or you can look at modern examples like Detroit or Flint, Michigan, or any steel town of the Midwest.

There is an upside, though. After this initial shock, the economy will adjust. We see that today, as those displaced in their jobs by robots have begun branching out into new careers. Thus, it’s easy to imagine a magical society embracing the “gig economy” we’re seeing in Silicon Valley and other upscale regions, except they’d do it far earlier. You could even posit a planned socialist economy, if the magic works out.

But mages are human, too. They’re subject to need and greed the same as the rest of us. So they might instead become the billionaires of the world. Imagine, for instance, wizards as robber barons, hoarding their techno-magic to use as a lever to extract concessions from their rivals. Or they could simply sell their secrets to the highest bidder, creating something not much different from modern capitalism. If magic has a distinct military bent, then they could become the equivalent of defense contractors. The possibilities are endless. All you have to do is follow the chain of cause and effect.

The economy is huge. It’s probably beyond a single author to create something entirely realistic. But making something that passes the sniff test isn’t that hard. All you have to do is think about why things are the way they are, and how they would change based on the parameters you set. Oh, and you might want to find one of those munchkin-type players who likes to find loopholes; for the economic side, they’re more useful than any editor.