Review: Yesterwynde

It’s been a few years since my favorite group put out a new album, but the time has finally arrived. Yesterwynde came out last month, and I’ve got some things to say about it.

Nightwish remains at the top of my list of favorites, as they have for almost two decades now. Their music truly has touched me in many ways. It’s emotional, which carried me through some tough times. It’s inspiring, to the point of providing me with titles for about a dozen of my stories. Most of all, it’s just good music. Considering what the media tries to push these days, that’s a rare occurrence indeed.

Yesterwynde, however, is…a bit of a conundrum. It marks another shift in lineup, as Marko Hietala left the band a few years ago. His replacement on bass doesn’t sing. Than means male lead vocals fall to Troy Donockley—admittedly, he did most of the male singing on Human :II: Nature—who also plays…bagpipes. (Metal is weird, in case you’re wondering.)

Concept-wise, it’s not fully coherent, but there are definite themes that run throughout. From what I can tell, it’s envisioned as completing a trilogy that began with Endless Forms Most Beautiful. What that tells me is something I’ll save for the conclusion.

Anyway, on to the song-by-song.

Yesterwynde

The opening, and title, track sets the tone for the whole album. It starts with a very Nightwish symphonic and choral intro, tosses in some piping, and lets Floor Jansen show off. Very traditional, but you can already sense a shift in the tone of the album. There seem to be more minor chords and more drops that give the song a sense of sadness that was totally missing from the last two albums.

At the beginning, even before the strings, is another recurring theme, in the form of a film projector sound. The word "yesterwynde" is a pure neologism that, broadly speaking, refers to nostalgia, the longing for the past. That sense permeates the entire album, and the projector noise only reinforces the notion that we’re looking back. Compare this to Endless Forms, which always gave me the feeling of being looked at.

An Ocean Of Strange Islands

Now we get to the first "real" song, and it’s much more metal. It hearkens back to "Stargazers" and "Devil And The Deep Dark Ocean" in its energy and feel. In my view, that’s another way Yesterwynde invokes nostalgia: it’s as if you’re listening to a greatest hits album that doesn’t actually have any of the songs.

In terms of theme, it’s hard to tell just what these strange islands are, but I suspect that, in this instance at least, they’re worlds like Earth. Phrases like "universal mariners" and "the starbound quay" hint at that, while also reminding us that Nightwish has always been a very sea-focused band.

The Antikythera Mechanism

The object referred to in the title of this one, the Antikythera Mechanism, is the oldest known analog computer, a Greek invention from Late Antiquity that functioned as an orrery. Finding it changed a lot of what we thought we knew about that era, and technological progress as a whole.

The song gives a hint of that: "Your father’s voice, no more unheard." What it also gives is an unusual rhythm for the verses, interspersed with a rapid-fire refrain that opens up as it progresses, both filled with lofty lyrics that still come across as down-to-earth. This is also the first reference to the "weave", a theme that we’ll pick up later.

The Day Of…

No, the title didn’t get cut off. That’s really what it’s called. The day of…what, you might ask? Reckoning, I would assume, because this track definitely has an apocalyptic vibe. Rather, it’s closer to a deconstruction of apocalypse.

This is where Nightwish, like many metal acts, makes you think. For the whole song, Jansen is rattling off various ways the world might end. Y2K, overpopulation, global warming, and other such falsehoods. The best part is, she’s mocking them. She’s laughing at all these crazy theories humans have devised for the end times. And that goes all the way up to the present, the "mind virus" (which can only be a reference to woke progressivism) and the urge to "obey, stay away, cover up" that too many people submitted to in 2020.

"The Day Of…" thus feels like a rejection of the modern ideology, the state religion of fear impressed upon us. But it’s not a call for returning to tradition, either.

Perfume Of The Timeless

I barely know what to say about "Perfume Of The Timeless". It’s just one of those songs that gets into your mind, your very soul, and makes itself at home there. An 8-minute epic that evokes pretty much anything you could think of, if you squint hard enough. Symphony and metal intertwined. And, best of all, that chorus. The second line of it, to be precise: "We are because of a million loves."

Those seven words, in my opinion, encapsulate not only the overarching theme of Yesterwynde, but the feeling it seems to want you to feel. We’re human. We have human emotions. And this song isn’t saying that we live for love, but because of love. We’re here because our parents loved each other (or tried to), because their parents did, and so on. It’s the spiritual counterpart to what I consider the most important line of "The Greatest Show On Earth": "Not a single one of your fathers died young."

It’s humanism, plain and simple. It is the sense that we all have things in common, that there are universals among our species. And that we owe our lives to the humans who came before us.

Sway

After the humanist national anthem, we get "Sway", an airy ballad that lets both vocalists shine in harmony. Nothing too complex or even deep here, just good singing and an undercurrent of innocence. That’s another Nightwish standard, going all the way back to the 90s. There’s a bit of whimsy in here, that then stands in counterpoint to the bridge speaking of some unknown big reveal. Death? Revelation? Whatever it is, we should greet it with the eyes of a child.

The Children Of ‘Ata

Speaking of children, next up is "The Children Of ‘Ata". Not sure what ‘Ata is; my admittedly cursory search came up with a mythical Polynesian island, sort of a Pacific Atlantis or Hyperborea. The song starts with a chant in a Polynesian language—I think it’s Tongan?—lending credence to that theory.

Besides that, this is a song that confuses me. It works in the lyrical theme of a mariner, as in "An Ocean Of Strange Islands", but also the "watchers" theme from "Edema Ruh" and the endurance theme from the climax of "The Greatest Show On Earth", both on Endless Forms, the notion that we’re being watched and judged by someone beyond our knowledge. In this case, based on the rest of Yesterwynde, that someone is…our children. Future generations looking back, probably wondering what in the world we were thinking.

Oh, and there’s a haka. I think it’s a haka, anyway. Sounds like one. And that reminds me of Christopher Tin’s "Kia Hora Te Marino". Nothing wrong with that.

Something Whispered Follow Me

A nighttime visitor, physical or spiritual. A call of the wild, beckoning you to step into the unknown, into the land of fantasy that waits beyond the "normal". Every metal band does it, apparently. Queensryche made a hit of it. Avantasia wrote two whole albums about it. Nightwish themselves put it in a song ("Elvenpath", in case you’re wondering) when I was still in middle school.

This rendition of that timeless trope is nothing spectacular, but it’s solid. Hard. It urges us to find something real by, paradoxically enough, embracing flights of fancy. And it’s another song with humanist trappings, reminding us that our lives are works of art simply by us living them.

Spider Silk

Here’s another song where I don’t know what to say. Unlike "Perfume Of The Timeless", it’s not because I was bowled over by it. No, "Spider Silk" is simply…uninspiring. It’s literally a song about spiders, and it just isn’t a very good one.

Now, don’t get me wrong. The track has a catchy beat. When I listened to it at home, relaxing, I was immediately put off. But the second time, driving across the state to see the woman I love, I found myself almost singing along. So it’s the very rare case of a symphonic metal band creating a…pop song? That’s really what it feels like.

The Weave

Spiders, of course, weave webs. That’s what they’re known for. And weaving is a very important theme in Yesterwynde. Stories are woven. Tapestries are woven. And I’d say that all comes about because fate, in many older traditions, is also something woven.

As for "The Weave" as a song, there’s little to say. It’s definitely a filler track, but at least it has a gimmick: the whole thing is, with one exception, a palindrome. What that means, I have no idea.

Hiraeth

I put this out of order purely to keep "Spider Silk" and "The Weave" together. Well, also because it deserves more words. This is another ballad, one far more downcast than "Sway". For reference, hiraeth is the Welsh word for nostalgia, and thus basically a translation of Yesterwynde itself.

Even if you didn’t know that, you could probably figure it out if you listened to the lyrics. The verses are sung by Troy here, with Floor harmonizing in the refrains. And it does seem to be his song, his time in the spotlight as male lead.

He couldn’t have picked a better one. "Hiraeth" is all about looking back, about reaching out for what has passed us by. It’s sad in a bittersweet way, and it’s all too real. Life is full of hurts, of pain and loss. And that wears us down to the point where we do start to long for the days of old. Pain and sorrow and living with the thought that maybe we could’ve done something different to prevent it—that’s called being human.

And that’s why "Hiraeth" hits hard, despite being tucked away near the end of the album. It’s almost a hidden gem of a "sad" song. It makes you think. It makes you dwell on the past, and then realize what you’re doing.

Lanternlight

Last, we come to "Lanternlight". This is more of a story than a song, a bit of free verse that caps our journey through the rose-tinted world of memory. Musically speaking, it’s really nothing more than Floor Jansen’s rehearsal. If you’ve heard her cover of Heart’s "Alone", you’ll wonder if she used that to practice for this. (If you haven’t heard that, check it out. It’s amazing.)

For some reason, I can’t get through "Lanternlight" without crying, and I don’t really understand why. It’s not overly sad. Very bittersweet, yes, but not something intended to get the tears flowing. All I’ve been able to figure out is a sequence in the penultimate verse: "I hear our song now, sung by the free / For a thousand more tomorrows / Of an incomplete weave". That’s the part that gets misty for me, and the only reason I can think of is because I just don’t believe I’ll even see a thousand more tomorrows.

Conclusion

Overall, Yesterwynde is a good album. It’s not the greatest, far from the worst, and very much a coherent whole. Even the filler tracks ("Spider Silk", "The Weave") contribute to the primary themes of fate, humanity, humanism, and the idea that we live not for ourselves, but for those who are yet to come. That our descendants, our children and their children and their children, will be the ones who tell our story, not us.

On top of that, it’s nostalgic in the music sense, too. At almost every point in the album, you’ll hear something that sounds enough like an older Nightwish song that it tickles your ear and makes you think back to Once or Oceanborn or whatever. It’s new, but it’s not completely fresh. Instead, it builds off what came before.

If anything, that’s the message right there, and I can see how it’s the cap to a trilogy. Endless Forms Most Beautiful set us in our place in the universe, in the chain of evolution that stretches back to the birth of our world. Human :II: Nature places us in the world, contrasting our human ingenuity with the natural wonders around us. And Yesterwynde roots us in time, reminding us that the way we look back on our ancestors is exactly the way we will be looked upon by future generations.

So maybe we should act like it.

41

I’m a year into overtime now, and I’m not sure how to feel about that. I honestly don’t feel much different from this time last year, at least regarding my position. I’m back to work, if only part-time, and that’s enough to tread water. Never enough to move forward, however, and that’s really how I see every part of my life these days. Add in the exhaustion and stress I feel most days, the parts of my body that don’t quite work as well as they used to…

I’m old. There’s no other way to put it.

While I do still prefer it to the alternative, it makes things much more difficult. I just don’t have time to do all the things I want. I’m not talking about the Boomer obsession with vacations to faraway places or interminable roadtrips. I mean putting my ideas into practice. Because I still have tons of those.

Since I have, for almost three years now, continued to believe that each new birthday will be my last, I’ve decided to focus on my legacy, what little there is. Barring a miracle (and you know I don’t believe in miracles), I probably won’t have children of my own. I’m not going to be a billionaire philanthropist. No, I create things. That’s what I do. And I want to create something that, to put it bluntly, outlives me.

All my ideas are incredibly niche. I’ll freely admit that. I’ve never been one to follow trends or try to be popular. I think outside the box, whether or not people want me to. It’s a lonely path that my mind walks.

That said, the thoughts that have taken root lately have been for things that other people might find interesting or useful. So this year is my chance to focus on those.

  • Altidisk: It’s hard to believe I’ve been creating languages for a quarter of a century. True, I didn’t do much with the craft for a few years, but I never truly stopped. (I’ve used one of them for making passwords for 20 years!) Altidisk is a little different, though. It’s the first time I’ve made an auxiliary language. More Esperanto than Elvish, in that sense. Unlike most auxlangs, mine has an ulterior motive: it’s based on Germanic roots, Germanic principles, and it’s intended to foster a renewed sense of Germanic community.

  • Pixeme: This one’s still around. The idea is simple enough. Take a picture, describe in a single sentence what’s happening on it, then translate that sentence into as many other languages as you can. It’s good for building vocabulary and grammar, the latter of which flashcard methods tend to overlook. And I’ve even tested the Pixeme method myself; even just using AI-generated images, I was able to associate the image with a Spanish sentence fairly easily, and that helped with the words, too.

  • Rakentan: I’ve been wanting to build a "fediverse" platform ever since I first saw the ActivityPub standard. It just seems like it solves so many problems with the way the modern web is designed. Originally, I wanted to create a replacement for the old PHP-based forums of decades past, and that’s still on the table, but I recently had the idea of something like a recreation of webrings (remember those?) crossed with StumbleUpon (remember that?), in a federated model. So you’d have all your own links, and you could follow others’ collections to see what they’re liking, and so on. I’m…still working out the specifics, to be honest.

I have other ideas, because I always do, but these are the top ones at the moment. Other than my writing, of course. That’s what kept me going through the deepest parts of depression, so I don’t see why I shouldn’t continue it, even if basically nobody ever reads the books.

So there you have it. Another year older, somewhat crankier, as quixotic as ever, and altogether jaded. That’s me at 41.

Fork in the road

The past week or so has been an eventful one in the game development world. Unity is still backpedaling on their disastrous attempt at charging devs per-sale. The CCP-infested Unreal Engine has lowered its royalty fee. Ubisoft is teaching us all how best to set half a billion dollars on fire.

And then there’s Godot.

I’ve written about Godot Engine in the past. It first came out about 10 years ago, and it took the opensource world by storm. Here was a pro-level—okay, semi-pro back then—game engine that was free to use, without worrying that Unity would demand payment when you only ever opened the editor twice. (This actually happened to my brother.) Over the past decade, it grew, evolved, becoming the premier engine for budding developers on a budget.

All that changed a few days ago. "Get woke, go broke," the saying goes, and Godot’s management has chosen to go for broke. A far-left "community manager" proudly boasted on Twitter that this engine was perfectly fine being on an admittedly overzealous list of woke games. Fine. Sure. Find me a AAA studio that isn’t utterly broken to the mind virus, and I’ll gladly buy their games. Well, except I can’t actually buy their games; they won’t sell them to me. (California got one right this time, amazingly enough.)

Most people probably ignored the initial message, seeing it as just another fluorescent-haired professional victim parroting the latest narrative. And that’s probably how it was originally intended. But then came the doubling down. People who questioned the intent of the message started getting banned. Developers were kicked out. Backers were kicked out. The project head first claimed to be apolitical, then whined about being bullied off Twitter altogether, retreating to the safe space of leftist Mastodon. At every turn, those who objected to, disputed, or simply asked about Godot’s underlying political agenda were purged.

The great thing about open source is that this doesn’t mean the end. Because anyone can take the source, compile it, and release the resulting binaries, an open project can’t be shut down by progressive whim; this is most likely why so many are switching to "open core" models or demanding copyright assignments.

End result, though, is Redot Engine. Yes, the name’s derivative, but that’s to be expected. The whole thing is derivative, but in the positive sense that only free code under a permissive license allows. Redot hasn’t even released a build yet, and they’re already overwhelmed with support, so much so that Godot’s screeching gallery has started openly attacking it. They use the usual communist methods, so familiar from Antifa, BLM, and anything to do with Trump: projection, accusations of white supremacist beliefs, attempts to clog the system with garbage, and vague allusions of unseemly images stored on the "bad guys’" computers.

All this, because someone said, "No, I don’t want my game engine to have a political agenda."

Nor should it. Tools should be apolitical, because a tool, in and of itself, is amoral. It does not think or act on its own. It simply exists. The uses of a tool are in no way indicative of any inherent moral qualities of that tool. Nuclear bombs were once considered a viable means of digging canals, after all. And even if we accept the idea that a tool can espouse an ideology, why would we want one that’s communist? Why would we want to support the single most deadly ideology in all of human history? The one responsible for the Holodomor and the One Child Policy, the one that gave the world Stalin and Mao and Castro and Chavez?

Redot, as I see it, can be a chance to show that some people are still fighting against the encroachment of anti-human ideology in software. That gives me hope, because I’ve often felt I was fighting that battle alone, as I watched project after project adopt censorious codes of conduct or otherwise wall themselves off from rational discourse.

It’s not perfect yet, so my other hope is that the Redot team understands two things. One, something founded purely on a negative basis—that is, solely to be against another—cannot endure. This was the downfall of Voat and Threads, among many others

Second, if Redot wants to be inclusive in the true, non-bowdlerized meaning of the word, then it must be open. As yet, it is not. All discussion and development is currently hosted only in walled gardens: Discord, Github, Twitter, Youtube. There isn’t any way for a privacy-conscious developer/author to contribute, and I won’t compromise my own morals by supporting the very platforms which have spread the woke mind virus to software development in the first place.

So that’s where we stand right now. Godot has self-immolated, and I have no problem saying they deserve it. Redot is carrying the torch, but they need to prove that their words are not just wind. If they do, then we will have not only a great game engine for free, but a beacon of light in an otherwise dark time.

The Second Enlightenment

The Third Dark Age is upon us. We live in the modern equivalent of the final days of Rome, waiting for the sack that finishes off the Empire once and for all. Like the Huns and their Bronze Age counterparts, invaders run rampart in our towns and cities, not only not stopped by those who claim to lead us, but actively supported by them. Meanwhile, alleged academics want to banish all knowledge of the past, for fear of the masses recognizing our decline.

Can we halt our decline? Probably not, as far down the path as we’ve come so far. But that doesn’t mean we can’t work to ensure that it is as short as possible, and that our descendants are not left in a centuries-long era of regression and barbarism.

Awakening

Out of the Greek Dark Age came legends: Homer and Exodus, mythic tales of people overcoming the great and powerful with more than a little help from their chosen deities. By the time the dust had settled, the world had changed irrevocably, a full break with the past. Gone were the Hittites and Trojans and Canaanites, while Darius and Alexander were still hundreds of years away.

It was a long road back, but we got there in the end. Eventually, rational thought returned to the Western world, largely confined to Greece at this stage. Philosophy was born, and with it the awakening of wisdom, of reason.

Much, much later, the fall of Rome and rise of Islam brought the Medieval Dark Age to Europe, all but extinguishing that light. And while the cultural and technological and even scientific knowledge of the West rose from the mire after only a relatively few generations, the higher purpose of wisdom, of the kind of knowledge that creates civilizations and jump-starts human progress, lay dormant far longer. Instead, Europe looked to religion, to mysticism and myth, for another few centuries.

Only when science had advanced far enough to prove the fairytale stories of Jewish scripture demonstrably false could the Enlightenment begin. And only when it began was Europe able to cast off the last of the darkness.

That didn’t start until the early 17th century, with great thinkers like Galileo and Bruno to start, followed later by those such as Newton and Spinoza. Eventually, the Enlightenment even began to fracture, different regions going their separate ways. The French Enlightenment, for instance, gave us the rational philosophy of Descartes and the like: ways to look at the world that didn’t invoke the supernatural. The English and even Scottish, on the other hand, contributed the wisdom of politics, economics, and the "hard" sciences. Last, but most important, was the American Enlightenment, bringing the liberal (in the classical sense) values of French thinkers together with the moral imperatives of free speech, free markets, and freedom of religion that came from their rivals across the Channel.

In that sense, then, there were still bits of darkness in the world as late as 1800. (Really, not all of them left, but I digress.) Even though Europe didn’t have wandering hordes of invaders anymore, we in the West still needed a thousand years after Charlemagne to truly return to the glory days he was trying to emulate.

And that pairs up well with the Greek Dark Age. Yes, Homer was writing his epics in the ninth or tenth century BC, but they were the beginning, not the end, of Greece’s rise. Most of the great thinkers we associate with the Greek school of philosophy came much later, in the third and fourth centuries BC. In other words, almost a full millennium after the invasions of the Sea Peoples. In a very real sense, then, the path from darkness to light was longer for medieval Europe.

Learning from the past

We must do better than that. The effects of our Third Dark Age can’t last for a thousand years. We’ve come too far as a species to allow ourselves to be dragged back into the darkness, no matter what the "traditionalist" right and "inclusive" left wish for us.

So how do we do it?

First, we must keep knowledge alive. True knowledge, the wisdom passed down to us and created in our own time. Digital collections such as Library Genesis are a good thing—the fact that elites hate them so much is a pretty good indication—but they have the downside of being, well, digital. If the Third Dark Age collapse is too great, ubiquitous computing won’t be a given. In addition to distributed, censorship-resistant online libraries, then, we need open, secure libraries in the physical world. The Library of Alexandria, except there’s one in every city, and their shared goal is to archive as much knowledge as possible, in ways that it will endure even the harshest decline.

Second, those of us who are awake to the peril must continue to share that knowledge. Within our family, our community, and our country (in that order), we should be training others in the skills we possess, while also passing down what we have learned. And this includes the greatest lessons of all: that the world is not some divine mystery; that humanity is inherently a good and positive force; that science is not reserved to the elite, or those with the right credentials, but is something every one of us experiences every single day.

Third on the list is a greater focus on that community. The post-collapse time of the previous Dark Ages was a reversion to a heavily decentralized world. An anti-globalism, or a "localism", if you will. In modern times, I can foresee that creating a multitude of city-states; we’re already pretty close to this with New York, London, and a few others. But even rural areas will have to become more self-reliant as the Third Dark Age brings a fall of the American Empire. We can’t do that if we don’t know our neighbors. (We also can’t band together to resist invasions without that sense of brotherhood, so this strengthening of community has more than one beneficial effect.)

Fourth, reconnecting with our past, in the sense of doing useful work outside of the internet. This can be writing books, building a shed, or just anything. The key is that it has a physical presence. It’s a physical manifestation of our knowledge, which matters more in a world that will come to see that knowledge as worthless in itself. I’m not saying to become a prepper—that might be more useful in other collapse circumstances—but to prepare for a major shift in what society deems important.

Above all, we need to remember, to preserve, to teach the world that the coming darkness is not eternal. There is a light beyond, and it is not the light at the end of the tunnel. It is the dawn of a new day. Working together, recognizing what we are losing and why, we have the chance to bring that new dawn faster than in our ancestors’ previous two attempts.

If knowledge is power, our job is to be a generator. And that’s what you need to keep the lights on in a disaster.

The Third Dark Age

Twice before, the West faced a crisis, a series of unfortunate events that led ultimately to the decline of the reigning powers of the civilized world, a long stretch of technological stagnation or even regression, and a loss of the cultural achievements of those who came before. In short, a Dark Age.

Our world today is on the same track. We’re following those same steps, dealing with those same crises. Unlike the past, however, we have the ability to recognize what is happening, and to stop it. But we can only do that if we acknowledge our situation. To do that, we must understand the warning signs and the parallels.

The last Dark Age

Most people in the Western world have heard of the Dark Age. (Sometimes referred to as the Dark Ages, but the singular is important here, as you’ll see shortly.) The time after the fall of the Roman Empire was a period of barbarism in Europe, a long stretch of Vikings and serfdom and general horror.

Archaeological finds show that our legends of this time are exaggerated. Alas, these finds have given ammunition to those on both sides of the political spectrum who wish to argue that the Dark Age never even happened. The left will point to algebra and the Almagest to say, "See? Muslims kept making advances where white Europeans failed." The right, meanwhile, counters with, "Look at those cathedrals! That’s proof that Christianity is what kept civilization going."

Both are wrong, of course. The few Muslim inventions—and their occasional translation of ancient knowledge—don’t make up for the ravages of the Moorish conquest of Spain. The cathedrals built in the 9th century weren’t constructed from Roman concrete, because knowledge of how to make that was lost along with so much else. There was a Dark Age, no doubt about it. The only questions are how long it lasted and just how dark it was.

By any reasonable estimation, the fall of Rome was the tipping point. Many of the Gothic tribes that settled in Italy, France, and Spain at the time still considered themselves vassals of the Empire, to some extent, and some continued to pay homage to Constantinople, the eastern capital where the flame of civilization was kept alive. But even that had ended by 540, following a volcanic winter (caused by an eruption in Central America!) and attendant famines and plagues. So we can put the start of the Dark Age around 500 AD, plus or minus about 40 years.

When did it end? Tradition has it lasting as late as 1066, with the Norman Conquest. Academics like to credit Charlemagne’s accession in 800 as ending it. I’d say the best date comes in between those. The early not-quite-Renaissance of the late ninth and early tenth centuries shows that European culture was beginning to rise from its nadir far better than the end of the Merovingian era. Personally, I’d use 927, the year of Æthelstan’s coronation as king of England, as a good compromise, but you could make a case for anywhere in the range of 870-940.

In between, most of the continent was a mess. Rational thought took a back seat to mysticism and monasticism. Texts, cultural contributions, and even general knowledge of the Roman Empire all fell away, until the Romans themselves almost became mythologized. The typical Hollywood portrayal of medieval peasants in dirty, tattered clothing, treading muddy streets to go back to their thatch hovels, isn’t entirely accurate, but it’s probably closer to the truth than the almost romantic notions of some traditionalists.

The European Dark Age was, to sum up, a time where the strong ruled, the weak toiled, and the wisdom of the past was forgotten. What’s worse is that it wasn’t the first time that had happened.

The one before

As far back in history from the start of the European Dark Age as it is from our present day, the lands of the Mediterranean faced a crisis. This was precipitated by invasions from what are commonly called the Sea Peoples, a later collective name given to groups who are mostly known only from a few Egyptian accounts. We can identify some of them from such accounts, however: the Achaeans, Sicilians, Sardinians, and Philistines. Possibly also the Etruscans, though this etymology is on somewhat shakier ground.

Whoever they were, the Sea Peoples invaded Egypt, most of the Levant, and Anatolia. They clashed with the major civilizations of that time—Egyptians, Hittites, and so on—and ultimately wore them down so much that they fell into their own decline. It wasn’t a conquest, but more like a war of attrition, the same way forces in Palestine and Lebanon (almost the exact same place!) are bleeding their occupiers dry as we speak.

Dates are hard to find this far back in history, but the most common given for the start, or perhaps the height, of the troubles is 1177 BC, owing to the popular book of the same name, which isn’t bad except for the part where it does the usual academic trick of trying to be notable by minimizing the impact of known historical events.

This "Greek Dark Age", as it’s commonly called, isn’t as much known, but its effects were no less drastic than the European one that started a millennium and a half later. The Hittites fell out of history entirely, to the point where our only knowledge of them as recently as 200 years ago was a mention in the Bible. The Egyptians fared a little better, but lost most of their holdings east of the Sinai to the Philistines and Canaanites, who—in another event paralleled by modern times—later lost them to invading Hebrews. Farther north, Troy fell to an alliance that included Sea Peoples; its collapse was so total that the modern West thought the whole city was a myth until it was rediscovered.

Three thousand years is a long time, so it’s only reasonable that we have far less data about the Greek Dark Age. We don’t know a lot of details about it, but what we do know shows that it follows the same pattern as the one that befell Europe later on.

What’s to come

The biggest contributor to both of the previous Dark Ages is invasion. Rome was invaded by Goths, Huns, Vandals, and (later) Moors, all of whom picked apart the bones of the empire and left little behind for its citizens. The Sea Peoples did much the same to the powers of the Bronze Age, even when Ramesses II tried to resettle some of them.

It’s not hard to see that pattern repeating today. Our own country is being invaded as we speak, as are so many of the major Western powers. Millions of "asylum-seekers" who consume resources but refuse to assimilate, who provoke or cause violence, who care nothing for the sentiments of those who call this land home. The Haitians eating pets in Ohio, the Venezuelans capturing apartment complexes in Colorado, the rape gangs of England…these are the modern Sea Peoples, the modern Huns and Moors. And they are one tip of the trident thrust our way.

The apocalyptic conditions of the 530s contributed to the European Dark Age, as well as the fall of a number of smaller cultures in the Middle East, where the resulting power vacuum provided fertile ground for a Muslim conquest. It’s harder to pinpoint a major ecological disaster for the Greek Dark Age; probably the closest is an impact (or airburst) event on the shores of the Dead Sea circa 1600 BC, the historical basis for the Sodom and Gomorrah myth. But that must be too far back. Undoubtedly, the Sea Peoples wanted to migrate south for some reason. Perhaps it’s linked to the fall of the Minoans on Crete, another total collapse in that era.

Today, we don’t seem to have as much to worry about on that front. We’re in a stable climate epoch, a period of global "greening" while temperatures remain steady and comfortable. In our case, the ecological angle of collapse might come from an overreaction on the part of—or simply led by—doomsayers who claim our relatively quiescent climate is somehow a bad thing, and that we need to go back to the days of the Little Ice Age.

More likely, ecology will be used as a way to contribute to the collapse. We already see that happening, as clean nuclear plants are shut down and replaced with toxic solar panels and bird-killing turbines. Eugenics is another possibility: the attempts by the so-called "elite" to force us to eat bugs or genetically modified plants, to take experimental drugs that are shown to have a deleterious effect on our health.

The third and final pillar that must fall to create a Dark Age is cultural continuity. In modern times, that one isn’t so much collapsing as being demolished. The entire agenda of ideologies such as progressivism and communism is to create a clean break with the past, with the traditions and customs that brought us to where we are. What little history is allowed to be learned is shown through a distorted lens, and too many who should oppose such acts instead welcome them, hoping that, in the chaos that follows, their particular ideologies will have a chance to step forward.

To be continued

Our new Dark Age, then will come from those factors: unchecked immigration, ecological fear-mongering, and the destruction of our heritage. That’s not to say these things will start happening soon. No, they’re already happening. With every border crossing, every falsified temperature record, every statue torn down, we sink deeper into the darkness. We’re on the path of decline right now. We have been for almost the entire 21st century.

The question then becomes: what are we going to do about it? In the next post, I’ll offer my own thoughts on a solution. To combat the Third Dark Age, I believe we’ll need a Second Enlightenment.

Summer Reading List 2024: Third

Finished with about a week to spare. Good thing, too. I know next week is going to be awful for reading, and the weekend won’t be any different. Anyway…

Fantasy

Title: In the Shadow of Lightning
Author: Brian McClellan
Genre: Fantasy
Year: 2022

I got this one for Christmas two years ago. Never actually read it, mostly because so much in my life was turning upside-down around that time. So I promptly forgot about it and, when Christmas of last year rolled around, put it on my wishlist again. My brother, always willing to try for a laugh, swiped it out of the pile in my room, wrapped it back up, and gave it to me again. And I’ll admit that I was fooled.

The book itself, now that I’ve actually read it, is great. Brian McClellan is one of my favorite current authors, because he really seems to care about his worldbuilding, while also being able to tell a good story without getting bogged down in minutiae. Plus, he specializes in a style of fantasy that’s post-medieval, more 17th than 12th century, which is refreshing and fun. If anything, that’s how I feel reading his books: like I’m having fun.

With In the Shadow of Lightning, that trend continues. The pace is lively throughout the book, and there’s almost never any real downtime. Things happen, and then something else happens on the next page, and so on. No long monologues, very few digressions. The book is long, but tight. I could easily imagine it being another 200 pages if written by most other authors.

The story itself starts out almost typical, with the fantasy cliche of an imperial hero putting down the wicked rebels seeking a measure of autonomy. Things go wrong quite quickly, however, and the main story begins a decade later, with that same hero now living in exile. He gets drawn back into the political game, and there’s not a minute where things let up from there.

Along the way, there’s a lot of fighting. McClellan is definitely a military fantasy writer, and it shows here far more than it did in the Powder Mage series, which was about a war! Essentially the whole book has as its main event a war between the empire and a neighboring country that has retained its independence through trade. Kind of like Venice or the Netherlands, is how I see it.

The lead-up to the war is part of the plot, so I won’t get too into spoilers here. I will say that I guessed "false flag" about five pages into the first post-prologue chapter. Not because it’s telegraphed, but the details simply aligned with what I know of how false flag operations are pushed. 9/11, 1/6, 10/7…anyone who really digs into terrorist attacks and assassination attempts will see that some patterns emerge, and a lot of those patterns point to the supposedly "random" shootings and "unprovoked" attacks actually being started by other agents. Agents of chaos.

Brian McClellan clearly also knows this, as he throws in every single one of the elements we now know to be signs of a false flag operation: a killer who just happens to be of a specific nationality, economic chaos at just the right time, conflicting or outright forged orders, media propaganda to hide the truth. If he isn’t making a statement, then I can’t wait to read the book in which he does.

And that brings me to what I feel is the worst part of In the Shadow of Lightning: the Ossan Empire itself. Rather than your typical fantasy autocracy, possibly with a secret cabal of ministers who are the real power behind the throne, McClellan just jumps straight to the cabal. Ossa is an oligarchical empire where the powerful families vie for dominance.

It’s more of a Renaissance Florence than a High Middle Ages England, in that sense. In the social sense, however, it’s closer to the Weimar Republic…or modern America. The nature of magic in the setting requires body piercings and tattoos, and that’s fine. It’s an interesting twist. But the empire seems to be a place that has given in wholly to decadence and hedonism. Religion has become just another form of commerce. Gender roles are completely absent.

So are sexual roles; essentially every character whose preferences are mentioned has no preferences. At one point, I joked to my girlfriend, "Everybody in this book is bi!" And it’s true. Combined with the guild family dynamic that is ever-present, and I got the same feeling as I did with The Expanse: this whole place is rotten, and there’s nothing even worth redeeming.

If that is another way of representing an empire in decline, then I’m okay with it. It’s a pattern that has recurred throughout history. The failure of the family unit, and the transfer of the nurturing role to government or society at large has happened before. It’s happening now. And the backlash from those who understand human nature has invariably been disastrous.

But this book is anything but a disaster. Read it for what it is: a fun, fast-paced ride through a dying empire, in a world where magic is flashy and deadly. You get some great battles, a lot of political intrigue, and even some Lovecraftian horror near the end. Just remember that the secret cabal of people who masquerade as normal humans to sow dissent and chaos, using degeneracy to bring an empire to its knees while controlling its government and economy from the shadows, is the least fantastic element of the story.

Summer Reading List 2024: Second

The world’s a mess, my life’s a mess, but at least I’m reading. Right?

Military History

Title: The Guns of August
Author: Barbara Tuchman
Genre: Military History
Year: 1962

World War I has always fascinated me. Ever since I was in the 6th grade, when I had a to choose a topic for a social studies project (that year was world history, and I had reached the early 20th century), I was hooked by the stories and the sheer scope of the Great War. My grade on that project was terrible—I almost failed, simply because there was just so much to learn that I couldn’t narrow it down enough, and didn’t have time to rehearse—but the memory remained.

Over the past decade, the war that had languished in relative obscurity all my life finally started to get back into the public eye. Mostly, that’s because of the centennial, the 100th anniversary of Archduke Franz Ferdinand’s assassination in June 1914. That milestone brought about the desire to make new media, whether movies (1917), video games (Verdun and its sequels, Tannenberg and Isonzo), web series (Indy Neidell’s The Great War), or music (Sabaton’s…er, The Great War). Many of these are excellent, and I’ve spent the past ten years basking in the knowledge that I finally got to be a history hipster.

But I haven’t read a book about WWI in decades. And I hadn’t planned on doing so this summer, either, until Elon Musk shared a list of his must-listen audiobooks. Since I don’t really care for audiobooks—I’m a visual learner—I downloaded them in written form, and I picked the first interesting one I saw that wasn’t 11 volumes. Sorry, Will Durant.

The Guns of August is a fairly detailed narrative, drawn from diaries, newspapers, the occasional eyewitness report, and other primary or good secondary sources. Its topic is, broadly speaking, the first month of the war. In practice, it starts somewhat before that, with the death of Edward VII, King of England. His death, and his succession by George V, created a power vacuum in a geopolitical landscape that was already growing increasingly tense. Europe had four years until war finally broke out (ignore the Balkan Wars for a moment), but the buildup had already begun. Edward’s death, as Tuchman argues in a roundabout way, set Europe on the path to war.

Most of us know the broad strokes of the summer of 1914. Ferdinand was shot in Sarajevo. Austria demanded reparations from Serbia; recall that this was before Yugoslavia existed, much less Bosnia. Favors were called in on both sides, drawing in first Germany, France, Russia, and Great Britain, then seemingly every other country in the world. Four years and many millions of dead young men later, the original belligerents peaced out one by one, ending with Germany signing the Treaty of Versailles.

As not enough of us know, that treaty was designed to be so ruinous that the German Empire would cease to be a nation able to project power abroad. Indeed, it was the end of the empire as a whole. Instead, the Kaiser’s rule was replaced with the decadent debauchery of the Weimar Republic, which served to suck out the marrow of the German economy while leaving its society fractured, fragmented. Exactly as we’re seeing in modern America, but I digress.

Anyway, Tuchman’s book isn’t about that part of the war. In fact, it leaves off as the Battle of the Marne begins, ending with a series of what-ifs that are tantalizing to the worldbuilder in me. What if the German armies hadn’t tried to do a forced march just to stick to their predetermined schedule of battles? What if Britain’s Field Marshal French hadn’t been swayed by a rare emotional outpouring from the normally stoic General Joffre? (Now I really want to write that alt-history!)

No matter what might have happened if things had gone differently in August 1914, the author makes it clear that what occurred in the weeks immediately prior to the German advance to the outskirts of Paris were pretty much set in stone. Before Franz Ferdinand was so much as cold, Europe was going to war. It was only a matter of when.


As far as the book goes, it’s a good read. It’s nothing brilliant, and certainly not worth a Pulitzer, in my opinion. The writing can be almost too highbrow at times, as if Tuchman is trying to capture the last gasp of the Victorian Era in words. To be fair, that’s how most of the major players talked and wrote, but readers even in the 1960s wouldn’t have been exposed to it except in literature classes. Certainly not when discussing military history. There are also scores of untranslated sentences in French and German, an oddity in a book written for English speakers.

The pacing is also very uneven. The Russians get a couple of fairly long chapters, but are otherwise forgotten; Tannenberg is practically a footnote compared to Liege. Conditions on a forced march get page after page of narration, including diary excerpts from soldiers, while the battles themselves are mostly reduced to the traditional "date and body count" sort of exposition.

If there’s any real critique of The Guns of August, it has to come from its very obvious and very intentional Allied bias. While the happenings in Germany and among the Kaiser’s generals are well-represented, they’re often cast in a negative light. When the Germans demolish a village in retaliation for partisan attacks, it’s a war crime and an international outrage. When the French demolish a village because they think they might need to put up defenses, it’s a heroic effort to save their country.

This is, of course, the same kind of thinking that still permeates the discussion about the Great War’s sequel. The "bad guys" aren’t allowed to take pride in their country. Their nationalism is evil; ours is sacred. (This line of reasoning also leads otherwise sensible people to praise Communists.) The simple fact is, the Weimar Republic was far worse than it’s portrayed, and the governments to either side of it on the timeline, whether Empire or Reich, were not as bad as they’re portrayed. Barbara Tuchman, being a student of her generation, can’t get past that. Even if she tried, I imagine her publishers wouldn’t let her.

Otherwise, The Guns of August is a worthwhile read for its subject matter. It’s a good look at the backdrop to World War I, something that occasionally gets lost among the trenches. Personally, I find it a bit overrated, but I’m glad I read it.

Summer Reading List 2024: First

It’s been about a month, and I finally made time to read something. Thanks to my brother’s timely discovery of a Youtube channel called "In Deep Geek", I got a little inspired for this one. Man, I hope that guy starts posting on a site that respects its users soon.

Biography

Title: The Nature of Middle-Earth
Author: J.R.R. Tolkien (ed. Carl Hostetter)
Genre: Biography/History?
Year: 2021

I don’t really know how to classify this book. It’s basically a collection of notes and scraps that Tolkien left behind. Much like his son Christopher’s History of Middle-Earth series, a ton of editing had to be done to make something readable. And…well, that didn’t quite work. The book as a whole is very disjointed, full of footnotes and editor comments and just a mess overall.

That makes perfect sense, though. Tolkien was probably the first great worldbuilder. He worked in an era without computers, without the internet. He had to write out his notes longhand. And there were a ton of those notes, because his constructed world began all the way back in the days before World War I. 1909, or thereabouts, was when he first started sketching out the conlang that would become Quenya. By his death, those earliest notes were senior citizens. There was a lot of cruft.

This book, then, is about organizing a lot of that cruft. In that, Hostetter does a good job. His is the job of an archaeologist, in a sense, as well as a forensic scientist. Oh, and a linguist, because Tolkien’s languages were ever the most important part of his creation.

The Nature of Middle-Earth, as its name suggests, gives us notes and drafts related to some of the fundamental questions and thorny problems Tolkien had to solve to give his invented world verisimilitude while also keeping it true to his long-standing ideas and ideals. After all, Middle-Earth is intended to be our world, just a few thousand years in the past. How many, exactly? It’s never stated anywhere in his published books, but this book tells us that Tolkien saw his present day—well, in 1960—as being about 6000 years after the end of LOTR. Convenient, that number, since it’s basically the same as what creationists claim.

And that brings me to the point I want to make. Our editor here repeats his own note a couple of times, emphasizing that Tolkien saw his world as a "fundamentally Catholic" creation. He was a Catholic, so that makes sense in some regard.

Much of the book—much of Tolkien’s corpus of personal notes—is thus about harmonizing a high fantasy world at the cusp of the Dominion of Man with the low, anti-human dogma of the Catholic Church. So Tolkien writes at length, and sometimes in multiple revisions, that his Elves were strictly monogamous, and that they didn’t reincarnate into different bodies. The men of Numenor were the same (except that he didn’t have to worry about reincarnation for them) because they had grown more godly.

In a few cases, Tolkien shows glimpses of a modern scientific worldview that was probably heretical in the churches of his youth. Sure, it’s all in an explicitly theistic framework, but he even accepts evolution for the most part; he can’t quite make the logical leap that humans are subject to it, too, but he meets science halfway, which is more than most would dare.

There is also a glimpse of what I’ve previous called "hardcore" worldbuilding. Tolkien was, of course, a master of that, but The Nature of Middle-Earth shows the extremes he was willing to go to for the sake of his creation. Multiple chapters are taken up with his attempts at giving believable dates for some of the events that were considered prehistorical even in the tales of The Silmarillion. In each, he went into excruciating detail, only to discard it all when he reached a point where the numbers just wouldn’t work. I’ve been there, and now I don’t feel so bad about that. Knowing that the undisputed master of my craft had the same troubles I do is refreshing.

All in all, most of the chapters of the book are short, showing the text of Tolkien’s notes on a subject, plus the occasional editorial comment, and the copious footnotes from both authors. We get to see how the sausage is made, and it’s sometimes just as disgusting as we’d expect. Not one reader of LOTR or The Silmarillion cares about the exact population of each tribe of Elves, or what the etymology of Galadriel’s name indicates about her travels, but Tolkien isn’t writing these things for us.
When worldbuilding, we authors do so much work not because we expect to show every bit of it to our audience, but so that the parts we do show are as good as they can be.

If this book has any lesson, then, it’s that. Worldbuilding is hard work. Worse, it’s work that accomplishes almost nothing in itself. Its sole value is in being a tool to better convey a story. Perfectionist and obsessive that Tolkien was, he wanted an answer to any plausible question a reader might ask. But he also wanted to create for the sake of creating. Remember that the intended goal of Middle-Earth was to become a new mythology, mostly for the British peoples. When you set your sights on something that sweeping, you’re always going to find something to do.

Tools and appliances

I was trying to sleep late last night when I had something of an epiphany. I’ve long lamented the dumbing-down of the world, and particularly the tech world. Even in programming, what should be a field that relies on high intelligence, common sense, and reasoning abilities, you can’t get away from it. We’ve reached the point where I can only assume malice, rather than mere incompetence, is behind the push for what I’m calling the Modern Dark Age.

The revelation I had was that, at least for computers, there’s a simple way of looking at the problem and its most obvious solution. To do that, however, we need to stop and think a little more.

The old days

I’m not a millennial. I’m in that weird boundary zone between that generation the Generation X that preceded it. In terms of attitude and worldview, that puts me in a weird place, and I really don’t "get" either side of the divide. But I grew up with computers. I was one of the first to do so from an early age. I learned BASIC at 8, taught myself C++ at 13, and so on to the dozen or so languages I’m proficient in now at 40. I’ve written web apps, shell scripts, maintenance tools, and games.

In my opinion, that’s largely because I had the chance to experience the world of 90s tech. Yes, my intelligence and boundless curiosity made me want to explore computers in ever-deeper detail, but the time period involved allowed me an exploratory freedom that is just unknown to younger people today.

The web was born in 1992, less than a decade after me. At the time, I was getting an Apple IIe to print my name in an infinite loop, then waiting for the after-recess period when I could play Oregon Trail. Yet the internet as a whole, and the technologies which still provide its underpinnings today, were already mature. When AOL, Compuserve, and Prodigy (among others), brought them to the masses, people in the know had been there for fifteen years or more. (They resented the influx of new, inexperienced rabble so much that "Eternal September" is still a phrase you’ll see thrown about on occasion, a full 30 years after it happened!)

This was very much a Wild West. I first managed to convince my mom that an internet subscription was worth it in 1996, not long before my 13th birthday. At the time, there were no unlimited plans; the services charged a few cents a minute, and I quickly racked up a bill that ran over a hundred dollars a month. But it was worth it.

Nowadays, Google gives people the illusion of all the answers. Back in the day, it wasn’t that simple. Oh, there were search engines: Altavista, Lycos, Excite, Infoseek, and a hundred other names lost to the mists of time. None of these indexed more than a small fraction of the web even in that early era, though. (Some companies tried to capitalize on that by making meta-engines that would search from as many sites as possible.)

Finding things was harder on the 90s web, but that wasn’t the only place to look. Before the dotcom bubble, the internet had multiple, independent communities, many of which were still vibrant. Yes, you had websites. In the days before CSS and a standardized DOM, they were rarely pretty, but the technical know-how necessary to create one—as well as the limited space available—meant that they tended to be more informative. When you found a new site about a topic, it often provided hours of reading.

That screeching modem gave you other options, though. Your ISP might offer a proprietary chat service; this eventually spawned AIM and MSN. AOL especially went all-in on what we now call the "walled garden": chat, news, online trivia games, and basically everything a proto-social network would have needed. On top of that, everyone had email, even if some places (Compuserve is the one I remember best) actually charged for it.

Best of all, in my rose-colored opinion, were the other protocols. These days, everything is HTTP. It’s so prevalent that even local apps are running servers for communication, because it’s all people know anymore. But the 90s had much more diversity. Usenet newsgroups served a similar purpose to what Facebook groups do now, except they did it so much better. Organized into a hierarchy of topics, with no distractions in the form of shared videos or memes, you could have long, deep discussions with total strangers. Were there spammers and other bad actors? Sure there were. But social pressure kept them in line; when it didn’t, you, the user, had the power to block them from your feed. And if you didn’t want to go to the trouble, there were always moderated groups instead.

Beyond that, FTP "sites" were a thing, and they were some of the best places to get…certain files. Gopher was already on its way out when I joined the internet community, but I vaguely remember dipping into it on a few occasions. And while I don’t even know if my area had a local BBS, the dialer software I got with my modem had a few national ones that I checked out. (That was even worse than the AOL per-minute fees, because you were calling long-distance!)

My point here is that the internet of 30 years ago was a diverse and frankly eye-opening place. Ideas were everywhere. Most of them didn’t pan out, but not for lack of trying. Experimentation was everywhere. Once you found the right places, you could meet like-minded people and learn entirely new ways of looking at the world. I’m not even kidding about that. People talk about getting lost in Wikipedia, but the mid 90s could see a young man going from sports trivia to assembly tutorials to astral projection to…something not at all appropriate for a 13-year-old, and all within the span of a few hours. Yes, I’m speaking from personal experience.

Back again

In 2024, we’ve come a long way, and I’m not afraid to state that most of that way was downhill. Today’s internet is much like today’s malls: a hollowed-out, dying husk kept alive by a couple of big corporations selling their overpriced goods, and a smattering of hobbyists trying to make a living in their shadow. Compared even to 20 years ago, let alone 30, it’s an awful place. Sure, we have access to an unprecedented amount of information. It’s faster than ever. It’s properly indexed and tagged for easy searching. What we’ve lost, though, is its utility.

A computer in the 90s was still a tool. Tools are wonderful things. They let us fix, repair, build, create. Look at a wrench or a drill, a nail gun or a chainsaw. These are useful objects. In many cases, they may have a learning curve, but learning unlocks their true potential. The same was true for computers then. Oh, you might have to fiddle with DIP switches and IRQs to get that modem working, but look at what it unlocks. Tweaking your autoexec.bat file so you can get a big Doom WAD running? I did that. Did I learn a useful skill? Probably not. Did it give me a sense of accomplishment when I got it working? Absolutely.

Tools are just like that. They provide us with the means to do things, and the things we can do with them only increase as we gain proficiency. With the right tools, you can become a craftsman, an artisan. You can attain a level of mastery. Computers, back then, gave us that opportunity.

Now, however, computers have become appliances. Appliances are still useful things, of course. Dishwashers and microwaves are undeniably good to have. Yet two aspects set them apart from tools. First, an appliance is, at its heart, a provider of convenience. Microwaves let us cook faster. TVs are entertainment sources. That dryer in the laundry room is a great substitute for a clothesline.

Second, and more important for the distinction I’m drawing here, is that an appliance’s utility is bounded. They have no learning curve—except figuring out what the buttons do—and a fixed set of functions. That dryer is never going to be useful for anything other than drying clothes. There’s no mastery needed, because there’s nothing a mastery of an appliance would offer. (Seriously, how many people even use all those extra cooking options on their microwave?)

Modern computers are the same way. There is no indication that mastery is desirable or useful. Instead, we’re encouraged and sometimes forced into suboptimal solutions because we aren’t given the tools to do better. Even in this century, for example, it was possible to create a decent webpage with nothing more than a text editor. You can’t do that now, though, because browsers won’t even let you access local files from a script. The barrier to entry is thus raised by the need to install a server.

It only gets worse from there. Apple has become famous for the total lockdown of its software and hardware. They had to be dragged into opening up to independent app stores, and they’ve done so in the most obtuse way possible as protest. Google is no better, and is probably far worse; they’re responsible for the browser restriction I mentioned, as well as killing off FTP in the browser, restricting mobile notifications to only use their paid service, and so on. Microsoft? They’re busy installing an AI keylogger into Windows.

We’ve fallen. There’s no other way to put it. The purpose of a computer has narrowed into nothing more than a way to access a curated set of services. Steam games, Facebook friends, tweets and Tiktoks and all the rest. That’s the internet of 2024. There’s very little information there, and it’s so spread out that it’s practically useless. There’s almost no way to participate in its creation, either.

What’s the solution? I wish I knew. To be honest, I think the best thing to do would be a clean break. Create a new internet for those who want the retro feel. Cut it off from the rest, maybe using Tor or something as the only access point. Let it be free of the corrupting influence of corporate greed, while also making it secure against the evils of progressivism. NNTP, SMTP, FTP…these all worked. Bring them back, or use them as the basis for new protocols, new applications, new tools that help us communicate and grow, instead of being ever further restrained.

Dumbing down tech

I recognize that I’m smarter than most people. I’ve known that as long as I can remember. When I was six years old, I took a standardized IQ test. You know, the kind whose results are actually usable. I apparently scored a minimum of 175; it wasn’t until a few years later, when I was studying statistics, that I understood both what that meant in relation to society at large and why it had to be a minimum. (IQ is a relative measurement, not an absolute one. Once you get to a certain point, small sample sizes make a precise evaluation impossible.)

There is, of course, a big difference between intelligence and wisdom, though I like to think I also have a good stock of the latter. In some fields, however, the intelligence factor is the defining one, and tech is one of those fields. Why? Because intelligence isn’t just being able to recite facts and formulas. It’s about reasoning. Logic, deduction, critical thinking, and all those other analytical skills that have been absent from most children’s curricula for decades. While some people literally do have brains that are better wired for that sort of thinking—I know, because I’m one of them—anyone can learn. Logic is a teachable skill. Deductive reasoning isn’t intuition.

Modern society, in a most unfortunate turn of events, has deemed analytical thinking to be a hindrance rather than an aid. While public schooling has always been about indoctrination first, and education a very distant second, recent years have only made the problem both more visible and more pronounced. I’ll get back to this point in a moment, but it bears some consideration: as a 40-year-old man, I grew up in a society that was indifferent to high intelligence, but I now find myself living in one that is actively hostile to it.


I’ve always enjoyed reading tech books. Even in the age of Medium tutorials, online docs, and video walkthroughs, I still find it easiest to learn a new technology from a book-length discussion of it. And these books used to be wonderful. Knuth’s The Art of Computer Programming has parts that are now approaching 60 years old, yet it’s still relevant today. The O’Reilly programming language books were often better than a language’s official documentation.

It’s been awhile since I really needed to read a book for a new technology. I’ve spent the past few years working with a single stack that doesn’t have a lot of "book presence" anyway, and the solo projects I’ve started have tended to use things I already knew. Now that I’m unemployed and back to the eternal job hunt, though, I wanted to look for something new, and I was tired of looking at online resources that are, by and large, poorly written, poorly edited, and almost completely useless beyond the beginner level. So I turned to books, hoping for something better.

I didn’t find it.

One book I tried was Real World OCaml. For a few years, I’ve been telling myself that I should learn a functional language. I hate functional programming in general, because I find it useless for real-world problems—the lone exception is Pandoc, which I use to create my novels, because text transformation is one of the few actual uses for FP. But it’s become all the rage, so I looked at what I felt to be the least objectionable of the lot.

The language itself isn’t bad. It has some questionable decisions, but it’s far more palatable than Haskell or Clojure. That comes from embracing imperative programming as valid, meaning that an OCaml program can actually accomplish things without getting lost in mathematical jargon or a sea of parentheses.

But the book…well, it wasn’t bad. It just didn’t live up to its title. There wasn’t much of the real world in it, just the same examples I’d get from a quick Brave search. The whirlwind tour of the language was probably the best part, because it was informative. Tech books work best when they inform.


Okay, maybe that’s a one-off. Maybe I ran into a bad example. So I tried again. I’m working on Pixeme again, the occasional side project I’ve rewritten half a dozen times now, and I decided that this iteration would use the stack I originally intended before I got, er, distracted. As it turns out, the authors of the intriguing Htmx library have also written a book about it, called Hypermedia Systems.

This was where I started getting worried. The book is less about helping you learn their library and more about advancing their agenda. Yes, that agenda has its good parts, and I agree with the core of it, that a full-stack app can offer a better experience for both developers and users than a bloated, Javascript-heavy SPA. The rest of it is mostly unhelpful.

As someone who has been ridiculed for pronouncing "GIF" correctly (like the peanut butter, as the format’s author said) and fighting to keep "hacker" from referring to blackhats, I have to laugh when the authors try to claim that a RESTful API isn’t really REST, and use an appeal to authority to state that the term can only apply to a server returning HTML or some reasonable facsimile.

Advocacy aside, the book was unhelpful in other ways. I can accept that you feel your technology is mostly for the front end, so you don’t want to bog down your book with the perils and pitfalls of a back-end server. But when you’re diving into a method of development that requires a symbiotic relationship between the two, using the academic "beyond the scope" cop-out to wall off any discussion of how to actually structure a back end to benefit from your library is doing your readers—your users—a great disservice. If the scope of your book doesn’t include patterns for taking advantage of a "hypermedia" API, then what does it include? A few new HTML attributes and your whining that people are ignoring a rant from three decades ago?


Alright, I thought after this, I’ll give it one more shot. Never let it be said that I’m not stubborn. The back end of this newest version of Pixeme is going to use Django. Mostly, that’s because I’m tired of having to build out or link together all the different parts of a server-side framework that FastAPI doesn’t include. Things like logins, upload handling, etc. I still want to use Python, because that’s become the language I’m most productive in, but I want something with batteries included.

The official documentation for Django is an excellent reference, but it’s just that: a reference. There’s a tutorial, but this ends very quickly, and offers almost no insight on, say, best practices. That, for me, is the key strength of a tech book: it has the space and the "weight" to explain the whys as well as the hows. So I went looking for a recent book on the topic, and I ended up with Ultimate Django for Web App Development Using Python. A bit of a mouthful, but it’s so new that it even uses the "on X, formerly Twitter" phrasing that mainstream media has adopted to refer to Twitter. (Seriously, nobody in the real world calls it X, just like nobody refers to the Google corporate entity as Alphabet.)

In this case, the book is somewhat informative, and it functions a lot like an expanded version of the official Django tutorial. If you’re new to the framework, then it’s probably not a bad guide to getting started. From something with "ultimate" in the title, I just expected…more. Outside of the tutorial bits, there’s not much there. The book has a brief overview of setting up a Docker container, but Docker deserves to be wiped off the face of the earth, so that’s not much help. And the last chapter introduced Django Ninja, a sort of FastAPI clone that would be incredible if not for the fact that its developers support child trafficking and religious persecution.

Beyond that, the text of the book is littered with typos and grammatical errors. Most of these cases have the telltale look of an ESL author or editor, a fact which is depressingly common in tech references of all kinds nowadays. Some parts are almost unreadable, and I made sure to look over any code samples I wanted to use very carefully. It’s like dealing with ChatGPT, except here I know there was a real human involved at some point, someone who looked at the text and said, "Yeah, this is right." That’s even worse.


Three strikes, and I’m out. Maybe I’m just unlucky, or maybe these three books are representative of modern tech literature. If it’s the latter, that only reinforces my earlier point: today’s society rewards mediocrity and punishes intelligence, even in fields where intelligence is paramount.

Especially in programming, where there is no room for alternate interpretations, the culture of "good enough" is not only depressing but actively harmful. We laugh wryly at the AAA video game with a 200 GB install size and a 50 GB patch on release day, but past experiences show that it doesn’t have to be that way. We can have smart developers. As with any evolutionary endeavor, we have to select for them. Intelligence needs to be rewarded at all stages of life. Otherwise, we’ll be stuck where we are now: with ESL-level books that recapitulate tutorials, screeds disguised as reference texts, and a thousand dead or paywalled links that have nothing of value.

As a case in point, I was looking just yesterday for information about code generation from abstract syntax trees. This is a fundamental part of compiler design, something every programming language has to deal with at some point. Finding a good, helpful resource should be easy, right?

Searching the web netted me a few link farms and a half-finished tutorial using Lisp. Looking for books wasn’t much better, because the only decent reference is still the Dragon Book, published in 1986! Yes, the state of the art has certainly advanced in the past 38 years, but…good luck finding out how.

That’s what needs to change. It isn’t only access to information. It isn’t only that this information isn’t being written down. It’s a confluence of factors. All of it happening all at once is making us dumber as a people. Worst of all is that we accept it. Whether you consider it the "price of democracy" or simply decide that there’s nothing you can do about it, accepting this rot has only let it fester.