Fork in the road

The past week or so has been an eventful one in the game development world. Unity is still backpedaling on their disastrous attempt at charging devs per-sale. The CCP-infested Unreal Engine has lowered its royalty fee. Ubisoft is teaching us all how best to set half a billion dollars on fire.

And then there’s Godot.

I’ve written about Godot Engine in the past. It first came out about 10 years ago, and it took the opensource world by storm. Here was a pro-level—okay, semi-pro back then—game engine that was free to use, without worrying that Unity would demand payment when you only ever opened the editor twice. (This actually happened to my brother.) Over the past decade, it grew, evolved, becoming the premier engine for budding developers on a budget.

All that changed a few days ago. "Get woke, go broke," the saying goes, and Godot’s management has chosen to go for broke. A far-left "community manager" proudly boasted on Twitter that this engine was perfectly fine being on an admittedly overzealous list of woke games. Fine. Sure. Find me a AAA studio that isn’t utterly broken to the mind virus, and I’ll gladly buy their games. Well, except I can’t actually buy their games; they won’t sell them to me. (California got one right this time, amazingly enough.)

Most people probably ignored the initial message, seeing it as just another fluorescent-haired professional victim parroting the latest narrative. And that’s probably how it was originally intended. But then came the doubling down. People who questioned the intent of the message started getting banned. Developers were kicked out. Backers were kicked out. The project head first claimed to be apolitical, then whined about being bullied off Twitter altogether, retreating to the safe space of leftist Mastodon. At every turn, those who objected to, disputed, or simply asked about Godot’s underlying political agenda were purged.

The great thing about open source is that this doesn’t mean the end. Because anyone can take the source, compile it, and release the resulting binaries, an open project can’t be shut down by progressive whim; this is most likely why so many are switching to "open core" models or demanding copyright assignments.

End result, though, is Redot Engine. Yes, the name’s derivative, but that’s to be expected. The whole thing is derivative, but in the positive sense that only free code under a permissive license allows. Redot hasn’t even released a build yet, and they’re already overwhelmed with support, so much so that Godot’s screeching gallery has started openly attacking it. They use the usual communist methods, so familiar from Antifa, BLM, and anything to do with Trump: projection, accusations of white supremacist beliefs, attempts to clog the system with garbage, and vague allusions of unseemly images stored on the "bad guys’" computers.

All this, because someone said, "No, I don’t want my game engine to have a political agenda."

Nor should it. Tools should be apolitical, because a tool, in and of itself, is amoral. It does not think or act on its own. It simply exists. The uses of a tool are in no way indicative of any inherent moral qualities of that tool. Nuclear bombs were once considered a viable means of digging canals, after all. And even if we accept the idea that a tool can espouse an ideology, why would we want one that’s communist? Why would we want to support the single most deadly ideology in all of human history? The one responsible for the Holodomor and the One Child Policy, the one that gave the world Stalin and Mao and Castro and Chavez?

Redot, as I see it, can be a chance to show that some people are still fighting against the encroachment of anti-human ideology in software. That gives me hope, because I’ve often felt I was fighting that battle alone, as I watched project after project adopt censorious codes of conduct or otherwise wall themselves off from rational discourse.

It’s not perfect yet, so my other hope is that the Redot team understands two things. One, something founded purely on a negative basis—that is, solely to be against another—cannot endure. This was the downfall of Voat and Threads, among many others

Second, if Redot wants to be inclusive in the true, non-bowdlerized meaning of the word, then it must be open. As yet, it is not. All discussion and development is currently hosted only in walled gardens: Discord, Github, Twitter, Youtube. There isn’t any way for a privacy-conscious developer/author to contribute, and I won’t compromise my own morals by supporting the very platforms which have spread the woke mind virus to software development in the first place.

So that’s where we stand right now. Godot has self-immolated, and I have no problem saying they deserve it. Redot is carrying the torch, but they need to prove that their words are not just wind. If they do, then we will have not only a great game engine for free, but a beacon of light in an otherwise dark time.

The Second Enlightenment

The Third Dark Age is upon us. We live in the modern equivalent of the final days of Rome, waiting for the sack that finishes off the Empire once and for all. Like the Huns and their Bronze Age counterparts, invaders run rampart in our towns and cities, not only not stopped by those who claim to lead us, but actively supported by them. Meanwhile, alleged academics want to banish all knowledge of the past, for fear of the masses recognizing our decline.

Can we halt our decline? Probably not, as far down the path as we’ve come so far. But that doesn’t mean we can’t work to ensure that it is as short as possible, and that our descendants are not left in a centuries-long era of regression and barbarism.

Awakening

Out of the Greek Dark Age came legends: Homer and Exodus, mythic tales of people overcoming the great and powerful with more than a little help from their chosen deities. By the time the dust had settled, the world had changed irrevocably, a full break with the past. Gone were the Hittites and Trojans and Canaanites, while Darius and Alexander were still hundreds of years away.

It was a long road back, but we got there in the end. Eventually, rational thought returned to the Western world, largely confined to Greece at this stage. Philosophy was born, and with it the awakening of wisdom, of reason.

Much, much later, the fall of Rome and rise of Islam brought the Medieval Dark Age to Europe, all but extinguishing that light. And while the cultural and technological and even scientific knowledge of the West rose from the mire after only a relatively few generations, the higher purpose of wisdom, of the kind of knowledge that creates civilizations and jump-starts human progress, lay dormant far longer. Instead, Europe looked to religion, to mysticism and myth, for another few centuries.

Only when science had advanced far enough to prove the fairytale stories of Jewish scripture demonstrably false could the Enlightenment begin. And only when it began was Europe able to cast off the last of the darkness.

That didn’t start until the early 17th century, with great thinkers like Galileo and Bruno to start, followed later by those such as Newton and Spinoza. Eventually, the Enlightenment even began to fracture, different regions going their separate ways. The French Enlightenment, for instance, gave us the rational philosophy of Descartes and the like: ways to look at the world that didn’t invoke the supernatural. The English and even Scottish, on the other hand, contributed the wisdom of politics, economics, and the "hard" sciences. Last, but most important, was the American Enlightenment, bringing the liberal (in the classical sense) values of French thinkers together with the moral imperatives of free speech, free markets, and freedom of religion that came from their rivals across the Channel.

In that sense, then, there were still bits of darkness in the world as late as 1800. (Really, not all of them left, but I digress.) Even though Europe didn’t have wandering hordes of invaders anymore, we in the West still needed a thousand years after Charlemagne to truly return to the glory days he was trying to emulate.

And that pairs up well with the Greek Dark Age. Yes, Homer was writing his epics in the ninth or tenth century BC, but they were the beginning, not the end, of Greece’s rise. Most of the great thinkers we associate with the Greek school of philosophy came much later, in the third and fourth centuries BC. In other words, almost a full millennium after the invasions of the Sea Peoples. In a very real sense, then, the path from darkness to light was longer for medieval Europe.

Learning from the past

We must do better than that. The effects of our Third Dark Age can’t last for a thousand years. We’ve come too far as a species to allow ourselves to be dragged back into the darkness, no matter what the "traditionalist" right and "inclusive" left wish for us.

So how do we do it?

First, we must keep knowledge alive. True knowledge, the wisdom passed down to us and created in our own time. Digital collections such as Library Genesis are a good thing—the fact that elites hate them so much is a pretty good indication—but they have the downside of being, well, digital. If the Third Dark Age collapse is too great, ubiquitous computing won’t be a given. In addition to distributed, censorship-resistant online libraries, then, we need open, secure libraries in the physical world. The Library of Alexandria, except there’s one in every city, and their shared goal is to archive as much knowledge as possible, in ways that it will endure even the harshest decline.

Second, those of us who are awake to the peril must continue to share that knowledge. Within our family, our community, and our country (in that order), we should be training others in the skills we possess, while also passing down what we have learned. And this includes the greatest lessons of all: that the world is not some divine mystery; that humanity is inherently a good and positive force; that science is not reserved to the elite, or those with the right credentials, but is something every one of us experiences every single day.

Third on the list is a greater focus on that community. The post-collapse time of the previous Dark Ages was a reversion to a heavily decentralized world. An anti-globalism, or a "localism", if you will. In modern times, I can foresee that creating a multitude of city-states; we’re already pretty close to this with New York, London, and a few others. But even rural areas will have to become more self-reliant as the Third Dark Age brings a fall of the American Empire. We can’t do that if we don’t know our neighbors. (We also can’t band together to resist invasions without that sense of brotherhood, so this strengthening of community has more than one beneficial effect.)

Fourth, reconnecting with our past, in the sense of doing useful work outside of the internet. This can be writing books, building a shed, or just anything. The key is that it has a physical presence. It’s a physical manifestation of our knowledge, which matters more in a world that will come to see that knowledge as worthless in itself. I’m not saying to become a prepper—that might be more useful in other collapse circumstances—but to prepare for a major shift in what society deems important.

Above all, we need to remember, to preserve, to teach the world that the coming darkness is not eternal. There is a light beyond, and it is not the light at the end of the tunnel. It is the dawn of a new day. Working together, recognizing what we are losing and why, we have the chance to bring that new dawn faster than in our ancestors’ previous two attempts.

If knowledge is power, our job is to be a generator. And that’s what you need to keep the lights on in a disaster.

The Third Dark Age

Twice before, the West faced a crisis, a series of unfortunate events that led ultimately to the decline of the reigning powers of the civilized world, a long stretch of technological stagnation or even regression, and a loss of the cultural achievements of those who came before. In short, a Dark Age.

Our world today is on the same track. We’re following those same steps, dealing with those same crises. Unlike the past, however, we have the ability to recognize what is happening, and to stop it. But we can only do that if we acknowledge our situation. To do that, we must understand the warning signs and the parallels.

The last Dark Age

Most people in the Western world have heard of the Dark Age. (Sometimes referred to as the Dark Ages, but the singular is important here, as you’ll see shortly.) The time after the fall of the Roman Empire was a period of barbarism in Europe, a long stretch of Vikings and serfdom and general horror.

Archaeological finds show that our legends of this time are exaggerated. Alas, these finds have given ammunition to those on both sides of the political spectrum who wish to argue that the Dark Age never even happened. The left will point to algebra and the Almagest to say, "See? Muslims kept making advances where white Europeans failed." The right, meanwhile, counters with, "Look at those cathedrals! That’s proof that Christianity is what kept civilization going."

Both are wrong, of course. The few Muslim inventions—and their occasional translation of ancient knowledge—don’t make up for the ravages of the Moorish conquest of Spain. The cathedrals built in the 9th century weren’t constructed from Roman concrete, because knowledge of how to make that was lost along with so much else. There was a Dark Age, no doubt about it. The only questions are how long it lasted and just how dark it was.

By any reasonable estimation, the fall of Rome was the tipping point. Many of the Gothic tribes that settled in Italy, France, and Spain at the time still considered themselves vassals of the Empire, to some extent, and some continued to pay homage to Constantinople, the eastern capital where the flame of civilization was kept alive. But even that had ended by 540, following a volcanic winter (caused by an eruption in Central America!) and attendant famines and plagues. So we can put the start of the Dark Age around 500 AD, plus or minus about 40 years.

When did it end? Tradition has it lasting as late as 1066, with the Norman Conquest. Academics like to credit Charlemagne’s accession in 800 as ending it. I’d say the best date comes in between those. The early not-quite-Renaissance of the late ninth and early tenth centuries shows that European culture was beginning to rise from its nadir far better than the end of the Merovingian era. Personally, I’d use 927, the year of Æthelstan’s coronation as king of England, as a good compromise, but you could make a case for anywhere in the range of 870-940.

In between, most of the continent was a mess. Rational thought took a back seat to mysticism and monasticism. Texts, cultural contributions, and even general knowledge of the Roman Empire all fell away, until the Romans themselves almost became mythologized. The typical Hollywood portrayal of medieval peasants in dirty, tattered clothing, treading muddy streets to go back to their thatch hovels, isn’t entirely accurate, but it’s probably closer to the truth than the almost romantic notions of some traditionalists.

The European Dark Age was, to sum up, a time where the strong ruled, the weak toiled, and the wisdom of the past was forgotten. What’s worse is that it wasn’t the first time that had happened.

The one before

As far back in history from the start of the European Dark Age as it is from our present day, the lands of the Mediterranean faced a crisis. This was precipitated by invasions from what are commonly called the Sea Peoples, a later collective name given to groups who are mostly known only from a few Egyptian accounts. We can identify some of them from such accounts, however: the Achaeans, Sicilians, Sardinians, and Philistines. Possibly also the Etruscans, though this etymology is on somewhat shakier ground.

Whoever they were, the Sea Peoples invaded Egypt, most of the Levant, and Anatolia. They clashed with the major civilizations of that time—Egyptians, Hittites, and so on—and ultimately wore them down so much that they fell into their own decline. It wasn’t a conquest, but more like a war of attrition, the same way forces in Palestine and Lebanon (almost the exact same place!) are bleeding their occupiers dry as we speak.

Dates are hard to find this far back in history, but the most common given for the start, or perhaps the height, of the troubles is 1177 BC, owing to the popular book of the same name, which isn’t bad except for the part where it does the usual academic trick of trying to be notable by minimizing the impact of known historical events.

This "Greek Dark Age", as it’s commonly called, isn’t as much known, but its effects were no less drastic than the European one that started a millennium and a half later. The Hittites fell out of history entirely, to the point where our only knowledge of them as recently as 200 years ago was a mention in the Bible. The Egyptians fared a little better, but lost most of their holdings east of the Sinai to the Philistines and Canaanites, who—in another event paralleled by modern times—later lost them to invading Hebrews. Farther north, Troy fell to an alliance that included Sea Peoples; its collapse was so total that the modern West thought the whole city was a myth until it was rediscovered.

Three thousand years is a long time, so it’s only reasonable that we have far less data about the Greek Dark Age. We don’t know a lot of details about it, but what we do know shows that it follows the same pattern as the one that befell Europe later on.

What’s to come

The biggest contributor to both of the previous Dark Ages is invasion. Rome was invaded by Goths, Huns, Vandals, and (later) Moors, all of whom picked apart the bones of the empire and left little behind for its citizens. The Sea Peoples did much the same to the powers of the Bronze Age, even when Ramesses II tried to resettle some of them.

It’s not hard to see that pattern repeating today. Our own country is being invaded as we speak, as are so many of the major Western powers. Millions of "asylum-seekers" who consume resources but refuse to assimilate, who provoke or cause violence, who care nothing for the sentiments of those who call this land home. The Haitians eating pets in Ohio, the Venezuelans capturing apartment complexes in Colorado, the rape gangs of England…these are the modern Sea Peoples, the modern Huns and Moors. And they are one tip of the trident thrust our way.

The apocalyptic conditions of the 530s contributed to the European Dark Age, as well as the fall of a number of smaller cultures in the Middle East, where the resulting power vacuum provided fertile ground for a Muslim conquest. It’s harder to pinpoint a major ecological disaster for the Greek Dark Age; probably the closest is an impact (or airburst) event on the shores of the Dead Sea circa 1600 BC, the historical basis for the Sodom and Gomorrah myth. But that must be too far back. Undoubtedly, the Sea Peoples wanted to migrate south for some reason. Perhaps it’s linked to the fall of the Minoans on Crete, another total collapse in that era.

Today, we don’t seem to have as much to worry about on that front. We’re in a stable climate epoch, a period of global "greening" while temperatures remain steady and comfortable. In our case, the ecological angle of collapse might come from an overreaction on the part of—or simply led by—doomsayers who claim our relatively quiescent climate is somehow a bad thing, and that we need to go back to the days of the Little Ice Age.

More likely, ecology will be used as a way to contribute to the collapse. We already see that happening, as clean nuclear plants are shut down and replaced with toxic solar panels and bird-killing turbines. Eugenics is another possibility: the attempts by the so-called "elite" to force us to eat bugs or genetically modified plants, to take experimental drugs that are shown to have a deleterious effect on our health.

The third and final pillar that must fall to create a Dark Age is cultural continuity. In modern times, that one isn’t so much collapsing as being demolished. The entire agenda of ideologies such as progressivism and communism is to create a clean break with the past, with the traditions and customs that brought us to where we are. What little history is allowed to be learned is shown through a distorted lens, and too many who should oppose such acts instead welcome them, hoping that, in the chaos that follows, their particular ideologies will have a chance to step forward.

To be continued

Our new Dark Age, then will come from those factors: unchecked immigration, ecological fear-mongering, and the destruction of our heritage. That’s not to say these things will start happening soon. No, they’re already happening. With every border crossing, every falsified temperature record, every statue torn down, we sink deeper into the darkness. We’re on the path of decline right now. We have been for almost the entire 21st century.

The question then becomes: what are we going to do about it? In the next post, I’ll offer my own thoughts on a solution. To combat the Third Dark Age, I believe we’ll need a Second Enlightenment.

Summer Reading List 2024: Third

Finished with about a week to spare. Good thing, too. I know next week is going to be awful for reading, and the weekend won’t be any different. Anyway…

Fantasy

Title: In the Shadow of Lightning
Author: Brian McClellan
Genre: Fantasy
Year: 2022

I got this one for Christmas two years ago. Never actually read it, mostly because so much in my life was turning upside-down around that time. So I promptly forgot about it and, when Christmas of last year rolled around, put it on my wishlist again. My brother, always willing to try for a laugh, swiped it out of the pile in my room, wrapped it back up, and gave it to me again. And I’ll admit that I was fooled.

The book itself, now that I’ve actually read it, is great. Brian McClellan is one of my favorite current authors, because he really seems to care about his worldbuilding, while also being able to tell a good story without getting bogged down in minutiae. Plus, he specializes in a style of fantasy that’s post-medieval, more 17th than 12th century, which is refreshing and fun. If anything, that’s how I feel reading his books: like I’m having fun.

With In the Shadow of Lightning, that trend continues. The pace is lively throughout the book, and there’s almost never any real downtime. Things happen, and then something else happens on the next page, and so on. No long monologues, very few digressions. The book is long, but tight. I could easily imagine it being another 200 pages if written by most other authors.

The story itself starts out almost typical, with the fantasy cliche of an imperial hero putting down the wicked rebels seeking a measure of autonomy. Things go wrong quite quickly, however, and the main story begins a decade later, with that same hero now living in exile. He gets drawn back into the political game, and there’s not a minute where things let up from there.

Along the way, there’s a lot of fighting. McClellan is definitely a military fantasy writer, and it shows here far more than it did in the Powder Mage series, which was about a war! Essentially the whole book has as its main event a war between the empire and a neighboring country that has retained its independence through trade. Kind of like Venice or the Netherlands, is how I see it.

The lead-up to the war is part of the plot, so I won’t get too into spoilers here. I will say that I guessed "false flag" about five pages into the first post-prologue chapter. Not because it’s telegraphed, but the details simply aligned with what I know of how false flag operations are pushed. 9/11, 1/6, 10/7…anyone who really digs into terrorist attacks and assassination attempts will see that some patterns emerge, and a lot of those patterns point to the supposedly "random" shootings and "unprovoked" attacks actually being started by other agents. Agents of chaos.

Brian McClellan clearly also knows this, as he throws in every single one of the elements we now know to be signs of a false flag operation: a killer who just happens to be of a specific nationality, economic chaos at just the right time, conflicting or outright forged orders, media propaganda to hide the truth. If he isn’t making a statement, then I can’t wait to read the book in which he does.

And that brings me to what I feel is the worst part of In the Shadow of Lightning: the Ossan Empire itself. Rather than your typical fantasy autocracy, possibly with a secret cabal of ministers who are the real power behind the throne, McClellan just jumps straight to the cabal. Ossa is an oligarchical empire where the powerful families vie for dominance.

It’s more of a Renaissance Florence than a High Middle Ages England, in that sense. In the social sense, however, it’s closer to the Weimar Republic…or modern America. The nature of magic in the setting requires body piercings and tattoos, and that’s fine. It’s an interesting twist. But the empire seems to be a place that has given in wholly to decadence and hedonism. Religion has become just another form of commerce. Gender roles are completely absent.

So are sexual roles; essentially every character whose preferences are mentioned has no preferences. At one point, I joked to my girlfriend, "Everybody in this book is bi!" And it’s true. Combined with the guild family dynamic that is ever-present, and I got the same feeling as I did with The Expanse: this whole place is rotten, and there’s nothing even worth redeeming.

If that is another way of representing an empire in decline, then I’m okay with it. It’s a pattern that has recurred throughout history. The failure of the family unit, and the transfer of the nurturing role to government or society at large has happened before. It’s happening now. And the backlash from those who understand human nature has invariably been disastrous.

But this book is anything but a disaster. Read it for what it is: a fun, fast-paced ride through a dying empire, in a world where magic is flashy and deadly. You get some great battles, a lot of political intrigue, and even some Lovecraftian horror near the end. Just remember that the secret cabal of people who masquerade as normal humans to sow dissent and chaos, using degeneracy to bring an empire to its knees while controlling its government and economy from the shadows, is the least fantastic element of the story.

Summer Reading List 2024: Second

The world’s a mess, my life’s a mess, but at least I’m reading. Right?

Military History

Title: The Guns of August
Author: Barbara Tuchman
Genre: Military History
Year: 1962

World War I has always fascinated me. Ever since I was in the 6th grade, when I had a to choose a topic for a social studies project (that year was world history, and I had reached the early 20th century), I was hooked by the stories and the sheer scope of the Great War. My grade on that project was terrible—I almost failed, simply because there was just so much to learn that I couldn’t narrow it down enough, and didn’t have time to rehearse—but the memory remained.

Over the past decade, the war that had languished in relative obscurity all my life finally started to get back into the public eye. Mostly, that’s because of the centennial, the 100th anniversary of Archduke Franz Ferdinand’s assassination in June 1914. That milestone brought about the desire to make new media, whether movies (1917), video games (Verdun and its sequels, Tannenberg and Isonzo), web series (Indy Neidell’s The Great War), or music (Sabaton’s…er, The Great War). Many of these are excellent, and I’ve spent the past ten years basking in the knowledge that I finally got to be a history hipster.

But I haven’t read a book about WWI in decades. And I hadn’t planned on doing so this summer, either, until Elon Musk shared a list of his must-listen audiobooks. Since I don’t really care for audiobooks—I’m a visual learner—I downloaded them in written form, and I picked the first interesting one I saw that wasn’t 11 volumes. Sorry, Will Durant.

The Guns of August is a fairly detailed narrative, drawn from diaries, newspapers, the occasional eyewitness report, and other primary or good secondary sources. Its topic is, broadly speaking, the first month of the war. In practice, it starts somewhat before that, with the death of Edward VII, King of England. His death, and his succession by George V, created a power vacuum in a geopolitical landscape that was already growing increasingly tense. Europe had four years until war finally broke out (ignore the Balkan Wars for a moment), but the buildup had already begun. Edward’s death, as Tuchman argues in a roundabout way, set Europe on the path to war.

Most of us know the broad strokes of the summer of 1914. Ferdinand was shot in Sarajevo. Austria demanded reparations from Serbia; recall that this was before Yugoslavia existed, much less Bosnia. Favors were called in on both sides, drawing in first Germany, France, Russia, and Great Britain, then seemingly every other country in the world. Four years and many millions of dead young men later, the original belligerents peaced out one by one, ending with Germany signing the Treaty of Versailles.

As not enough of us know, that treaty was designed to be so ruinous that the German Empire would cease to be a nation able to project power abroad. Indeed, it was the end of the empire as a whole. Instead, the Kaiser’s rule was replaced with the decadent debauchery of the Weimar Republic, which served to suck out the marrow of the German economy while leaving its society fractured, fragmented. Exactly as we’re seeing in modern America, but I digress.

Anyway, Tuchman’s book isn’t about that part of the war. In fact, it leaves off as the Battle of the Marne begins, ending with a series of what-ifs that are tantalizing to the worldbuilder in me. What if the German armies hadn’t tried to do a forced march just to stick to their predetermined schedule of battles? What if Britain’s Field Marshal French hadn’t been swayed by a rare emotional outpouring from the normally stoic General Joffre? (Now I really want to write that alt-history!)

No matter what might have happened if things had gone differently in August 1914, the author makes it clear that what occurred in the weeks immediately prior to the German advance to the outskirts of Paris were pretty much set in stone. Before Franz Ferdinand was so much as cold, Europe was going to war. It was only a matter of when.


As far as the book goes, it’s a good read. It’s nothing brilliant, and certainly not worth a Pulitzer, in my opinion. The writing can be almost too highbrow at times, as if Tuchman is trying to capture the last gasp of the Victorian Era in words. To be fair, that’s how most of the major players talked and wrote, but readers even in the 1960s wouldn’t have been exposed to it except in literature classes. Certainly not when discussing military history. There are also scores of untranslated sentences in French and German, an oddity in a book written for English speakers.

The pacing is also very uneven. The Russians get a couple of fairly long chapters, but are otherwise forgotten; Tannenberg is practically a footnote compared to Liege. Conditions on a forced march get page after page of narration, including diary excerpts from soldiers, while the battles themselves are mostly reduced to the traditional "date and body count" sort of exposition.

If there’s any real critique of The Guns of August, it has to come from its very obvious and very intentional Allied bias. While the happenings in Germany and among the Kaiser’s generals are well-represented, they’re often cast in a negative light. When the Germans demolish a village in retaliation for partisan attacks, it’s a war crime and an international outrage. When the French demolish a village because they think they might need to put up defenses, it’s a heroic effort to save their country.

This is, of course, the same kind of thinking that still permeates the discussion about the Great War’s sequel. The "bad guys" aren’t allowed to take pride in their country. Their nationalism is evil; ours is sacred. (This line of reasoning also leads otherwise sensible people to praise Communists.) The simple fact is, the Weimar Republic was far worse than it’s portrayed, and the governments to either side of it on the timeline, whether Empire or Reich, were not as bad as they’re portrayed. Barbara Tuchman, being a student of her generation, can’t get past that. Even if she tried, I imagine her publishers wouldn’t let her.

Otherwise, The Guns of August is a worthwhile read for its subject matter. It’s a good look at the backdrop to World War I, something that occasionally gets lost among the trenches. Personally, I find it a bit overrated, but I’m glad I read it.

Summer Reading List 2024: First

It’s been about a month, and I finally made time to read something. Thanks to my brother’s timely discovery of a Youtube channel called "In Deep Geek", I got a little inspired for this one. Man, I hope that guy starts posting on a site that respects its users soon.

Biography

Title: The Nature of Middle-Earth
Author: J.R.R. Tolkien (ed. Carl Hostetter)
Genre: Biography/History?
Year: 2021

I don’t really know how to classify this book. It’s basically a collection of notes and scraps that Tolkien left behind. Much like his son Christopher’s History of Middle-Earth series, a ton of editing had to be done to make something readable. And…well, that didn’t quite work. The book as a whole is very disjointed, full of footnotes and editor comments and just a mess overall.

That makes perfect sense, though. Tolkien was probably the first great worldbuilder. He worked in an era without computers, without the internet. He had to write out his notes longhand. And there were a ton of those notes, because his constructed world began all the way back in the days before World War I. 1909, or thereabouts, was when he first started sketching out the conlang that would become Quenya. By his death, those earliest notes were senior citizens. There was a lot of cruft.

This book, then, is about organizing a lot of that cruft. In that, Hostetter does a good job. His is the job of an archaeologist, in a sense, as well as a forensic scientist. Oh, and a linguist, because Tolkien’s languages were ever the most important part of his creation.

The Nature of Middle-Earth, as its name suggests, gives us notes and drafts related to some of the fundamental questions and thorny problems Tolkien had to solve to give his invented world verisimilitude while also keeping it true to his long-standing ideas and ideals. After all, Middle-Earth is intended to be our world, just a few thousand years in the past. How many, exactly? It’s never stated anywhere in his published books, but this book tells us that Tolkien saw his present day—well, in 1960—as being about 6000 years after the end of LOTR. Convenient, that number, since it’s basically the same as what creationists claim.

And that brings me to the point I want to make. Our editor here repeats his own note a couple of times, emphasizing that Tolkien saw his world as a "fundamentally Catholic" creation. He was a Catholic, so that makes sense in some regard.

Much of the book—much of Tolkien’s corpus of personal notes—is thus about harmonizing a high fantasy world at the cusp of the Dominion of Man with the low, anti-human dogma of the Catholic Church. So Tolkien writes at length, and sometimes in multiple revisions, that his Elves were strictly monogamous, and that they didn’t reincarnate into different bodies. The men of Numenor were the same (except that he didn’t have to worry about reincarnation for them) because they had grown more godly.

In a few cases, Tolkien shows glimpses of a modern scientific worldview that was probably heretical in the churches of his youth. Sure, it’s all in an explicitly theistic framework, but he even accepts evolution for the most part; he can’t quite make the logical leap that humans are subject to it, too, but he meets science halfway, which is more than most would dare.

There is also a glimpse of what I’ve previous called "hardcore" worldbuilding. Tolkien was, of course, a master of that, but The Nature of Middle-Earth shows the extremes he was willing to go to for the sake of his creation. Multiple chapters are taken up with his attempts at giving believable dates for some of the events that were considered prehistorical even in the tales of The Silmarillion. In each, he went into excruciating detail, only to discard it all when he reached a point where the numbers just wouldn’t work. I’ve been there, and now I don’t feel so bad about that. Knowing that the undisputed master of my craft had the same troubles I do is refreshing.

All in all, most of the chapters of the book are short, showing the text of Tolkien’s notes on a subject, plus the occasional editorial comment, and the copious footnotes from both authors. We get to see how the sausage is made, and it’s sometimes just as disgusting as we’d expect. Not one reader of LOTR or The Silmarillion cares about the exact population of each tribe of Elves, or what the etymology of Galadriel’s name indicates about her travels, but Tolkien isn’t writing these things for us.
When worldbuilding, we authors do so much work not because we expect to show every bit of it to our audience, but so that the parts we do show are as good as they can be.

If this book has any lesson, then, it’s that. Worldbuilding is hard work. Worse, it’s work that accomplishes almost nothing in itself. Its sole value is in being a tool to better convey a story. Perfectionist and obsessive that Tolkien was, he wanted an answer to any plausible question a reader might ask. But he also wanted to create for the sake of creating. Remember that the intended goal of Middle-Earth was to become a new mythology, mostly for the British peoples. When you set your sights on something that sweeping, you’re always going to find something to do.

Summer Reading List Challenge 2024

Is it already that time of the year? 2024 seems like it’s just flying by, or maybe that’s because I’m old now. Whatever the case, it’s Memorial Day, and that means time to start a new Summer Reading List challenge! Take a look at the original post if you want to see how this all started. If you don’t really care that this is the 9th straight year I’m doing this challenge, then read on.

The rules are the same as always, because they just fit the challenge perfectly. As always, remember that the "rules" presented here are intended to be guidelines rather than strictures. This is all in fun. You won’t be graded, so all you have to do is be honest with yourself.

  1. The goal is to read 3 new (to you) books between Memorial Day (May 27) and Labor Day (September 2) in the US, the traditional "unofficial" bounds of summer. For those of you in the Southern Hemisphere reading this, it’s a winter reading list. If you’re in the tropics…I don’t know what to tell you.

  2. A book is anything non-periodical, so no comics, graphic novels, or manga. Anything else works. If you’re not sure, just use common sense. Audiobooks are acceptable, but only if they’re books, not something like a podcast.

  3. One of the books should be of a genre you don’t normally read. For example, I’m big on fantasy and sci-fi, so I might read a romance, or a thriller, or something like that. Nonfiction, by the way, also works as a "new" genre, unless you do read it all the time.

  4. You can’t count books you wrote, because they obviously wouldn’t be new to you. (Yes, this rule exists solely to keep me from just rereading my books.)

Social media is an awful place these days, and even my usual fediverse haunt is in flux at the moment. I’ll try to post on my alt @nocturne@bae.st, but don’t hold your breath. Instead, just wait for me to write something here. Of course, you can post wherever you like, even if that’s to Facebook, Twitter (I’m not calling it anything else), or something weird like Threads.

Have fun, and keep reading!

Tools and appliances

I was trying to sleep late last night when I had something of an epiphany. I’ve long lamented the dumbing-down of the world, and particularly the tech world. Even in programming, what should be a field that relies on high intelligence, common sense, and reasoning abilities, you can’t get away from it. We’ve reached the point where I can only assume malice, rather than mere incompetence, is behind the push for what I’m calling the Modern Dark Age.

The revelation I had was that, at least for computers, there’s a simple way of looking at the problem and its most obvious solution. To do that, however, we need to stop and think a little more.

The old days

I’m not a millennial. I’m in that weird boundary zone between that generation the Generation X that preceded it. In terms of attitude and worldview, that puts me in a weird place, and I really don’t "get" either side of the divide. But I grew up with computers. I was one of the first to do so from an early age. I learned BASIC at 8, taught myself C++ at 13, and so on to the dozen or so languages I’m proficient in now at 40. I’ve written web apps, shell scripts, maintenance tools, and games.

In my opinion, that’s largely because I had the chance to experience the world of 90s tech. Yes, my intelligence and boundless curiosity made me want to explore computers in ever-deeper detail, but the time period involved allowed me an exploratory freedom that is just unknown to younger people today.

The web was born in 1992, less than a decade after me. At the time, I was getting an Apple IIe to print my name in an infinite loop, then waiting for the after-recess period when I could play Oregon Trail. Yet the internet as a whole, and the technologies which still provide its underpinnings today, were already mature. When AOL, Compuserve, and Prodigy (among others), brought them to the masses, people in the know had been there for fifteen years or more. (They resented the influx of new, inexperienced rabble so much that "Eternal September" is still a phrase you’ll see thrown about on occasion, a full 30 years after it happened!)

This was very much a Wild West. I first managed to convince my mom that an internet subscription was worth it in 1996, not long before my 13th birthday. At the time, there were no unlimited plans; the services charged a few cents a minute, and I quickly racked up a bill that ran over a hundred dollars a month. But it was worth it.

Nowadays, Google gives people the illusion of all the answers. Back in the day, it wasn’t that simple. Oh, there were search engines: Altavista, Lycos, Excite, Infoseek, and a hundred other names lost to the mists of time. None of these indexed more than a small fraction of the web even in that early era, though. (Some companies tried to capitalize on that by making meta-engines that would search from as many sites as possible.)

Finding things was harder on the 90s web, but that wasn’t the only place to look. Before the dotcom bubble, the internet had multiple, independent communities, many of which were still vibrant. Yes, you had websites. In the days before CSS and a standardized DOM, they were rarely pretty, but the technical know-how necessary to create one—as well as the limited space available—meant that they tended to be more informative. When you found a new site about a topic, it often provided hours of reading.

That screeching modem gave you other options, though. Your ISP might offer a proprietary chat service; this eventually spawned AIM and MSN. AOL especially went all-in on what we now call the "walled garden": chat, news, online trivia games, and basically everything a proto-social network would have needed. On top of that, everyone had email, even if some places (Compuserve is the one I remember best) actually charged for it.

Best of all, in my rose-colored opinion, were the other protocols. These days, everything is HTTP. It’s so prevalent that even local apps are running servers for communication, because it’s all people know anymore. But the 90s had much more diversity. Usenet newsgroups served a similar purpose to what Facebook groups do now, except they did it so much better. Organized into a hierarchy of topics, with no distractions in the form of shared videos or memes, you could have long, deep discussions with total strangers. Were there spammers and other bad actors? Sure there were. But social pressure kept them in line; when it didn’t, you, the user, had the power to block them from your feed. And if you didn’t want to go to the trouble, there were always moderated groups instead.

Beyond that, FTP "sites" were a thing, and they were some of the best places to get…certain files. Gopher was already on its way out when I joined the internet community, but I vaguely remember dipping into it on a few occasions. And while I don’t even know if my area had a local BBS, the dialer software I got with my modem had a few national ones that I checked out. (That was even worse than the AOL per-minute fees, because you were calling long-distance!)

My point here is that the internet of 30 years ago was a diverse and frankly eye-opening place. Ideas were everywhere. Most of them didn’t pan out, but not for lack of trying. Experimentation was everywhere. Once you found the right places, you could meet like-minded people and learn entirely new ways of looking at the world. I’m not even kidding about that. People talk about getting lost in Wikipedia, but the mid 90s could see a young man going from sports trivia to assembly tutorials to astral projection to…something not at all appropriate for a 13-year-old, and all within the span of a few hours. Yes, I’m speaking from personal experience.

Back again

In 2024, we’ve come a long way, and I’m not afraid to state that most of that way was downhill. Today’s internet is much like today’s malls: a hollowed-out, dying husk kept alive by a couple of big corporations selling their overpriced goods, and a smattering of hobbyists trying to make a living in their shadow. Compared even to 20 years ago, let alone 30, it’s an awful place. Sure, we have access to an unprecedented amount of information. It’s faster than ever. It’s properly indexed and tagged for easy searching. What we’ve lost, though, is its utility.

A computer in the 90s was still a tool. Tools are wonderful things. They let us fix, repair, build, create. Look at a wrench or a drill, a nail gun or a chainsaw. These are useful objects. In many cases, they may have a learning curve, but learning unlocks their true potential. The same was true for computers then. Oh, you might have to fiddle with DIP switches and IRQs to get that modem working, but look at what it unlocks. Tweaking your autoexec.bat file so you can get a big Doom WAD running? I did that. Did I learn a useful skill? Probably not. Did it give me a sense of accomplishment when I got it working? Absolutely.

Tools are just like that. They provide us with the means to do things, and the things we can do with them only increase as we gain proficiency. With the right tools, you can become a craftsman, an artisan. You can attain a level of mastery. Computers, back then, gave us that opportunity.

Now, however, computers have become appliances. Appliances are still useful things, of course. Dishwashers and microwaves are undeniably good to have. Yet two aspects set them apart from tools. First, an appliance is, at its heart, a provider of convenience. Microwaves let us cook faster. TVs are entertainment sources. That dryer in the laundry room is a great substitute for a clothesline.

Second, and more important for the distinction I’m drawing here, is that an appliance’s utility is bounded. They have no learning curve—except figuring out what the buttons do—and a fixed set of functions. That dryer is never going to be useful for anything other than drying clothes. There’s no mastery needed, because there’s nothing a mastery of an appliance would offer. (Seriously, how many people even use all those extra cooking options on their microwave?)

Modern computers are the same way. There is no indication that mastery is desirable or useful. Instead, we’re encouraged and sometimes forced into suboptimal solutions because we aren’t given the tools to do better. Even in this century, for example, it was possible to create a decent webpage with nothing more than a text editor. You can’t do that now, though, because browsers won’t even let you access local files from a script. The barrier to entry is thus raised by the need to install a server.

It only gets worse from there. Apple has become famous for the total lockdown of its software and hardware. They had to be dragged into opening up to independent app stores, and they’ve done so in the most obtuse way possible as protest. Google is no better, and is probably far worse; they’re responsible for the browser restriction I mentioned, as well as killing off FTP in the browser, restricting mobile notifications to only use their paid service, and so on. Microsoft? They’re busy installing an AI keylogger into Windows.

We’ve fallen. There’s no other way to put it. The purpose of a computer has narrowed into nothing more than a way to access a curated set of services. Steam games, Facebook friends, tweets and Tiktoks and all the rest. That’s the internet of 2024. There’s very little information there, and it’s so spread out that it’s practically useless. There’s almost no way to participate in its creation, either.

What’s the solution? I wish I knew. To be honest, I think the best thing to do would be a clean break. Create a new internet for those who want the retro feel. Cut it off from the rest, maybe using Tor or something as the only access point. Let it be free of the corrupting influence of corporate greed, while also making it secure against the evils of progressivism. NNTP, SMTP, FTP…these all worked. Bring them back, or use them as the basis for new protocols, new applications, new tools that help us communicate and grow, instead of being ever further restrained.

Twine thoughts

As I mentioned a few months back, I’m writing interactive fiction now. I’ve been planning one called The Anitra Incident, which I envision as a kind of prequel to my Orphans of the Stars novel series. (The second, which I’m actually in the process of writing, is…something else that I’ll never attach my real name to.)

In the previous post, I looked at what I consider the top four tools for creating interactive fiction: Inform 7, Twine, Ren’Py, and Ink. I think I made it clear then why I felt Twine was the best choice for what I’m writing. Now that I’ve been working with it for a while, I have some thoughts to share. These are more of a ramble than even my usual posts here, so bear with me.

Ditch the editor

Twine’s biggest draw is that it has its own editor, with a nifty little drag-and-drop visual tool to organize your stories. It looks good, and it helps to get people interested in creating, rather than whining about how they don’t want to have to learn anything.

But it sucks.

Yes, the editor works just fine for small-scale constructions. Twine divides its stories into passages, which are just that: bits of text that can be anywhere from a few words to an entire chapter, with all the necessary logic for interactivity sprinkled in. A big story with a lot of branching points, arcs, and the like is going to have hundreds, if not thousands, of passages. (Case in point: my unnamed side project has 232 total passages already, and that’s not much more than a set of locations and a handful of conversation scenes.) Trying to keep all that straight will quickly become impossible.

On top of that, the editor’s structure makes it difficult to write code. There isn’t much room for "metadata" on a passage; for the most part, that’s limited to a series of tags, which you have to edit using the "chip" style of tagging that web devs love for some inexplicable reason. But that means you have to put all the code in that little box, even if you’re using a tool that expects tags. In my case, that’s TinyQBN, a library for implementing what the creators of Sunless Sea call "storylets".

I could rant about the editor for another few posts, but I just don’t bother using it, so I won’t bother discussing it further. Yes, setting up a custom workflow is a bit more difficult. Yes, it’s worth it in the end. After doing the work, I can now write my story in Vim and my code in, er, Code. And it all comes out the same, except that I also have better handling of external JS libraries, static analysis tools that can run automatically, and so much more that I’m used to from my life as a developer.

People are stupid

Which brings me to my next point. The average Twine user is not a professional developer or a professional author. Worse yet, neither are the Twine power users. As far as I can tell, I’m just about the only one using Twine who does both. Believe me, it shows.

Most Twine tutorials are written for someone who has never so much as looked at code, and who barely even knows what fiction is, let alone how to write it. I don’t know why Twine’s community targets journalists as its intended audience, but that’s how it is.

For someone who knows both fields, it’s just frustrating. I’ve already read the intro material. I know what a macro is. But no one out there is creating any resources for the intermediate or advanced users. How should I structure a story in terms of source files? What are some common design patterns in interactive fiction, and how do I apply them in Twine? When should I break a scene across multiple passages, and what’s the best way to handle that?

I get that much of writing fiction is an art. I’m well aware that there’s no one-size-fits-all method for creating a novel. But to assume that everyone is forever going to be stuck at the beginner stage is doing the rest of us a disservice. I’m aware that zoomers, degenerates, and progressives (the main components of the intfiction.org "community") don’t know how to learn; people who look to Tumblr for knowledge and wisdom have shown pretty definitively that they have neither. Surely somebody out there cares about the rest of us, though.

If not, maybe I should work on that myself.

Wokeness taints everything

Allegedly, the interactive fiction community is thriving, and Twine is a big part of that. In reality, there’s not much of a community. Much like any other hobby (people don’t generally make a living off adventure stories, unless they work for Failbetter), the anti-human rot of progressivism infects every large gathering that would have the chance to become a community. Those of us who prefer free expression to censorship are, as usual, labeled extremists for the radical view that words are just words. Strange for a hobby built around words, but that’s the whole point of the woke ideology: to tear apart any gathering of like-minded individuals by setting them against one another.

So there’s an interactive fiction forum, but it’s so heavily censored that you get banned just for saying something that someone might think is "bad" in some ill-defined way. There’s a group on Reddit, but that’s…well, Reddit. It’s the Mos Eisley of the internet. Your other major option is Discord, which might be even worse!

Interactive fiction started in the days before the web. It became popular because of technologies like Usenet, where you were expected to be civil, yes, but you weren’t coddled. To have its gathering places be nothing more than wastelands of diversity, mere online versions of Portland and Detroit, is just sad.

(This isn’t specific to Twine, mind you. The Inform community goes even farther. They not only stand against freedom of speech, but also anonymity.)

Tech is tech

Beneath it all, Twine is nothing more than a very weird SPA framework. Sure, you have to compile the source, but the end result is an HTML page and a bunch of assets. It’s like Svelte in a way, except that (as far as I’m aware) the Twine authors don’t openly support child trafficking and religious persecution. As a developer, I think looking at it as a web framework has helped me better understand how to use it as an authoring tool.

This is where my earlier point about getting rid of the Twine IDE as soon as possible comes back into play. Once you abandon that crutch, you realize just how much freedom you have, with all that entails. For my current story, I’ve added the Pure CSS library to help with some layout issues. On my initial draft of The Anitra Incident, I’d used Moment.js for timekeeping; now, somebody finally made a decent native date system macro that does most of what you’d need in a story.

The output is HTML, meaning that you get to use CSS for styling, Javascript for scripting, and all that good stuff. People have managed to integrate Phaser, a 2D sprite-based game engine, into Twine stories, and I’ve been looking at how they did that. I wouldn’t be surprised if somebody even tried combining Twine with React and a full-stack framework. (Come to think of it, that’s not a bad idea. Okay, maybe not React, but Vue and Nuxt…)

One true format

Twine comes prepackaged with a number of "story formats", which are combinations of style templates and authoring DSLs. I briefly went over them in the previous post on this topic. In short, Chapbook is new, and mostly unused. Snowman is not much more than raw Javascript with a parser.

The other two are the most popular: Harlowe and Sugarcube. Harlowe is the default format in the Twine IDE, so it’s the one most newcomers learn first, but I think that’s a horrible decision. If you want to do anything even remotely complex, you’ll quickly run into the limitations of Harlowe. Far worse, however, is the fact that those limitations are by design. The authors, much like Apple, go out of their way to break any attempt at getting outside their sandbox.

In other words, there’s really no reason not to go straight to Sugarcube and stay there. It works. It’s not difficult to pick up. Most of the libraries out there are for it. (A few are format-agnostic, I’ll admit.) And you won’t be supporting the intentional hobbling of technology.

Conclusion

To sum up, then, what I’ve learned about Twine from using it is that it’s a great tool for what it does. It has some extraneous bits, and these are unfortunately the same bits that newcomers are pushed towards. If you’re willing to take the time to set up your own dev environment, use Sugarcube and a compiler like Tweego, and live with the fact that you’ll get no help from the community beyond "here’s how to make text red" and "here’s how to let your players make up their own words to use as pronouns", you won’t have any problems.

Writing a novel is a lot of work. Writing a program is a lot of work. Trying to do both, which is all interactive fiction really is, can be a monumental undertaking. But it’s fun, too. That’s what I’ve discovered in the past few months.

Dumbing down tech

I recognize that I’m smarter than most people. I’ve known that as long as I can remember. When I was six years old, I took a standardized IQ test. You know, the kind whose results are actually usable. I apparently scored a minimum of 175; it wasn’t until a few years later, when I was studying statistics, that I understood both what that meant in relation to society at large and why it had to be a minimum. (IQ is a relative measurement, not an absolute one. Once you get to a certain point, small sample sizes make a precise evaluation impossible.)

There is, of course, a big difference between intelligence and wisdom, though I like to think I also have a good stock of the latter. In some fields, however, the intelligence factor is the defining one, and tech is one of those fields. Why? Because intelligence isn’t just being able to recite facts and formulas. It’s about reasoning. Logic, deduction, critical thinking, and all those other analytical skills that have been absent from most children’s curricula for decades. While some people literally do have brains that are better wired for that sort of thinking—I know, because I’m one of them—anyone can learn. Logic is a teachable skill. Deductive reasoning isn’t intuition.

Modern society, in a most unfortunate turn of events, has deemed analytical thinking to be a hindrance rather than an aid. While public schooling has always been about indoctrination first, and education a very distant second, recent years have only made the problem both more visible and more pronounced. I’ll get back to this point in a moment, but it bears some consideration: as a 40-year-old man, I grew up in a society that was indifferent to high intelligence, but I now find myself living in one that is actively hostile to it.


I’ve always enjoyed reading tech books. Even in the age of Medium tutorials, online docs, and video walkthroughs, I still find it easiest to learn a new technology from a book-length discussion of it. And these books used to be wonderful. Knuth’s The Art of Computer Programming has parts that are now approaching 60 years old, yet it’s still relevant today. The O’Reilly programming language books were often better than a language’s official documentation.

It’s been awhile since I really needed to read a book for a new technology. I’ve spent the past few years working with a single stack that doesn’t have a lot of "book presence" anyway, and the solo projects I’ve started have tended to use things I already knew. Now that I’m unemployed and back to the eternal job hunt, though, I wanted to look for something new, and I was tired of looking at online resources that are, by and large, poorly written, poorly edited, and almost completely useless beyond the beginner level. So I turned to books, hoping for something better.

I didn’t find it.

One book I tried was Real World OCaml. For a few years, I’ve been telling myself that I should learn a functional language. I hate functional programming in general, because I find it useless for real-world problems—the lone exception is Pandoc, which I use to create my novels, because text transformation is one of the few actual uses for FP. But it’s become all the rage, so I looked at what I felt to be the least objectionable of the lot.

The language itself isn’t bad. It has some questionable decisions, but it’s far more palatable than Haskell or Clojure. That comes from embracing imperative programming as valid, meaning that an OCaml program can actually accomplish things without getting lost in mathematical jargon or a sea of parentheses.

But the book…well, it wasn’t bad. It just didn’t live up to its title. There wasn’t much of the real world in it, just the same examples I’d get from a quick Brave search. The whirlwind tour of the language was probably the best part, because it was informative. Tech books work best when they inform.


Okay, maybe that’s a one-off. Maybe I ran into a bad example. So I tried again. I’m working on Pixeme again, the occasional side project I’ve rewritten half a dozen times now, and I decided that this iteration would use the stack I originally intended before I got, er, distracted. As it turns out, the authors of the intriguing Htmx library have also written a book about it, called Hypermedia Systems.

This was where I started getting worried. The book is less about helping you learn their library and more about advancing their agenda. Yes, that agenda has its good parts, and I agree with the core of it, that a full-stack app can offer a better experience for both developers and users than a bloated, Javascript-heavy SPA. The rest of it is mostly unhelpful.

As someone who has been ridiculed for pronouncing "GIF" correctly (like the peanut butter, as the format’s author said) and fighting to keep "hacker" from referring to blackhats, I have to laugh when the authors try to claim that a RESTful API isn’t really REST, and use an appeal to authority to state that the term can only apply to a server returning HTML or some reasonable facsimile.

Advocacy aside, the book was unhelpful in other ways. I can accept that you feel your technology is mostly for the front end, so you don’t want to bog down your book with the perils and pitfalls of a back-end server. But when you’re diving into a method of development that requires a symbiotic relationship between the two, using the academic "beyond the scope" cop-out to wall off any discussion of how to actually structure a back end to benefit from your library is doing your readers—your users—a great disservice. If the scope of your book doesn’t include patterns for taking advantage of a "hypermedia" API, then what does it include? A few new HTML attributes and your whining that people are ignoring a rant from three decades ago?


Alright, I thought after this, I’ll give it one more shot. Never let it be said that I’m not stubborn. The back end of this newest version of Pixeme is going to use Django. Mostly, that’s because I’m tired of having to build out or link together all the different parts of a server-side framework that FastAPI doesn’t include. Things like logins, upload handling, etc. I still want to use Python, because that’s become the language I’m most productive in, but I want something with batteries included.

The official documentation for Django is an excellent reference, but it’s just that: a reference. There’s a tutorial, but this ends very quickly, and offers almost no insight on, say, best practices. That, for me, is the key strength of a tech book: it has the space and the "weight" to explain the whys as well as the hows. So I went looking for a recent book on the topic, and I ended up with Ultimate Django for Web App Development Using Python. A bit of a mouthful, but it’s so new that it even uses the "on X, formerly Twitter" phrasing that mainstream media has adopted to refer to Twitter. (Seriously, nobody in the real world calls it X, just like nobody refers to the Google corporate entity as Alphabet.)

In this case, the book is somewhat informative, and it functions a lot like an expanded version of the official Django tutorial. If you’re new to the framework, then it’s probably not a bad guide to getting started. From something with "ultimate" in the title, I just expected…more. Outside of the tutorial bits, there’s not much there. The book has a brief overview of setting up a Docker container, but Docker deserves to be wiped off the face of the earth, so that’s not much help. And the last chapter introduced Django Ninja, a sort of FastAPI clone that would be incredible if not for the fact that its developers support child trafficking and religious persecution.

Beyond that, the text of the book is littered with typos and grammatical errors. Most of these cases have the telltale look of an ESL author or editor, a fact which is depressingly common in tech references of all kinds nowadays. Some parts are almost unreadable, and I made sure to look over any code samples I wanted to use very carefully. It’s like dealing with ChatGPT, except here I know there was a real human involved at some point, someone who looked at the text and said, "Yeah, this is right." That’s even worse.


Three strikes, and I’m out. Maybe I’m just unlucky, or maybe these three books are representative of modern tech literature. If it’s the latter, that only reinforces my earlier point: today’s society rewards mediocrity and punishes intelligence, even in fields where intelligence is paramount.

Especially in programming, where there is no room for alternate interpretations, the culture of "good enough" is not only depressing but actively harmful. We laugh wryly at the AAA video game with a 200 GB install size and a 50 GB patch on release day, but past experiences show that it doesn’t have to be that way. We can have smart developers. As with any evolutionary endeavor, we have to select for them. Intelligence needs to be rewarded at all stages of life. Otherwise, we’ll be stuck where we are now: with ESL-level books that recapitulate tutorials, screeds disguised as reference texts, and a thousand dead or paywalled links that have nothing of value.

As a case in point, I was looking just yesterday for information about code generation from abstract syntax trees. This is a fundamental part of compiler design, something every programming language has to deal with at some point. Finding a good, helpful resource should be easy, right?

Searching the web netted me a few link farms and a half-finished tutorial using Lisp. Looking for books wasn’t much better, because the only decent reference is still the Dragon Book, published in 1986! Yes, the state of the art has certainly advanced in the past 38 years, but…good luck finding out how.

That’s what needs to change. It isn’t only access to information. It isn’t only that this information isn’t being written down. It’s a confluence of factors. All of it happening all at once is making us dumber as a people. Worst of all is that we accept it. Whether you consider it the "price of democracy" or simply decide that there’s nothing you can do about it, accepting this rot has only let it fester.