Dumbing down tech

I recognize that I’m smarter than most people. I’ve known that as long as I can remember. When I was six years old, I took a standardized IQ test. You know, the kind whose results are actually usable. I apparently scored a minimum of 175; it wasn’t until a few years later, when I was studying statistics, that I understood both what that meant in relation to society at large and why it had to be a minimum. (IQ is a relative measurement, not an absolute one. Once you get to a certain point, small sample sizes make a precise evaluation impossible.)

There is, of course, a big difference between intelligence and wisdom, though I like to think I also have a good stock of the latter. In some fields, however, the intelligence factor is the defining one, and tech is one of those fields. Why? Because intelligence isn’t just being able to recite facts and formulas. It’s about reasoning. Logic, deduction, critical thinking, and all those other analytical skills that have been absent from most children’s curricula for decades. While some people literally do have brains that are better wired for that sort of thinking—I know, because I’m one of them—anyone can learn. Logic is a teachable skill. Deductive reasoning isn’t intuition.

Modern society, in a most unfortunate turn of events, has deemed analytical thinking to be a hindrance rather than an aid. While public schooling has always been about indoctrination first, and education a very distant second, recent years have only made the problem both more visible and more pronounced. I’ll get back to this point in a moment, but it bears some consideration: as a 40-year-old man, I grew up in a society that was indifferent to high intelligence, but I now find myself living in one that is actively hostile to it.


I’ve always enjoyed reading tech books. Even in the age of Medium tutorials, online docs, and video walkthroughs, I still find it easiest to learn a new technology from a book-length discussion of it. And these books used to be wonderful. Knuth’s The Art of Computer Programming has parts that are now approaching 60 years old, yet it’s still relevant today. The O’Reilly programming language books were often better than a language’s official documentation.

It’s been awhile since I really needed to read a book for a new technology. I’ve spent the past few years working with a single stack that doesn’t have a lot of "book presence" anyway, and the solo projects I’ve started have tended to use things I already knew. Now that I’m unemployed and back to the eternal job hunt, though, I wanted to look for something new, and I was tired of looking at online resources that are, by and large, poorly written, poorly edited, and almost completely useless beyond the beginner level. So I turned to books, hoping for something better.

I didn’t find it.

One book I tried was Real World OCaml. For a few years, I’ve been telling myself that I should learn a functional language. I hate functional programming in general, because I find it useless for real-world problems—the lone exception is Pandoc, which I use to create my novels, because text transformation is one of the few actual uses for FP. But it’s become all the rage, so I looked at what I felt to be the least objectionable of the lot.

The language itself isn’t bad. It has some questionable decisions, but it’s far more palatable than Haskell or Clojure. That comes from embracing imperative programming as valid, meaning that an OCaml program can actually accomplish things without getting lost in mathematical jargon or a sea of parentheses.

But the book…well, it wasn’t bad. It just didn’t live up to its title. There wasn’t much of the real world in it, just the same examples I’d get from a quick Brave search. The whirlwind tour of the language was probably the best part, because it was informative. Tech books work best when they inform.


Okay, maybe that’s a one-off. Maybe I ran into a bad example. So I tried again. I’m working on Pixeme again, the occasional side project I’ve rewritten half a dozen times now, and I decided that this iteration would use the stack I originally intended before I got, er, distracted. As it turns out, the authors of the intriguing Htmx library have also written a book about it, called Hypermedia Systems.

This was where I started getting worried. The book is less about helping you learn their library and more about advancing their agenda. Yes, that agenda has its good parts, and I agree with the core of it, that a full-stack app can offer a better experience for both developers and users than a bloated, Javascript-heavy SPA. The rest of it is mostly unhelpful.

As someone who has been ridiculed for pronouncing "GIF" correctly (like the peanut butter, as the format’s author said) and fighting to keep "hacker" from referring to blackhats, I have to laugh when the authors try to claim that a RESTful API isn’t really REST, and use an appeal to authority to state that the term can only apply to a server returning HTML or some reasonable facsimile.

Advocacy aside, the book was unhelpful in other ways. I can accept that you feel your technology is mostly for the front end, so you don’t want to bog down your book with the perils and pitfalls of a back-end server. But when you’re diving into a method of development that requires a symbiotic relationship between the two, using the academic "beyond the scope" cop-out to wall off any discussion of how to actually structure a back end to benefit from your library is doing your readers—your users—a great disservice. If the scope of your book doesn’t include patterns for taking advantage of a "hypermedia" API, then what does it include? A few new HTML attributes and your whining that people are ignoring a rant from three decades ago?


Alright, I thought after this, I’ll give it one more shot. Never let it be said that I’m not stubborn. The back end of this newest version of Pixeme is going to use Django. Mostly, that’s because I’m tired of having to build out or link together all the different parts of a server-side framework that FastAPI doesn’t include. Things like logins, upload handling, etc. I still want to use Python, because that’s become the language I’m most productive in, but I want something with batteries included.

The official documentation for Django is an excellent reference, but it’s just that: a reference. There’s a tutorial, but this ends very quickly, and offers almost no insight on, say, best practices. That, for me, is the key strength of a tech book: it has the space and the "weight" to explain the whys as well as the hows. So I went looking for a recent book on the topic, and I ended up with Ultimate Django for Web App Development Using Python. A bit of a mouthful, but it’s so new that it even uses the "on X, formerly Twitter" phrasing that mainstream media has adopted to refer to Twitter. (Seriously, nobody in the real world calls it X, just like nobody refers to the Google corporate entity as Alphabet.)

In this case, the book is somewhat informative, and it functions a lot like an expanded version of the official Django tutorial. If you’re new to the framework, then it’s probably not a bad guide to getting started. From something with "ultimate" in the title, I just expected…more. Outside of the tutorial bits, there’s not much there. The book has a brief overview of setting up a Docker container, but Docker deserves to be wiped off the face of the earth, so that’s not much help. And the last chapter introduced Django Ninja, a sort of FastAPI clone that would be incredible if not for the fact that its developers support child trafficking and religious persecution.

Beyond that, the text of the book is littered with typos and grammatical errors. Most of these cases have the telltale look of an ESL author or editor, a fact which is depressingly common in tech references of all kinds nowadays. Some parts are almost unreadable, and I made sure to look over any code samples I wanted to use very carefully. It’s like dealing with ChatGPT, except here I know there was a real human involved at some point, someone who looked at the text and said, "Yeah, this is right." That’s even worse.


Three strikes, and I’m out. Maybe I’m just unlucky, or maybe these three books are representative of modern tech literature. If it’s the latter, that only reinforces my earlier point: today’s society rewards mediocrity and punishes intelligence, even in fields where intelligence is paramount.

Especially in programming, where there is no room for alternate interpretations, the culture of "good enough" is not only depressing but actively harmful. We laugh wryly at the AAA video game with a 200 GB install size and a 50 GB patch on release day, but past experiences show that it doesn’t have to be that way. We can have smart developers. As with any evolutionary endeavor, we have to select for them. Intelligence needs to be rewarded at all stages of life. Otherwise, we’ll be stuck where we are now: with ESL-level books that recapitulate tutorials, screeds disguised as reference texts, and a thousand dead or paywalled links that have nothing of value.

As a case in point, I was looking just yesterday for information about code generation from abstract syntax trees. This is a fundamental part of compiler design, something every programming language has to deal with at some point. Finding a good, helpful resource should be easy, right?

Searching the web netted me a few link farms and a half-finished tutorial using Lisp. Looking for books wasn’t much better, because the only decent reference is still the Dragon Book, published in 1986! Yes, the state of the art has certainly advanced in the past 38 years, but…good luck finding out how.

That’s what needs to change. It isn’t only access to information. It isn’t only that this information isn’t being written down. It’s a confluence of factors. All of it happening all at once is making us dumber as a people. Worst of all is that we accept it. Whether you consider it the "price of democracy" or simply decide that there’s nothing you can do about it, accepting this rot has only let it fester.

Lands of the lost

Recently, I finished reading Fingerprints of the Gods. I picked it up because I found the premise interesting, and because the mainstream media made such a big deal about author Graham Hancock getting a Netflix miniseries to showcase his unorthodox theories. I went into the book hoping there would be something tangible about those theories. Unfortunately, there isn’t.

Time of ice

The basic outline of the book is this: What if an advanced civilization existed before all known historical ones, and imparted some of its wisdom to those later civilizations as a way of outliving its own demise?

Put like that, it’s an intriguing proposition, one that has cropped up in many places over the past three decades. The Stargate franchise—one of my favorites, I must admit—is based largely on Hancock’s ideas, along with those of noted crackpots like Erich von Daniken. Chrono Trigger, widely regarded as one of the greatest video games of all time, uses the concept as a major plot point. Plenty of novels, especially in fantasy genres, suppose an ancient "builder" race or culture whose fingerprints are left within the world in some fashion.

It was this last point that piqued my interest, because my Otherworld series revolves around exactly this. And I even unknowingly used some of Hancock’s hypotheses for that. The timing of my ancients leaving Earth for their second world matches that of his ancients’ final collapse. The connection of archaeoastronomy as a way of leading to their knowledge arises in my books. Even using the prehistoric Mesoamericans as the catalyst wasn’t an original idea of mine; in my case, however, I did it so I wouldn’t have to deal with the logistics of the characters traveling to another continent.

Some of the questions Hancock asks are ones that need to be asked. It’s clear that ancient historical cultures the world over have some common themes which arise in their earliest mythology. Note, though, that these aren’t the specific ones he lists. The flood of Noah and Gilgamesh is entirely different from those of cultures beyond the Fertile Crescent and Asia Minor, for example, because it most likely stems from oral traditions of the breaking of the Bosporus, which led to a massive expansion of the Black Sea. Celts, to take one instance, would instead have a flood myth pointing to the flooding of what is now the Dogger Bank; peoples of New Guinea might have one relating to the inundation of the Sunda region; American Indian myths may have preserved echoes of the flooding of Beringia; and so on.

While the details Hancock tries to use don’t always work, the broad strokes of his supposition have merit. There are definitely astronomical alignments in many prehistoric structures, and some of them are downright anachronistic. Too many indigenous American cultures have myths about people who most definitely are not Amerind. (And now I’m wondering if Kennewick Man was a half-breed. I may need to incorporate that into a book…)

The possibility can’t yet be ruled out that cultures with technology more advanced than their direct successors did exist in the past. We know that Dark Ages happen, after all. We have historical records of two in the West (the familiar medieval Dark Age beginning c. 500 AD and the Greek Dark Age that started c. 1200 BC), and we’re very likely on the threshold of what might one day be termed the Progressive Dark Age.

With the cataclysmic end of the Ice Age and the catastrophic Younger Dryas cold snap, which now seems likely to be caused by at least one asteroid impact, there’s a very good impetus for the "breaking the chain" effect that leads to a Dark Age, one that would erase most traces of such an advanced civilization.

Habeas corpus

Of course, the biggest problem with such a theory is the lack of evidence. Even worse, Hancock, like most unorthodox scholars, argues from an "absence of evidence is not evidence of absence" line of thought. Which is fine, but it’s not science. Science is about making testable and falsifiable predictions about the world. It’s not simply throwing out what-ifs and demanding that the establishment debunk them.

The onus is on those who make alternative theories, and this is where Hancock fails miserably. Rarely in the book does he offer any hard evidence in favor of his conjecture. Instead, he most often uses the "beyond the scope of this book" cop-out (to give him credit, that does make him exactly like any orthodox academic) or takes a disputed data point as proof that, since the establishment can’t explain it, that must mean he’s right. It’s traditional crackpottery, and that’s unfortunate. I would’ve liked a better accounting of the actual evidence.

Probably the most disturbing aspect of the book is the author’s insistence on taking myths at face value. We know that mythology is absolutely false—the Greek gods don’t exist, for example—but that it can often hide clues to historical facts.

To me, one of the most interesting examples of this is also one of the most recent: the finding in 2020 of evidence pointing to an impact or airburst event near the shore of the Dead Sea sometime around 1600 BC. This event apparently not only destroyed a town in such a violent event that it vaporized human flesh, but it also scattered salt from the sea over such a wide region that it literally salted the earth. And the only reference, oral or written, to this disaster is as a metaphor, in the Jewish fable of Sodom and Gomorrah.

Myths, then, can be useful to historians and archaeologists, but they’re certainly not a primary source. The nameless town on the shore of the Dead Sea wasn’t wiped out by a capricious deity’s skewed sense of justice, but by a natural, if rare, disaster. Similarly, references in Egyptian texts of gods who ruled as kings doesn’t literally mean that their gods existed. Because they didn’t.

In the same vein, Hancock focuses too much on numerological coincidences, assuming that they must have some deeper meaning. But the simple fact is that many cultures could independently hit upon the idea of dividing the sky into 360 degrees. It’s a highly composite number, after all, and close enough to the number of days in the year that it wouldn’t be a huge leap. That the timeworn faces of the Giza pyramids are currently in certain geometric ratios doesn’t mean that they always were, or that they were intended to be, or that they were intended to be as a message from ten thousand years ago.

Again, the burden of proof falls on the one making the more outlandish claims. Most importantly, if there did exist an ancient civilization with enough scientific and technological advancement to pose as gods around the world, there should be evidence of their existence. Direct, physical evidence. An industrial civilization puts out tons upon tons of waste. It requires natural resources, including some that are finite. The more people who supposedly lived in this Quaternary Atlantis, the more likely we would have stumbled upon one’s remains by now.

Even more than this, the scope of Hancock’s conjecture is absurdly small. He draws most of his conclusions from three data points: Egypt, Peru, and Central America. Really, that’s more like two and a half, because there were prehistoric connections between the two halves of the Americas—potatoes and corn had to travel somehow. Rarely does he point to India, where Dravidians mangled the myths of the Yamnaya into the Vedas. China, which became literate around the same time as Egypt, is almost never mentioned. Did the ancients just not bother with them? What about Stonehenge, which is at least as impressive, in terms of the necessary engineering, as the Pyramids?

Conclusion

I liked the book. Don’t get me wrong on that. It had some thought-provoking moments, and it makes for good novel fodder. I’ll definitely have to make a mention of Viracocha in an Otherworld story at some point.

As a scientific endeavor, or even as an introduction to an unorthodox theory, it’s almost useless. There are too many questions, too few answers, and too much moralizing. There’s also a strain of romanticism, which is common to a lot of people who study archaeological findings but aren’t themselves archaeologists. At many points, Hancock denigrates modern society while upholding his supposed lost civilization as a Golden Age of humanity. You know, exactly like Plato and Francis Bacon did.

That said, it’s worth a read if only to see what not to do. In a time where real science is under attack, and pseudoscience is given preferential treatment in society, government, and media, it’s important to know that asking questions is the first step. Finding evidence to support your assertions is the next, and it’s necessary if you want to advance our collective knowledge.