With apologies to Monty Python, I’ve been thinking a lot about the rules that have shaped human behavior through the centuries, and world religions play an outsized role. I’m not challenging anyone’s belief system. I’m simply interested in how the rules we allow to shape our beliefs and actions function in the same way as the algorithms used in agentic AI. From the early days of the Code of Hammurabi and through Greek and Roman mythology, we see how history unfolds alongside religious practice. Tracing Asian religions and the three Abrahamic traditions to today’s religious practices, we find similarities in the search for divine truth. Fear of retribution is a common theme, as is some form of the Golden Rule.
In the age of agentic AI, I find it instructive to think about religious rules as algorithms. It not only helps deconstruct some of the “facts” we regard as historical truth but also acts as a cautionary tale for those of us navigating a world increasingly shaped by chatbots ready to “help” point to truth wherever they are directed. We are in danger of erecting virtual cathedrals to our own prejudices. Case in point—many rational humans have fallen in love with their virtual companions.
Here’s my personal perspective. As a child raised in a fundamentalist Christian denomination, I wasn’t aware how much of the “Biblical truth” that I was taught was basically an algorithm coded and enforced by fallible humans. I don’t claim malign intent. I claim that people didn’t see the algorithms for what they were—a way to impose a strict moral code on a community of believers. Joan of Arc, Galileo, and some unfortunate Salem Puritans all ventured outside of the prescribed algorithms of their community and paid the price. Indeed, nobody expects the Spanish Inquisition.
And of course, it’s more than religious practice. Back when algorithms were strictly used to solve math problems, we allowed simple, problem-solving tools to permeate our behavior and our culture. Think about it. The Farmer’s Almanac our great-grandfathers relied on told them when to plant and how to think about the seasonal cycles that shaped their world. The American history book we used in middle school, without questioning its sources, shaped our world view. Even the meatloaf recipe handed down for generations is an example of a default algorithm used to help bring order to our experience. Every algorithm assumes a bias. Hence, the polarization we see within religious communities.
Today, in the English language alone, there are over one hundred distinct Christian Bible translations or major revisions currently in circulation. If you search Amazon for books on “Bible Study,” that algorithm lists over 70,000 results. No wonder modern Christians who seek the path of the righteous are increasingly at odds with each other; who has the right algorithm?
Here’s the lesson we can carry forward into our AI-influenced world. Demand transparency. Stay aware. Increasingly, algorithms written by someone else (most likely a self-learning agentic AI) generate invisible mazes that direct our lives in seemingly benign ways. These could affect how we shop, how we eat, and how we vote. Even if we are allowed to vote. They have the potential to do great harm by biasing the system against outliers. They could also work as a launching pad for innovations that extend life, solve climate change, and end global hunger. These realities live side by side.
Only we can determine how that unfolds. Algorithms, whether they come from mathematical equations or an innate desire to find order in the cosmos, will always be with us. Human oversight, a community demand for transparency, and individual vigilance are in order.
