Expanding the Scope of Rationality Turns it Into Religion
Consistently aligning my map with the territory lead to my adopting a bunch of practices that a look a lot like the intersection of most major religious faiths
I have come to believe that just trying to consistently believe what is true, aka rationality, aka ‘the Scout Mindset’ is so difficult, that as a result of taking this endeavor seriously, I have concluded a prior project I set out on, to try and to see religions as being maps of normative territory, and attempting to integrate them to see what the true norms are.
My conclusion is that simply trying to pursue the truth is likely a sufficient sense of what ‘good’ is, to generate the entire territory being mapped out by major religions.
I’m not telling you how to be rationalist, or what rationalism means, or how you should live your life. I’m simply sharing my own experience. My experience is that the more I attempt to see what is actually true, instead of what I think is true, the more I find myself acting in ways that are motivated by a number of different readings by religious or spiritual authors, like this one:
When a prophet speaketh in the name of the LORD, if the thing follow not, nor come to pass, that is the thing which the LORD hath not spoken, but the prophet hath spoken it presumptuously: thou shalt not be afraid of him.
Deuteronomy, 18:22, plus modern rationality at its core.
I understand that this is obviously a deeply controversial thing to say.
My goal is not to emotionally tweak you, or verbally steamroll you into adopting my worldview as your own. I’m just a guy trying to share a perspective here that I found both surprising as well as helpful. Hopefully it’ll make more sense, and you’ll be encouraged not to throw this essay away in disgust, if I point out that many religions convince people to adopt unverifiable beliefs as a kind of instrumental rationality.
I did sprinkle some pictures in here, but this is a long read for obvious reasons. No apologies there; the readers who like my substack the most seem to enjoy the longer, more thorough arguments. If you’ve made it this far despite the lack of pictures and the clearly abstract nature of the subject at hand, congratulations! You’re in the unique company of the few people who’d laugh at my comments at the Facebook Q&A with ‘Zuck’, or the people at Google who’d stand around when I’d go on a soliloquy at the end of a beer-and-snacks social hour, or the kind of person who understands when they are being flattered and yet doesn’t immediately lose trust because hey, we’re all human, everyone wants a pat on the back now and then, and the author flattering you as being part of the high status elite group that is his audience is only evidence that this guy isn’t totally nuts, and instead might be somewhat well read and thoughtful, especially given his self-aware ribbing at the obviously shameful thing he’s doing by telling you how smart you are for reading.
A Synthesis of Many Seemingly Contradictory Worldviews
I have spent a number of years engaged in an epistemic experiment.
The hypothesis I was exploring was that there is, indeed, moral truth, but it is computationally very complex and subtle. The hypothesis further said that moral truth was the same thing as pursuing long term expected value choices. The hypothesis I was experimenting with was that there really is a true value system, and that our various evolutionary drives approximate the true value system.
This hypothesis leads to a prediction: attempts to articulate moral truth in the past would all be different from each other in many ways, and yet they would have some underlying unity. So I explored as my different religious and spiritual frameworks as I could, working under the hypothesis that they were approximations of some underlying truth.
I wanted to do the experiment with an open mind because if there is truth to the hypothesis that there is a true value system, then, by definition, that is the most valuable truth there is.
Along the way, I adopted various hypothesis that, through the experimentation of living my life by them, seemed to work well. Prior to the last few weeks, my best guess waste the true value system was ‘entropy maximization’. That hypothesis was at the top for a while, after knocking out ‘the unspecified nature of the true value system is important so that you don’t mistake it for an approximation.’ But what was recently come to rule the roost, the meme that has come out on top of the competition for space in the top-level control stack in my brain, the thing which analyzes the present moment and says, adjust attitude here, adjust energy here, adjust posture here, change focus there - the thing which is living there right now says, love the truth, search for the truth, look at it fearlelessly and submit to whatever you find.
Not only has this approach worked for me in the lab of my own life, i.e. as a child of dying parents, a father of three young children, a brother and a friend to people in need, a man who always wants to do more than he can, who suffers from stress and anxiety and fear but can’t help but want to do more good even though he’s not sure what exactly that means - it seems to also unify the different maps I was exploring.
To my great surprise, the unity in all these different moral maps, and in the rationality community, seems to be, “try to avoid rejecting what is actually true in favor of your own beliefs.”
I now believe that philosophy - the love of the Truth - is sufficient to generate so much of what so may different religions consider to be ethical, moral behavior, that I am now confident enough to believe that fidelity to the Truth really is the moral territory.1
I set out to synthesize the views of different Darshans of Hinduism, various schools of Buddhism, (Hinayana, Mahayana, Theravada, Tantric), Sikhism, various Abrahamic faiths such as Judaism, Sufi and Shia Islam, the Baháʼí Faith, and strains of Christianity including Calvinism, Roman Catholicism, Eastern Orthodox Christianity, Mormonism, and authors and speakers like Alan Watts, the Indian mystic Sadghuru, Ram Dass, Terrence McKenna, Rob Burbea, as well as Jordan Petersen, and various Christian, Jewish and Muslim authors and speakers.
The end result of studying and trying to understand the commonalities and differences between these perspectives, plus my own attempts at meditation, plus my navigation of the world as a human being acting upon various hypothesis and ideas and norms, have all lead me to a place where I believe that simply loving what is true, on a consistent basis, more than what I already believe to be true is really, really, really hard.
I have come to believe that consistently striving to believe what is true, and being open to the possibility that my beliefs are wrong, is sufficient to satisfy most of the exhortations of these approaches, to the extent that they don’t directly contradict each other. Experimentally, I have found that adopting a mishmash of ‘best practices’ from all these different faiths seems to be helping me in that endeavor. It isn’t just religions that are unified by a love of Truth - I think science comes out of this as well.
At the Core: Openness, Determination, and Measured Self Trust
I am coming to believe that the scientific method, as well as successfully adhering to the tenets of most of many faiths and practices, depends upon adopting a specific kind of attitude towards the present, which is the same attitude I try adopt during meditation: a mixture of openness to the truth, determination to get at the truth, combined with trust in myself and with awareness that I might very well be wrong.
These attitudes all seemed deeply contradictory, until I gradually came to see them as something like, ends of a tradeoff - or, like a structure that stands up because it leans on itself. Acknowledgement that I might be wrong motivates openness. Trust in myself strengthens determination to pursue the truth, and openness to ideas that might be subtly wrong or even harmful if I adopt them. Trust in the world to be consistent, legible, and good, motivates openness. Acknowledgement that I might be wrong, and that this will likely or limit hurt me, motivates determination to be less wrong.
It is very difficult to simultaneously acknowledge that I might be wrong, while still believing that I am right and lots of other people are wrong. Yet the more I practice this, the more I feel a sense of faith, both in the world to be coherent, understandable, and constrained, and in myself to safely explore that world in ways that don’t violate my intuitive sense of the kind of man I want to be.
I am starting to think that the protestant reformation and the scientific revolution both shared something in common: a rejection of the teachings of authorities, in favor of direct experience with reality, and the willingness to trust yourself to pursue the truth.
The Contradiction Between Top-Down and Bottom Up
There is an obvious contradiction at play here.
Many religions explicitly say, “trust these specific authorities.” I resolve the contradiction by leaning on the original hypothesis: if there is a true value system2 which our evolutionary drives and desires approximate, then religions that approximate the true value system better should outperform those which do not.
I have come to believe that these religions ideologies work, i.e. they propagate through minds, across time, over and over, because they promote alignment with the truth, in their hosts, better than whatever is running around in the cultural microbiome.
I think predictive processing explains this well: it’s very easy to fall into a trap where your beliefs about reality overwrite what you are experiencing in the present moment. From a predictive processing perspective, top down thinking is so much easier for me to fall into than bottom-up attentiveness to sensory information. I am now beginning to think that religions that succeed (i.e. survive and propagate) do so because they successfully convince people to consistently adhere to the truth.
They seem to adopt two basic strategies: hard-coded high level priors, or conscious rejection of priors altogether.
Hard-Coded Priors
One strategy is to convince people to adopt specific high-level priors with probability 1. These strategies work to the extent that the priors are reasonably accurate guides to consistently satisfying human drives. If objective value is real, we could say that these strategies work by approximating the true value system well enough that people consistently following the strategies outperform people who do not. Abrahamic faiths, as well as Hinduism and its offshoots seem to rely on this strategy.
These approaches tend to fail in two ways: One, the environment changes so much that the hard-coded priors are no longer valid. Two, linguistic slippage plus cultural evolution mean that the old priors are no longer legible in a future culture whose use of language and cultural background priors make it almost impossible for someone to accurately reconstruct the original priors.
These failure modes seem to compound each other; successful ideologies change their environment in ways that likely invalidate the accuracy of their priors3. In a world that is full of brutality and violence, saying ‘this place is made by an all-powerful, loving God’ sounds so insane that it’s provocative and makes people think, i.e. consciously examine their priors. I think the modern equivalent would be arguing that, sometimes being judgmental is more loving than simply striving to be loving and accepting.
Conscious Rejection of Priors
Another, seemingly contradictory strategy, is to encourage people to reject priors and lean heavily into trusting their own personal experiences. Strains of Buddhism and Idealism leave heavily here, as do, to some extent, Marxism and certain strains of postmodernism.
These approaches tend to fail because you can’t escape having priors, but you can seal them off from your introspection so that you aren’t able to understand when they are being violated so heavily it would be wise to either heavily updated probabilities or else re-write your town-down prediction model.
Interesting enough, the most complex and subtle faiths I’ve studied do both: they include some hard-coded, top down priors, while encouraging active awareness and acceptance of the present moment.
I think these religions still ‘work’ (i.e., they compete for mindshare effectively over long periods of time) because the territory is massive, and persuading people to adopt false beliefs about things beyond their ability to control or test can be instrumental in helping them adopt more consistently accurate beliefs about things they can control and experience directly.
One way to convince people to always consider expected value when choosing, instead of just listening to the loudest voice in your head, is to construct a story that stimulates primates in all their loudest drives by pretending that the truth itself is a person, who is eventually going to judge you in the distant future based upon how accurate your worldview was.
Personally, I find it far more effective to pretend that the truth is a person, who loves me and wants me to approach them, but above that wants me to be free, because they wouldn’t control me any more than I would control my kids, if I could. I love my children, and therefore I want them to freely explore the world, hoping that they pursue a path of truth and therefore goodness.
Why Rationality Motivates Religion and Spirituality
I’m guessing that what triggered various explosions of religious faith around the world was deeply charismatic people who figured out an identical reality - that total devotion to Truth is both extremely demanding and extremely rewarding. I think the decline of religions comes about, inevitably, as socially constructed maps of the territory diverge from the territory, due to both linguistic drift and changes in the territory over time. After a few generations of a living in a culture that adheres to an increasingly inaccurate map, people begin to doubt that rationality is as demanding as it truly is.
The demanding normative realities of convergent instrumental sub goals are easily ignored, because the I think that rationality demands we be patient, loving, kind, humble, sincere, and honest. But these demands are contingent on acknowledging our own weaknesses, limitations, and failure modes.
We don’t like to admit, or think about, the Truth of our own limitations. A religion that tells you to ‘be humble and trust God who has everything under control’ is, I think, more true than one that says, “your capacity to predict the future beyond short horizons is extremely limited, and mostly what it does is cause you anxiety and fear. The best you can do in any moment is to trust the world to be somewhat intelligible, markets to be somewhat efficient, and political frameworks to be somewhat equilibrium seeking, and then do your best to be emotionally stable and focused on taking long-term value added actions, while accepting the stochastic nature of life as a human being. Identify more with the long-term narrative propagating itself by means of your body, then your body itself, and you achieve the feeling of memetic immortality, and with it, peace and acceptance of death, both yours and those you love. How you identify is your choice, and if you choose to identify as an immortal concept approximated by DNA and its epigenome more so than the epigenome itself, who’s to say you’re wrong?”
You can stop right here and you’ve gotten maybe 90% of the value of the essay. What remains are a bunch of details for people who like that sort of thing. If you enjoyed this essay, why not subscribe?
Or, heck, give it a share
Or leave a comment?
Or even:
How Specific Values are Motivated
We are mortal, flawed, and easily tempted. These unpleasant truths are ones which we are tempted to ignore or downplay. Ignoring our flawed, weak, limited epistemic capacities leads to the delusion that we, as individuals, are capable of accurately seeing reality consistently on an on-going basis while focusing primarily on some goal we have of altering the world state.
I believe the reality is that our epistemic flaws and limitations so severely limit our ability to see reality as it is, that unless faithfulness to the truth is our topmost goal, far above anything else, we are guaranteed to fall into wishful thinking and confirmation bias.
I believe that the orthogonality thesis is clearly false for embodied agents, who, in order to maintain alignment with the truth, must practice a number of disciplines and instrumental values that end up looking a lot like the intersection of most major religions.
You Must Love the Truth Before All Else
You can’t have your map aligned with the territory if that isn’t really, really important to you.
The moment anything else becomes more important than believing only that which is true, and believing as much as possible of what is true, you are going to be tempted to adopt incorrect or questionable beliefs which are instrumental to advancing whatever goals are higher priority for you than adherence to the Truth. You’ll be tempted to reject truths that make you uncomfortable or squeamish or threaten your ability to pursue your goals.
We might rephrase the first commandment from
I am the LORD your God, who brought you out of the land of Egypt, out of the house of slavery. You shall have no other gods before Me.
Into:
Only True beliefs allow you to consistently accomplish seemingly impossible feats. If anything becomes more important to you than fidelity to the truth, you will eventually weaken your capacity to accomplish any of your goals. Therefore, whatever your goals are, adherence to the Truth must be paramount among them.
Don’t Lie.
It is very easy to lie to yourself. Confirmation bias is extremely difficult to escape. When you tell lies, knowingly, it’s very difficult to avoid the trap of beginning to believe them yourself.
“The first principle is that you must not fool yourself and you are the easiest person to fool.”
I have found that the way I communicate with others ends up being the way I communicate internally, i.e. the way I think. When I am patient, gentle, open minded, curious, specific, precise, and cautious with my words, my own patterns of thinking tend to take on these attributes.
Conversely, when I try to throw rhetorical bombs at the internet, I feel that happening inside my thinking patterns. It feels great right up until it gets me severely hurt.
Be Honest With Yourself and True to your Values
When I act in ways that are counter to my long-term values, I introduce inconsistencies within my own map, and these inconsistencies bring it out of alignment with the territory.
I have drives and instincts as a human being. I cannot deny these ‘inscrutable exhortations of my soul.’ No, I don’t need to push these values onto other people. Experimentally, I’ve found that that approach doesn’t work. This means, practically speaking, I’ve stopped doing that.
So much of the things I do that traditionally religions would say are wrong, I have found, experimentally, that these ways of acting simply make me feel worse over the long term, even though they often convey short term rewards.
I have found that it is difficult to act in ways that consistently line up with my long term values. I have also found that the practices suggested by multiple major religions work well at helping me act in ways that better align with how I want to be.
Be Disciplined. Manage your Emotions and Guard Against Being Owned By Your Drives.
If I get too angry, I am likely to hyper-focus on a tiny number of facts, while tuning out and ignoring all others. Anger severely restricts the scope of my attention. Ignoring aspects of reality is forgoing map-territory alignment in most of the map, in exchange for a hyper-detailed, and likely incorrect picture of some tiny piece of the territory.
Intense, unchecked anger is a hindrance to consistently accurate beliefs.
Anger can be instrumental, so long as it is controlled, limited in scope, and subject to a broader commitment to the Truth.
When I feel anger, I ask myself, “what boundary do I feel an instinct to enforce here?” Sometimes, the boundary is a healthy one. Sometimes it seems likely that all parties will be better off if I remind them, “hey, there’s a line here, I’d prefer if you didn’t cross it.” Often, the other party also prefers that they not cross that same line, but they have gotten carried away by their own intense emotions. Helping someone else remember that they would rather not behave a certain way is bringing all of our maps more in alignment with the territory simply by reducing the internal contradictions in the maps of people around me.
If I get too greedy, too desirous of material gains, I am also likely to lie to myself about the possible consequences my your actions. “I’ll just have a little bit” is something I often think before engaging in behavior that has a short term payout but long term negative consequences for them. This prediction is usually false when I make it, unless I’ve managed to cultivate a discipline and practice around that behavior.
Engaging in destructive behavior (what some religions call ‘vices’) almost always requires distorting my understanding of the long-term consequences of my actions.
Fidelity to the Truth about the likely consequences of my actions, is very difficult for me when I am faced with significant short-term payouts. The strategy I am using is to think about the truth, a lot, multiple times a day, by giving it a name that helps me to think of the truth as a Person to whom I am loyal and do not want to betray.
Pursue and Cultivate Healthy Relationships.
I find that I am likely to imitate the people around me, and so it is wise for me to avoid spending too much time around people who are consistently engaging in behavior that distorts my cognitive map.
If I temporarily adopt goals more important to me than “fidelity to the Truth”, I become more likely to adopt false beliefs which are instrumental to me reliably satisfying whatever it is that is more important to me, in that moment, than believing what is true.
The more I foster lots of healthy relationships, the more I am able to receive signal about the world from other people's perspectives. I can learn more about the world more broadly by integrating the different perspectives of people with different experiences. Even if I don’t share another person’s perspective, the fact that they have it is also a part of the territory.
Be Kind and Open Minded To Maximize Signal Flow.
If I am too mean or harsh to other people, they won't open up with me, and I lose access to those conduits of information. If I am too convinced that my own world view is correct and others are wrong, I won't be able to integrate their worldviews with mine.
When I judge and criticize other people, they are less likely to share their perspectives with me. Other people’s beliefs, even if they are inaccurate, are still part of the territory. If I judge and criticize others habitually, at minimum the areas of my map that represent these people will consistently be wrong. If I can successfully relate to all kinds of people, they will share their perspectives, their struggles, and their values with me, and thus my map of the word will be all the richer and more accurate.
If I value accuracy and breadth of my beliefs, then I must be kind and open minded, to increase my ability to understand more of the world by opening up more avenues of information flow from other people to my worldview.
Practice Empathy and Solidarity
When other human beings are prosper, my capacity to understand reality grows, because their ability to deliver valuable signals to me increases.
Wouldn’t you like more scientists, more explorers, more creators? I know I would. My experience tells me that merely having more functional, healthy adults in the world will help you better understand the world.
The perspectives that have helped me learn and grow the most are those from people who are psychologically balanced, stable, and healthy. Good examples are hard to find, but extremely valuable when I find them. When people are struggling or afraid of adverse consequences, it is often harder for them to pursue the truth. When governments are dictatorships, and people aren’t free to say what they please, they are likely to alter their thinking patterns to protect themselves.
Thus, a commitment to my own rationality motivates a desire to see all human beings flourishing materially, and free to pursue whatever beliefs or ideas they want.
Injustice anywhere is a threat to rationality everywhere.
Of course, I can’t do much about most of these situations. But I can, at least, recognize that merely wanting to see the world accurately means I should value worldstates that consist primarily of thriving, healthy adults freely pursuing their own values, over worldstates where may adults are afraid to speak their minds.
In a word, this means valuing liberalism. Instrumental rationality compels me to believe that I should take actions which promote the spread and flourishing of liberal values4.
Be Bold, Try New Things. Don’t be Afraid of Failure.
Confirmation bias is really, really hard to avoid.
The more time I spend in one environment, the more likely it is my beliefs will become an optimized map of how to succeed in that one environment. One approach that helps me, is to put in a ton of effort in to remember and repeat that:
The territory around me is not all there is. There is more to life than just what I experience in my immediate vicinity. There is a truth which is far larger than my own limited experiences. The more I seek out its reality and accept whatever I find, the more willingly I look for errors in my own worldview, the better things will go for me over long periods of time and across higher levels of environmental variance.
Even better than this kind of prayer is failure.
Going out, trying things, failing, and getting back up is a great way to break confirmation bias.
Epistemic rationality ends up suggesting that it might be a good idea to have a bias for action, for trying new things, for creativity
Writing articles, sharing my thoughts with the world, and seeing how the world pushes back and responds has been great at helping me develop and grow my own beliefs.
Just hiring someone else to work for me, once, helped me much better understand what it’s like to be ‘on the other side’ of the table. Starting businesses that failed was phenomenal at helping me better understand the economy, investing, and the perspectives of people ‘higher up’ in the business world, who would otherwise remain more of a mystery.
Getting laid off, being asked to leave, having partners break up with me, having co-founders lose faith in me, even losing valuable friendships have all hurt. These painful experiences have also been instrumental in my personal growth. All of these scenarios have helped me better understand where the edges are, and, ultimately, helped me place more value on understanding other people’s perspectives, and on avoiding pressuring others to adopt my own beliefs.
Seeing Reality as Good Guards Against Self Deception
Why is the truth what it is? Wouldn’t it be better if the truth were different, so that, say, it were true kids didn’t die of awful diseases and that people didn’t suffer things for no reason? Couldn’t a radically different world be so much better?
There is a truth here, but it’s like a chainsaw: a powerful tool if used well, but potentially a source of great injury.
These thoughts risk being the beginning of the path of self deception, because they involve a belief that my own imaginary constructions are superior to the Truth. That path soon leads to my own beliefs taking precedence over a willingness to consider that I might be wrong. If I don’t trust the truth to be fundamentally good, I might be tempted to not look at some truths. If I think the world is a scary place, why should I believe that knowing certain ideas wouldn’t hurt me? The only way I can rectify always wanting to pursue the truth is with a prior that says, effectively, Goodness and Truth and inexorably linked.
Another danger in imagining that some hypothetical situation is better than this one is the danger of excessive confidence in the accuracy of my understanding of the world.
In order to prevent myself from imagining that ideas which originate in my head are superior to the Truth, and thus ignoring aspects of the truth I don’t like, I have assigned an extremely high probability to the belief that there is likely some order or meaning of which I am unaware, and that often things which seem awful from my perspective are likely instrumental to good things that I don’t yet comprehend or realize.
I have already encountered plenty of evidence that this is true in my own life, and labels that say ‘this is good, this is bad’ are not supplied to me by empirical reality, but by my own internally cultivated moral system.
Simply pinning “truth = goodness” with probability 1 ends up stabilizing instrumental and epistemic rationality as the highest goals, and thus is as useful as assuming with probability 1 that there is a world outside my head that follows consistent, discoverable rules. The alternative is to spend huge amounts of time exploring unverifiable hypotheses that ultimately make me miserable.
Change what You Can, Accept what You Cannot.
Of course, sometimes ideas that originate in my head really are better than reality at present. When I write something that people like, I have altered reality by substituting some tiny piece of it with an idea from my head. When I make my family eggs for breakfast, I am altering physical reality according to a plan from my brain.
I think this is still improving map territory alignment, in this case, by altering some tiny piece of the territory to better reflect my map. This is, effectively, what a blueprint is: a map of some not-yet-real territory that becomes accurate by means of people altering the territory.
Sometimes, alternate worlds really are better than the present. Rationality warns us against taking too large steps in the direction of our imagination, because our imagined futures are not real. The further out we imagine changes, the less likely our imaginations are to line up with reality.
Rationality then suggests that when we set out to change the world, we should do so gradually, humbly, cautiously, and with an open mind to the possibility that we might be wrong. It suggests learning from history, to better understand how prior attempts to make similar changes have gone poorly. It suggests moving gradually and slowly, given the chaotic nature of the world and our inability to see the future clearly.
Faith as a Strategy for Consistent Rationality
A strategy that I have found effective in helping me strive to keep my map aligned with the territory is to practice a kind of relentless loyalty and devotion to the truth.
My goal is to practice this loyalty to the truth with a level of emotional intensity that exceeds my desire for anything material. Imagining that the Truth is a person facilitates the generation of emotional signals more powerful than that of my drives for food, sex, money, or status.
I love this anthropomorphized representation of Truth more than I love my own family, because I believe this approach, of seeing the Truth as a lover, a protector, a parent, a guardian, a guide, a coach, a friend, and a mentor helps me reduce the chances that I will engage in acts of self-deception in order to satisfy some temporary drive or desire.
When things go poorly for me, I imagine that the Truth and I are engaged in a kind of kinky, painful sex. The pain I experience, I interpret as an act of love to which I consented to by turning away from the Truth in some aspect. Whack, pay attention, my lover tells me when I move to turn the blankets in my bed and dislocate my shoulder at 3 in the morning and I’m tired and just want sleep. “She does this because she loves me,” I remind myself, “and I really do want to be more mindful about how I move my body.”
When I hurt in ways that feel unfair or wrong, I ask myself, “What did I do to contribute to this happening? In what ways did false beliefs lead me to this experience?”
So long as I am honest with myself, I can always find some answer.
An intrepid reader may then ask, ‘ok, but what about entropy maximization? Do you still think of the physical universe as being an ‘entropic investor?’ The answer is: yes, I do. I now think that consistently believing what is true, especially the truth of the present moment, is likely to be the most effective means for maximizing my contribution to global entropy.
One might also argue that the territory I’m describing is ‘human nature’, rather than simply ‘the true value system’. It’s a fine objection and I explain later why i think this is partially, but not entirely true.
I have a longer argument I want to make about this. The short version is that priors interact with the world by throwing out certain kinds of information as illegible. An embodied agent acting on specific priors, I think, likely has the net effect of falsifying those priors. Our priors tend to be geared towards that which helps us survive and propagate, which means we pay more attention to things we can use and manipulate, than we do to things like our own waste products. These waste products tend to be illegible to us, because better understanding things we can’t use doesn’t really serve us. Our waste products grow in proportion with the success of our ideology, however. I think this mechanism in part explains the cycle described by ‘the fourth turning’, the Hindu notion of Kali Yuga, and the ‘weak mean → hard times → strong men → good times’ meme, all of which capture essentially the same pattern.
I think liberalism is something like a ‘minimum viable religion’ but this is a topic for another post.
Hello, Mark! I found your Substack from a comment you made on Astral Codex Ten. I think about entropy all the time, apparently just like you, and hadn't heard of Jeremy England, so grateful for that introduction. I appreciate the ambition and audacity of your writing!
This essay reminds me a bit of Leah Libresco's path. Pressed to answer where she thought moral law came from in her metaphysics, she blurted out: "I guess Morality just loves me or something." "It turns out ... I believed that the Moral Law wasn’t just a Platonic truth, abstract and distant. It turns out I actually believed it was some kind of Person, as well as Truth."
Great essay, Mark. I agree with you. As you know, I went through a similar exercise in applying evolution to morality and concluded that I had been wrong when young to dismiss evolutionary ethics as immoral; in fact, properly understood, you end up at almost the same place as traditional morality. There always seems to be convergence.