

Discover more from Resident Contrarian
The other day Scott Alexander wrote a piece focusing on a few of the various jhana states of meditation. This falls pretty firmly into the category of “stuff I don’t know much about”, but he linked to an explainer of the general states here:
As you reach each jhanic level, your mind will be tempted to remain at the previous jhanic state. Simply keep a balanced mind with no clinging to the pleasant or unpleasant and you will progress to the higher levels. The nine levels of jhana are:
Delightful Sensations
Joy
Contentment
Utter peacefulness
Infinity of space
Infinity of consciousness
No-thingness
Neither perception nor non-perception
Cessation
Scott’s article focuses mainly on the first two because he’s mostly commenting on the claimed experience of Nick Cammarata, who describes his experience with them like this:



These are, for better or worse, pretty big claims. Being more in control of pleasure-seeking behavior would be a very big deal for people who like getting stuff done and thus would also be a game-changer all by itself. Something being 1000-10000% better than sex has an appeal all its own, or at least would for me if it didn’t sound potentially fatal; I’m not sure if my wiring can keep up with that.
If that weren’t enough to make you want what he’s got, he reveals that it also directly quasi-cured his drug use, helps him maintain a healthy diet, and rewired his brain to be more sensitive to caffeine so he can get by with less:
Or, to inflate the promised goods in the more carnal direction, someone else mentions that jhana states might make you spiritually jizz whilst touching blankets in Target:
To be very clear, this makes all my woo-alarms go off. In a purely instinctive/reactive way, I am strongly inclined against believing any of this. To be even more clear, I also can’t disprove any of this. I haven’t adhered to a months-to-years-long daily meditation schedule after reading up on how to do it correctly. I can’t peer inside their mental state and say that I know for sure that they are reporting inaccurate information.
I also don’t think that these claims are entirely implausible. Brains are weird and do weird stuff. If optical illusions can make me see colors that don’t exist or objectively misinterpret the color of a dress, that should make me more confident that my brain wiring can be tricked. If there’s a flower with seeds that can (processed correctly) make me immune to pain, that should make me less confident that my brain runs on pure, incontrovertible reality. Whether or not I feel that it’s likely what jhana practitioners claim is happening, I can’t (fairly) say “I’m sure it’s not”.
Like everything else that’s ever happened, this makes for weird social situations.
If you came up to me and asked “RC, should I dedicate a bunch of my life to meditation in pursuit of a mystical bliss state that also makes me quit drugs, eat better, and perhaps directly cause mop-ups in aisle six?” I’d probably tell you that you shouldn’t, at least to the extent you are motivated by that reason alone. And I clearly wouldn’t do so myself - it’s a big outlay of time and effort for a payoff I’m not confident in.
For several people who are reading this, I think that’s probably mildly insulting. They have relayed a personal experience, and I’m doubting it - how is that not an accusation of lying? And why would I disbelieve it when I believe other similar subjective statements regarding unfalsifiable experiences every day?
Scott frames the same problem like this:
I consider this to be a pretty decent point in terms of, say, telling someone “I doubt it!” when they say they are hungry. Why would you do that? And why would you go straight to assuming they aren’t? It’s fair to point out that people tell you about subjective internal experiences every day - that they are happy, sad, or have a headache - and that we mostly take it as true based on faith.
I think other parts of this comment do a little less well under close scrutiny. Scott points out that thousands of people claim to have reached jhanas (in what I assume he means are the ways described here), and that this should be strong evidence that the states exist. He also applies a fairly confident impression of implausibility that people inflated how they perceived or communicated a good mood; he doesn’t seem to think that could happen.
I don’t really want to do the part of this article that’s about how it’s reasonable to doubt people in some contexts. But to get to the part I want to talk about, I sort of have to.
There is a thriving community of people pretending to have a bunch of multiple personalities on TikTok. They are (they say) composed of many quirky little somebodies, complete with different fun backstories. They get millions of views talking about how great life is when lived as multiples, and yet almost everyone who encounters these videos in the wild goes “What the hell is this? Who pretends about this kind of stuff?”
There’s an internet community of people, mostly young women, who pretend to be sick. They call themselves Spoonies; it’s a name derived from the idea that physically and mentally well people have unlimited “spoons”, or mental/physical resources they use to deal with their day. Spoonies are claiming to have fewer spoons, but also en masse have undiagnosable illnesses. They trade tips on how to force their doctors to give them diagnoses:
In a TikTok video, a woman with over 30,000 followers offers advice on how to lie to your doctor. “If you have learned to eat salt and follow internet instructions and buy compression socks and squeeze your thighs before you stand up to not faint…and you would faint without those things, go into that appointment and tell them you faint.” Translation: You know your body best. And if twisting the facts (like saying you faint when you don’t) will get you what you want (a diagnosis, meds), then go for it. One commenter added, “I tell docs I'm adopted. They'll order every test under the sun”—because adoption means there may be no family history to help with diagnoses.
And doctors note being able to sort of track when particular versions of illnesses get flavor-of-the-week status:
Over the pandemic, neurologists across the globe noticed a sharp uptick in teen girls with tics, according to a report in the Wall Street Journal. Many at one clinic in Chicago were exhibiting the same tic: uncontrollably blurting out the word “beans.” It turned out the teens were taking after a popular British TikToker with over 15 million followers. The neurologist who discovered the “beans” thread, Dr. Caroline Olvera at Rush University Medical Center, declined to speak with me—because of “the negativity that can come from the TikTok community,” according to a university spokesperson.
Almost no one who encounters them assumes they are actually sick.
Are there individuals in each of these communities that are “for real”? Probably, especially in the case of the Spoonies; undiagnosed or undiagnosable illnesses are a real thing. Are most of them legitimate? The answer seems to be a pretty clear “no”.
I’m not bringing them up to bully them; I suspect that there are profiteers and villains in both communities, but there’s also going to be a lot of people driven to it as a form of coping with something else, like how we used to regard cutting and similar forms of self-harm. And, you know, a spectrum of people in between those two poles, like you’d expect with nearly anything.
But it’s relevant to bring up because there seem to be far more Spoonies and DID TikTok-fad folks than people who say they orgasm looking at blankets because they did some hard thinking (or non-thinking) earlier. So when Scott says something that boils down to “this is credible, because a lot of people say they experience this”, I have to mention that there’s groups that say they experience a lot of stuff in just the same way that basically nobody believes is experiencing anything close to what they say they are.
A story: when I was a kid, a new series of games called Pokémon came out. To give you an idea of how into the games I got, I just typed the acute E in the title using alt-codes from memory.
For those who never dreamed of becoming a Pokémon master, the original two games worked like this: As the protagonist, you were trying to catch little cute animals so you could fight them against other cute animals until you became the animal-fighting champion. There were 150 generally accessible pokémon, provided you had money (since there were two versions of the game, and each game had an exclusive subset of creatures). These were the ones you could get pretty easily. There was also one you could get if you manipulated a glitch.
But then there was Mew, the one hundred and fifty-first non-glitch pokémon, and the only legitimate way to get him was to go to in-person Pokémon events. Since none but the most child-dominated of parents would take their kids to one of these events, people looked for other ways to make it happen.
Thus it was that I found my way onto an Angelfire-hosted site with a small message board, and found a guy who knew how to get Mew.
Between two cities in the game, he said, there was a particular small patch of grass, seemingly exactly like the other patches of grass in which you’d normally walk to flush out animal-fighting fodder. And in that patch of grass, and no other, there was a very small chance of encountering Mew, who could then be caught.
I tried to do this forever, and it didn’t work. I was just young enough to not really question lies of that sort. It was only years later that I thought back to the claim, realized it was a lie, and then contemplated how weird it was that someone would lie like that.
But it was only years after that when I considered that, as time went on, multiple people joined the forum and then eventually came to claim that they too had also caught Mew using this method that didn’t work. The lie, dumb as it was, had proved contagious.
“Why would I lie about this?” is a compelling argument. Because it’s really hard to confidently point to why someone (in the hypothetical) would lie about/confer false information to themselves or others about something like intense meditation-joy. And when you can’t do this - when you go “I have no idea why they would lie about this”, it seems reasonable, up-front, for someone to say “See? That should lend this credibility.”
But the why-would-they-lie argument doesn’t hold water; you can point to countless groups who conveyed information that was false as a group. You can see the obvious falseness mixed into Spoonieism and DID TikTok fads; you can learn about dishonesty from Pokémon forums; you can reference the approximately one zillion people who are “a little psychic, sometimes” or who can see auras.
Most notably, you can notice that you don’t actually need to know why people lie to themselves or others (or both!) to note that they sometimes do.
A different Scott, one who let us hitchhike around the galaxy, (Author’s note: An actual different Scott, Scott Lawrence, has pointed out that I’m a dumbass who can’t tell the difference between the Dilbert guy and the Hitchhiker’s guy, albeit more gently than I said it. Fixed but left in so you don’t trust me as much) Douglas Adams once wrote this:
We know, however, that the mind is capable of understanding these matters in all their complexity and in all their simplicity. A ball flying through the air is responding to the force and direction with which it was thrown, the action of gravity, the friction of the air which it must expend its energy on overcoming, the turbulence of the air around its surface, and the rate and direction of the ball's spin.
And yet, someone who might have difficulty consciously trying to work out what 3 x 4 x 5 comes to would have no trouble in doing differential calculus and a whole host of related calculations so astoundingly fast that they can actually catch a flying ball.
People who call this "instinct" are merely giving the phenomenon a name, not explaining anything.
It is not a secret that people who trend towards rationalism (or tech, with which rationalism has significant overlap) are not, on average, considered to be exceptionally socially skilled. A placement on the autism spectrum or some other form of neurodivergence is considered to be the norm rather than the exception to the rule amongst them. I don’t think this is bad; if anything, it’s where the group’s value comes from in the first place.
But with that comes a group-wide expectation that things that can’t be quantified with math are thus default-unknowable. A statement like “I could tell he was lying” isn’t quite taboo nonsense there, but it carries much less weight than in other places. People are less able or less willing to point out that someone looks less credible for socially understood reasons than in other less-enlightened-more-practical contexts.
Sometimes this is nice, but at some extremes, it ends up being a lot like if someone looked at Scott Douglas Adam’s description of a person catching a baseball above, realized that they can’t really explain how that happens, and then concluded that baseball catching was not a skill that does or could exist.
At the extremes of those extremes, you see things like the jhana thing: where it’s something that seems unlikely to most but is unfalsifiable, and because of that unfalsifiability is then assumed to be true because it was claimed at all. By Scott’s standard above, we would basically assume that any claim we couldn’t disprove was true, provided we could find at least a few thousand people who claimed it.
In the real world, people don’t usually end up making the assumption from the last section. Consider if I bet on the jhana thing being true and decided that was a thing I wanted (and, frankly, I think I would want it). The price of verifying that it was true (if it was true, mind you) to myself would be steep - I’d have to spend hundreds or thousands of hours in meditation before I got there.
But the price of verifying it false is unlimited - at any particular juncture, no matter how many hours I got into the process, the fault in my failure might lie in some limitation of my mind or spirit; I might just be “doing it wrong”.
Faced with that kind of dynamic, people make choices all the time about what they believe. Note that I’m not saying it’s about what they disbelieve. Belief, real belief, is often something that demands action - that a person do something about it, or else know they did wrong or took a loss in not doing it. I don’t believe in jhana in that active sense, so I won’t commit hundreds or thousands of hours to achieve it.
At the same time, I noted above that it’s at least possible that jhana is real; I can’t disprove it.
The wrinkle is this: to a jhana person, this sounds like “you are a liar”, full-stop. It’s impolite. And if you are sensitive to that kind of thing, if that’s a behavior you want to avoid doing, you can get caught pretty easily in a net where you feel like you have to treat something like jhana as true despite being unconvinced of it.
Anyway, the point is this: I’m arguing for a concept of a reasonable middle between “running up to everyone who says they have long COVID and calling them dirty, filthy liars” and “accepting every unproven claim of any sort as face-value true”.
And you might expect that there’s already something like this, but there really isn’t, or at least it’s not standardized. I flat-out guarantee you I’ve offended someone while writing this article, even in the group (jhana folks) that I treated the most gently. And I have at least one person I consider myself friendly with (Jay) who (if I’m remembering right) belongs to that group, and probably more that I don’t know about.
Note here that if the claims of jhana folks were widely believed in the active “I’ll do something about that” sense, we’d expect to see millions of people trying this and reporting back on their successes or failures - the fact that we don’t very probably means that I’m not alone in my passive disbelief. To put that another way: most people who encounter these claims conclude that they are likely enough to be false that they aren’t worth betting time and effort on.
I think people encounter this kind of doubt related to claims they are making and are shocked by it the worst when they come from predominantly atheist backgrounds. If you’ve spent most of your lives making claims that don’t rely on internal experiences that answer to most of the descriptors of spirituality, then you are used to either being believed based on nothing but trust or else being believed or disbelieved based on physical evidence.
Religious folks are often necessarily used to “listen, I get that you are claiming a certainty you feel about this topic, but I don’t buy it” in a way the average not-religious-besides-meditation type of person isn’t.
I’d like to accommodate the jhana-claimer and to be politely credulous, but consider this: if I went with believing the jhanists based on the level of evidence they provide, I’d also have to believe both the Spoonies, the DID people, the astral-projecting Wiccans, people who see auras, and John Edwards (pick one; almost any John Edwards works for this sentence).
More to the point: the kind of norm that’s being demanded here requires you to believe almost any claim, provided you can’t immediately falsify it. Or barring that to at least say you do, even if you don’t do the stuff that belief would imply you should.
I’m a proponent of honesty. It’s sort of one of my things. I’m not necessarily especially good at it, but I put in some effort toward the goal of being honest; I think about honesty kind of a lot.
There’s a line of political thinking that says “Listen, you can’t make too many things illegal, because even if they are just nominally illegal in a way that isn’t enforced, you will create more instances where more kinds of people are likely to break the law. You will eventually condition people to think of themselves as lawbreakers, or else to think of laws as something that it’s OK to break, and that’s going to be something you regret after the fact”.
I tend to agree with that, and I tend to think of “don’t create situations where lying is the norm” in the same way. This goes beyond the practical aspects of believing unproven claims; even though those exist, I don’t think most of us expect a huge upsurge in actual honest-to-god meditation from jhana claims, even if we expect more claims whether they are connected to practice or not. People will go on assessing confidence in claims and acting (or not acting) as appropriate, regardless of what they say.
But what you might expect, I think, is that honesty itself is harmed - that people conditioned to lying about small things (say, pretending to buy jhana when they don’t) end up more likely to lie about more and bigger things, especially when they can attach it to an intent to do good - by, say, believing that it’s necessary to lie to be polite and kind.
I think it’s better to acknowledge a middle position - one where someone is not actively believed, where the listener remains unconvinced, without having the default assumption being that they are by doing this committing an act indistinct from flat-out accusations of dishonestly. In other words: People should be able to express reasonable, honestly-felt doubt without being thought of as expressing hate or disdain.
On Unfalsifiable Internal State Claims, Politeness, and Broadly Applied Principles
> > “If you have learned to eat salt and follow internet instructions and buy compression socks and squeeze your thighs before you stand up to not faint…and you would faint without those things, go into that appointment and tell them you faint.” Translation: You know your body best. And if twisting the facts (like saying you faint when you don’t) will get you what you want (a diagnosis, meds), then go for it
Generally liked this post, but that quote always felt weird to me. Normal healthy people do not require adding extra salt to their diet and wearing compression socks and squeezing their thighs before they can stand up without fainting. If you need to start doing all of those things in order to avoid fainting when you stand up, then it's accurate to say that you faint, and I don't get how it's twisting the facts to tell your doctor that.
Somewhat related: that thing where somebody you don't know that well enthusiastically tells you "you should watch movie XYZ" or "you should read this book". I guess the correct polite response is "oh, maybe I will", but that's typically dishonest. There's no polite, short-form way to say "while I appreciate your recommendation, I have a long list of things I 'should' read and watch, and there's no chance that your rec changes my plans". So I'm stuck either pissing the person off (made that mistake once with someone who I thought would have a sense of humor about it---they didn't), or lying, or very quickly changing the topic. In practice I settle for "I'll put it on my list", which is also a lie (because the list is long and doesn't need low-quality additions), but somehow feels like a sufficiently small one.
Same flavor of "how do I politely express disbelief while being proselytized to".