Search
Full
Incoming links
from Heidegger on the Connection between Nihilism, Art, Technology and Politics
  • Well that is – unintuitive. That sounds like an even harsher critique of Rationalism than I typically make.
from paperclip maximizer
  • This seems entirely implausible to me. Part of this exercise is to investigate and defend that intuition and related doubts about the ironclad mathematical certainties that Rationalism produces so effortlessly.
from postrationalism
  • A difficult to define ideology, but literally means people who have moved beyond Rationalism. Meaningness and Ribbonfarm are usually taken to be postrationalist, and I'm close enough to those precincts that it probably means I am too, although in truth I was probably never Rationalist enough to qualify.
from bird's eye view vs. frog's eye view
  • It seems to me that subjectivity and objectivity need to be balanced and integrated. Too much emphasis on the objective, and you get eliminativism or gradgrindism or Rationalism. Too much emphasis on the subjective and you get the rancid aspects of postmodernism and whatever it is that seems to afflict the younger generation, a kind of toxic emotional entitlement.
from The Enigma of Reason
  • Book by Hugo Mercier and Dan Sperber that has an interesting, non-Rationalism view of reason. Rather than striving to attain objectivity, reason is inherently purposeful and interested. This seems pretty common-sensical I suppose, but in this context it's kind of radical.
from rationality" vs "rationalism
  • Well that's kind of embarassing – I wrote tons of stuff about Rationalism without realizing that it is a label eschewed by rationalists themselves:
    • To call something an “ism” suggests that it is a matter ideology or faith, like Trotskyism or creationism....So, my suggestion is to use "rationality" consistently and to avoid using "rationalism". Via similarity to "scientist" and "physicist", "rationalist" doesn't seem to have the same problem
from Agency Made Me Do It
  • A caution: my goal is not to write a self-help book or a manual on how to acquire more agency. I guess this is something people might be looking for, given how it is basically the promise of a whole subindustry of productivity and self-help gurus, and a concern of Rationalism (see Being a Robust Agent).
from SlateStarCodex
  • Slate Star Codex is the former blog of Scott Alexander (aka Scott Siskind aka SSC), the most widely-read person in Rationalism sphere these days (2020 or so).
from neoreaction
  • An extreme right-wing political ideology that for some reason has a serious following in a subset of the technology world, with considerable overlap with the Rationalism community. Also called NRx by those in the know. Neoreactionaries don't like to be called fascists or white nationalists, but their writings contain astounding levels of toxic racism and calls for violence, just the sort of thing you would expect from fascists or white nationalists.
    • Scott Alexander has put a lot of effort into distancing himself from neoreaction, which is good, but he's basically in trouble for being in.a social position where he had a need to do that. Which might not be quite fair, but this is how things work.
from nihilism
  • The need to find some kind of value or purpose in a meaningless universe is kind of an unacknowledged note throughout Rationalism discourse, emerging in its nightmare dreams of like the paperclip maximizer or Roko's Basilisk or its attempts to wax poetic about utilitarianism. This is not really meant as a criticism. I see Rationalism as a sincere attempt to build something necessary – a religion, a shared way of making meaning – on top of the unpromising nihilist foundations of the materialist worldview. I'm sympathetic to their goals and efforts but kind of dubious about their solution.
from haecceity
  • Isn't this about the same as that other advanced vocabulary term, ipsissimosity? (Note: this has some bearing on my quarrels with Rationalism and the "objective spirit")
from About
from Agency: Introduction
  • Each volume of A Map That Reflects the Territory has a short introduction to its theme. I'm going to dissect a few quotes from the introduction to Agency, because they seem to compactly and precisely embody my issues with Rationalism in general:
from Agency: Introduction
  • There's a whole lot of this that I disagree with (see Rationalism), but here I just want to point out how it leads to a distorted and arguably harmful view of human agency as somehow deficient because it is not pitiless and single-minded.
from AI Risk
  • The long-term superintellignce risk that is an obsession of Rationalism.
from About
from gradgrindism
  • Rationalismists are not really Gradgrinds, they are in fact a pretty playful and imaginative bunch in their way. But their ideology is grim, and their nightmares of paperclip maximizer have a Gradgrindian aspect to them.
from optimizing
  • One of my gripes with Rationalism is the unquestioned assumption that intelligence is about optimizing some quantity. Closely related to the similar gripe about winning. I find this an impoverished way to think about the mind.
from antipolitics
  • Rationalism features a prominent disdain for politics. There are many good reasons of course to hate politics, but disliking something does not make it thereby unimportant. And it doesn't excuse you from participation in the actual politics of the present day.
from Meta-honesty
  • To say this is wrong is kind of an understatement; it strikes me as aggressively wrong, deliberately retro, an attempt to stick one's head in the sand to evade the postmodern condition. And it's foundational to Rationalism.
from libertarianism
  • Rationalism tends toward libertarianism, although it's not universal. And I really do think that their libertarianism is motivated more by a fondness for elegant distributed mechanisms than by a desire to slaughter leftists. Whatever the motivation, the ideas are deeply intertwined, and probably my objections to them are intertwined as well.
from Being a Robust Agent
  • Because Rationalism is about an idealized version of thinking, it doesn't have much interest in the ways that humans (so far, the only examples we have of intelligent agents) actually work. It aims to make humans more closely approximate the ideal, even though the ideal is monstrous when taken to its logical extremes.
from Being a Robust Agent
  • I think I've arrived at a compact understanding of what Rationalism is:
    • start with the natural goal-seeking and problem-solving abilities of actual humans
    • abstract this out so you have a model of goal-seeking in general.
    • assume that computational technology is going to make hyperaccelerated and hypercapable versions of this process (ignoring or confusing the relationship between abstract and actual goal-seekers)
    • notice that this is dangerous and produces monsters.
from Making of a Counter-Culture
  • The quotes below really highlight for me how much Rationalism is a reactionary movement against sixties romanticism. That doesn't make it wrong – there were plenty of reasons to turn against that stuff – but explains a bit its cultural and political penumbra.
from illegibility
  • Rationalism seems too oriented towards legibility; for my taste at least. As I've said elsewhere, they seem intellectually retro, and haven't gotten the news about the limits of reason and representation:
    • The whole movement is kind of retro in a way that is sometimes appealing but just as often appalling. Peter Sloterdijk labelled rationalists as "the Amish of postmodernism" and it often does seem like an effort to be staunchly and cluelessly devoted to ideas that nobody really takes seriously any more.
from Review: A Map That Reflects the Territory
  • The Rationalism community has packaged up some of the best of LessWrong into book form, and when I saw that one of the five focus topics was agency I could not resist asking for a review copy, that being something of a pet subject of mine. Now I have to follow through with a review, and I'm taking the opportunity to also completely rebuild my writing and publishing stack.
from winning
  • The constant references to "winning" in Rationalism discourse really grate on my nerves. I get what work it is doing – it's suggesting that life is a kind of competitive game, in which there is some kind of scoring metric, and you are able to compare your score with others. The best, most rational ideas are those that produce the most winning.
from anti-purpose
  • Purposefulness in itself is a key value of Rationalism (see Being a Robust Agent). A good rationalist not only has goals, they have meta-goals about being more goal-oriented.
from Coherence Arguments Do Not Imply Goal Directed Behavior
  • The Rationalism counter to this, I think, is to say that humans are imperfectly rational due to the accidents of evolution, but AIs, being designed and untroubled by the complexity of biology, will be able to achieve something closer to theoretical rationality. Since this is provably better than what humans do, humans are potentially in deep trouble. Hence they have taken on the dual task of making humans more rational, and figuring out how to constrain AIs so they won't kill us.
from SlateStarCodex
  • My general opinion: he's an amazingly prolific and clever writer but there's something off about his viewpoint. This is part of what has gotten him into trouble with the mainstream, and it's quite related to my general objections with Rationalism. I've written quite a bit trying to pick apart some of his posts, and I freely admit that has generally been a very rewarding intellectual experience even if I don't end up vibing with him
from Meaningness
  • David Chapman (aka @meaningness) has been a major influence on my own thinking. His work at the MIT AI lab with Phil Agre made a deep impression on me when I was trying to figure out my own academic path. This included a critical take on the standard cognitive science view of the mind, which is pretty much Rationalism minus the more cultish and cartoonish aspects.

Rationalism

21 Dec 2020 03:59 - 29 Sep 2021 11:18

    • [a note on terminology: "rationality" vs "rationalism"]
    • Rationalism is a movement of nerdy types (in both the best and worst senses), centered around the LessWrong website. Should be distinguished from small-r rationalism, which is just a philosophical position. Rationalism goes beyond the small-r version in that it is also a self-help movement that tries to promote what it considers better ways of thinking and being.
    • It has a very particular theory of what that means, comprising a theory of knowledge (representational objectivism) and of action (optimizing aka winning). Both of these theories seem extremely weak to me, in that they don't adequately describe the natural phenomena they are supposed to be about (human intelligence) and they don't serve as an adequate guide for building artificial versions of the same.
    • Nevertheless they manage to do a lot of interesting thinking based on this inadequate framework, and they attract smart and interestingly weird people, so I find myself paying them attention despite my disdain for their beliefs. A lot of this text is about me trying to work out this contradiction.
    • The other component of Rationalism is a belief that superintelligent AI is just around the corner and poses a grave ("existential") threat to humanity, and it is their duty to try to prevent this.
    • Rationalists have founded MIRI (the Machine Intelligence Research Insititute) to deal with this problem; and CFAR (Center for Applied Rationality) to promulgate rationalist self-improvement techniques. They are also tightly connected to the Effective Altruism movement. They've attracted funding from shady Silicon Valley billionaires and allies from within the respectable parts of academia. And they constitute a significant subculture within the world of technology and science, which makes them important. They are starting to penetrate the mainstream, as evidenced by this New Yorker article about some drama on the most popular rationalist blog, SlateStarCodex.
    • [update 2/13/2021: the mentioned New York Times article finally dropped and it seems pretty fair.
      • SlateStarCodex was a window into the Silicon Valley psyche. There are good reasons to try and understand that psyche, because the decisions made by tech companies and the people who run them eventually affect millions.
    • Rationalists occasionally refer to their movement as a "cult" in a half-ironic way. It has a lot of the aspects of a cult: an odd belief system, charismatic founders, apocalyptic prophecies, standard texts, and a certain closed-world aspect that both draws people in and repels outsiders. But it's a cult for mathematicians, and hence its belief system is a lot stronger and more appealing than, say, that of Scientology.
    • The NYT article has a quote by Scott Aaronson (a Rationalist-adjacent mathematician):
      • They are basically just hippies who talk a lot more about Bayes’ theorem than the original hippies.
    • Now, this is quite true in that Rationalists constitute a subculture and have established a network of group houses, have a lot of promiscuous sex (aka "polyamory"), and are into psychedelics. On the other hand in Meditations on Meditations on Moloch I find that they've inverted some key hippie attitudes, for better or worse. They embrace what the hippies rejected and want to build a world on different principles.
    • Some admirable things about Rationalists

      • They are super-smart of course. They seem to attract mathematicians who are too weird for academia, and we sure need more people like that.
      • Their ideas tend to be simple, precise, and stated with extreme clarity.
      • They want to save the world and otherwise do good.
      • They are serious and committed about putting their ideas into practice.
      • They are very reflective about their own thinking, and seek to continually improve it.
    • My major gripes

      • Assuming that the overarching goal of life is "winning"
      • Overly mathematical (confusing map with territory)
      • Occasional extreme arrogance
      • A sort of impenetrable closed-world style of self-justification.
      • Retro taste in ideas and sometimes esthetics.
        • Sloterdijk made a good crack (in You Must Change Your Life) about small-r rationalists; he called them "the Amish of postmodernism". Of course if that metaphor holds, then I should leave them alone to their quaint and deeply held beliefs, which might end up being superior to the mainstream for long-term survivability.
      • Connections (socially and intellectually) to unpleasant political movements like libertarianism, objectivism, and neoreaction, fueled by an antipolitics stance that is ultimately shallow.
      • Taking as axiomatic things that are extremely questionable at best (orthogonality thesis, Bayesianism as a theory of mind).
    • A Rationalist 2x2

      • OK, this was really just an experiment to see if I could make.a 2x2 table in Roam and yes, I could and it was pretty easy!
      • The upper-left and bottom-right quadrants are pretty self-explanatory.
      • The top-right is a bit contradictory because the claim of importance is central to Rationalism. They believe they are literally saving the world from likely destruction by superintelligent AIs, and what could be more important than that? But they could be wrong about their importance while still producing intellectual value, so this represents that possibility.
      • The bottom-left represents the possibility that Rationalism is not only wrong, but harmful, in that it distracts smart people from working on real problems, and to the extent it becomes a dominant ideology in the tech world its becomes that much more harmful. Also to the extent that Rationalism is an ally of bad political ideas (considerable), it's not just a harmless nerd social club.