Monday, September 2, 2013

Online Cults

A "cult" is hard to define. The term is used very subjectively, and usually pejoratively. Typically the most fervent anti-cult movements are run by religious people, and that's a slippery place from which to argue against differing beliefs. The concept is inevitably tied to fringe - and harmful - religious beliefs, practice, and institution, but the details are very fuzzy.

Nonetheless, this is how I imagine cultists on the Internet.

We've talked before about how the Internet has given atheism a loud platform, but what about new religious movements? While the numbers did indicate that non-religious numbers were on the rise, we can't necessarily assume that the "no religion in particular" category excludes people that participate in cults, since they might not even think of it as a religion. We also know that the Internet can be very insular, allowing for strange community quirks to develop.

Since we're talking about online religious movements, we won't include the movements that started before the Internet.  Even though Heaven's Gate had a (hilarious) website in the mid-90s, we wouldn't count it because they got their start in the 1970s.

Let's explore what the Internet has come to offer by way of new religious movements.

In 2005, a YouTuber-to-be called Onision attempted to start an online cult named Sicesca. The website was shut down in 2006, but is archived here. The website's content is written very obtusely, but there seem to have been promises of enlightenment and an emphasis on environmentalism and animal rights. His attempt at a cult went forgotten for a while, but he has since referenced it in one of his YouTube videos. Now that Onision has a sizable YouTube following, he has a very prominent platform on which to promote his ideas.

However, as far as I can tell, Sicesca - or whatever may have succeeded it - doesn't have any current activity, so let's move on.

FreeDomain Radio is the online outlet for libertarian blogger and "self-described philosopher" Stefan Molyneux, who explores concepts of "freedom philosophy" in his podcasts and articles. The website ranks among the top 100,000 websites visited in the United States, and presents itself as "alternative news".

Molyneux ramps up the usual libertarian "freedom from force" garbage by describing the family structure as a coercive entity. After all, since you are forced into your family ties, then true freedom must come by deliberately rejecting your family, right? The FreeDomain Radio community calls this "DeFOO" - Departing the Family of Origin. The von Mises Institute, a more mainstream libertarian outlet, thrashes Molyneux's libertarian arguments as "preposterously bad".

The Alexa data reveals that most of the website visitors don't have a college education, suggesting that a large portion of Molyneux's audience is younger than college-aged and therefore already in a period of rebelliousness against their family. There is at least one verified account of a teenager and regular visitor to FreeDomain Radio running away from home and cutting all ties with their family. Other such stories can be found on various blogs, devoted to countering Molyneux's message and discussing the general destructiveness of FreeDomain Radio.

And let's do it because a cult told us to!

As far as uniquely "online" cults go, some new religious movements exist that could not possibly have existed before the Internet. Kopimism is a completely online-born religious movement, whose members - called Kopimists - believe that copying and sharing information is a virtue that should be celebrated and revered.

Kopimism has even been recognized as a legitimate religion in Sweden, and at least one Kopimist wedding has been performed. Their logo is the Kopimi symbol, a copyright alternative designed to specifically encourage the "kopimied" work to be copied as much as possible.

According to their constitution, Kopimists declare the internet "holy" and have private ceremonies online that are forbidden from being recorded. Reading through the mix of organizational jargon and internet slang, it's hard to tell if these people actually take themselves seriously, or if they're a bunch of people exchanging files in secured meeting spaces.

The group seems to be very focused on free proliferation of information, which seems to be more like a political stance than a religious identity. It is possible that Kopimism is similar to pastafarianism and discordianism in that it exists to make a tongue-in-cheek point. On their "What is Kopimism?" page, they explicitly say that they "do not make claims regarding gods or supernatural forces", and that they registered as a religion because they "deserve the same recognition and respect" as other faiths.


"Thou mayest ctrl-c and ctrl-v, but of ctrl-x thou shalt not press of it."

How far do we want to stretch the label of "cult", anyway? Kopimism, on paper, is a new religious movement, but it doesn't seem like a cult. There's some negative connotation to the term "cult" that separates it from merely being a "new religious movement". After checking around some online sources for definitions and common traits found among cults, let's define our criteria for cults to be:
  • Having an authoritarian leader
  • Having fringe beliefs, which may be dangerous, and may be false
  • Imposing (or suggesting, since it'd be rather hard to impose anything on someone through the Internet) lifestyle changes for its adherents
  • A strong emphasis on getting money and/or recruiting members
  • "Thought-reform", "mind-altering practices", or deliberately affecting how its adherents think in some way
  • A strong distinction between those outside the group and those within the group, often incorporating insider language
An author of one of the anti-FreeDomain Radio blogs goes into a very good explanation of cults, and contends that when people usually talk about cults, they mean a specific kind of new religious movement that is destructive to its followers. While there may be a lot in common between cults and more mainstream organizations, we can make an essential distinction by talking about "destructive cults".

This makes sense, because we know that a "cult following" doesn't carry the same connotations as "following a cult". Nerdy fandoms can be notoriously devoted to their hobby of choice, obsessively following the work of famous authors, artists, and other content producers. These content producers rarely (if ever) try to command an authoritarian role among their fans. These people are not cult leaders. Their work is not designed to preach a message or to be doctrine.

Despite authorial intent, a very fringe number of fans will still venerate their favored artist, and actually do treat their hobby like a religion. Even very isolated individuals can connect with like-minded people on the Internet, form organized communities, and then organize conventions and other meetups to celebrate their hobby. These are people who take their consumption habits to the extreme, where it becomes destructive to them. Luckily, they do not have to dictate how the functional, (relatively) well-adjusted majority enjoys the creative output of others.

Of course, some fringe audiences are more visibly crazy than others.

We also know that "cult of personality" does not have the same connotations as "cult", either. We've previously discussed the cults of personality that some administrators have with their website community. Unlike artists, administrators do have some authority over their website community because they have control over the website interface. Since administrators likely direct their website to their own tastes, the people who end up being regular visitors to the website probably have similar tastes to the administrator. We also know that a lot of online communities tend to be very cliquey, often with their own catchphrases, social dynamics, and prerequisite community lore.

However, there is never any obligation for site members to love, like, or agree with the site administrator. Most administrators are very impartial to community activity on their websites, and impose the bare minimum of regulations on their user base to avoid legal issues or readability issues. There are communities with tighter rules, but those rules never exist to venerate the administrator. If anything, those administrators receive far more anger than they do adulation.

The SomethingAwful administrator narrates a NSFW email that he received from one of his biggest fans.

Still, just because a lot of these online hubs and communities aren't really cults (let alone destructive cults) doesn't mean we couldn't have a cult-like online hub.

Let's explore this idea by looking at Less Wrong, an online community that I've previously written about. I mentioned in the previous post that they've already had accusations of being a cult levied against them, so that gives us reason to check them out a little more thoroughly.

We already talked about their peculiar insider language that they use with one another in the older post. They also organize meet-ups together for members to meet each other in real life. This isn't unusual - people who go on Reddit, Tumblr, and other websites have organized meetups in the past as well (meetups on online communities may be laughable, cringeworthy, shameful events - but they're not unusual).

Less Wrong meetings can incorporate a little more than the usual nerdy online awkwardness. They are actively concerned with making Less Wrong "a stronger community", have written an extensive guide to running proper meetups, and even closely examine successful instances of meetups as case studies. Here is an excerpt from one of their Christmas-time meetup summaries, aptly called a "ritual report":
"The night begins with many sources of light - from candles and oil lamps to gas lanterns to florescent bulbs to lasers and lava lamps. We begin with fun songs like “It’s Beginning to Look A Lot Like Fish Men.” As the night progresses, we turn the lights off, one by one, and the songs grow darker. We occasionally read relevant snippets of Lovecraft, then abridged versions of Eliezer’s Sequences. We read the Litany of Tarski, over and over, each time facing a darker possibility that we must prepare ourselves for." ~A "Ritual Report"
You'll notice that part of their ceremony was reading from the "Sequences", which are blog posts that Less Wrong's administrator, Eliezer Yudkowsky, penned a few years ago. Yudkowsky's Sequences detail many of his pet ideas within transhumanism, "Bayesian epistemology", artificial intelligence, and other topics pertaining to "Rationality".

As far as I can tell, Yudkowky has not had a direct hand in organizing these meetups, nor has he asked anyone to incorporate the Sequences in such meetups. These people are just so enthusiastic about "rationality" that they consider his writings to be essential, and have even written alternative versions of them so that more people read them. These devoted members call themselves "Rationalists".

The way that Less Wrong people define rationality is far more involved than the conventional definition of rationality. It is more specific, and involves a lot more of their strange terminology than most might be able to stomach. This is important - the way that Yudkowsky and Less Wrong people define "rationality" is different from the way that most people define rationality. Bear that in mind as you read this, so as not to confuse their "rational" talk with what you think is rational.

According to Yudkowsky, rationality "is the master lifehack which distinguishes which other lifehacks to use". On his personal website, he writes about the virtues of rationality, writing in a style that borders on religious:
"How can you improve your conception of rationality? Not by saying to yourself, “It is my duty to be rational.” By this you only enshrine your mistaken conception. Perhaps your conception of rationality is that it is rational to believe the words of the Great Teacher, and the Great Teacher says, “The sky is green,” and you look up at the sky and see blue. If you think: “It may look like the sky is blue, but rationality is to believe the words of the Great Teacher,” you lose a chance to discover your mistake.
Do not ask whether it is “the Way” to do this or that. Ask whether the sky is blue or green. If you speak overmuch of the Way you will not attain it.
You may try to name the highest principle with names such as “the map that reflects the territory” or “experience of success and failure” or “Bayesian decision theory”. But perhaps you describe incorrectly the nameless virtue. How will you discover your mistake? Not by comparing your description to itself, but by comparing it to that which you did not name." -Eliezer Yudkowsky
Yudkowsky's pet ideas have been accused of being misplaced and delusional. Yudkowsky is not an academic, and allegedly hasn't even finished high school.

That didn't stop him from founding what is now known as the Machine Intelligence Research Institute (MIRI) in 2002, an organization devoted to "Friendly AI" research. Despite being called a research institute, their "publications" only began appearing in peer reviewed journals in 2012. Those particular papers in 2012 were in two low-impact journals that are outranked even when examining journals in the same field. Other papers have been published in Singularity Hypotheses, whose website appears to run out of a blogger site and doesn't even appear in the SJR database. They do have several conference proceedings under their belt, but presenting in a scientific conference can demand lower rigor than publishing in a scientific journal.

Of course, sometimes Yudkowky can get his works published by getting them into chapters of other people's books, but that is a far less selective arena than peer-reviewed publication. This all goes without mentioning his Harry Potter fanfiction.

Rationality and reality are apparently not compatible.

There are other people who feature prominently on Less Wrong, some of which are other bloggers on the Internet, others of which are researchers affiliated with MIRI. Despite these other contributors, Yudkowsky has historically possessed most of the spotlight among his peers. The community contributes their own thoughts on rationality, and often build on Yudkowsky's pet ideas from the Sequences. They even jokingly glorify Yudkowsky, inserting him in strange fanfiction and Chuck Norris-esque jests. While Yudkowsky may not be the one organizing all of the meetups, he certainly embraces a role as community leader.

Yudkowsky, along with some other administrators, maintains control over the content on his website. Roko's Basilisk was an on-site incident where a contributing user wrote about a version of Pascal's wager involving a theoretical future artificial intelligence. The writings were consistent with ideas already proposed in Less Wrong, but Yudkowsky's response to the content was to sputter, rant, and ban all discussion of it on his website. His provided reasoning was that Roko's Basilisk had caused extreme psychological distress to members of his community. This, coming from someone who supports a community that favors overcoming initial reactions of disgust to "rational ideas" and sincerely argues in favor of torturing a man for 50 years over dust flying in the eyes of a sufficiently large number of people.

Less Wrong and MIRI also have a sister organization, the Center for Applied Rationality, which offers workshops on how to improve your ability to think with rationality. For the low price of $3900 USD, you can spend a week with them learning about how your brain works and how you can improve your thinking skills. They'll even keep in touch with you for six weeks afterwards to make sure that you "adapt to the rationality material". Of course, Yudkowsky is on the team as a curriculum consultant.

So, we have a leader with fringe ideas on artificial intelligence. He exercises authority over which ideas are allowed to propagate in a community that loves his ideas. A whole wing of his supporters - along with this leader himself - are open and dedicated to taking your money while striving to reform the way you think.  He founded and helps run an organization that has questionable authority in the field that they claim to research. And of course, let's not forget their nonsense language. There you have it, folks - the Less Wrong online community certainly fits the bill of cult.

Quick! Someone give this thing a Snuggie!

But, is Less Wrong a destructive cult? Potentially.

There's also MetaMed, an organization stemming from MIRI that focuses on "personalized medical research". Don't confuse this with "personalized medicine" - MetaMed instead uses Bayesian networks to give you "actionable options". Of course, hiring someone to scour literature data in order to find obscure treatments could be valuable, but the service more strongly resembles medical consultation and should not be confused with personalized medicine or medicinal research. It's also medical consultation that costs about $5000, and is highly unlikely to be covered by insurance.

One blogger, closely affiliated with Less Wrong's community, has discussed MetaMed on his own blog, explaining that some rationalists advise trusting MetaMed's opinions over traditional doctors' opinions because MetaMed comes from within the rationalist community. Indeed, the pitch for MetaMed on LessWrong is awfully disparaging to traditional doctors, and promotes MetaMed as an organization with "names [that LessWrong people] will find familiar", with researchers that "have also read LessWrong".

Yudkowsky has already gone on record to say that Bayesianism is superior to the scientific method, and now MetaMed as being promoted as a superior alternative to the conventional health care system. While there are certainly valid criticisms of the health care and hospital system, doing so in favor of a start-up that lacks a transparent record of success (they have testimonials, but the praises stop short of any mention of being successfully treated or cured) seems disingenuous. At its potential worst, it is self-promotion for the sake of profit, and may not actually significantly benefit its customers.

It is yet to be seen whether Rationality will have any sort of greater influence or authority in the future. One could argue that it is too early to make judgment (and indeed, most of the PhDs working at MIRI seem to be younger and merely beginning their careers) but a lot of Less Wrong's core members seem to have already made the judgment that Rationality is the superior way. This could hurt people.

Rationality is happening at a very interesting time in history. It's possible that they are the first among many new-age religious movements to deal with their particular subject matter.

In fact, we've seen something similar happen in the 20th century with extraterrestrial life.

Yesteryear's singularity.

UFO religions caught on like wildfire during the 20th century, in a time when flight technology and space exploration were finally gaining widespread support and legitimacy. Sure, human flight was talked about prior to the 20th century, but that was a time of gliders and hot air balloons - actual flight, let alone into space, was a pipe dream. As the Wright Brothers, Neil Armstrong, and Carl Sagan dominated public imagination, some individuals' imaginations ran more wildly than others.

Some people conceived of benevolent extraterrestrial life along with the necessity of seeking, welcoming, and embracing superior life forms. Some people fabricated entire mythologies around human origination, the afterlife, and good and evil in terms of great sci-fi epics. They were fringe organizations, but still earning a reasonably large following - Scientology, for example, has tens of thousands of American members. Raelism is allegedly the world's largest UFO religion, and also has a membership in the tens of thousands.

Meanwhile, here we are in the 21st century, and information technology is booming. The Internet has connected people like never previously seen. We produce so much information on this network that it brings deeper information into human visibility, and we need new programs and automata to see it all. New professions are emerging in machine learning and informatics, with implications seen everywhere from biology to sociology to economics. The way that technology has advanced in the past few decades is truly inspirational.

So is it so surprising that we suddenly have a group of people who conceive of benevolent artificial intelligence along with the necessity of seeking, welcoming, and embracing superior ways to process information through "Rationality"?

Thanks, information superhighway!

It's obvious to say that Kopimism could not exist without the Internet. But Less Wrong and Rationality likely could not either. Sure, supercomputers and artificial intelligence have been a pillar of science fiction since the emergence of computers in the 20th century, and transhumanists have been calling themselves 'transhumanists' since the 1980s, but back then, these fields were farther removed from reality than they are today. The Internet has given us far more intuitive notions of collective behavior, mass information, automation, and other concepts essential to imagining the topics of science fiction. And of course, these technological leaps have mostly been spurred by people who don't have any visible affiliation with transhumanism at all.

Followers of Rationality - again to emphasize, different from conventional rationality - are imaginative human beings trying to make sense of exciting current trends in technology. And as the technology progresses, we'll probably see more people like the Rationalists in the future. They are a product of our times, and our times happen to be focusing on information technology right now.

If Rationality is actually similar to their UFO religion forefathers (and let's be honest, I could be wrong), then we can expect Rationality and its ilk to always be fringe. Once in a while, they'll do something that reminds the world that they exist, but they will ultimately be inconsequential, and dismissed by people in industry, academia, government programs, etc who do work in computer science and information technology.

All that said, though it may be important to point out and criticize the cults, we must take care not to disparage the cultists too heavily. We're all guilty of believing strange things once in a while, so we ought to have some humility if we're just going to disparage someone for believing different strange things than we do. If their beliefs are not harming them or others, then the cult members can likely maintain functional lives while also happening to believe strange things.

The best thing that a person can do to help cult members is recognize the agency of cult members, keep resources on the cult available, and offer support if circumstances within the cult get dangerous. I'm certainly one to do my part in that.

6 comments:

  1. Hello, I recently discovered this blog and I must say it's an excellent reference for contemporary Internet culture. I was searching for Less Wrong criticism because I like reading that sort of stuff on the 'Net and I must say that this is one of the best essays (and very documented) on this topic! Keep up the good work!

    ReplyDelete
    Replies
    1. Thanks for the kind words! Glad you enjoyed the read!

      Delete
  2. Your bias against youths wanting to live independent is alarming. Many young adults require their parents to provide money or sustenance as long as their 30's. Society has infantilized a whole generation of kids and a man tells them to forge their own path in life, clear from their parents as humans have done fot countless generations yet he is accused of starting a cult? God help us all if this is how society feels. Truth test is an emerging online cult I wanted to discuss, but honestly im aghast at your presumptions of how young men ought to be infants for life.

    ReplyDelete
  3. This really is well written, I really think damaging the integrity of science and scientific journals is one of the largest dangers of internet "rationalism". Casting aside scientific method for the use circular logic scares me.

    ReplyDelete
  4. My broken relationship has been restored by Dr.Unity the best spell caster online that is powerful and genuine. My boyfriend of 4yr broke up with me i have cried my self to sleep most of the nights and don’t seem to concentrate during lectures, sometimes I stay awake almost all night thinking about him and start to cry all over again.Because of this I end up not having energy for my next day’s classes ,my attendance has dropped. Generally he is a very nice guy ,he ended it because he said we were arguing a lot and not getting along.He is right we’ve been arguing during the pregnancy a lot .After the break up I kept ringing him and telling him I will change.I am in love with this guy and he is the best guy I have ever been with.I’m still hurt and in disbelief when he said he didn’t have any romantic feelings towards me anymore that hurt me faster than a lethal syringe.He texts me now and then mainly to check up on how am doing with the pregnancy,he is supportive with it but it’s not fair on me, him texting me as I just want to grieve the pain and not have any stress due to the pregnancy.i was really upset and i needed help, so i searched for help online and I came across a website that suggested that Dr Unity can help get ex back fast. So, I felt I should give him a try. I contacted him and he told me what to do and i did it then he did a spell for me. 28 hours later, my bf came to me and apologized for the wrongs he did and promise never to do it again. Ever since then, everything has returned back to normal. I and my bf are living together happily again..All thanks to Dr Unity. If you have any problem contact Dr.Unity now and i guarantee you that he will help you. Here’s his contact.Email him at: Unityspelltemple@gmail.com ,you can also call him or add him on Whats-app: +2348071622464 , His website: http://unityspelltemple.yolasite.com .Thank you so much for reading Sonia Williams from England.

    ReplyDelete
  5. There are many errors and omissions in your understanding of LessWrong. For instance, scientific conferences actually tend to be higher quality than journals in computer science - it's unique relative to other fields of science. There is also almost no moderation of ideas on LW, with RB being a sole (and temporary) exception. The excerpts from meetups and essays which you used to make it sound 'religious' are of course not a representative sample.

    The "conventional definition" of rationality you cited is a dictionary definition, which is a terrible way to approach any philosophical topic. If you look at how real (academic, mainstream) philosophers talk about rationality, they use just as much complex heavy terminology as folks on LessWrong do. Using a dictionary as an authority on a substantive issue is just a silly trick which would equally condemn all kinds of researchers and academics from all backgrounds.

    If you think MetaMed was bad just because it differed from conventional healthcare, you simply don't know anything about healthcare. The lack of statistical training among doctors and the prevalence of poor treatment decisions is a widely known problem in American healthcare which leads to poor treatment outcomes and cost overruns. MetaMed provided consulting on evidence-based treatment, not anything regarded as problematic in the slightest.

    After several years, your comments have aged poorly. There's been no sense of rationality "hurting people"; instead there's just been a steady stream of MIRI papers being accepted into higher tier workshops and Yudkowsky's views on AI gaining more approval in the field.

    ReplyDelete