Referencing Khaneman and Tversky's "system 1 and system 2" from their bestseller Thinking Fast and Slow without crediting it seems to be more about avoiding the scrutiny that would draw, and instead, appearing authoritative with statements about how ancient psychological researchers commonly distinguish between these types of thought. We should even be concerned someone who could be a "jumper," may also have schizophrenia, as our thinking may have applications to that.
Maybe it's interesting, or maybe on closer inspection and thoughtful consideration it resembles propaganda to ignore your instincts, trust the narrative, and provides blunt tools for being suspicious of others who don't. If only most people were smart and comfortable enough to jump to conclusions and then course correct as needed, we could avoid the consequences of groupthink by those who fear criticism above all.
A friend of a relative is an apparent flat earther - my relative is always saying things like "oh you wouldn't believe what this guy thinks, he says the world is actually shaped like a UN map, what a goof". In this situation, which of the two is actually the more gullible? From the situations I hear described, it's pretty clear the guy just likes to tease, and enjoys the "intellectual exercise" - for some value of intellectual - of tearing down commonly held truths by providing alternatives, even if they are silly. I've seen the same thing with moon landing and 9/11. People like to use these as strawmen about look at all the nuts out there, we need to control what people can read, but this is just an excuse for shutting down opinions they don't like.
This was the default assumption for plenty of people when they first hear about real flat earthers, but unfortunately it appears to quickly devolve into actual, very firm beliefs for many people.
This whole phenomenon of "LARPing" your way into a conspiracy (I'm going to start posting about the Jewish cabal controlling the world because it's "edgy," it triggers people online, and it's a fun intellectual exercise to see if I can gather as many pieces of evidence as possible) and then actually believing it seems to honestly explain how a lot of these conspiracy theories end up gaining so much momentum.
After a while, they just grow tired of keeping up appearances, or maybe a famous person talks about conspiracy X, so they are no longer embarrassed by their belief, etc.
Is it possible to quantify "many people"? Like, what percentage of people that are ostensibly flat earthers are actually genuine flat earthers vs. trolls? I would guess trolls make up the majority of internet flat earthers.
There are without question plenty of "real" flat earthers our there. People willing to spend actual money on experiments doomed to fail. People cutting ties with their families because they don't agree with them.
When we read people espousing flat earth views on YouTube or Reddit or wherever, out natural assumption is to think that they must be trolling, because nobody could be that stupid, but in fact there really is a subculture that does.
Was her point grounded in reality? Hell no. But the exact point being made above is highlighted by your comment.
Her original post: https://politizoom.com/wp-content/uploads/2021/01/Greene-las....
There's a perfectly reasonable point to be made in each case, but many of the people making it feel the need to overstate the case and make it even more absurd.
I think it's because mockery spreads faster on the internet than a simple debunking. And so the thing being debunked is turned into its most extreme form. Perhaps even a step or two beyond a valid interpretation of what's being asserted by the subject.
So with the Mockery Maximization Principle, you get a meme that can spread very quickly and discredit the target at the same time.
The problem with this technique is that it can backfire if people start repeating it as what was actually being asserted. For example, the horse paste meme has backfired when it came to Rogan and the CNN doctor. When this happens, now the debunkers are on the defensive and they don't have a great way out of it since they're no longer on the side of accuracy and truth.
That's a great term.
This is the problem with tribal audiences. CNN can say to its audience (mainly older liberals) that ivermection is horse paste, and face no repercussions. Conformity is so valued by society as a whole, that no one is going to stand up for the truth on CNN itself. But then in those brief moments when the outside world comes in, their bluff is called.
The right does it as well, and your mockery maximization principle (great name BTW) mandates that the meme is produced solely for their primary audience to sell clicks.
That one could write that she did not quite say that, instead of the sane universe’s version that she did not say anything remotely like it truly is shocking. And horrifying.
She put enough of her thoughts and 'research' into her post that it was a 'dog whistle' for like-minded-thinkers.
You do not always have to explicitly say something, to actually say something...
And rather than stating the incorrect belief explicitly, she was using dog-whistle rhetoric techniques.
The lie is pretending that someones' written words are not actually enough evidence to show their intent.
The lie is pretending that statements exist completely isolated from any cultural or historical context. Marjorie Taylor Green has a well documented history of making very controversial statements.
The lie is ignoring the fact that time and time again conspiracy theorists keep bringing up the same scapegoats. Perhaps you are simply unaware that historically, attempting to link things with "Rothschild" is a very long-standing anti-semetic conspiracy and a known dog whistle.
> The lie is pretending that someones' written words are not actually enough evidence to show their intent.
> The lie is pretending that statements exist completely isolated from any cultural or historical context. Marjorie Taylor Green has a well documented history of making very controversial statements.
> The lie is ignoring the fact that time and time again conspiracy theorists keep bringing up the same scapegoats.
You're desperate to align reality with your opinion and offering nothing but characters attacks and assumptions as argumentation.
This isn't a strawman intellectual exercise. If we're in the car and talking about charging it, she will excitedly bring up how electricity will be free soon and laugh when we suggest that's a load of bull. She chose to not get vaccinated because she believes the vaccine was engineered to make you more susceptible to the "next virus" that the CCP is planning to release.
I don't really understand how she can possibly believe so much drivel, but she tunes in for her update every day, and this is the only "news" she cares about.
edit: just to be clear, the "free energy" thing isn't about government subsidizing electricity, it's about a device that nikola tesla invented that creates electricity from nothing but was suppressed for generations by the rich and powerful vested interests of the fossil fuel industry.
I like the idea of intellectual exercise, and I enjoyed reading the sub when it was just fun and games. I stopped reading it when they started to seriously support his election. But people for sure lose themselves in the delusions, especially if some part of it connects to them. It's like some sort of cheat code into their minds. I have friends who fell to the QAnon situation. If you told them 5 years ago that our government officials were satanic ritualistic baby eaters, they'd laugh you out of the room and tell you to have another beer. It doesn't really work that way. No way in any serious debate could you ever convince them of this. But now, after they've fell down the rabbit hole some of them seriously unironically believe these things could actually be true.
It's like they saying goes: If you open up your mind too much, your brains will fall out.
What's surprising is the speed at which this happens. In Berger's conceptualization is that this happens over the course of lifetimes, while in the social media space, this happens over the course of months.
My final point is that this is the reason why "jokes" are insidious. They will attract people who earnestly believe they aren't just jokes which normalizes harassment.
 The Social Construction of Reality. Berger, Peter http://perflensburg.se/Berger%20social-construction-of-reali...
 A specific kind of harassment designed to habituate abuse while also taking advantage of plausible deniability by ostensibly also being a "joke".
They literally cannot see beyond their own senses, e.g., want to see for themselves the curvature of the earth, yet they believe that it is the case that every one of the billions of photos of the earth from space or high altitude is faked, but won't believe that people can go to space.
The psychological research that seems to best explain it is that these people prefer to believe that the world is full of and controlled by an evil secret cabal than live in the reality of a world that is deeply random and uncertain.
Someone seeing periodicity or even symmetry, fractal self simlarity, or isomorphic structure in something noisy could just as easily be accused of apophenia until they produce proof. That there is structure in the noise matters to the theorist, but to the hegemon it's unimportant, if not subversive and dangerous.
I prefer to look at it as having to do with how people relate to power and truth and how they perecieve it.
I'm not sure it conflicts, and likely even correlates well with your view of how people relate to power and truth.
Whatever the root cause is relatively unimportant, until an aspiring authoritarian starts herding those who have trouble with objective reality - creates support for a regime that will constrain the rest of us.
On several key points, the conspiracy subreddit has been correct. For example, in March 2020, they accurately predicted the course of covid, namely that vaccines would be developed, that there would be large-scale hesitancy, and that this would be used to justify the keeping of covid lockdowns / masking requirements / eviction moratoria far beyond the stated 15 days. Regardless of your belief in whether this was justified or not, this happened.
Similarly, the subreddit was correct about things later revealed by edward snowden.
Of course the next post will be about lizard people, but frankly, that doesn't mean we ought to discount the well thought out, presented, and reasonable 'conspiracies' that get trotted out there. Some of them actually happen.
It's just a bad place to trust, because they do no moderation. On the other hand, the places that do moderate are always behind on the things that end up being true.
Why do we need that? I find this type of control suffocating and it is just a patch for deeper seated problems, mistrust of authority for example.
The grammar was very poor. I meant the "look at all the nuts out there, we need to control what people can read" as the false conclusion that might be drawn by someone who believes conspiracy theories are rampant.
You need to find out why that happened and fix that.
I imagine a lot of these people just want/need something to cling to , because even the conspiracy theory sometimes seems more real then reality.
It doesn't take long to find evidence of truly outlandish conspiracies perpetrated by the US government itself. Dosing citizens with drugs in an attempt to control their minds. Exposing citizens to radiation on purpose without their knowledge or consent in order to study the effects, measurably causing increased rates of cancer in that population. Anyone can find stuff like this just randomly browsing wikipedia. Yet somehow we're supposed to disbelieve when "theorists" speak?
I've also concluded there's people out there who want to discredit these "theorists" in any way possible. Abusing science and medicine to do it? Why not? They must be mentally ill, right? All those insane ideas must be evidence that something is literally wrong with their cognition. Their brains must be scrambled or something. Schizophrenia? Throw in words like paranoia and suddenly people will agree that maybe we should involuntarily institutionalize these people. Put them on medication to calm them down. Yeah.
Paranoia is not a symptom when they really are after you. I don't understand how anyone can accuse people of paranoia in the 21st century when we know there are a hundreds of government agencies out there spying on literally everyone on the planet at all times. We know they conspire to do all sorts of completely unacceptable actions. We know they always deny the truth afterwards for as long as they can get away with.
Some people have silly beliefs like flat earth. That's just wrong and it's safe to ignore them. The problem is when people raising perfectly legitimate concerns about COVID-19 vaccines are dismissed as anti-vaxxers. The truth is their risk-benefit profiles are not fully known at this time, scientists are still studying their effects. It's absolutely possible that some vaccine has higher risks than benefits. Yet people saying anything that isn't glowing support for vaccination get moderated on social media for spreading disinformation.
I have become more skeptical of bashing of "conspiracy theorists", and rapid mislabeling from numerous experiments gone awry, due to great examples of crazy things in government like Operation Midnight Climax 
>The problem is when people raising perfectly legitimate concerns about COVID-19 vaccines are dismissed as anti-vaxxers. The truth is their risk-benefit profiles are not fully known at this time, scientists are still studying their effects. It's absolutely possible that some vaccine has higher risks than benefits. Yet people saying anything that isn't glowing support for vaccination get moderated on social media for spreading disinformation.
I think this had to do with Science-as-a-process vs "Scientism". When you treat science as a methodology it works great, because I truly believe science is regularly self-correcting, even if the correction period is in centuries. But, when science becomes mixed with narrative truth, painted through a partisan lense, or a dogma, then we devolve into something out of Frankenstein. Or, perhaps, better, The Fly.
The US is a country with documented examples of doctors injecting microorganisms into unsuspecting patients. There's a truly astounding list of unethical scientific experiments performed there on wikipedia.
So I don't fault americans for assuming there's some kind of conspiracy behind vaccines and I would most certainly not be surprised if they turned out to be right. I don't fault them for not trusting doctors, especially psychiatrists. I've read some truly horrifying stories.
People should be able to make informed decisions with regards to these vaccines, not get manipulated into taking them because governments are desperate to contain the situation.
The vaccines have side effects, though are regarded as a very safe method. It's very targeted, as broad corona vaccines are not deemed to be good. Medical experts know this much better.
No conspiracies needed. QAnon type conspiracies are more visible than ever (as they get blown up on social media) and personally I've seemed to notice a pretty big uptick in 'uncle so and so went off the deep end so we had to cut off communication' type stories.
This increase in attention feeds news stories about conspiracy adherents (and anybody that can be lumped in with them, even if it's not warranted).
> maybe on closer inspection and thoughtful consideration it resembles propaganda to ignore your instincts, trust the narrative, and provides blunt tools for being suspicious of others who don't
This is almost all "news" nowadays. Antifa is running the left, QAnon is running the right, somebody is out to get you and ruin the American way of life and we're going to tell you who the bad people are if you just tune in for our next segment.
I wonder how long that sort of cutting off has been happening in the past. I know there are members of my extended family I don’t associate with due to their crazy beliefs, but I haven’t talked about it.
If Uncle Bob in 2017 believed that the federal reserve was run by a cabal hoarding 90% of our tax money to enrich themselves, nobody would really care unless he feels the need to bring it up in every conversation to the detriment of normal human interaction.
If Uncle Bob in 2020 believes that, he probably also believes that the cabal is using masks to control the population, which causes an immediate conflict when he insists on going unmasked everywhere, including visits to elderly grandma. And then in 2021 that belief probably gets extended to vaccines.
In other words, the pandemic offered a unique opportunity for conspiracy theories to conflict with day-to-day life in a way that wasn't true before the pandemic.
What broke pandemic policy is it was run by people who believe sincerely that they need to deceive people for their own good. It's the maternalism of noble lies. While there is a lot of uncertainty in policy circles about science and truth, there is very little uncertainty about power, and when you have that, truth is what you say it is.
The policy response is absolutely using the crisis as leverage to ensconce measures that would not have been legally or politically possible without it. The only meaningful question (I think) is whether the people behind the policies and supporting them are protagonists or antagonists. Almost nobody is asking, "wait, are we the baddies?" The reason "the banality of evil," is such a controversial idea is it places more intellectual and moral responsibility on each of us than the long tail of people are willing or able to accept, and so it's easier to attack the person with the idea than clear the bar it implies.
That's not conspiratorial, that's critical, and I'm sympathetic to people accused of conspiracy thinking because we've let the culture conflate those - to whose benefit is left as an exercise to the reader. ;)
> This increase in attention feeds news stories about conspiracy adherents (and anybody that can be lumped in with them, even if it's not warranted).
Emphasis on the last bit: whether your contrarian position is "+2 sigma" (as you state) or -2 sigma, you will definitely be lumped in with the other group by some targeted news program.
> to whose benefit is left as an exercise
To everyone's detriment really. Every accuser is also accused. Mainstream news these days is as much a target of conspiracies as they are accusers of conspiracy theorists. So too are leftists, rightists, contrarians, and of course actual conspiracy theorists. Everyone is accusing everyone else of something, to a net negative on society.
Every leftist is accused of marxism, every conservative accused of white supremacy, and so on. It's a miserable state of affairs with no nuance or productive discussion. Even the platforms (Facebook et al) that promulgate the inflammation of public discourse are themselves increasingly under fire by both sides of the aisle for various reasons to the point that both breaking up tech companies or heavily regulating their platforms are regularly discussed and promoted by lawmakers (ie Facebook and family face existential threats because they are so accused of poisoning the well, which well they have done to be fair but that's really just the nature of social media).
Conspiracy theories are rooted in (if not defined as) the logic of uncharitable interpretations, and I'm saying the source of that is whether the subject of the theory is viewed as a protagonist or antagonist. Conflating the dumb and the smart in that stddev/sigma view is an artifact of that uncharitable thinking as well, where the average person has been trained to think the common are stupid and the exceptional are insane.
Optimistically, we can dislodge that, and I'd emphasize this uncharitable cognitive bias as the source of the divide.
The entire problem. You're not speaking to this person to figure out what they actually believe, you're forming a belief based on your own assumption.
My point is that IF uncle bob's belief system segued into beliefs about masks and vax, then it becomes more visible and causes conflict with family members who didn't give a shit about the previous belief system; not that all conspiracies will necessarily turn into beliefs about masks and vax.
Well that and the preponderance of cutting off hypothetical family members.
It seems to me there has been an uptick in people wrongly assuming someone else's attempts to fix a problem are all down to social signalling.
This is interesting. It used to be that some of the people we're discussing, let's use flat-earthers as an example, were just really, really committed to their idea. It was their whole identity and persona, and they would just never be able to accept anything different. This seems to me to be the opposite of what you describe. It's like they were terrified of being wrong, so they embraced not admitting it no matter what. They would never accept that they had been wrong. (With a few exceptions - I've engaged with A LOT of flat-earthers and some of them do give me the impression they just like playing devil's advocate, etc.)
Lately though, QAnon seems to have embraced this idea. They've made statements that will be very quickly falsifiable, and will actually say "Who cares if it's not true - we just support patriotism / freedom, etc. and what part of that do you disagree with?" They'll make statements that will be very quickly falsifiable (for instance, I saw a post claiming Nancy Pelosi had been arrested for high crimes, and Trump was now making his move - but a few days later obviously that wasn't true). That's what I call a lack of fear of consequences of being wrong. But to me it seems a new phenomenon.
1. Most people make two trips or fewer to a dealership before buying a car
2. when picking a doctor, many individuals use recommendations from friends and family rather than consulting other health care professionals or “formal sources” such as employers, articles or Web sites
And, of course, the last person I would look to for objective advice on buying a car is a car salesman! Surely people have a good idea what they are going to buy, what they are prepared to spend and how they are expecting to finance it etc before going to a dealership for the car?
Ditto too the doctor; if you are searching for a doctor, do you go and cold call other doctors for an opinion? Or read the blurb on a website provided by or sponsored by the employer? Its much more straightforward to ask people you know which doctors they recommend.
Is the subtext of the starting examples meant to be saying that people should defer to car salesman and should ask doctors rather than friends and family to recommend other doctors?
1. We wanted a used Honda Civic. We found a used Honda Civic online, at a good price relative to Blue Book. We went to the dealership and bought the Honda Civic.
2. We wanted a used Honda CR-V. We found a used Honda CR-V online, at a good price relative to Blue Book. We went to the dealership and bought the Honda CR-V.
Both cars have been great. No need to overcomplicate things.
Stepping back, I've noticed a certain cognitive bias that's very common, but I don't know what its name might be. Basically, the bias is the idea that spending more time gathering information and analyzing is always good. To which I respond... maybe in some limited, abstract, ceteris paribus sense, but in real life? Not really.
1. You're not taking account of the opportunity cost, the things you could be doing that you're not because you're stuck on this one thing.
2. There is such a thing as overthinking / analysis paralysis, where your thinking actually gets worse the more you obsess about something. You may be better off unplugging, taking a walk, and coming back with a clear mind. You may be able to make a quick decision that's reasonably close to optimal and ends the sinking of your time into this one particular thing.
Then they will go onto gold digging exercise to prove that they are smart to be smarter than those pesky dealers.
In the end when thing they bought turns out crappy (no gold in that mine:)) they will get defensive about it and will pull all kind of mental gymnastics to confirm what they did was worth investing time.
So such person won't even feel the opportunity cost, because they will make themselves believe that effort was necessary to find that good and they 'gamed the system'.
2. There was exactly one Honda dealership within driving distance in the state where I wanted to buy the vehicle
3. I acquired this vehicle.
Later on, I was told that this was actually a Japanese Honda, and not an American Honda, and that the allegation was that although the same design, the Japanese Honda had superior reliability records and less variance around fittings, because the Japanese teams had a great deal of experience around assembly (at that time). More than 20 years later I am still driving the same vehicle, and I've had no major issues of any type.
>1. You're not taking account of the opportunity cost, the things you could be doing that you're not because you're stuck on this one thing
I do appreciate the point you made - there is an opportunity cost to your thinking, and overly focusing on one thing may be contrary. If I have learned anything from thinking about SV business models, it is that attention is a limited resource. Every moment we spend overanalyzing one thing, is time spent less on something more productive.
> There is such a thing as overthinking / analysis paralysis, where your thinking actually gets worse the more you obsess about something. You may be better off unplugging, taking a walk, and coming back with a clear mind. You may be able to make a quick decision that's reasonably close to optimal and ends the sinking of your time into this one particular thing.
"paralysis by analysis" is definitely a real thing. Although I believe it is important to go through the cognitive exercise of analysis, it can be counter-productive. Saw this at a workplace or two where budgets to execute projects were highly limited, so staff would analyze-to-death options for fairly low dollar activities. Spent more on analysis of paper projects by a factor, than actual project execution.
(a) I hate car shopping, in particular any interactions with the salespeople
(b) I have the internet
(c) I don't believe there is some special deal I can unlock, I think that is a mistaken belief that many dealers like people to hold so they can make them feel like their getting them a special deal, when in reality the "negotiations" are just a bit of theatre
(d) I value my time and it's not worth a few $100 to drive all over looking for deals
A typical buyer negotiates for a car only a few times in their life.
Who do you think is likely to come out on top?
Now the extra costs come in on accessories, service plans, and warranties, whereas they used to make more profit on the actual sales.
We just bought a vehicle in March, and it was the best, least friction experience buying a car. I found the package I wanted, I found the color I wanted, I evaluated the price against online sources such as KBB, truecar, and those types of things, then I contacted the dealer who had it in stock online and set a date to come buy it. They didn't try any weird old-timey car sales tactics, because the price is the price. They offered extras, add-ons, the warranties, and service packages, with clear prices, and were respectful when I declined.
It's a tough problem to solve - wait time is certainly part of the solution, but so is bedside manner, the doctor's willingness to punt and refer you elsewhere for unique problems outside their area, price, location, and a hundred other things.
And that doesn't even address picking a specialist. You break your hand, how do you pick a hand surgeon on short notice? The cost of picking the worst guy could be high, but you know what they call the guy who graduated last in his class? Doctor.
Even something like a broken bone falls under the category of "normal" health issue. It happens all the time. You will be fine with a perfectly mediocre doctor dealing with your broken bone.
As for bedside manner, I'm mostly with you - it's largely irrelevant. For me, it's more of a minimum acceptable level - as long as they're above it, all is good.
You make an excellent point, and I will point out a related anecdote.
A plastic surgeon in my wife's practice fell into a glass table severing a number of nerves and tendons in his dominant wrist & arm. Basically a potential career ending injury. Short of an amputation that is one of the worst injuries for a surgeon.
He did not go to any regular hand surgeon, or plastic surgeon that may have done a fellowship in hand. Nope. He got driven to Hopkins where they have one of the best teams in the Mid-Atlantic for complex, nerve involved hand & arm related injuries. A year or so later and he is completely recovered.
Some of the other leading honor roll hospitals  like Mass General, Cleveland Clinic, Mayo, etc - they have leaders in their field, and collections of gurus needed for really complex injuries where best outcomes are critical. Regular surgeons refer the super hard cases to them, because they are often times the best care possible in the US.
>a mediocre doctor can deal with a broken bone
interprets that as
>there are zero cases of broken bones that are extremely complex requiring highly specialized doctors
I'm really curious to know if you believe that is how humans communicate.
Your "gotcha" response of a relatively rare case of a hand fracture requiring surgery with a specialist is not a meaningful response.
The best GP I have had admitted when she didn't know but knew how to search and searched together with me or gave pointers for further experimentation and troubleshooting. He wait times suck because she does this. I love it.
Re: manners. This can be so different. I had an Indian colleague complain to me once about a really bad doctor. The doctor apparently told her mother that she had cancer. I was dumbfounded why that would be bad (she did actually have cancer). Apparently the doctor was supposed to tell a family member about it but not the mother.
I very clearly told her that if a doctor did tell a family member but not me I would be really really mad. Never mind doctor patient confidentiality rules.
She didn't seem to understand.
Same goes for doctors. I need a recommendation for somebody in my area that I can work with. If a close friend likes a doctor there is probably a decent chance I will like them as well. I can also google them and things like that before deciding to go. It also isn’t a lifetime commitment. If I don’t like them I can change doctors very easily.
I think what is missing from their comment is that people go to do those things without doing prior research. In the case which you go to the car salesman once, you have already done your research - same with the doctor.
I know people who would exhibit the behavior without doing the prior research. I am not sure if they believe the car salesman or not, I can't read their mind.
Some people are comfortable making hasty decisions. Just because a person visited only one car dealer does not suggest a lack of prior research. A person may know exactly what they want and the price range they are willing to pay for it with features they want. The last time I bought a car I went to one dealer and told them I want these features. They called around to various other dealers on my behalf because they wanted my business and they knew what to look for.
Time to action stresses arbitrary decisions which may not be hasty (no planning or research). The goal of a well considered time to action is to act quickly but in balance for risk analysis. This means either having a remediation already available or transferring the risk to someone else in order to make a decision now consequences be damned.
In other situations it makes complete sense. If you have $1M, losing $10 isn't nearly as big a deal as when those $10 were all you had.
If you try to apply game theory to economics you apply a utility function to money to model this, and a logarithmic function maps quite well to how humans think about money.
Not necessarily, since many people buy coffee way more often than they buy a car.
Buying a car is all about understanding that dealers are usually pretty desperate.
I wouldn't think this was an unusual pattern for many buyers in the internet age.
One thing that hits me is conspiracy theories. Although i am not buying into any of the currently circulating ones (covid deliberately invented by Chinese, mind control chips and 5G, faked Moon landings), i understand people who do. Actually, hiding link between cell phones and cancer was a very viable idea through ~mid-2000s when cell phones became just way too numerous so if the link was true, it would become impossible to hide (and this is about the time when i heard of this theory a lot, then it gradually subsided, i haven't really heard of it in any serious manner in the last 10 years). After all, we know that they quite deliberately denied addictiveness of tobacco? They denied harm from leaded gasoline and lead water pipes? There is so much nasty crap authorities concealed or actively denied because they made money on it. They still deny how dangerous and addictive sugar is. Many of them still deny global warming.
I have to keep telling myself "It's probably not true," but if it ever turns out to be proven, I'd be shouting "I KNEW IT!"
This kind of logic is acceptable because decisions about one's life can never be made on the basis of mathematically rigorous information because you very rarely come across such kind of information in your day to day life. To suggest that the mainstream consensus is always accurate is to deny past history and present reality. No matter your religion, you can always point to incredibly large groups of people who believe some facts that you perceive to be incorrect.
But the affirmers are all people in white lab-coats, and many people automatically distrust anyone in a white lab-coat. No "ordinary people" went to the moon and witnessed it themselves. I believe you could pick out the lunar lander from Earth, using a BIG telescope, as a small dot. But no amateur stargazer could do that. So all the affirmative testimony comes from a group of people who were all working for the same team.
So if you don't trust that team, there's no reliable evidence against the CT. (There's no evidence for it either; but if you don't trust the people who know, then to doubt seems rational)
I imagine you could just as easily make a study "people who look at fewer partitions of datasets reach wrong conclusions more often". Just as you can sample more fish from a lake and see if there's a pattern in their color, you can try to look at more perspectives and see if there's a pattern one way or the other.
The term is often used to refer to implausible "theories" that make very strong claims, but have zero supporting evidence and make zero testable predictions.
Of course, it's true that people conspire all the time.
Implausible "theories" that make strong claims are exciting and entertaining. They're fun to play with. They become a problem when they go viral, among groups of people with limited critical thinking skills.
The internet has been responsible for this, by democratising information. Anyone who failed at school can still look up shifty research papers online, misunderstand them, and not not know how to evaluate the evidence.
This is a social pandemic, and I have no idea what the cure is. But I'm certain the cure isn't more censorship.
> you can always point to incredibly large groups of people who believe some facts that you perceive to be incorrect.
Hardly any propositions about the world are "mathematically rigorous" - there's nearly always a big chunk of judgement needed to decide the truth of a proposition. That doesn't mean that all judgements have equal value; some beliefs are "wronger" than others. There seem to be large groups of people with opinions that are screamingly, obviously, completely wrong, and as close to 100% irrational as makes no difference.
Some "conspiracy theories" lack adequate supporting evidence, but are at least arguable. For example, it's a fact that in the past, groups of people have been given vaccines that were deliberately contaminated by the manufacturer. It's therefore not unreasonable to ask for evidence that this hasn't happened in the case of e.g. COVID vaccines.
I prefer to reserve the term "conspiracy theory" for that class of opinions that is obviously completely batshit.
Agreed, and to expand on this: I believe another problem is that these conspiracy theorists also believe that they have secret knowledge that the man doesn’t want you to have. Finally, they are smarter about something than those folks with education and success.
This makes them an easy target for real conspiracies, conspiracies by folks that merely want to take their money. See, for example, the anti-GMO movement and specifically the anti-GMO seed packs you can buy.
I'm against GMO. Not because I believe retail GMO food is likely to be harmful; I mean, harmful foods could be made, but we have testing.
I'm against GMO because (a) a lot of GMO plants have been engineered to be grown with heavy use of herbicides and pesticides, (b) GMO crops make a farm non-organic, and I'm in favour of organic farming for environmental reasons; and (c) GMO companies lobby incessantly against food labelling. If a food supplier doesn't want to label my food, then absent any other excuse, I presume there's something they don't want me to know (yes, I read food labels in supermarkets. I carry a credit-card-sized magnifier in my wallet, for that purpose).
What is an "anti-GMO seed pack"? Would that just be a pack containing seeds that are not genetically modified?
That's whether GMOs are safe or not.
Water companies would also lobby against "Warning: heavy water" and Cisco would lobby against "gigahertz radiation" labels, but not because they have something to hide.
No it doesn't.
But without a label of the form "contains GM ingredients", I am being deprived of choice about what I eat. It's reasonable to expect people to care a lot about what they eat.
> Water companies would also lobby against "heavy water" labels
If you mean labelling ordinary water as "heavy" because it contains a tiny amount of deuterium, I'm not that's analogous. Even if it is, I'm not sure that saying "They would say that, wouldn't they" makes their arguments against labelling more persuasive.
You should still be able to avoid the product for environmental reasons, or just because, but of course they don't think so. (HN's favorite Sinclair quote about men and salaries and understanding.)
This is why Occam’s razor is such a useful tool. Suppose your conspiracy is correct. Now how many things do you have to explain to make this conspiracy work? If it’s too many, maybe your ideas are wrong and you need a simpler answer.
I think that it's easy to fall into a false dichotomy between 'believing that everything conventional is true' and 'conapiracy theories are real'.
Sometimes conspiracy theorists defend themselves by asking things like 'do you trust the government/corporations/church...?' as if I must either trust everything a government has ever said or (eg.) believe it's harbouring secret ET bases. Or that no doctor has ever lied or they are hiding the secret cancer cure.
And yea, maybe I am missing something that's real. The issue is that there is such a deluge of crap, that no one can possibly know every theory. You could spend months or years just researching one of (9/11, crisis actor, qanon, moon landing, telepathy, etc).
We may just be talking about different types of conspiracy theories though, as mentioned by the other responder.
And yet much of life is dominated by such cases.
Great ball players jump all the time.
Hunters jump or go hungry.
Soldiers jump or get jumped.
Financial traders jump all day long. It doesn't mean they don't study at night, but during they day, they don't spend time pondering trades that disappear as fast as a gap in traffic.
Maitre d's jump.
Taxi drivers jump.
Fork lift drivers jump.
Customs inspectors jump.
Not everyone jumps.
Engineers work things out. But they still have to jump sometimes before the factory explodes or the heat shield disintegrates or the oxygen runs out.
One insight recently in the political theater is to determine what your audience is skeptical of. Everyone is skeptical of something, and these biases, when pinpointed, can be illuminating.
On any topic you can split skepticism into groups. Are you more skeptical of government? The pharmaceutical industry? Corporations? Which?
They're a manifestation of biased skepticism. Conspiracy theorists tend to be skeptical of everything except their own favorite conspiracy theories.
There is also this massive assumption in whole thing that investing time to study or decide actually increases positive outcomes. Might need to show that and then see if time spent vs. possible upside is well allocated.
How many fish do you need to catch to be relatively confident you've covered the 51% possibility? 2-3? 9-10? I'm sure there's a mathematic solution, but my gut tells me I'd want more than 2-3. Maybe as many as 10 or 20, depending on how frequently fish are caught.
If you're doing multiple experiments, then you can say that p(redLake) is your belief and update it by multiplying by p(redFish|redLake)/p(redFish) every time you see a red fish, and its inverse whenever you see a grey fish.
If both of the lakes are 49-51, the update size is 0.51/0.50 = 1.02. Each red fish you see should increase your confidence in the red lake by 2% over whatever it was before, and vice versa.
Assuming your original assumption is 50-50: After you see one red fish, you can be 51% confident in the red lake. If you've seen 5 more red fish than grey, you can be 55% confident. If you've seen 10 more red fish than grey, you can be 60% confident. If you've seen 30 more red fish than grey, you can be 90% confident.
If the proportion of fish in each lake is more substantial, your bayesian update is larger and your confidence increases faster. If each lake is 90% one color, a single piece of evidence should give you 90% confidence.
I guess the true conclusion-jumpers don't think they are.
David Dunning strikes again!
I’m curious if there’s a correlation between type 1 jumpers and people who are more entrepreneurial risk takers.
Most criminals do not have it either.
You know, this really isn’t a valid experiment. If you do the probability calculations then 2-3 draws is all you need. All this is based on the assumption getting red or grey is high probability, >90%.
This is exactly how distributors do defect testing on lots incoming goods.