<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>belief formation &#8211; Spencer Greenberg</title>
	<atom:link href="https://www.spencergreenberg.com/tag/belief-formation/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.spencergreenberg.com</link>
	<description></description>
	<lastBuildDate>Sat, 26 Oct 2024 03:24:01 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">23753251</site>	<item>
		<title>A major (overlooked) reason why smart people fall for stupid things</title>
		<link>https://www.spencergreenberg.com/2024/09/a-major-overlooked-reason-why-smart-people-fall-for-stupid-things/</link>
					<comments>https://www.spencergreenberg.com/2024/09/a-major-overlooked-reason-why-smart-people-fall-for-stupid-things/#comments</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Fri, 13 Sep 2024 12:47:50 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief formation]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[mentors]]></category>
		<category><![CDATA[mentorship]]></category>
		<category><![CDATA[naivete]]></category>
		<category><![CDATA[recommendations]]></category>
		<category><![CDATA[scams]]></category>
		<category><![CDATA[social learning]]></category>
		<category><![CDATA[social proof]]></category>
		<category><![CDATA[trrusting]]></category>
		<category><![CDATA[trust]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=4117</guid>

					<description><![CDATA[Why do smart people fall for stupid things? Here is what I think is an important part of the answer that almost never gets discussed. It&#8217;s easy to look around at the stupid seeming things that other people believe (e.g., people who join harmful cults, get scammed by a con artist, become vocal evangelists for [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Why do smart people fall for stupid things? Here is what I think is an important part of the answer that almost never gets discussed.</p>



<p>It&#8217;s easy to look around at the stupid seeming things that other people believe (e.g., people who join harmful cults, get scammed by a con artist, become vocal evangelists for a placebo treatment, or jump on the hype train of some outrageous new bubble) and wonder: &#8220;How on earth can they be so dumb?&#8221;</p>



<p>The answer, a lot of times, is simply the trust they have in someone else.<br>In other words, if a person were to evaluate the bad idea itself &#8211; call it X &#8211; they may well see it as dumb, dangerous, or full of hot air.</p>



<p>Instead, someone that person sees as impressive and totally trustworthy (or someone they just really like and respect) tells them that X is the next big thing. Or that X will change their life. Or that X will make them rich. Or that X will solve a problem for them that they desperately want solved.</p>



<p>This puts their brain in a predicament. They can either believe:<br>(1) That this impressive person who they deeply trust is deceiving them<br>Or<br>(2) That his impressive person who they trust is right &#8211; and their life will be way better off because of it!</p>



<p>If their trust in the person is great enough, or, at least, greater than their level of skepticism, (2) may win them over simply for that reason.</p>



<p>But (2) may also win them over for one or more of these reasons:</p>



<ul class="wp-block-list">
<li>they so desperately want this to be real &#8211; they so want to be special, or rich, or to have their biggest problems finally solved</li>



<li>it&#8217;s difficult and painful to believe this person they trust so much is deceiving them or so wrong about something important</li>



<li>they sense it will damage the relationship if they refuse to believe, and they care deeply about the relationship</li>



<li>they have a hard time saying &#8216;no&#8217; &#8211; perhaps it makes them very anxious to do so</li>
</ul>



<p>In other words, there are a great many dumb things that even smart people end up believing in simply because people believe people. To be clear, this is not the only mechanism by which smart people fall for dumb things. Being smart is not the same as acting rationally all the time. But this trust-based force is, I think, an important mechanism.</p>



<p>While a belief in others is wonderful and admirable in many instances, it can also be a chink in our skepticism and rationality. It can lead us to believe in crazy and dangerous things that we wouldn&#8217;t be likely to believe without that trust. We see this when people get scammed by their favorite influencer or when they become true believers in quack cures because they have a friend who says it changed their life.</p>



<p>While this effect often happens when one person we trust causes us to believe in X, the effect is magnified when more people around us believe. Being recruited into a harmful cult by a trusted friend can be difficult, but leaving a cult &#8211; at which point all of our close friends are believers in X &#8211; is far more difficult. And growing up in an authoritarian regime &#8211; where EVERYONE we&#8217;ve met seems to believe in X, makes X that much more impossible to resist.</p>



<p>When rationality is discussed, it&#8217;s often talked about at the level of the individual. But quite a bit of our thinking we necessarily outsource to others &#8211; we can&#8217;t make sense of everything ourselves. When we allow someone into our circle of trust who doesn&#8217;t deserve to be there, that can jeopardize our rationality. Hence, an important meta-skill of rationality is knowing who to trust &#8211; and not being suckered into trusting those who don&#8217;t deserve it.</p>



<p>Almost everyone is susceptible to this phenomenon of being duped because of our trust in people, but that doesn&#8217;t mean there&#8217;s nothing we can do to avoid it.</p>



<p>One thing that I think helps is to treat trust as being multi-factor. I can trust a person in one way or in one domain but not another. Or, put another way, earning trust is multi-dimensional. I can see someone as trustworthy because:</p>



<ul class="wp-block-list">
<li>I know they wouldn&#8217;t betray me and that they care a lot about me</li>



<li>I know that they vet evidence carefully, come to their beliefs in a rigorous way, and approach new information skeptically</li>



<li>I know that they are extremely knowledgeable about a specific topic area</li>
</ul>



<p>Being strong in one of these domains doesn&#8217;t automatically make someone strong in another. So, viewing someone as trustworthy in one of these domains shouldn&#8217;t cause you to view them as trustworthy in the other ones. And yet, that&#8217;s what many people do.</p>



<p>If you track trust in a one-dimensional way, it puts you at a lot of risk because someone you trust may have a very bad idea that they really want you to believe in. It may be hard to reject that idea because you trust them so much &#8211; and that may mean joining a harmful cult, buying into the peak of the next bubble, putting stock in an ineffective treatment, or being scammed.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on September 13, 2024, and first appeared on my website on September 22, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/09/a-major-overlooked-reason-why-smart-people-fall-for-stupid-things/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4117</post-id>	</item>
		<item>
		<title>Conducting Instantaneous Experiments</title>
		<link>https://www.spencergreenberg.com/2024/08/conducting-instantaneous-experiments/</link>
					<comments>https://www.spencergreenberg.com/2024/08/conducting-instantaneous-experiments/#comments</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Sat, 24 Aug 2024 11:54:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[Bayesian reasoning]]></category>
		<category><![CDATA[belief formation]]></category>
		<category><![CDATA[belief updating]]></category>
		<category><![CDATA[continual learning]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[evidence]]></category>
		<category><![CDATA[experiments]]></category>
		<category><![CDATA[incremenetal evidence]]></category>
		<category><![CDATA[likelihood]]></category>
		<category><![CDATA[likelihood ratios]]></category>
		<category><![CDATA[posterior]]></category>
		<category><![CDATA[proof]]></category>
		<category><![CDATA[updating]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=4166</guid>

					<description><![CDATA[Have a hypothesis about the world, society, human nature, physics, or anything else that nobody has directly tested before? It might seem like conducting a costly experiment would be required to find out whether it&#8217;s true. But a lot of the time, you can check your hypothesis easily using what I call an &#8220;Instantaneous Experiment.&#8221; [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Have a hypothesis about the world, society, human nature, physics, or anything else that nobody has directly tested before? It might seem like conducting a costly experiment would be required to find out whether it&#8217;s true. But a lot of the time, you can check your hypothesis easily using what I call an &#8220;Instantaneous Experiment.&#8221;</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>How to do an Instantaneous Experiment:</p>



<p><strong>Step 1:</strong> Think of anything at all about the world that&#8217;s checkable that is likely to be true if your hypothesis is true, but that is likely to be false if your hypothesis is false.</p>



<p>Important: this checkable thing should be something that you have never investigated before &#8211; in other words, you don&#8217;t actually know if it&#8217;s true, and the only real reason you think it&#8217;s true is just because your hypothesis implies it would be. This is critical to help prevent bias from occurring during the process (for instance, this procedure doesn&#8217;t work if the fact you are checking is one that influenced your development of the hypothesis).</p>



<p><strong>Step 2:</strong> Go check whether the checkable thing is true or not by trying to look the answer up (e.g., in an article or paper)!</p>



<p>The amount of evidence that the answer provides in favor (or against) your hypothesis precisely depends on how many times more likely you are to see that result if your hypothesis is true compared to if it&#8217;s not true. The bigger that number is, the greater the evidence!</p>



<p>Instantaneous Experiments work because, to get evidence for a theory or hypothesis, it is not necessary to directly check whether that thing is true. All you have to do is check something that is implied by that theory (that would be unlikely to be true otherwise).</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>Here&#8217;s an example:</p>



<p>Suppose you believe that &#8220;greater intelligence causes people to worry a lot more&#8221;</p>



<p>That&#8217;s very hard to test. But you can do an Instantaneous Experiment:</p>



<p>Step 1: if intelligence causes worry, then you might expect higher IQ people to agree more often with a statement like &#8220;I worry too much,&#8221; whereas if the theory is not true, you wouldn&#8217;t expect a positive correlation between IQ and agreement with that statement.</p>



<p>Step 2: We go check this, and we find a paper that measures both IQ and the level of agreement on the statement &#8220;I worry too much.&#8221; The correlation between them is essentially 0.</p>



<p>Result: We haven&#8217;t completely disproven the theory, but we should now reduce our confidence in it compared to what we thought before.</p>



<p>How much we reduce our confidence depends on how many times less likely we&#8217;d be to find no correlation between self-reported worry and IQ if our hypothesis &#8220;greater intelligence causes people to worry a lot more&#8221; is true, compared to if it&#8217;s false.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on August 24, 2024, and first appeared on my website on October 11, 2024.</em></p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/08/conducting-instantaneous-experiments/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4166</post-id>	</item>
		<item>
		<title>Soldier Altruists vs. Scout Altruists</title>
		<link>https://www.spencergreenberg.com/2021/04/soldier-altruists-vs-scout-altruists/</link>
					<comments>https://www.spencergreenberg.com/2021/04/soldier-altruists-vs-scout-altruists/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Fri, 23 Apr 2021 22:44:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief formation]]></category>
		<category><![CDATA[beliefs]]></category>
		<category><![CDATA[causal mechanisms]]></category>
		<category><![CDATA[conflict]]></category>
		<category><![CDATA[conflict theory]]></category>
		<category><![CDATA[corruption]]></category>
		<category><![CDATA[curiosity]]></category>
		<category><![CDATA[effort]]></category>
		<category><![CDATA[evidence-based action]]></category>
		<category><![CDATA[ideological blindspots]]></category>
		<category><![CDATA[ignorance]]></category>
		<category><![CDATA[inertia]]></category>
		<category><![CDATA[mistake theory]]></category>
		<category><![CDATA[political will]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[reasoning]]></category>
		<category><![CDATA[scaling]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[scout]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[selfishness]]></category>
		<category><![CDATA[soldier mindset]]></category>
		<category><![CDATA[systemic problems]]></category>
		<category><![CDATA[testing]]></category>
		<category><![CDATA[theory]]></category>
		<category><![CDATA[updating]]></category>
		<category><![CDATA[warm glow]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2571</guid>

					<description><![CDATA[There is an important division between people who want to improve the world that few seem to be aware of. Inspired by Julia Galef&#8217;s new book (The Scout Mindset), I&#8217;ll call this division:&#160;Soldier Altruists vs. Scout Altruists. 1. Soldier Altruists&#160;think it&#8217;s obvious how to improve the world and that we just need to execute those [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>There is an important division between people who want to improve the world that few seem to be aware of. Inspired by Julia Galef&#8217;s new book (<em>The Scout Mindset</em>), I&#8217;ll call this division:&nbsp;<strong>Soldier Altruists vs. Scout Altruists</strong>.</p>



<p><strong>1. Soldier Altruists&nbsp;</strong>think it&#8217;s obvious how to improve the world and that we just need to execute those obvious steps. They see the barriers to a better world as:</p>



<p>(i) not enough people taking action (e.g., due to ignorance, selfishness, or propaganda), and</p>



<p>(ii) bad groups blocking things (e.g., corrupt politicians or greedy corporations).</p>



<p></p>



<p><strong>2. Scout Altruists</strong>&nbsp;think it&#8217;s hard to figure out how to improve the world &#8211; and most attempts either don&#8217;t work, only slightly help, or make things worse. They see the barriers to a better world as:</p>



<p>(i) not enough understanding of causal mechanisms (e.g., due to a lack of high-quality evidence, not enough attention to the evidence we do have, not enough careful reasoning, etc.), and</p>



<p>(ii) too much investment in bad solutions (e.g., due to people jumping to conclusions, doing what feels good emotionally rather than what is effective, ideological blindspots, inertia, etc.)</p>



<hr class="wp-block-separator"/>



<p>Soldier Altruists say we need to DO and FIGHT more. Scout Altruists say we need to THINK and TEST more. Soldier Altruists are more likely to think that if we could just get people to be less selfish and more motivated to act, we would make a lot of progress towards a better world. Scout Altruists are more likely to think that if we could just get people to pay more attention to evidence and to have more good-faith debates with strong norms around the quality of argumentation, we would make a lot more progress.</p>



<p>Soldier Altruists may think Scout Altruists are far too reluctant to act and are wasting their time on research and debate. Scout Altruists may think Soldier Altruists are far too confident in their conclusions and are wasting their effort pushing for changes that aren&#8217;t going to help much (and which, in some cases, might even make things worse). Of course, in reality, there is a continuum between these two positions. So, on a scale from 0 (Soldier Altruist) to 10 (Scout Altruist) where do you fall? I&#8217;m probably a 7 or 8.</p>



<hr class="wp-block-separator"/>



<p>As some&nbsp;<a target="_blank" href="https://www.facebook.com/spencer.greenberg/posts/10105808551163702?__cft__[0]=AZXHoevvmvsz4tG6r-SoVZBGVxOdM6ixkZlhisrLVXQTX4VrTiFr5pCm004f4o9J6rQCOqPDSCsRwLT3miKvR3_6STsnjnpvPqH2WkzvtWHbM6eXvssfOziyDsDq1oFu1Pg&amp;__tn__=%2CO%2CP-R" rel="noreferrer noopener">commenters</a>&nbsp;have pointed out, there is a relationship between this distinction and &#8220;Conflict Theory&#8221; vs. &#8220;Mistake Theory.&#8221; I think it is related &#8211; but also distinct in important ways. Conflict theory says that there is a giant zero-sum struggle (groups fighting over fixed resources). Whereas in this case, we&#8217;re operating from a framework of altruism: &#8220;the world can be made a lot better &#8211; what&#8217;s the big barrier to that happening? Is it that we know what to do and we&#8217;re not doing it enough/with enough energy, or is it that we don&#8217;t really know what to do?&#8221;</p>



<p>Also, to clarify another important point brought up in the&nbsp;<a rel="noreferrer noopener" target="_blank" href="https://www.facebook.com/spencer.greenberg/posts/10105808551163702?__cft__[0]=AZXHoevvmvsz4tG6r-SoVZBGVxOdM6ixkZlhisrLVXQTX4VrTiFr5pCm004f4o9J6rQCOqPDSCsRwLT3miKvR3_6STsnjnpvPqH2WkzvtWHbM6eXvssfOziyDsDq1oFu1Pg&amp;__tn__=%2CO%2CP-R">comments</a>: I&#8217;m not asking, &#8220;do you think it&#8217;s obvious how we should improve the world if we had a magic wand that could change whatever we wanted?&#8221; &#8211; instead, the question is: &#8220;is it obvious what to do to improve the real world, given that we don&#8217;t have a magic wand?&#8221; Do we just need to put more money/time/effort/people into executing the current &#8220;obvious&#8221; strategies because they will work well if we just scale them up? Or is it pretty unclear what strategies we should even be putting more resources into (meaning that a lot of thinking, research, debate and/or evidence evaluation will typically be necessary to even figure out what is worth scaling up)?</p>



<hr class="wp-block-separator"/>



<p>Julia&#8217;s book (which I highly recommend): <a href="https://www.amazon.com/Scout-Mindset-Perils-Defensive-Thinking/dp/0735217556/ref=nodl_ ">https://www.amazon.com/Scout-Mindset-Perils-Defensive-Thinking/dp/0735217556/ref=nodl_ </a></p>



<p><em>This piece was first written on April 23rd, 2021, and first appeared on this site on January 7th, 2022.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/04/soldier-altruists-vs-scout-altruists/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2571</post-id>	</item>
	</channel>
</rss>
