<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>motivated reasoning &#8211; Spencer Greenberg</title>
	<atom:link href="https://www.spencergreenberg.com/tag/motivated-reasoning/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.spencergreenberg.com</link>
	<description></description>
	<lastBuildDate>Tue, 14 Jan 2025 17:03:27 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">23753251</site>	<item>
		<title>Trusting the science</title>
		<link>https://www.spencergreenberg.com/2024/11/trusting-the-science/</link>
					<comments>https://www.spencergreenberg.com/2024/11/trusting-the-science/#comments</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Wed, 20 Nov 2024 15:35:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[antiintellectualism]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[dichotomous thinking]]></category>
		<category><![CDATA[distrust]]></category>
		<category><![CDATA[fake]]></category>
		<category><![CDATA[fraud]]></category>
		<category><![CDATA[fraudulent science]]></category>
		<category><![CDATA[importance hacking]]></category>
		<category><![CDATA[motivated reasoning]]></category>
		<category><![CDATA[nuanced thinking]]></category>
		<category><![CDATA[p-hacking]]></category>
		<category><![CDATA[polarization]]></category>
		<category><![CDATA[pragmatism]]></category>
		<category><![CDATA[repliation crisis]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[social science]]></category>
		<category><![CDATA[variability]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=4249</guid>

					<description><![CDATA[Is it a bad idea to broadly tell people to just &#8220;trust the science&#8221;? I think so. The reason stems from my thinking that all of the following are important and true (and too often overlooked) regarding science: 1) A lot of science is real AND valuable to society. 2) A lot of &#8220;science&#8221; is [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Is it a bad idea to broadly tell people to just &#8220;trust the science&#8221;? I think so.</p>



<p>The reason stems from my thinking that all of the following are important and true (and too often overlooked) regarding science:</p>



<p>1) A lot of science is real AND valuable to society.</p>



<p>2) A lot of &#8220;science&#8221; is actually fake &#8211; see, for instance, a decent percentage of papers in psychology 15 years ago.</p>



<p>3) &#8220;Science&#8221; (as an approach to knowledge discovery) is one of humanity&#8217;s greatest inventions &#8211; but in practice, it is reasonably often misapplied, or the process is distorted due to bad incentives or poor training. Unfortunately, not all fields of science have done a good job of being self-correcting either, so sometimes, fields go in bad directions for quite a while and need reform. There are different kinds of bad science:</p>



<p>(i) Sometimes, science is &#8220;bad&#8221; because it uses unsound methods for figuring out the truth (such as when p-hacking is rampant).</p>



<p>(ii) Sometimes it is &#8220;bad&#8221; because it overclaims (e.g., &#8220;Importance Hacking&#8221; where scientists claim they found something important/valuable when they didn&#8217;t actually demonstrate what they claim in their study. Or cases where science is used to &#8220;prove&#8221; questions that can&#8217;t be proven by science &#8211; such as which policy is better in a particular context when it&#8217;s actually a tradeoff between different values).</p>



<p>(iii) Other times science is bad because it is biased (e.g., when people are only willing to run or publish studies that show X but not that show the opposite of X).</p>



<p>(iv) And sometimes science is bad because it&#8217;s simply fraudulent.</p>



<p>4) Promoting broad &#8220;trust the science&#8221; is misguided (and actually harmful) because a bunch of science is fake. If you tell people to always just &#8220;trust the science,&#8221; then you are going to cause them to be tricked by a bunch of bad science, or you are going to contribute to their disillusionment and loss of trust when they discover (correctly) that some of the science you&#8217;re saying is good is actually garbage.</p>



<p>5) The &#8220;distrust all science&#8221; view is probably an even worse take than &#8220;trust the science.&#8221; If you distrust all science, you are likely to miss out on incredible things (such as highly effective treatments), and you set yourself up to fall for tons of things that don&#8217;t work (e.g., widely used unscientific treatments). Those who tell people to always just &#8220;trust the science&#8221; sometimes accidentally push people into the &#8220;distrust all science&#8221; view when those people realize that some of what they are being told to trust is crap.</p>



<p>6) So, hard as it is, rather than promoting either &#8220;trust all science&#8221; or &#8220;distrust all science,&#8221; the course of action I believe in with regard to science education is to teach people that &#8220;Science&#8221; (as a method) is an incredibly powerful and useful invention, but that &#8220;science&#8221; (as actually practiced) is much like every other field: some of it is good, some of it is crap. There are good hairdressers and bad hairdressers, and there is good science and bad science (and unfortunately, some bad science ends up in the very top journals &#8211; while journals and peer review absolutely do block some bad science, they unfortunately still let through quite a lot of it).</p>



<p>Since some science is well done, and some of it is poorly done, it&#8217;s very valuable to learn to tell the difference to make the best use of scientific results &#8211; both with regard to applying it in your own life and using it to form your beliefs about the world.</p>



<p>If we pretend science is all good or all bad, we do a lot of harm. We need nuance to see through the bad stuff while maintaining the tremendous benefits.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on November 20, 2024, and first appeared on my website on January 14, 2025.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/11/trusting-the-science/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4249</post-id>	</item>
		<item>
		<title>Three motivations for believing </title>
		<link>https://www.spencergreenberg.com/2024/04/three-motivations-for-believing/</link>
					<comments>https://www.spencergreenberg.com/2024/04/three-motivations-for-believing/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sat, 20 Apr 2024 14:04:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[addiction]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[delusional]]></category>
		<category><![CDATA[delusions]]></category>
		<category><![CDATA[denial]]></category>
		<category><![CDATA[dopamine]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[hedonism]]></category>
		<category><![CDATA[hope]]></category>
		<category><![CDATA[motivated reasoning]]></category>
		<category><![CDATA[optimism]]></category>
		<category><![CDATA[pragmatism]]></category>
		<category><![CDATA[present bias]]></category>
		<category><![CDATA[rationalism]]></category>
		<category><![CDATA[religion]]></category>
		<category><![CDATA[self-sabotage]]></category>
		<category><![CDATA[utility]]></category>
		<category><![CDATA[values]]></category>
		<category><![CDATA[wishful thinking]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3929</guid>

					<description><![CDATA[There are three different motivations for belief, and it&#8217;s important to distinguish between them.&#160; 1) Belief because you think something&#8217;s true. For instance, you may think that the evidence supports the idea that you will eventually find love, or you may feel convinced by logical arguments you&#8217;ve heard in favor of god&#8217;s existence. 2) Belief [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>There are three different motivations for belief, and it&#8217;s important to distinguish between them.&nbsp;</p>



<p><strong>1) Belief because you think something&#8217;s true.</strong></p>



<p>For instance, you may think that the evidence supports the idea that you will eventually find love, or you may feel convinced by logical arguments you&#8217;ve heard in favor of god&#8217;s existence.</p>



<p><strong>2) Belief because you think it&#8217;s useful to believe.&nbsp;</strong></p>



<p>Regardless of whether you predict something&#8217;s true, you can predict that believing it will be more helpful than harmful to you in the long term, and so be motivated to believe for that pragmatic benefit.</p>



<p>For instance, you may intuit that you&#8217;ll be better off long-term believing that you will eventually find love (because that will make love more likely) or perceive that you&#8217;ll be happier believing in god (even if it turns out there is no god).</p>



<p><strong>3) Belief because it feels good in the moment.&nbsp;</strong></p>



<p>Regardless of whether it&#8217;s true or helpful to you in the long term, you may be motivated to believe something because it feels good right now (or prevents you from feeling bad).&nbsp;</p>



<p>For instance, you may feel comforted right now by thinking you&#8217;ll eventually find love or feel good in the moment, believing a god is watching over you.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Rationalists&nbsp;</strong>typically recommend striving to have your beliefs be of type 1: believing based on what&#8217;s most likely to be true.</p>



<p><strong>Pragmatists</strong>&nbsp;often recommend aiming for type 2 beliefs: believing based on what&#8217;s ultimately most useful to you.</p>



<p>I favor striving to have type 1 beliefs rather than type 2 beliefs, in part because I intrinsically value truth, but also because I think that for beliefs in category 2 that are *not* actually true, there are typically some beliefs in category 1 that will help you just as much, but which&nbsp;have the advantage of&nbsp;also&nbsp;being true.&nbsp;So often (but not always), there is a low cost to replacing beliefs from 2 with beliefs from 1 that have the added benefit of being true.</p>



<p>I also think that if you allow yourself&nbsp;to indiscriminately hold type 2 beliefs, it makes it hard to suddenly switch to rigorous truth-oriented thinking when it&#8217;s important to figure out the truth (e.g.,&nbsp;when you have to make a very important decision based on evidence).</p>



<p>On the other hand, many people have lots of type 3 beliefs, and all of us, myself included, have some type 3 beliefs. Whether you think that type 1 or type 2 beliefs are ultimately preferable, I think a valuable aspiration is to replace some of our type 3 beliefs with either 1s or 2s.</p>



<p>It&#8217;s very, very easy for us humans to delude ourselves based on what it feels good to believe at the moment because the reward cycle is so fast. Type 3 beliefs are immediately rewarding, incentivizing more such beliefs. But they are like the social media addiction version of believing, where you pursue what gives the greatest instantaneous reward rather than what&#8217;s actually good for you.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on April 20, 2024, and first appeared on my website on May 7, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/04/three-motivations-for-believing/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3929</post-id>	</item>
		<item>
		<title>Why is Confirmation Bias So Common?</title>
		<link>https://www.spencergreenberg.com/2021/05/why-is-confirmation-bias-so-common/</link>
					<comments>https://www.spencergreenberg.com/2021/05/why-is-confirmation-bias-so-common/#respond</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Wed, 05 May 2021 14:37:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[confirmation bias]]></category>
		<category><![CDATA[echo chambers]]></category>
		<category><![CDATA[epistemic humility]]></category>
		<category><![CDATA[fear of errors]]></category>
		<category><![CDATA[irrationality]]></category>
		<category><![CDATA[motivated reasoning]]></category>
		<category><![CDATA[overconfidence]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[soldier mindset]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2204</guid>

					<description><![CDATA[Written: May 5, 2021 &#124; Released: June 18, 2021 People often talk about what a problem &#8220;confirmation bias&#8221; is. But we rarely discuss what causes so many of us to search for information in a biased way. Let&#8217;s explore some of the forces: 1. Echo chambers:&#160;our routine sources of information tend to support our worldview. [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p><em>Written: May 5, 2021 | Released: June 18, 2021</em></p>



<p>People often talk about what a problem &#8220;confirmation bias&#8221; is. But we rarely discuss what causes so many of us to search for information in a biased way.</p>



<p>Let&#8217;s explore some of the forces:</p>



<p><strong>1. Echo chambers:&nbsp;</strong>our routine sources of information tend to support our worldview. Much of this is due to social ties (we tend to talk to people who are similar to us in age, geography, religion, etc.) We also trust news sources more if they share our basic ideology/assumptions. The authorities we look to will be the ones that agree with us on most of our fundamental assumptions (even if some of these assumptions could turn out to be wrong).</p>



<p><strong>2. Soldier Mindset:&nbsp;</strong>as&nbsp;<a href="https://juliagalef.com/" target="_blank" rel="noreferrer noopener">Julia Galef</a>&nbsp;explains in her wonderful new book (the Scout Mindset), we are often not even TRYING to figure out the truth. We&#8217;re just trying to beat the other side or prove a point. In these cases, of course we have a biased search process.</p>



<p><strong>3. Lack of doubt:&nbsp;</strong>when we&#8217;re really confident our basic premises are correct, we don&#8217;t see the need for a nuanced information search process. We&#8217;re going to go to whatever sources are convenient for filling in minor details, even if our beliefs have major unquestioned assumptions.</p>



<p><strong>4. Advocacy:</strong>&nbsp;others are actively trying to get us to believe certain falsehoods. They do so through ads, websites, news, and other channels. For the most part, they themselves believe these falsehoods (promoting &#8220;the truth&#8221; as they see it), but occasionally it&#8217;s pure manipulation.</p>



<p><strong>5. Fear of being wrong:&nbsp;</strong>it hurts to be wrong, especially if we&#8217;ve made the error publicly, have our identity tied up in the belief, or it challenges our understanding of the world or who to trust. We sometimes avoid finding out we are wrong the way we avoid touching a hot stove.</p>



<p>We don&#8217;t always seek out information in a biased way. For instance, when looking up driving directions or trying to figure out what paint to use to prevent water damage, we want the right answer, don&#8217;t have political biases, and usually have appropriate self-doubt and skepticism.</p>



<p>But we are liable to have a biased search for the truth when we are incentivized to have particular beliefs, such as when our social world supports just one perspective, when we&#8217;re trying to prove the other side wrong, when we have no doubt that we are right, when powerful others are devoting a lot of effort to persuade us, or when we&#8217;re too afraid of being wrong.</p>



<p>For more about this topic, you may want to check out <a href="https://clearerthinkingpodcast.com/?ep=036">my recent podcast episode with Julia Galef</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/05/why-is-confirmation-bias-so-common/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2204</post-id>	</item>
	</channel>
</rss>
