<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>epistemics &#8211; Spencer Greenberg</title>
	<atom:link href="https://www.spencergreenberg.com/tag/epistemics/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.spencergreenberg.com</link>
	<description></description>
	<lastBuildDate>Sat, 26 Oct 2024 03:24:01 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">23753251</site>	<item>
		<title>A major (overlooked) reason why smart people fall for stupid things</title>
		<link>https://www.spencergreenberg.com/2024/09/a-major-overlooked-reason-why-smart-people-fall-for-stupid-things/</link>
					<comments>https://www.spencergreenberg.com/2024/09/a-major-overlooked-reason-why-smart-people-fall-for-stupid-things/#comments</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Fri, 13 Sep 2024 12:47:50 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief formation]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[mentors]]></category>
		<category><![CDATA[mentorship]]></category>
		<category><![CDATA[naivete]]></category>
		<category><![CDATA[recommendations]]></category>
		<category><![CDATA[scams]]></category>
		<category><![CDATA[social learning]]></category>
		<category><![CDATA[social proof]]></category>
		<category><![CDATA[trrusting]]></category>
		<category><![CDATA[trust]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=4117</guid>

					<description><![CDATA[Why do smart people fall for stupid things? Here is what I think is an important part of the answer that almost never gets discussed. It&#8217;s easy to look around at the stupid seeming things that other people believe (e.g., people who join harmful cults, get scammed by a con artist, become vocal evangelists for [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Why do smart people fall for stupid things? Here is what I think is an important part of the answer that almost never gets discussed.</p>



<p>It&#8217;s easy to look around at the stupid seeming things that other people believe (e.g., people who join harmful cults, get scammed by a con artist, become vocal evangelists for a placebo treatment, or jump on the hype train of some outrageous new bubble) and wonder: &#8220;How on earth can they be so dumb?&#8221;</p>



<p>The answer, a lot of times, is simply the trust they have in someone else.<br>In other words, if a person were to evaluate the bad idea itself &#8211; call it X &#8211; they may well see it as dumb, dangerous, or full of hot air.</p>



<p>Instead, someone that person sees as impressive and totally trustworthy (or someone they just really like and respect) tells them that X is the next big thing. Or that X will change their life. Or that X will make them rich. Or that X will solve a problem for them that they desperately want solved.</p>



<p>This puts their brain in a predicament. They can either believe:<br>(1) That this impressive person who they deeply trust is deceiving them<br>Or<br>(2) That his impressive person who they trust is right &#8211; and their life will be way better off because of it!</p>



<p>If their trust in the person is great enough, or, at least, greater than their level of skepticism, (2) may win them over simply for that reason.</p>



<p>But (2) may also win them over for one or more of these reasons:</p>



<ul class="wp-block-list">
<li>they so desperately want this to be real &#8211; they so want to be special, or rich, or to have their biggest problems finally solved</li>



<li>it&#8217;s difficult and painful to believe this person they trust so much is deceiving them or so wrong about something important</li>



<li>they sense it will damage the relationship if they refuse to believe, and they care deeply about the relationship</li>



<li>they have a hard time saying &#8216;no&#8217; &#8211; perhaps it makes them very anxious to do so</li>
</ul>



<p>In other words, there are a great many dumb things that even smart people end up believing in simply because people believe people. To be clear, this is not the only mechanism by which smart people fall for dumb things. Being smart is not the same as acting rationally all the time. But this trust-based force is, I think, an important mechanism.</p>



<p>While a belief in others is wonderful and admirable in many instances, it can also be a chink in our skepticism and rationality. It can lead us to believe in crazy and dangerous things that we wouldn&#8217;t be likely to believe without that trust. We see this when people get scammed by their favorite influencer or when they become true believers in quack cures because they have a friend who says it changed their life.</p>



<p>While this effect often happens when one person we trust causes us to believe in X, the effect is magnified when more people around us believe. Being recruited into a harmful cult by a trusted friend can be difficult, but leaving a cult &#8211; at which point all of our close friends are believers in X &#8211; is far more difficult. And growing up in an authoritarian regime &#8211; where EVERYONE we&#8217;ve met seems to believe in X, makes X that much more impossible to resist.</p>



<p>When rationality is discussed, it&#8217;s often talked about at the level of the individual. But quite a bit of our thinking we necessarily outsource to others &#8211; we can&#8217;t make sense of everything ourselves. When we allow someone into our circle of trust who doesn&#8217;t deserve to be there, that can jeopardize our rationality. Hence, an important meta-skill of rationality is knowing who to trust &#8211; and not being suckered into trusting those who don&#8217;t deserve it.</p>



<p>Almost everyone is susceptible to this phenomenon of being duped because of our trust in people, but that doesn&#8217;t mean there&#8217;s nothing we can do to avoid it.</p>



<p>One thing that I think helps is to treat trust as being multi-factor. I can trust a person in one way or in one domain but not another. Or, put another way, earning trust is multi-dimensional. I can see someone as trustworthy because:</p>



<ul class="wp-block-list">
<li>I know they wouldn&#8217;t betray me and that they care a lot about me</li>



<li>I know that they vet evidence carefully, come to their beliefs in a rigorous way, and approach new information skeptically</li>



<li>I know that they are extremely knowledgeable about a specific topic area</li>
</ul>



<p>Being strong in one of these domains doesn&#8217;t automatically make someone strong in another. So, viewing someone as trustworthy in one of these domains shouldn&#8217;t cause you to view them as trustworthy in the other ones. And yet, that&#8217;s what many people do.</p>



<p>If you track trust in a one-dimensional way, it puts you at a lot of risk because someone you trust may have a very bad idea that they really want you to believe in. It may be hard to reject that idea because you trust them so much &#8211; and that may mean joining a harmful cult, buying into the peak of the next bubble, putting stock in an ineffective treatment, or being scammed.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on September 13, 2024, and first appeared on my website on September 22, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/09/a-major-overlooked-reason-why-smart-people-fall-for-stupid-things/feed/</wfw:commentRss>
			<slash:comments>2</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4117</post-id>	</item>
		<item>
		<title>Conducting Instantaneous Experiments</title>
		<link>https://www.spencergreenberg.com/2024/08/conducting-instantaneous-experiments/</link>
					<comments>https://www.spencergreenberg.com/2024/08/conducting-instantaneous-experiments/#comments</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Sat, 24 Aug 2024 11:54:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[Bayesian reasoning]]></category>
		<category><![CDATA[belief formation]]></category>
		<category><![CDATA[belief updating]]></category>
		<category><![CDATA[continual learning]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[evidence]]></category>
		<category><![CDATA[experiments]]></category>
		<category><![CDATA[incremenetal evidence]]></category>
		<category><![CDATA[likelihood]]></category>
		<category><![CDATA[likelihood ratios]]></category>
		<category><![CDATA[posterior]]></category>
		<category><![CDATA[proof]]></category>
		<category><![CDATA[updating]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=4166</guid>

					<description><![CDATA[Have a hypothesis about the world, society, human nature, physics, or anything else that nobody has directly tested before? It might seem like conducting a costly experiment would be required to find out whether it&#8217;s true. But a lot of the time, you can check your hypothesis easily using what I call an &#8220;Instantaneous Experiment.&#8221; [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Have a hypothesis about the world, society, human nature, physics, or anything else that nobody has directly tested before? It might seem like conducting a costly experiment would be required to find out whether it&#8217;s true. But a lot of the time, you can check your hypothesis easily using what I call an &#8220;Instantaneous Experiment.&#8221;</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>How to do an Instantaneous Experiment:</p>



<p><strong>Step 1:</strong> Think of anything at all about the world that&#8217;s checkable that is likely to be true if your hypothesis is true, but that is likely to be false if your hypothesis is false.</p>



<p>Important: this checkable thing should be something that you have never investigated before &#8211; in other words, you don&#8217;t actually know if it&#8217;s true, and the only real reason you think it&#8217;s true is just because your hypothesis implies it would be. This is critical to help prevent bias from occurring during the process (for instance, this procedure doesn&#8217;t work if the fact you are checking is one that influenced your development of the hypothesis).</p>



<p><strong>Step 2:</strong> Go check whether the checkable thing is true or not by trying to look the answer up (e.g., in an article or paper)!</p>



<p>The amount of evidence that the answer provides in favor (or against) your hypothesis precisely depends on how many times more likely you are to see that result if your hypothesis is true compared to if it&#8217;s not true. The bigger that number is, the greater the evidence!</p>



<p>Instantaneous Experiments work because, to get evidence for a theory or hypothesis, it is not necessary to directly check whether that thing is true. All you have to do is check something that is implied by that theory (that would be unlikely to be true otherwise).</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>Here&#8217;s an example:</p>



<p>Suppose you believe that &#8220;greater intelligence causes people to worry a lot more&#8221;</p>



<p>That&#8217;s very hard to test. But you can do an Instantaneous Experiment:</p>



<p>Step 1: if intelligence causes worry, then you might expect higher IQ people to agree more often with a statement like &#8220;I worry too much,&#8221; whereas if the theory is not true, you wouldn&#8217;t expect a positive correlation between IQ and agreement with that statement.</p>



<p>Step 2: We go check this, and we find a paper that measures both IQ and the level of agreement on the statement &#8220;I worry too much.&#8221; The correlation between them is essentially 0.</p>



<p>Result: We haven&#8217;t completely disproven the theory, but we should now reduce our confidence in it compared to what we thought before.</p>



<p>How much we reduce our confidence depends on how many times less likely we&#8217;d be to find no correlation between self-reported worry and IQ if our hypothesis &#8220;greater intelligence causes people to worry a lot more&#8221; is true, compared to if it&#8217;s false.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on August 24, 2024, and first appeared on my website on October 11, 2024.</em></p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/08/conducting-instantaneous-experiments/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4166</post-id>	</item>
		<item>
		<title>How to spot real expertise</title>
		<link>https://www.spencergreenberg.com/2024/04/how-to-spot-real-expertise/</link>
					<comments>https://www.spencergreenberg.com/2024/04/how-to-spot-real-expertise/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Tue, 23 Apr 2024 13:33:35 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[calibration]]></category>
		<category><![CDATA[consensus]]></category>
		<category><![CDATA[epistemic humility]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[evaluating evidence]]></category>
		<category><![CDATA[evidence]]></category>
		<category><![CDATA[expertise]]></category>
		<category><![CDATA[honesty]]></category>
		<category><![CDATA[humility]]></category>
		<category><![CDATA[intellectual humility]]></category>
		<category><![CDATA[judgment]]></category>
		<category><![CDATA[knowledge]]></category>
		<category><![CDATA[probability]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[steelman]]></category>
		<category><![CDATA[steelmanning]]></category>
		<category><![CDATA[strawman]]></category>
		<category><![CDATA[uncertainty]]></category>
		<category><![CDATA[updating]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3902</guid>

					<description><![CDATA[Thanks go to Travis (from the Clearer Thinking team) for coauthoring this with me. This is a cross-post from Clearer Thinking. How can you tell who is a valid expert, and who is full of B.S.? On almost any topic of importance you can find a mix of valid experts (who are giving you reliable [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p><em>Thanks go to Travis (from the Clearer Thinking team) for coauthoring this with me.</em> <em>This is a cross-post from <a href="https://www.clearerthinking.org/post/how-to-spot-real-expertise?utm_source=ClearerThinking.org&amp;utm_campaign=a6a0ff049e-EMAIL_CAMPAIGN_FAKE_EXPERTISE&amp;utm_medium=email&amp;utm_term=0_f2e9d15594-b71c1a1f3d-%5BLIST_EMAIL_ID%5D&amp;mc_cid=a6a0ff049e&amp;mc_eid=dea552ccde">Clearer Thinking</a>. </em></p>



<p id="viewer-6ho89124">How can you tell who is a valid expert, and who is full of B.S.?</p>



<p id="viewer-toa9l129">On almost any topic of importance you can find a mix of valid experts (who are giving you reliable information) and false but confident-seeming &#8220;experts&#8221; (who are giving you misinformation). To make matters even more confusing, sometimes the fake experts even have very impressive credentials, and every once in a while, the real, genuine experts are entirely self-taught.</p>



<p id="viewer-nh6hz132">Here are 12 signs we look for in an expert to help us determine whether they are trustworthy.&nbsp;</p>



<h2 class="wp-block-heading" id="viewer-c5pf3134">1. They have deep factual knowledge</h2>



<p id="viewer-u4tmf136">Let’s start with the obvious: for most topics, a lot of factual knowledge is required before you can have genuine expertise. This means that a genuine expert will have an impressive command of the relevant (non-debated) facts on the topic of their expertise. Thankfully, it&#8217;s a lot easier to tell if an expert has a strong command of the non-debated facts than whether they are correct about more controversial claims.&nbsp;</p>



<h2 class="wp-block-heading" id="viewer-9rkmj138">2. They communicate their confidence levels</h2>



<p id="viewer-ikf4r140">Not all knowledge is equally well-established. Even theories that are widely accepted enjoy different levels of support from the relevant evidence. When an expert regularly pretends that all their claims are equally well-established, they demonstrate they are willing to make you believe something is certain when it isn&#8217;t.</p>



<p id="viewer-99oed142">It’s a good sign that someone treats their subject with the nuance expected from genuine expertise, when they indicate how confident they are (e.g., “It&#8217;s been shown in many high-quality studies that…”, or “My best guess is…”), and they explain limitations in the evidence they are using (e.g., “this is unfortunately based on just one study, but that is all that currently exists”)</p>



<h2 class="wp-block-heading" id="viewer-hoas5144">3. They admit not knowing</h2>



<p id="viewer-z5138146">Genuine experts also sometimes say that they don’t know the answer to a question, or that the answer is generally not known by anyone. This is important because every topic will have some unknowns, and no expert can know everything about a topic. Telling you when they don&#8217;t know is a sign that, when they say they <em>do</em>&nbsp;know, they actually do know.</p>



<h2 class="wp-block-heading" id="viewer-3bw8y150">4. They tell you to look at sources other than themselves</h2>



<p id="viewer-868ro152">This might happen when an expert doesn’t know the answer to a question, or when they want to help you go beyond the answer they can give you. Genuine experts don&#8217;t seek to be seen as a sole arbiter of knowledge or authority on a topic (which can be an indication that ego, rather than truth-seeking, is a primary motivation for them), but instead encourage you to look at resources other than the ones they have produced.</p>



<h2 class="wp-block-heading" id="viewer-9grt2154">5. They use logic and evidence</h2>



<p id="viewer-wqk8m156">Anyone can use rhetorical devices like emotional appeals, no matter how wrong they are, but a well-reasoned argument that uses valid logic and strong evidence will tend to point toward truth. Or, put another way, using strong logic and strong evidence is easier to do when you&#8217;re right, whereas emotional appeals are no easier when you&#8217;re right than when you&#8217;re wrong.&nbsp;&nbsp;</p>



<h2 class="wp-block-heading" id="viewer-89qm4158">6. They cite high-quality evidence</h2>



<p id="viewer-98ifh160">Some evidence is much more reliable than other evidence, and those who rely on the less reliable kinds when the more reliable kinds exist probably aren&#8217;t doing the best job they can at figuring out the truth. For this reason, genuine experts cite high-quality evidence when it exists (e.g., looking at multiple randomized controlled trials for causal claims) rather than low-quality evidence (e.g., just talking about personal anecdotes), and when high-quality evidence doesn’t exist, they cite the highest quality evidence that does exist.</p>



<h2 class="wp-block-heading" id="viewer-xi60e162">7. They acknowledge the consensus</h2>



<p id="viewer-auj7t164">Consensus views among experts are more often correct than the idiosyncratic views of just one or two experts. The consensus will not always be right, of course, but often it will be the best understanding we have available. That’s why reliable experts are transparent about the degree to which their opinion differs from the majority of experts, provide reasoned explanations for any deviations, and they are cautious not to present fringe theories as mainstream. This shows a deep engagement with the topic of their expertise and also an adherence to ethical standards of honesty and accuracy in communication.</p>



<h2 class="wp-block-heading" id="viewer-oky7e166">8. They change their mind</h2>



<p id="viewer-6w5gt168">Genuine experts will change their minds about topics within their expertise in response to evidence and arguments. It’s hard to become an expert in something without having been wrong from time-to-time.</p>



<p id="viewer-xy6s3170">That means that anyone claiming to be an expert who has never changed their mind probably has not found and corrected their mistakes. Relatedly, changing one&#8217;s mind in response to evidence is also a sign of the epistemic humility associated with genuine expertise.</p>



<p id="viewer-h4l3f172">Of course, if someone has a long history of being wrong, that is evidence against them being a genuine expert, not in favor of it. But, since everyone makes some mistakes, if they make mistakes from time to time and then note they were wrong and improve their beliefs, that is a sign that they are following the evidence where it leads rather than continuing to believe what they do regardless of the evidence.</p>



<h2 class="wp-block-heading" id="viewer-94wkg174">9. They Steelman</h2>



<p id="viewer-1edoz176">When you ‘straw man’ an argument, you misrepresent or oversimplify someone else&#8217;s position to make it easier to attack or refute. Instead of dealing with the actual argument, you replace it with a weaker version that distorts the original point, which you then argue against. The opposite of this is called ‘steelmanning’, and it involves presenting the strongest possible version of an argument you’re objecting to, even if it&#8217;s more robust than the one originally presented. This approach aims to strengthen the opposing case in order to facilitate a more genuine and constructive debate.&nbsp;</p>



<p id="viewer-kib65178">The most reliable experts will accurately present the strongest arguments made by those that disagree with them while pointing out flaws in those arguments, rather than focusing on just weak arguments from the other side or just mocking the other side (including ad hominem attacks rather than focusing on the substance of the claims of the other side). This is important because knocking down a weak argument from the other side of a debate does little to show the other side is wrong; you have to refute the strongest claims of the other side to actually show they are wrong. Additionally, demonstrating a knowledge of the strongest arguments against your own position shows a deeper level of expertise than only understanding the opposing point of view at a superficial level.</p>



<h2 class="wp-block-heading" id="viewer-mmo40180">10. They clearly explain their reasons for believing</h2>



<p id="viewer-yh6ch182">The philosopher Daniel Dennett <a target="_blank" href="https://books.google.co.uk/books?id=C5pUnN1-vhcC&amp;pg=PT16&amp;dq=%22if+I+can%E2%80%99t+explain+something+I%E2%80%99m+doing+to+a+group+of+bright+undergraduates,+I+don%E2%80%99t+really+understand+it+myself%22&amp;redir_esc=y#v=onepage&amp;q=%22if%20I%20can%E2%80%99t%20explain%20something%20I%E2%80%99m%20doing%20to%20a%20group%20of%20bright%20undergraduates%2C%20I%20don%E2%80%99t%20really%20understand%20it%20myself%22&amp;f=false" rel="noreferrer noopener">has said</a>: “if I can’t explain something I’m doing to a group of bright undergraduates, I don’t really understand it myself.” This sentiment is echoed by philosopher John Searle, who said “In general, I feel if you can&#8217;t say it clearly you don&#8217;t understand it yourself.”&nbsp;</p>



<p id="viewer-spi6a186">When communicating with non-experts, genuine experts are often able to give clear, easy-to-follow (and, ideally, checkable) explanations for why they believe what they believe &#8211; without dumbing down the points. They avoid unnecessary jargon and technical language (which sounds smart but makes their arguments very difficult for their audience to follow). Not every genuine expert is able to do this, but the ability to do this well is a sign of genuine expertise. This is important because an expert who cannot explain their ideas clearly will end up requiring you to believe them based on their authority rather than engaging with the arguments themselves. And sometimes, people claiming to be experts will hide behind technical expertise and jargon so that you won&#8217;t notice that their arguments are actually weak.</p>



<h2 class="wp-block-heading" id="viewer-gs759188">11. They have a track record</h2>



<p id="viewer-11d0m190">Sometimes genuine experts will have track records of predictions or successes that you can check, and this provides direct evidence of their knowledge or skill. Unfortunately, this only applies to some fields, like chess masters, martial experts who fight in tournaments, experts who make public predictions about the economy or politics, etc.</p>



<h2 class="wp-block-heading" id="viewer-pzysj192">12. They use multiple lenses</h2>



<p id="viewer-o5ipy194">The world is complex and multi-faceted, and any one simple theory is going to fail to explain a lot of what&#8217;s really going on. For this reason, genuine experts tend to look at problems from multiple frames and perspectives; they don&#8217;t act as though one way of looking at things solves all problems, or that one solution works for all problems, or that one simple theory explains everything.</p>



<p id="viewer-b0ax0196">So the next time you hear claims from an alleged expert on a topic that is important to you, you may want to consider: how many of these signs of expertise do they exhibit? You can use this checklist, considering if they:</p>



<ol class="wp-block-list">
<li>have deep factual knowledge</li>



<li>communicate their confidence levels</li>



<li>admit not knowing</li>



<li>tell you to look at sources other than themselves</li>



<li>use logic and evidence</li>



<li>cite high-quality evidence</li>



<li>acknowledge the consensus</li>



<li>change their mind</li>



<li>steelman</li>



<li>clearly explain their reasons for believing</li>



<li>have a track record</li>



<li>use multiple lenses</li>
</ol>



<p id="viewer-6liwv235">And if you’re seeking to be an expert in something yourself, you may want to ask yourself: “to what extent do I exhibit these traits?”Being able to discern genuine expertise from B.S. requires good judgment. If you’d like to improve your skills at making accurate judgments, why not try our <a href="https://www.openphilanthropy.org/calibration">Calibrate Your Judgment tool,</a> created in partnership with Open Philanthropy.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece first appeared on Clearer Thinking.org on April 16, 2024, and first appeared on my website on April 22, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/04/how-to-spot-real-expertise/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3902</post-id>	</item>
		<item>
		<title>Three motivations for believing </title>
		<link>https://www.spencergreenberg.com/2024/04/three-motivations-for-believing/</link>
					<comments>https://www.spencergreenberg.com/2024/04/three-motivations-for-believing/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sat, 20 Apr 2024 14:04:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[addiction]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[delusional]]></category>
		<category><![CDATA[delusions]]></category>
		<category><![CDATA[denial]]></category>
		<category><![CDATA[dopamine]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[hedonism]]></category>
		<category><![CDATA[hope]]></category>
		<category><![CDATA[motivated reasoning]]></category>
		<category><![CDATA[optimism]]></category>
		<category><![CDATA[pragmatism]]></category>
		<category><![CDATA[present bias]]></category>
		<category><![CDATA[rationalism]]></category>
		<category><![CDATA[religion]]></category>
		<category><![CDATA[self-sabotage]]></category>
		<category><![CDATA[utility]]></category>
		<category><![CDATA[values]]></category>
		<category><![CDATA[wishful thinking]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3929</guid>

					<description><![CDATA[There are three different motivations for belief, and it&#8217;s important to distinguish between them.&#160; 1) Belief because you think something&#8217;s true. For instance, you may think that the evidence supports the idea that you will eventually find love, or you may feel convinced by logical arguments you&#8217;ve heard in favor of god&#8217;s existence. 2) Belief [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>There are three different motivations for belief, and it&#8217;s important to distinguish between them.&nbsp;</p>



<p><strong>1) Belief because you think something&#8217;s true.</strong></p>



<p>For instance, you may think that the evidence supports the idea that you will eventually find love, or you may feel convinced by logical arguments you&#8217;ve heard in favor of god&#8217;s existence.</p>



<p><strong>2) Belief because you think it&#8217;s useful to believe.&nbsp;</strong></p>



<p>Regardless of whether you predict something&#8217;s true, you can predict that believing it will be more helpful than harmful to you in the long term, and so be motivated to believe for that pragmatic benefit.</p>



<p>For instance, you may intuit that you&#8217;ll be better off long-term believing that you will eventually find love (because that will make love more likely) or perceive that you&#8217;ll be happier believing in god (even if it turns out there is no god).</p>



<p><strong>3) Belief because it feels good in the moment.&nbsp;</strong></p>



<p>Regardless of whether it&#8217;s true or helpful to you in the long term, you may be motivated to believe something because it feels good right now (or prevents you from feeling bad).&nbsp;</p>



<p>For instance, you may feel comforted right now by thinking you&#8217;ll eventually find love or feel good in the moment, believing a god is watching over you.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Rationalists&nbsp;</strong>typically recommend striving to have your beliefs be of type 1: believing based on what&#8217;s most likely to be true.</p>



<p><strong>Pragmatists</strong>&nbsp;often recommend aiming for type 2 beliefs: believing based on what&#8217;s ultimately most useful to you.</p>



<p>I favor striving to have type 1 beliefs rather than type 2 beliefs, in part because I intrinsically value truth, but also because I think that for beliefs in category 2 that are *not* actually true, there are typically some beliefs in category 1 that will help you just as much, but which&nbsp;have the advantage of&nbsp;also&nbsp;being true.&nbsp;So often (but not always), there is a low cost to replacing beliefs from 2 with beliefs from 1 that have the added benefit of being true.</p>



<p>I also think that if you allow yourself&nbsp;to indiscriminately hold type 2 beliefs, it makes it hard to suddenly switch to rigorous truth-oriented thinking when it&#8217;s important to figure out the truth (e.g.,&nbsp;when you have to make a very important decision based on evidence).</p>



<p>On the other hand, many people have lots of type 3 beliefs, and all of us, myself included, have some type 3 beliefs. Whether you think that type 1 or type 2 beliefs are ultimately preferable, I think a valuable aspiration is to replace some of our type 3 beliefs with either 1s or 2s.</p>



<p>It&#8217;s very, very easy for us humans to delude ourselves based on what it feels good to believe at the moment because the reward cycle is so fast. Type 3 beliefs are immediately rewarding, incentivizing more such beliefs. But they are like the social media addiction version of believing, where you pursue what gives the greatest instantaneous reward rather than what&#8217;s actually good for you.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on April 20, 2024, and first appeared on my website on May 7, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/04/three-motivations-for-believing/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3929</post-id>	</item>
		<item>
		<title>These epistemic methods really want you to trust them</title>
		<link>https://www.spencergreenberg.com/2020/11/these-epistemic-methods-really-want-you-to-trust-them/</link>
					<comments>https://www.spencergreenberg.com/2020/11/these-epistemic-methods-really-want-you-to-trust-them/#respond</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Fri, 27 Nov 2020 01:50:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[circular reasoning]]></category>
		<category><![CDATA[circularity]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[induction]]></category>
		<category><![CDATA[recursion]]></category>
		<category><![CDATA[recursive]]></category>
		<category><![CDATA[tautology]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2851</guid>

					<description><![CDATA[These epistemic methods really want you to trust them. Each tries to prove itself to you: 1. Tautologies are true by definition, &#8217;cause tautologies are true by definition. 😎 2. Induction worked in the past, so it probably will in the future. 😉 3. If deduction solves your problem, and you want it solved, then [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>These epistemic methods really want you to trust them. Each tries to prove itself to you: </p>



<p><strong>1. </strong><em><strong>Tautologies</strong> </em>are true by definition, &#8217;cause tautologies are true by definition. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f60e.png" alt="😎" class="wp-smiley" style="height: 1em; max-height: 1em;" />  </p>



<p></p>



<p><strong>2. <em>Induction</em></strong> worked in the past, so it probably will in the future. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f609.png" alt="😉" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p></p>



<p><strong>3.</strong> If <strong><em>deduction</em></strong> solves your problem, and you want it solved, then you&#8217;ll want to use deduction! <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f60a.png" alt="😊" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p></p>



<p><strong>4.</strong> If you thought <strong><em>Bayesianism</em></strong> had 3:1 odds, and you think this sentence is 2x more likely if Bayesianism than if not Bayesianism, now you should give it 6:1 odds. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f920.png" alt="🤠" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p></p>



<p><strong>5.</strong> If <strong><em>frequentism</em></strong> wasn&#8217;t true, you&#8217;d have a low probability of reading a sentence as extreme as this one. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f913.png" alt="🤓" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p></p>



<p><strong>6.  </strong>You should<strong> <em>trust your gut</em> </strong>because your gut is telling you to trust it. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f618.png" alt="😘" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p></p>



<p><strong>7. </strong>The theory that <strong><em>Occam&#8217;s Razor</em></strong> is true is simpler than the theory that it is false. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f914.png" alt="🤔" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<p></p>



<p><strong>8.</strong> The best explanation for why humans use <strong><em>abduction</em></strong> often is because it is useful for helping them figure out the truth. <img src="https://s.w.org/images/core/emoji/17.0.2/72x72/1f9d0.png" alt="🧐" class="wp-smiley" style="height: 1em; max-height: 1em;" /></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>If you liked this piece, you may also like <a href="https://www.spencergreenberg.com/2021/03/twelve-recursive-explanations/">Twelve Recursive Explanations</a>.</em></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on November 26, 2020, and first appeared on this site on August 12, 2022.</em></p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2020/11/these-epistemic-methods-really-want-you-to-trust-them/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2851</post-id>	</item>
		<item>
		<title>Idea-Inducing Questions</title>
		<link>https://www.spencergreenberg.com/2020/08/idea-inducing-questions/</link>
					<comments>https://www.spencergreenberg.com/2020/08/idea-inducing-questions/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sun, 02 Aug 2020 19:14:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[creativity]]></category>
		<category><![CDATA[discussion]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[ideas]]></category>
		<category><![CDATA[learning]]></category>
		<category><![CDATA[personal reflection]]></category>
		<category><![CDATA[post]]></category>
		<category><![CDATA[questions]]></category>
		<category><![CDATA[unusual ideas]]></category>
		<category><![CDATA[writing]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2762</guid>

					<description><![CDATA[Struggling to come up with an idea for a blog post? Want to post ideas on social media but can&#8217;t think of what to write about? Want to come up with interesting topics for an intellectual discussion or meetup? Use my lists of &#8220;Idea-Inducing Questions&#8221; to generate nearly endless ideas to write about, think about, [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Struggling to come up with an idea for a blog post? Want to post ideas on social media but can&#8217;t think of what to write about? Want to come up with interesting topics for an intellectual discussion or meetup?</p>



<p>Use my lists of &#8220;Idea-Inducing Questions&#8221; to generate nearly endless ideas to write about, think about, or discuss!</p>



<hr class="wp-block-separator is-style-default"/>



<p><strong>Questions about learning and truth-seeking</strong></p>



<p>• Recently learned: what&#8217;s a powerful idea, concept, or mental model that you&#8217;ve been learning about recently that you think is worth knowing?</p>



<p>• Changed opinions: what is a strongly held belief you used to have that you changed your mind about? What caused you to change your mind? Why do you think you were wrong before?</p>



<p>• Influential book: what is the name of one book that substantially influenced the way you think about things? What did you learn from it that you can pass on to your audience?</p>



<p>• Debates: what&#8217;s something you disagree with a certain group of people on? What do you think is the core of the disagreement?</p>



<p>• Thorny problems: Is there a complex problem, situation, idea, concept, or a set of competing ideas that you&#8217;re still trying to understand or figure out your opinion on? What are the factors that are driving your opinion in different directions or that make the issue tricky to figure out? What are the open questions or confusions you have about it still?</p>



<p>• Third perspectives: For any pair of opposing ideas that most people in the public sphere take either one side or the other on, can you think of a third perspective or synthesis of both ideas that could actually be better than taking either side?</p>



<p>• Misconceptions: What&#8217;s a commonly believed idea that you think is actually wrong or a misconception?</p>



<p>• Underrated or overrated ideas: What&#8217;s a powerful or useful idea that you think is significantly underrated? Or conversely, what&#8217;s an idea that is talked about a lot in a positive light that you think is overrated or that isn&#8217;t actually a good idea?</p>



<p>• Epistemics: how do you think about what to believe versus what ideas to reject? How do you approach understanding hotly-contested, thorny, or highly complex topics? What mental models or approaches do you use to help you think more clearly or analyze questions or evaluate evidence?</p>



<hr class="wp-block-separator is-style-default"/>



<p><strong>Questions about ideas that have useful applications</strong></p>



<p>• Beneficial ideas: what&#8217;s an idea that, if it became widely known and adopted/used, would greatly improve the world?</p>



<p>• Versatile ideas: what powerful idea or concept do you think has many different useful applications across many life domains?</p>



<p>• Psychology tools: what&#8217;s a powerful idea, concept, mental model, or tool from psychology that you think is useful to people&#8217;s lives?</p>



<p>• Tools for making sense: what&#8217;s a powerful idea, concept, mental model, or tool that you think can help people better understand or make sense of the world?</p>



<p>• Scientific principles: what principle from a mathematical or scientific field (e.g., economics, statistics, evolutionary biology, etc.) do you think is important or valuable to know about (because it helps you understand the world or because there are applications of it to daily life)? How can you apply this idea in life?</p>



<hr class="wp-block-separator is-style-default"/>



<p><strong>Questions about your own ideas and experiences</strong></p>



<p>• Your ideas: What&#8217;s an idea you&#8217;ve come up with that you think would be valuable for your audience to hear about?</p>



<p>• Ideas you&#8217;ve applied: What&#8217;s an idea that you&#8217;ve found to be very useful or powerful in your own life? How have you applied it successfully?</p>



<p>• Unique experiences: what&#8217;s something you have experienced that very few people have experienced (whether it&#8217;s a good thing, a bad thing, or just something strange or surprising)? What did you learn from that experience?</p>



<p>• On your mind: What&#8217;s an idea you&#8217;ve been thinking about a lot lately? What are your current thoughts about it?</p>



<p>• Area of expertise: what topic are you very knowledgeable about? What is the most valuable idea from that field that you think many people would benefit from knowing about?</p>



<hr class="wp-block-separator is-style-default"/>



<p><strong>Questions about unusual or neglected ideas</strong></p>



<p>• Powerful obscure ideas: In your opinion, what is one of the most valuable or important ideas or concepts that most people don&#8217;t know about?</p>



<p>• Contrarian ideas: What&#8217;s something you disagree with most smart, educated people about (according to your own definition of smart and educated)? Or what&#8217;s your answer to the Thiel question: &#8220;What important truth do very few people agree with you on?&#8221;</p>



<p>• Overlooked areas: What&#8217;s a topic area that very few people have an opinion on or knowledge of, that you think it&#8217;s important to have an opinion on, or that is well worth learning about?</p>



<hr class="wp-block-separator is-style-default"/>



<p><em>This piece was first written on August 2, 2020, and first appeared on this site on May 27, 2022.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2020/08/idea-inducing-questions/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2762</post-id>	</item>
	</channel>
</rss>
