<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>rationality &#8211; Spencer Greenberg</title>
	<atom:link href="https://www.spencergreenberg.com/tag/rationality/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.spencergreenberg.com</link>
	<description></description>
	<lastBuildDate>Wed, 01 Apr 2026 14:53:39 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">23753251</site>	<item>
		<title>How to spot real expertise</title>
		<link>https://www.spencergreenberg.com/2024/04/how-to-spot-real-expertise/</link>
					<comments>https://www.spencergreenberg.com/2024/04/how-to-spot-real-expertise/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Tue, 23 Apr 2024 13:33:35 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[calibration]]></category>
		<category><![CDATA[consensus]]></category>
		<category><![CDATA[epistemic humility]]></category>
		<category><![CDATA[epistemics]]></category>
		<category><![CDATA[evaluating evidence]]></category>
		<category><![CDATA[evidence]]></category>
		<category><![CDATA[expertise]]></category>
		<category><![CDATA[honesty]]></category>
		<category><![CDATA[humility]]></category>
		<category><![CDATA[intellectual humility]]></category>
		<category><![CDATA[judgment]]></category>
		<category><![CDATA[knowledge]]></category>
		<category><![CDATA[probability]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[steelman]]></category>
		<category><![CDATA[steelmanning]]></category>
		<category><![CDATA[strawman]]></category>
		<category><![CDATA[uncertainty]]></category>
		<category><![CDATA[updating]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3902</guid>

					<description><![CDATA[Thanks go to Travis (from the Clearer Thinking team) for coauthoring this with me. This is a cross-post from Clearer Thinking. How can you tell who is a valid expert, and who is full of B.S.? On almost any topic of importance you can find a mix of valid experts (who are giving you reliable [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p><em>Thanks go to Travis (from the Clearer Thinking team) for coauthoring this with me.</em> <em>This is a cross-post from <a href="https://www.clearerthinking.org/post/how-to-spot-real-expertise?utm_source=ClearerThinking.org&amp;utm_campaign=a6a0ff049e-EMAIL_CAMPAIGN_FAKE_EXPERTISE&amp;utm_medium=email&amp;utm_term=0_f2e9d15594-b71c1a1f3d-%5BLIST_EMAIL_ID%5D&amp;mc_cid=a6a0ff049e&amp;mc_eid=dea552ccde">Clearer Thinking</a>. </em></p>



<p id="viewer-6ho89124">How can you tell who is a valid expert, and who is full of B.S.?</p>



<p id="viewer-toa9l129">On almost any topic of importance you can find a mix of valid experts (who are giving you reliable information) and false but confident-seeming &#8220;experts&#8221; (who are giving you misinformation). To make matters even more confusing, sometimes the fake experts even have very impressive credentials, and every once in a while, the real, genuine experts are entirely self-taught.</p>



<p id="viewer-nh6hz132">Here are 12 signs we look for in an expert to help us determine whether they are trustworthy.&nbsp;</p>



<h2 class="wp-block-heading" id="viewer-c5pf3134">1. They have deep factual knowledge</h2>



<p id="viewer-u4tmf136">Let’s start with the obvious: for most topics, a lot of factual knowledge is required before you can have genuine expertise. This means that a genuine expert will have an impressive command of the relevant (non-debated) facts on the topic of their expertise. Thankfully, it&#8217;s a lot easier to tell if an expert has a strong command of the non-debated facts than whether they are correct about more controversial claims.&nbsp;</p>



<h2 class="wp-block-heading" id="viewer-9rkmj138">2. They communicate their confidence levels</h2>



<p id="viewer-ikf4r140">Not all knowledge is equally well-established. Even theories that are widely accepted enjoy different levels of support from the relevant evidence. When an expert regularly pretends that all their claims are equally well-established, they demonstrate they are willing to make you believe something is certain when it isn&#8217;t.</p>



<p id="viewer-99oed142">It’s a good sign that someone treats their subject with the nuance expected from genuine expertise, when they indicate how confident they are (e.g., “It&#8217;s been shown in many high-quality studies that…”, or “My best guess is…”), and they explain limitations in the evidence they are using (e.g., “this is unfortunately based on just one study, but that is all that currently exists”)</p>



<h2 class="wp-block-heading" id="viewer-hoas5144">3. They admit not knowing</h2>



<p id="viewer-z5138146">Genuine experts also sometimes say that they don’t know the answer to a question, or that the answer is generally not known by anyone. This is important because every topic will have some unknowns, and no expert can know everything about a topic. Telling you when they don&#8217;t know is a sign that, when they say they <em>do</em>&nbsp;know, they actually do know.</p>



<h2 class="wp-block-heading" id="viewer-3bw8y150">4. They tell you to look at sources other than themselves</h2>



<p id="viewer-868ro152">This might happen when an expert doesn’t know the answer to a question, or when they want to help you go beyond the answer they can give you. Genuine experts don&#8217;t seek to be seen as a sole arbiter of knowledge or authority on a topic (which can be an indication that ego, rather than truth-seeking, is a primary motivation for them), but instead encourage you to look at resources other than the ones they have produced.</p>



<h2 class="wp-block-heading" id="viewer-9grt2154">5. They use logic and evidence</h2>



<p id="viewer-wqk8m156">Anyone can use rhetorical devices like emotional appeals, no matter how wrong they are, but a well-reasoned argument that uses valid logic and strong evidence will tend to point toward truth. Or, put another way, using strong logic and strong evidence is easier to do when you&#8217;re right, whereas emotional appeals are no easier when you&#8217;re right than when you&#8217;re wrong.&nbsp;&nbsp;</p>



<h2 class="wp-block-heading" id="viewer-89qm4158">6. They cite high-quality evidence</h2>



<p id="viewer-98ifh160">Some evidence is much more reliable than other evidence, and those who rely on the less reliable kinds when the more reliable kinds exist probably aren&#8217;t doing the best job they can at figuring out the truth. For this reason, genuine experts cite high-quality evidence when it exists (e.g., looking at multiple randomized controlled trials for causal claims) rather than low-quality evidence (e.g., just talking about personal anecdotes), and when high-quality evidence doesn’t exist, they cite the highest quality evidence that does exist.</p>



<h2 class="wp-block-heading" id="viewer-xi60e162">7. They acknowledge the consensus</h2>



<p id="viewer-auj7t164">Consensus views among experts are more often correct than the idiosyncratic views of just one or two experts. The consensus will not always be right, of course, but often it will be the best understanding we have available. That’s why reliable experts are transparent about the degree to which their opinion differs from the majority of experts, provide reasoned explanations for any deviations, and they are cautious not to present fringe theories as mainstream. This shows a deep engagement with the topic of their expertise and also an adherence to ethical standards of honesty and accuracy in communication.</p>



<h2 class="wp-block-heading" id="viewer-oky7e166">8. They change their mind</h2>



<p id="viewer-6w5gt168">Genuine experts will change their minds about topics within their expertise in response to evidence and arguments. It’s hard to become an expert in something without having been wrong from time-to-time.</p>



<p id="viewer-xy6s3170">That means that anyone claiming to be an expert who has never changed their mind probably has not found and corrected their mistakes. Relatedly, changing one&#8217;s mind in response to evidence is also a sign of the epistemic humility associated with genuine expertise.</p>



<p id="viewer-h4l3f172">Of course, if someone has a long history of being wrong, that is evidence against them being a genuine expert, not in favor of it. But, since everyone makes some mistakes, if they make mistakes from time to time and then note they were wrong and improve their beliefs, that is a sign that they are following the evidence where it leads rather than continuing to believe what they do regardless of the evidence.</p>



<h2 class="wp-block-heading" id="viewer-94wkg174">9. They Steelman</h2>



<p id="viewer-1edoz176">When you ‘straw man’ an argument, you misrepresent or oversimplify someone else&#8217;s position to make it easier to attack or refute. Instead of dealing with the actual argument, you replace it with a weaker version that distorts the original point, which you then argue against. The opposite of this is called ‘steelmanning’, and it involves presenting the strongest possible version of an argument you’re objecting to, even if it&#8217;s more robust than the one originally presented. This approach aims to strengthen the opposing case in order to facilitate a more genuine and constructive debate.&nbsp;</p>



<p id="viewer-kib65178">The most reliable experts will accurately present the strongest arguments made by those that disagree with them while pointing out flaws in those arguments, rather than focusing on just weak arguments from the other side or just mocking the other side (including ad hominem attacks rather than focusing on the substance of the claims of the other side). This is important because knocking down a weak argument from the other side of a debate does little to show the other side is wrong; you have to refute the strongest claims of the other side to actually show they are wrong. Additionally, demonstrating a knowledge of the strongest arguments against your own position shows a deeper level of expertise than only understanding the opposing point of view at a superficial level.</p>



<h2 class="wp-block-heading" id="viewer-mmo40180">10. They clearly explain their reasons for believing</h2>



<p id="viewer-yh6ch182">The philosopher Daniel Dennett <a target="_blank" href="https://books.google.co.uk/books?id=C5pUnN1-vhcC&amp;pg=PT16&amp;dq=%22if+I+can%E2%80%99t+explain+something+I%E2%80%99m+doing+to+a+group+of+bright+undergraduates,+I+don%E2%80%99t+really+understand+it+myself%22&amp;redir_esc=y#v=onepage&amp;q=%22if%20I%20can%E2%80%99t%20explain%20something%20I%E2%80%99m%20doing%20to%20a%20group%20of%20bright%20undergraduates%2C%20I%20don%E2%80%99t%20really%20understand%20it%20myself%22&amp;f=false" rel="noreferrer noopener">has said</a>: “if I can’t explain something I’m doing to a group of bright undergraduates, I don’t really understand it myself.” This sentiment is echoed by philosopher John Searle, who said “In general, I feel if you can&#8217;t say it clearly you don&#8217;t understand it yourself.”&nbsp;</p>



<p id="viewer-spi6a186">When communicating with non-experts, genuine experts are often able to give clear, easy-to-follow (and, ideally, checkable) explanations for why they believe what they believe &#8211; without dumbing down the points. They avoid unnecessary jargon and technical language (which sounds smart but makes their arguments very difficult for their audience to follow). Not every genuine expert is able to do this, but the ability to do this well is a sign of genuine expertise. This is important because an expert who cannot explain their ideas clearly will end up requiring you to believe them based on their authority rather than engaging with the arguments themselves. And sometimes, people claiming to be experts will hide behind technical expertise and jargon so that you won&#8217;t notice that their arguments are actually weak.</p>



<h2 class="wp-block-heading" id="viewer-gs759188">11. They have a track record</h2>



<p id="viewer-11d0m190">Sometimes genuine experts will have track records of predictions or successes that you can check, and this provides direct evidence of their knowledge or skill. Unfortunately, this only applies to some fields, like chess masters, martial experts who fight in tournaments, experts who make public predictions about the economy or politics, etc.</p>



<h2 class="wp-block-heading" id="viewer-pzysj192">12. They use multiple lenses</h2>



<p id="viewer-o5ipy194">The world is complex and multi-faceted, and any one simple theory is going to fail to explain a lot of what&#8217;s really going on. For this reason, genuine experts tend to look at problems from multiple frames and perspectives; they don&#8217;t act as though one way of looking at things solves all problems, or that one solution works for all problems, or that one simple theory explains everything.</p>



<p id="viewer-b0ax0196">So the next time you hear claims from an alleged expert on a topic that is important to you, you may want to consider: how many of these signs of expertise do they exhibit? You can use this checklist, considering if they:</p>



<ol class="wp-block-list">
<li>have deep factual knowledge</li>



<li>communicate their confidence levels</li>



<li>admit not knowing</li>



<li>tell you to look at sources other than themselves</li>



<li>use logic and evidence</li>



<li>cite high-quality evidence</li>



<li>acknowledge the consensus</li>



<li>change their mind</li>



<li>steelman</li>



<li>clearly explain their reasons for believing</li>



<li>have a track record</li>



<li>use multiple lenses</li>
</ol>



<p id="viewer-6liwv235">And if you’re seeking to be an expert in something yourself, you may want to ask yourself: “to what extent do I exhibit these traits?”Being able to discern genuine expertise from B.S. requires good judgment. If you’d like to improve your skills at making accurate judgments, why not try our <a href="https://www.openphilanthropy.org/calibration">Calibrate Your Judgment tool,</a> created in partnership with Open Philanthropy.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece first appeared on Clearer Thinking.org on April 16, 2024, and first appeared on my website on April 22, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/04/how-to-spot-real-expertise/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3902</post-id>	</item>
		<item>
		<title>I&#8217;m an extreme non-credentialist &#8211; what about you?</title>
		<link>https://www.spencergreenberg.com/2024/02/im-an-extreme-non-credentialist-what-about-you/</link>
					<comments>https://www.spencergreenberg.com/2024/02/im-an-extreme-non-credentialist-what-about-you/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Wed, 28 Feb 2024 15:34:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[argumentation]]></category>
		<category><![CDATA[consensus]]></category>
		<category><![CDATA[credentials]]></category>
		<category><![CDATA[crux]]></category>
		<category><![CDATA[deference]]></category>
		<category><![CDATA[deferring]]></category>
		<category><![CDATA[disagreements]]></category>
		<category><![CDATA[experts]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[resolving disagreements]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[substance]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3874</guid>

					<description><![CDATA[I&#8217;m an extreme (&#62;99th percentile) non-credentialist. Does that mean if I find out someone has a nutrition Ph.D., then I don&#8217;t think they know more about nutrition than most random people? Of course not. Credentials are evidence of what someone knows (e.g., having a nutrition Ph.D. is evidence that you have nutrition knowledge). But part [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>I&#8217;m an extreme (&gt;99th percentile) non-credentialist. Does that mean if I find out someone has a nutrition Ph.D., then I don&#8217;t think they know more about nutrition than most random people? Of course not. Credentials are evidence of what someone knows (e.g., having a nutrition Ph.D. is evidence that you have nutrition knowledge).</p>



<p>But part of what makes me an extreme non-credentialist is that if I spend an hour watching someone with a nutrition Ph.D. debate a completely self-taught person, and the Ph.D. is making bad arguments and pointing to weak evidence, and the self-taught person is making very solid arguments and pointing to strong evidence and has a very solid command of the relevant facts, the fact that the first person has a Ph.D. will be nearly completely washed out for me at that point, and I will trust the second person&#8217;s view of nutrition far more based on the quality of their thinking and the reasons underlying why they believe what they do.</p>



<p>So, being a non-credentialist to me isn&#8217;t about thinking that credentials are meaningless, but rather, it involves being willing to quickly update away from the evidence of a credential once you have more direct evidence about the way a person comes to conclusions and what they know.</p>



<p>Most Ph.D.s in a subject are vastly more reliable sources of information on that subject than most non-Ph.D.s on that same subject, but there are lots of exceptions, and sometimes self-taught people are absolutely world-class (and, in any human endeavor, plenty of people with fancy credentials are actually full of B.S.)</p>



<p>Another thing that makes me a non-credentialist is that I love to see highly credible, highly knowledgeable, self-taught people discussing topics and spreading their ideas (whereas some people are very much rubbed the wrong way when someone is talking publicly about a topic they lack a credential in).</p>



<p>An important note: when there is a strong scientific consensus, that is usually a strong starting point for beliefs on topics you know little about (e.g., in physics or biology), even though the consensus is not always right. But trusting the scientific consensus is not the same as trusting one person due to their credentials &#8211; a strong scientific consensus is typically more reliable than individual experts.</p>



<p>If you&#8217;d like to figure out how much of a credentialist or non-credentialist you are, you can take our credentialist test <a href="https://programs.clearerthinking.org/credentialist_test.html" target="_blank" rel="noreferrer noopener">here.</a></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on February 28, 2024, and first appeared on my website on March 22, 2024.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2024/02/im-an-extreme-non-credentialist-what-about-you/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3874</post-id>	</item>
		<item>
		<title>Dichotomizer (or Oversimplifiers) vs. Difference Deniers: a dynamic regarding group differences that leads to rage and confusion</title>
		<link>https://www.spencergreenberg.com/2023/12/oversimplifiers-vs-difference-deniers-a-dynamic-regarding-group-differences-that-leads-to-rage-and-confusion/</link>
					<comments>https://www.spencergreenberg.com/2023/12/oversimplifiers-vs-difference-deniers-a-dynamic-regarding-group-differences-that-leads-to-rage-and-confusion/#respond</comments>
		
		<dc:creator><![CDATA[Spencer]]></dc:creator>
		<pubDate>Sun, 17 Dec 2023 16:22:15 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[differences]]></category>
		<category><![CDATA[groups]]></category>
		<category><![CDATA[polarization]]></category>
		<category><![CDATA[psychology]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[truth]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3772</guid>

					<description><![CDATA[Here&#8217;s a misery-filled dynamic that I believe commonly plays out regarding small observed differences between groups: (1) Two groups have a small (but meaningful) difference in their average value of some trait, with heavily overlapping distributions. (2) Some people (&#8220;Dichotomizers&#8221; or &#8220;Oversimplifiers&#8221;) observe this difference (in their everyday life or media reports) and turn this [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Here&#8217;s a misery-filled dynamic that I believe commonly plays out regarding small observed differences between groups:</p>



<p>(1) Two groups have a small (but meaningful) difference in their average value of some trait, with heavily overlapping distributions.</p>



<p>(2) Some people (&#8220;Dichotomizers&#8221; or &#8220;Oversimplifiers&#8221;) observe this difference (in their everyday life or media reports) and turn this small average difference into a (sometimes very harmful) oversimplification: &#8220;A&#8217;s are like this, B&#8217;s are like that.&#8221; They also fairly often make it seem like this difference is large (or applies to almost everyone in the group), important, and fundamental (e.g., inherent and unchangeable).</p>



<p>(3) Other people (&#8220;Difference Deniers&#8221;), often acting with good intentions, criticize this oversimplification, which they correctly perceive as harmful. But instead of saying some combination of:</p>



<ul class="wp-block-list">
<li>&#8220;the difference in averages is small&#8221; </li>



<li>&#8220;the distributions are heavily overlapping&#8221; </li>



<li>&#8220;judging an individual based on a small difference in group averages is a poor way to make predictions, as well as unjust&#8221; </li>



<li>&#8220;we shouldn&#8217;t judge people for differing on that trait&#8221; (if it&#8217;s not a trait one should be judged on)</li>



<li>&#8220;if we want to remove the difference in averages, we should consider implementing policy XYZ&#8221; </li>
</ul>



<p>they say &#8220;the difference in averages does not exist.&#8221; After denying the difference and seeing those they respect deny it, some of them become convinced anyone who believes in the existence of this (actually existing) small average difference is nefarious (and lump such people in with those who harmfully oversimplify people into &#8220;A&#8217;s are like this, B&#8217;s are like that.”) Others among them know the average difference exists but pretend not to because they want to fit into the group that adamantly denies the difference, or because they feel guilty about believing it (even though they are right about it existing).</p>



<p>(4) Oversimiplifiers from (2), who remain totally convinced an average difference exists (and are correct about its existence but exaggerate its magnitude), assume that the Difference Deniers from (3) must be either stupid (for not realizing there is a difference), or untrustworthy liars (for denying what they must see is true), or cruel lunatics (for getting enraged at people for believing in &#8220;the truth&#8221;).</p>



<p>Queue endless fights between the Dichotomizers and the Difference Deniers, both of which are misrepresenting the actual reality of the situation and demonizing each other.</p>



<div style="height:40px" aria-hidden="true" class="wp-block-spacer"></div>



<figure class="wp-block-image size-large"><img data-recalc-dims="1" fetchpriority="high" decoding="async" width="750" height="198" data-attachment-id="3790" data-permalink="https://www.spencergreenberg.com/2023/12/oversimplifiers-vs-difference-deniers-a-dynamic-regarding-group-differences-that-leads-to-rage-and-confusion/oversimplifiers-vs-reality-deniars-2/" data-orig-file="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?fit=10268%2C2717&amp;ssl=1" data-orig-size="10268,2717" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="oversimplifiers-vs-reality-deniars-2" data-image-description="" data-image-caption="" data-large-file="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?fit=750%2C198&amp;ssl=1" src="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?resize=750%2C198&#038;ssl=1" alt="" class="wp-image-3790" srcset="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?resize=1024%2C271&amp;ssl=1 1024w, https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?resize=300%2C79&amp;ssl=1 300w, https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?resize=768%2C203&amp;ssl=1 768w, https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?resize=1536%2C406&amp;ssl=1 1536w, https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?resize=2048%2C542&amp;ssl=1 2048w, https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2023/12/oversimplifiers-vs-reality-deniars-2.png?w=2250&amp;ssl=1 2250w" sizes="(max-width: 750px) 100vw, 750px" /></figure>



<div style="height:71px" aria-hidden="true" class="wp-block-spacer"></div>



<p></p>



<p>The problem with turning small averages into &#8220;A&#8217;s are like this, B&#8217;s are like that&#8221; is that it is an inaccurate oversimplification and often unfair to A&#8217;s or B&#8217;s or both.</p>



<p>The problem with denying the existence of average differences that, while small, really do exist is that you end up believing falsehoods, or you end up lying, or both, and you may end up unfairly misjudging people who are (without malice) reporting on real average differences.</p>



<p>To avoid the weaknesses of both the Dichotomizers and the Difference Deniers, I think the best way to handle these cases is to:</p>



<p>1) Avoid pre-judging people based on their membership in broad groups &#8211; learn about people as individuals before coming to judgments about them.</p>



<p>2) Avoid language like &#8220;A&#8217;s are like this, B&#8217;s are like that&#8221; so that you aren&#8217;t a Dichotomizer.</p>



<p>3) Avoid denying that an average difference exists when it really does exist, so that way, you aren&#8217;t a Difference Denier.</p>



<p>4) When relevant, remind people that small average differences are not a good basis for judging individuals (epistemically and morally), and point out that the distributions between the two groups are heavily overlapping (when they are) to combat people using differences in the average as a justification for stereotyping.</p>



<p>5) Point to (when relevant, helpful, and accurate) policies that may help close the gap between the two groups (keeping in mind that some gaps in averages are fine if the trait in question is merely a difference and not something &#8220;good&#8221; or &#8220;bad&#8221;)</p>



<p>6) Point out (when the difference in question is not something people should be judged for) that this attribute should not be a basis for judging people, i.e., that having different values of that trait is completely okay.</p>



<p>Another approach that can be taken when the group differences in the average are small but meaningful is well described by Guy Srinivasan in the comments on an earlier draft of this post: &#8220;Can we agree to make decisions <strong><em>as if</em></strong> there were no average difference, since usually all such decisions would turn out the same, and usually when they <strong><em>wouldn&#8217;t</em></strong> it&#8217;s perpetuating systemic problems to make the decision differently?&#8221;</p>



<p>Of course, as with any binary categories, some people will only be partial Difference Deniers or Dichotomizers &#8211; people are absurdly complex, and this model I present here is purposely simplified in order to help communicate this dynamic clearly.</p>



<p>Okay, but are there cases where the Dichotomizers or Difference Deniers are actually just right?</p>



<p>Absolutely, there are some.</p>



<p>When a group difference is SO huge that the distributions are nearly non-overlapping, then it&#8217;s reasonable to say, &#8220;A&#8217;s are like this, and B&#8217;s are like that.&#8221; For instance, it makes sense to say that &#8220;blue whales are big, mice are small.&#8221; In such cases, the Dichotomizers aren&#8217;t really oversimplifying. But when we&#8217;re talking about human groups, this kind of situation is very rare.</p>



<p>And in situations when the difference in averages between groups is so small as to be essentially insignificant for all purposes, the Difference Deniers aren&#8217;t actually denying reality. For instance, if it turns out that right-handed people are 0.001% better at school than left-handed people, that difference is so small as to not be meaningfully different from zero for all purposes, and so saying there is &#8220;no difference&#8221; is an extremely reasonable thing to do. There are, in fact, many attributes along which human groups differ so little that &#8220;no difference&#8221; is an accurate way to describe it (even though the difference is not literally zero to the 10th decimal point).</p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2023/12/oversimplifiers-vs-difference-deniers-a-dynamic-regarding-group-differences-that-leads-to-rage-and-confusion/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3772</post-id>	</item>
		<item>
		<title>False Beliefs Held by Intellectual Giants</title>
		<link>https://www.spencergreenberg.com/2023/07/false-beliefs-held-by-intellectual-giants/</link>
					<comments>https://www.spencergreenberg.com/2023/07/false-beliefs-held-by-intellectual-giants/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sun, 16 Jul 2023 03:16:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[Bayesian]]></category>
		<category><![CDATA[intelligence]]></category>
		<category><![CDATA[Langan]]></category>
		<category><![CDATA[Newton]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[Turing]]></category>
		<category><![CDATA[uncertainty]]></category>
		<category><![CDATA[update]]></category>
		<category><![CDATA[updating]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3549</guid>

					<description><![CDATA[Even many of the smartest people that have ever lived convinced themselves of false things (just like the rest of us). Here are some fun and wild examples: (1) Linus Pauling won TWO Nobel prizes &#8211; one in peace and one in chemistry. Unfortunately, he eventually became obsessed with and widely promoted the false (and [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Even many of the smartest people that have ever lived convinced themselves of false things (just like the rest of us). Here are some fun and wild examples: </p>



<p></p>



<p>(1) Linus Pauling won TWO Nobel prizes &#8211; one in peace and one in chemistry. Unfortunately, he eventually became obsessed with and widely promoted the false (and sometimes still repeated) idea that high-dose vitamin C cures many diseases, including HIV and snakebites. </p>



<p></p>



<p>(2) Isaac Newton, who co-invented calculus and discovered the laws of gravity, also was convinced the Bible had hidden messages he could decode for prophetic purposes, and spent a lot of time trying to create the mythical philosopher&#8217;s stone, so he could turn metal into gold. </p>



<p></p>



<p>(3) Alan Turing, often considered to be the father of theoretical computer science and artificial intelligence, seemingly was convinced by the existence of extrasensory perception. He wrote: &#8220;the statistical evidence, at least for telepathy, is overwhelming.&#8221; </p>



<p></p>



<p>(4) C. Langan, who appears to have one of the highest IQs ever recorded, believes &#8220;you can prove the existence of God, the soul, and an afterlife, using mathematics.&#8221; and has claimed that 9/11 was an inside job staged by the Bush administration. </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>My point is not that these people were stupid &#8211; they are the opposite of stupid &#8211; they are far smarter than 99.9% of the people that have ever lived (by at least some reasonably common ways of thinking about intelligence). My point is that even the smartest among us hold some silly, false beliefs &#8211; intelligence is not enough to avoid error. </p>



<p></p>



<p>Rationality (in the sense of evaluating evidence in such a way as to effectively arrive at the truth on important topics) and intelligence, while related, are also not the same things. Rationality involves actively working to disprove your own beliefs &#8211; which intelligent people may or may not do. For instance, intelligence is often used to come up with clever and convincing arguments for why what you already think must indeed be correct. In other words, intelligence can be deployed for rationality but also for rationalization. </p>



<p></p>



<p>Of course, it may also be me that&#8217;s wrong. Perhaps there&#8217;s a philosopher&#8217;s stone, vitamin C cures a ton of diseases, 9/11 was an inside job. But more likely, I&#8217;m wrong about other things (that I have no clue I&#8217;m wrong about). It&#8217;s useful to remember: we all believe false things. </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on July 16, 2023, and first appeared on this site on August 16, 2023.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2023/07/false-beliefs-held-by-intellectual-giants/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3549</post-id>	</item>
		<item>
		<title>Five metaphorical tools to help you climb your personal mountains</title>
		<link>https://www.spencergreenberg.com/2023/05/five-metaphorical-tools-to-climb-your-personal-mountains/</link>
					<comments>https://www.spencergreenberg.com/2023/05/five-metaphorical-tools-to-climb-your-personal-mountains/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Fri, 19 May 2023 04:11:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[asking for help]]></category>
		<category><![CDATA[exploring]]></category>
		<category><![CDATA[friends]]></category>
		<category><![CDATA[goals]]></category>
		<category><![CDATA[guidance]]></category>
		<category><![CDATA[help]]></category>
		<category><![CDATA[mindset]]></category>
		<category><![CDATA[mountains]]></category>
		<category><![CDATA[persistence]]></category>
		<category><![CDATA[planning]]></category>
		<category><![CDATA[practice]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[self-improvement]]></category>
		<category><![CDATA[social support]]></category>
		<category><![CDATA[striving]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3586</guid>

					<description><![CDATA[You&#8217;re on a mountain range, trying to reach the highest mountain peak you&#8217;re capable of reaching. That peak reflects the total sum of your achievements according to your intrinsic values. This may include, for instance, your happiness, the happiness of your loved ones, your positive impact on the world, living virtuously, achieving your deeply meaningful [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>You&#8217;re on a mountain range, trying to reach the highest mountain peak you&#8217;re capable of reaching.</p>



<p></p>



<p>That peak reflects the total sum of your achievements according to your intrinsic values. This may include, for instance, your happiness, the happiness of your loved ones, your positive impact on the world, living virtuously, achieving your deeply meaningful goals, and so on.</p>



<p></p>



<p>Unfortunately, the mountains you face are foggy as hell. Plus, they have dense forests, huge boulders, and brambles covering them. Your mountains are untamed, uncharted.</p>



<p></p>



<ul class="wp-block-list">
<li>The fog means that you can only see clearly for a short distance, and the further you look, the harder it is to tell what&#8217;s out there.</li>



<li>The dense forests mean that to go a considerable distance in most directions, you&#8217;ll have to whack your way through with substantial effort.</li>



<li>The huge boulders will sometimes make a path impassible that had looked promising from around the bend.</li>



<li>The brambles mean that certain paths will cause considerable pain if you take them. Even more inconveniently, beautiful grasses and flowers sometimes conceal the brambles.</li>
</ul>



<p></p>



<p>These are your personal mountains, unique in all the world. Your mountains are determined by a combination of:</p>



<p></p>



<p><strong>(1) Your intrinsic values</strong>. It is your values that determine the height of each landing and peak, including the height of wherever you&#8217;re standing right now.<br><strong>(2) The structure of the real world</strong>, which makes some paths easier to traverse than others. The locations of the forests, boulders, and brambles are metaphors for this structure.<br><strong>(3) Your current life situation.</strong> This is represented by your current latitude and longitude on the mountain range, as well as your physical and mental health, resources, and skills.</p>



<p></p>



<p>Looking out from a distance, you can see the dim outlines of many high-up peaks far away that look promising, but they are in different directions from each other. That means you&#8217;ll have to make tough choices about what direction to go, even at the beginning of your journey.</p>



<p></p>



<p>This journey will take your entire life. If you&#8217;re like most people, it will be long and hard but also too short. It will be wondrous, terrifying, joyful, and sad.</p>



<p></p>



<p>Rather than trying to travel a great distance in order to climb to great heights, it is easier to find the first comfortable spot, set up a hammock and tent, and make camp there forever. Who can blame you for making that choice (except, perhaps, yourself)?</p>



<p></p>



<p>If you decide to take the journey, you&#8217;ll need to use your tool belt, which (if you&#8217;re lucky) comes equipped with five (metaphorical) tools.</p>



<p></p>



<p>To have the greatest chance of reaching the highest peaks, you&#8217;ll want to train yourself to be a master of each tool.</p>



<p></p>



<p>Here are the tools to master.</p>



<p></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Tool 1: The walking stick</strong>, which is what you use to move forward along the path you&#8217;ve chosen.</p>



<p></p>



<p>By far, the most common tool you&#8217;ll use is the walking stick. For every choice of path, you&#8217;re going to have to spend a lot of time walking.</p>



<p></p>



<p>Maybe you&#8217;ll give up and turn back at the first encounter with a snake, tiger, or tornado. Or maybe you&#8217;ll use the walking stick to keep going.</p>



<p></p>



<p>You&#8217;re using the walking stick when you create a to-do list and tick items off of it. You&#8217;re using it when you push through fear to do something valuable.</p>



<p></p>



<p>Mostly, climbing a mountain involves using your walking stick, but if you <em>only</em> use that, you&#8217;re doomed.</p>



<p></p>



<p></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Tool 2: The telescope</strong>, which allows you to peer at the shape of the mountain, collecting data and facts about the world that you can use to select your path.</p>



<p></p>



<p>The questions that guide your use of the telescope are ones like:</p>



<ul class="wp-block-list">
<li>What questions could I ask that would help me choose my path?</li>



<li>What do I need to know about the mountains that I don&#8217;t yet know?</li>



<li>What important question am I confused about?</li>



<li>Where is my lack of knowledge showing?</li>
</ul>



<p></p>



<p>The scope, like each of your tools, takes many forms. In a start-up, it may look like talking to customers, running surveys, examining other products, or scrutinizing the structure of your own product, keeping a keen eye out for flaws. In your career, it may look like researching career paths, talking to others who have tried different routes, quickly trying things out, and soliciting feedback on your work from your colleagues.</p>



<p></p>



<p>What you&#8217;re looking for with the telescope are indications of which nearby paths lead quickest up the mountain, as well as hints for other (potentially faraway) parts of the mountain range that may have yet higher peaks (even if you have to go a ways back down the mountain to get there).</p>



<p></p>



<p>Being good at using the scope means being observant, impartial, curious, methodical, open to criticism, and empirical. And it means being able and willing to cope with reality.</p>



<p></p>



<p></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Tool 3: The notepad</strong>, which you use to formulate your theories about how the world works, as well as to devise plans.</p>



<p></p>



<p>This tool will most dramatically increase how effectively you use the telescope because there are far too many potential things to point the scope at. Your theories on the notepad, therefore, guide your use of the scope. At the same time, the scope provides data to go into your theories in the notebook.</p>



<p></p>



<p>Some of your theories will be explicit, penned in detail with full awareness, but most will be implicit, born out of the things you&#8217;ve seen, etched in your subconscious with a shadowy ink.</p>



<p></p>



<p>In your personal life, your notebook contains your understanding of yourself, your partner, your parents, your friends, and human nature. It contains your understanding of your mountains (as well as other people&#8217;s mountains) and your beliefs about where the brambles, boulders, and high peaks lie.</p>



<p></p>



<p>To use the notebook is to sit and reflect, to make predictions, to spell out your thoughts, to reduce ambiguity through precision, to derive new knowledge from other things you already know, to come to new conclusions.</p>



<p></p>



<p>Being good at using the notebook means being thoughtful, philosophical, reflective, logical, cautious, precise, and rational.</p>



<p></p>



<p></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Tool 4: The jump rope</strong>, which you use to practice and improve your skills.</p>



<p></p>



<p>Sometimes, this will take the form of physical training &#8211; maybe you don&#8217;t yet have the bushwhacking skills to knock away the brambles. Oftentimes, the training and practice will be mental rather than physical. Maybe you don&#8217;t know enough about how to use the scope, or your facility with the notepad is not where it needs to be.</p>



<p></p>



<p>Early on in your journey, you&#8217;ll need to use the metaphorical jump rope a lot so that you can build the skills you&#8217;ll need for the journey. Over time, you&#8217;ll need it less often, but there will always be new skills that are useful to train as you climb higher.</p>



<p></p>



<p>The jump rope is informed by the scope and the notebook. Sometimes the scope will tell you about the sort of paths you&#8217;ll soon need to face, and the jump rope will help you prepare for them. Other times, you&#8217;ll turn the lens of the scope on yourself to see your weaknesses. You can then use the jump rope to work on these, to reshape yourself.</p>



<p></p>



<p>You&#8217;re using the jump rope when you&#8217;re reading to learn, taking a course to understand something important, practicing how to do something, asking someone to teach you, breaking a difficult skill into smaller pieces, or asking a question when you&#8217;re confused.</p>



<p></p>



<p>If your walking speed is slow, or you find another weakness that is slowing you down, use the jump rope to get that skill up to par. But just as importantly, use the jump rope to hone what you&#8217;re already good at, to sharpen it into an exceptional skill.</p>



<p></p>



<p></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Tool 5: The whistle,</strong> which is how you get the help you need from others.</p>



<p></p>



<p>Most things that are worth doing can&#8217;t be done alone, and those that travel without a whistle put themselves in great peril. Sometimes you&#8217;ll need the help of others to clear a path, to show you how to use the other tools effectively, or to help you understand why you&#8217;re stuck. No matter where you&#8217;re headed on the mountains, there are those that have gone that way before who have advice to share.</p>



<p></p>



<p>Using the whistle may mean requesting assistance or a favor, but it can also mean asking advice, asking a simple question, or getting support when you&#8217;re mentally exhausted.</p>



<p></p>



<p>Being good at using the whistle means investing time in your relationships, developing deep connections, being a good friend when others use their whistles, meeting new people when you perceive gaps, and being bold enough to call for support when you could use it.</p>



<p></p>



<p>Never travel without a whistle when you don&#8217;t have to.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>So how, then, do you get to the top of your mountains? Well, you will never get to the very top &#8211; the mountains stretch forever. But you can climb high. To maximize your chances, use:</p>



<p></p>



<p><strong>(1) The walking stick </strong>(to keep moving forward without giving up).<br><strong>(2) The telescope </strong>(to investigate the mountains carefully and with minimal bias so that you can understand where the brambles and boulders are).<br><strong>(3) The notepad</strong> (to reflect carefully on your beliefs and formulate your plans).<br><strong>(4) The jump rope </strong>(to improve your weaknesses and enhance your strengths).<br><strong>(5) The whistle </strong>(to get help and support).</p>



<p></p>



<p>Good luck &#8211; may your climb be a joyous one!</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p></p>



<p><em>This piece was first written on May 19, 2023, and first appeared on this site on September 22, 2023.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2023/05/five-metaphorical-tools-to-climb-your-personal-mountains/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3586</post-id>	</item>
		<item>
		<title>How do we predict high levels of success?</title>
		<link>https://www.spencergreenberg.com/2021/09/how-do-we-predict-high-levels-of-success/</link>
					<comments>https://www.spencergreenberg.com/2021/09/how-do-we-predict-high-levels-of-success/#comments</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sun, 12 Sep 2021 16:26:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[aptitude]]></category>
		<category><![CDATA[confidence]]></category>
		<category><![CDATA[conscientiousness]]></category>
		<category><![CDATA[cooperation]]></category>
		<category><![CDATA[courage]]></category>
		<category><![CDATA[creativity]]></category>
		<category><![CDATA[deliberate practice]]></category>
		<category><![CDATA[efficiency]]></category>
		<category><![CDATA[exponential]]></category>
		<category><![CDATA[focus]]></category>
		<category><![CDATA[goals]]></category>
		<category><![CDATA[luck]]></category>
		<category><![CDATA[mental health]]></category>
		<category><![CDATA[multiplicative effects]]></category>
		<category><![CDATA[obsession]]></category>
		<category><![CDATA[opportunities]]></category>
		<category><![CDATA[prioritization]]></category>
		<category><![CDATA[randomness]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[resources]]></category>
		<category><![CDATA[self-promotion]]></category>
		<category><![CDATA[social skills]]></category>
		<category><![CDATA[socioeconomic status]]></category>
		<category><![CDATA[success]]></category>
		<category><![CDATA[wealth]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2693</guid>

					<description><![CDATA[Below, I outline 13 approaches to predicting high levels of success with differing levels of complexity, including my own mega model at the bottom. Note: here, I use the term &#8220;success&#8221; merely in terms of achievement, career success, or high levels of expertise, NOT in terms of happiness, living a good life, morality, having strong [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Below, I outline 13 approaches to predicting high levels of success with differing levels of complexity, including my own mega model at the bottom.</p>



<p>Note: here, I use the term &#8220;success&#8221; merely in terms of achievement, career success, or high levels of expertise, NOT in terms of happiness, living a good life, morality, having strong social bonds, etc. There is nothing wrong with&nbsp;<em>not</em>&nbsp;wanting to be successful in the way this post focuses on. But if you DO want &#8220;success&#8221; in the sense in which it is used in this post (or you are interested in being able to predict it in others), you may find some of the models here useful.</p>



<p>I&#8217;m also interested to know: which model (below) do you find most useful for thinking about success, and which one of these factors (used in the models) do you think is currently most limiting your success?</p>



<hr class="wp-block-separator"/>



<p><strong>1. Noise theory:</strong></p>



<p>success = luck</p>



<hr class="wp-block-separator"/>



<p><strong>2. Genetic determinism:</strong></p>



<p>success = (innate) aptitude + luck</p>



<p>Note: whenever I use &#8220;luck,&#8221; I mean random factors not already accounted for in the other factors in the model. So in the case above, &#8220;luck&#8221; means luck other than the random chance of what your aptitude is.</p>



<hr class="wp-block-separator"/>



<p><strong>3. Traditional right:</strong></p>



<p>success = aptitude + surrounding culture + hard work</p>



<hr class="wp-block-separator"/>



<p><strong>4. Social justice left:</strong></p>



<p>success = privilege + luck</p>



<hr class="wp-block-separator"/>



<p><strong>5. Economic left:</strong></p>



<p>success = social/economic class you&#8217;re born into + luck</p>



<hr class="wp-block-separator"/>



<p><strong>6. Cynical theory:</strong></p>



<p>success = some combination of self-promotion, bullshitting, social skills, good-lookingness, starting resources, and luck</p>



<hr class="wp-block-separator"/>



<p><strong>7. Gladwell:&nbsp;</strong></p>



<p>success = whoever practiced for 10,000 hours + luck</p>



<hr class="wp-block-separator"/>



<p><strong>8. Dweck:&nbsp;</strong></p>



<p>success = aptitude + growth mindset + luck</p>



<hr class="wp-block-separator"/>



<p><strong>9. Duckworth:&nbsp;</strong></p>



<p>success = aptitude + growth mindset + grit + luck</p>



<hr class="wp-block-separator"/>



<p><strong>10. Seligman:&nbsp;</strong></p>



<p>success = skill * effort * self-promotion * luck</p>



<hr class="wp-block-separator"/>



<p><strong>11. Psychometrics:&nbsp;</strong></p>



<p>success = IQ + conscientiousness + low neuroticism + luck</p>



<hr class="wp-block-separator"/>



<p><strong>12. Ericsson:&nbsp;</strong></p>



<p>success = luck + hours spent doing &#8220;deliberate practice&#8221; (i.e., with specific goals and tight performance feedback loops, while analyzing mistakes and dividing skills into micro-skills that can be practiced independently, ideally all done under the supervision of expert coaches)</p>



<hr class="wp-block-separator"/>



<p><strong>13. My mega model:</strong></p>



<p>success at a fixed goal = luck^a</p>



<p>* (resources+opportunities)^b</p>



<p>* (community/collaborator quality and supportiveness)^c</p>



<p>* (innate aptitude at relevant skills)^d</p>



<p>* intelligence^e</p>



<p>* rationality^f</p>



<p>* (creativity and resourcefulness)^g</p>



<p>* (social skills)^h</p>



<p>* (hours of deliberate practice)^i</p>



<p>* (unitary or obsessive focus on the goal)^j</p>



<p>* (conscientiousness and self-control)^k</p>



<p>* (physical or mental health)^l</p>



<p>* confidence^m</p>



<p>* (ambition and agency/self-directedness)^n</p>



<p>* (self-promotion skill and effort)^o</p>



<p>* courage^p</p>



<p>* (goal/task-specific factors)^q</p>



<p>* (efficiency and prioritization)^r</p>



<p>Each exponent a, b, c, &#8230;, r is a different number from 0 to 1. Note that each of these traits is selected because I believe, on average, having more of them improves the chance of success &#8211; that&#8217;s why I exclude negative exponents. Furthermore, I’m claiming that these factors, on average, each have diminishing marginal returns. That’s why the exponents are each less than 1 (making a concave function).</p>



<p>The values of the exponents vary depending on the field and type of skill. For instance, in some areas, courage is a minor factor (in which case the courage exponent, n, would be close to 0, and in other fields, courage is essential, in which case n would be close to 1). So, in other words: success is a PRODUCT of roughly 18 factors, and how much each factor matters depends on what you&#8217;re trying to do.</p>



<p>Note that this is designed so that if you have literally 0 of any factor, then the level of success is automatically 0 (since 0 times any number is 0). For instance, if you have literally no physical health, you are, presumably, dead, and if you have literally no ambition, presumably you just sit around all day or do the minimum you need to eat.</p>



<p>It&#8217;s worth noting that the factors above are not completely statistically or causally independent in reality (becoming higher in one may make you higher in another, on average). But I think the enormous extra complexity of trying to account for these dependencies probably is not worth it in practice.</p>



<hr class="wp-block-separator"/>



<p><strong>How do you improve your odds of success?</strong></p>



<hr class="wp-block-separator"/>



<p>A lot of times, when people are extremely successful, I think it&#8217;s because they avoid being TOO low in any of the factors, and they have one or two factors where they are exceptionally high. Many factors are &#8220;bounded&#8221; ones: for instance, you can&#8217;t work more than 24 hours per day. So it&#8217;s impossible to work more than 3x the amount the average person does. But there are some &#8220;unbounded&#8221; factors where you can potentially be WAY higher than the average person (e.g., &#8220;creativity&#8221;), which can drive the success score very high (as long as no other factor is close enough to zero to drag it back down). Hence, this model leads to an approach for thinking about how to be more successful (if that&#8217;s something you care about).</p>



<p>Put simply, success often flows from not being TOO weak on really important factors and having one or two really strong (and relevant) strengths.</p>



<p>Getting into more detail, here is a process you might use to consider how to increase your odds of great success:</p>



<p>1. For the goal/task you&#8217;re trying to succeed at, figure out which of the above factors matter substantially (which maps onto trying to &#8211; very roughly &#8211; figure out the exponents for each factor).</p>



<p>2. If your strong/weak factors are not a good fit for the goal, consider changing the goal to better play to your strengths, or consider teaming up with someone (e.g., a co-founder) to compensate for your weaknesses.</p>



<p>3. Once you have settled on a goal, identify any especially low factors (relevant to that goal) that are driving your potential for success down, and think about how you can improve at those. Due to multiplicative effects, very low factors can really drag down your potential for success. For instance, if you have severe mental health challenges that interfere with your day-to-day tasks, working on that first can be a great idea (even if you&#8217;re just optimizing for success).</p>



<p>4. Identify your strongest factors (that are relevant to that goal) and think about how you might improve at them or hone them to get them VERY high. You can also figure out how to make even more use of these great strengths of yours to achieve good outcomes. Often, one of the most effective things we can focus on is leaning into our greatest strengths (for instance, by designing a path towards our goals that leverages them or working to enhance them even more). This is especially the case once we&#8217;ve gotten barriers to success out of the way (i.e., we&#8217;ve worked on improving our especially low factors).</p>



<hr class="wp-block-separator"/>



<p>A question for you: right now, which of the factors listed above is the one that is most significantly limiting your success?</p>



<hr class="wp-block-separator"/>



<p><em>This piece was first written on September 12, 2021 and first appeared on this site on March 25, 2022.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/09/how-do-we-predict-high-levels-of-success/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2693</post-id>	</item>
		<item>
		<title>Three big reasons we struggle to find the truth </title>
		<link>https://www.spencergreenberg.com/2021/06/three-big-reasons-we-struggle-to-find-the-truth/</link>
					<comments>https://www.spencergreenberg.com/2021/06/three-big-reasons-we-struggle-to-find-the-truth/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Fri, 04 Jun 2021 01:28:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[complexity]]></category>
		<category><![CDATA[incentives]]></category>
		<category><![CDATA[irrationality]]></category>
		<category><![CDATA[mimicry]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[skeptical seeker]]></category>
		<category><![CDATA[social desirability]]></category>
		<category><![CDATA[social pressures]]></category>
		<category><![CDATA[truth-seeking]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2997</guid>

					<description><![CDATA[As I see it, there are three main causes for our struggles to see the truth on any particular topic: 1. Mimicry: when our in-group promotes falsity that we copy 2. Incentives: when we predict that knowing the truth would feel bad or harm our objectives 3. Complexity: when the truth is hard to figure [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>As I see it, there are three main causes for our struggles to see the truth on any particular topic:</p>



<p><strong>1. Mimicry:</strong> when our in-group promotes falsity that we copy</p>



<p><strong>2. Incentives: </strong>when we predict that knowing the truth would feel bad or harm our objectives</p>



<p><strong>3. Complexity: </strong>when the truth is hard to figure out</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>Examples:</strong></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>1. Mimicry</strong></p>



<p>• Some are Christians because all their friends and family are, too; some are atheists for the same reason.</p>



<p>• Some think that it makes sense to circumcise baby boys because the people they know think it&#8217;s healthy and normal; others think it&#8217;s bizarre because the people they know think foreskin is healthy and normal.</p>



<p>• Some believe it would harm Black Americans to defund police because their friends say so; others think it would help Black Americans because their friends say it.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>2. Incentives</strong></p>



<p>• If you make more money believing X, it&#8217;s going to be harder to stop believing it.</p>



<p>• If the idea of permanent death is terrifying to you, it&#8217;s going to be harder to stop believing in reincarnation.</p>



<p>• If it would make you feel really bad to find out you were wrong about something you posted online, your immediate reaction may be to deny being wrong (to others and to yourself) to shield yourself from the negative feelings.</p>



<p>Note that mimicry and incentives can blend together. Sometimes we mimic to fit in or to avoid being socially punished. But mimicry is even more basic than that: we seem to have a natural, in-built strong tendency to copy others. If all the people around us believe something, we usually will come to believe it too, without even questioning whether it might be false or even being aware that we copied the belief from others.&nbsp;</p>



<p>If we see others all behave in a certain way, we&#8217;ll probably behave that way, too, unless we have strong reasons not to. This appears to be an evolutionary survival mechanism &#8211; it&#8217;s risky to (for example) eat plants that are different than the ones your tribe eats (they might be poisonous) or to avoid the behaviors everyone else does (those behaviors might be key to survival in some hidden way). In the wilderness, you can&#8217;t figure out how to survive from first principles (chances are you&#8217;ll die way too fast for that) &#8211; you need to mimic what has worked for centuries (some of which will be key to survival, some of which will be pointless, but evolutionary pressures will have weeded out most of the really harmful stuff and hung on to the most useful stuff).</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>3. Complexity</strong></p>



<p>• It&#8217;s really not obvious how to prevent future risks from advanced artificial intelligence (though it often seems obvious to folks who&#8217;ve spent almost no time thinking about it).</p>



<p>• How best to prevent economic crashes is a fundamentally complicated question.</p>



<p>• Nobody actually seems to know how to cure cancers in general.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>A weird thing about these three causes for believing falsehoods is that we are usually not aware of their effects.</p>



<p>1. We don&#8217;t usually realize it when we believe something just because we copied our social group.</p>



<p>2. We don&#8217;t usually realize when we believe something just because it would hurt us not to believe it.</p>



<p>3. We don&#8217;t usually realize when our analysis of a complex issue is oversimplified and misses important considerations.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>Believing in falsehoods feels just like believing the truth &#8211; until the moment we genuinely face up to the possibility of being wrong.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>So how do we be right more often?</strong></p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>1. To combat mimicry: </strong>we can keep our identities smaller (or have more of them), be more willing to be viewed as having &#8220;weird&#8221; beliefs, join social groups that value diversity of thought, learn to do less social mimicry (e.g., having greater skepticism towards in-group consensus). We can recognize that every in-group will get some things wrong (including ours) and that it&#8217;s worth investigating where ours is wrong. In summary, we can combat mimicry with social resilience and skepticism.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>2. To combat harmful incentives: </strong>we can recognize that, while there can be short-term pain from accepting the truth, being truth-seeking is usually a better long-term strategy (especially because you can&#8217;t just suddenly decide to be truth-seeking when it&#8217;s convenient &#8211; it&#8217;s best to have a habit of being truth-seeking all the time). We can consider thought experiments like: &#8220;If X were true, would I rather believe it or be wrong about it?&#8221;. We can also leave &#8220;lines of retreat&#8221; so that we can decide what we&#8217;d do and how we&#8217;d move forward if we turn out to be wrong on something important. In summary, we can combat bad incentives with a Scout Mindset and a focus on seeking the truth.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><strong>3. To tackle complexity</strong>: we can use probabilistic thinking, consider multiple hypotheses, and consider the evidence for and against each one. We can train ourselves in evidence and argument evaluation, practice steel-manning arguments, talk to smart and knowledgeable people with different views and fight back against overconfidence (e.g., through calibration practice). We can also do large amounts of research when it&#8217;s important to be right. In summary, we can combat complexity with good epistemic hygiene, honed thinking skills, and self-skepticism.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on June 3, 2021, and first appeared on this site on November 11, 2022.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/06/three-big-reasons-we-struggle-to-find-the-truth/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2997</post-id>	</item>
		<item>
		<title>Why is Confirmation Bias So Common?</title>
		<link>https://www.spencergreenberg.com/2021/05/why-is-confirmation-bias-so-common/</link>
					<comments>https://www.spencergreenberg.com/2021/05/why-is-confirmation-bias-so-common/#respond</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Wed, 05 May 2021 14:37:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[confirmation bias]]></category>
		<category><![CDATA[echo chambers]]></category>
		<category><![CDATA[epistemic humility]]></category>
		<category><![CDATA[fear of errors]]></category>
		<category><![CDATA[irrationality]]></category>
		<category><![CDATA[motivated reasoning]]></category>
		<category><![CDATA[overconfidence]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[soldier mindset]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2204</guid>

					<description><![CDATA[Written: May 5, 2021 &#124; Released: June 18, 2021 People often talk about what a problem &#8220;confirmation bias&#8221; is. But we rarely discuss what causes so many of us to search for information in a biased way. Let&#8217;s explore some of the forces: 1. Echo chambers:&#160;our routine sources of information tend to support our worldview. [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p><em>Written: May 5, 2021 | Released: June 18, 2021</em></p>



<p>People often talk about what a problem &#8220;confirmation bias&#8221; is. But we rarely discuss what causes so many of us to search for information in a biased way.</p>



<p>Let&#8217;s explore some of the forces:</p>



<p><strong>1. Echo chambers:&nbsp;</strong>our routine sources of information tend to support our worldview. Much of this is due to social ties (we tend to talk to people who are similar to us in age, geography, religion, etc.) We also trust news sources more if they share our basic ideology/assumptions. The authorities we look to will be the ones that agree with us on most of our fundamental assumptions (even if some of these assumptions could turn out to be wrong).</p>



<p><strong>2. Soldier Mindset:&nbsp;</strong>as&nbsp;<a href="https://juliagalef.com/" target="_blank" rel="noreferrer noopener">Julia Galef</a>&nbsp;explains in her wonderful new book (the Scout Mindset), we are often not even TRYING to figure out the truth. We&#8217;re just trying to beat the other side or prove a point. In these cases, of course we have a biased search process.</p>



<p><strong>3. Lack of doubt:&nbsp;</strong>when we&#8217;re really confident our basic premises are correct, we don&#8217;t see the need for a nuanced information search process. We&#8217;re going to go to whatever sources are convenient for filling in minor details, even if our beliefs have major unquestioned assumptions.</p>



<p><strong>4. Advocacy:</strong>&nbsp;others are actively trying to get us to believe certain falsehoods. They do so through ads, websites, news, and other channels. For the most part, they themselves believe these falsehoods (promoting &#8220;the truth&#8221; as they see it), but occasionally it&#8217;s pure manipulation.</p>



<p><strong>5. Fear of being wrong:&nbsp;</strong>it hurts to be wrong, especially if we&#8217;ve made the error publicly, have our identity tied up in the belief, or it challenges our understanding of the world or who to trust. We sometimes avoid finding out we are wrong the way we avoid touching a hot stove.</p>



<p>We don&#8217;t always seek out information in a biased way. For instance, when looking up driving directions or trying to figure out what paint to use to prevent water damage, we want the right answer, don&#8217;t have political biases, and usually have appropriate self-doubt and skepticism.</p>



<p>But we are liable to have a biased search for the truth when we are incentivized to have particular beliefs, such as when our social world supports just one perspective, when we&#8217;re trying to prove the other side wrong, when we have no doubt that we are right, when powerful others are devoting a lot of effort to persuade us, or when we&#8217;re too afraid of being wrong.</p>



<p>For more about this topic, you may want to check out <a href="https://clearerthinkingpodcast.com/?ep=036">my recent podcast episode with Julia Galef</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/05/why-is-confirmation-bias-so-common/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2204</post-id>	</item>
		<item>
		<title>Soldier Altruists vs. Scout Altruists</title>
		<link>https://www.spencergreenberg.com/2021/04/soldier-altruists-vs-scout-altruists/</link>
					<comments>https://www.spencergreenberg.com/2021/04/soldier-altruists-vs-scout-altruists/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Fri, 23 Apr 2021 22:44:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief formation]]></category>
		<category><![CDATA[beliefs]]></category>
		<category><![CDATA[causal mechanisms]]></category>
		<category><![CDATA[conflict]]></category>
		<category><![CDATA[conflict theory]]></category>
		<category><![CDATA[corruption]]></category>
		<category><![CDATA[curiosity]]></category>
		<category><![CDATA[effort]]></category>
		<category><![CDATA[evidence-based action]]></category>
		<category><![CDATA[ideological blindspots]]></category>
		<category><![CDATA[ignorance]]></category>
		<category><![CDATA[inertia]]></category>
		<category><![CDATA[mistake theory]]></category>
		<category><![CDATA[political will]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[reasoning]]></category>
		<category><![CDATA[scaling]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[scout]]></category>
		<category><![CDATA[scout mindset]]></category>
		<category><![CDATA[selfishness]]></category>
		<category><![CDATA[soldier mindset]]></category>
		<category><![CDATA[systemic problems]]></category>
		<category><![CDATA[testing]]></category>
		<category><![CDATA[theory]]></category>
		<category><![CDATA[updating]]></category>
		<category><![CDATA[warm glow]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2571</guid>

					<description><![CDATA[There is an important division between people who want to improve the world that few seem to be aware of. Inspired by Julia Galef&#8217;s new book (The Scout Mindset), I&#8217;ll call this division:&#160;Soldier Altruists vs. Scout Altruists. 1. Soldier Altruists&#160;think it&#8217;s obvious how to improve the world and that we just need to execute those [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>There is an important division between people who want to improve the world that few seem to be aware of. Inspired by Julia Galef&#8217;s new book (<em>The Scout Mindset</em>), I&#8217;ll call this division:&nbsp;<strong>Soldier Altruists vs. Scout Altruists</strong>.</p>



<p><strong>1. Soldier Altruists&nbsp;</strong>think it&#8217;s obvious how to improve the world and that we just need to execute those obvious steps. They see the barriers to a better world as:</p>



<p>(i) not enough people taking action (e.g., due to ignorance, selfishness, or propaganda), and</p>



<p>(ii) bad groups blocking things (e.g., corrupt politicians or greedy corporations).</p>



<p></p>



<p><strong>2. Scout Altruists</strong>&nbsp;think it&#8217;s hard to figure out how to improve the world &#8211; and most attempts either don&#8217;t work, only slightly help, or make things worse. They see the barriers to a better world as:</p>



<p>(i) not enough understanding of causal mechanisms (e.g., due to a lack of high-quality evidence, not enough attention to the evidence we do have, not enough careful reasoning, etc.), and</p>



<p>(ii) too much investment in bad solutions (e.g., due to people jumping to conclusions, doing what feels good emotionally rather than what is effective, ideological blindspots, inertia, etc.)</p>



<hr class="wp-block-separator"/>



<p>Soldier Altruists say we need to DO and FIGHT more. Scout Altruists say we need to THINK and TEST more. Soldier Altruists are more likely to think that if we could just get people to be less selfish and more motivated to act, we would make a lot of progress towards a better world. Scout Altruists are more likely to think that if we could just get people to pay more attention to evidence and to have more good-faith debates with strong norms around the quality of argumentation, we would make a lot more progress.</p>



<p>Soldier Altruists may think Scout Altruists are far too reluctant to act and are wasting their time on research and debate. Scout Altruists may think Soldier Altruists are far too confident in their conclusions and are wasting their effort pushing for changes that aren&#8217;t going to help much (and which, in some cases, might even make things worse). Of course, in reality, there is a continuum between these two positions. So, on a scale from 0 (Soldier Altruist) to 10 (Scout Altruist) where do you fall? I&#8217;m probably a 7 or 8.</p>



<hr class="wp-block-separator"/>



<p>As some&nbsp;<a target="_blank" href="https://www.facebook.com/spencer.greenberg/posts/10105808551163702?__cft__[0]=AZXHoevvmvsz4tG6r-SoVZBGVxOdM6ixkZlhisrLVXQTX4VrTiFr5pCm004f4o9J6rQCOqPDSCsRwLT3miKvR3_6STsnjnpvPqH2WkzvtWHbM6eXvssfOziyDsDq1oFu1Pg&amp;__tn__=%2CO%2CP-R" rel="noreferrer noopener">commenters</a>&nbsp;have pointed out, there is a relationship between this distinction and &#8220;Conflict Theory&#8221; vs. &#8220;Mistake Theory.&#8221; I think it is related &#8211; but also distinct in important ways. Conflict theory says that there is a giant zero-sum struggle (groups fighting over fixed resources). Whereas in this case, we&#8217;re operating from a framework of altruism: &#8220;the world can be made a lot better &#8211; what&#8217;s the big barrier to that happening? Is it that we know what to do and we&#8217;re not doing it enough/with enough energy, or is it that we don&#8217;t really know what to do?&#8221;</p>



<p>Also, to clarify another important point brought up in the&nbsp;<a rel="noreferrer noopener" target="_blank" href="https://www.facebook.com/spencer.greenberg/posts/10105808551163702?__cft__[0]=AZXHoevvmvsz4tG6r-SoVZBGVxOdM6ixkZlhisrLVXQTX4VrTiFr5pCm004f4o9J6rQCOqPDSCsRwLT3miKvR3_6STsnjnpvPqH2WkzvtWHbM6eXvssfOziyDsDq1oFu1Pg&amp;__tn__=%2CO%2CP-R">comments</a>: I&#8217;m not asking, &#8220;do you think it&#8217;s obvious how we should improve the world if we had a magic wand that could change whatever we wanted?&#8221; &#8211; instead, the question is: &#8220;is it obvious what to do to improve the real world, given that we don&#8217;t have a magic wand?&#8221; Do we just need to put more money/time/effort/people into executing the current &#8220;obvious&#8221; strategies because they will work well if we just scale them up? Or is it pretty unclear what strategies we should even be putting more resources into (meaning that a lot of thinking, research, debate and/or evidence evaluation will typically be necessary to even figure out what is worth scaling up)?</p>



<hr class="wp-block-separator"/>



<p>Julia&#8217;s book (which I highly recommend): <a href="https://www.amazon.com/Scout-Mindset-Perils-Defensive-Thinking/dp/0735217556/ref=nodl_ ">https://www.amazon.com/Scout-Mindset-Perils-Defensive-Thinking/dp/0735217556/ref=nodl_ </a></p>



<p><em>This piece was first written on April 23rd, 2021, and first appeared on this site on January 7th, 2022.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/04/soldier-altruists-vs-scout-altruists/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2571</post-id>	</item>
		<item>
		<title>Twelve Recursive Explanations</title>
		<link>https://www.spencergreenberg.com/2021/03/twelve-recursive-explanations/</link>
					<comments>https://www.spencergreenberg.com/2021/03/twelve-recursive-explanations/#respond</comments>
		
		<dc:creator><![CDATA[Admin]]></dc:creator>
		<pubDate>Sun, 21 Mar 2021 17:37:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[anthropics]]></category>
		<category><![CDATA[Baader-Meinhof effect]]></category>
		<category><![CDATA[common knowledge]]></category>
		<category><![CDATA[explore-exploit tradeoff]]></category>
		<category><![CDATA[inferential distance]]></category>
		<category><![CDATA[lists of explanations]]></category>
		<category><![CDATA[meta]]></category>
		<category><![CDATA[opportunity costs]]></category>
		<category><![CDATA[Overton Window]]></category>
		<category><![CDATA[Pareto Optimality]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[recursion]]></category>
		<category><![CDATA[satire]]></category>
		<category><![CDATA[Schelling points]]></category>
		<category><![CDATA[Sturgeon&#039;s Law]]></category>
		<category><![CDATA[sunk cost fallacy]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=2684</guid>

					<description><![CDATA[If the Overton Window were not inside of itself, you&#8217;d think I was crazy for writing this. Is it just me, or has the Baader-Meinhof effect been popping up all over the place ever since I learned about it? It&#8217;s hard to justify learning about opportunity costs when there are so many other things you [&#8230;]]]></description>
										<content:encoded><![CDATA[
<ol class="wp-block-list"><li>If the Overton Window were not inside of itself, you&#8217;d think I was crazy for writing this.</li><li>Is it just me, or has the Baader-Meinhof effect been popping up all over the place ever since I learned about it?</li><li>It&#8217;s hard to justify learning about opportunity costs when there are so many other things you could be doing with that time.</li><li>I don&#8217;t think the idea of being Pareto Optimal has made anyone better off without making at least one person worse off.</li><li>What can we infer from the fact that we find ourselves living in a world where we&#8217;ve invented the idea of &#8220;Anthropics&#8221;?</li><li>Everyone knows that everyone knows that everyone knows that everyone knows (and so on) what common knowledge is.</li><li>Ninety percent of explanations of Sturgeon&#8217;s Law are crap.</li><li>I would teach you about Inferential Distance, but it would take too long to explain it to you.</li><li>Let&#8217;s meet at the place where you think that I think that you think that I think that you think is a good place to discuss Schelling Points.</li><li>If you think this sentence is meta, you&#8217;re mistaken; it is one level higher than that.</li><li>You should use some of your time learning about new ideas, like the explore-exploit tradeoff, and the rest of your time applying ideas you already know well.</li><li>I wasn&#8217;t going to include this explanation of the sunk cost fallacy because it&#8217;s obviously bad, but at this point, I&#8217;ve already invested time into it.</li></ol>



<hr class="wp-block-separator"/>



<p>If you liked this piece, you may also like <a href="https://www.spencergreenberg.com/2020/07/50-laws-of-everything/">50 “Laws” of Everything</a>.</p>



<hr class="wp-block-separator"/>



<p><em>This piece was first written on March 21, 2021, and first appeared on this site on March 18, 2022.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2021/03/twelve-recursive-explanations/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">2684</post-id>	</item>
	</channel>
</rss>
