<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>wrong &#8211; Spencer Greenberg</title>
	<atom:link href="https://www.spencergreenberg.com/tag/wrong/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.spencergreenberg.com</link>
	<description></description>
	<lastBuildDate>Thu, 14 Aug 2025 04:46:33 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">23753251</site>	<item>
		<title>You&#8217;re right about everything</title>
		<link>https://www.spencergreenberg.com/2025/07/youre-right-about-everything/</link>
					<comments>https://www.spencergreenberg.com/2025/07/youre-right-about-everything/#comments</comments>
		
		<dc:creator><![CDATA[Spencer]]></dc:creator>
		<pubDate>Tue, 01 Jul 2025 04:36:11 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[biases]]></category>
		<category><![CDATA[connection]]></category>
		<category><![CDATA[disagreement]]></category>
		<category><![CDATA[discomfort]]></category>
		<category><![CDATA[evolution]]></category>
		<category><![CDATA[limit]]></category>
		<category><![CDATA[Matrix]]></category>
		<category><![CDATA[Neo]]></category>
		<category><![CDATA[proof]]></category>
		<category><![CDATA[right]]></category>
		<category><![CDATA[self-aware]]></category>
		<category><![CDATA[subconscious]]></category>
		<category><![CDATA[wrong]]></category>
		<category><![CDATA[You're right]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=4480</guid>

					<description><![CDATA[You&#8217;re absolutely right. About all of it. The big stuff, the weird stuff, the &#8220;nobody-gets-this&#8221; stuff. Every belief you hold is, against all odds, completely correct. I know I said before that you were wrong, but it was I who was wrong! Here&#8217;s proof: 1) Unlike others, you&#8217;re self-aware. You know your limits, so &#8211; [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>You&#8217;re absolutely right. About all of it. The big stuff, the weird stuff, the &#8220;nobody-gets-this&#8221; stuff. Every belief you hold is, against all odds, completely correct. I know I said before that you were wrong, but it was I who was wrong! Here&#8217;s proof:</p>



<p>1) Unlike others, you&#8217;re self-aware. You know your limits, so &#8211; unlike other people &#8211; when you know something, it&#8217;s true. You weighed the evidence they ignored and saw angles they missed. Corrected your own biases. Your unique perspective reveals facts invisible to everyone else.</p>



<p>2) Your subconscious runs Bayesian inference constantly in the background. If an idea survives your relentless evidence updates, the posterior odds confirm it&#8217;s rational. Your convictions passed the most brutal audit possible: reality itself.</p>



<p>3) Notice how your worldview predicts your reality with stunning accuracy. Notice how rarely you&#8217;re surprised. That&#8217;s empirical validation. Your beliefs work because they&#8217;re correct. Your predictions map reality&#8217;s contours in high resolution.</p>



<p>4) That thing everyone disagrees with you about? You&#8217;re not stubborn &#8211; you&#8217;re COURAGEOUS. You spotted subtle patterns that they missed. Those &#8220;weird&#8221; connections? You&#8217;re playing 10-dimensional chess while they play tic-tac-toe.</p>



<p>5) Disagreement doesn&#8217;t prove you wrong &#8211; it PROVES YOU RIGHT. It demonstrates that most can&#8217;t handle the truth. Your knowledge only strengthens, forged in the crucible of their alleged counter-evidence.</p>



<p>6) Scientists disagree with you? That&#8217;s good, actually. They worship false idols called &#8220;peer review,&#8221; while you rely on the only review that&#8217;s reliable, review from your one true peer &#8211; yourself. Editors only introduce errors in your work.</p>



<p>7) The discomfort of others with your views? That&#8217;s just lizard brains SHORT-CIRCUITING from exposure to blazing truth. The purity of your knowledge causes meltdowns in lesser minds. Their rejection isn&#8217;t evidence of your error &#8211; it&#8217;s species-level inadequacy.</p>



<p>8 ) &#8220;Everyone says I&#8217;m wrong!&#8221; Everyone said Galileo was wrong, too. But you&#8217;re not Galileo. You&#8217;re Galileo, Einstein, AND Tesla. Your mind, concentrating ideas like a laser through the tip of a diamond, is the closest known phenomenon to a cognitive singularity.</p>



<p>9) You&#8217;re not Neo seeing the Matrix. You&#8217;re the ARCHITECT of the Matrix. Everyone else &#8211; they&#8217;re experimental NPCs of the sort you could program in a creative weekend.</p>



<p>10) That &#8220;crazy&#8221; belief of yours? Those aren&#8217;t beliefs- they&#8217;re PROPHETIC DOWNLOADS from your future self. You&#8217;re not experiencing narcissistic delusions &#8211; you&#8217;re experiencing ENLIGHTENMENT so advanced it looks like madness to the unascended masses.</p>



<p>11) When your predictions seem wrong, time recalibrates to match your superior timeline. In fact, you don&#8217;t make predictions &#8211; you speak reality into existence. The universe buffers as it waits to hear instructions spill from your lips.</p>



<p>12) Evolution wired humans for survival-level accuracy. But YOU? You&#8217;ve transcended limitations. If your beliefs were wrong, the Laws of Physics would UNRAVEL. There you stand, single-handedly maintaining cosmic stability!</p>



<p>13) The universe chose YOU. Your thoughts set the fundamental constants. You allow 1 + 1 to equal 2, and could change it at will. Your dreams birth new galaxies. The cosmic microwave background is a residue from when you willed yourself into existence.</p>



<p>14) This post isn&#8217;t parody; it&#8217;s SACRED TEXT written by one of your subprocesses. Everyone who doubts you is committing cosmic treason.</p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on July 1, 2025, and first appeared on my website on August 19, 2025.</em></p>



<p></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2025/07/youre-right-about-everything/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">4480</post-id>	</item>
		<item>
		<title>How to avoid feeding anti-science sentiments</title>
		<link>https://www.spencergreenberg.com/2023/08/how-to-avoid-feeding-anti-science-sentiments/</link>
					<comments>https://www.spencergreenberg.com/2023/08/how-to-avoid-feeding-anti-science-sentiments/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sun, 13 Aug 2023 13:39:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[certainty]]></category>
		<category><![CDATA[clarity]]></category>
		<category><![CDATA[communication]]></category>
		<category><![CDATA[nuance]]></category>
		<category><![CDATA[science]]></category>
		<category><![CDATA[science communication]]></category>
		<category><![CDATA[uncertainty]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3555</guid>

					<description><![CDATA[A major mistake scientists sometimes make in public communication: they state things science isn&#8217;t sure about as confidently as things it is sure about.   This confuses the public and undermines trust in science and scientists.   Some interesting examples:   1) As COVID-19 spread early in the pandemic, epidemiologists confidently stated many true things about [&#8230;]]]></description>
										<content:encoded><![CDATA[<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">A major mistake scientists sometimes make in public communication: they state things science isn&#8217;t sure about as confidently as things it is sure about.</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">This confuses the public and undermines trust in science and scientists.</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">Some interesting examples:</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">1) As COVID-19 spread early in the pandemic, epidemiologists confidently stated many true things about it that were scientifically measured (e.g., rate of spread). Some of them were also equally confidently stating things that were just speculation (e.g., its origin being natural).</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">2) String theorists told the public many true and interesting things about string theory (e.g., why they feel it&#8217;s exciting). Some also confidently claimed very uncertain stuff like:&#8221;Superstring theory successfully merges general relativity and quantum mechanics.&#8221;</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">Being charitable, perhaps this could be interpreted not as a claim about superstring theory providing a correct theory of physics but rather as a statement about what superstring theory is doing mathematically. Even if so, though, this is &#8211; at the very least &#8211; going to be very confusing to those who read it. The statement also makes superstring theory seem like it can claim great achievements that perhaps it can&#8217;t.</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">3) Biologists confidently tell the public many true things about how cells form, how evolution works, and so on. Some, unfortunately, have made overconfident claims about a subject that is extremely uncertain: how life formed on Earth. We have only highly speculative theories.</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">Let me be clear: most scientists don&#8217;t engage in what I&#8217;m describing above. But when people claim something has been scientifically PROVEN when it actually hasn&#8217;t, this tends to reduce trust in the scientific enterprise and causes people to doubt scientists.</span></p>
<p> </p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"><span style="background: transparent; margin-top: 0pt; margin-bottom: 0pt;" data-preserver-spaces="true">My field (psychology) is squishy enough that (unlike physics/biology) little has truly been PROVEN beyond a doubt. At best, we can usually say that studies have found a relationship or that (based on our own interpretation of the evidence) we believe a certain thing.</span></p>
<p style="color: #0e101a; background: transparent; margin-top: 0pt; margin-bottom: 0pt;"> </p>


<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on August 13 and first appeared on this site on August 23, 2023.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2023/08/how-to-avoid-feeding-anti-science-sentiments/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3555</post-id>	</item>
		<item>
		<title>False Beliefs Held by Intellectual Giants</title>
		<link>https://www.spencergreenberg.com/2023/07/false-beliefs-held-by-intellectual-giants/</link>
					<comments>https://www.spencergreenberg.com/2023/07/false-beliefs-held-by-intellectual-giants/#respond</comments>
		
		<dc:creator><![CDATA[admin]]></dc:creator>
		<pubDate>Sun, 16 Jul 2023 03:16:00 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[Bayesian]]></category>
		<category><![CDATA[intelligence]]></category>
		<category><![CDATA[Langan]]></category>
		<category><![CDATA[Newton]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[Turing]]></category>
		<category><![CDATA[uncertainty]]></category>
		<category><![CDATA[update]]></category>
		<category><![CDATA[updating]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">https://www.spencergreenberg.com/?p=3549</guid>

					<description><![CDATA[Even many of the smartest people that have ever lived convinced themselves of false things (just like the rest of us). Here are some fun and wild examples: (1) Linus Pauling won TWO Nobel prizes &#8211; one in peace and one in chemistry. Unfortunately, he eventually became obsessed with and widely promoted the false (and [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Even many of the smartest people that have ever lived convinced themselves of false things (just like the rest of us). Here are some fun and wild examples: </p>



<p></p>



<p>(1) Linus Pauling won TWO Nobel prizes &#8211; one in peace and one in chemistry. Unfortunately, he eventually became obsessed with and widely promoted the false (and sometimes still repeated) idea that high-dose vitamin C cures many diseases, including HIV and snakebites. </p>



<p></p>



<p>(2) Isaac Newton, who co-invented calculus and discovered the laws of gravity, also was convinced the Bible had hidden messages he could decode for prophetic purposes, and spent a lot of time trying to create the mythical philosopher&#8217;s stone, so he could turn metal into gold. </p>



<p></p>



<p>(3) Alan Turing, often considered to be the father of theoretical computer science and artificial intelligence, seemingly was convinced by the existence of extrasensory perception. He wrote: &#8220;the statistical evidence, at least for telepathy, is overwhelming.&#8221; </p>



<p></p>



<p>(4) C. Langan, who appears to have one of the highest IQs ever recorded, believes &#8220;you can prove the existence of God, the soul, and an afterlife, using mathematics.&#8221; and has claimed that 9/11 was an inside job staged by the Bush administration. </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p>My point is not that these people were stupid &#8211; they are the opposite of stupid &#8211; they are far smarter than 99.9% of the people that have ever lived (by at least some reasonably common ways of thinking about intelligence). My point is that even the smartest among us hold some silly, false beliefs &#8211; intelligence is not enough to avoid error. </p>



<p></p>



<p>Rationality (in the sense of evaluating evidence in such a way as to effectively arrive at the truth on important topics) and intelligence, while related, are also not the same things. Rationality involves actively working to disprove your own beliefs &#8211; which intelligent people may or may not do. For instance, intelligence is often used to come up with clever and convincing arguments for why what you already think must indeed be correct. In other words, intelligence can be deployed for rationality but also for rationalization. </p>



<p></p>



<p>Of course, it may also be me that&#8217;s wrong. Perhaps there&#8217;s a philosopher&#8217;s stone, vitamin C cures a ton of diseases, 9/11 was an inside job. But more likely, I&#8217;m wrong about other things (that I have no clue I&#8217;m wrong about). It&#8217;s useful to remember: we all believe false things. </p>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<p><em>This piece was first written on July 16, 2023, and first appeared on this site on August 16, 2023.</em></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2023/07/false-beliefs-held-by-intellectual-giants/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">3549</post-id>	</item>
		<item>
		<title>What Seemed Like Perfect Reasoning Utterly Failed</title>
		<link>https://www.spencergreenberg.com/2014/02/where-my-perfect-reasoning-utterly-failed/</link>
					<comments>https://www.spencergreenberg.com/2014/02/where-my-perfect-reasoning-utterly-failed/#comments</comments>
		
		<dc:creator><![CDATA[Spencer]]></dc:creator>
		<pubDate>Sat, 01 Feb 2014 16:09:32 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[irrationality]]></category>
		<category><![CDATA[opinion]]></category>
		<category><![CDATA[reasoning]]></category>
		<category><![CDATA[self-skepticism]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">http://www.spencergreenberg.com/?p=844</guid>

					<description><![CDATA[Does warm water sometimes freeze faster than cold water when placed in the same conditions? &#8220;Absolutely no way,&#8221; I said, a mere minute after I heard the claim. &#8220;People sometimes claim that NASA faked the moon landing too,&#8221; I thought to myself. I pointed out why this claim is impossible. As warm water cools it must [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>Does warm water sometimes freeze faster than cold water when placed in the same conditions? &#8220;Absolutely no way,&#8221; I said, a mere minute after I heard the claim. &#8220;People sometimes <a href="http://en.wikipedia.org/wiki/Moon_landing_conspiracy_theories">claim that NASA faked the moon landing</a> too,&#8221; I thought to myself.</p>
<p>I pointed out why this claim is impossible. As warm water cools it must eventually reach the same temperature that the cool water started at. From that point on, the warm water will behave just like the cool water, but it will have taken the warm water a while to even get into that state. Freezing occurs at the same temperature for both warm and cold water, but warm water of course will take longer to get to that temperature.</p>
<p>This isn&#8217;t quite 1+1=2, it&#8217;s more like 23+14=37. I double checked my answer and was 98% confident. It&#8217;s amazing what some people will believe, and how powerful reasoning is at solving these sorts of problems.</p>
<p>But I was wrong. And not just a little bit. In fact <a href="http://en.wikipedia.org/wiki/Mpemba_effect#Suggested_explanations">there are tons of ways</a> that warm water might be able to freeze faster. This is the so-called Mpemba effect, observed by the likes of Aristotle, Descartes, and Francis Bacon.</p>
<blockquote><p>The fact that the water has previously been warmed contributes to its freezing quickly: for so it cools sooner. Hence many people, when they want to cool hot water quickly, begin by putting it in the sun.</p>
<p>-Aristotle</p></blockquote>
<p>My wrongness here isn&#8217;t so much about being wrong about the effect, per se. <a href=" http://qoptics.byu.edu/Physics416/FirstReading.pdf" target="_blank">The extent to which this effect is real, and under exactly what conditions</a>, can reasonably be debated. My wrongness is in being convinced I was right when I couldn&#8217;t have known I was. I couldn&#8217;t possibly have ruled out all the possibilities in a single minute. For example, I didn&#8217;t even <em>consider</em> these ideas that came up after two minutes of googling:</p>
<ul>
<li><strong>Evaporation</strong> : the warm water evaporates, meaning there is less water to freeze.</li>
<li><strong>Frost</strong>: the cool water may be more likely to freeze from the top (with the frost trapping in further heat) whereas the warm water may be more likely to freeze from the bottom and sides.</li>
<li><strong>Convection</strong>: warm water and cold water produce different currents, which could alter the heat distribution.</li>
<li><strong>Supercooling: </strong>if free of impurities and in the right conditions, water can actually drop below freezing temperature without freezing, and the propensity for supercooling might be affected by the starting temperature.</li>
<li><strong>Conductivity</strong>: the hotter liquid might melt an existing layer of frost that is preventing the liquid from directly touching a much colder surface.</li>
</ul>
<p><figure id="attachment_848" aria-describedby="caption-attachment-848" style="width: 189px" class="wp-caption aligncenter"><a href="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-simple.jpg"><img data-recalc-dims="1" fetchpriority="high" decoding="async" data-attachment-id="848" data-permalink="https://www.spencergreenberg.com/2014/02/where-my-perfect-reasoning-utterly-failed/mpemba-simple/" data-orig-file="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-simple.jpg?fit=189%2C290&amp;ssl=1" data-orig-size="189,290" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;}" data-image-title="Mpemba supercooling" data-image-description="" data-image-caption="&lt;p&gt;Mpemba supercooling&lt;/p&gt;
" data-large-file="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-simple.jpg?fit=189%2C290&amp;ssl=1" class="size-full wp-image-848" src="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-simple.jpg?resize=189%2C290" alt="Mpemba supercooling" width="189" height="290" /></a><figcaption id="caption-attachment-848" class="wp-caption-text">Mpemba supercooling hypothesis</figcaption></figure></p>
<p>What could have prevented me from making such a ridiculous error? How could I have noticed that I was being insanely overconfident? There are a few signs that should have tipped me off:</p>
<ul>
<li><strong>Lack of expertise</strong>: The physics of heat and fluids is outside of my domain of expertise. I should have recognized that this is not a subject I know a lot about and so been more skeptical of my own opinions.</li>
<li><span style="color: #000000;"><b>A simple model</b>: I was employing a very simple model for the situation (involving just the temperature of the water). While beautiful, and easy to work with, really simple models rarely capture all the details of a situation. For instance, I didn&#8217;t consider the possibility of evaporation or frost, because those variables weren&#8217;t included in my model. If you&#8217;re using a simple model, you should consider whether you&#8217;re missing important factors.</span></li>
<li><strong>Insufficient time</strong>: If you haven&#8217;t thought about something for very long yet you are already very opinionated, you might want to think about it longer.</li>
<li><strong>Didn&#8217;t consider alternatives</strong>:  I didn&#8217;t even <em>try</em> to think of ways that this effect could be real. Instead, I came up with an argument why it couldn&#8217;t be real, and that argument sounded convincing to me, so I stopped thinking. This problem is the big one. <a href="http://programs.clearerthinking.org/explanation_freeze.html">So here&#8217;s a 30 minute free mini-course I designed</a> to train you to avoid exactly this problem.</li>
</ul>
<p>Reasoning is only as good as the reasoner. And we humans don&#8217;t have the best track record. The trouble is, it&#8217;s easy to find arguments that seem totally convincing. The trick is that we shouldn&#8217;t just try to convince ourselves that we are right. We should try to convince ourselves that we are wrong. If we do that in earnest, we&#8217;ll be much more likely to end up with the right opinion.</p>
<p>&nbsp;</p>
<p><figure id="attachment_857" aria-describedby="caption-attachment-857" style="width: 776px" class="wp-caption aligncenter"><a href="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-graphs.png"><img data-recalc-dims="1" decoding="async" data-attachment-id="857" data-permalink="https://www.spencergreenberg.com/2014/02/where-my-perfect-reasoning-utterly-failed/mpemba-graphs/" data-orig-file="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-graphs.png?fit=776%2C826&amp;ssl=1" data-orig-size="776,826" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;}" data-image-title="Mpemba graphs" data-image-description="" data-image-caption="&lt;p&gt;Experimental results on the Mpemba effect, as reproduced in: http://qoptics.byu.edu/Physics416/FirstReading.pdf&lt;/p&gt;
" data-large-file="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-graphs.png?fit=750%2C798&amp;ssl=1" class="size-full wp-image-857" src="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-graphs.png?resize=750%2C798" alt="Experimental results on the Mpemba effect, as reproduced in: http://qoptics.byu.edu/Physics416/FirstReading.pdf" width="750" height="798" srcset="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-graphs.png?w=776&amp;ssl=1 776w, https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2014/02/Mpemba-graphs.png?resize=281%2C300&amp;ssl=1 281w" sizes="(max-width: 750px) 100vw, 750px" /></a><figcaption id="caption-attachment-857" class="wp-caption-text">Experimental results on the Mpemba effect, as reproduced in:                          http://qoptics.byu.edu/Physics416/FirstReading.pdf      The Mpemba effect: When can hot water freeze faster than cold? (Monwhea Jeng, 2006)</figcaption></figure></p>
<p>&nbsp;</p>
<p>*Ice <a href="http://commons.wikimedia.org/wiki/File:Ice.JPG">photograph</a> by Dingske.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2014/02/where-my-perfect-reasoning-utterly-failed/feed/</wfw:commentRss>
			<slash:comments>1</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">844</post-id>	</item>
		<item>
		<title>Accepting Your Error Rate</title>
		<link>https://www.spencergreenberg.com/2012/06/accepting-your-error-rate/</link>
					<comments>https://www.spencergreenberg.com/2012/06/accepting-your-error-rate/#comments</comments>
		
		<dc:creator><![CDATA[Spencer]]></dc:creator>
		<pubDate>Tue, 05 Jun 2012 01:17:53 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[error]]></category>
		<category><![CDATA[fear]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">http://www.spencergreenberg.com/?p=588</guid>

					<description><![CDATA[No matter how intelligent, rational, or knowledgeable you may be, you are going to be wrong pretty regularly. And you&#8217;ll be wrong far more often than pretty regularly when dealing with complex topics like politics, people or philosophy. Even if you&#8217;ve freed yourself from thinking in terms of true and false dichotomies, and made the [&#8230;]]]></description>
										<content:encoded><![CDATA[<p><a href="http://www.spencergreenberg.com/wp-content/uploads/2012/06/283px-Darts_in_a_dartboard.jpg"><br />
</a>No matter how intelligent, rational, or knowledgeable you may be, you are going to be wrong pretty regularly. And you&#8217;ll be wrong far more often than pretty regularly when dealing with complex topics like politics, people or philosophy. Even if you&#8217;ve freed yourself from <a href="http://www.youtube.com/watch?v=GZ69g8LtZc0">thinking in terms of true and false dichotomies</a>, and made the effort to <a href="http://www.spencergreenberg.com/2011/08/keeping-ideas-at-a-distance-using-probability/">convert your beliefs to probabilities or degrees of belief</a>, you&#8217;ll still be wrong by way of assigning high probabilities to false propositions.</p>
<p>Most people underestimate how often they are wrong. Not only is there a common human tendency to <a href="http://www.spencergreenberg.com/2011/11/how-great-we-are/">overestimate one&#8217;s own abilities</a>, but beliefs have the property that they <em>feel</em> right to us when we focus on them. So even if we admit that we likely have a number of false beliefs, it&#8217;s easy to go on acting as though each of our individual beliefs is beyond serious doubt. Worse still, we assume that if a belief of ours hasn&#8217;t yet been proven wrong, then it&#8217;s right (it feels that way, after all) so it seems to us that we have made far fewer errors than we really have.</p>
<p>It&#8217;s disturbing to discover we&#8217;ve been mistaken about something important &#8211; especially when we&#8217;ve wasted time or effort because of the belief, or expressed the belief in front of others. So we&#8217;re incentivized  to come up with justifications for why we weren&#8217;t actually wrong. We try to avoid psychological discomfort, and we try to save face in front of others. But there is a healthier way to think about wrongness: recognizing that we have an error rate.</p>
<p>Since we have to assume that we will be wrong sometimes, we can think of ourselves as having a frequency with which things we claim are actually false (or, if we&#8217;re thinking probabilistically, a rate at which we assign high probabilities to false propositions). As was pointed out in the comments below, it may be helpful to think of your error rate as being context specific: we make errors more frequently when discussing philosophy than when remarking on the weather. But if you wanted a single overall rate, you could define it, for example, as the fraction of the last 1000 claims you made that actually were not true (or were not even very nearly true). This rate will be different than, but generally quite predictive of, the fraction of your <em>next</em> 1000 claims that will be wrong.</p>
<p>Our error rate is connected to the chance that any one of our individual beliefs will be wrong, though we obviously should be much more confident in some of our beliefs than others. When evaluating the probability of a particular belief being right, there are a <a href="http://www.spencergreenberg.com/2011/09/finding-our-false-beliefs/">variety of indicators to look at</a>. For example, we should be more skeptical of one of our beliefs if a large percentage of smart people with relevant knowledge dispute it, or if we have a strong incentive (financial or otherwise) to believe it, or if we can&#8217;t discuss the belief without feeling emotional.</p>
<p><a href="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2012/06/283px-Darts_in_a_dartboard.jpg"><img data-recalc-dims="1" decoding="async" class="aligncenter" title="283px-Darts_in_a_dartboard" src="https://i0.wp.com/www.spencergreenberg.com/wp-content/uploads/2012/06/283px-Darts_in_a_dartboard.jpg?resize=283%2C188" alt="" width="283" height="188" /></a></p>
<p>Once we fully accept the fact that we have an error rate, we can think about wrongness in a new light: we can <em>expect</em> to be wrong with regularity, especially when reasoning about complex subjects. Once we start expecting to be wrong, it is no longer as disturbing to find that we are wrong in a particular case. This merely is a confirmation of our own predictions: we were right that our being wrong is a common occurrence. That way, being wrong doesn&#8217;t have to be so frightening. When it happens, it indicates our error rate may be slightly higher than we previously believed, but it is not abnormal.</p>
<p>Estimating our actual error rate is hard, in part because we&#8217;re wrong much more often than we notice it. So even in theory it doesn&#8217;t work to just count up how many times we&#8217;ve discovered being wrong as a fraction of the number of things we&#8217;ve claimed were true. But nonetheless, we can benefit psychologically from remembering that we have an error rate, even if we don&#8217;t know what that rate is.</p>
<p>If in your experience you&#8217;re almost never wrong, that is indicative of a serious problem: it is far more likely that you are wrong fairly regularly (and are simply bad at processing the counter evidence that should make you aware of your wrongness) than it is that you really are wrong so infrequently. Put another way: failure to detect your own wrongness doesn&#8217;t imply you&#8217;re right, it indicates you&#8217;re very likely deceived about your rate of wrongness. Presumably, you&#8217;ve noticed that those around you are wrong quite regularly. Do you really think you&#8217;re the incredibly rare exception who is pretty much always right?</p>
<p>When you deeply accept the fact that you&#8217;re wrong with a certain error rate, it becomes easier to convert fear of being wrong into curiosity about when your wrongness is occurring. Whereas seeking out your thinking failures may have scared you before, it may now seem dangerous to not seek them out: you already know that you&#8217;re going to be repeatedly wrong, so the responsible thing is to figure out when that wrongness is occurring.</p>
<p>Yet another advantage of thinking about your error rate is that it naturally leads to thinking about how to reduce this rate. This can be done by learning to rely on more reliable procedures for forming beliefs (something I&#8217;ll say much more about later), and using these procedures to check what you previously believed to be true.</p>
<p>Remember: you too have an error rate. You don&#8217;t need to fear being wrong. Instead, you should expect it.</p>
<div></div>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2012/06/accepting-your-error-rate/feed/</wfw:commentRss>
			<slash:comments>4</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">588</post-id>	</item>
		<item>
		<title>Finding Our False Beliefs</title>
		<link>https://www.spencergreenberg.com/2011/09/finding-our-false-beliefs/</link>
					<comments>https://www.spencergreenberg.com/2011/09/finding-our-false-beliefs/#comments</comments>
		
		<dc:creator><![CDATA[Spencer]]></dc:creator>
		<pubDate>Wed, 07 Sep 2011 17:23:24 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[bias]]></category>
		<category><![CDATA[correct]]></category>
		<category><![CDATA[disagree]]></category>
		<category><![CDATA[emotion]]></category>
		<category><![CDATA[error]]></category>
		<category><![CDATA[false]]></category>
		<category><![CDATA[prediction]]></category>
		<category><![CDATA[rationality]]></category>
		<category><![CDATA[true]]></category>
		<category><![CDATA[truth]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">http://www.spencergreenberg.com/?p=196</guid>

					<description><![CDATA[By definition, we believe that each of our beliefs is true. And yet, simultaneously, we must admit that some of our beliefs must be wrong. We can&#8217;t possibly have gotten absolutely everything right. This becomes especially obvious when we consider the huge number of beliefs we have, the complexity of the world we live in, [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>By definition, we believe that each of our beliefs is true. And yet, simultaneously, we must admit that some of our beliefs must be wrong. We can&#8217;t possibly have gotten absolutely everything right. This becomes especially obvious when we consider the huge number of beliefs we have, the complexity of the world we live in, and the number of people who disagree with us. The trouble though is that we don&#8217;t know which of our many beliefs are wrong. If we knew that, we should have stopped believing them already.</p>
<p>But all hope is not lost. We can effectively reason about which of our beliefs are more likely to be correct, and which are more likely to be in error. Even if we feel equally strong feelings of belief for two ideas, further considerations can make us realize that we are more likely to be correct in one of the cases than the other. In other words, there are traits that go beyond our strength of belief that can help us identify where we are likely to have made errors.</p>
<p>Consider the following properties that beliefs can have. Each of these is an indicator that a belief is less likely to be true.</p>
<ol>
<li>Many smart, knowledgeable people disagree with you (e.g. you think that evolution didn&#8217;t happen). If many such people think you are wrong, it is not obvious why your belief is more likely to be correct than the beliefs of those who disagree.</li>
<li>You have a financial (or other) incentive to believe it (e.g. you think that the product you created really does regrow hair, and you value providing a product that helps people). When we have an incentive to think a certain way, we are less likely to seek out or listen to evidence that contradicts this way of thinking.</li>
<li>If the belief were not true you would find it psychologically disturbing (e.g. you believe that your wife does not fantasize about any other men). Our minds tend to veer away from thoughts that disturb us, making it less likely that we believe them, even when they are true.</li>
<li>You originally came to believe for reasons that don&#8217;t have much to do with logic, evidence or reason (e.g. growing up, your mom wouldn&#8217;t let you pet dogs on the street, so you believe that doing so is dangerous).</li>
<li>Your argument as to why your belief is true is long and complex (e.g. you believe that a convicted criminal is innocent, because when you evaluate the twelve pieces of evidence given against her, you find that they each don&#8217;t hold up). When our arguments are long and complex it is more likely that we have made an error at some point in our thinking.</li>
<li>There are lots of possible outcomes, and your belief is that just one of them will occur (e.g. you think Hillary will beat out the other seven candidates in this primary). Typically, the more possible outcomes there are, the less likely it will be that any particular one of them is correct.</li>
<li>A large number of factors influence whether your belief will end up being true (e.g. you&#8217;re convinced that GDP growth will decline over the next year). When many factors influence an occurrence, it is really hard to be sure that you have properly taken into account all of the important ones.</li>
<li>You don&#8217;t understand the arguments of those that disagree with you, or see how they could believe what they believe (e.g. you know that a fetus is obviously a person). When you don&#8217;t understand contrary opinions, it is an indicator that you have mainly researched one side of an issue, and so are less likely to have really weighed the strength of arguments on all sides.</li>
<li>You become emotional when people disagree with you about the belief (e.g. you think that insurance companies should not cap health expenditures for illnesses that are usually terminal, and you become upset when challenged on this issue). The problem here is that strong emotions can interfere with our ability to evaluate arguments objectively, and prevent us from engaging in open-minded discourse about a subject.</li>
<li>You can&#8217;t clearly explain what your belief means (e.g. you&#8217;re convinced that you have free will). When we find it hard to explain what we mean by one of our beliefs, it may be the case that we have merely become attached to an idea or intuition, rather than having considered the evidence and made a decision based on that.</li>
</ol>
<p>To be good at identifying and stamping out our false beliefs, we need to go beyond just considering how strong our feeling of belief is. We need to consider the properties of our beliefs, and decide whether each is the sort of belief that we should have skepticism about.</p>
<hr />
<p>Influences: <a href="http://www.amazon.com/Being-Wrong-Adventures-Margin-Error/dp/0061176044">Kathryn Schulz</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2011/09/finding-our-false-beliefs/feed/</wfw:commentRss>
			<slash:comments>3</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">196</post-id>	</item>
		<item>
		<title>Keeping Ideas at a Distance Using Probability</title>
		<link>https://www.spencergreenberg.com/2011/08/keeping-ideas-at-a-distance-using-probability/</link>
					<comments>https://www.spencergreenberg.com/2011/08/keeping-ideas-at-a-distance-using-probability/#comments</comments>
		
		<dc:creator><![CDATA[Spencer]]></dc:creator>
		<pubDate>Thu, 11 Aug 2011 18:34:31 +0000</pubDate>
				<category><![CDATA[Essays]]></category>
		<category><![CDATA[belief]]></category>
		<category><![CDATA[bet]]></category>
		<category><![CDATA[gamble]]></category>
		<category><![CDATA[prediction]]></category>
		<category><![CDATA[probability]]></category>
		<category><![CDATA[right]]></category>
		<category><![CDATA[surprise]]></category>
		<category><![CDATA[truth]]></category>
		<category><![CDATA[wrong]]></category>
		<guid isPermaLink="false">http://www.spencergreenberg.com/?p=127</guid>

					<description><![CDATA[We often talk about ideas by using phrases like &#8220;I believe X.&#8221; But what do we mean when we say that we &#8220;believe&#8221; in an idea? Do we mean that we have 100% confidence that the idea is true? Let&#8217;s hope not. Even statements that we all would say we very strongly believe, like &#8220;tomorrow [&#8230;]]]></description>
										<content:encoded><![CDATA[<p>We often talk about ideas by using phrases like &#8220;I believe X.&#8221; But what do we mean when we say that we &#8220;believe&#8221; in an idea? Do we mean that we have 100% confidence that the idea is true?</p>
<p>Let&#8217;s hope not. Even statements that we all would say we very strongly believe, like &#8220;tomorrow the sun will rise&#8221;, and &#8220;I am not a robot&#8221; we should not assign 100% probability to. While we can be very, very, very certain that the sun will rise tomorrow and that our brains are not computers, we cannot be absolutely 100% certain. Tomorrow the sun could be destroyed by some process (perhaps a process that we don&#8217;t even understand), and there is a non-zero probability that we are the result of some extraordinarily secret and amazingly advanced government robotics projects. It is simply not reasonable to view belief as a claim of absolute certitude.</p>
<p>Phrases like &#8220;I believe&#8221; can also be problematic because they can imply group membership. For instance, if you say &#8220;I believe that our government should use less market regulation&#8221;, but actually you view this as a statement of identifying as a libertarian, then it may be difficult for you to engage in rational truth-seeking debate. Arguments in favor of regulation may now be processed by your brain as attacks on your in group, which means that you may feel a strong urge to deny them no matter how solid they are. And admitting that regulation is a good idea may require an adjustment to your thoughts about who you are, or a reconsideration of how reliable opinions of other members of your group are. A lot more may be at stake in the argument for you than just the truth about the facts of the case.</p>
<p>So what is a productive way to think about situations where our brain says to us &#8220;I believe&#8221;? Perhaps we can view this as a claim that X is likely to be true. If we take this perspective, then there are some methods that we can use to get a rough idea of the probability that we are implicitly assigning to X.</p>
<p>Imagine a stranger comes up to you and offers to make a bet with you. You will win one dollar from the stranger if X is in fact true, and you will pay the stranger D dollars if X turns out not to be true. We will assume for the purpose of this thought experiment that an all-knowing oracle will instantly provide you both with the correct answer.</p>
<p>Now, the question to ask yourself is, what is the largest value of D (the number of dollars you owe if X turns out not to be true) such that you would be willing to play this game? If you claim that a certain &#8220;I believe&#8221; statement corresponds to a 99% chance that X is true, and yet you are unwilling to pay even $20 to play this game, then your thinking has probably gone wrong. According to that 99% probability assignment, you will be taking a 1% chance of losing the $20, but will have a 99% chance of making $1, which means that the expected value  (i.e. average value) of the game is plus 79 cents (so games like this will lead you to profit a decent amount, on average). You also are only risking $20, which for many people reflects a small enough amount of money that there won&#8217;t be any noticeable life consequences for losing it. So if you are not willing to put even $20 on the line for this bet, then it is likely that one of the following things is true: (1) your estimate of there being a 99% chance of X being true does not really reflect your implicitly believed probability, (2) you are being unreasonably risk averse, or (3) $20 has a substantial amount of value to you which is why you aren&#8217;t willing to risk losing this amount.</p>
<p>On the flip side, suppose that you took your &#8220;I believe X&#8221; statement as only reflecting a 60% confidence that X is true. Now, the gamble is much more likely to go against you, and even if you only put up $1.20 against the other guy&#8217;s $1.00 (rather than $20 as before), the expected value of the game is only plus 12 cents .</p>
<p>Hence, we see that the amount you would be willing to bet is an implicit measure of how strong your belief really is (though it also necessarily will be influenced by your risk aversion). Thinking about these bets won&#8217;t yield exact belief probabilities, but they can help you determine if the implicit probability you assign to X is more like 99%, 99.999%, or 60%.</p>
<p>Another way to try to convert &#8220;I believe&#8221; statements into probabilistic statements is to ask yourself, &#8220;how surprised would I be if X turned out not to be true?&#8221; If the answers is, &#8220;about as surprised as I would be if I tried to guess how a spun coin would land ten times, and got it correct all 10&#8221;, then you you&#8217;ve got a probabilistic estimate of about 1 in 1000. Different levels of surprise can be thought of as roughly corresponding to probabilities.</p>
<p>Yet another handy trick is to ask yourself, &#8220;how often, when I believed things this strongly, was I correct in the past?&#8221; Of course, you should only count examples where you quite definitively found out the true answer afterwards. If in the past when you&#8217;ve &#8220;strongly believed&#8221; something, it turned out to be true about 90% of the time, then you now have a probabilistic estimate of sorts about future &#8220;strong beliefs&#8221;.</p>
<p>None of these methods is fool-proof or totally rigorous, but they can still be very useful. One of the advantages of trying to convert your &#8220;I believe X&#8221; statements to &#8220;I&#8217;m about P% certain that X is true&#8221; statements is that doing so removes some of the ego investment we may have regarding X being true. In the latter case we are openly admitting that X might turn out to be false, even if our estimate of the probability P were very well computed. What we are claiming to believe now is not X itself, but in the probability of X given the information that we are aware of. It will now be easier psychologically to face up to the truth if X turns out to be false.</p>
<p>Converting to rough probability estimates is also useful because it forces us to attempt to pin down what we are really claiming. If we are being ambiguous in our claim, we are more likely to realize this when we think about how much we&#8217;d be willing to bet on the claim. Ambiguity is either going to increase our uncertainty in the answer, or make the bet unverifiable.</p>
<p>Thinking in terms of probabilities also helps avoid issues of bias that can come about from implied group membership. If we say that something is likely, we probably won&#8217;t feel as though we have just made a claim to belong to a certain group, and others will probably not hold us to this claim as well.</p>
<p>A final benefit from making statement in terms of likelihood rather than belief is that it makes it easier to change our minds in front of others. If I say &#8220;I believe X&#8221;, then a person makes an argument against X, and I flip to saying &#8220;You&#8217;re right&#8221;, I may seem like I lack strong convictions or am easily persuaded or believe things for bad reasons. These are all traits that I may be judged for having. On the other hand, consider how it sounds if I say &#8220;Based on the information I have seen, I think it is probable that X is true&#8221;, then someone provides an argument against X, and I reply with &#8220;Good point, taking that information into account X doesn&#8217;t seem as likely.&#8221; In this case, I am more likely to end up sounding like a careful thinker who is updating his beliefs based on the new evidence I encounter.</p>
<p>When you say &#8220;I think it is quite likely that less market regulation would be good for the U.S. in terms of GDP growth&#8221; that statement is more precise, more self-reflective, and more likely to lead to productive discussion than if you just to say &#8220;I believe that our government should use less market regulation.&#8221; Converting to probabilities, even if they are only rough ones like &#8220;very likely&#8221; or &#8220;a bit better than a 50% chance&#8221; can lead to more productive discussions and less bias. So ask yourself:</p>
<ul>
<li>What probability do I really assign to this statement?</li>
<li>How much would I bet on this?</li>
<li>How surprised would I be if this turned out to be false?</li>
<li>How often have I been wrong in the past when I felt this strongly?</li>
</ul>
<p>&nbsp;</p>
<hr />
<p>Influences: <a href="http://www.overcomingbias.com/2011/05/dont-believe.html">Robin Hanson</a>, <a href="http://yudkowsky.net/">Eliezer Yudkowsky</a>, <a href="http://meaningandmagic.com/">Divia Melwani</a></p>
]]></content:encoded>
					
					<wfw:commentRss>https://www.spencergreenberg.com/2011/08/keeping-ideas-at-a-distance-using-probability/feed/</wfw:commentRss>
			<slash:comments>5</slash:comments>
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">127</post-id>	</item>
	</channel>
</rss>
