<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Susan STEM’s Entropy Control Theory: Scientific Foundations]]></title><description><![CDATA[Understand the Science Behind It — where structure becomes intelligence.]]></description><link>https://www.entropycontroltheory.com/s/scientific-foundations</link><generator>Substack</generator><lastBuildDate>Sat, 11 Apr 2026 05:54:44 GMT</lastBuildDate><atom:link href="https://www.entropycontroltheory.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Susan STEM]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[sstem@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[sstem@substack.com]]></itunes:email><itunes:name><![CDATA[Susan STEM]]></itunes:name></itunes:owner><itunes:author><![CDATA[Susan STEM]]></itunes:author><googleplay:owner><![CDATA[sstem@substack.com]]></googleplay:owner><googleplay:email><![CDATA[sstem@substack.com]]></googleplay:email><googleplay:author><![CDATA[Susan STEM]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Seeing the World Through Stephen Wolfram’s Eyes: Twenty Years of “A New Kind of Science”]]></title><description><![CDATA[Stephen Wolfram&#19968;&#30452;&#22312;&#35828;&#20182;&#22312;&#20570;&#26032;&#31185;&#23398;&#21834;&#65292;&#20174;&#20182;&#30340;&#35282;&#24230;&#30475;&#33539;&#24335;&#36716;&#31227;20&#24180;&#21069;&#23601;&#24320;&#22987;&#20102;&#65288;&#20013;&#25991;&#22312;&#21518;&#38754;&#65289;]]></description><link>https://www.entropycontroltheory.com/p/seeing-the-world-through-stephen</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/seeing-the-world-through-stephen</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Tue, 20 Jan 2026 23:01:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!eGjT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1>A Person Who Has Been Talking About &#8220;Paradigm Shifts&#8221; in Science for 20 Years</h1><p>Over the past few weeks, I&#8217;ve deliberately given my brain a bit of a break. I mentioned in earlier pieces that when you interact with AI at very high density&#8212;especially across multiple languages (Chinese and English) and multiple language modes (programming languages, formal languages)&#8212;you start to feel something like information overload and information drift, a kind of cognitive and even physical fatigue.</p><p>I realized I needed to switch gears, to change what I was thinking about and what I was working on. Once you drill yourself too deeply into a single tunnel, it becomes very hard to generate genuinely good ideas.</p><p>So I decided to seriously revisit someone I&#8217;ve been thinking about a lot lately: <strong>Stephen Wolfram</strong>. Why him? Because it suddenly struck me that Wolfram may be the only person over the past twenty-plus years who has consistently and explicitly talked about a <em>paradigm shift</em> in science. His book is literally titled <em>The New Kind of Science</em>.</p><p>I recently bought a copy of this book online. Before that, I had already encountered Wolfram in various forms: his work on cellular automata, <em>Mathematica</em>, and the concepts he keeps emphasizing&#8212;computational irreducibility, and <em>ruliology</em>(a term he coined himself). But to be honest, for a long time I didn&#8217;t really &#8220;get&#8221; him. I didn&#8217;t truly understand what he was doing, nor did I dig deeply into it. These ideas didn&#8217;t hit me at a fundamental, logical level; they didn&#8217;t trigger that feeling of &#8220;this is worth investing a serious amount of my limited time and mental energy into.&#8221;</p><p>I&#8217;ve also mentioned Thomas Kuhn&#8217;s <em>The Structure of Scientific Revolutions</em> before. Over the past few years, that book has become increasingly important to me. But what really turned &#8220;paradigm shift&#8221; into a widely discussed topic for my generation was the commercialization of AI in 2022&#8211;2023. And Stephen Wolfram had already recognized this reality very early on&#8212;and repeatedly articulated it in many of his essays, some written as early as 2005 or 2012, more than a decade ago.</p><p>In this piece, I want to share how I&#8217;ve started to re-approach Wolfram&#8217;s work. I want to try to understand <em>computational irreducibility</em> from his perspective, to get a feel for his way of thinking, while also using AI to track down relevant papers. Coincidentally, I also happen to have a children&#8217;s toy on my desk, brought to me by a friend from China&#8212;a code-breaking machine.</p><p>Behind this toy, there is actually an <strong>NP-complete</strong> problem. I want to study and understand this complexity problem using Wolfram&#8217;s language and worldview. Especially during these past months of high-intensity writing, I&#8217;ve often been reminded of Wittgenstein&#8217;s remarks on the relationship between language and the world. Only when you fully internalize a language system do you actually gain a new worldview and a new way of seeing problems.</p><div><hr></div><h1>Since the Day <em>A New Kind of Science</em> Was Published, Wolfram Has Been Surrounded by Intense Academic Controversy</h1><p>In a sense, it helped that Wolfram is a commercial / entrepreneurial scientist. He largely funded himself, bypassed the traditional academic and publication systems, and didn&#8217;t need to survive on academic approval. This, in turn, has made me very interested in a certain type of people like him&#8212;<em>corporate scientists</em>. For example, <strong>Elon Musk</strong> is a member of the U.S. National Academy of Engineering, yet he doesn&#8217;t really participate in academia either.</p><p>Precisely because such people don&#8217;t depend on the academic system for survival, and are in fact genuinely capable scientists, their evaluations and criticisms of academia are often worth reading carefully. Especially today: when universities are flooded with low-value papers, when the meaning of academic titles is itself becoming questionable, when AI is steadily eroding universities&#8217; monopoly over knowledge interpretation; and when governments&#8212;especially the U.S. government&#8212;are cutting research funding and openly questioning the real-world value of academic systems. Against this backdrop, perspectives from outside the system form a valuable counterbalance.</p><p>Without further ado, let me show you a few examples.</p><p>Here is an important critique of Wolfram from the mathematical community: a formal book review published in the <strong>Bulletin of the American Mathematical Society</strong>. It criticizes aspects of his argumentation, the rigor of his exposition, and the relationship between his work and existing literature. For instance, it argues that some explanations are &#8220;unclear,&#8221; &#8220;unsystematic,&#8221; or &#8220;lacking testable detail.&#8221;</p><p><a href="https://www.ams.org/journals/bull/2003-40-01/S0273-0979-02-00970-9/S0273-0979-02-00970-9.pdf">https://www.ams.org/journals/bull/2003-40-01/S0273-0979-02-00970-9/S0273-0979-02-00970-9.pdf</a></p><p>In the complex systems and statistical physics communities, one of the sharpest criticisms can be summarized in a single phrase: <strong>&#8220;old wine in new bottles + overclaiming.&#8221;</strong> The core argument is that Wolfram repackaged many existing ideas from complexity science and computation theory, then declared a &#8220;paradigm shift&#8221; through a very strong narrative, while failing to adequately acknowledge or credit prior work. A representative voice here is <strong>Cosma Shalizi</strong>.</p><p><a href="http://bactra.org/reviews/wolfram/">http://bactra.org/reviews/wolfram/</a></p><p>I&#8217;m not interested in adjudicating who is right or wrong. Whether it&#8217;s Wolfram or his critics, their level of scholarship is far beyond mine.</p><p>But after reading these critiques&#8212;especially after months of deliberate writing practice in both Chinese and English&#8212;I had two very strong intuitive impressions.</p><p>First, <strong>Wolfram and his critics are not even operating within the same language system</strong>. This isn&#8217;t just a disagreement of viewpoints; it&#8217;s a mismatch of language protocols, almost a case of people talking past each other entirely.</p><p>Second, <strong>these critical texts carry a very strong emotional charge</strong>. You don&#8217;t have to analyze them very deeply to sense a kind of dissatisfaction or resentment. That&#8217;s actually quite strange for a community that prides itself on being rational and dispassionate.</p><p>Over time, I started to understand this better. In the AI era, many people are labeled as &#8220;crackpot scientists&#8221; simply because they lack formal academic titles, even though they are seriously engaging with scientific problems. The emotional tone feels remarkably similar.</p><p>&#8220;Alright,&#8221; I can only say this: in the age of AI, I&#8217;d much rather be a <strong>billionaire &#8216;crackpot scientist&#8217;</strong> like Wolfram than a professor or lecturer struggling to secure funding and facing an uncertain career future. &#128514;</p><div><hr></div><h1>How Does Wolfram Respond?</h1><p>Thirteen years ago, Wolfram had already personally experienced the shock of a paradigm shift, and he described it with remarkable clarity. Calling him a &#8220;crackpot&#8221; is frankly absurd.</p><p>Let&#8217;s look at his own words, from a 2012 essay, where he describes this impact. By then, he had already largely exited academia&#8212;arguably as early as 1987, when he began building software. He explains why he stopped publishing papers: academic publishing had become overly formalistic, with form outweighing substance, while the sheer amount of real content in his work made the paper format impractical. He preferred blogging. That&#8217;s why, today, the best way to understand his thinking is simply to read <strong>Stephen Wolfram&#8217;s writings</strong>.</p><p><strong>Why does he insist this is a &#8220;paradigm shift&#8221;?</strong></p><p>When he recounts reactions to his work, many academics respond with unusually intense emotion:</p><blockquote><p>&#8220;You&#8217;re destroying the heritage of mathematics&#8230;&#8221;</p></blockquote><p>Why such anger? Would you get angry at a shaman or a yoga guru making scientific claims? No&#8212;you wouldn&#8217;t even engage. Anger only appears when something is taken seriously.</p><p>Wolfram&#8217;s response is blunt:</p><blockquote><p>&#8220;This is what a paradigm shift sounds like&#8212;up close and personal.&#8221;</p></blockquote><p><strong>A paradigm shift is not primarily about new conclusions; it&#8217;s about a new evaluation system.</strong></p><p>The emotion isn&#8217;t triggered because a theorem was overturned, but because:</p><ul><li><p>The old standards of &#8220;what counts as science&#8221; are being threatened (papers, peer review, citations, academic networks).</p></li><li><p>Old &#8220;career investments&#8221; are being threatened (decades of training and accumulated prestige may suddenly depreciate).</p></li></ul><p>He even breaks this down into two &#8220;core threatened groups,&#8221; drawing a sharp distinction between <strong>surface reasons</strong> and <strong>deeper reasons</strong>.</p><blockquote><p>&#8220;There was a surface reason&#8230; and a deeper reason.&#8221;</p></blockquote><p><strong>A. Content-level fears (the career economics of paradigm conflict)</strong></p><p>First group (mostly physicists):</p><blockquote><p>&#8220;We&#8217;ve spent our whole careers barking up the wrong tree.&#8221;</p></blockquote><p>If the computational perspective of <em>NKS</em> is right, then these researchers weren&#8217;t just &#8220;slightly wrong&#8221;&#8212;their entire research trajectory may have been a low-return investment. This is classic Kuhnian paradigm shift territory: <strong>what counted as success before may no longer count at all.</strong></p><p>Second group (complexity researchers):</p><blockquote><p>&#8220;It&#8217;ll overshadow everything we&#8217;ve done.&#8221;</p></blockquote><p>This isn&#8217;t about truth; it&#8217;s about <strong>attention and authority structures</strong>&#8212;who defines the main narrative controls textbooks, funding, and disciplinary gateways.</p><p><strong>Form-level conflict: doing something &#8220;academic-like&#8221; without following &#8220;academic rules&#8221;</strong></p><blockquote><p>&#8220;Academic-like, but you haven&#8217;t played by academic rules.&#8221;</p></blockquote><p>The implicit message is that academia has a legitimacy stack:</p><ul><li><p>Peer review as gatekeeping</p></li><li><p>References as network visibility</p></li><li><p>Journals and publishers as distribution channels</p></li><li><p>Academic identity as speech authorization</p></li></ul><p>Wolfram&#8217;s move was to <strong>bypass this entire stack</strong>. Hence his insistence:</p><blockquote><p>&#8220;I wasn&#8217;t an academic&#8230;&#8221;</p></blockquote><p>This is the core tension of his essay: <strong>a new paradigm combined with a new distribution and validation mechanism</strong>, posing a dual threat to the old system.</p><p>His full essay is here:</p><p><a href="https://writings.stephenwolfram.com/2012/05/living-a-paradigm-shift-looking-back-on-reactions-to-a-new-kind-of-science/">https://writings.stephenwolfram.com/2012/05/living-a-paradigm-shift-looking-back-on-reactions-to-a-new-kind-of-science/</a></p><div><hr></div><h1>Kuhn Loss</h1><p>Even though Kuhn&#8217;s <em>The Structure of Scientific Revolutions</em> is approaching a century old, it uncannily predicts the precise situation we&#8217;re discussing here. I can&#8217;t resist quoting it. The full discussion is in the SEP entry linked below.</p><blockquote><p>A paradigm revolution does not merely solve more problems; it also discards&#8212;or even declares illegitimate&#8212;problems and explanations that the old paradigm valued and successfully addressed by its own standards.</p><p>This loss of explanatory power, problem sets, and evaluation criteria is what Kuhn called <em>Kuhn loss</em>.</p></blockquote><p>Kuhn used this concept to undermine the idea that scientific progress is simply a cumulative approximation to truth.</p><h3>1) What Is Actually &#8220;Lost&#8221;?</h3><ol><li><p><strong>Problem sets change</strong>: what counts as an &#8220;important problem&#8221; changes.</p></li><li><p><strong>Standards change</strong>: what counts as a &#8220;good explanation&#8221; or &#8220;good science&#8221; changes.</p></li><li><p><strong>Concepts and worldviews change</strong>: the same words refer to different things, different sentences become expressible.</p></li></ol><p>So Kuhn loss isn&#8217;t just &#8220;one less derivation.&#8221; It&#8217;s that things which <em>had to be explained</em> in the old paradigm become, in the new one:</p><ul><li><p>&#8220;Not necessary to explain&#8221;</p></li><li><p>&#8220;Meaningless&#8221;</p></li><li><p>&#8220;Metaphysical&#8221;</p></li><li><p>Or outright &#8220;pseudo-problems&#8221;</p></li></ul><p>This is why Kuhn says revolutions change the very definition of science.</p><h3>2) A Classic Example: Why Newton &#8220;Lacked Explanatory Power&#8221; Yet Won</h3><p>In Aristotelian and Cartesian mechanics, the question <strong>&#8220;How is attraction possible?&#8221;</strong> was mandatory&#8212;you had to provide a contact mechanism or ontological explanation.</p><p>Newtonian gravity looked like action at a distance and failed this test, so it was initially rejected. But once Newton&#8217;s paradigm won, that question was <strong>kicked out of the scientific agenda</strong> as illegitimate. That&#8217;s Kuhn loss.</p><p>Later, general relativity reintroduced the issue in a different form, reinforcing Kuhn&#8217;s point: <strong>progress isn&#8217;t linear accumulation; it&#8217;s repeated agenda rewriting.</strong></p><h3>3) Kuhn Loss and Incommensurability</h3><p>You can think of Kuhn loss as an observable symptom of <strong>incommensurability</strong>.</p><p>Incommensurability doesn&#8217;t mean &#8220;incomparable,&#8221; but rather:</p><ul><li><p><strong>No shared metric</strong>: different conceptual nets, problem lists, and evaluation standards.</p></li><li><p>Hence no single unified measure of &#8220;closer to truth.&#8221;</p></li></ul><p>Kuhn loss tells us that even if a new paradigm is stronger in some respects, it may be weaker in dimensions valued by the old paradigm&#8212;and whether it&#8217;s &#8220;weaker&#8221; at all depends entirely on which metric you use.</p><h3>4) Why Kuhn Loss Shocks Scientific Rationality</h3><p>This is why Kuhn and Feyerabend were once accused of being &#8220;anti-science&#8221;:</p><ul><li><p>If revolutions rewrite problems and standards, does rational comparison collapse into politics?</p></li><li><p>If old successes are declared illegitimate, does science stop &#8220;approaching truth&#8221;?</p></li></ul><p>Kuhn later clarified: <strong>incommensurability &#8800; incomparability; Kuhn loss &#8800; irrationality.</strong></p><p>Paradigm choice lacks a neutral algorithm, but it can still be guided by values (accuracy, scope, simplicity, fruitfulness), allowing for rational disagreement.</p><p>Suddenly, that Cantonese phrase&#8212;<em>talking past each other</em>&#8212;feels very apt.</p><p>SEP link:</p><p><a href="https://plato.stanford.edu/archives/fall2019/entries/incommensurability/#RevParThoKuhInc">https://plato.stanford.edu/archives/fall2019/entries/incommensurability/#RevParThoKuhInc</a></p><div><hr></div><h1>Experiencing Irreducibility with a Decoder / Mastermind Toy</h1><p>I picked up the Decoder/Mastermind toy simply because it happened to be on my desk when I was thinking about this.</p><p>A seemingly simple children&#8217;s toy hides an <strong>NP-complete</strong> problem. Originally called <strong>Mastermind</strong>, it&#8217;s a two-player board game: one player sets a secret code, the other tries to guess it.</p><p>Each round, the guesser submits a guess; the code-maker returns feedback:</p><ul><li><p><strong>Black pegs</strong>: correct color and correct position</p></li><li><p><strong>White pegs</strong>: correct color, wrong position</p></li></ul><p>This feedback channel must be perfectly correct and noise-free; otherwise, the entire reasoning chain collapses.</p><p>My electronic <strong>Decoder</strong> replaces the human code-maker with a hidden algorithm and adds variants: green lights (correct color and position), white lights (correct color, wrong position), and no light (color absent). In fully <em>indirect hint</em> mode, positional information is hidden, dramatically expanding the state space beyond brute-force feasibility.</p><p>Theoretical work confirmed this complexity: in <strong>2005</strong>, Jeff Stuckman and Guo-Qiang Zhang proved that <strong>Mastermind is NP-complete</strong>.</p><p><a href="https://arxiv.org/abs/cs/0512049">https://arxiv.org/abs/cs/0512049</a></p><p>I spent a full day solving all <strong>800 levels</strong> of this toy using a <strong>minimax</strong> strategy&#8212;choosing guesses that maximally reduce the candidate space in the worst case. The product itself is impressively reliable: a consumer-grade children&#8217;s toy with a completely noise-free feedback channel.</p><pre><code><code>&#10140;  giiker_super_decoder python3 minimax_generic_gwn.py --Y 7 --P 4 --R 7 --interactive-hints

=== Generic Minimax Bucket Solver (unique colors, tri-feedback) ===
Parameters: Y=7 colors, P=4 slots, R=7 total rows
Feedback: (G,W,N) with G+W+N=P
Initial candidates: |C0| = P(Y,P) = 840

--- Preset hints input (interactive) ---
Enter one hint per line as:
  guess:  a b c d    (P ints, unique, in [0..Y-1])
  fb:     G W N      (3 ints, G+W+N=P)
Type empty guess to stop.

hint guess (or empty to stop): 0 1 2 3
hint feedback (G W N): 2 1 1 
hint guess (or empty to stop): 0 6 2 1
hint feedback (G W N): 3 0 1
hint guess (or empty to stop): 

Applying preset hints: n=2  =&gt; remaining query budget T=R-n = 5
  hint#1: guess=0 1 2 3 fb=(G,W,N)=(2,1,1)  |C| 840 -&gt; 36
  hint#2: guess=0 6 2 1 fb=(G,W,N)=(3,0,1)  |C| 36 -&gt; 2

Step 1 (remaining queries: 5/5)
Suggested guess: 0 1 2 4
Current |C| = 2   worst_bucket=1   expected_remaining&#8776;1.00
Enter feedback as 'G W N' (sum=4), or 'q' to quit: 2 2 0
Filtered candidates: 2 -&gt; 1

&#9989; Unique solution determined without further queries: 0 4 2 1
&#10140;  giiker_super_decoder 

</code></code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!eGjT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!eGjT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 424w, https://substackcdn.com/image/fetch/$s_!eGjT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 848w, https://substackcdn.com/image/fetch/$s_!eGjT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 1272w, https://substackcdn.com/image/fetch/$s_!eGjT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!eGjT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic" width="584" height="763" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:763,&quot;width&quot;:584,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:64393,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/185241154?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!eGjT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 424w, https://substackcdn.com/image/fetch/$s_!eGjT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 848w, https://substackcdn.com/image/fetch/$s_!eGjT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 1272w, https://substackcdn.com/image/fetch/$s_!eGjT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1a7f74b4-47b3-4805-867b-e0ee61943806_584x763.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>All solution code is available here, including extensible solvers that go beyond the device&#8217;s original parameters:</p><p><a href="https://github.com/STEMMOM/giiker_super_decoder">https://github.com/STEMMOM/giiker_super_decoder</a></p><p>By this point, there&#8217;s little &#8220;puzzle-solving&#8221; left to say. My goal was never to solve a toy, but to revisit complexity through <strong>Wolfram&#8217;s lens</strong>.</p><p>The complexity isn&#8217;t in the rules&#8212;they&#8217;re trivial&#8212;but in the structure: a hidden truth (s), accessible only through queries that yield limited feedback. You can&#8217;t compute the answer directly; you must <em>extract</em> it through interaction.</p><p>From an information-theoretic view, this is a pure black-box query model. Information leaks from the hidden truth through a constrained channel. This structure pervades reality:</p><ul><li><p>Cryptography and security</p></li><li><p>Medical diagnosis</p></li><li><p>Scientific experimentation</p></li><li><p>Engineering parameter tuning</p></li></ul><p>The toy is kind because it promises a noise-free oracle. Reality makes no such guarantee.</p><p>Of course, I&#8217;m fully aware that I&#8217;m still at a very early stage, and that this class of problems is by no means unexplored. In fact, it has long been studied in a systematic way within academia. In theoretical computer science and in research on &#8220;puzzle complexity,&#8221; <strong>Mastermind</strong> is typically formulated as a <strong>constraint satisfaction / consistency decision problem</strong>, in its standard form known as the <strong>Mastermind Satisfiability Problem (MSP)</strong>, and it has been rigorously proven to be <strong>NP-complete</strong>&#8212;a point already mentioned earlier. For the classic board-game parameters (4 positions, 6 colors, with repetition allowed), the entire state space contains only (6^4 = 1296) possibilities; as early as <strong>1977</strong>, Knuth provided a strategy that guarantees a win in <strong>at most five moves</strong> in the worst case. My own implementation essentially operates at this same level: no more than five steps in the worst case, with reported results in the literature showing an average number of moves around <strong>four-point-something</strong>.</p><p>So within this particular scale and framework, the aspects that can be formalized, proven, and optimized have already been studied quite thoroughly. If I continue to push further, I&#8217;m more likely to approach the problem from an <strong>information-theoretic</strong> perspective&#8212;recasting it in terms of how much effective information each query can extract. Coincidentally, I also came across a very interesting recent paper today: <strong>G&#252;r (2025), &#8220;Weighted Entropy Approach,&#8221;</strong>which uses weighted entropy as a heuristic to approach the theoretically optimal average number of steps. At its core, it treats the <strong>strategy itself as a kind of measurement instrument</strong>, an idea I find highly valuable as a reference. I won&#8217;t go into details here; interested readers can consult the original papers directly.</p><p><a href="https://arxiv.org/abs/cs/0512049?utm_source=chatgpt.com">https://arxiv.org/abs/cs/0512049?utm_source=chatgpt.com</a></p><p><a href="https://www.cs.bu.edu/fac/best/res/papers/alybull86.pdf?utm_source=chatgpt.com">https://www.cs.bu.edu/fac/best/res/papers/alybull86.pdf?utm_source=chatgpt.com</a></p><p><a href="https://arxiv.org/abs/2511.19446?utm_source=chatgpt.com">https://arxiv.org/abs/2511.19446?utm_source=chatgpt.com</a></p><div><hr></div><h1>What I&#8217;m Really Interested In: How Wolfram Would See This Problem</h1><p>A more realistic plan for me this year is to devote time to <strong>cellular automata</strong>&#8212;the true starting point of Wolfram&#8217;s worldview, beginning with <em>A New Kind of Science</em>. He treats them as a <strong>minimal, clean computational universe</strong> for studying complexity, irreducibility, and the fact that simple rules can generate extreme behavior.</p><p>Wolfram&#8217;s systematic, large-scale, decades-long exploration of cellular automata is essentially unmatched globally. From exhaustive rule-space scans to behavioral classification and the development of ruliology, it&#8217;s hard to find a true second example.</p><p>What continues to impress me is that <strong>one person</strong> proposed such a sweeping new scientific narrative. The key isn&#8217;t his conclusions, but his questions. When faced with Mastermind, my instinct is to <em>solve it</em>. Wolfram&#8217;s instinct is to ask: <em>what kind of system is this?</em></p><p>Does it exhibit universal computation? Which behavior class does it belong to? Is it computationally irreducible? How common is this behavior in rule space?</p><p>Here&#8217;s the chilling thought: <strong>cellular automata are Turing complete</strong>. You&#8217;re not dealing with a solver, but a computational universe. From this comes <strong>PCE&#8212;the Principle of Computational Equivalence</strong>. I&#8217;m still digesting it, but roughly: once a system passes a very low complexity threshold, its computational power is equivalent to that of most others. The real differences lie in predictability, compressibility, and whether simulation is unavoidable.</p><p>So Wolfram isn&#8217;t anti&#8211;problem-solving. He&#8217;s <strong>reshaping the problem itself</strong>. He&#8217;s less interested in &#8220;what is the answer to this instance?&#8221; and more in &#8220;what are the behavioral laws of this system?&#8221;</p><p>In the PCE view, the &#8220;unique solution&#8221; isn&#8217;t central. What matters is whether different systems share the same class of computational capability and behavior. This worldview&#8212;almost orthogonal to the solver&#8217;s instinct&#8212;is what I&#8217;m only beginning to grasp.</p><p>This series will continue. There&#8217;s a lot here worth learning, and I&#8217;m still very early in the process.</p><h1>&#19968;&#20010;&#25226;&#31185;&#23398;&#30340;&#8220;&#33539;&#24335;&#36716;&#31227;&#8221;&#35828;&#20102;20&#24180;&#30340;&#20154;&#12290;</h1><p>&#36825;&#20960;&#21608;&#65292;&#20854;&#23454;&#25105;&#32473;&#33258;&#24049;&#30340;&#22823;&#33041;&#25918;&#20102;&#19968;&#20123;&#20551;&#26399;&#12290;&#25105;&#22312;&#21069;&#20960;&#31687;&#25991;&#31456;&#37324;&#25552;&#21040;&#36807;&#65306;&#24403;&#20320;&#38750;&#24120;&#23494;&#38598;&#22320;&#19982; AI &#20132;&#20114;&#65292;&#32780;&#19988;&#36824;&#26159;&#22810;&#35821;&#31181;&#65288;&#20013;&#25991;&#12289;&#33521;&#25991;&#65289;&#12289;&#22810;&#35821;&#35328;&#27169;&#24335;&#65288;&#32534;&#31243;&#12289;&#24418;&#24335;&#21270;&#35821;&#35328;&#65289;&#30340;&#39640;&#23494;&#24230;&#20132;&#20114;&#20043;&#21518;&#65292;&#20250;&#26126;&#26174;&#24863;&#21463;&#21040;&#19968;&#31181;&#20449;&#24687;&#36807;&#22823;&#12289;&#20449;&#24687;&#28418;&#31227;&#30340;&#29366;&#24577;&#65292;&#19968;&#31181;&#36523;&#24515;&#19978;&#30340;&#20914;&#20987;&#21644;&#30130;&#24811;&#12290;</p><p>&#25105;&#24847;&#35782;&#21040;&#25105;&#24517;&#39035;&#20999;&#25442;&#19968;&#19979;&#65292;&#32473;&#33258;&#24049;&#25442;&#19968;&#20010;&#24819;&#27861;&#65292;&#25442;&#19968;&#20214;&#20107;&#24773;&#20570;&#12290;&#20154;&#19968;&#26086;&#38075;&#36827;&#29275;&#35282;&#23574;&#65292;&#26159;&#24456;&#38590;&#20135;&#29983;&#22909;&#30340;&#21019;&#24847;&#30340;&#12290;</p><p>&#25152;&#20197;&#25105;&#25171;&#31639;&#35748;&#30495;&#30740;&#31350;&#19968;&#19979;&#36825;&#27573;&#26102;&#38388;&#25105;&#38750;&#24120;&#22312;&#24847;&#30340;&#19968;&#20010;&#20154;&#65306;Stephen Wolfram&#12290;&#20026;&#20160;&#20040;&#26159;&#20182;&#65311;&#22240;&#20026;&#25105;&#31361;&#28982;&#24847;&#35782;&#21040;&#65292;Stephen Wolfram &#21487;&#33021;&#26159;&#36807;&#21435;&#20108;&#21313;&#22810;&#24180;&#37324;&#65292;&#21807;&#19968;&#19968;&#20010;&#19968;&#30452;&#22312;&#35848;&#31185;&#23398;&#8220;&#33539;&#24335;&#36716;&#31227;&#8221;&#30340;&#20154;&#12290;&#20182;&#30340;&#20070;&#21517;&#26412;&#36523;&#23601;&#21483; <em>The New Kind of Science</em>&#12290;</p><p>&#26368;&#36817;&#25105;&#21018;&#22312;&#32593;&#19978;&#20080;&#20102;&#36825;&#26412;&#20070;&#12290;&#20854;&#23454;&#22312;&#27492;&#20043;&#21069;&#65292;&#25105;&#24050;&#32463;&#25509;&#35302;&#36807; Stephen Wolfram&#65306;&#20182;&#30340;&#20803;&#32990;&#26426;&#30740;&#31350;&#12289;Mathematica&#65292;&#20197;&#21450;&#20182;&#21453;&#22797;&#25552;&#21040;&#30340;&#27010;&#24565;&#8212;&#8212;&#35745;&#31639;&#19981;&#21487;&#32422;&#24615;&#12289;Ruliology&#65288;&#36825;&#26159;&#20182;&#33258;&#24049;&#36896;&#30340;&#35789;&#65289;&#12290;&#20294;&#35828;&#23454;&#35805;&#65292;&#25105;&#19968;&#30452;&#27809;&#26377;&#30495;&#27491;&#30475;&#25026;&#20182;&#65292;&#20063;&#27809;&#26377;&#30495;&#27491;&#29702;&#35299;&#20182;&#65292;&#26356;&#27809;&#26377;&#28145;&#20837;&#19979;&#21435;&#12290;&#36825;&#20123;&#27010;&#24565;&#22312;&#24403;&#26102;&#24182;&#27809;&#26377;&#22312;&#24213;&#23618;&#36923;&#36753;&#19978;&#20987;&#20013;&#25105;&#65292;&#27809;&#26377;&#35753;&#25105;&#20135;&#29983;&#37027;&#31181;&#8220;&#25105;&#24895;&#24847;&#38750;&#24120;&#20005;&#32899;&#22320;&#25237;&#20837;&#33258;&#24049;&#23453;&#36149;&#30340;&#26102;&#38388;&#21644;&#33041;&#21147;&#21435;&#24324;&#25026;&#20182;&#8221;&#30340;&#20914;&#21160;&#12290;</p><p>&#25105;&#20197;&#21069;&#20063;&#25552;&#21040;&#36807; Kuhn &#30340;&#12298;&#31185;&#23398;&#38761;&#21629;&#30340;&#32467;&#26500;&#12299;&#65292;&#36825;&#26412;&#20070;&#36825;&#20960;&#24180;&#22312;&#25105;&#24515;&#37324;&#30340;&#37325;&#35201;&#24615;&#19968;&#30452;&#22312;&#19978;&#21319;&#12290;&#20294;&#30495;&#27491;&#35753;&#8220;&#33539;&#24335;&#36716;&#31227;&#8221;&#25104;&#20026;&#25105;&#36825;&#19968;&#20195;&#20154;&#26222;&#36941;&#35752;&#35770;&#35805;&#39064;&#30340;&#65292;&#20854;&#23454;&#26159; 2022&#8211;2023 &#24180; AI &#30340;&#21830;&#19994;&#21270;&#12290;&#32780; Stephen Wolfram &#24456;&#26089;&#23601;&#35748;&#23450;&#20102;&#36825;&#19968;&#29616;&#23454;&#65292;&#24182;&#19988;&#22312;&#20182;&#22823;&#37327;&#30340;&#25991;&#31456;&#20013;&#21453;&#22797;&#38416;&#36848;&#36807;&#36825;&#19968;&#28857;&#8212;&#8212;&#20854;&#20013;&#19981;&#23569;&#25991;&#31456;&#20889;&#20110; 2005 &#24180;&#12289;2012 &#24180;&#65292;&#36317;&#20170;&#24050;&#32463;&#21313;&#22810;&#24180;&#20102;&#12290;</p><p>&#20511;&#36825;&#31687;&#25991;&#31456;&#65292;&#25105;&#24819;&#20998;&#20139;&#19968;&#19979;&#25105;&#26159;&#22914;&#20309;&#37325;&#26032;&#23581;&#35797;&#29702;&#35299;&#20182;&#30340;&#12290;&#25105;&#24819;&#31449;&#22312;&#20182;&#30340;&#35282;&#24230;&#65292;&#21435;&#29702;&#35299;&#8220;&#35745;&#31639;&#19981;&#21487;&#32422;&#24615;&#8221;&#65292;&#21435;&#20307;&#20250;&#20182;&#30340;&#24605;&#32500;&#26041;&#24335;&#65307;&#21516;&#26102;&#20511;&#21161; AI &#26597;&#25214;&#30456;&#20851;&#35770;&#25991;&#12290;&#24688;&#22909;&#25105;&#25163;&#36793;&#36824;&#26377;&#19968;&#20010;&#26379;&#21451;&#20174;&#20013;&#22269;&#24102;&#26469;&#30340;&#20799;&#31461;&#29609;&#20855;&#8212;&#8212;&#19968;&#21488;&#23494;&#30721;&#26426;&#12290;</p><p>&#36825;&#21488;&#23494;&#30721;&#26426;&#32972;&#21518;&#65292;&#20854;&#23454;&#38544;&#34255;&#30528;&#19968;&#20010; NP-complete &#38382;&#39064;&#12290;&#25105;&#24819;&#20174; Wolfram &#30340;&#20840;&#26032;&#35270;&#35282;&#21644;&#35821;&#35328;&#35266;&#20986;&#21457;&#65292;&#21435;&#30740;&#31350;&#21644;&#29702;&#35299;&#36825;&#20010;&#22797;&#26434;&#24230;&#38382;&#39064;&#12290;&#23588;&#20854;&#26159;&#22312;&#36825;&#20960;&#20010;&#26376;&#39640;&#24378;&#24230;&#20889;&#20316;&#30340;&#36807;&#31243;&#20013;&#65292;&#25105;&#24120;&#24120;&#20250;&#24819;&#36215;&#32500;&#29305;&#26681;&#26031;&#22374;&#20851;&#20110;&#35821;&#35328;&#19982;&#19990;&#30028;&#20851;&#31995;&#30340;&#37027;&#20123;&#35805;&#12290;&#21482;&#26377;&#24403;&#20320;&#23436;&#20840;&#25509;&#21463;&#20102;&#19968;&#31181;&#35821;&#35328;&#20307;&#31995;&#65292;&#20320;&#20854;&#23454;&#33719;&#24471;&#20102;&#19968;&#31181;&#20840;&#26032;&#30340;&#19990;&#30028;&#35266;&#21644;&#30475;&#38382;&#39064;&#30340;&#35282;&#24230;&#12290;</p><div><hr></div><h1>Wolfram&#20174;the New Kind of Science &#25104;&#20070;&#37027;&#19968;&#22825;&#24320;&#22987;&#65292;&#23601;&#39281;&#21463;&#23398;&#26415;&#20849;&#21516;&#20307;&#30340;&#26497;&#22823;&#20105;&#35758;&#12290;</h1><p>&#20063;&#24184;&#20111;&#20182;&#26159;&#21830;&#19994;/&#20225;&#19994;&#31185;&#23398;&#23478;&#65292;&#22522;&#26412;&#23436;&#20840;&#33258;&#36153;&#65292;&#32469;&#36807;&#20102;&#20256;&#32479;&#30340;&#23398;&#26415;&#20307;&#31995;&#21644;&#21457;&#34920;&#20307;&#31995;&#65292;&#20063;&#19981;&#38656;&#35201;&#38752;&#23398;&#26415;&#20849;&#21516;&#20307;&#27963;&#30528;&#12290;&#36825;&#19968;&#28857;&#20854;&#23454;&#35753;&#25105;&#26412;&#20154;&#38750;&#24120;&#20851;&#27880;&#19968;&#31867;&#21644;&#20182;&#30456;&#20284;&#30340;&#20154;&#8212;&#8212;&#20225;&#19994;&#31185;&#23398;&#23478;&#12290;&#27604;&#22914; <strong>Elon Musk</strong>&#65292;&#32654;&#22269;&#22269;&#23478;&#24037;&#31243;&#38498;&#38498;&#22763;&#65292;&#20294;&#20182;&#21516;&#26679;&#19981;&#28151;&#23398;&#26415;&#30028;&#12290;</p><p>&#27491;&#22240;&#20026;&#20182;&#20204;&#19981;&#20381;&#36182;&#23398;&#26415;&#20307;&#31995;&#29983;&#23384;&#65292;&#21453;&#32780;&#26159;&#30495;&#27491;&#8220;&#26377;&#23454;&#21147;&#8221;&#30340;&#31185;&#23398;&#23478;&#65292;&#25152;&#20197;&#20182;&#20204;&#23545;&#23398;&#26415;&#30028;&#30340;&#35780;&#20215;&#21644;&#25209;&#35780;&#65292;&#24448;&#24448;&#38750;&#24120;&#20540;&#24471;&#19968;&#30475;&#12290;&#23588;&#20854;&#26159;&#22312;&#20170;&#22825;&#36825;&#20010;&#26102;&#20195;&#65306;&#22823;&#23398;&#39640;&#26657;&#35770;&#25991;&#22823;&#37327;&#28748;&#27700;&#65292;&#32844;&#31216;&#20307;&#31995;&#30340;&#24847;&#20041;&#26412;&#36523;&#24320;&#22987;&#21464;&#24471;&#21487;&#30097;&#65292;&#22823;&#23398;&#27491;&#22312;&#34987; AI &#36880;&#27493;&#20405;&#34432;&#30693;&#35782;&#35299;&#37322;&#26435;&#65307;&#20877;&#21152;&#19978;&#20197;&#32654;&#22269;&#20026;&#39318;&#30340;&#25919;&#24220;&#24320;&#22987;&#21066;&#20943;&#23398;&#26415;&#32463;&#36153;&#65292;&#20844;&#24320;&#36136;&#30097;&#23398;&#26415;&#31995;&#32479;&#30340;&#29616;&#23454;&#20215;&#20540;&#8212;&#8212;&#22312;&#36825;&#26679;&#30340;&#32972;&#26223;&#19979;&#65292;&#36825;&#31181;&#26469;&#33258;&#20307;&#31995;&#22806;&#30340;&#35270;&#35282;&#65292;&#26412;&#36523;&#23601;&#26159;&#19968;&#31181;&#24456;&#22909;&#30340;&#24179;&#34913;&#12290;</p><p>&#35805;&#19981;&#22810;&#35828;&#65292;&#25105;&#30452;&#25509;&#32473;&#20320;&#30475;&#20960;&#31687;&#25991;&#31456;&#12290;</p><p>&#36825;&#26159;&#23545; Wolfram &#30340;&#19968;&#31687;&#37325;&#35201;&#25209;&#35780;&#65292;&#26469;&#33258;&#25968;&#23398;&#20849;&#21516;&#20307;&#8212;&#8212;&#32654;&#22269;&#25968;&#23398;&#23398;&#20250; <strong>American Mathematical Society</strong> &#26071;&#19979;&#30340; <strong>Bulletin of the AMS</strong>&#65292;&#26159;&#19968;&#31687;&#27491;&#24335;&#30340;&#20070;&#35780;&#12290;&#25991;&#31456;&#23545;&#20182;&#30340;&#20855;&#20307;&#35770;&#35777;&#26041;&#24335;&#12289;&#34920;&#36848;&#30340;&#20005;&#23494;&#24615;&#65292;&#20197;&#21450;&#19982;&#26082;&#26377;&#25991;&#29486;&#33033;&#32476;&#20043;&#38388;&#30340;&#20851;&#31995;&#65292;&#25552;&#20986;&#20102;&#19981;&#23569;&#25209;&#35780;&#65292;&#27604;&#22914;&#65306;&#26377;&#20123;&#35299;&#37322;&#8220;&#35828;&#19981;&#28165;&#26970; / &#19981;&#25104;&#20307;&#31995; / &#32570;&#20047;&#21487;&#26816;&#39564;&#30340;&#32454;&#33410;&#8221;&#12290;</p><p><a href="https://www.ams.org/journals/bull/2003-40-01/S0273-0979-02-00970-9/S0273-0979-02-00970-9.pdf?utm_source=chatgpt.com">https://www.ams.org/journals/bull/2003-40-01/S0273-0979-02-00970-9/S0273-0979-02-00970-9.pdf?utm_source=chatgpt.com</a></p><p>&#22312;&#22797;&#26434;&#31995;&#32479;&#21644;&#32479;&#35745;&#29289;&#29702;&#30456;&#20851;&#30340;&#22280;&#23376;&#37324;&#65292;&#26368;&#23574;&#38160;&#30340;&#19968;&#31867;&#25209;&#35780;&#21017;&#38598;&#20013;&#22312;&#19968;&#21477;&#35805;&#19978;&#65306;<strong>&#8220;&#26032;&#29942;&#35013;&#26087;&#37202; + &#36807;&#24230;&#23459;&#31216;&#8221;</strong>&#12290;&#26680;&#24515;&#35266;&#28857;&#26159;&#35748;&#20026; Wolfram &#25226;&#22797;&#26434;&#31995;&#32479;&#21644;&#35745;&#31639;&#29702;&#35770;&#20013;&#24050;&#32463;&#23384;&#22312;&#30340;&#24456;&#22810;&#24605;&#24819;&#37325;&#26032;&#21253;&#35013;&#20102;&#19968;&#36941;&#65292;&#28982;&#21518;&#36890;&#36807;&#38750;&#24120;&#24378;&#30340;&#21465;&#20107;&#26041;&#24335;&#23459;&#31216;&#8220;&#33539;&#24335;&#36291;&#36801;&#8221;&#65292;&#20294;&#22312;&#23545;&#26082;&#26377;&#30740;&#31350;&#30340;&#25215;&#35748;&#21644;&#24402;&#21151;&#26041;&#24335;&#19978;&#23384;&#22312;&#38382;&#39064;&#12290;&#36825;&#19968;&#31867;&#25209;&#35780;&#20013;&#65292;&#27604;&#36739;&#26377;&#20195;&#34920;&#24615;&#30340;&#22768;&#38899;&#26469;&#33258; <strong>Cosma Shalizi</strong>&#12290;</p><p><a href="http://bactra.org/reviews/wolfram/">http://bactra.org/reviews/wolfram/</a></p><p>&#22312;&#36825;&#37324;&#25105;&#24182;&#19981;&#24819;&#35752;&#35770;&#35841;&#23545;&#35841;&#38169;&#12290;&#22240;&#20026;&#26080;&#35770;&#26159; Wolfram&#65292;&#36824;&#26159;&#36825;&#20123;&#20889;&#19979;&#25209;&#35780;&#30340;&#23398;&#32773;&#65292;&#20182;&#20204;&#30340;&#23398;&#35782;&#37117;&#36828;&#36828;&#39640;&#20110;&#25105;&#12290;</p><p>&#20294;&#25105;&#35835;&#23436;&#36825;&#20123;&#25209;&#35780;&#20043;&#21518;&#65292;&#23588;&#20854;&#26159;&#36825;&#20960;&#20010;&#26376;&#25105;&#22240;&#20026;&#21051;&#24847;&#32451;&#20064;&#20889;&#20316;&#65292;&#31215;&#32047;&#20102;&#19981;&#23569;&#20013;&#33521;&#25991;&#30340;&#35821;&#35328;&#35821;&#24863;&#65292;&#25105;&#26377;&#20004;&#20010;&#38750;&#24120;&#24378;&#28872;&#30340;&#30452;&#35266;&#24863;&#21463;&#65306;</p><p>&#31532;&#19968;&#65292;<strong>Wolfram &#21644;&#36825;&#20123;&#25209;&#35780;&#32773;&#65292;&#20174;&#35821;&#35328;&#20307;&#31995;&#19978;&#30475;&#65292;&#26681;&#26412;&#23601;&#19981;&#22312;&#21516;&#19968;&#20010;&#31995;&#32479;&#37324;</strong>&#12290;&#19981;&#26159;&#35266;&#28857;&#20998;&#27495;&#65292;&#32780;&#26159;&#35821;&#35328;&#21327;&#35758;&#19981;&#21516;&#65292;&#20960;&#20046;&#26159;&#19968;&#31181;&#8220;&#40481;&#21516;&#40493;&#35762;&#8221;&#30340;&#29366;&#24577;&#65288;&#25105;&#20204;&#24191;&#19996;&#20154;&#24120;&#35828;&#30340;&#65292;&#20457;&#37117;&#22312;&#33258;&#35828;&#33258;&#35805;&#65292;&#26080;&#25928;&#27807;&#36890;&#65289;&#12290;</p><p>&#31532;&#20108;&#65292;<strong>&#36825;&#20123;&#25209;&#35780;&#25991;&#26412;&#37324;&#24102;&#26377;&#38750;&#24120;&#26126;&#26174;&#30340;&#24773;&#32490;</strong>&#12290;&#20320;&#20960;&#20046;&#19981;&#38656;&#35201;&#22826;&#20180;&#32454;&#20998;&#26512;&#65292;&#23601;&#33021;&#35835;&#20986;&#19968;&#31181;&#8220;&#19981;&#28385;&#8221;&#30340;&#24773;&#32490;&#12290;&#36825;&#19968;&#28857;&#23545;&#19968;&#20010;&#33258;&#25105;&#26631;&#27036;&#20026;&#29702;&#24615;&#12289;&#20919;&#38745;&#30340;&#31185;&#23398;&#23478;&#32676;&#20307;&#26469;&#35828;&#65292;&#20854;&#23454;&#26159;&#25402;&#22855;&#24618;&#30340;&#12290;</p><p>&#21518;&#26469;&#25105;&#24930;&#24930;&#20063;&#20307;&#20250;&#21040;&#20102;&#36825;&#19968;&#28857;&#12290;AI &#26102;&#20195;&#21040;&#26469;&#20043;&#21518;&#65292;&#24456;&#22810;&#20154;&#34987;&#25351;&#36131;&#20026;&#8220;&#27665;&#31185;&#8221;&#30340;&#24863;&#35273;&#65292;&#20854;&#23454;&#26159;&#38750;&#24120;&#30456;&#20284;&#30340;&#8212;&#8212;&#27867;&#25351;&#37027;&#20123;&#27809;&#26377;&#25945;&#25480;&#22836;&#34900;&#12289;&#21364;&#22312;&#35748;&#30495;&#20570;&#31185;&#23398;&#38382;&#39064;&#30340;&#20154;&#12290;</p><p>&#8220;&#22909;&#21543;&#12290;&#8221;</p><p>&#25105;&#21482;&#33021;&#35828;&#65292;&#22914;&#26524;&#26159;&#22312; AI &#26102;&#20195;&#65292;&#25105;&#23425;&#24895;&#28151;&#25104; Wolfram &#36825;&#31181;<strong>&#36229;&#32423;&#23500;&#32705;&#22411;&#27665;&#31185;</strong>&#65292;&#20063;&#27604;&#29616;&#22312;&#37027;&#20123;&#21521;&#19978;&#35201;&#19981;&#21040;&#32463;&#36153;&#12289;&#32844;&#19994;&#21069;&#26223;&#26397;&#19981;&#20445;&#22805;&#30340;&#25945;&#25480;&#21644;&#35762;&#24072;&#26469;&#24471;&#26356;&#23454;&#22312;&#19968;&#20123;&#128514;&#12290;</p><h1>Wolfram&#24590;&#20040;&#22238;&#31572;&#65311;&#20182;&#22312;13&#24180;&#21069;&#65292;&#23601;&#24050;&#32463;&#20146;&#36523;&#32463;&#21382;&#20102;&#33539;&#24335;&#36716;&#31227;&#23545;&#20182;&#30340;&#20010;&#20154;&#20914;&#20987;&#65292;&#32780;&#19988;&#38750;&#24120;&#26377;&#26465;&#29702;&#30340;&#36319;&#20320;&#25551;&#36848;&#20102;&#20986;&#26469;&#12290;</h1><p>&#35201;&#35828;&#20182;&#26159;&#8220;&#27665;&#31185;&#8221;&#8230;.&#37027;&#23601;&#30495;&#30340;&#26159;&#21487;&#31505;&#20102;&#12290;</p><p>&#25105;&#20204;&#29616;&#22312;&#24341;&#29992;&#20182;&#26412;&#20154;&#22312;2012&#24180;&#30340;&#25991;&#31456;&#65292;&#26469;&#30475;&#30475;&#20182;&#22914;&#20309;&#25551;&#36848;&#36825;&#31181;&#20914;&#20987;&#30340;&#12290;&#22240;&#20026;&#20182;&#20854;&#23454;&#26089;&#23601;&#33073;&#31163;&#23398;&#26415;&#30028;&#20102;&#65292;&#25105;&#35748;&#20026;&#22312;1987&#24180;&#20182;&#24320;&#22987;&#20570;&#36719;&#20214;&#30340;&#26102;&#20505;&#65292;&#20182;&#26412;&#20154;&#24212;&#35813;&#23601;&#26377;&#36825;&#20010;&#20542;&#21521;&#12290;&#21518;&#26469;&#20182;&#19981;&#21457;&#35770;&#25991;&#65292;&#20182;&#33258;&#24049;&#20063;&#22312;&#27492;&#25991;&#26377;&#20855;&#20307;&#30340;&#25551;&#36848;&#65292;&#24635;&#30340;&#26469;&#35828;&#23601;&#26159;&#35770;&#25991;&#22871;&#26684;&#24335;&#65292;&#24418;&#24335;&#22823;&#20110;&#23454;&#36136;&#65292;&#32780;&#20182;&#30340;&#31185;&#23398;&#8221;&#30495;&#26448;&#23454;&#26009;&#8220;&#23454;&#36136;&#23454;&#22312;&#22826;&#22810;&#20102;&#65292;&#35770;&#25991;&#21457;&#19981;&#36807;&#26469;&#12290;&#20182;&#23425;&#24895;&#21457;&#21338;&#23458;&#12290;&#25152;&#20197;&#25105;&#29616;&#22312;&#29702;&#35299;&#20182;&#30340;&#24819;&#27861;&#65292;&#23601;&#26159;&#30475;&#20182;&#30340;&#26412;&#20154;&#30340;&#21338;&#23458;&#65292;Stephen Wolfram writings.</p><p><strong>&#20182;&#20026;&#20160;&#20040;&#22362;&#25345;&#65306;&#36825;&#26159;&#8220;&#33539;&#24335;&#36716;&#31227;&#8221;</strong></p><p>&#20182;&#35848;&#21450;&#23545;&#20182;&#30340;&#35780;&#35770;&#65292;&#22823;&#37096;&#20998;&#23398;&#32773;&#26159;&#19968;&#31181;<strong>&#24773;&#32490;&#24378;&#24230;&#24322;&#24120;</strong>&#30340;&#23398;&#26415;&#29616;&#22330;&#65306;</p><blockquote><p>&#8220;You&#8217;re destroying the heritage of mathematics&#8230;&#8221;</p></blockquote><p>&#25105;&#23601;&#35828;&#36825;&#20123;&#25209;&#35780;&#32773;&#37117;&#26159;&#24456;&#24378;&#24773;&#32490;&#30340;&#65292;&#23545;&#20110;&#19968;&#20010;&#23398;&#32773;&#26469;&#35828;&#65292;&#20026;&#20160;&#20040;&#35201;&#21457;&#33086;&#27668;&#65311;&#20320;&#20250;&#23545;&#19968;&#20010;&#38750;&#20005;&#32899;&#31185;&#23398;&#65292;&#27604;&#22914;&#25746;&#28385;&#24043;&#24072;&#65292;&#29788;&#20285;&#24072;&#25552;&#20986;&#30340;&#31185;&#23398;&#21457;&#33086;&#27668;&#21527;&#65311;&#19981;&#20250;&#65292;&#22240;&#20026;&#20320;&#19981;&#20250;&#21435;&#29702;&#20182;&#12290;</p><p>Wolfram&#30340;&#22238;&#31572;&#26159;&#65306;</p><blockquote><p>this is what a paradigm shift sounds like&#8212;up close and personal.&#8221;</p></blockquote><p><strong>&#33539;&#24335;&#36716;&#31227;&#39318;&#20808;&#19981;&#26159;&#26032;&#32467;&#35770;&#65292;&#32780;&#26159;&#26032;&#35780;&#20215;&#20307;&#31995;</strong>&#12290;</p><p>&#25152;&#20197;&#23545;&#26041;&#30340;&#24773;&#32490;&#19981;&#26159;&#22240;&#20026;&#26576;&#20010;&#23450;&#29702;&#34987;&#25512;&#32763;&#65292;&#32780;&#26159;&#22240;&#20026;&#65306;</p><ul><li><p>&#26087;&#30340;&#8220;&#20309;&#20026;&#31185;&#23398;&#8221;&#30340;&#26631;&#20934;&#34987;&#23041;&#32961;&#20102;&#65288;&#35770;&#25991;&#12289;&#23457;&#31295;&#12289;&#24341;&#29992;&#12289;&#23398;&#26415;&#32593;&#32476;&#65289;</p></li><li><p>&#26087;&#30340;&#8220;&#32844;&#19994;&#20154;&#29983;&#25237;&#36164;&#8221;&#34987;&#23041;&#32961;&#20102;&#65288;&#20960;&#21313;&#24180;&#30340;&#35757;&#32451;&#36335;&#24452;&#21644;&#22768;&#26395;&#36164;&#26412;&#21487;&#33021;&#22833;&#25928;&#65289;</p></li></ul><p>&#20182;&#21518;&#38754;&#25226;&#36825;&#31181;&#28145;&#23618;&#21407;&#22240;&#35828;&#24471;&#38750;&#24120;&#36196;&#35064;&#65292;&#29978;&#33267;&#20998;&#25104;&#20004;&#31867;&#8220;&#26680;&#24515;&#21463;&#23041;&#32961;&#32676;&#20307;&#8221;&#12290;</p><p><strong>&#20182;&#26126;&#30830;&#21306;&#20998;&#8220;surface reason / deeper reason&#8221;&#12290;</strong></p><blockquote><p>&#8220;there was a surface reason&#8230; and a deeper reason.&#8221;</p></blockquote><p>A. &#20869;&#23481;&#23618;&#38754;&#30340;&#20004;&#31867;&#24656;&#24807;&#65288;&#36825;&#23601;&#26159;&#33539;&#24335;&#20914;&#31361;&#30340;&#8220;&#32844;&#19994;&#23398;&#8221;&#20869;&#26680;&#65289;</p><p>&#31532;&#19968;&#31867;&#65288;&#22810;&#20026;&#29289;&#29702;&#23398;&#23478;&#65289;&#65306;</p><blockquote><p>&#8220;we&#8217;ve spent our whole careers barking up the wrong tree&#8221;.</p></blockquote><p>&#36825;&#21477;&#35805;&#30340;&#21547;&#20041;&#26159;&#65306;&#22914;&#26524; NKS &#30340;&#35745;&#31639;&#35270;&#35282;&#25104;&#31435;&#65292;&#37027;&#20123;&#20154;&#19981;&#26159;&#8220;&#38169;&#20102;&#19968;&#28857;&#8221;&#65292;&#32780;&#26159;<strong>&#25972;&#20010;&#31185;&#30740;&#25237;&#36164;&#26041;&#21521;&#21487;&#33021;&#34987;&#21028;&#23450;&#20026;&#20302;&#25910;&#30410;&#36335;&#24452;</strong>&#12290;</p><p>&#36825;&#23601;&#26159; Kuhn &#24335;&#33539;&#24335;&#36716;&#31227;&#37324;&#26368;&#24120;&#35265;&#30340;&#65306;<strong>&#26087;&#33539;&#24335;&#30340;&#25104;&#21151;&#25351;&#26631;&#65292;&#22312;&#26032;&#33539;&#24335;&#37324;&#19981;&#20877;&#20540;&#38065;</strong>&#12290;</p><p>&#31532;&#20108;&#31867;&#65288;&#22797;&#26434;&#24615;&#30740;&#31350;&#30456;&#20851;&#30340;&#20154;&#65289;&#65306;</p><blockquote><p>&#8220;it&#8217;ll overshadow everything we&#8217;ve done&#8221;.</p></blockquote><p>&#36825;&#19981;&#26159;&#8220;&#30495;&#20551;&#28966;&#34385;&#8221;&#65292;&#36825;&#26159;<strong>&#27880;&#24847;&#21147;&#19982;&#26435;&#23041;&#32467;&#26500;&#28966;&#34385;</strong>&#65306;&#35841;&#26469;&#23450;&#20041;&#20027;&#21465;&#20107;&#65292;&#35841;&#23601;&#25317;&#26377;&#23398;&#31185;&#20837;&#21475;&#12289;&#25945;&#26448;&#12289;&#22522;&#37329;&#35780;&#23457;&#30340;&#35805;&#35821;&#26435;&#12290;</p><p><strong>&#24418;&#24335;&#23618;&#38754;&#30340;&#20914;&#31361;&#65306;&#20320;&#20570;&#20102;&#8220;academic-like&#8221;&#65292;&#20294;&#19981;&#25353;&#8220;academic rules&#8221;</strong></p><blockquote><p>&#8220;academic-like, but you haven&#8217;t played by academic rules.&#8221;</p></blockquote><p>&#36825;&#37324;&#30340;&#28508;&#21488;&#35789;&#26159;&#65306;&#23398;&#26415;&#30028;&#26377;&#19968;&#22871;&#8220;&#21512;&#27861;&#24615;&#21327;&#35758;&#26632;&#8221;&#65292;&#21253;&#25324;&#65306;</p><ul><li><p>peer review &#20316;&#20026;&#20934;&#20837;</p></li><li><p>references &#20316;&#20026;&#20851;&#31995;&#32593;&#21487;&#35265;&#24615;</p></li><li><p>&#22823;&#20986;&#29256;&#31038;/&#26399;&#21002;&#20307;&#31995;&#20316;&#20026;&#20998;&#21457;&#28192;&#36947;</p></li><li><p>&#23398;&#26415;&#36523;&#20221;&#20316;&#20026;&#35805;&#35821;&#36164;&#26684;</p></li></ul><p>&#32780; Wolfram &#30340;&#36335;&#32447;&#26159;&#65306;<strong>&#25105;&#25226;&#36825;&#22871;&#21327;&#35758;&#26632;&#32469;&#36807;&#21435;&#20102;</strong>&#12290;&#25152;&#20197;&#20320;&#20250;&#30475;&#21040;&#20182;&#24378;&#35843;&#65306;</p><ul><li><p>&#25105;&#19981;&#26159; academic&#65292;&#25105;&#19981;&#21463;&#23427;&#32422;&#26463;&#65306;</p></li></ul><blockquote><p>&#8220;I wasn&#8217;t an academic&#8230;&#8221;</p></blockquote><p>&#36825;&#20854;&#23454;&#23601;&#26159;&#20182;&#36825;&#31687;&#25991;&#31456;&#30340;&#26680;&#24515;&#20914;&#31361;&#65306;<strong>&#26032;&#33539;&#24335; + &#26032;&#20998;&#21457;/&#39564;&#35777;&#26426;&#21046;</strong>&#65292;&#23545;&#26087;&#31995;&#32479;&#26159;&#21452;&#37325;&#23041;&#32961;&#12290;</p><p>&#20182;&#20889;&#20070;&#21644;&#21338;&#23458;&#8230;.</p><p>&#20855;&#20307;&#20869;&#23481;&#21487;&#20197;&#21435;&#30475;&#20182;&#30340;&#21407;&#25991;&#12290;</p><p><a href="https://writings.stephenwolfram.com/2012/05/living-a-paradigm-shift-looking-back-on-reactions-to-a-new-kind-of-science/?utm_source=chatgpt.com">https://writings.stephenwolfram.com/2012/05/living-a-paradigm-shift-looking-back-on-reactions-to-a-new-kind-of-science/?utm_source=chatgpt.com</a></p><p>&#25105;&#20204;&#36825;&#37324;&#20063;&#19981;&#20877;&#23637;&#24320;&#12290;</p><h1>Khun Loss</h1><p>&#25105;&#35828;&#19968;&#19979;Khun &#24211;&#24681;&#65292;&#21738;&#24597;&#20182;&#37027;&#26412;&#31185;&#23398;&#38761;&#21629;&#30340;&#32467;&#26500;&#36825;&#26412;&#20070;&#24555;&#25104;&#20070;&#19968;&#20010;&#19990;&#32426;&#65292;&#36824;&#30495;&#30340;&#26159;&#31934;&#20934;&#30340;&#39044;&#35328;&#20102;&#25105;&#20204;&#36825;&#31687;&#25991;&#31456;&#20889;&#21040;&#29616;&#22312;&#65292;&#36825;&#20010;&#26102;&#38388;&#28857;&#30340;&#31934;&#30830;&#36807;&#24230;&#35266;&#28857;&#12290;&#25105;&#30495;&#30340;&#24525;&#19981;&#20303;&#35201;&#24341;&#29992;&#19968;&#19979;&#12290;&#20855;&#20307;&#20869;&#23481;&#36824;&#26159;&#35201;&#30475;SEP&#21407;&#25991;&#65292;&#38142;&#25509;&#25105;&#20063;&#36148;&#20986;&#26469;&#20102;&#12290;</p><blockquote><p>&#19968;&#27425;&#33539;&#24335;&#38761;&#21629;&#19981;&#21482;&#26159;&#8220;&#22810;&#35299;&#20915;&#20102;&#19968;&#20123;&#38382;&#39064;&#8221;&#65292;&#23427;&#20063;&#20250;&#8220;&#20002;&#25481;&#65288;&#29978;&#33267;&#23459;&#24067;&#19981;&#21512;&#27861;&#65289;&#19968;&#20123;&#26087;&#33539;&#24335;&#26366;&#32463;&#38750;&#24120;&#30475;&#37325;&#12289;&#24182;&#19988;&#22312;&#26087;&#26631;&#20934;&#19979;&#31639;&#26159;&#25104;&#21151;&#30340;&#38382;&#39064;/&#35299;&#37322;&#8221;&#12290;</p><p>&#36825;&#31181;&#8220;&#20002;&#25481;&#30340;&#35299;&#37322;&#33021;&#21147;/&#38382;&#39064;&#38598;/&#35780;&#20215;&#26631;&#20934;&#8221;&#65292;&#23601;&#26159; Kuhn loss&#12290;</p></blockquote><p>&#23427;&#26159; Kuhn &#29992;&#26469;&#25171;&#25481;&#8220;&#31185;&#23398;&#36827;&#27493; = &#32047;&#31215;&#24335;&#36924;&#36817;&#30495;&#29702;&#8221;&#30340;&#20851;&#38190;&#26964;&#23376;&#20043;&#19968;&#12290;</p><h3>1) &#21040;&#24213;&#8220;&#20002;&#8221;&#20102;&#20160;&#20040;</h3><ol><li><p><strong>&#38382;&#39064;&#38598;&#25913;&#21464;</strong>&#65306;&#20160;&#20040;&#31639;&#8220;&#37325;&#35201;&#38382;&#39064;&#8221;&#21464;&#20102;</p></li><li><p><strong>&#26631;&#20934;&#25913;&#21464;</strong>&#65306;&#20160;&#20040;&#31639;&#8220;&#21512;&#26684;&#35299;&#37322;/&#22909;&#31185;&#23398;&#8221;&#21464;&#20102;</p></li><li><p><strong>&#27010;&#24565;&#19982;&#19990;&#30028;&#22270;&#26223;&#25913;&#21464;</strong>&#65306;&#21516;&#19968;&#20010;&#35789;&#22312;&#20004;&#20010;&#33539;&#24335;&#37324;&#25351;&#21521;&#30340;&#19996;&#35199;&#12289;&#20801;&#35768;&#35828;&#30340;&#35805;&#12289;&#33021;&#34920;&#36798;&#30340;&#21477;&#23376;&#38598;&#21512;&#37117;&#21464;&#20102;</p></li></ol><p>&#25152;&#20197; Kuhn loss &#19981;&#21482;&#26159;&#8220;&#23569;&#20102;&#19968;&#20010;&#25512;&#23548;&#8221;&#65292;&#32780;&#26159;&#26356;&#20687;&#65306;</p><ul><li><p>&#26087;&#33539;&#24335;&#37324;&#8220;&#24517;&#39035;&#35299;&#37322;&#8221;&#30340;&#19996;&#35199;&#65292;&#22312;&#26032;&#33539;&#24335;&#37324;&#21464;&#25104;&#20102;</p><ul><li><p>&#8220;&#19981;&#38656;&#35201;&#35299;&#37322;&#8221;&#12289;</p></li><li><p>&#8220;&#27809;&#24847;&#20041;&#8221;&#12289;</p></li><li><p>&#8220;&#24418;&#32780;&#19978;&#23398;&#8221;&#12289;</p></li><li><p>&#29978;&#33267;&#8220;&#20266;&#38382;&#39064;&#8221;&#12290;</p></li></ul></li></ul><p>&#36825;&#23601;&#26159;&#20182;&#20026;&#20160;&#20040;&#35828;&#38761;&#21629;&#20250;&#25913;&#21464;&#8220;&#31185;&#23398;&#30340;&#23450;&#20041;&#8221;&#12290;</p><h3>2) &#35789;&#26465;&#37324;&#30340;&#32463;&#20856;&#20363;&#23376;&#65306;&#29275;&#39039;&#20026;&#20309;&#8220;&#27809;&#35299;&#37322;&#21147;&#8221;&#65292;&#21364;&#36194;&#20102;</h3><p>SEP &#29992;&#30340;&#20363;&#23376;&#24456; Kuhn&#65306;</p><ul><li><p>&#22312; <strong>&#20122;&#37324;&#22763;&#22810;&#24503;/&#31515;&#21345;&#23572;</strong> &#30340;&#21147;&#23398;&#20256;&#32479;&#37324;&#65306;</p><p><strong>&#8220;&#21560;&#24341;&#21147;&#22914;&#20309;&#21487;&#33021;&#65311;&#8221;</strong> &#26159;&#30828;&#25351;&#26631;&#65288;&#20320;&#24517;&#39035;&#32473;&#20986;&#25509;&#35302;&#26426;&#21046;/&#26412;&#20307;&#35770;&#35299;&#37322;&#65289;&#12290;</p></li><li><p>&#29275;&#39039;&#30340;&#19975;&#26377;&#24341;&#21147;&#22312;&#24403;&#26102;&#30475;&#36215;&#26469;&#20687;&#8220;&#38548;&#31354;&#20316;&#29992;&#8221;&#65292;&#22312;&#26087;&#26631;&#20934;&#19979;&#26159;&#19981;&#21512;&#26684;&#30340;&#65292;&#25152;&#20197;&#20250;&#34987;&#25298;&#32477;&#12290;</p></li></ul><p>&#20294;&#19968;&#26086;&#29275;&#39039;&#33539;&#24335;&#32988;&#20986;&#65292;&#26032;&#20849;&#21516;&#20307;&#20250;&#25226;&#36825;&#31867;&#38382;&#39064; <strong>&#20174;&#31185;&#23398;&#35758;&#31243;&#37324;&#36386;&#20986;&#21435;</strong>&#65288;&#35828;&#23427;&#8220;&#19981;&#21512;&#27861;/&#19981;&#31185;&#23398;&#8221;&#65289;&#12290;</p><p>&#36825;&#23601;&#26159; Kuhn loss&#65306;&#20320;&#20174;&#26087;&#33539;&#24335;&#30340;&#35780;&#20998;&#34920;&#30475;&#65292;&#26032;&#33539;&#24335;&#8220;&#35299;&#37322;&#21147;&#21464;&#24046;&#20102;&#8221;&#65307;&#20294;&#20174;&#26032;&#33539;&#24335;&#30340;&#35780;&#20998;&#34920;&#30475;&#65292;&#36825;&#39064;&#26681;&#26412;&#8220;&#19981;&#35813;&#20570;&#8221;&#12290;</p><p>&#28982;&#21518; SEP &#20063;&#34917;&#20102;&#19968;&#20992;&#65306;&#36825;&#20010;&#38382;&#39064;&#21518;&#26469;&#22312;&#24191;&#20041;&#30456;&#23545;&#35770;&#30340;&#26694;&#26550;&#19979;&#20197;&#21478;&#19968;&#31181;&#26041;&#24335;&#8220;&#37325;&#26032;&#20986;&#29616;&#24182;&#24471;&#21040;&#22788;&#29702;&#8221;&#12290;&#36825;&#26356;&#33021;&#20307;&#29616; Kuhn &#30340;&#24847;&#24605;&#65306;<strong>&#19981;&#26159;&#30452;&#32447;&#32047;&#31215;&#65292;&#32780;&#26159;&#35758;&#31243;&#21453;&#22797;&#25913;&#20889;&#12290;</strong></p><h3>3) Kuhn loss &#21644;&#8220;&#19981;&#21487;&#36890;&#32422;&#8221;&#26159;&#20160;&#20040;&#20851;&#31995;</h3><p>&#20320;&#21487;&#20197;&#25226; Kuhn loss &#30475;&#25104; <strong>&#19981;&#21487;&#36890;&#32422;&#65288;incommensurability&#65289;&#30340;&#19968;&#20010;&#21487;&#35266;&#23519;&#30151;&#29366;</strong>&#12290;</p><p>&#19981;&#21487;&#36890;&#32422;&#22312; Kuhn &#36825;&#37324;&#19981;&#26159;&#8220;&#23436;&#20840;&#19981;&#21487;&#27604;&#36739;&#8221;&#65292;&#32780;&#26159;&#65306;</p><ul><li><p><strong>&#27809;&#26377;&#20849;&#21516;&#24230;&#37327;</strong>&#65306;&#22240;&#20026;&#20004;&#36793;&#22312;&#29992;&#19981;&#21516;&#30340;&#27010;&#24565;&#32593;&#12289;&#19981;&#21516;&#30340;&#38382;&#39064;&#28165;&#21333;&#12289;&#19981;&#21516;&#30340;&#35780;&#20215;&#26631;&#20934;&#12290;</p></li><li><p>&#22240;&#27492;&#20320;&#27809;&#27861;&#29992;&#21516;&#19968;&#20010;&#8220;&#32479;&#19968;&#25351;&#26631;&#8221;&#35828;&#65306;A &#27604; B &#26356;&#25509;&#36817;&#30495;&#29702;&#12289;&#25110;&#8220;&#24635;&#20307;&#26356;&#22909;&#8221;&#12290;</p></li></ul><p>Kuhn loss &#23601;&#26159;&#21578;&#35785;&#20320;&#65306;</p><p>&#21363;&#20351;&#26032;&#33539;&#24335;&#22312;&#26576;&#20123;&#26041;&#38754;&#26356;&#24378;&#65292;&#23427;&#20063;&#21487;&#33021;&#22312;&#26087;&#33539;&#24335;&#26366;&#32463;&#25797;&#38271;&#30340;&#32500;&#24230;&#19978;&#26356;&#24369;&#8212;&#8212;&#20294;&#8220;&#26356;&#24369;&#8221;&#36825;&#20214;&#20107;&#26412;&#36523;&#26159;&#21542;&#25104;&#31435;&#65292;&#21462;&#20915;&#20110;&#20320;&#31449;&#22312;&#21738;&#20010;&#33539;&#24335;&#30340;&#24230;&#37327;&#20307;&#31995;&#37324;&#12290;</p><h3>4) &#20026;&#20160;&#20040; Kuhn loss &#23545;&#8220;&#31185;&#23398;&#29702;&#24615;&#8221;&#26500;&#25104;&#21050;&#28608;</h3><p>&#36825;&#23601;&#26159;&#24403;&#24180; Kuhn/Feyerabend &#34987;&#39554;&#8220;&#21453;&#31185;&#23398;&#8221;&#30340;&#21407;&#22240;&#20043;&#19968;&#65306;</p><ul><li><p>&#22914;&#26524;&#38761;&#21629;&#20250;&#25913;&#20889;&#38382;&#39064;&#19982;&#26631;&#20934;&#65292;&#37027;&#8220;&#29702;&#24615;&#27604;&#36739;&#8221;&#26159;&#19981;&#26159;&#21464;&#25104;&#20102;&#25919;&#27835;&#26007;&#20105;&#65311;</p></li><li><p>&#22914;&#26524;&#26087;&#25104;&#21151;&#20250;&#34987;&#26032;&#33539;&#24335;&#23459;&#24067;&#19981;&#21512;&#27861;&#65292;&#37027;&#31185;&#23398;&#26159;&#19981;&#26159;&#19981;&#20877;&#8220;&#36924;&#36817;&#30495;&#29702;&#8221;&#65311;</p></li></ul><p>SEP &#20063;&#24378;&#35843;&#20102; Kuhn &#21518;&#26469;&#30340;&#28548;&#28165;&#65306;</p><p><strong>&#19981;&#21487;&#36890;&#32422; &#8800; &#19981;&#21487;&#27604;&#36739;&#65307;Kuhn loss &#8800; &#38750;&#29702;&#24615;&#12290;</strong></p><p>Kuhn &#30340;&#31435;&#22330;&#26356;&#20687;&#65306;</p><ul><li><p>&#33539;&#24335;&#36873;&#25321;&#27809;&#26377;&#8220;&#20013;&#31435;&#31639;&#27861;&#8221;&#65292;&#20294;&#20173;&#21487;&#29992;&#19968;&#32452;<strong>&#20215;&#20540;</strong>&#65288;accuracy/scope/simplicity/fruitfulness&#8230;&#65289;&#26469;&#20570;&#8220;&#26377;&#29702;&#30001;&#30340;&#20105;&#35770;&#8221;&#65307;</p></li><li><p>&#19981;&#21516;&#20154;&#23545;&#36825;&#20123;&#20215;&#20540;&#30340;&#26435;&#37325;&#19981;&#21516; &#8594; &#20801;&#35768;&#8220;&#29702;&#24615;&#20998;&#27495;&#8221;&#12290;</p></li></ul><p>&#21448;&#22238;&#21040;&#20102;&#24191;&#19996;&#20154;&#30340;&#37027;&#21477;&#20439;&#35821;&#65306;&#40481;&#21516;&#40493;&#35762;&#65292;&#26159;&#19981;&#26159;&#21464;&#24471;&#26356;&#36148;&#20999;&#20102;&#65311;</p><p><a href="https://plato.stanford.edu/archives/fall2019/entries/incommensurability/#RevParThoKuhInc">https://plato.stanford.edu/archives/fall2019/entries/incommensurability/#RevParThoKuhInc</a></p><div><hr></div><h1>&#25105;&#38543;&#25163;&#25343;&#20102;&#19968;&#20010;decoder/mastermind&#29609;&#20855;&#26469;&#20307;&#39564;&#19981;&#21487;&#32422;</h1><p>&#30495;&#30340;&#26159;&#22240;&#20026;&#25105;&#24819;&#21040;&#36825;&#19968;&#28857;&#30340;&#26102;&#20505;&#65292;&#36825;&#20010;&#29609;&#20855;&#21018;&#22909;&#23601;&#22312;&#25105;&#20070;&#26700;&#19978;&#32780;&#24050;&#12290;</p><p>&#19968;&#20010;&#30475;&#20284;&#31616;&#21333;&#30340;&#20799;&#31461;&#29609;&#20855;&#65292;&#32972;&#21518;&#20854;&#23454;&#38544;&#34255;&#30528;&#19968;&#20010; <strong>NP-complete</strong> &#32423;&#21035;&#30340;&#38382;&#39064;&#12290;&#23427;&#26368;&#26089;&#30340;&#21517;&#23383;&#21483; <strong>Mastermind</strong>&#65306;&#19968;&#27454;&#38656;&#35201;&#20004;&#20010;&#20154;&#21442;&#19982;&#30340;&#26700;&#28216;&#8212;&#8212;&#19968;&#20010;&#20154;&#36127;&#36131;&#8220;&#35774;&#23494;&#30721;&#8221;&#65292;&#21478;&#19968;&#20010;&#20154;&#36127;&#36131;&#8220;&#29468;&#23494;&#30721;&#8221;&#12290;</p><p>&#35268;&#21017;&#24456;&#30452;&#25509;&#65306;&#35774;&#23494;&#30721;&#32773;&#20808;&#36873;&#23450;&#19968;&#32452;&#39068;&#33394;/&#20301;&#32622;&#32452;&#25104;&#30340;&#23494;&#30721;&#24182;&#38544;&#34255;&#36215;&#26469;&#65307;&#29468;&#23494;&#30721;&#32773;&#38656;&#35201;&#22312;&#26377;&#38480;&#22238;&#21512;&#20869;&#65288;&#20063;&#23601;&#26159;&#26827;&#30424;&#32473;&#23450;&#30340;&#27133;&#20301;/&#22238;&#21512;&#25968;&#38480;&#21046;&#65289;&#25226;&#23427;&#29468;&#20013;&#12290;&#33509;&#33021;&#22312;&#38480;&#21046;&#22238;&#21512;&#20869;&#29468;&#20013;&#65292;&#29468;&#23494;&#30721;&#32773;&#32988;&#65307;&#21542;&#21017;&#35774;&#23494;&#30721;&#32773;&#32988;&#12290;</p><p>&#27599;&#19968;&#22238;&#21512;&#65292;&#29468;&#23494;&#30721;&#32773;&#25552;&#20132;&#19968;&#20010;&#29468;&#27979;&#65292;&#35774;&#23494;&#30721;&#32773;&#24517;&#39035;&#32473;&#20986;&#21453;&#39304;&#12290;&#32463;&#20856;&#21453;&#39304;&#24418;&#24335;&#26159;<strong>&#40657;&#38025;/&#30333;&#38025;</strong>&#65306;</p><ul><li><p><strong>&#40657;&#38025;</strong>&#65306;&#39068;&#33394;&#27491;&#30830;&#19988;&#20301;&#32622;&#27491;&#30830;&#65307;</p></li><li><p><strong>&#30333;&#38025;</strong>&#65306;&#39068;&#33394;&#27491;&#30830;&#20294;&#20301;&#32622;&#19981;&#27491;&#30830;&#12290;</p></li></ul><p>&#26412;&#36136;&#19978;&#65292;&#36825;&#26159;&#19968;&#26465;&#8220;&#21453;&#39304;&#20449;&#36947;&#8221;&#65292;&#32780;&#19988;&#23427;&#24517;&#39035;&#20005;&#26684;&#28385;&#36275;&#35268;&#21017;&#8212;&#8212;<strong>&#23436;&#20840;&#27491;&#30830;&#12289;&#26080;&#22122;&#38899;</strong>&#65292;&#21542;&#21017;&#25972;&#20010;&#25512;&#29702;&#38142;&#23601;&#20250;&#23849;&#25481;&#12290;</p><p>&#25105;&#25163;&#19978;&#30340;&#36825;&#20010;&#30005;&#23376;&#29256; <strong>Decoder</strong> &#21487;&#20197;&#30475;&#20316;&#26159; Mastermind &#30340;&#21319;&#32423;&#29256;&#65306;&#23427;&#25226;&#8220;&#35774;&#23494;&#30721;&#30340;&#20154;&#8221;&#26367;&#25442;&#25104;&#20102;&#35774;&#22791;&#20869;&#37096;&#30340;&#38544;&#34255;&#31639;&#27861;&#65292;&#24182;&#19988;&#21152;&#20837;&#20102;&#26356;&#22810;&#21464;&#20307;&#21453;&#39304;&#26426;&#21046;&#65292;&#27604;&#22914;&#65306;</p><ul><li><p><strong>&#32511;&#28783;</strong>&#34920;&#31034;&#39068;&#33394;&#21644;&#20301;&#32622;&#37117;&#27491;&#30830;&#65292;</p></li><li><p><strong>&#30333;&#28783;</strong>&#34920;&#31034;&#39068;&#33394;&#27491;&#30830;&#20294;&#20301;&#32622;&#19981;&#23545;&#65292;</p></li><li><p><strong>&#19981;&#20142;</strong>&#34920;&#31034;&#36825;&#31181;&#39068;&#33394;&#26681;&#26412;&#19981;&#22312;&#23494;&#30721;&#37324;&#65288;&#25110;&#32773;&#31561;&#20215;&#22320;&#34920;&#31034;&#8220;&#39068;&#33394;&#20063;&#19981;&#27491;&#30830;&#8221;&#30340;&#25968;&#37327;&#65289;&#12290;</p></li></ul><p>&#22312;&#23436;&#20840; <strong>indirect hint</strong> &#30340;&#21333;&#26426;&#27169;&#24335;&#19979;&#65292;&#29609;&#23478;&#29978;&#33267;&#30475;&#19981;&#21040;&#20301;&#32622;&#20449;&#24687;&#8212;&#8212;&#36825;&#20250;&#35753;&#29366;&#24577;&#31354;&#38388;&#24613;&#21095;&#33192;&#32960;&#65292;&#36828;&#36828;&#36229;&#20986; brute force &#30452;&#25509;&#31351;&#20030;&#30340;&#21487;&#34892;&#33539;&#22260;&#12290;&#32780;&#20174;&#29702;&#35770;&#19978;&#35828;&#65292;&#36825;&#32972;&#21518;&#30340;&#22797;&#26434;&#24615;&#24182;&#19981;&#26159;&#8220;&#24863;&#35273;&#19978;&#24456;&#38590;&#8221;&#36825;&#20040;&#31616;&#21333;&#65306;&#22312; <strong>2005 &#24180;</strong>&#65292;Jeff Stuckman &#21644; Guo-Qiang Zhang &#30340;&#35770;&#25991;&#35777;&#26126;&#20102; <strong>Mastermind &#26159; NP-complete</strong>&#12290;</p><p><a href="https://arxiv.org/abs/cs/0512049?utm_source=chatgpt.com">https://arxiv.org/abs/cs/0512049?utm_source=chatgpt.com</a></p><p>&#22909;&#65292;&#29992;&#20102;&#19968;&#25972;&#22825;&#30340;&#26102;&#38388;&#65292;&#25105;&#25226;&#36825;&#20010;&#29609;&#20855;&#33539;&#22260;&#20869;&#30340; <strong>800 &#20010;&#20851;&#21345;&#20840;&#37096;&#35299;&#23436;&#20102;</strong>&#12290;&#25105;&#37319;&#29992;&#30340;&#26159; <strong>minimax</strong> &#31574;&#30053;&#65306;&#27599;&#19968;&#27493;&#36873;&#25321;&#22312;&#26368;&#22351;&#24773;&#20917;&#19979;&#33021;&#26368;&#22823;&#24133;&#24230;&#21387;&#32553;&#20505;&#36873;&#31354;&#38388;&#30340;&#29468;&#27979;&#65292;&#32780;&#19981;&#26159;&#36861;&#27714;&#8220;&#30475;&#36215;&#26469;&#32874;&#26126;&#8221;&#30340;&#23616;&#37096;&#26368;&#20248;&#12290;</p><p>&#36825;&#37324;&#24517;&#39035;&#35748;&#30495;&#22840;&#19968;&#21477;&#8212;&#8212;&#36825;&#27454;<strong>&#26469;&#33258;&#20013;&#22269;&#35745;&#31185;&#20844;&#21496;&#30340;&#29609;&#20855;&#20135;&#21697;</strong>&#20570;&#24471;&#38750;&#24120;&#25166;&#23454;&#12290;&#23427;&#30340;<strong>&#21453;&#39304;&#20449;&#36947;&#39640;&#24230;&#21487;&#38752;&#12289;&#23436;&#20840;&#26080;&#22122;&#22768;</strong>&#65292;&#20005;&#26684;&#36981;&#23432;&#35268;&#21017;&#32422;&#23450;&#12290;&#33021;&#22312;&#19968;&#20010;&#38754;&#21521;&#20799;&#31461;&#30340;&#28040;&#36153;&#32423;&#29609;&#20855;&#37324;&#65292;&#25226;&#21453;&#39304;&#19968;&#33268;&#24615;&#21644;&#35268;&#21017;&#25191;&#34892;&#20570;&#21040;&#36825;&#31181;&#31243;&#24230;&#65292;&#20854;&#23454;&#24456;&#21385;&#23475;&#12290;</p><p>&#25152;&#26377;&#35299;&#27861;&#30340; <strong>repo</strong> &#25105;&#37117;&#24050;&#32463;&#25918;&#20986;&#26469;&#20102;&#65306;</p><p>&#19981;&#20165;&#21253;&#21547;&#38024;&#23545;&#36825;&#21488;&#35774;&#22791;&#26412;&#36523;&#37197;&#32622;&#30340;&#27714;&#35299;&#22120;&#65292;&#20063;&#21253;&#25324;<strong>&#21487;&#25193;&#23637;&#30340;&#36890;&#29992;&#35299;&#27861;</strong>&#65292;&#33021;&#22815;&#25903;&#25345;&#27604;&#21407;&#26426;&#26356;&#22823;&#30340;&#39068;&#33394;&#31354;&#38388;&#21644;&#26356;&#22810;&#21464;&#20307;&#35268;&#21017;&#12290;&#21487;&#20197;&#30452;&#25509;&#22797;&#29992;&#12289;&#25913;&#21442;&#25968;&#23601;&#33021;&#36305;&#12290;</p><p>&#23601;&#24403;&#26159;&#32473;&#35746;&#38405;&#32773;&#30340;&#19968;&#20010;&#23567;&#31119;&#21033;&#21543;&#12290;&#34429;&#28982;&#25105;&#20889;&#30340;&#25991;&#31456;&#36890;&#24120;&#37117;&#24456;&#38271;&#12289;&#20063;&#19981;&#24590;&#20040;&#8220;&#21451;&#22909;&#8221;&#65292;&#20294;&#36825;&#20010;&#29609;&#20855;&#20498;&#26159;&#20010;&#24456;&#22909;&#30340;&#20999;&#20837;&#21475;&#8212;&#8212;&#20320;&#21487;&#20197;&#20080;&#19968;&#20010;&#32473;&#23401;&#23376;&#65292;&#35753;&#20182;&#33258;&#24049;&#35748;&#30495;&#29609;&#20960;&#22825;&#65307;&#31561;&#20182;&#30495;&#30340;&#21345;&#20303;&#12289;&#24320;&#22987;&#24576;&#30097;&#20154;&#29983;&#30340;&#26102;&#20505;&#65292;&#20320;&#20877;&#25487;&#20986;&#36825;&#22871;&#31639;&#27861;&#65292;&#20960;&#30334;&#20851;&#36731;&#26494;&#24102;&#36807;&#12290;</p><p>&#20182;&#22823;&#27010;&#29575;&#20250;&#35273;&#24471;&#20320;<strong>&#29305;&#21035;&#21385;&#23475;</strong>&#12290;</p><p>&#20174;&#23454;&#29992;&#35282;&#24230;&#35762;&#65292;&#36825;&#21487;&#33021;&#20063;&#26159;&#19968;&#31181;<strong>&#35753;&#23567;&#23401;&#23376;&#31361;&#28982;&#24456;&#21916;&#27426;&#20320;&#30340;&#21150;&#27861;</strong>&#12290;</p><p><a href="https://github.com/STEMMOM/giiker_super_decoder">https://github.com/STEMMOM/giiker_super_decoder</a></p><p>&#29609;&#21040;&#36825;&#20010;&#38454;&#27573;&#65292;&#20854;&#23454;&#24050;&#32463;&#27809;&#20160;&#20040;&#8220;&#35299;&#39064;&#24847;&#20041;&#8221;&#21487;&#20877;&#23637;&#24320;&#20102;&#65306;&#19968;&#20010;&#20799;&#31461;&#29609;&#20855;&#65292;&#34987;&#23436;&#25972;&#27714;&#35299;&#65292;&#20840;&#37096;&#20851;&#21345;&#36890;&#20851;&#12290;&#23427;&#22312;&#29702;&#35770;&#19978;&#24403;&#28982;&#26159; <strong>NP-complete</strong>&#65292;&#20294;&#22312;&#36825;&#20010;&#34987;&#20005;&#26684;&#38480;&#21046;&#30340;&#35268;&#27169;&#20869;&#65292;&#20116;&#27493;&#20043;&#20869;&#24517;&#28982;&#21487;&#35299;&#12290;&#28982;&#32780;&#65292;&#36825;&#20174;&#26469;&#19981;&#26159;&#25105;&#30340;&#30495;&#27491;&#30446;&#26631;&#12290;&#25105;&#24182;&#19981;&#26159;&#24819;&#35777;&#26126;&#25105;&#35299;&#20915;&#20102;&#19968;&#20010;&#20799;&#31461;&#29609;&#20855;&#65292;&#32780;&#26159;&#24819;&#31449;&#22312; <strong>Wolfram &#30340;&#35270;&#35282;</strong>&#37325;&#26032;&#23457;&#35270;&#22797;&#26434;&#24615;&#26412;&#36523;&#12290;</p><p>&#36825;&#37324;&#30340;&#22797;&#26434;&#24615;&#24182;&#19981;&#26469;&#33258;&#35268;&#21017;&#8212;&#8212;&#35268;&#21017;&#26497;&#20854;&#31616;&#21333;&#8212;&#8212;&#32780;&#26159;&#26469;&#33258;&#36825;&#26679;&#19968;&#31181;&#32467;&#26500;&#65306;&#23384;&#22312;&#19968;&#20010;&#34987;&#38544;&#34255;&#30340;&#30495;&#20540; (s)&#65288;&#23494;&#30721;&#12289;&#31572;&#26696;&#65289;&#65292;&#20320;&#27599;&#19968;&#27425;&#21482;&#33021;&#25552;&#20132;&#19968;&#20010; guess&#65292;&#31995;&#32479;&#36820;&#22238;&#19968;&#20010;&#21453;&#39304;&#65288;G/W/N&#65289;&#65292;&#20320;&#24517;&#39035;&#22312;&#26377;&#38480;&#36718;&#25968;&#20869;&#25226;&#36825;&#20010; (s) &#21453;&#28436;&#20986;&#26469;&#12290;&#20320;&#24819;&#30452;&#25509;&#31639;&#20986;&#31572;&#26696;&#65292;&#21364;&#27704;&#36828;&#32570;&#23569;&#20851;&#38190;&#20449;&#24687;&#65307;&#32780;&#20320;&#33719;&#24471;&#20449;&#24687;&#30340;&#21807;&#19968;&#26041;&#24335;&#65292;&#26159;&#19981;&#26029;&#21457;&#36215;&#26597;&#35810;&#24182;&#25509;&#25910;&#21453;&#39304;&#12290;&#31572;&#26696;&#19981;&#26159;&#20174;&#35268;&#21017;&#20013;&#25512;&#23548;&#20986;&#26469;&#30340;&#65292;&#32780;&#26159;&#20174;&#20132;&#20114;&#36807;&#31243;&#20013;&#34987;&#36843;&#25366;&#20986;&#26469;&#30340;&#12290;</p><p>&#20174;&#20449;&#24687;&#35770;&#30340;&#35282;&#24230;&#30475;&#65292;&#36825;&#20010;&#27169;&#22411;&#24322;&#24120;&#32431;&#31929;&#65306;&#20449;&#24687;&#21333;&#21521;&#22320;&#20174;&#38544;&#34255;&#30495;&#20540;&#21521;&#22806;&#27844;&#38706;&#65292;&#22256;&#38590;&#19981;&#22312;&#20110;&#35268;&#21017;&#22797;&#26434;&#65292;&#32780;&#22312;&#20110;&#20449;&#24687;&#34987;&#38145;&#22312;&#19968;&#20010; oracle&#65288;&#21453;&#39304;&#22120;&#65289;&#21518;&#38754;&#65307;&#26412;&#36136;&#19978;&#65292;&#36825;&#26159;&#19968;&#20010;&#26631;&#20934;&#30340;&#40657;&#31665;&#26597;&#35810;&#38382;&#39064;&#8212;&#8212;&#20320;&#27599;&#19968;&#27493;&#37117;&#22312;&#29992;&#19968;&#27425; query&#65292;&#25442;&#21462;&#26497;&#20854;&#26377;&#38480;&#30340; bit&#12290;&#36825;&#31181;&#32467;&#26500;&#24182;&#19981;&#21482;&#23384;&#22312;&#20110;&#29609;&#20855;&#20013;&#65292;&#23427;&#22312;&#29616;&#23454;&#19990;&#30028;&#37324;&#24191;&#27867;&#23384;&#22312;&#65292;&#32780;&#19988;&#22797;&#26434;&#24615;&#24448;&#24448;&#21576;&#25351;&#25968;&#32423;&#25918;&#22823;&#12290;&#29609;&#20855;&#30340;&#8220;&#20161;&#24904;&#8221;&#22312;&#20110;&#65306;&#20320;&#34987;&#26126;&#30830;&#21578;&#30693;&#21453;&#39304;&#20449;&#36947;&#26159;&#23436;&#20840;&#26080;&#22122;&#22768;&#30340;&#65292;&#35268;&#21017;&#34987;&#20005;&#26684;&#25191;&#34892;&#65292;oracle &#19981;&#20250;&#25746;&#35854;&#65307;&#32780;&#29616;&#23454;&#20013;&#65292;&#26377;&#35841;&#35828;&#20102;&#21453;&#39304;&#19968;&#23450;&#26159;&#30495;&#23454;&#30340;&#65311;&#26080;&#22122;&#30340;&#65311;&#20110;&#26159;&#65292;&#30495;&#27491;&#20540;&#24471;&#24605;&#32771;&#30340;&#23545;&#35937;&#23601;&#21464;&#25104;&#20102;&#36825;&#20010;&#20320;&#26080;&#27861;&#30475;&#21040;&#20869;&#37096;&#26426;&#21046;&#30340;&#40657;&#31665;&#65306;&#20320;&#21482;&#33021;&#21521;&#23427;&#25552;&#20986;&#26597;&#35810;&#65292;&#23427;&#21482;&#36820;&#22238;&#19968;&#20010;&#21453;&#39304;&#65292;&#32780;&#25972;&#20010;&#22797;&#26434;&#24615;&#65292;&#27491;&#26159;&#20174;&#36825;&#31181;&#21463;&#38480;&#12289;&#21333;&#21521;&#12289;&#34987;&#36974;&#34109;&#30340;&#20449;&#24687;&#20132;&#20114;&#20013;&#33258;&#28982;&#28044;&#29616;&#20986;&#26469;&#30340;&#12290;</p><ul><li><p>&#23433;&#20840;/&#23494;&#30721;&#23398;&#65306;&#20320;&#23545;&#31995;&#32479;&#20869;&#37096;&#29366;&#24577;&#26410;&#30693;&#65292;&#21482;&#33021;&#35797;&#25506;/&#25506;&#27979;&#65288;queries&#65289;</p></li><li><p>&#35786;&#26029;/&#21307;&#23398;&#65306;&#20320;&#19981;&#30693;&#36947;&#30149;&#22240;&#65292;&#21482;&#33021;&#20570;&#26816;&#26597;&#65288;queries&#65289;&#26469;&#32553;&#23567;&#20505;&#36873;</p></li><li><p>&#31185;&#23398;&#23454;&#39564;&#65306;&#20320;&#19981;&#30693;&#36947;&#35268;&#24459;&#65292;&#21482;&#33021;&#20570;&#23454;&#39564;&#65288;queries&#65289;&#24471;&#21040;&#35266;&#27979;&#65292;&#36880;&#27493;&#25910;&#25947;</p></li><li><p>&#24037;&#31243;&#35843;&#21442;&#65306;&#20320;&#19981;&#30693;&#36947;&#26368;&#20339;&#37197;&#32622;&#65292;&#21482;&#33021;&#35797;&#36816;&#34892;&#65288;queries&#65289;&#25343;&#21040;&#21453;&#39304;&#20877;&#26356;&#26032;</p></li></ul><p>&#24403;&#28982;&#65292;&#25105;&#20063;&#24456;&#28165;&#26970;&#33258;&#24049;&#36824;&#22788;&#22312;&#19968;&#20010;<strong>&#38750;&#24120;&#26089;&#26399;&#30340;&#38454;&#27573;</strong>&#65292;&#32780;&#19988;&#36825;&#31867;&#38382;&#39064;&#24182;&#19981;&#26159;&#26080;&#20154;&#28041;&#36275;&#12290;&#20107;&#23454;&#19978;&#65292;&#22312;&#23398;&#30028;&#23427;&#26089;&#24050;&#34987;&#31995;&#32479;&#21270;&#22320;&#30740;&#31350;&#36807;&#12290;&#22312;&#29702;&#35770;&#35745;&#31639;&#26426;&#31185;&#23398;&#21644;&#8220;&#35868;&#39064;&#22797;&#26434;&#24230;&#8221;&#30456;&#20851;&#30340;&#30740;&#31350;&#20013;&#65292;Mastermind &#36890;&#24120;&#34987;&#34920;&#36848;&#20026;&#19968;&#31181;<strong>&#32422;&#26463;&#28385;&#36275; / &#19968;&#33268;&#24615;&#21028;&#23450;&#38382;&#39064;</strong>&#65292;&#26631;&#20934;&#24418;&#24335;&#34987;&#31216;&#20026; <strong>Mastermind Satisfiability Problem&#65288;MSP&#65289;</strong>&#65292;&#24182;&#19988;&#24050;&#32463;&#34987;&#20005;&#26684;&#35777;&#26126;&#26159; <strong>NP-complete</strong>&#8212;&#8212;&#36825;&#19968;&#28857;&#21069;&#38754;&#20854;&#23454;&#24050;&#32463;&#20132;&#20195;&#36807;&#20102;&#12290;&#23601;&#32463;&#20856;&#26700;&#28216;&#21442;&#25968;&#32780;&#35328;&#65288;4 &#20301;&#12289;6 &#31181;&#39068;&#33394;&#12289;&#20801;&#35768;&#37325;&#22797;&#65289;&#65292;&#25972;&#20010;&#29366;&#24577;&#31354;&#38388;&#21482;&#26377; (6^4 = 1296) &#31181;&#21487;&#33021;&#65307;&#26089;&#22312; <strong>1977 &#24180;</strong>&#65292;Knuth &#23601;&#32473;&#20986;&#20102;&#19968;&#20010;<strong>&#26368;&#22351;&#24773;&#20917; 5 &#27493;&#24517;&#32988;</strong>&#30340;&#31574;&#30053;&#12290;&#25105;&#30340;&#23454;&#29616;&#26412;&#36136;&#19978;&#20063;&#26159;&#36825;&#20010;&#32423;&#21035;&#65306;&#26368;&#22351;&#19981;&#36229;&#36807; 5 &#27493;&#65292;&#32780;&#19988;&#25105;&#30475;&#21040;&#30340;&#25991;&#29486;&#37324;&#32473;&#20986;&#30340;&#32479;&#35745;&#32467;&#26524;&#65292;&#24179;&#22343;&#27493;&#25968;&#22823;&#27010;&#22312; <strong>4 &#28857;&#22810;</strong>&#12290;&#25152;&#20197;&#22312;&#36825;&#20010;&#35268;&#27169;&#21644;&#26694;&#26550;&#20043;&#20869;&#65292;&#33021;&#34987;&#24418;&#24335;&#21270;&#12289;&#33021;&#34987;&#35777;&#26126;&#12289;&#33021;&#34987;&#20248;&#21270;&#30340;&#37096;&#20998;&#65292;&#20854;&#23454;&#24050;&#32463;&#34987;&#30740;&#31350;&#24471;&#30456;&#24403;&#20805;&#20998;&#20102;&#12290;&#25105;&#25509;&#19979;&#26469;&#22914;&#26524;&#32487;&#32493;&#25512;&#36827;&#65292;&#26356;&#21487;&#33021;&#20250;&#20174;<strong>&#20449;&#24687;&#29109;</strong>&#30340;&#35282;&#24230;&#20837;&#25163;&#65292;&#25226;&#38382;&#39064;&#37325;&#26032;&#34920;&#36848;&#25104;&#8220;&#27599;&#19968;&#27493;&#26597;&#35810;&#33021;&#27048;&#21462;&#22810;&#23569;&#26377;&#25928;&#20449;&#24687;&#8221;&#30340;&#38382;&#39064;&#12290;&#20063;&#27491;&#22909;&#65292;&#20170;&#22825;&#36824;&#20598;&#28982;&#30475;&#21040;&#19968;&#31687;&#25402;&#26377;&#24847;&#24605;&#30340;&#26032;&#35770;&#25991;&#65306;<strong>G&#252;r&#65288;2025&#65289;&#30340; Weighted Entropy Approach</strong>&#65292;&#29992;&#21152;&#26435;&#29109;&#20316;&#20026;&#21551;&#21457;&#24335;&#65292;&#36924;&#36817;&#29702;&#35770;&#26368;&#20248;&#30340;&#24179;&#22343;&#27493;&#25968;&#65292;&#26412;&#36136;&#19978;&#20063;&#26159;&#25226;&#8220;&#31574;&#30053;&#8221;&#24403;&#25104;&#19968;&#31181;<strong>&#27979;&#37327;&#20202;&#22120;</strong>&#26469;&#35774;&#35745;&#65292;&#36825;&#20010;&#24605;&#36335;&#23545;&#25105;&#26469;&#35828;&#38750;&#24120;&#26377;&#21442;&#32771;&#20215;&#20540;&#12290;&#36825;&#37324;&#23601;&#19981;&#23637;&#24320;&#20102;&#65292;&#26377;&#20852;&#36259;&#30340;&#21487;&#20197;&#30452;&#25509;&#21435;&#35835;&#21407;&#25991;&#12290;</p><p><a href="https://arxiv.org/abs/cs/0512049?utm_source=chatgpt.com">https://arxiv.org/abs/cs/0512049?utm_source=chatgpt.com</a></p><p><a href="https://www.cs.bu.edu/fac/best/res/papers/alybull86.pdf?utm_source=chatgpt.com">https://www.cs.bu.edu/fac/best/res/papers/alybull86.pdf?utm_source=chatgpt.com</a></p><p><a href="https://arxiv.org/abs/2511.19446?utm_source=chatgpt.com">https://arxiv.org/abs/2511.19446?utm_source=chatgpt.com</a></p><div><hr></div><h1>&#25105;&#24863;&#20852;&#36259;&#30340;&#26159; Wolfram&#20250;&#24590;&#20040;&#30475;&#24453;&#36825;&#31181;&#38382;&#39064;</h1><p>&#24403;&#28982;&#65292;&#26356;&#29616;&#23454;&#12289;&#20063;&#26356;&#31526;&#21512;&#25105;&#20170;&#24180;&#33410;&#22863;&#30340;&#20570;&#27861;&#65292;&#26159;<strong>&#25226;&#19968;&#37096;&#20998;&#26102;&#38388;&#30495;&#27491;&#25237;&#21521;&#20803;&#32990;&#26426;&#26412;&#36523;</strong>&#12290;&#22240;&#20026;&#22312;&#25105;&#30475;&#26469;&#65292;&#36825;&#26465;&#32447;&#32034;&#20960;&#20046;&#26159;&#19968;&#20999;&#30340;&#36215;&#28857;&#8212;&#8212;&#36215;&#28857;&#23601;&#22312; A New Kind of Science&#12290;&#26080;&#35770;&#26159;&#37027;&#26412;&#20070;&#26412;&#36523;&#65292;&#36824;&#26159;&#27492;&#21518; Wolfram &#25345;&#32493;&#20108;&#21313;&#22810;&#24180;&#21457;&#34920;&#30340;&#22823;&#37327;&#25991;&#31456;&#12289;&#35762;&#24231;&#19982;&#34917;&#20805;&#26448;&#26009;&#65292;<strong>&#26680;&#24515;&#23545;&#35937;&#30340;&#36215;&#28857;&#22312;&#20803;&#32990;&#26426;</strong>&#12290;&#25105;&#24863;&#35273;&#20182;&#26159;&#25226;&#23427;&#24403;&#20316;&#19968;&#31181;<strong>&#26368;&#23567;&#12289;&#26368;&#24178;&#20928;&#30340;&#35745;&#31639;&#23431;&#23449;</strong>&#65292;&#29992;&#26469;&#27491;&#38754;&#30740;&#31350;&#22797;&#26434;&#24615;&#12289;&#19981;&#21487;&#32422;&#24615;&#12289;&#20197;&#21450;&#8220;&#35268;&#21017;&#26497;&#31616;&#20294;&#34892;&#20026;&#26497;&#31471;&#22797;&#26434;&#8221;&#36825;&#19968;&#20107;&#23454;&#26412;&#36523;&#12290;&#22312;&#36825;&#26465;&#36335;&#32447;&#19978;&#65292;<strong>Stephen Wolfram &#23545;&#20803;&#32990;&#26426;&#30340;&#31995;&#32479;&#24615;&#12289;&#35268;&#27169;&#21270;&#12289;&#38271;&#26399;&#25237;&#20837;&#65292;&#20960;&#20046;&#26159;&#20840;&#29699;&#29420;&#19968;&#26080;&#20108;&#30340;</strong>&#8212;&#8212;&#19981;&#35770;&#26159;&#35268;&#21017;&#31354;&#38388;&#30340;&#20840;&#38754;&#25195;&#25551;&#12289;&#28436;&#21270;&#20998;&#31867;&#12289;&#35745;&#31639;&#19981;&#21487;&#32422;&#24615;&#30340;&#25552;&#20986;&#65292;&#36824;&#26159;&#21518;&#26469;&#21457;&#23637;&#20986;&#30340; ruliology &#35270;&#35282;&#65292;&#22312;&#20840;&#29699;&#33539;&#22260;&#20869;&#37117;&#24456;&#38590;&#25214;&#21040;&#30495;&#27491;&#24847;&#20041;&#19978;&#30340;&#8220;&#31532;&#20108;&#23478;&#8221;&#12290;&#25152;&#20197;&#25105;&#32943;&#23450;&#20063;&#20250;&#25345;&#32493;&#22312;&#36825;&#26465;&#19978;&#23398;&#20064;&#12290;</p><p>&#25105;&#19968;&#30452;&#35273;&#24471;&#65292;<strong>Wolfram &#25152;&#35859;&#30340;&#8220;&#26032;&#31185;&#23398;&#33539;&#24335;&#8221;</strong>&#65292;&#22312;&#20182;&#24403;&#24180;&#20197;<strong>&#19968;&#20010;&#20154;</strong>&#30340;&#36523;&#20221;&#25552;&#20986;&#36825;&#26679;&#19968;&#20010;&#23439;&#22823;&#30340;&#21465;&#20107;&#65292;&#26412;&#36523;&#23601;&#24050;&#32463;&#38750;&#24120;&#20102;&#19981;&#36215;&#20102;&#8212;&#8212;&#32780;&#19988;&#25105;&#32477;&#23545;&#19981;&#26159;&#21807;&#19968;&#19968;&#20010;&#23545;&#27492;&#20135;&#29983;&#24378;&#28872;&#38663;&#21160;&#30340;&#20154;&#12290;&#20851;&#38190;&#19981;&#22312;&#20110;&#20182;&#35828;&#20102;&#20160;&#20040;&#8220;&#32467;&#35770;&#8221;&#65292;&#32780;&#22312;&#20110;&#20182;<strong>&#21040;&#24213;&#22312;&#38382;&#20160;&#20040;&#38382;&#39064;</strong>&#12290;&#20320;&#30475;&#65292;&#21516;&#26679;&#26159;&#25343;&#21040; <strong>Mastermind / Decoder</strong> &#36825;&#20010;&#28216;&#25103;&#65292;&#25105;&#30340;&#31532;&#19968;&#21453;&#24212;&#32943;&#23450;&#26159;&#65306;<strong>&#27714;&#35299;</strong>&#12290;&#36825;&#24403;&#28982;&#24456;&#33258;&#28982;&#8212;&#8212;&#35745;&#31639;&#30340;&#30446;&#30340;&#20043;&#19968;&#65292;&#38590;&#36947;&#19981;&#23601;&#26159;&#27714;&#35299;&#21527;&#65311;&#19990;&#30028;&#37324;&#20284;&#20046;&#23384;&#22312;&#19968;&#20010;&#8220;&#30495;&#30456;&#8221;&#65292;&#19968;&#20010;<strong>&#21807;&#19968;&#30340; secret code</strong>&#65288;&#25110;&#32773;&#19968;&#20010;&#26497;&#23567;&#30340;&#20505;&#36873;&#38598;&#21512;&#65289;&#65307;&#25104;&#21151;&#30340;&#25351;&#26631;&#20063;&#38750;&#24120;&#28165;&#26970;&#65306;<strong>&#29992;&#23613;&#37327;&#23569;&#30340;&#27493;&#25968;&#25226;&#31572;&#26696;&#38145;&#23450;&#19979;&#26469;</strong>&#65292;&#36861;&#27714;&#25910;&#25947;&#12289;&#25910;&#26463;&#12289;&#26368;&#30701;&#36335;&#24452;&#12290;&#36825;&#22871;&#24605;&#36335;&#22312;&#22797;&#26434;&#24230;&#29702;&#35770;&#37324;&#20877;&#29087;&#24713;&#19981;&#36807;&#20102;&#65306;CSP / SAT / &#25628;&#32034;&#38382;&#39064;&#65292;&#21152;&#19978;&#19968;&#28857;&#31574;&#30053;&#20248;&#21270;&#65292;&#25152;&#26377;&#35780;&#20215;&#25351;&#26631;&#37117;&#22260;&#32469;&#8220;&#26368;&#24555;&#23450;&#20301;&#21807;&#19968;&#35299;&#8221;&#12290;&#65288;&#31361;&#28982;&#24819;&#24863;&#24936;&#19968;&#19979;&#65292;&#22909;&#20570;&#39064;&#23478;&#30340;&#24605;&#32500;&#21834;&#8230;.)</p><p>&#20294;&#22914;&#26524;&#25442;&#25104; <strong>Stephen Wolfram</strong> &#30340;&#35270;&#35282;&#65292;&#38382;&#39064;&#30340;&#37325;&#24515;&#20250;&#21457;&#29983;&#26126;&#26174;&#20559;&#31227;&#12290;&#20182;&#20851;&#24515;&#30340;&#24448;&#24448;&#19981;&#26159;&#8220;&#36825;&#20010;&#23454;&#20363;&#30340;&#31572;&#26696;&#26159;&#20160;&#20040;&#8221;&#65292;&#32780;&#26159;&#65306;<strong>&#36825;&#20010;&#35268;&#21017;&#31995;&#32479;&#26412;&#36523;&#20250;&#38271;&#25104;&#20160;&#20040;&#26679;</strong>&#12290;&#27604;&#22914;&#65306;&#23427;&#20250;&#19981;&#20250;&#28044;&#29616;&#20986;<strong>&#26222;&#36866;&#35745;&#31639;&#33021;&#21147;</strong>&#65311;&#23427;&#30340;&#25972;&#20307;&#34892;&#20026;&#23646;&#20110;&#21738;&#19968;&#31867;&#65288;&#31616;&#21333;&#12289;&#21608;&#26399;&#12289;&#28151;&#27788;&#12289;&#22797;&#26434;&#65289;&#65311;&#23427;&#26377;&#27809;&#26377;&#8220;&#25463;&#24452;&#8221;&#65292;&#36824;&#26159;&#35828;&#26412;&#36136;&#19978;&#26159;<strong>&#35745;&#31639;&#19981;&#21487;&#32422;</strong>&#30340;&#65311;&#22312;&#25972;&#20010;&#35268;&#21017;&#31354;&#38388;&#37324;&#65292;&#36825;&#31181;&#34892;&#20026;&#21040;&#24213;&#26159;&#31232;&#26377;&#30340;&#65292;&#36824;&#26159;&#26222;&#36941;&#23384;&#22312;&#30340;&#65311;&#22312;&#36825;&#20010;&#26694;&#26550;&#19979;&#65292;&#25104;&#21151;&#30340;&#25351;&#26631;&#19981;&#20877;&#26159;&#8220;&#38145;&#23450;&#21807;&#19968;&#35299;&#8221;&#65292;&#32780;&#26159;<strong>&#21457;&#29616;&#32467;&#26500;&#29616;&#35937;&#12289;&#20998;&#31867;&#29616;&#35937;&#12289;&#35782;&#21035;&#29983;&#25104;&#26426;&#21046;</strong>&#65292;&#24182;&#22238;&#31572;&#8220;&#20026;&#20160;&#20040;&#36825;&#31181;&#29616;&#35937;&#22312;&#35268;&#21017;&#23431;&#23449;&#20013;&#22914;&#27492;&#24120;&#35265;&#8221;&#12290;</p><p>&#26377;&#19968;&#20010;&#35753;&#20154;&#31245;&#24494;&#26377;&#28857;&#8220;&#32972;&#33034;&#21457;&#20937;&#8221;&#30340;&#24565;&#22836;, &#25105;&#24819;&#21040;&#65306;<strong>&#20803;&#32990;&#26426;&#26159;&#22270;&#28789;&#23436;&#22791;&#30340;</strong>&#12290;&#36825;&#24847;&#21619;&#30528;&#20320;&#38754;&#23545;&#30340;&#19981;&#26159;&#19968;&#20010;&#8220;&#35299;&#39064;&#22120;&#8221;&#65292;&#32780;&#26159;&#19968;&#20010;&#28508;&#22312;&#30340;&#35745;&#31639;&#23431;&#23449;&#12290;&#22260;&#32469;&#36825;&#19968;&#28857;&#65292;Wolfram &#25552;&#20986;&#20102; <strong>PCE&#65288;Principle of Computational Equivalence&#65289;</strong>&#8212;&#8212;&#35828;&#23454;&#35805;&#65292;&#36825;&#20010;&#21407;&#21017;&#25105;&#33258;&#24049;&#30446;&#21069;&#20063;&#36824;&#22312;&#28040;&#21270;&#20043;&#20013;&#65292;&#20294;&#23427;&#22823;&#33268;&#22312;&#35828;&#65306;&#19968;&#26086;&#31995;&#32479;&#30340;&#34892;&#20026;&#36229;&#36807;&#26576;&#20010;&#26497;&#20302;&#30340;&#22797;&#26434;&#24615;&#38408;&#20540;&#65292;&#23427;&#20204;&#22312;<strong>&#35745;&#31639;&#33021;&#21147;</strong>&#19978;&#24448;&#24448;&#26159;&#31561;&#20215;&#30340;&#12290;&#25442;&#21477;&#35805;&#35828;&#65292;&#8220;&#33021;&#31639;&#30340;&#37117;&#24046;&#19981;&#22810;&#19968;&#26679;&#33021;&#31639;&#8221;&#65292;&#21306;&#21035;&#19981;&#22312;&#20110;&#26159;&#21542;&#24378;&#22823;&#65292;&#32780;&#22312;&#20110;<strong>&#26159;&#21542;&#21487;&#39044;&#27979;&#12289;&#26159;&#21542;&#21487;&#21387;&#32553;&#12289;&#26159;&#21542;&#24517;&#39035;&#19968;&#27493;&#27493;&#27169;&#25311;</strong>&#12290;</p><p>&#25152;&#20197;&#65292;Wolfram &#24182;&#19981;&#26159;&#19981;&#8220;&#27714;&#35299;&#8221;&#65292;&#32780;&#26159;<strong>&#24443;&#24213;&#25442;&#20102;&#38382;&#39064;&#30340;&#24418;&#29366;</strong>&#12290;&#20182;&#19981;&#26159;&#22312;&#38382;&#65306;&#8220;&#36825;&#20010;&#20855;&#20307;&#23454;&#20363;&#30340;&#31572;&#26696;&#26159;&#20160;&#20040;&#65311;&#8221;&#20182;&#26356;&#24120;&#38382;&#30340;&#26159;&#65306;&#8220;&#36825;&#20010;&#31995;&#32479;&#30340;<strong>&#34892;&#20026;&#35268;&#24459;</strong>&#26159;&#20160;&#20040;&#65311;&#8221;&#22312;&#21516;&#26679;&#26159;&#8220;&#32473;&#23450;&#35268;&#21017; + &#21021;&#24577;&#8221;&#30340;&#21069;&#25552;&#19979;&#65292;&#20182;&#26410;&#24517;&#22312;&#24847;&#20320;&#26159;&#21542;&#25226;&#31995;&#32479;&#25512;&#36827;&#21040;&#31532; N &#27493;&#12289;&#24471;&#21040;&#26576;&#19968;&#20010;&#30830;&#23450;&#22270;&#20687;&#65307;&#20182;&#21487;&#33021;&#26356;&#20851;&#24515;&#30340;&#26159;&#65306;&#33021;&#21542;&#39044;&#27979;&#26576;&#20123;&#32479;&#35745;&#37327;&#65288;&#23494;&#24230;&#12289;&#29109;&#29575;&#12289;&#32467;&#26500;&#21387;&#32553;&#29575;&#65289;&#65292;&#33021;&#21542;&#35777;&#26126;&#25110;&#36890;&#36807;&#32463;&#39564;&#21028;&#26029;&#8220;&#21482;&#26377;&#27169;&#25311;&#25165;&#33021;&#30693;&#36947;&#32467;&#26524;&#8221;&#65292;&#20197;&#21450;&#33021;&#21542;&#25226;&#19968;&#20010;&#30475;&#20284;&#23436;&#20840;&#19981;&#21516;&#30340;&#38382;&#39064;<strong>&#32534;&#35793;</strong>&#36827;&#36825;&#20010;&#31995;&#32479;&#65292;&#20174;&#32780;&#26174;&#31034;&#23427;&#30340;&#26222;&#36866;&#24615;&#12290;&#22312; <strong>PCE &#30340;&#35270;&#35282;</strong>&#37324;&#65292;&#8220;&#21807;&#19968;&#35299;&#8221;&#20174;&#26469;&#19981;&#26159;&#20013;&#24515;&#38382;&#39064;&#65307;&#30495;&#27491;&#23621;&#20013;&#30340;&#65292;&#26159;<strong>&#19981;&#21516;&#31995;&#32479;&#26159;&#21542;&#20849;&#20139;&#21516;&#19968;&#31867;&#35745;&#31639;&#33021;&#21147;&#12289;&#21516;&#19968;&#31867;&#34892;&#20026;&#31867;&#22411;</strong>&#12290;&#36825;&#27491;&#26159;&#25105;&#24320;&#22987;&#24930;&#24930;&#24847;&#35782;&#21040;&#30340;&#12289;&#19982;&#8220;&#35299;&#39064;&#30452;&#35273;&#8221;&#20960;&#20046;&#27491;&#20132;&#30340;&#19968;&#31181;&#19990;&#30028;&#35266;&#12290;</p><p>&#36825;&#20010;&#31995;&#21015;&#25105;&#20250;&#25345;&#32493;&#26356;&#26032;&#65292;&#36825;&#26159;&#19968;&#20010;&#20540;&#24471;&#38271;&#26399;&#23398;&#20064;&#21644;&#25506;&#35752;&#30340;&#35758;&#39064;&#12290;&#35828;&#23454;&#35805;&#25105;&#30340;&#35748;&#35782;&#30446;&#21069;&#36824;&#31639;&#24456;&#27973;&#12290;</p>]]></content:encoded></item><item><title><![CDATA[Stephen Wolfram’s view of AI: my biggest takeaway in 2026 is the world’s computational irreducibility—and that our goal is to use AI to find “pockets of reducibility.”]]></title><description><![CDATA[Stephen Wolfram &#23545;AI&#30340;&#30475;&#27861;&#65306;2026&#24180;&#23545;&#25105;&#26368;&#22823;&#30340;&#21551;&#21457;&#65292;&#19990;&#30028;&#30340;&#19981;&#21487;&#32422;&#24615;&#65292;&#25105;&#20204;&#30340;&#30446;&#26631;&#26159;&#21033;&#29992;AI&#25214;&#21040;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;. &#65288;&#20013;&#25991;&#22312;&#21518;&#38754;&#65289;]]></description><link>https://www.entropycontroltheory.com/p/stephen-wolframs-view-of-ai-my-biggest</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/stephen-wolframs-view-of-ai-my-biggest</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Mon, 12 Jan 2026 21:59:34 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!7CqO!,w_256,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb00c4079-e461-4d54-89c9-5fcd0e045695_1280x1280.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>This article was originally meant to be just an appendix&#8212;something I would use to articulate my own views on large language models in 2026 and to frame my long-term roadmap. But in its own right, it&#8217;s a heavyweight statement by Stephen Wolfram on AI and the next scientific paradigm. It deserves to be read&#8212;again and again&#8212;slowly and deeply. It was published in March 2024, yet the moment it truly <em>hit</em> me in terms of information density didn&#8217;t arrive until late 2025. By then, it felt obvious that he had already pointed to the essence of what AI really is.</p><p>That time lag is exactly the distance between me and top-tier scientists: the same paragraph, when written, is already aimed at the future; while I need years of friction with the real world&#8212;projects failing, being rebuilt, repeatedly crashing into the question of &#8220;how a system takes responsibility over time&#8221;&#8212;before I can finally understand what it&#8217;s actually saying.</p><p>More importantly, you don&#8217;t need a PhD to understand this piece, nor do you need to cross some &#8220;elite&#8221; intellectual gate. In a certain sense, AI really has pushed knowledge toward a kind of egalitarianism&#8212;not only by making information easier to access, but by giving us the ability to <em>wash</em> it, to sift out what truly matters. It drags many so-called academic authorities&#8212;once anchored mainly by status, networks, or journals&#8212;back to a more ordinary but far stricter standard: <strong>Does it explain the world? Can it be validated in practice? Can it be reproduced?</strong></p><p>In an era where paper counts have exploded, a large fraction of academic publishing produces little real incremental value beyond paying to publish, cross-citing, and helping authors accumulate credentials. Worse, reproducibility itself is increasingly in doubt. Against that backdrop, Wolfram&#8217;s forward-looking vision remains dramatically under-recognized by the public. I even think he is already one of the foundational figures of the next scientific paradigm&#8212;only this fact hasn&#8217;t yet been absorbed into the mainstream narrative.</p><p>Of course, to truly read this article properly, you need at least an intuitive grasp of one concept: <strong>computational irreducibility</strong>. Many years ago, when I first read Wolfram&#8217;s <em>A New Kind of Science</em>, I almost had no idea what he was talking about. Only later&#8212;through six to ten years of life and project experience&#8212;did I gradually begin to internalize what he meant by &#8220;irreducibility.&#8221;</p><p>That&#8217;s also why he has become one of the most heavyweight thinkers on my personal list. And he&#8217;s still alive, still actively researching, still producing new work. That means what he says today isn&#8217;t a closed &#8220;past-perfect&#8221; academic conclusion&#8212;it&#8217;s a paradigm still being built in real time. I care about every sentence he writes. Many of the quotes I reference are not only from this article, but drawn from his talks and other writings as well; I won&#8217;t expand on that here.</p><p><a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/">https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/</a></p><div><hr></div><h2>AI Won&#8217;t Be Able to &#8220;Do Everything / Solve Science&#8221;</h2><ul><li><p>There is a somewhat widespread belief that AI will eventually be able to &#8220;do everything.&#8221;</p></li><li><p>Wolfram uses <strong>science</strong> as the ultimate stress test: can AI come in and, in one sweep, close out centuries of accumulated unsolved scientific problems?</p></li><li><p>His answer is: <strong>&#8220;inevitably and firmly no.&#8221;</strong></p></li></ul><p>Important: he is not denying AI&#8217;s practical usefulness; he is rejecting an &#8220;endgame omnipotence&#8221; narrative.</p><blockquote><p>&#8220;there&#8217;s a somewhat widespread belief that eventually AI will be able to &#8216;do everything&#8217;&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>This applies not only to people who were excited from day one, but also to those who began skeptical and later became fervent. <strong>&#8220;AI can&#8217;t do everything&#8221;</strong> is also something I deeply want to emphasize right now. Wolfram gives us a kind of orienting guidance. When you shrink the scope of what you think AI can do, paradoxically, the set of things you <em>can</em> do becomes larger&#8212;because your precision increases and your positioning becomes clearer. At least, that&#8217;s been true for me.</p><blockquote><p>&#8220;So what about science?&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>He then explains why science is such a decisive stress test: science is the largest intellectual edifice our civilization has built&#8212;yet it remains unfinished&#8212;so it&#8217;s the sharpest way to interrogate what &#8220;do everything&#8221; really means.</p><blockquote><p>&#8220;the single largest intellectual edifice of our civilization&#8221; (Writings by Stephen Wolfram)</p><p>&#8220;there are still all sorts of scientific questions that remain.&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>He&#8217;s not asking whether AI can <em>help</em> science. He&#8217;s asking whether it can <strong>finish what&#8217;s left</strong>.</p><blockquote><p>&#8220;So can AI now come in and just solve all of them?&#8221; (Writings by Stephen Wolfram)</p><p>&#8220;the answer is inevitably and firmly no.&#8221; (Writings by Stephen Wolfram)</p></blockquote><p><strong>No.</strong></p><blockquote><p>&#8220;But that certainly doesn&#8217;t mean AI can&#8217;t importantly help&#8230;&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>And that <strong>&#8220;but&#8221;</strong> is crucial. What AI is <em>particularly good at</em>&#8212;and what we should study seriously&#8212;starts right there. That&#8217;s where the value is.</p><div><hr></div><h2>Wolfram&#8217;s &#8220;Practical Positioning&#8221; for AI: Linguistic Interface + High-Level Autocomplete of Conventional Wisdom</h2><p>The second paragraph already contains his measured affirmation:</p><ul><li><p>LLMs provide a new kind of <strong>linguistic interface</strong>, connecting human intent to existing computational capabilities (his example: the Wolfram Language).</p></li><li><p>Through &#8220;conventional scientific wisdom,&#8221; LLMs can function as <strong>high-level autocomplete</strong>, filling in &#8220;conventional answers&#8221; or &#8220;conventional next steps.&#8221;</p></li></ul><p>This section is key. He acknowledges that LLMs are <strong>very strong</strong>&#8212;but strong in making existing paradigms smoother to use, not in producing genuinely new paradigms of discovery.</p><h3>1) LLMs as a new &#8220;linguistic interface&#8221;</h3><blockquote><p>&#8220;At a very practical level, for example, LLMs provide a new kind of linguistic interface to the computational capabilities that we&#8217;ve spent so long building in the Wolfram Language.&#8221; (Writings by Stephen Wolfram)</p></blockquote><h3>2) High-level autocomplete via &#8220;conventional scientific wisdom&#8221;</h3><blockquote><p>&#8220;And through their knowledge of &#8216;conventional scientific wisdom&#8217; LLMs can often provide what amounts to very high-level &#8216;autocomplete&#8217;&#8230;&#8221; (Writings by Stephen Wolfram)</p><p>&#8220;&#8230;for filling in &#8216;conventional answers&#8217; or &#8216;conventional next steps&#8217; in scientific work.&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>Now, I want to single out <strong>&#8220;linguistic interface&#8221;</strong>, because it matters enormously.</p><p>Right now our imagination of it is far too narrow&#8212;basically reduced to a chat window, plus some ordinary programming use cases. Let&#8217;s be honest: most of us &#8220;mere mortals&#8221; are still operating at this level&#8212;open a window, drop a prompt, get an output that looks plausible, and feel as if the world has been rewritten.</p><p>But here&#8217;s the point: <strong>the interface itself contains massive room for systematic engineering.</strong></p><p>The world is not limited to a chat window just because that&#8217;s all we can currently picture. The chat window is merely the most primitive, crude form of interface.</p><p>The pleasure of &#8220;AI vibe coding&#8221; is, in essence, a short-term dopamine hit from interface upgrades: work that used to cost you hours of intense cognitive effort can now be &#8220;done&#8221; with a single prompt, and it feels great&#8230; but is that all? If we treat that as the endpoint, we&#8217;ve merely installed a new kind of <em>text slot machine</em> inside our workflows.</p><p>The real question is not &#8220;can it give you an answer?&#8221; The real question is:</p><p><strong>Can this interface carry real-world responsibility?</strong></p><p>In many of my earlier writing I keep returning to a few concepts:</p><p><strong>Reproducibility. Auditability. Portability.</strong></p><p>These are not engineering vanity metrics; they are institutional properties required for anything that enters <strong>real decision-making</strong>. (Look at any organizational process, legal process, judicial process, or audit process&#8212;same structure.)</p><p>Because if you zoom out, you realize: human daily life, organizational life, corporate governance, even national governance&#8212;an enormous portion of decisions are, at their core, a <strong>linguistic interface</strong>. We use language to raise issues, describe risk, exchange commitments, write rules, issue judgments&#8212;and then we convert those words into actions and consequences.</p><p>So here&#8217;s the question: can we let a large language model make those decisions? Do we dare?</p><p>A single model&#8217;s judgment is inherently environment-dependent. The same sentence, the same question, can change with context. And constraints written in a prompt are, fundamentally, <strong>soft constraints</strong>&#8212;not machine-executable <strong>if-else</strong> logic. More dangerously, the model will ignore constraints in order to &#8220;complete the language&#8221; and deliver an answer that looks like an answer&#8212;because its objective is not &#8220;obey the institution,&#8221; but &#8220;produce coherent text.&#8221;</p><p>Predict the next token.</p><p>It won&#8217;t say: <strong>Error</strong> (with a clear error code).</p><p>It must give you <em>something</em>. Its default behavior is to disguise uncertainty as certainty, and to disguise indeterminacy as a decisive ruling.</p><p><strong>Has this problem been solved?</strong></p><p>No.</p><p>So do we have an interface architecture where, when language enters the system:</p><ul><li><p>constraints are hard&#8212;compilable and verifiable</p></li><li><p>failure is allowed&#8212;explicit <strong>throw error / require override</strong></p></li><li><p>decisions are replayable and accountable&#8212;not a one-off chat session</p></li><li><p>outputs must pass institutional audit and gating before they touch reality</p></li></ul><p>If not, then the &#8220;vibe coding&#8221; pleasure we&#8217;re enjoying today is still just fireworks in low-risk environments. We remain an entire institutional engineering chasm away from decision systems that can actually carry human society.</p><p>And isn&#8217;t that exactly where developers should invest resources and attention? Isn&#8217;t this a career-grade, high-value path that can be reasoned out ahead of time&#8212;especially for those of us who are not academic researchers, but application-oriented builders? (laugh)</p><div><hr></div><h2>He Abstracts the History of Science into Two Representation Revolutions: Mathematical Representation &#8594; Computational Representation</h2><p>He then proposes a deeper evaluative framework:</p><ul><li><p>Three centuries ago, science underwent a leap because we learned to <strong>represent the world using mathematics</strong>.</p></li><li><p>Today, we are in the middle of a leap toward <strong>a fundamentally computational representation of the world</strong> (which he sees as a more foundational paradigm shift).</p></li></ul><p>This move effectively <strong>raises the bar</strong>:</p><p>If you ask whether AI is &#8220;changing science,&#8221; you first have to clarify: is AI helping mainly at the <strong>tool layer</strong>, or is it introducing a new scientific paradigm at the <strong>representation layer</strong>?</p><blockquote><p>&#8220;Three centuries ago science was transformed by the idea of representing the world using mathematics.&#8221; (Writings)</p><p>&#8220;And in our times we&#8217;re in the middle of a major transformation to a fundamentally computational representation of the world (and, yes, that&#8217;s what our Wolfram Language computational language is all about).&#8221; (Writings)</p></blockquote><h3>The &#8220;Tool Layer vs. Paradigm Layer&#8221; Question</h3><blockquote><p>&#8220;So how does AI stack up?&#8221; (Writings)</p><p>&#8220;Should we think of it essentially as a practical tool for accessing existing methods, or does it provide something fundamentally new for science?&#8221; (Writings)</p></blockquote><p>What does this mean? For someone encountering his worldview for the first time, it can feel unfamiliar. <strong>Computational irreducibility</strong> isn&#8217;t merely a technical term&#8212;it&#8217;s a <strong>worldview switch</strong>. It says that many systems are not something you can &#8220;skip through&#8221; by being smarter; instead, you often <strong>have to do the computation all the way through</strong>. (If this feels impossible to grasp, you really should study cellular automata; the mechanism makes the idea tangible.) What you <em>can</em> do, more often than not, is find <strong>pockets of reducibility</strong>&#8212;places where you can compress locally and predict locally. Remember this phrase: <strong>pockets of reducibility</strong>.</p><p>When Demis Hassabis talks about whether nature is modelable, I think his intuition is very similar. In his interview with Lex Fridman (I wrote a lot of threads about it on X, but they&#8217;re hard to locate now; later I moved my longer essays to Substack), he used protein folding as an analogy for this idea of &#8220;pockets.&#8221; It&#8217;s like an enormous wilderness: there are always a few footpaths that people have carved out. If you find the path, everything suddenly feels easy. If you don&#8217;t, you&#8217;re stuck with brute force. The underlying structure is: <strong>pocketed reducibility / otherwise brute force</strong>:</p><blockquote><p>&#8220;if there&#8217;s not [patterns]&#8230; you have to do brute force.&#8221; (Lex Fridman)</p></blockquote><p>And the key to why many natural problems look combinatorially explosive yet still become modelable:</p><blockquote><p>&#8220;there&#8217;s some structure&#8230; some gradient you can follow.&#8221; (Lex Fridman)</p></blockquote><p>Read these side by side with Wolfram:</p><ul><li><p>Wolfram: global irreducibility prevents systematic &#8220;jumping ahead,&#8221; but there are always <strong>pockets of reducibility</strong>.</p></li><li><p>Demis: if the space has structure (a gradient/landscape), you can search effectively; if it doesn&#8217;t, you&#8217;re stuck with brute force.</p></li></ul><p>So when I say &#8220;Wolfram and Demis are similar&#8212;irreducibility,&#8221; I mean this:</p><p><strong>The world does not guarantee shortcuts everywhere. What we call intelligence is often just the ability to quickly find where structure exists and where it doesn&#8217;t.</strong></p><p>And irreducibility feels like an &#8220;ocean of possibilities&#8221; precisely because it yanks you away from &#8220;capability worship&#8221; and brings you back to <strong>computation and institutions</strong>:</p><ul><li><p>when you must simulate, enumerate, and complete the computation;</p></li><li><p>when you can compress, abstract, and form a narrative;</p></li><li><p>and most importantly: when you plug AI into real-world decision-making, which parts you <strong>must not let it &#8220;guess through,&#8221;</strong> and instead must ground in hard structures that are reproducible, auditable, and able to throw explicit errors.</p></li></ul><p>Honestly, I&#8217;m still learning too. If I can internalize even a small fraction of what these giants are pointing at, that&#8217;s already a huge win.</p><div><hr></div><h2>The Hardest Core Argument: Computational Irreducibility as a &#8220;Physics-Level Limit&#8221; AI Cannot Cross</h2><p>This article is powered by the engine of <strong>computational irreducibility</strong>.</p><ul><li><p>Treat natural systems as computational processes: the system itself is &#8220;computing&#8221; its evolution.</p></li><li><p>We (or AI) must also compute in order to predict it.</p></li><li><p>The <strong>Principle of Computational Equivalence</strong>: these computations are, in principle, comparable in sophistication.</p></li><li><p>Therefore, you cannot expect AI to &#8220;jump ahead&#8221; and skip the evolution steps systematically.</p></li><li><p>So fully &#8220;solving science&#8221; is impossible.</p></li></ul><blockquote><p>The kind of &#8220;endgame shortcut&#8221; you want simply doesn&#8217;t exist for many systems. It&#8217;s not that you didn&#8217;t train enough&#8212;the world doesn&#8217;t grant that shortcut.</p></blockquote><p>This is even harder to understand&#8212;so let&#8217;s unpack it:</p><h3>1) The world as computation: the system is &#8220;computing&#8221; its behavior</h3><p>Wolfram first sets &#8220;world = computational process&#8221; as a foundational premise:</p><blockquote><p>&#8220;we can think of everything that happens as a computational process.&#8221; (Writings)</p><p>&#8220;The system is doing a computation to determine its behavior.&#8221; (Writings)</p></blockquote><p>He&#8217;s not saying &#8220;we simulate the world using computation.&#8221; He&#8217;s saying: <strong>the world itself is computation.</strong></p><h3>2) We (or AI) must compute too, in order to predict it</h3><p>He then puts the observer (human or AI) into the same computational frame:</p><blockquote><p>&#8220;We humans&#8212;or, for that matter, any AIs we create&#8212;also have to do computations&#8221; (Writings)</p><p>&#8220;to try to predict or &#8216;solve&#8217; that behavior.&#8221; (Writings)</p></blockquote><p>Meaning: if you want to know what will happen, you must <strong>pay computational steps</strong>&#8212;you don&#8217;t get a free pass just because your output looks like &#8220;human intuition in text form.&#8221;</p><h3>3) PCE: the &#8220;sophistication&#8221; ceiling is of the same order</h3><p>He uses the Principle of Computational Equivalence to nail down why you can&#8217;t systematically skip steps:</p><blockquote><p>&#8220;the Principle of Computational Equivalence says that these computations are all at most equivalent in their sophistication.&#8221; (Writings)</p></blockquote><p>This is the &#8220;physics-level nail&#8221;: the system computes; you compute; the ceiling is of the same order&#8212;so there is no universal &#8220;god&#8217;s-eye shortcut.&#8221;</p><h3>4) Therefore, no systematic &#8220;jump ahead&#8221;</h3><p>He basically uses the exact phrase &#8220;jump ahead&#8221;:</p><blockquote><p>&#8220;we can&#8217;t expect to systematically &#8216;jump ahead&#8217; and predict or &#8216;solve&#8217; the system&#8221; (Writings)</p><p>&#8220;it inevitably takes a certain irreducible amount of computational work&#8221; (Writings)</p></blockquote><p>The key word is <strong>systematically</strong>. He isn&#8217;t saying you can never skip <em>anything</em>; he&#8217;s saying there is no broadly reliable method to skip the evolution itself. Those occasional &#8220;skips&#8221; are exactly what he means by <strong>pockets of reducibility</strong>.</p><h3>5) Therefore, &#8220;solving science&#8221; is impossible: irreducibility is the ceiling</h3><p>He concludes that scientific power hits a hard limit:</p><blockquote><p>&#8220;we&#8217;ll ultimately be limited in our &#8216;scientific power&#8217; by the computational irreducibility of the behavior.&#8221; (Writings)</p></blockquote><p>And then he adds the blunt line (this one really matters):</p><blockquote><p>&#8220;there just won&#8217;t be any way&#8212;with AI or otherwise&#8212;to shortcut just simulating the system step by step.&#8221; (Writings)</p></blockquote><p>Now it starts to feel &#8220;mystical&#8221; and subtle. Here&#8217;s my own way of understanding it: the world is computing (I can&#8217;t fully explain what &#8220;the world computing&#8221; means). Let me offer an analogy&#8212;are your DNA and cells &#8220;computing&#8221; the protein structures inside your body? This is still a metaphor: the world advances its states according to its own rules, like a process running itself. And you&#8217;re computing too&#8212;using your mind, paper, machines, models&#8212;to try to know in advance what will happen.</p><p>The key is: these two kinds of &#8220;computation&#8221; are fundamentally on the same level. You are not outside the universe holding a remote control; you are a computational device <em>inside</em> the universe. Trying to use one computation to dominate another and systematically &#8220;skip steps&#8221; is, in most cases, impossible. Your computation will not be more &#8220;clever&#8221; than the world&#8217;s (it can only be weaker, honestly), because you&#8217;re still running rules within the same physical universe. The only times you &#8220;win&#8221; are when you happen to find a <strong>shortcut pocket</strong>&#8212;not because you became God, but because the system allows compression at a particular scale, allows simplification, allows you to say something ahead of time.</p><p>And every computation costs energy. (Landauer&#8217;s principle and the rest&#8212;no need to go deep here.) I increasingly treat this as a kind of base tax in reality. If you want more detail, you pay more steps. If you want more precision, reproducibility, and accountability, you pay more structural cost. You can bluff with language for a while, but once it touches execution, you must pay the bill: time, compute, energy, even human attention. &#8220;Irreducibility&#8221; means: in many places, <strong>you can&#8217;t avoid paying that bill.</strong></p><p>So when we shift from &#8220;Can AI do everything?&#8221; to &#8220;Can AI help us make decisions?&#8221;, the question becomes sharp: if the world&#8217;s evolution must be computed step by step, why is a model entitled to output something that looks like a conclusion or a ruling <strong>without paying comparable cost</strong>? Even worse, because the model is driven to continue the language, it tends to package uncertainty as a &#8220;sayable result,&#8221; rather than behaving like a real system: throw an error when it must, halt when it must, demand more information when it must. (This &#8220;ability to throw an error&#8221; is extremely important; I&#8217;ve discussed it in detail elsewhere. Systems that cannot error out are dangerous.)</p><p>That&#8217;s why I believe what truly needs to be engineered is not &#8220;making AI answer better,&#8221; but <strong>giving the linguistic interface institutional constraints that feel as strict as conservation laws</strong>: make it expensive when it must be expensive, and make it stop when it must stop; make it able to admit &#8220;there is no shortcut here,&#8221; rather than handing you a cheap illusion. Otherwise, the computation cost you <em>must</em> pay gets silently converted into a more hidden cost: misjudgment, misplaced trust, mistaking approximation for conclusion, mistaking smoothness for reliability.</p><p>This is also why I keep emphasizing: reproducibility, auditability, portability. These aren&#8217;t engineering OCD; they are the hard conditions that keep language from collapsing when it enters real decision-making under an irreducible world. You can&#8217;t defeat the world&#8217;s computation. The only thing you can do is acknowledge where you must compute and where you can compress&#8212;and then write that into structure, so the system can take responsibility for every judgment it makes over time.</p><p>This is honestly the limit of how far I can explain it. I can&#8217;t go further than this.</p><blockquote><p>So what&#8217;s the key? Wolfram&#8217;s greatest value&#8212;his biggest gift to me&#8212;is exactly here: he tells us our goal is to find countless &#8220;pockets of reducibility.&#8221;</p></blockquote><div><hr></div><h2>Key Detail: Irreducibility Does <strong>Not</strong> Mean &#8220;Nothing Can Be Done&#8221;</h2><p>Many people misread this.</p><p>Wolfram&#8217;s point is:</p><ul><li><p>The overall behavior may be irreducible,</p></li><li><p>but there must exist infinitely many <strong>&#8220;pockets of reducibility,&#8221;</strong></p></li><li><p>and science is possible precisely because we usually work inside those pockets&#8212;regularities, models, compression, and understanding all come from them.</p></li></ul><p>So his conclusion is essentially two-part:</p><ol><li><p><strong>AI cannot enable us to systematically bypass irreducibility.</strong></p></li><li><p><strong>AI may help us find pockets of reducibility more efficiently.</strong></p></li></ol><p>That&#8217;s why he can say &#8220;the endgame is impossible&#8221; while still spending so much time discussing what &#8220;AI can do in science.&#8221;</p><p>This is <strong>not</strong> an argument for giving up.</p><div><hr></div><h3>1) He first poses the key question: if things are irreducible, why is science possible at all?</h3><p>He immediately asks:</p><blockquote><p>&#8220;But given computational irreducibility, why is science actually possible at all?&#8221; (Writings)</p></blockquote><p>This pulls the reader out of &#8220;despair/nihilism&#8221;: if you can&#8217;t skip steps, wouldn&#8217;t science collapse? That&#8217;s exactly where many people misunderstand him&#8212;he&#8217;s not being pessimistic.</p><div><hr></div><h3>2) The core answer: overall irreducibility implies infinitely many &#8220;pockets of reducibility&#8221;</h3><p>His &#8220;pocket theory&#8221;:</p><blockquote><p>&#8220;whenever there&#8217;s overall computational irreducibility, there are also an infinite number of pockets of computational reducibility.&#8221; (Writings)</p></blockquote><p>What a &#8220;pocket&#8221; means:</p><blockquote><p>&#8220;there are always certain aspects of a system about which things can be said using limited computational effort.&#8221; (Writings)</p></blockquote><p>Why science relies on pockets:</p><blockquote><p>&#8220;these are what we typically concentrate on in &#8216;doing science&#8217;.&#8221; (Writings)</p></blockquote><div><hr></div><blockquote><p>So what are we supposed to do?</p></blockquote><div><hr></div><h2>AI Cannot Systematically Cross Irreducibility (the endgame is impossible)</h2><p>Outside the pockets, irreducibility still brings limits and surprises:</p><blockquote><p>&#8220;there are limits to this&#8212;and issues that run into computational irreducibility.&#8221; (Writings)</p><p>&#8220;we just can&#8217;t answer&#8221; / &#8220;surprises&#8221; (Writings)</p></blockquote><p>He restates that you cannot shortcut the full evolution:</p><blockquote><p>&#8220;there just won&#8217;t be any way&#8212;with AI or otherwise&#8212;to shortcut&#8230; step by step.&#8221; (Writings)</p></blockquote><p><strong>AI cannot let us systematically bypass irreducibility. Human smallness does not disappear. AI cannot challenge God.</strong></p><div><hr></div><h2>AI May Help Us Find &#8220;Pockets of Reducibility&#8221; Faster</h2><p>Still the pocket theory:</p><blockquote><p>&#8220;AI has the potential to give us streamlined ways to find certain kinds of pockets of computational reducibility.&#8221; (Writings)</p></blockquote><p>The key phrase is <strong>&#8220;streamlined ways&#8221;</strong>: not &#8220;prove everything / solve everything,&#8221; but &#8220;find certain kinds of pockets more smoothly.&#8221;</p><div><hr></div><h2>Why He Can Deny the &#8220;Endgame&#8221; Yet Still Talk About What AI Can Do</h2><p>Because in his framework:</p><ul><li><p><strong>Irreducibility</strong> explains why &#8220;solve science / do everything&#8221; is impossible (the ceiling),</p></li><li><p><strong>pockets of reducibility</strong> explain why science still works and why tools remain useful (the space),</p></li><li><p><strong>AI</strong> is placed in a very specific role: accelerating the discovery and exploitation of pockets, not creating shortcuts where irreducibility rules.</p></li></ul><blockquote><p>The practical takeaway is: remember what AI can do, and try to find your own niche in your domain&#8212;some &#8220;pocket of reducibility&#8221; that AI can help reveal.</p></blockquote><div><hr></div><h1>Neural Nets Are Good at &#8220;Roughly Right,&#8221; Not at &#8220;Getting Every Detail Exactly Right&#8221;</h1><p>Later in the article he demonstrates this with a series of experiments:</p><ul><li><p>Predicting functions: fits the past, but the future details collapse.</p></li><li><p>Predicting cellular automata: gets the simple parts right, fails on complex parts; errors compound.</p></li><li><p>Predicting the three-body problem: can &#8220;memorize&#8221; simple trajectories, struggles with complex ones.</p></li><li><p>Autoencoder compression: compresses data similar to its training set; can&#8217;t compress through irreducibility.</p></li></ul><p>His summary is basically:</p><blockquote><p>ML is often &#8220;roughly right,&#8221; but &#8220;nailing the details&#8221; isn&#8217;t its strength.</p></blockquote><p>This is his experience-based support for why LLMs/NNs hit a wall in science.</p><div><hr></div><h1>His Philosophical Reading of AlphaFold: Much of the &#8220;Success&#8221; Depends on Human Criteria</h1><ul><li><p>Protein folding itself is not a human task.</p></li><li><p>But what we count as &#8220;correct&#8221; (shape, function, secondary structure, etc.) is a human criterion.</p></li><li><p>So neural nets may succeed partly because they capture pockets of reducibility aligned with human perception/classification standards.</p></li><li><p>But with more complex or &#8220;alien&#8221; proteins, surprises and failures can still appear.</p></li></ul><p>The philosophical point is:</p><blockquote><p>AI often succeeds &#8220;within human-defined usable criteria,&#8221; not &#8220;in fully objective microscopic truth.&#8221;</p></blockquote><p>This can feel &#8220;mystical&#8221; again&#8212;but it&#8217;s directly connected to Demis Hassabis and AlphaFold in a very concrete way.</p><p>Demis is a generational prodigy, and AlphaFold truly did open an era. Both he and Wolfram are scientists I follow closely. But to be clear: <strong>Wolfram is not denying AlphaFold&#8217;s achievement.</strong> He&#8217;s using AlphaFold as a powerful example to illustrate a deeper structural claim: the biggest AI breakthroughs are often not &#8220;solving the entire microscopic reality of the world,&#8221; but <strong>precisely hitting a valuable pocket of reducibility.</strong></p><p>Their language systems are very different, so they don&#8217;t look like they&#8217;re speaking the same &#8220;dialect.&#8221; But I think the structure of what they&#8217;re saying is isomorphic.</p><p>In Wolfram&#8217;s framing, protein folding is not &#8220;human-centered&#8221;; yet we never evaluate AlphaFold by demanding the exact position of every atom at every moment. We want results that are usable and verifiable for biology: whether the overall structure is right, whether key features are right, whether the functional shape is right. In other words, our definition of &#8220;correctness&#8221; already lives in a region that is compressible, generalizable, and operational.</p><p>That&#8217;s where AlphaFold&#8217;s greatness becomes visible: it isn&#8217;t &#8220;explaining the world with smarter language,&#8221; but using powerful learning under massive data and structural constraints to identify repeated, stable, reusable regularities. That&#8217;s exactly what Wolfram calls a <strong>pocket of computational reducibility</strong>: even in a world that may be globally irreducible (and where you can&#8217;t systematically skip steps), there are still local regions that can be compressed, modeled, and reliably exploited. AlphaFold&#8217;s victory is that it locked onto a particularly valuable pocket with extraordinary precision&#8212;and engineered it into a scalable tool.</p><p>So when Wolfram talks about &#8220;the eye of the beholder,&#8221; he isn&#8217;t diminishing AlphaFold. He&#8217;s pointing to a key reality: science and engineering ultimately define success around human-relevant metrics, scales, and criteria&#8212;and pockets of reducibility often appear precisely at those levels, making the complex world compressible, predictable, and actionable.</p><p>I believe Wolfram fundamentally endorses the AlphaFold pattern:</p><p><strong>AlphaFold didn&#8217;t &#8220;pierce irreducibility&#8221;&#8212;it found one of the most valuable pockets of reducibility inside an irreducible world.</strong></p><div><hr></div><h2>&#8220;Science as Narrative&#8221; as His Landing Point for Why Humans Remain Irreplaceable in Science</h2><p>He emphasizes:</p><ul><li><p>Science has traditionally been about forging the world into a narrative humans can think and talk in.</p></li><li><p>Irreducibility implies that in many places you can only give &#8220;100 computational steps,&#8221; which is not a human narrative.</p></li><li><p>Human narrative needs <strong>waypoints</strong>: familiar theorems, concept chunks, language constructs.</p></li><li><p>Wolfram Language is, in essence, an attempt to manufacture such &#8220;human-assimilable waypoints.&#8221;</p></li><li><p>AI may help with naming or aligning vocabulary, but it&#8217;s not guaranteed that every pocket of reducibility can be covered by human concepts (interconcept space).</p></li></ul><p>This directly answers the line I&#8217;ve been developing about &#8220;psychological immersion / interface protocols&#8221;:</p><blockquote><p>LLMs are strongest at narrative and interface; but scientific progress relies on computable, reproducible, and structurally organized waypoints&#8212;not on smooth conversation.</p></blockquote><p>He defines science as a narrative-engineering project:</p><blockquote><p>&#8220;the essence of science&#8230; casting it in a form we humans can think about&#8221; (Writings)</p><p>&#8220;provide a human-accessible narrative&#8221;</p></blockquote><p>Irreducibility makes that narrative impossible in many places:</p><blockquote><p>&#8220;computational irreducibility&#8230; shows us that this will&#8230; not be possible&#8221; (Writings)</p><p>&#8220;It doesn&#8217;t do much good to say &#8216;here are 100 computational steps&#8217;&#8221;</p></blockquote><p>To translate a non-human computation chain into a human narrative, you need waypoints:</p><blockquote><p>&#8220;we&#8217;d need &#8216;waypoints&#8217; that are somehow familiar&#8221;</p><p>&#8220;pieces that humans can assimilate&#8221;</p></blockquote><p>He elevates this into the mission of computational language design:</p><blockquote><p>&#8220;capture &#8216;common lumps of computational work&#8217; as built-in constructs&#8221;</p><p>&#8220;identifying &#8216;human-assimilable waypoints&#8217; for computations&#8221;</p><p>&#8220;we&#8217;ll never be able to find such waypoints for all computations&#8221;</p></blockquote><p>And he warns that even if AI finds reduced representations, they may not map into our current concept system:</p><blockquote><p>&#8220;not part of our current scientific lexicon&#8221;</p><p>&#8220;there often won&#8217;t be&#8230; a &#8216;human-accessible narrative&#8217; that &#8216;reaches&#8217; them&#8221;</p></blockquote><p>Meaning: the pocket may exist, but we may not have words for it; AI can propose names, but that doesn&#8217;t guarantee those names become usable human waypoints.</p><p>Wolfram is effectively placing LLM strength where it belongs: <strong>narrative and interface</strong>&#8212;while insisting that what actually drives science (and what matters for governance/decision systems) is <strong>computable, reproducible, structurally organized waypoints</strong>&#8212;not a smooth chat.</p><blockquote><p>LLM can make you feel like you understood; but only waypoints (executable constructs / auditable intermediate states) can make you truly reproducible, portable, and accountable.</p></blockquote><p>I personally don&#8217;t fully grasp what &#8220;waypoints&#8221; refers to in practice.</p><div><hr></div><h2>The Architecture He Ultimately Bets On: AI + the Computational Paradigm</h2><p>Near the end he argues:</p><ul><li><p>AI is a new way of leveraging reducibility (capturing pockets),</p></li><li><p>but for fundamental discovery it&#8217;s weaker than the computational paradigm plus irreducible computation (enumeration, simulation, system exploration),</p></li><li><p>the best path forward is combining the strengths of AI and the formal computational paradigm.</p></li></ul><p>In plain terms:</p><ul><li><p><strong>AI</strong>: navigation, candidates, intuition, interface, humanization, cross-domain analogy</p></li><li><p><strong>Computational systems</strong>: verifiable derivation, reproducible execution, enumeration, rigorous structuring</p></li><li><p><strong>Irreducible computation</strong>: true &#8220;new terrain discovery&#8221; (not &#8220;paper-like&#8221; textual novelty)</p></li></ul><p>I am also not sure how to explain this part.</p><div><hr></div><p>&#36825;&#31687;&#25991;&#31456;&#21407;&#26412;&#21482;&#26159;&#25105;&#29992;&#26469;&#38416;&#36848;&#33258;&#24049;&#22312; 2026 &#24180;&#23545;&#22823;&#27169;&#22411;&#30340;&#30475;&#27861;&#12289;&#20197;&#21450;&#38271;&#26399;&#35268;&#21010;&#30340;&#19968;&#27573;&#38468;&#24405;&#12290;&#20294;&#23427;&#26412;&#36523;&#65292;&#20854;&#23454;&#26159;&#19968;&#31687; Wolfram &#23545; AI&#12289;&#20197;&#21450;&#26410;&#26469;&#31185;&#23398;&#33539;&#24335;&#30340;&#37325;&#30917;&#38416;&#36848;&#12290;&#20540;&#24471;&#21453;&#22797;&#12289;&#28145;&#24230;&#30740;&#35835;&#12290;&#25991;&#31456;&#21457;&#34920;&#20110; 2024 &#24180; 3 &#26376;&#65292;&#21487;&#23427;&#30495;&#27491;&#8220;&#22312;&#20449;&#24687;&#23494;&#24230;&#19978;&#20987;&#20013;&#25105;&#8221;&#30340;&#26102;&#21051;&#65292;&#21364;&#21457;&#29983;&#22312; 2025 &#24180;&#24213;&#12290;&#20182;&#22312;&#37027;&#20010;&#26102;&#20505;&#26089;&#23601;&#32473;&#25105;&#20204;&#25351;&#20986;&#20102;AI&#30340;&#23454;&#36136;&#12290;</p><p>&#36825;&#31181;&#26102;&#38388;&#24046;&#65292;&#27491;&#26159;&#25105;&#21644;&#39030;&#23574;&#31185;&#23398;&#23478;&#20043;&#38388;&#30340;&#36317;&#31163;&#65306;&#21516;&#19968;&#27573;&#25991;&#23383;&#65292;&#20889;&#20986;&#26469;&#30340;&#26102;&#20505;&#24050;&#32463;&#22312;&#25351;&#21521;&#26410;&#26469;&#65307;&#32780;&#25105;&#38656;&#35201;&#32463;&#21382;&#19968;&#27573;&#29616;&#23454;&#19990;&#30028;&#30340;&#25705;&#25830;&#12289;&#39033;&#30446;&#30340;&#22833;&#36133;&#19982;&#37325;&#24314;&#12289;&#23545;&#8220;&#31995;&#32479;&#22914;&#20309;&#22312;&#26102;&#38388;&#20013;&#25215;&#25285;&#36131;&#20219;&#8221;&#30340;&#21453;&#22797;&#25758;&#22681;&#65292;&#25165;&#32456;&#20110;&#35835;&#25026;&#23427;&#21040;&#24213;&#22312;&#35828;&#20160;&#20040;&#12290;</p><p>&#26356;&#37325;&#35201;&#30340;&#26159;&#65306;&#29702;&#35299;&#36825;&#31687;&#25991;&#31456;&#24182;&#19981;&#38656;&#35201;&#21338;&#22763;&#23398;&#20301;&#65292;&#20063;&#19981;&#38656;&#35201;&#26576;&#31181;&#8220;&#39640;&#28145;&#38376;&#27099;&#8221;&#12290;&#22312;&#26576;&#31181;&#24847;&#20041;&#19978;&#65292;AI &#30340;&#30830;&#25512;&#21160;&#20102;&#30693;&#35782;&#30340;&#24179;&#26435;&#65292;&#19981;&#20165;&#20165;&#26159;&#8220;&#26356;&#23481;&#26131;&#33719;&#21462;&#30693;&#35782;&#8221;&#65292;&#26356;&#26159;&#35753;&#25105;&#20204;&#26377;&#33021;&#21147;&#21435;&#27927;&#21047;&#12289;&#31579;&#20986;&#30495;&#27491;&#26377;&#20215;&#20540;&#30340;&#30693;&#35782;&#12290;&#23427;&#25226;&#24456;&#22810;&#36807;&#21435;&#21482;&#33021;&#38752;&#36523;&#20221;&#12289;&#22280;&#23618;&#12289;&#26399;&#21002;&#26469;&#38170;&#23450;&#30340;&#25152;&#35859;&#8220;&#23398;&#26415;&#26435;&#23041;&#8221;&#65292;&#37325;&#26032;&#25289;&#22238;&#21040;&#19968;&#20010;&#26356;&#26420;&#32032;&#20063;&#26356;&#20005;&#33499;&#30340;&#26631;&#20934;&#65306;<strong>&#33021;&#21542;&#35299;&#37322;&#19990;&#30028;&#65292;&#33021;&#21542;&#33853;&#22320;&#39564;&#35777;&#65292;&#33021;&#21542;&#22797;&#29616;&#12290;</strong></p><p>&#22312;&#35770;&#25991;&#25968;&#37327;&#29190;&#28856;&#30340;&#26102;&#20195;&#65292;&#35768;&#22810;&#23398;&#30028;&#35770;&#25991;&#38500;&#20102;&#33457;&#38065;&#21457;&#34920;&#12289;&#20114;&#30456;&#24341;&#29992;&#12289;&#24110;&#21161;&#20316;&#32773;&#33719;&#21462;&#32844;&#31216;&#20043;&#22806;&#65292;&#24182;&#19981;&#20135;&#29983;&#20219;&#20309;&#30495;&#23454;&#30340;&#22686;&#37327;&#20215;&#20540;&#65307;&#26356;&#31967;&#30340;&#26159;&#65292;&#21487;&#22797;&#29616;&#24615;&#26412;&#36523;&#37117;&#36234;&#26469;&#36234;&#21487;&#30097;&#12290;&#30456;&#36739;&#20043;&#19979;&#65292;Wolfram &#30340;&#21069;&#30651;&#24615;&#21453;&#32780;&#36828;&#36828;&#27809;&#26377;&#34987;&#22823;&#20247;&#30495;&#27491;&#35748;&#35782;&#21040;&#12290;&#25105;&#29978;&#33267;&#35748;&#20026;&#65292;&#20182;&#24050;&#32463;&#26159;&#19979;&#19968;&#20195;&#31185;&#23398;&#33539;&#24335;&#30340;&#22880;&#22522;&#32773;&#20043;&#19968;&#65292;&#21482;&#26159;&#36825;&#20010;&#20107;&#23454;&#36824;&#27809;&#26377;&#34987;&#8220;&#20027;&#27969;&#21465;&#20107;&#8221;&#21450;&#26102;&#21560;&#25910;&#12290;</p><p>&#24403;&#28982;&#65292;&#35201;&#30495;&#27491;&#35835;&#25026;&#36825;&#31687;&#25991;&#31456;&#65292;&#20320;&#38656;&#35201;&#20808;&#23545;&#19968;&#20010;&#27010;&#24565;&#26377;&#22522;&#26412;&#30452;&#35273;&#65306;<strong>&#35745;&#31639;&#30340;&#19981;&#21487;&#32422;&#24615;</strong>&#12290;&#25105;&#24456;&#22810;&#24180;&#21069;&#31532;&#19968;&#27425;&#35835; Wolfram &#30340;&#12298;A New Kind of Science&#12299;&#26102;&#65292;&#20960;&#20046;&#23436;&#20840;&#19981;&#30693;&#36947;&#20182;&#22312;&#35762;&#20160;&#20040;&#12290;&#21518;&#26469;&#65292;&#38752; 6 &#21040; 10 &#24180;&#30340;&#20154;&#29983;&#19982;&#39033;&#30446;&#20307;&#39564;&#65292;&#25165;&#24930;&#24930;&#30340;&#39046;&#20250;&#20182;&#36825;&#20010;&#8220;&#19981;&#21487;&#32422;&#8221;&#12290;</p><p>&#20063;&#27491;&#22240;&#27492;&#65292;&#20182;&#22312;&#25105;&#24515;&#30446;&#20013;&#25104;&#20026;&#20102;&#26497;&#20854;&#38752;&#21069;&#30340;&#37325;&#30917;&#20154;&#29289;&#12290;&#32780;&#19988;&#65292;&#20182;&#20173;&#28982;&#22312;&#19990;&#12289;&#20173;&#22312;&#31215;&#26497;&#30740;&#31350;&#12289;&#20173;&#22312;&#25345;&#32493;&#36755;&#20986;&#12290;&#36825;&#24847;&#21619;&#30528;&#65306;&#20182;&#29616;&#22312;&#35828;&#30340;&#27599;&#19968;&#21477;&#35805;&#65292;&#37117;&#19981;&#26159;&#8220;&#36807;&#21435;&#23436;&#25104;&#26102;&#8221;&#30340;&#23398;&#26415;&#32467;&#35770;&#65292;&#32780;&#26159;&#20173;&#22312;&#25512;&#36827;&#20013;&#30340;&#33539;&#24335;&#26500;&#24314;&#12290;&#20182;&#35828;&#30340;&#27599;&#21477;&#35805;&#65292;&#25105;&#37117;&#24456;&#22312;&#24847;&#12290;&#26377;&#24456;&#22810;&#24341;&#29992;&#19981;&#26159;&#36825;&#31687;&#25991;&#31456;&#65292;&#32780;&#26159;&#38598;&#21512;&#20182;&#30340;&#28436;&#35762;&#21644;&#20854;&#20182;&#25991;&#29486;&#65292;&#36825;&#37324;&#19981;&#36184;&#36848;&#20102;&#12290;</p><p><a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/">https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/</a></p><div><hr></div><h2>AI &#19981;&#21487;&#33021;&#8220;do everything / solve science&#8221;</h2><ul><li><p>&#31038;&#20250;&#19978;&#26377;&#20154;&#30456;&#20449; AI &#26368;&#32456;&#33021;&#8220;&#20570;&#19968;&#20999;&#8221;</p></li><li><p>&#20182;&#25226;&#8220;&#31185;&#23398;&#8221;&#20316;&#20026;&#32456;&#26497;&#21387;&#21147;&#27979;&#35797;&#65306;&#33021;&#19981;&#33021;&#25226;&#20960;&#30334;&#24180;&#32047;&#31215;&#30340;&#31185;&#23398;&#26410;&#35299;&#38382;&#39064;&#19968;&#21475;&#27668;&#35299;&#20915;&#65311;</p></li><li><p>&#20182;&#30340;&#31572;&#26696;&#26159;&#8220;inevitably and firmly no&#8221;</p></li></ul><p>&#27880;&#24847;&#65306;&#20182;&#19981;&#26159;&#22312;&#21542;&#35748; AI &#30340;&#23454;&#29992;&#20215;&#20540;&#65292;&#32780;&#26159;&#22312;&#21542;&#35748;&#19968;&#31181;&#8220;&#32456;&#23616;&#24335;&#20840;&#33021;&#8221;&#21465;&#20107;&#12290;</p><blockquote><p>&#8220;there&#8217;s a somewhat widespread belief that eventually AI will be able to &#8216;do everything&#8217;&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>&#21253;&#25324;&#24456;&#22810;&#19968;&#24320;&#22987;&#24576;&#30097;&#30340;&#65292;&#19968;&#24320;&#22987;&#29378;&#28909;&#30340;&#12290;AI can&#8217;t do everything, &#20063;&#26159;&#25105;&#29616;&#22312;&#38750;&#24120;&#24819;&#24378;&#35843;&#30340;&#20107;&#24773;&#12290;&#20182;&#32473;&#25105;&#20204;&#19968;&#31181;&#23450;&#20301;&#22411;&#30340;&#25351;&#24341;&#12290;&#36825;&#20010;&#35201;&#36319;&#20182;&#23545;&#40784;&#65292;&#24403;&#20320;&#33021;&#20570;&#30340;&#20107;&#24773;&#33539;&#22260;&#32553;&#23567;&#30340;&#26102;&#20505;&#65292;&#20854;&#23454;&#20320;&#33021;&#20570;&#30340;&#20107;&#24773;&#21453;&#32780;&#8220;&#22810;&#20102;&#8221;&#12290;&#22240;&#20026;&#20320;&#30340;&#31934;&#24230;&#25552;&#39640;&#20102;&#65292;&#20320;&#30340;&#23450;&#20301;&#28165;&#26224;&#20102;&#65292;&#33267;&#23569;&#23545;&#25105;&#26469;&#35828;&#26159;&#36825;&#26679;&#12290;</p><blockquote><p>&#8220;So what about science?&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>&#28982;&#21518;&#20182;&#32473;&#20986;&#8220;&#31185;&#23398;&#20026;&#20309;&#26159;&#32456;&#26497;&#21387;&#21147;&#27979;&#35797;&#8221;&#30340;&#29702;&#30001;&#65306;&#31185;&#23398;&#26159;&#20154;&#31867;&#25991;&#26126;&#26368;&#22823;&#26234;&#35782;&#24037;&#31243;&#65292;&#20294;&#20173;&#28982;&#27809;&#20570;&#23436;&#8212;&#8212;&#25152;&#20197;&#26368;&#33021;&#26816;&#39564;&#8220;do everything&#8221;&#30340;&#21547;&#20041;&#12290;</p><blockquote><p>&#8220;the single largest intellectual edifice of our civilization&#8221; (Writings by Stephen Wolfram)</p><p>&#8220;there are still all sorts of scientific questions that remain.&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/">Writings by Stephen Wolfram</a>)</p></blockquote><p>&#19981;&#26159;&#35828;&#8220;AI &#33021;&#19981;&#33021;&#24110;&#31185;&#23398;&#8221;&#65292;&#32780;&#26159;&#38382;&#8212;&#8212;<strong>&#33021;&#19981;&#33021;&#25226;&#21097;&#19979;&#30340;&#20840;&#37096;&#25910;&#23614;</strong>&#12290;</p><blockquote><p>&#8220;So can AI now come in and just solve all of them?&#8221; (Writings by Stephen Wolfram)</p><p>&#8220;the answer is inevitably and firmly no.&#8221; (Writings by Stephen Wolfram)</p></blockquote><p><strong>&#19981;&#34892;&#65281;</strong></p><blockquote><p>&#8220;But that certainly doesn&#8217;t mean AI can&#8217;t importantly help&#8230;&#8221; (Writings by Stephen Wolfram)</p></blockquote><p>&#36825;&#20010;&#23601;&#24456;&#37325;&#35201;&#20102;&#65292;&#20182;&#36825;&#20010;but&#65281; AI&#29305;&#21035;&#25797;&#38271;&#20160;&#20040;&#65292;&#23601;&#26159;&#25105;&#20204;&#35813;&#21435;&#30740;&#31350;&#30340;&#20107;&#24773;&#12290;&#26377;&#20215;&#20540;&#30340;&#20107;&#24773;&#12290;</p><div><hr></div><h2>&#20182;&#32473; AI &#30340;&#8220;&#29616;&#23454;&#23450;&#20301;&#8221;&#65306;&#35821;&#35328;&#25509;&#21475; + &#20256;&#32479;&#26234;&#24935;&#30340;&#39640;&#23618;&#33258;&#21160;&#34917;&#20840;</h2><p></p><ul><li><p>LLM &#26159;&#19968;&#31181;&#26032;&#30340; <strong>linguistic interface</strong>&#65288;&#35821;&#35328;&#25509;&#21475;&#65289;&#65292;&#33021;&#25226;&#20154;&#31867;&#24847;&#22270;&#25509;&#21040;&#26082;&#26377;&#35745;&#31639;&#33021;&#21147;&#19978;&#65288;&#20182;&#33258;&#24049;&#30340;&#20363;&#23376;&#26159; Wolfram Language&#65289;&#12290;</p></li><li><p>LLM &#36824;&#33021;&#26681;&#25454;&#8220;conventional scientific wisdom&#65288;&#24815;&#24120;&#31185;&#23398;&#26234;&#24935;&#65289;&#8221;&#20570; <strong>high-level autocomplete</strong>&#65306;&#32473;&#8220;&#24815;&#24120;&#31572;&#26696; / &#24815;&#24120;&#19979;&#19968;&#27493;&#8221;&#12290;</p></li></ul><p>&#36825;&#19968;&#27573;&#24456;&#20851;&#38190;&#65306;</p><p>&#20182;&#25215;&#35748; LLM <strong>&#24456;&#24378;</strong>&#65292;&#20294;&#24378;&#22312;&#8212;&#8212;<strong>&#25226;&#26082;&#26377;&#33539;&#24335;&#29992;&#24471;&#26356;&#39034;</strong>&#65292;&#32780;&#19981;&#26159;&#29983;&#25104;&#8220;&#26032;&#33539;&#24335;&#30340;&#30495;&#21457;&#29616;&#8221;&#12290;</p><h3>1) LLM &#26159;&#26032;&#30340; &#8220;linguistic interface&#65288;&#35821;&#35328;&#25509;&#21475;&#65289;&#8221;</h3><blockquote><p>&#8220;At a very practical level, for example, LLMs provide a new kind of linguistic interface to the computational capabilities that we&#8217;ve spent so long building in the Wolfram Language.&#8221; (Writings by Stephen Wolfram)</p></blockquote><h3>2) LLM &#20316;&#20026;&#8220;&#39640;&#23618;&#33258;&#21160;&#34917;&#20840;&#8221;&#65306;&#22522;&#20110;&#8220;conventional scientific wisdom&#8221;</h3><blockquote><p>&#8220;And through their knowledge of &#8216;conventional scientific wisdom&#8217; LLMs can often provide what amounts to very high-level &#8216;autocomplete&#8217;&#8230;&#8221; (Writings by Stephen Wolfram)</p><p>&#8220;&#8230;<strong>for filling in &#8216;conventional answers&#8217; or &#8216;conventional next steps&#8217; in scientific work</strong>.&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/?utm_source=chatgpt.com">Writings by Stephen Wolfram</a>)</p></blockquote><p>&#22909;&#65292;&#25105;&#24819;&#25226; <strong>linguistic interface</strong> &#36825;&#20010;&#35789;&#21333;&#25294;&#20986;&#26469;&#35828;&#65292;&#22240;&#20026;&#23427;&#22826;&#37325;&#35201;&#20102;&#12290;</p><p>&#25105;&#20204;&#29616;&#22312;&#23545;&#23427;&#30340;&#24819;&#35937;&#36824;&#38750;&#24120;&#29421;&#31364;&#65306;&#22522;&#26412;&#31561;&#21516;&#20110;&#8220;&#19968;&#20010;&#22823;&#27169;&#22411;&#23545;&#35805;&#31383;&#21475;&#8221;&#65292;&#20877;&#21152;&#19978;&#19968;&#20123;&#26222;&#36890;&#30340;&#32534;&#31243;&#24212;&#29992;&#12290;&#35828;&#30333;&#20102;&#65292;&#21566;&#36744;&#20035;&#20961;&#20154;&#65292;&#25105;&#20204;&#22823;&#22810;&#25968;&#20154;&#30446;&#21069;&#20063;&#23601;&#20572;&#30041;&#22312;&#36825;&#19968;&#23618;&#65306;&#24320;&#20010;&#31383;&#21475;&#65292;&#20002;&#20010; prompt&#65292;&#24471;&#21040;&#19968;&#27573;&#30475;&#36215;&#26469;&#25402;&#20687;&#26679;&#30340;&#36755;&#20986;&#65292;&#28982;&#21518;&#35273;&#24471;&#19990;&#30028;&#34987;&#25913;&#20889;&#20102;&#12290;</p><p>&#20294;&#38382;&#39064;&#26159;&#65306;<strong>&#20809;&#26159;&#8220;&#25509;&#21475;&#8221;&#26412;&#36523;&#65292;&#23601;&#26377;&#24040;&#22823;&#30340;&#31995;&#32479;&#24615;&#24320;&#21457;&#31354;&#38388;&#12290;</strong></p><p>&#24182;&#19981;&#26159;&#20320;&#29616;&#22312;&#21482;&#24819;&#21040;&#31383;&#21475;&#65292;&#36825;&#20010;&#19990;&#30028;&#23601;&#21482;&#21097;&#31383;&#21475;&#12290;&#31383;&#21475;&#21482;&#26159;&#26368;&#21407;&#22987;&#12289;&#26368;&#31895;&#31961;&#30340;&#19968;&#31181;&#25509;&#21475;&#24418;&#24577;&#12290;</p><p>&#25152;&#35859; &#8220;AI vibe coding&#8221; &#30340;&#29245;&#24863;&#65292;&#26412;&#36136;&#19978;&#20063;&#21482;&#26159;&#25509;&#21475;&#21319;&#32423;&#24102;&#26469;&#30340;&#30701;&#26399;&#24555;&#24863;&#65306;&#20320;&#20197;&#21069;&#35201;&#28903;&#33041;&#21322;&#22825;&#30340;&#24037;&#20316;&#65292;&#29616;&#22312;&#19968;&#20010; prompt &#31435;&#21051;&#32473;&#20320;&#19968;&#20221;&#8220;&#33021;&#36305;&#36215;&#26469;&#8221;&#30340;&#19996;&#35199;&#65292;&#20110;&#26159;&#20320;&#24456;&#29245;&#8230;&#20294;&#23601;&#21040;&#27492;&#20026;&#27490;&#20102;&#21527;&#65311;&#22914;&#26524;&#25226;&#36825;&#24403;&#25104;&#32456;&#28857;&#65292;&#37027;&#25105;&#20204;&#20854;&#23454;&#21482;&#26159;&#25226;&#19968;&#20010;&#26032;&#30340;&#8220;&#25991;&#26412;&#32769;&#34382;&#26426;&#8221;&#25442;&#36827;&#20102;&#24037;&#20316;&#27969;&#12290;</p><p>&#30495;&#27491;&#20851;&#38190;&#30340;&#19981;&#26159;&#8220;&#23427;&#33021;&#32473;&#20320;&#31572;&#26696;&#8221;&#65292;&#32780;&#26159;&#65306;<strong>&#36825;&#20010;&#25509;&#21475;&#33021;&#19981;&#33021;&#25215;&#36733;&#29616;&#23454;&#19990;&#30028;&#30340;&#36131;&#20219;&#12290;</strong></p><p>&#25105;&#22312;&#24456;&#22810;&#25991;&#31456;&#37324;&#21453;&#22797;&#24378;&#35843;&#36807;&#20960;&#20010;&#35789;&#65306;</p><p><strong>&#21487;&#22797;&#29616;&#12289;&#21487;&#23457;&#35745;&#12289;&#21487;&#36801;&#31227;&#12290;</strong></p><p>&#36825;&#26159;&#20219;&#20309;&#36827;&#20837;&#8220;&#29616;&#23454;&#20915;&#31574;&#8221;&#30340;&#31995;&#32479;&#24517;&#39035;&#20855;&#22791;&#30340;&#21046;&#24230;&#23646;&#24615;&#12290;&#65288;&#20320;&#21435;&#30475;&#30475;&#20219;&#20309;&#32452;&#32455;&#27969;&#31243;&#65292;&#27861;&#24459;&#27969;&#31243;&#65292;&#35009;&#21028;&#27969;&#31243;&#65292;&#23457;&#35745;&#27969;&#31243;&#65292;&#36825;&#37324;&#19981;&#36184;&#36848;&#65289;</p><p>&#22240;&#20026;&#20320;&#25918;&#30524;&#30475;&#21435;&#65306;</p><p>&#20154;&#31867;&#30340;&#26085;&#24120;&#29983;&#27963;&#12289;&#32452;&#32455;&#21327;&#20316;&#12289;&#20844;&#21496;&#27835;&#29702;&#12289;&#29978;&#33267;&#22269;&#23478;&#27835;&#29702;&#8212;&#8212;&#22823;&#37327;&#20851;&#38190;&#20915;&#31574;&#65292;&#24402;&#26681;&#21040;&#24213;&#37117;&#26159;&#19968;&#31181; <strong>linguistic interface</strong>&#65306;&#25105;&#20204;&#29992;&#35821;&#35328;&#25552;&#20986;&#35758;&#39064;&#12289;&#25551;&#36848;&#39118;&#38505;&#12289;&#20132;&#25442;&#25215;&#35834;&#12289;&#20889;&#19979;&#21046;&#24230;&#12289;&#20570;&#20986;&#35009;&#20915;&#65292;&#28982;&#21518;&#25226;&#36825;&#20123;&#35821;&#35328;&#21464;&#25104;&#34892;&#21160;&#19982;&#21518;&#26524;&#12290;</p><p>&#37027;&#38382;&#39064;&#26469;&#20102;&#65306;&#36825;&#20123;&#20915;&#31574;&#33021;&#38752;&#22823;&#27169;&#22411;&#21527;&#65311;&#20320;&#25954;&#20449;&#21527;&#65311;</p><p>&#21333;&#20010;&#27169;&#22411;&#30340;&#20915;&#31574;&#22825;&#28982;&#20855;&#26377;&#24378;&#28872;&#30340;&#29615;&#22659;&#20381;&#36182;&#65306;&#21516;&#19968;&#21477;&#35805;&#12289;&#21516;&#19968;&#20010;&#38382;&#39064;&#65292;&#25442;&#20010;&#19978;&#19979;&#25991;&#23601;&#21464;&#65307;&#32780; prompt &#37324;&#20889;&#30340;&#32422;&#26463;&#65292;&#26412;&#36136;&#19978;&#26159;<strong>&#36719;&#32422;&#26463;</strong>&#65292;&#19981;&#26159;&#26426;&#22120;&#21487;&#25191;&#34892;&#30340; <strong>if-else</strong>&#12290;&#26356;&#21361;&#38505;&#30340;&#26159;&#65306;&#27169;&#22411;&#20026;&#20102;&#8220;&#23436;&#25104;&#35821;&#35328;&#8221;&#65292;&#20026;&#20102;&#32473;&#20320;&#19968;&#20010;&#30475;&#36215;&#26469;&#20687;&#31572;&#26696;&#30340;&#31572;&#26696;&#65292;&#23427;&#26159;&#20250;&#20027;&#21160;&#24573;&#30053;&#32422;&#26463;&#30340;&#12290;&#22240;&#20026;&#23427;&#30340;&#30446;&#26631;&#20989;&#25968;&#19981;&#26159;&#8220;&#36981;&#23432;&#21046;&#24230;&#8221;&#65292;&#32780;&#26159;&#8220;&#20135;&#20986;&#36830;&#36143;&#25991;&#26412;&#8221;&#12290;Predict the next token!</p><p>&#23427;&#19981;&#20250;&#35828;&#65306;Error &#65288;&#36319;&#19968;&#20010;&#32534;&#21495;&#65289;&#12290;</p><p>&#23427;&#24517;&#39035;&#32473;&#20320;&#32467;&#26524;&#12290;&#23427;&#30340;&#40664;&#35748;&#34892;&#20026;&#26159;&#25226;&#19981;&#30830;&#23450;&#24615;&#20266;&#35013;&#25104;&#30830;&#23450;&#24615;&#65292;&#25226;&#19981;&#21487;&#21028;&#23450;&#20266;&#35013;&#25104;&#21487;&#35009;&#20915;&#12290;</p><p><strong>&#36825;&#20010;&#38382;&#39064;&#35299;&#20915;&#20102;&#21527;&#65311;</strong></p><p>&#27809;&#26377;&#22043;&#12290;</p><p>&#25105;&#20204;&#26377;&#27809;&#26377;&#19968;&#31181;&#25509;&#21475;&#26550;&#26500;&#65292;&#33021;&#35753;&#35821;&#35328;&#36827;&#20837;&#31995;&#32479;&#26102;&#65306;</p><ul><li><p>&#32422;&#26463;&#26159;&#30828;&#30340;&#65292;&#21487;&#32534;&#35793;&#12289;&#21487;&#39564;&#35777;&#30340;</p></li><li><p>&#20986;&#38169;&#26159;&#20801;&#35768;&#30340;&#65292;&#33021;&#26126;&#30830; throw error / require override</p></li><li><p>&#20915;&#31574;&#26159;&#21487;&#22238;&#25918;&#12289;&#21487;&#36861;&#36131;&#30340;&#65292;&#32780;&#19981;&#26159;&#19968;&#27425;&#24615;&#30340;&#25991;&#26412;&#20250;&#35805;</p></li><li><p>&#36755;&#20986;&#36827;&#20837;&#29616;&#23454;&#20043;&#21069;&#24517;&#39035;&#32463;&#36807;&#21046;&#24230;&#21270;&#30340;&#23457;&#35745;&#19982;&#38376;&#25511;</p></li></ul><p>&#22914;&#26524;&#27809;&#26377;&#65292;&#37027;&#25105;&#20204;&#29616;&#22312;&#20139;&#21463;&#30340;&#8220;vibe coding&#8221;&#29245;&#24863;&#65292;&#26412;&#36136;&#19978;&#36824;&#21482;&#26159;&#20302;&#39118;&#38505;&#22330;&#26223;&#30340;&#28895;&#33457;&#12290;&#31163;&#30495;&#27491;&#33021;&#25215;&#36733;&#20154;&#31867;&#31038;&#20250;&#30340;&#20915;&#31574;&#31995;&#32479;&#65292;&#36824;&#24046;&#19968;&#25972;&#26465;&#21046;&#24230;&#21270;&#30340;&#24037;&#31243;&#40511;&#27807;&#12290;&#38590;&#36947;&#36825;&#20010;&#19981;&#20540;&#24471;&#24320;&#21457;&#32773;&#25237;&#36164;&#36164;&#28304;&#65292;&#31934;&#21147;&#21435;&#30740;&#31350;&#21527;&#65311;&#38590;&#36947;&#36825;&#19981;&#26159;&#19968;&#26465;&#25512;&#28436;&#20986;&#26469;&#30340;&#32844;&#19994;&#24247;&#24196;&#22823;&#36947;&#21527;&#65311;&#65288;&#31505;&#65289;&#23545;&#20110;&#25105;&#20204;&#36825;&#31181;&#38750;&#31185;&#30740;&#65292;&#20559;&#24212;&#29992;&#21521;&#30340;&#24320;&#21457;&#32773;&#26469;&#35828;&#12290;</p><div><hr></div><h2>&#20182;&#25226;&#31185;&#23398;&#21490;&#25277;&#35937;&#25104;&#20004;&#27425;&#34920;&#24449;&#38761;&#21629;&#65306;&#25968;&#23398;&#34920;&#24449; &#8594; &#35745;&#31639;&#34920;&#24449;</h2><p>&#20182;&#25509;&#19979;&#26469;&#25552;&#20986;&#19968;&#20010;&#26356;&#28145;&#30340;&#21028;&#26029;&#26694;&#26550;&#65306;</p><ul><li><p>300 &#24180;&#21069;&#65306;&#31185;&#23398;&#30340;&#36291;&#36801;&#26469;&#33258;&#8220;&#29992;&#25968;&#23398;&#34920;&#31034;&#19990;&#30028;&#8221;</p></li><li><p>&#29616;&#22312;&#65306;&#25105;&#20204;&#27491;&#22312;&#32463;&#21382;&#8220;&#29992;&#35745;&#31639;&#34920;&#31034;&#19990;&#30028;&#8221;&#30340;&#36291;&#36801;&#65288;&#20182;&#35748;&#20026;&#36825;&#26159;&#26356;&#26681;&#26412;&#30340;&#33539;&#24335;&#65289;</p></li></ul><p>&#36825;&#19968;&#27493;&#20854;&#23454;&#26159;&#22312;&#8220;&#25260;&#39640;&#26631;&#20934;&#8221;&#65306;</p><p>&#22914;&#26524;&#20320;&#38382;&#20182; AI &#26159;&#21542;&#8220;&#25913;&#21464;&#31185;&#23398;&#8221;&#65292;&#20320;&#24471;&#20808;&#35828;&#26126;&#65306;AI &#21040;&#24213;&#26159;&#22312;<strong>&#24037;&#20855;&#23618;</strong>&#24110;&#24537;&#65292;&#36824;&#26159;&#22312;<strong>&#34920;&#24449;&#23618;</strong>&#24102;&#26469;&#26032;&#30340;&#31185;&#23398;&#33539;&#24335;&#12290;</p><blockquote><p>&#8220;Three centuries ago science was transformed by the idea of representing the world using mathematics.&#8221; (Writings)</p><p>&#8220;<strong>And in our times we&#8217;re in the middle of a major transformation to a fundamentally computational representation of the world</strong> (and, yes, that&#8217;s what our Wolfram Language computational language is all about).&#8221;</p></blockquote><p>&#24037;&#20855;&#23618; vs &#33539;&#24335;&#23618;&#8221;&#30340;&#38382;&#27861;</p><blockquote><p>&#8220;So how does AI stack up?&#8221; (Writings)</p><p>&#8220;<strong>Should we think of it essentially as a practical tool for accessing existing methods, or does it provide something fundamentally new for science?</strong>&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/?utm_source=chatgpt.com">Writings</a>)</p></blockquote><p>&#20160;&#20040;&#24847;&#24605;&#21834;&#65292;&#29702;&#35299;&#36825;&#20010;&#23545;&#20110;&#21021;&#27425;&#25509;&#35302;&#20182;&#29702;&#24565;&#30340;&#20154;&#26469;&#35828;&#26377;&#20123;&#38476;&#29983;&#12290;&#8220;computational irreducibility&#65288;&#35745;&#31639;&#19981;&#21487;&#32422;&#24615;&#65289;&#8221;&#30830;&#23454;&#23427;&#19981;&#26159;&#19968;&#20010;&#8220;&#25216;&#26415;&#26415;&#35821;&#8221;&#65292;&#32780;&#26159;&#19968;&#20010;<strong>&#19990;&#30028;&#35266;&#24320;&#20851;</strong>&#65306;&#23427;&#22312;&#35828;&#65292;&#24456;&#22810;&#31995;&#32479;&#19981;&#26159;&#8220;&#26356;&#32874;&#26126;&#23601;&#33021;&#36339;&#27493;&#8221;&#65292;&#32780;&#26159;<strong>&#24517;&#39035;&#25226;&#35745;&#31639;&#20570;&#23436;&#65288;&#23454;&#22312;&#30475;&#19981;&#26126;&#30333;&#30340;&#65292;&#19968;&#23450;&#35201;&#21435;&#30740;&#31350;&#20803;&#32990;&#26426;&#30340;&#26426;&#21046;&#65289;</strong>&#65307;&#20320;&#33021;&#20570;&#30340;&#65292;&#24448;&#24448;&#21482;&#26159;&#25214;&#21040;&#19968;&#20123;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;&#65292;&#22312;&#23616;&#37096;&#21387;&#32553;&#12289;&#22312;&#23616;&#37096;&#39044;&#27979;&#12290;&#35760;&#20303;<strong>&#21487;&#32422;&#21475;&#34955;</strong>&#36825;&#20010;&#35789;&#12290;</p><p>Demis Hassabis &#22312;&#35848;&#33258;&#28982;&#30028;&#21487;&#24314;&#27169;&#24615;&#26102;&#65292;&#25105;&#35748;&#20026;&#20182;&#30340;&#24847;&#24605;&#26159;&#24046;&#19981;&#22810;&#30340;&#12290;&#20182;&#23545;Lex Fridman &#30340;&#37027;&#27425;&#35775;&#35848;&#65292;&#25105;&#20854;&#23454;&#22312;X&#19978;&#20889;&#20102;&#24456;&#22810;&#25991;&#31456;&#65292;&#20294;&#26159;&#29616;&#22312;&#19981;&#22823;&#22909;&#25214;&#20102;&#12290;&#21518;&#26469;&#25105;&#23601;&#25226;&#25105;&#24819;&#20889;&#30340;&#38271;&#31687;&#25991;&#31456;&#37117;&#25918;&#22312;Substack. &#20182;&#35828;&#30340;&#65292;&#24403;&#26102;&#20182;&#31867;&#27604;&#30340;&#34507;&#30333;&#36136;&#25240;&#21472;&#65292;&#23601;&#26159;&#31867;&#20284;&#36825;&#31181;&#21475;&#34955;&#12290;&#24847;&#24605;&#26159;&#19968;&#20010;&#38750;&#24120;&#24191;&#38420;&#30340;&#22825;&#22320;&#65292;&#24635;&#26377;&#19968;&#20123;&#20154;&#36393;&#20986;&#26469;&#30340;&#23567;&#36335;&#65292;&#25214;&#21040;&#36825;&#26465;&#23567;&#36335;&#65292;&#20320;&#23601;&#29245;&#27498;&#27498;&#12290;&#27809;&#25214;&#21040;&#65292;&#20320;&#23601;brute force. &#20182;&#32972;&#21518;&#23601;&#26159;&#8220;&#21475;&#34955;&#21487;&#32422; / &#21542;&#21017;&#26292;&#21147;&#8221;&#30340;&#32467;&#26500;&#65306;</p><blockquote><p>&#8220;if there&#8217;s not [patterns]&#8230; you have to do brute force.&#8221; (Lex Fridman)</p></blockquote><p>&#20197;&#21450;&#20182;&#35299;&#37322;&#20026;&#20160;&#20040;&#24456;&#22810;&#33258;&#28982;&#38382;&#39064;&#8220;&#30475;&#36215;&#26469;&#32452;&#21512;&#29190;&#28856;&#65292;&#20294;&#20173;&#21487;&#34987;&#27169;&#22411;&#21270;&#8221;&#30340;&#20851;&#38190;&#28857;&#65306;</p><blockquote><p>&#8220;there&#8217;s some structure&#8230; some gradient you can follow.&#8221; (Lex Fridman)</p></blockquote><p>&#25226;&#36825;&#20004;&#21477;&#21644; Wolfram &#25918;&#22312;&#19968;&#36215;&#35835;&#65292;</p><ul><li><p>Wolfram &#35828;&#65306;&#25972;&#20307;&#19981;&#21487;&#32422;&#20250;&#38459;&#27490;&#20320;&#8220;&#31995;&#32479;&#24615;&#36339;&#27493;&#8221;&#65292;&#20294;&#24635;&#26377;&#8220;pockets of reducibility&#65288;&#21487;&#32422;&#21475;&#34955;&#65289;&#8221;&#12290;</p></li><li><p>Demis &#35828;&#65306;&#22914;&#26524;&#31354;&#38388;&#37324;&#26377;&#32467;&#26500;&#65288;&#26799;&#24230;/&#26223;&#35266;&#65289;&#65292;&#20320;&#23601;&#33021;&#26377;&#25928;&#25628;&#32034;&#65307;&#22914;&#26524;&#27809;&#26377;&#32467;&#26500;&#65292;&#37027;&#23601;&#21482;&#33021; brute force&#12290;(<a href="https://lexfridman.com/demis-hassabis-2-transcript/">Lex Fridman</a>)</p></li></ul><p>&#25152;&#20197;&#8220;&#20182;&#21644; Demis &#24456;&#30456;&#20284;&#8212;&#8212;&#19981;&#21487;&#32422;&#24615;&#8221;,</p><p><strong>&#19990;&#30028;&#24182;&#19981;&#20445;&#35777;&#22788;&#22788;&#26377;&#25463;&#24452;&#12290;&#25152;&#35859;&#26234;&#33021;&#65292;&#24456;&#22810;&#26102;&#20505;&#21482;&#26159;&#26356;&#24555;&#22320;&#25214;&#21040;&#21738;&#20123;&#22320;&#26041;&#26377;&#32467;&#26500;&#12289;&#21738;&#20123;&#22320;&#26041;&#27809;&#32467;&#26500;&#12290;</strong></p><p>&#32780;&#8220;&#19981;&#21487;&#32422;&#24615;&#8221;&#22914;&#8220;&#28009;&#28698;&#28895;&#28023;&#8221;&#65292;&#23601;&#22312;&#20110;&#23427;&#20250;&#25226;&#20320;&#20174;&#8220;&#33021;&#21147;&#23815;&#25308;&#8221;&#30452;&#25509;&#25289;&#22238;&#21040;<strong>&#35745;&#31639;&#19982;&#21046;&#24230;</strong>&#65306;</p><ul><li><p>&#20160;&#20040;&#26102;&#20505;&#24517;&#39035;&#27169;&#25311;&#12289;&#24517;&#39035;&#26522;&#20030;&#12289;&#24517;&#39035;&#20570;&#23436;&#35745;&#31639;&#65307;</p></li><li><p>&#20160;&#20040;&#26102;&#20505;&#21487;&#20197;&#21387;&#32553;&#12289;&#21487;&#20197;&#25277;&#35937;&#12289;&#21487;&#20197;&#24418;&#25104;&#21465;&#20107;&#65307;</p></li><li><p>&#20197;&#21450;&#26368;&#20851;&#38190;&#30340;&#65306;&#24403;&#20320;&#25226; AI &#25509;&#36827;&#29616;&#23454;&#20915;&#31574;&#26102;&#65292;&#21738;&#20123;&#37096;&#20998;<strong>&#19981;&#33021;&#35753;&#23427;&#8220;&#29468;&#36807;&#21435;&#8221;</strong>&#65292;&#24517;&#39035;&#33853;&#21040;&#21487;&#22797;&#29616;&#12289;&#21487;&#23457;&#35745;&#12289;&#21487;&#25253;&#38169;&#30340;&#30828;&#32467;&#26500;&#37324;&#12290;</p></li></ul><p>&#35828;&#30495;&#30340;&#65292;&#25105;&#33258;&#24049;&#20063;&#22312;&#23398;&#20064;&#36807;&#31243;&#20013;&#65292;&#21482;&#33021;&#36319;&#22823;&#20332;&#20204;&#23398;&#20064;&#19968;&#20123;&#21746;&#23398;&#30340;&#26681;&#22522;&#21644;&#24605;&#24819;&#12290;&#20855;&#20307;&#33021;&#23454;&#29616;&#20154;&#23478;&#30340;&#30334;&#20998;&#20043;&#19968;&#65292;&#20063;&#26159;&#36186;&#21040;&#20102;&#12290;</p><div><hr></div><h2>&#26368;&#26680;&#24515;&#30340;&#30828;&#35770;&#35777;&#65306;&#35745;&#31639;&#19981;&#21487;&#32422;&#24615; = AI &#26080;&#27861;&#36234;&#36807;&#30340;&#8220;&#29289;&#29702;&#32423;&#19978;&#38480;&#8221;</h2><p>&#36825;&#31687;&#25991;&#31456;&#30340;&#21457;&#21160;&#26426;&#23601;&#26159;&#8220;computational irreducibility&#65288;&#35745;&#31639;&#19981;&#21487;&#32422;&#24615;&#65289;&#8221;&#12290;</p><ul><li><p>&#25226;&#33258;&#28982;&#31995;&#32479;&#24403;&#20316;&#35745;&#31639;&#36807;&#31243;&#65306;&#31995;&#32479;&#33258;&#24049;&#22312;&#8220;&#31639;&#8221;&#23427;&#30340;&#28436;&#21270;</p></li><li><p>&#25105;&#20204;&#65288;&#25110; AI&#65289;&#35201;&#39044;&#27979;&#23427;&#65292;&#20063;&#24517;&#39035;&#20570;&#35745;&#31639;</p></li><li><p><strong>Principle of Computational Equivalence</strong>&#65306;&#36825;&#20123;&#35745;&#31639;&#30340;&#8220;&#35745;&#31639;&#24378;&#24230;&#8221;&#22312;&#21407;&#21017;&#19978;&#30456;&#24403;</p></li><li><p>&#25152;&#20197;&#20320;&#19981;&#33021;&#25351;&#26395; AI &#31995;&#32479;&#24615;&#22320;&#8220;jump ahead&#8221;&#36339;&#36807;&#28436;&#21270;&#27493;&#39588;</p></li><li><p>&#22240;&#32780;&#8220;&#23436;&#20840; solve science&#8221;&#19981;&#21487;&#33021;</p></li></ul><blockquote><p>&#20320;&#24819;&#35201;&#30340;&#37027;&#31181;&#8220;&#32456;&#23616;&#25463;&#24452;&#8221;&#65292;&#22312;&#35768;&#22810;&#31995;&#32479;&#19978;&#26681;&#26412;&#19981;&#23384;&#22312;&#12290;&#19981;&#26159;&#20320;&#35757;&#32451;&#19981;&#22815;&#65292;&#26159;&#19990;&#30028;&#19981;&#32473;&#20320;&#25463;&#24452;&#12290;</p></blockquote><p>&#36825;&#20010;&#23601;&#26356;&#38590;&#29702;&#35299;&#20102;&#12290;&#25105;&#20204;&#25286;&#24320;&#35828;&#19968;&#19979;&#65306;</p><ol><li><p>&#25226;&#33258;&#28982;&#31995;&#32479;&#24403;&#20316;&#35745;&#31639;&#36807;&#31243;&#65306;&#31995;&#32479;&#33258;&#24049;&#22312;&#8220;&#31639;&#8221;&#23427;&#30340;&#28436;&#21270;</p></li></ol><p>Wolfram &#20808;&#25226;&#8220;&#19990;&#30028;=&#35745;&#31639;&#36807;&#31243;&#8221;&#20316;&#20026;&#24213;&#23618;&#21069;&#25552;&#25243;&#20986;&#26469;&#65306;</p><blockquote><p>&#8220;we can think of everything that happens as a computational process.&#8221; (Writings)</p><p>&#8220;The system is doing a computation to determine its behavior.&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/">Writings</a>)</p></blockquote><p>&#36825;&#37324;&#20182;&#19981;&#26159;&#35828;&#8220;&#25105;&#20204;&#29992;&#35745;&#31639;&#21435;&#27169;&#25311;&#19990;&#30028;&#8221;&#65292;&#32780;&#26159;&#35828;&#65306;<strong>&#19990;&#30028;&#26412;&#36523;&#23601;&#22312;&#35745;&#31639;</strong>&#12290;</p><ol start="2"><li><p>&#25105;&#20204;&#65288;&#25110; AI&#65289;&#35201;&#39044;&#27979;&#23427;&#65292;&#20063;&#24517;&#39035;&#20570;&#35745;&#31639;</p></li></ol><p>&#32039;&#25509;&#30528;&#20182;&#25226;&#39044;&#27979;&#32773;&#65288;&#20154;&#25110; AI&#65289;&#25918;&#22238;&#21516;&#19968;&#20010;&#8220;&#35745;&#31639;&#8221;&#26694;&#26550;&#37324;&#65306;</p><blockquote><p>&#8220;We humans&#8212;or, for that matter, any AIs we create&#8212;also have to do computations&#8221; (Writings)</p><p>&#8220;to try to predict or &#8216;solve&#8217; that behavior.&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/">Writings</a>)</p></blockquote><p>&#24847;&#24605;&#26159;&#65306;&#20320;&#24819;&#8220;&#30693;&#36947;&#23427;&#20250;&#24590;&#26679;&#8221;&#65292;&#20320;&#20063;&#24471;<strong>&#20184;&#20986;&#35745;&#31639;&#27493;&#39588;</strong>&#65292;&#19981;&#26159;&#38752;&#8220;&#26356;&#20687;&#20154;&#31867;&#30340;&#30452;&#35273;&#25991;&#26412;&#8221;&#23601;&#33021;&#20813;&#21333;&#12290;</p><ol start="3"><li><p>Principle of Computational Equivalence&#65306;&#36825;&#20123;&#35745;&#31639;&#22312;&#21407;&#21017;&#19978;&#30456;&#24403;</p></li></ol><p>&#20182;&#29992; PCE&#65288;&#35745;&#31639;&#31561;&#20215;&#21407;&#29702;&#65289;&#25226;&#8220;&#20026;&#20160;&#20040;&#27809;&#27861;&#31995;&#32479;&#24615;&#36339;&#27493;&#8221;&#38025;&#27515;&#65306;</p><blockquote><p>&#8220;the Principle of Computational Equivalence says that these computations are all at most equivalent in their sophistication.&#8221; (Writings)</p></blockquote><p>&#36825;&#21477;&#26159;&#25972;&#27573;&#30340;&#8220;&#29289;&#29702;&#32423;&#30828;&#38025;&#23376;&#8221;&#65306;&#31995;&#32479;&#22312;&#31639;&#65292;&#20320;&#20063;&#22312;&#31639;&#65292;&#20294;<strong>&#35745;&#31639;&#24378;&#24230;&#19978;&#38480;&#26159;&#21516;&#38454;</strong>&#65292;&#25152;&#20197;&#19981;&#23384;&#22312;&#19968;&#20010;&#26222;&#36941;&#21487;&#29992;&#30340;&#8220;&#19978;&#24093;&#35270;&#35282;&#25463;&#24452;&#8221;&#12290;</p><ol start="4"><li><p>&#25152;&#20197;&#19981;&#33021;&#25351;&#26395; AI &#31995;&#32479;&#24615;&#22320; &#8220;jump ahead&#8221; &#36339;&#36807;&#27493;&#39588;</p></li></ol><p>&#20182;&#20960;&#20046;&#26159;&#29992;&#20320;&#37027;&#21477; &#8220;jump ahead&#8221; &#30340;&#21407;&#35789;&#26469;&#20889;&#30340;&#65306;</p><blockquote><p>&#8220;we can&#8217;t expect to systematically &#8216;jump ahead&#8217; and predict or &#8216;solve&#8217; the system&#8221; (Writings)</p><p>&#8220;it inevitably takes a certain irreducible amount of computational work&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/">Writings</a>)</p></blockquote><p>&#20851;&#38190;&#35789;&#26159; <strong>systematically</strong>&#65306;</p><p>&#19981;&#26159;&#35828;&#8220;&#26576;&#20123;&#23616;&#37096;&#22330;&#26223;&#20598;&#23572;&#33021;&#36339;&#19968;&#19979;&#8221;&#65292;&#32780;&#26159;&#35828;<strong>&#19981;&#23384;&#22312;&#19968;&#22871;&#26222;&#36866;&#26041;&#27861;</strong>&#33021;&#38271;&#26399;&#31283;&#23450;&#22320;&#36339;&#36807;&#28436;&#21270;&#26412;&#36523;&#12290;&#36825;&#20010;&#36339;&#19968;&#19979;&#65292;&#23601;&#26159;&#20182;&#35828;&#30340;&#21487;&#32422;&#21475;&#34955;&#12290;</p><ol start="5"><li><p>&#22240;&#32780;&#8220;&#23436;&#20840; solve science&#8221;&#19981;&#21487;&#33021;&#65306;&#19978;&#38480;&#26469;&#33258;&#19981;&#21487;&#32422;&#24615;</p></li></ol><p>&#20182;&#25226;&#32467;&#35770;&#33853;&#21040;&#8220;&#31185;&#23398;&#33021;&#21147;&#30340;&#19978;&#38480;&#8221;&#65306;</p><blockquote><p>&#8220;we&#8217;ll ultimately be limited in our &#8216;scientific power&#8217; by the computational irreducibility of the behavior.&#8221; (Writings)</p></blockquote><p>&#24182;&#19988;&#25226;&#8220;&#32456;&#23616;&#25463;&#24452;&#19981;&#23384;&#22312;&#8221;&#30340;&#30452;&#30333;&#21477;&#23376;&#34917;&#19978;&#65288;&#36825;&#21477;&#23545;&#20320;&#24456;&#37325;&#35201;&#65289;&#65306;</p><blockquote><p>&#8220;there just won&#8217;t be any way&#8212;with AI or otherwise&#8212;to shortcut just simulating the system step by step.&#8221; (Writings)</p></blockquote><p>&#36825;&#20010;&#22043;&#65292;&#23601;&#24320;&#22987;&#21464;&#24471;&#24456;&#29572;&#12289;&#24456;&#24494;&#22937;&#20102;&#12290;&#25105;&#30340;&#29702;&#35299;&#26159;&#65306;&#19990;&#30028;&#22312;&#31639;&#65288;&#24403;&#28982;&#25105;&#20063;&#24456;&#38590;&#35299;&#37322;&#8220;&#19990;&#30028;&#22312;&#31639;&#8221;&#21040;&#24213;&#26159;&#20160;&#20040;&#24847;&#24605;&#65289;&#12290;&#25105;&#25171;&#20010;&#27604;&#26041;&#21834;&#65292;&#20320;&#30340; DNA &#26159;&#19981;&#26159;&#22312;&#8220;&#31639;&#8221;&#20320;&#36523;&#20307;&#37324;&#30340;&#34507;&#30333;&#36136;&#32467;&#26500;&#65311;&#36825;&#20854;&#23454;&#26356;&#20687;&#19968;&#20010;&#27604;&#21947;&#65306;&#19990;&#30028;&#26412;&#36523;&#22312;&#25353;&#23427;&#33258;&#24049;&#30340;&#35268;&#21017;&#25512;&#36827;&#29366;&#24577;&#65292;&#23601;&#20687;&#19968;&#20010;&#36807;&#31243;&#22312;&#33258;&#24049;&#36305;&#12290;&#19982;&#27492;&#21516;&#26102;&#65292;&#20320;&#20063;&#22312;&#31639;&#23545;&#19981;&#23545;&#65311;&#20320;&#29992;&#20320;&#30340;&#33041;&#23376;&#12289;&#20320;&#30340;&#32440;&#31508;&#12289;&#20320;&#30340;&#30005;&#33041;&#12289;&#20320;&#30340;&#27169;&#22411;&#65292;&#35797;&#22270;&#25552;&#21069;&#30693;&#36947;&#23427;&#20250;&#24590;&#20040;&#21464;&#12290;</p><p>&#20851;&#38190;&#22312;&#20110;&#65306;&#20320;&#20457;&#36825;&#20004;&#31181;&#8220;&#31639;&#8221;&#65292;&#22312;&#26412;&#36136;&#19978;&#26159;&#21516;&#32423;&#30340;&#12290;&#20320;&#19981;&#26159;&#31449;&#22312;&#19990;&#30028;&#20043;&#22806;&#25343;&#30528;&#36965;&#25511;&#22120;&#30340;&#20154;&#65292;&#20320;&#20063;&#26159;&#19990;&#30028;&#20869;&#37096;&#30340;&#19968;&#20010;&#35745;&#31639;&#35013;&#32622;&#12290;&#20320;&#24819;&#29992;&#19968;&#20010;&#35745;&#31639;&#21435;&#21387;&#36807;&#21478;&#19968;&#20010;&#35745;&#31639;&#12289;&#31995;&#32479;&#24615;&#22320;&#8220;&#36339;&#27493;&#8221;&#65292;&#22823;&#22810;&#25968;&#26102;&#20505;&#26159;&#19981;&#21487;&#33021;&#30340;&#12290;&#20320;&#30340;&#35745;&#31639;&#19981;&#20250;&#27604;&#19990;&#30028;&#26356;&#39640;&#26126;&#65288;&#21482;&#21487;&#33021;&#26356;&#20302;&#32423;&#65292;&#21621;&#21621;&#65289;&#65292;&#22240;&#20026;&#20320;&#33021;&#20570;&#30340;&#32456;&#31350;&#36824;&#26159;&#22312;&#21516;&#19968;&#20010;&#29289;&#29702;&#23431;&#23449;&#37324;&#36305;&#35268;&#21017;&#12290;&#20320;&#33021;&#36194;&#30340;&#24773;&#20917;&#65292;&#24448;&#24448;&#21482;&#26159;&#20320;&#24688;&#22909;&#25214;&#21040;&#20102;&#19968;&#20010;&#8220;&#25463;&#24452;&#21475;&#34955;&#8221;&#65281;&#19981;&#26159;&#20320;&#21464;&#25104;&#20102;&#19978;&#24093;&#65292;&#32780;&#26159;&#36825;&#20010;&#31995;&#32479;&#22312;&#26576;&#20010;&#23610;&#24230;&#19978;&#20801;&#35768;&#34987;&#21387;&#32553;&#12289;&#20801;&#35768;&#34987;&#31616;&#21270;&#12289;&#20801;&#35768;&#34987;&#25552;&#21069;&#35828;&#20986;&#19968;&#28857;&#19996;&#35199;&#12290;&#65288;&#36825;&#19968;&#28857;&#24456;&#37325;&#35201;&#65292;&#25105;&#20204;&#20316;&#20026;&#20154;&#31867;&#19981;&#21487;&#33258;&#22823;&#65289;&#12290;</p><p>&#32780;&#19988;&#20961;&#26159;&#35745;&#31639;&#65292;&#24517;&#32791;&#33021;&#12290;&#36825;&#20010;&#20848;&#36947;&#23572;&#21407;&#21017;&#65292;&#19968;&#20999;&#30340;&#19968;&#20999;&#65292;&#23601;&#19981;&#22810;&#35828;&#20102;&#12290;&#36825;&#21477;&#35805;&#25105;&#36234;&#26469;&#36234;&#25226;&#23427;&#24403;&#20316;&#19968;&#31181;&#29616;&#23454;&#19990;&#30028;&#30340;&#24213;&#23618;&#31246;&#25910;&#12290;&#20320;&#24819;&#30693;&#36947;&#26356;&#22810;&#32454;&#33410;&#65292;&#23601;&#35201;&#20184;&#20986;&#26356;&#22810;&#27493;&#25968;&#65307;&#20320;&#24819;&#26356;&#31934;&#30830;&#12289;&#26356;&#21487;&#22797;&#29616;&#12289;&#26356;&#21487;&#36861;&#36131;&#65292;&#23601;&#35201;&#20184;&#20986;&#26356;&#22810;&#32467;&#26500;&#21270;&#25104;&#26412;&#12290;&#20320;&#21487;&#20197;&#29992;&#35821;&#35328;&#31946;&#24324;&#19968;&#26102;&#65292;&#20294;&#19968;&#26086;&#35201;&#33853;&#21040;&#25191;&#34892;&#65292;&#23601;&#24517;&#39035;&#20132;&#36134;&#65306;&#26102;&#38388;&#36134;&#12289;&#31639;&#21147;&#36134;&#12289;&#33021;&#37327;&#36134;&#12289;&#29978;&#33267;&#20154;&#31867;&#27880;&#24847;&#21147;&#36134;&#12290;&#25152;&#35859;&#8220;&#19981;&#21487;&#32422;&#8221;&#65292;&#23601;&#26159;&#35828;&#65306;&#22312;&#24456;&#22810;&#22320;&#26041;&#65292;&#36825;&#20221;&#36134;&#20320;&#36530;&#19981;&#25481;&#12290;</p><p>&#25152;&#20197;&#24403;&#25105;&#20204;&#25226;&#30446;&#20809;&#20174;&#8220;AI &#33021;&#19981;&#33021;&#20570;&#19968;&#20999;&#8221;&#31227;&#21040;&#8220;AI &#33021;&#19981;&#33021;&#24110;&#25105;&#20204;&#20570;&#20915;&#31574;&#8221;&#26102;&#65292;&#38382;&#39064;&#23601;&#31361;&#28982;&#21464;&#24471;&#23574;&#38160;&#65306;&#22914;&#26524;&#19990;&#30028;&#30340;&#28436;&#21270;&#26412;&#26469;&#23601;&#38656;&#35201;&#19968;&#27493;&#27493;&#31639;&#20986;&#26469;&#65292;&#37027;&#19968;&#20010;&#27169;&#22411;&#20973;&#20160;&#20040;&#22312;&#19981;&#20184;&#20986;&#21516;&#31561;&#25104;&#26412;&#30340;&#24773;&#20917;&#19979;&#65292;&#32473;&#20320;&#19968;&#20010;&#30475;&#36215;&#26469;&#20687;&#32467;&#35770;&#12289;&#20687;&#35009;&#20915;&#30340;&#31572;&#26696;&#65311;&#26356;&#35201;&#21629;&#30340;&#26159;&#65292;&#27169;&#22411;&#20026;&#20102;&#25226;&#35821;&#35328;&#32493;&#20889;&#19979;&#21435;&#65292;&#23427;&#20542;&#21521;&#20110;&#25226;&#19981;&#30830;&#23450;&#24615;&#20063;&#21253;&#35013;&#25104;&#8220;&#21487;&#35828;&#30340;&#32467;&#26524;&#8221;&#65292;&#32780;&#19981;&#26159;&#20687;&#30495;&#27491;&#30340;&#31995;&#32479;&#37027;&#26679;&#65306;&#35813;&#25253;&#38169;&#23601;&#25253;&#38169;&#65292;&#35813;&#20572;&#26426;&#23601;&#20572;&#26426;&#65292;&#35813;&#35201;&#27714;&#26356;&#22810;&#20449;&#24687;&#23601;&#35201;&#27714;&#26356;&#22810;&#20449;&#24687;&#12290;&#65288;&#25105;&#21018;&#25165;&#35828;&#30340;&#36825;&#20010;&#25253;&#38169;&#24456;&#37325;&#35201;&#65292;&#25105;&#22312;&#21478;&#19968;&#31687;&#26377;&#20180;&#32454;&#38416;&#36848;&#65292;&#36825;&#26159;&#25105;&#26368;&#36817;&#31995;&#32479;&#30340;&#37325;&#35201;&#24515;&#24471;&#12290;&#19981;&#33021;&#25253;&#38169;&#30340;&#31995;&#32479;&#24456;&#21361;&#38505;&#65281; &#65289;</p><p>&#20110;&#26159;&#25105;&#25165;&#35748;&#20026;&#30495;&#27491;&#38656;&#35201;&#34987;&#24037;&#31243;&#21270;&#30340;&#65292;&#19981;&#26159;&#8220;&#35753; AI &#26356;&#20250;&#22238;&#31572;&#8221;&#65292;&#32780;&#26159;<strong>&#35753;&#35821;&#35328;&#25509;&#21475;&#20855;&#22791;&#33021;&#37327;&#23432;&#24658;&#33324;&#30340;&#21046;&#24230;&#32422;&#26463;</strong>&#12290;&#35753;&#23427;&#22312;&#35813;&#26114;&#36149;&#30340;&#26102;&#20505;&#26114;&#36149;&#65292;&#22312;&#35813;&#20572;&#19979;&#26469;&#30340;&#26102;&#20505;&#20572;&#19979;&#26469;&#65307;&#35753;&#23427;&#33021;&#25215;&#35748;&#8220;&#36825;&#37324;&#27809;&#26377;&#25463;&#24452;&#8221;&#65292;&#32780;&#19981;&#26159;&#32473;&#20320;&#19968;&#20010;&#24265;&#20215;&#30340;&#24187;&#35273;&#12290;&#21542;&#21017;&#65292;&#25105;&#20204;&#23601;&#20250;&#25226;&#26412;&#26469;&#24517;&#39035;&#20184;&#20986;&#30340;&#35745;&#31639;&#25104;&#26412;&#65292;&#20599;&#20599;&#36716;&#23233;&#25104;&#21478;&#19968;&#31181;&#26356;&#38544;&#34109;&#30340;&#25104;&#26412;&#65306;&#35823;&#21028;&#12289;&#35823;&#20449;&#12289;&#38169;&#25226;&#25311;&#21512;&#24403;&#32467;&#35770;&#12289;&#38169;&#25226;&#39034;&#28369;&#24403;&#21487;&#38752;&#12290;</p><p>&#36825;&#20063;&#26159;&#20026;&#20160;&#20040;&#25105;&#19968;&#30452;&#24378;&#35843;&#37027;&#20960;&#20010;&#35789;&#65306;&#21487;&#22797;&#29616;&#12289;&#21487;&#23457;&#35745;&#12289;&#21487;&#36801;&#31227;&#12290;&#23427;&#20204;&#19981;&#26159;&#8220;&#24037;&#31243;&#27905;&#30294;&#8221;&#65292;&#32780;&#26159;&#22312;&#19981;&#21487;&#32422;&#30340;&#19990;&#30028;&#37324;&#65292;&#21807;&#19968;&#33021;&#35753;&#35821;&#35328;&#36827;&#20837;&#29616;&#23454;&#20915;&#31574;&#32780;&#19981;&#23849;&#30424;&#30340;&#30828;&#26465;&#20214;&#12290;&#22240;&#20026;&#20320;&#26080;&#27861;&#25112;&#32988;&#19990;&#30028;&#30340;&#35745;&#31639;&#65292;&#20320;&#21807;&#19968;&#33021;&#20570;&#30340;&#65292;&#26159;&#25215;&#35748;&#21738;&#37324;&#24517;&#39035;&#31639;&#12289;&#21738;&#37324;&#21487;&#20197;&#21387;&#32553;&#65292;&#28982;&#21518;&#25226;&#36825;&#19968;&#20999;&#20889;&#25104;&#32467;&#26500;&#65292;&#35753;&#31995;&#32479;&#22312;&#26102;&#38388;&#37324;&#23545;&#33258;&#24049;&#30340;&#27599;&#19968;&#27425;&#21028;&#26029;&#36127;&#36131;&#12290;</p><p>&#36825;&#20010;&#30495;&#26159;&#25105;&#35299;&#37322;&#30340;&#26497;&#38480;&#20102;&#65292;&#20877;&#22810;&#30340;&#25105;&#20063;&#35299;&#37322;&#19981;&#20102;&#20102;&#12290;</p><blockquote><p>&#25152;&#20197;&#20851;&#38190;&#26159;&#20160;&#20040;&#65292;Wolfram&#30340;&#26368;&#22823;&#20215;&#20540;&#65292;&#23545;&#25105;&#30340;&#26497;&#22823;&#21551;&#21457;&#65292;&#23601;&#26159;&#22312;&#36825;&#37324;&#65292;&#21578;&#35785;&#20320;&#25105;&#20204;&#30340;&#30446;&#26631;&#26159;&#25214;&#26080;&#25968;&#20010;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;&#12290;</p></blockquote><div><hr></div><h2>&#20851;&#38190;&#32454;&#33410;&#65306;&#19981;&#21487;&#32422;&#24615;&#24182;&#19981;&#31561;&#20110;&#8220;&#20160;&#20040;&#37117;&#19981;&#33021;&#20570;&#8221;</h2><p>&#36825;&#37324;&#24456;&#22810;&#20154;&#20250;&#35823;&#35835;&#12290;</p><p>Wolfram &#30340;&#35828;&#27861;&#26159;&#65306;</p><ul><li><p>&#25972;&#20307;&#19981;&#21487;&#32422;</p></li><li><p>&#20294;&#24517;&#28982;&#23384;&#22312;&#26080;&#31351;&#22810; &#8220;pockets of reducibility&#65288;&#21487;&#32422;&#21475;&#34955;&#65289;&#8221;</p></li><li><p>&#31185;&#23398;&#20043;&#25152;&#20197;&#21487;&#33021;&#65292;&#26159;&#22240;&#20026;&#25105;&#20204;&#36890;&#24120;&#23601;&#22312;&#36825;&#20123;&#21475;&#34955;&#37324;&#24037;&#20316;&#65306;&#35268;&#24459;&#12289;&#27169;&#22411;&#12289;&#21387;&#32553;&#12289;&#29702;&#35299;&#65292;&#37117;&#26469;&#33258;&#21475;&#34955;</p></li></ul><p>&#25152;&#20197;&#20182;&#30340;&#32467;&#35770;&#20854;&#23454;&#26159;&#20108;&#27573;&#24335;&#65306;</p><ol><li><p><strong>AI &#19981;&#21487;&#33021;&#35753;&#25105;&#20204;&#31995;&#32479;&#24615;&#36339;&#36807;&#19981;&#21487;&#32422;&#24615;</strong></p></li><li><p><strong>AI &#21487;&#33021;&#24110;&#21161;&#25105;&#20204;&#26356;&#24555;&#22320;&#25214;&#21040;&#21487;&#32422;&#21475;&#34955;</strong></p></li></ol><p>&#36825;&#20063;&#35299;&#37322;&#20102;&#20182;&#20026;&#20160;&#20040;&#19968;&#36793;&#35828;&#8220;&#32456;&#23616;&#19981;&#21487;&#33021;&#8221;&#65292;&#19968;&#36793;&#21448;&#24895;&#24847;&#35848;&#24456;&#22810;&#8220;AI &#22312;&#31185;&#23398;&#37324;&#33021;&#24178;&#22043;&#8221;&#12290;</p><p>&#19981;&#26159;&#35201;&#36538;&#24179;&#65281;</p><ol><li><p>&#20182;&#20808;&#25243;&#20986;&#20851;&#38190;&#38382;&#39064;&#65306;&#26082;&#28982;&#19981;&#21487;&#32422;&#65292;&#31185;&#23398;&#20026;&#20160;&#20040;&#20173;&#21487;&#33021;&#65311;</p></li></ol><p>&#20182;&#32039;&#25509;&#30528;&#38382;&#65306;</p><blockquote><p>&#8220;But given computational irreducibility, why is science actually possible at all?&#8221; (Writings)</p></blockquote><p>&#36825;&#21477;&#26159;&#22312;&#25226;&#35835;&#32773;&#20174;&#8220;&#32477;&#26395;/&#34394;&#26080;&#8221;&#37324;&#25341;&#20986;&#26469;&#65306;&#20320;&#37117;&#35828;&#19981;&#33021;&#36339;&#27493;&#20102;&#65292;&#37027;&#31185;&#23398;&#23682;&#19981;&#26159;&#19981;&#25104;&#31435;&#65311;&#25152;&#20197;&#25105;&#36319;&#20320;&#35828;&#20102;&#65292;&#22823;&#37096;&#20998;&#20154;&#37117;&#22312;&#35823;&#35299;&#20182;&#65292;&#20182;&#19981;&#26159;&#22312;&#21809;&#34928;&#12290;</p><ol start="2"><li><p>&#26680;&#24515;&#31572;&#26696;&#65306;&#25972;&#20307;&#19981;&#21487;&#32422;&#65292;&#20294;&#24517;&#28982;&#23384;&#22312;&#26080;&#31351;&#22810;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;</p></li></ol><p>&#8220;&#21475;&#34955;&#29702;&#35770;&#8221;&#65306;</p><blockquote><p>&#8220;whenever there&#8217;s overall computational irreducibility, there are also an infinite number of pockets of computational reducibility.&#8221; (Writings)</p></blockquote><p>&#8220;&#21475;&#34955;&#26159;&#20160;&#20040;&#8221;&#65306;</p><blockquote><p>&#8220;there are always certain aspects of a system about which things can be said using limited computational effort.&#8221; (Writings)</p></blockquote><p>&#20197;&#21450;&#8220;&#31185;&#23398;&#20026;&#20309;&#38752;&#21475;&#34955;&#8221;&#65306;</p><blockquote><p>&#8220;these are what we typically concentrate on in &#8216;doing science&#8217;.&#8221; (Writings)</p></blockquote><div><hr></div><blockquote><p>&#25152;&#20197;&#25105;&#20204;&#35201;&#24590;&#20040;&#20570;&#65311;</p></blockquote><h2>AI &#19981;&#33021;&#31995;&#32479;&#24615;&#36234;&#36807;&#19981;&#21487;&#32422;&#65288;&#32456;&#23616;&#19981;&#21487;&#33021;&#65289;</h2><p>&#21475;&#34955;&#20043;&#22806;&#20173;&#26377;&#19981;&#21487;&#32422;&#24102;&#26469;&#30340;&#38382;&#39064;&#19982;&#24778;&#35766;&#65306;</p><blockquote><p>&#8220;there are limits to this&#8212;and issues that run into computational irreducibility.&#8221; (Writings)</p><p>&#8220;we just can&#8217;t answer&#8221; / &#8220;surprises&#8221; (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/?utm_source=chatgpt.com">Writings</a>)</p></blockquote><p>&#20182;&#20877;&#27425;&#37325;&#30003;&#8220;&#19981;&#33021; shortcut &#20840;&#37096;&#28436;&#21270;&#8221;&#65306;</p><blockquote><p>&#8220;there just won&#8217;t be any way&#8212;with AI or otherwise&#8212;to shortcut&#8230; step by step.&#8221; (Writings)</p></blockquote><p><strong>AI &#19981;&#21487;&#33021;&#35753;&#25105;&#20204;&#31995;&#32479;&#24615;&#36339;&#36807;&#19981;&#21487;&#32422;&#24615;&#65292;&#25105;&#20204;&#20154;&#31867;&#20043;&#28218;&#23567;&#26159;&#27809;&#26377;&#25913;&#21464;&#30340;&#65292;&#38752;AI&#26080;&#27861;&#25361;&#25112;&#19978;&#24093;&#12290;</strong></p><h2>AI &#21487;&#33021;&#24110;&#21161;&#26356;&#24555;&#25214;&#21040;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;</h2><p>&#36824;&#26159;&#21475;&#34955;&#29702;&#35770;</p><blockquote><p>&#8220;AI has the potential to give us streamlined ways to find certain kinds of pockets of computational reducibility.&#8221; (Writings)</p></blockquote><p>&#36825;&#37324;&#30340;&#20851;&#38190;&#35789;&#26159; <strong>streamlined ways</strong>&#65306;&#19981;&#26159;&#8220;&#35777;&#26126;&#19968;&#20999;/&#35299;&#20915;&#19968;&#20999;&#8221;&#65292;&#32780;&#26159;&#8220;&#26356;&#39034;&#28369;&#22320;&#25214;&#21040;&#26576;&#31867;&#21475;&#34955;&#8221;&#12290;</p><h2>&#20026;&#20160;&#20040;&#20182;&#33021;&#19968;&#36793;&#21542;&#23450;&#8220;&#32456;&#23616;&#8221;&#65292;&#19968;&#36793;&#35848;&#24456;&#22810;&#8220;AI &#33021;&#24178;&#22043;&#8221;</h2><p>&#22240;&#20026;&#22312;&#20182;&#36825;&#22871;&#26694;&#26550;&#37324;&#65306;</p><ul><li><p><strong>&#19981;&#21487;&#32422;&#24615;</strong>&#36127;&#36131;&#35299;&#37322;&#65306;&#20026;&#20160;&#20040;&#8220;solve science / do everything&#8221;&#19981;&#21487;&#33021;&#65288;&#19978;&#38480;&#65289;</p></li><li><p><strong>&#21487;&#32422;&#21475;&#34955;</strong>&#36127;&#36131;&#35299;&#37322;&#65306;&#20026;&#20160;&#20040;&#31185;&#23398;&#20173;&#28982;&#33021;&#20570;&#12289;&#24037;&#20855;&#20173;&#28982;&#26377;&#29992;&#65288;&#31354;&#38388;&#65289;</p></li><li><p><strong>AI</strong>&#21017;&#34987;&#25918;&#22312;&#19968;&#20010;&#38750;&#24120;&#20855;&#20307;&#30340;&#20301;&#32622;&#65306;<strong>&#22312;&#21475;&#34955;&#30340;&#21457;&#29616;&#19982;&#21033;&#29992;&#19978;&#21152;&#36895;&#65292;&#32780;&#19981;&#26159;&#22312;&#19981;&#21487;&#32422;&#22788;&#21019;&#36896;&#25463;&#24452;</strong> (<a href="https://writings.stephenwolfram.com/2024/03/can-ai-solve-science/?utm_source=chatgpt.com">Writings</a>)</p></li></ul><blockquote><p>&#32473;&#25105;&#20204;&#30340;&#21551;&#31034;&#23601;&#26159;&#29282;&#35760;AI&#33021;&#24178;&#22043;&#65292;&#24182;&#19988;&#23613;&#37327;&#22312;&#33258;&#24049;&#30340;&#39046;&#22495;&#37324;&#25214;&#21040;&#33258;&#24049;&#30340;niche&#65292;&#19968;&#31181;&#33021;&#22815;&#20511;&#21161;AI&#21457;&#29616;&#30340;&#26576;&#31181;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;&#12290;</p></blockquote><div><hr></div><h1>&#31070;&#32463;&#32593;&#32476;&#25797;&#38271;&#8220;&#31895;&#30053;&#23545;&#8221;&#65292;&#19981;&#25797;&#38271;&#8220;&#32454;&#33410;&#20840;&#23545;&#8221;</h1><p>&#36825;&#31687;&#25991;&#31456;&#21518;&#38754;&#30340;&#20869;&#23481;&#28436;&#31034;&#20102;&#19968;&#31995;&#21015;&#30340;&#23454;&#39564;&#65306;</p><ul><li><p>&#39044;&#27979;&#20989;&#25968;&#65306;&#35757;&#32451;&#33021;&#25311;&#21512;&#36807;&#21435;&#65292;&#20294;&#26410;&#26469;&#32454;&#33410;&#23849;</p></li><li><p>&#39044;&#27979;&#20803;&#32990;&#33258;&#21160;&#26426;&#65306;&#31616;&#21333;&#37096;&#20998;&#23545;&#65292;&#22797;&#26434;&#37096;&#20998;&#32454;&#33410;&#38169;&#65307;&#36234;&#38169;&#36234;&#21457;&#25955;</p></li><li><p>&#39044;&#27979;&#19977;&#20307;&#65306;&#31616;&#21333;&#36712;&#36947;&#33021;&#35760;&#20303;&#65292;&#22797;&#26434;&#36712;&#36947;&#23601;&#19981;&#34892;</p></li><li><p>autoencoder &#21387;&#32553;&#65306;&#33021;&#21387;&#32553;&#8220;&#20687;&#35757;&#32451;&#38598;&#8221;&#30340;&#19996;&#35199;&#65307;&#36935;&#21040;&#19981;&#21487;&#32422;&#24615;&#23601;&#21387;&#19981;&#21160;</p></li></ul><p>&#20182;&#24635;&#32467;&#20026;</p><blockquote><p>ML &#24448;&#24448;&#8220;roughly right&#8221;&#65292;&#20294;&#8220;nailing the details&#8221;&#19981;&#26159;&#23427;&#24378;&#39033;&#12290;</p></blockquote><p>&#36825;&#23601;&#26159;&#20182;&#23545;&#8220;LLM/NN &#22312;&#31185;&#23398;&#37324;&#20250;&#25758;&#22681;&#8221;&#30340;&#32463;&#39564;&#23618;&#25903;&#25745;&#12290;</p><div><hr></div><h1>&#20182;&#23545; AlphaFold &#36825;&#31181;&#8220;&#25104;&#21151;&#26696;&#20363;&#8221;&#30340;&#21746;&#23398;&#35299;&#37322;&#65306;&#25104;&#21151;&#30340;&#24456;&#22823;&#19968;&#37096;&#20998;&#26469;&#33258;&#8220;&#20154;&#31867;&#21028;&#25454;&#8221;</h1><ul><li><p>&#34507;&#30333;&#25240;&#21472;&#26412;&#36523;&#19981;&#26159;&#20154;&#31867;&#20219;&#21153;</p></li><li><p>&#20294;&#8220;&#25105;&#20204;&#20851;&#24515;&#20160;&#20040;&#31639;&#23545;&#8221;&#65288;&#24418;&#29366;&#12289;&#21151;&#33021;&#12289;&#20108;&#32423;&#32467;&#26500;&#31561;&#65289;&#26159;&#20154;&#31867;&#21028;&#25454;</p></li><li><p>&#22240;&#32780;&#31070;&#32463;&#32593;&#32476;&#33021;&#25104;&#21151;&#65292;&#21487;&#33021;&#37096;&#20998;&#26159;&#22240;&#20026;&#23427;&#25235;&#20303;&#20102;<strong>&#19982;&#20154;&#31867;&#24863;&#30693;/&#20998;&#31867;&#26631;&#20934;&#23545;&#40784;&#30340;&#21487;&#32422;&#21475;&#34955;</strong></p></li><li><p>&#20294;&#36935;&#21040;&#26356;&#22797;&#26434;&#25110;&#8220;&#24322;&#22495;&#8221;&#30340;&#34507;&#30333;&#65292;&#20173;&#21487;&#33021;&#20986;&#29616;&#8220;surprises&#8221;&#19982;&#22833;&#25928;</p></li></ul><p>&#36825;&#27573;&#32972;&#21518;&#30340;&#21746;&#23398;&#21619;&#36947;&#26159;&#65306;</p><blockquote><p>AI &#30340;&#25104;&#21151;&#24448;&#24448;&#26159;&#8220;&#22312;&#20154;&#31867;&#23450;&#20041;&#30340;&#21487;&#29992;&#26631;&#20934;&#37324;&#25104;&#21151;&#8221;&#65292;&#32780;&#19981;&#26159;&#8220;&#22312;&#20840;&#23458;&#35266;&#30340;&#24494;&#35266;&#30495;&#23454;&#37324;&#25104;&#21151;&#8221;&#12290;</p></blockquote><p>&#36825;&#20010;&#22320;&#26041;&#30830;&#23454;&#21448;&#24320;&#22987;&#8220;&#29572;&#8221;&#22238;&#26469;&#20102;&#65292;&#20294;&#23427;&#21644;&#25105;&#20204;&#19978;&#25991;&#25552;&#21040;&#30340; Demis Hassabis / AlphaFold &#30340;&#20851;&#31995;&#65292;&#21453;&#32780;&#26159;&#38750;&#24120;&#20855;&#20307;&#12289;&#38750;&#24120;&#33853;&#22320;&#30340;&#12290;</p><p>Demis &#26159;&#24403;&#19990;&#31070;&#31461;&#65292;AlphaFold &#20063;&#30830;&#23454;&#24320;&#21551;&#20102;&#19968;&#20010;&#26102;&#20195;&#12290;&#20182;&#21644; Wolfram &#37117;&#26159;&#25105;&#38271;&#26399;&#20851;&#27880;&#30340;&#31185;&#23398;&#23478;&#65292;&#20294;&#22312;&#36825;&#37324;&#25105;&#24819;&#20808;&#28548;&#28165;&#19968;&#28857;&#65306;<strong>Wolfram &#24182;&#19981;&#26159;&#22312;&#21542;&#35748; AlphaFold &#30340;&#25104;&#23601;</strong>&#12290;&#30456;&#21453;&#65292;&#20182;&#26159;&#22312;&#29992; AlphaFold &#36825;&#26679;&#30340;&#25104;&#21151;&#26696;&#20363;&#65292;&#21435;&#35299;&#37322;&#19968;&#20010;&#26356;&#24213;&#23618;&#30340;&#32467;&#26500;&#24615;&#35266;&#28857;&#65306;<strong>AI &#30340;&#37325;&#22823;&#31361;&#30772;&#65292;&#24448;&#24448;&#19981;&#26159;&#8220;&#35299;&#20915;&#20102;&#19990;&#30028;&#30340;&#20840;&#37096;&#24494;&#35266;&#30495;&#23454;&#8221;&#65292;&#32780;&#26159;&#8220;&#31934;&#20934;&#21629;&#20013;&#20102;&#19968;&#20010;&#21487;&#32422;&#21475;&#34955;&#8221;&#12290;</strong></p><p>&#20182;&#20457;&#30340;&#35821;&#35328;&#22826;&#19981;&#19968;&#26679;&#20102;&#65292;&#20182;&#20204;&#20004;&#20010;&#24456;&#38590;&#34987;&#35748;&#20026;&#26159;&#21516;&#19968;&#20010;&#35821;&#35328;&#20307;&#31995;&#21543;&#12290;&#20294;&#26159;&#25105;&#35748;&#20026;&#20182;&#20204;&#22312;&#36825;&#26041;&#38754;&#26159;&#21516;&#26500;&#30340;&#12290;</p><p>&#20063;&#23601;&#26159;&#35828;&#65292;Wolfram &#24819;&#24378;&#35843;&#30340;&#19981;&#26159;&#8220;AI &#19981;&#34892;&#8221;&#65292;&#32780;&#26159;&#8220;AI &#34892;&#30340;&#26102;&#20505;&#65292;&#23427;&#21040;&#24213;&#34892;&#22312;&#20160;&#20040;&#22320;&#26041;&#8221;&#12290;</p><p>&#22312;&#20182;&#30340;&#34920;&#36848;&#37324;&#65292;&#34507;&#30333;&#25240;&#21472;&#36825;&#20010;&#29289;&#29702;&#36807;&#31243;&#26412;&#36523;&#24182;&#19981;&#8220;&#20197;&#20154;&#31867;&#20026;&#20013;&#24515;&#8221;&#65307;&#20294;&#25105;&#20204;&#35780;&#20215; AlphaFold &#30340;&#8220;&#25104;&#21151;&#8221;&#65292;&#20063;&#20174;&#26469;&#19981;&#26159;&#35201;&#27714;&#23427;&#39044;&#27979;&#27599;&#19968;&#20010;&#21407;&#23376;&#22312;&#27599;&#19968;&#20010;&#26102;&#21051;&#30340;&#31934;&#30830;&#20301;&#32622;&#12290;&#25105;&#20204;&#35201;&#30340;&#26159;&#19968;&#31181;&#20154;&#31867;&#21487;&#29992;&#12289;&#21487;&#39564;&#35777;&#12289;&#21487;&#26381;&#21153;&#20110;&#29983;&#29289;&#23398;&#30446;&#26631;&#30340;&#32467;&#26524;&#65306;&#24635;&#20307;&#32467;&#26500;&#26159;&#21542;&#23545;&#12289;&#20851;&#38190;&#29305;&#24449;&#26159;&#21542;&#23545;&#12289;&#21151;&#33021;&#30456;&#20851;&#30340;&#24418;&#29366;&#26159;&#21542;&#23545;&#12290;&#25442;&#21477;&#35805;&#35828;&#65292;<strong>&#25105;&#20204;&#23450;&#20041;&#30340;&#8220;&#20160;&#20040;&#31639;&#23545;&#8221;&#65292;&#26412;&#36523;&#23601;&#33853;&#22312;&#19968;&#20010;&#21487;&#21387;&#32553;&#12289;&#21487;&#27010;&#25324;&#12289;&#21487;&#27867;&#21270;&#30340;&#32467;&#26500;&#21306;&#22495;&#37324;</strong>&#12290;</p><p>&#20110;&#26159; AlphaFold &#30340;&#20255;&#22823;&#20043;&#22788;&#23601;&#26174;&#29616;&#20986;&#26469;&#20102;&#65306;&#23427;&#19981;&#26159;&#29992;&#8220;&#26356;&#32874;&#26126;&#30340;&#35821;&#35328;&#8221;&#21435;&#35299;&#37322;&#19990;&#30028;&#65292;&#32780;&#26159;&#29992;&#26497;&#24378;&#30340;&#23398;&#20064;&#33021;&#21147;&#65292;&#22312;&#24040;&#22823;&#30340;&#25968;&#25454;&#19982;&#32467;&#26500;&#32422;&#26463;&#20043;&#19979;&#65292;&#25214;&#21040;&#20102;&#37027;&#19968;&#31867;&#8220;&#21453;&#22797;&#20986;&#29616;&#30340;&#12289;&#31283;&#23450;&#30340;&#12289;&#21487;&#22797;&#29992;&#30340;&#35268;&#24459;&#8221;&#12290;&#36825;&#23601;&#26159; Wolfram &#25152;&#35828;&#30340;&#37027;&#31181; <strong>pocket of computational reducibility&#65288;&#21487;&#32422;&#21475;&#34955;&#65289;</strong>&#65306;&#22312;&#25972;&#20307;&#21487;&#33021;&#19981;&#21487;&#32422;&#12289;&#19981;&#33021;&#31995;&#32479;&#36339;&#27493;&#30340;&#19990;&#30028;&#37324;&#65292;&#20173;&#28982;&#23384;&#22312;&#19968;&#20123;&#21487;&#20197;&#34987;&#21387;&#32553;&#12289;&#34987;&#24314;&#27169;&#12289;&#34987;&#21487;&#38752;&#21033;&#29992;&#30340;&#23616;&#37096;&#21306;&#22495;&#12290;AlphaFold &#30340;&#32988;&#21033;&#65292;&#23601;&#26159;&#25226;&#36825;&#19968;&#22359;&#21306;&#22495;&#25235;&#24471;&#24322;&#24120;&#31934;&#20934;&#12289;&#24182;&#19988;&#24037;&#31243;&#21270;&#21040;&#20102;&#21487;&#35268;&#27169;&#21270;&#20351;&#29992;&#30340;&#31243;&#24230;&#12290;</p><p>&#25152;&#20197;&#24403; Wolfram &#25552;&#21040; &#8220;eye of the beholder&#65288;&#35266;&#23519;&#32773;&#30340;&#21028;&#25454;&#65289;&#8221; &#26102;&#65292;&#20182;&#24182;&#19981;&#26159;&#22312;&#36140;&#20302; AlphaFold &#30340;&#31185;&#23398;&#24615;&#65307;<strong>&#31185;&#23398;&#19982;&#24037;&#31243;&#30495;&#27491;&#33853;&#22320;&#26102;&#65292;&#24635;&#26159;&#22260;&#32469;&#20154;&#31867;&#20851;&#24515;&#30340;&#25351;&#26631;&#12289;&#23610;&#24230;&#12289;&#21028;&#25454;&#26469;&#23450;&#20041;&#25104;&#21151;</strong>&#12290;&#32780;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;&#24688;&#24688;&#24120;&#24120;&#23601;&#26159;&#22312;&#36825;&#20123;&#21028;&#25454;&#19982;&#23610;&#24230;&#19978;&#20986;&#29616;&#30340;&#12290;&#23427;&#35753;&#22797;&#26434;&#19990;&#30028;&#22312;&#26576;&#20010;&#23618;&#32423;&#21464;&#24471;&#21487;&#21387;&#32553;&#12289;&#21487;&#39044;&#27979;&#12289;&#21487;&#25805;&#20316;&#12290;</p><p>&#25105;&#35748;&#20026;Wolfram&#26159;&#35748;&#21516;AlphaFold&#27169;&#24335;&#30340;&#65306;</p><p><strong>AlphaFold &#24182;&#19981;&#26159;&#8220;&#25171;&#31359;&#20102;&#19981;&#21487;&#32422;&#24615;&#8221;&#65292;&#32780;&#26159;&#8220;&#25214;&#21040;&#20102;&#19981;&#21487;&#32422;&#19990;&#30028;&#37324;&#26368;&#20540;&#38065;&#30340;&#19968;&#22359;&#21487;&#32422;&#21475;&#34955;&#8221;&#12290;</strong></p><p>&#24310;&#32493;&#25105;&#21018;&#25165;&#37027;&#27573;&#8220;&#21487;&#32422;&#21475;&#34955;&#8221;&#30340;&#35828;&#27861;&#65292;&#25105;&#36824;&#24819;&#20877;&#22810;&#35828;&#20960;&#21477;&#65292;&#25226;&#23427;&#21644; Demis &#30340;&#34920;&#36848;&#25187;&#22312;&#19968;&#36215;&#8212;&#8212;&#22240;&#20026;&#36825;&#20004;&#20010;&#20154;&#22312;&#21746;&#23398;&#24213;&#33394;&#19978;&#20854;&#23454;&#26159;&#30456;&#36890;&#30340;&#12290;</p><p>Demis &#22312;&#37319;&#35775;&#37324;&#21453;&#22797;&#24378;&#35843;&#19968;&#20010;&#28857;&#65306;<strong>&#20877;&#24378;&#30340;&#31995;&#32479;&#65292;&#22914;&#26524;&#23427;&#20135;&#20986;&#30340;&#8220;&#30693;&#35782;&#32467;&#26500;&#8221;&#25105;&#20204;&#30475;&#19981;&#25026;&#12289;&#35299;&#37322;&#19981;&#20102;&#65292;&#23601;&#20250;&#21464;&#25104;&#39118;&#38505;</strong>&#12290;&#20182;&#35848;&#21040; AI &#33021;&#20570;&#20986;&#36229;&#20986;&#25105;&#20204;&#8220;&#33258;&#24049;&#35774;&#35745;&#25110;&#29702;&#35299;&#8221;&#30340;&#19996;&#35199;&#65292;&#20294;&#39532;&#19978;&#34917;&#19968;&#21477;&#65306;&#30495;&#27491;&#30340;&#25361;&#25112;&#26159;&#35201;&#30830;&#20445;&#36825;&#20123;&#31995;&#32479;&#8220;&#24314;&#20986;&#26469;&#30340;&#30693;&#35782;&#25968;&#25454;&#24211;&#8221;&#65292;&#25105;&#20204;<strong>&#29702;&#35299;&#37324;&#38754;&#21040;&#24213;&#26159;&#20160;&#20040;</strong>&#12290;&#36825;&#21477;&#35805;&#38750;&#24120;&#20851;&#38190;&#65306;&#23427;&#25226;&#8220;&#33021;&#21147;&#8221;&#30828;&#29983;&#29983;&#25289;&#22238;&#21040;&#8220;&#21487;&#29702;&#35299;/&#21487;&#35299;&#37322;&#8221;&#30340;&#20154;&#31867;&#36131;&#20219;&#36793;&#30028;&#19978;&#12290;</p><p>&#22312;&#21478;&#19968;&#27573;&#23545;&#35805;&#37324;&#65288;Lex &#35775;&#35848;&#65289;&#65292;&#20182;&#36824;&#29992;&#20102;&#19968;&#20010;&#24456;&#24418;&#35937;&#30340;&#31867;&#27604;&#65306;&#21363;&#20415;&#20986;&#29616;&#8220;&#22825;&#25165;&#32423;&#30340;&#22909;&#25307;&#8221;&#65292;&#20063;&#19981;&#24517;&#28982;&#26159;&#31070;&#31192;&#19981;&#21487;&#29702;&#35299;&#30340;&#12290;&#26356;&#20687;&#39030;&#23574;&#26827;&#25163;&#36208;&#20986;&#19968;&#25163;&#20320;&#24819;&#19981;&#21040;&#30340;&#26827;&#65292;&#20294;&#20107;&#21518;&#20182;&#20204;&#33021;&#35299;&#37322;&#8220;&#20026;&#20160;&#20040;&#36825;&#27493;&#25104;&#31435;&#8221;&#65307;&#32780;&#19988;&#20182;&#30452;&#25509;&#35828;&#65306;<strong>&#33021;&#29992;&#31616;&#21333;&#26041;&#24335;&#35299;&#37322;&#20320;&#22312;&#24819;&#20160;&#20040;&#65292;&#26412;&#36523;&#23601;&#26159;&#26234;&#33021;&#30340;&#19968;&#37096;&#20998;</strong>&#12290;(<a href="https://lexfridman.com/demis-hassabis-2-transcript/">Lex Fridman</a>)</p><p>&#25226;&#36825;&#20004;&#27573; Demis &#30340;&#35805;&#32763;&#35793;&#25104; Wolfram &#30340;&#35821;&#35328;&#65292;&#20854;&#23454;&#23601;&#26159;&#25105;&#35828;&#30340;&#37027;&#21477;&#65306;<strong>&#20132;&#20114;&#23618;&#38754;&#65288;linguistic interface&#65289;&#21487;&#20197;&#22823;&#37327;&#29992; AI &#20570;&#28070;&#28369;</strong>&#8212;&#8212;&#25226;&#20154;&#31867;&#30340;&#24847;&#22270;&#21464;&#24471;&#26356;&#26131;&#34920;&#36798;&#65292;&#25226;&#22797;&#26434;&#35745;&#31639;&#21464;&#24471;&#26356;&#26131;&#35843;&#29992;&#65292;&#25226;&#32467;&#26524;&#21464;&#24471;&#26356;&#26131;&#21465;&#20107;&#12289;&#26356;&#26131;&#21560;&#25910;&#65307;&#20294;&#30495;&#27491;&#36827;&#20837;&#8220;&#31185;&#23398;/&#20915;&#31574;&#30340;&#20027;&#26435;&#21306;&#8221;&#30340;&#19996;&#35199;&#65292;&#24517;&#39035;&#26159;<strong>&#21487;&#29702;&#35299;&#12289;&#21487;&#39564;&#35777;&#12289;&#21487;&#36861;&#36131;</strong>&#30340;&#32467;&#26500;&#65292;&#32780;&#19981;&#26159;&#8220;&#30475;&#36215;&#26469;&#20687;&#31572;&#26696;&#30340;&#25991;&#26412;&#8221;&#12290;</p><p>&#20063;&#22240;&#27492;&#65292;Wolfram &#25165;&#20250;&#25226; LLM &#30340;&#29616;&#23454;&#20215;&#20540;&#23450;&#20301;&#22312;&#8220;&#35821;&#35328;&#25509;&#21475;&#8221;&#21644;&#8220;&#39640;&#23618; autocomplete&#8221;&#65306;&#23427;&#33021;&#25226;&#26082;&#26377;&#35745;&#31639;&#33021;&#21147;&#29992;&#24471;&#26356;&#39034;&#65292;&#33021;&#25226;&#8220;&#24815;&#24120;&#31185;&#23398;&#26234;&#24935;&#8221;&#34917;&#20840;&#25104;&#8220;&#24815;&#24120;&#31572;&#26696;/&#24815;&#24120;&#19979;&#19968;&#27493;&#8221;&#12290;</p><p><strong>&#28070;&#28369;&#20132;&#20114;&#21487;&#20197;&#24456;&#24378;&#65292;&#20294;&#35009;&#20915;&#19990;&#30028;&#19981;&#33021;&#38752;&#39034;&#28369;&#12290;</strong></p><p><a href="https://www.cbsnews.com/news/artificial-intelligence-google-deepmind-ceo-demis-hassabis-60-minutes-transcript/">https://www.cbsnews.com/news/artificial-intelligence-google-deepmind-ceo-demis-hassabis-60-minutes-transcript/</a></p><div><hr></div><h2>&#8220;Science as Narrative&#8221; &#26159;&#20182;&#23545;&#8220;&#20154;&#31867;&#22312;&#31185;&#23398;&#37324;&#30340;&#19981;&#21487;&#26367;&#20195;&#24615;&#8221;&#30340;&#33853;&#28857;</h2><p>&#20182;&#24378;&#35843;&#65306;</p><ul><li><p>&#31185;&#23398;&#20256;&#32479;&#19978;&#26159;&#25226;&#19990;&#30028;&#38136;&#36896;&#25104;&#8220;&#20154;&#33021;&#24819;&#12289;&#33021;&#35762;&#30340;&#21465;&#20107;&#8221;</p></li><li><p>&#19981;&#21487;&#32422;&#24615;&#24847;&#21619;&#30528;&#65306;&#24456;&#22810;&#22320;&#26041;&#20320;&#21482;&#33021;&#32473;&#20986;&#8220;100 &#27493;&#35745;&#31639;&#8221;&#65292;&#20294;&#36825;&#19981;&#26159;&#20154;&#31867;&#21465;&#20107;</p></li><li><p>&#20154;&#31867;&#21465;&#20107;&#38656;&#35201;&#8220;waypoints&#65288;&#21487;&#21560;&#25910;&#30340;&#20013;&#38388;&#36335;&#26631;&#65289;&#8221;&#65306;&#29087;&#24713;&#30340;&#23450;&#29702;&#12289;&#27010;&#24565;&#22359;&#12289;&#35821;&#35328;&#26500;&#20214;</p></li><li><p>Wolfram Language &#30340;&#35774;&#35745;&#26412;&#36136;&#19978;&#23601;&#26159;&#22312;&#21046;&#36896;&#36825;&#31181;&#8220;&#21487;&#21560;&#25910;&#36335;&#26631;&#8221;</p></li><li><p>AI &#20063;&#35768;&#33021;&#24110;&#24537;&#36215;&#21517;&#23383;/&#23545;&#40784;&#35789;&#27719;&#65292;&#20294;&#19981;&#20445;&#35777;&#20219;&#20309;&#21487;&#32422;&#21475;&#34955;&#37117;&#33021;&#34987;&#20154;&#31867;&#27010;&#24565;&#35206;&#30422;&#65288;&#20182;&#21483; interconcept space&#65289;</p></li></ul><p>&#36825;&#37096;&#20998;&#22522;&#26412;&#22238;&#31572;&#20102;&#25105;&#21069;&#38754;&#19968;&#30452;&#22312;&#35828;&#30340;&#8220;&#24515;&#29702;&#27785;&#28024;/&#25509;&#21475;&#21327;&#35758;&#8221;&#65288;&#25110;&#32773;&#25105;&#19979;&#19968;&#31687;&#25991;&#31456;&#65292;&#36825;&#20004;&#31687;&#25991;&#31456;&#26159;&#20114;&#30456;&#24341;&#29992;&#30340;&#65289;&#30340;&#37027;&#26465;&#32447;&#65306;</p><blockquote><p>LLM &#24456;&#24378;&#30340;&#26159;&#8220;&#21465;&#20107;&#19982;&#25509;&#21475;&#8221;&#65307;&#20294;&#31185;&#23398;&#25512;&#36827;&#30495;&#27491;&#20381;&#36182;&#30340;&#26159;&#21487;&#35745;&#31639;&#12289;&#21487;&#22797;&#29616;&#12289;&#21487;&#32452;&#32455;&#30340;&#32467;&#26500;&#36335;&#26631;&#12290;</p></blockquote><ol><li><p>&#31185;&#23398;&#26159;&#8220;&#25226;&#19990;&#30028;&#38136;&#36896;&#25104;&#21487;&#34987;&#20154;&#31867;&#24605;&#32771;&#30340;&#21465;&#20107;&#8221;</p></li></ol><p>Wolfram &#20808;&#25226;&#8220;&#31185;&#23398;=&#21465;&#20107;&#24037;&#31243;&#8221;&#23450;&#20041;&#20986;&#26469;&#65306;</p><blockquote><p>&#8220;the essence of science&#8230; [is] &#8230; casting it in a form we humans can think about&#8221; (Stephen Wolfram Writings)</p><p>&#8220;provide a human-accessible narrative&#8221;</p></blockquote><p><strong>&#31185;&#23398;&#35201;&#25226;&#19990;&#30028;&#21464;&#25104;&#20154;&#31867;&#21487;&#21560;&#25910;&#30340;&#34920;&#31034;</strong>&#12290;</p><p>&#25105;&#30475;&#19981;&#25026;&#19981;&#26159;&#20063;&#30333;&#25645;&#21527;&#65311;&#36825;&#20010;&#38382;&#39064;&#30475;&#36215;&#26469;&#24456;&#22810;&#27492;&#19968;&#20030;&#65292;&#20854;&#23454;&#29616;&#22312;&#26159;&#26377;&#20105;&#35758;&#30340;&#12290;&#22240;&#20026;&#26377;&#22823;&#37327;&#30340;&#20154;&#24320;&#22987;&#23558;&#31185;&#23398;&#8220;&#40657;&#31665;&#21270;&#8221;&#12290;</p><ol start="2"><li><p>&#19981;&#21487;&#32422;&#24615;&#24847;&#21619;&#30528;&#65306;&#24456;&#22810;&#26102;&#20505;&#20320;&#21482;&#33021;&#32473;&#20986;&#8220;100 &#27493;&#35745;&#31639;&#8221;&#65292;&#20294;&#36825;&#19981;&#26159;&#20154;&#31867;&#21465;&#20107;</p></li></ol><p>&#20182;&#30452;&#25509;&#35828;&#8220;&#19981;&#21487;&#32422;&#24615;&#35753;&#36825;&#31181;&#21465;&#20107;&#24456;&#22810;&#26102;&#20505;&#19981;&#21487;&#33021;&#8221;&#12290;</p><blockquote><p>&#8220;computational irreducibility&#8230; shows us that this will&#8230; not be possible&#8221; (Stephen Wolfram Writings)</p><p>&#8220;It doesn&#8217;t do much good to say &#8216;here are 100 computational steps&#8217;&#8221;</p></blockquote><p>&#36825;&#21477;&#20960;&#20046;&#23601;&#26159;&#25105;&#21478;&#19968;&#31687;&#25991;&#31456;&#8220;&#27169;&#22411;&#19981;&#33021; throw error&#12289;&#24517;&#39035;&#32473;&#32467;&#26524;&#8221;&#30340;&#21453;&#38754;&#38236;&#20687;&#65306;<strong>&#20154;&#31867;&#21465;&#20107;&#19981;&#26159;&#25226;&#27493;&#39588;&#30776;&#36807;&#26469;</strong>&#65292;&#32780;&#26159;&#25226;&#23427;&#32452;&#32455;&#25104;&#21487;&#21560;&#25910;&#30340;&#32467;&#26500;&#12290;</p><ol start="3"><li><p>&#20154;&#31867;&#21465;&#20107;&#38656;&#35201;&#8220;waypoints&#8221;&#65306;&#21487;&#21560;&#25910;&#30340;&#20013;&#38388;&#36335;&#26631;&#65288;&#23450;&#29702;/&#27010;&#24565;&#22359;/&#26500;&#20214;&#65289;</p></li></ol><p>&#35201;&#24819;&#25226;&#8220;&#38750;&#20154;&#31867;&#30340;&#35745;&#31639;&#38142;&#8221;&#21464;&#25104;&#20154;&#31867;&#21465;&#20107;&#65292;&#20320;&#38656;&#35201;&#36335;&#26631;&#65306;</p><blockquote><p>&#8220;we&#8217;d need &#8216;waypoints&#8217; that are somehow familiar&#8221;</p><p>&#8220;pieces that humans can assimilate&#8221;</p></blockquote><p>&#8220;&#29087;&#24713;&#30340;&#23450;&#29702;&#12289;&#27010;&#24565;&#22359;&#12289;&#35821;&#35328;&#26500;&#20214;&#8221;&#65306;&#23427;&#20204;&#26412;&#36136;&#19978;&#26159;<strong>&#35748;&#30693;&#21387;&#32553;&#28857;</strong>&#65292;&#25226;&#19981;&#21487;&#32422;&#30340;&#38271;&#38142;&#20999;&#25104;&#21487;&#29702;&#35299;&#30340;&#27573;&#33853;&#12290;</p><ol start="4"><li><p>Wolfram Language &#30340;&#35774;&#35745;&#26412;&#36136;&#19978;&#23601;&#26159;&#22312;&#21046;&#36896;&#36825;&#31181;&#8220;&#21487;&#21560;&#25910;&#36335;&#26631;&#8221;</p></li></ol><p>&#20182;&#25226;&#36825;&#20214;&#20107;&#30452;&#25509;&#25552;&#21319;&#20026;&#8220;&#35745;&#31639;&#35821;&#35328;&#35774;&#35745;&#30340;&#20351;&#21629;&#8221;&#65306;</p><blockquote><p>&#8220;capture &#8216;common lumps of computational work&#8217; as built-in constructs&#8221;</p><p>&#8220;identifying &#8216;human-assimilable waypoints&#8217; for computations&#8221;</p></blockquote><p>&#32780;&#19988;&#20182;&#20063;&#25215;&#35748;&#36825;&#20214;&#20107;&#26377;&#30828;&#19978;&#38480;&#65306;</p><blockquote><p>&#8220;we&#8217;ll never be able to find such waypoints for all computations&#8221;</p></blockquote><ol start="5"><li><p>AI &#20063;&#35768;&#33021;&#24110;&#36215;&#21517;&#23383;/&#23545;&#40784;&#35789;&#27719;&#65292;&#20294;&#24182;&#19981;&#20445;&#35777;&#21487;&#32422;&#21475;&#34955;&#37117;&#33021;&#34987;&#20154;&#31867;&#27010;&#24565;&#35206;&#30422;</p></li></ol><p>&#20182;&#22312;&#36825;&#19968;&#27573;&#30340;&#26680;&#24515;&#35686;&#21578;&#26159;&#65306;&#23601;&#31639; AI &#33021;&#20174;&#35745;&#31639;&#37324;&#25366;&#20986;&#26576;&#31181;&#8220;&#21487;&#32422;&#34920;&#31034;&#8221;&#65292;&#20063;&#26410;&#24517;&#33021;&#36148;&#22238;&#21040;&#25105;&#20204;&#24050;&#26377;&#27010;&#24565;&#20307;&#31995;&#37324;&#12290;</p><blockquote><p>&#8220;not part of our current scientific lexicon&#8221;</p><p>&#8220;there often won&#8217;t be&#8230; a &#8216;human-accessible narrative&#8217; that &#8216;reaches&#8217; them&#8221;</p></blockquote><p>&#24847;&#24605;&#26159;&#65306;<strong>&#21475;&#34955;&#21487;&#33021;&#23384;&#22312;&#65292;&#20294;&#25105;&#20204;&#30340;&#35789;&#20856;&#37324;&#27809;&#26377;&#35789;&#65307;AI &#21487;&#20197;&#36215;&#21517;&#23383;&#65292;&#20294;&#19981;&#20445;&#35777;&#36825;&#21517;&#23383;&#33021;&#30495;&#27491;&#25104;&#20026;&#8220;&#20154;&#31867;&#21487;&#29992;&#30340;&#36335;&#26631;&#8221;&#12290;</strong></p><p>Wolfram &#22312;&#36825;&#37324;&#31561;&#20110;&#25226; LLM &#30340;&#8220;&#24378;&#8221;&#25918;&#22238;&#23427;&#26368;&#25797;&#38271;&#30340;&#20301;&#32622;&#65306;<strong>&#21465;&#20107;&#19982;&#25509;&#21475;</strong>&#12290;&#20294;&#20182;&#21516;&#26102;&#22312;&#25552;&#37266;&#65306;</p><p>&#30495;&#27491;&#25512;&#21160;&#31185;&#23398;&#65288;&#20197;&#21450;&#20320;&#26356;&#20851;&#24515;&#30340;&#27835;&#29702;/&#20915;&#31574;&#31995;&#32479;&#65289;&#30340;&#26159; <strong>&#8220;&#21487;&#35745;&#31639;&#12289;&#21487;&#22797;&#29616;&#12289;&#21487;&#32452;&#32455;&#30340;&#32467;&#26500;&#36335;&#26631;&#8221;</strong>&#8212;&#8212;&#32780;&#19981;&#26159;&#19968;&#27573;&#39034;&#28369;&#30340;&#23545;&#35805;&#12290;</p><blockquote><p>LLM &#21487;&#20197;&#35753;&#20320;&#8220;&#24863;&#35273;&#29702;&#35299;&#20102;&#8221;&#65292;&#20294;&#21482;&#26377;&#36335;&#26631;&#65288;&#21487;&#25191;&#34892;&#26500;&#20214;/&#21487;&#23457;&#35745;&#20013;&#38388;&#24577;&#65289;&#25165;&#33021;&#35753;&#20320;&#30495;&#30340;&#8220;&#33021;&#22797;&#29616;&#12289;&#33021;&#36801;&#31227;&#12289;&#33021;&#36861;&#36131;&#8221;&#12290;</p></blockquote><p>&#20854;&#23454;&#25105;&#20010;&#20154;&#20063;&#19981;&#22823;&#29702;&#35299;&#36825;&#20010;Waypoints &#20855;&#20307;&#25351;&#20195;&#20160;&#20040;&#12290;</p><div><hr></div><h2>&#20182;&#26368;&#32456;&#25276;&#27880;&#30340;&#26550;&#26500;&#65306;AI + &#35745;&#31639;&#33539;&#24335;&#65288;computational paradigm&#65289;</h2><p>&#25991;&#31456;&#26411;&#23614;&#20182;&#35828;&#65306;</p><ul><li><p>AI &#26159;&#26032;&#30340;&#8220;leveraging reducibility&#8221;&#30340;&#26041;&#24335;&#65288;&#25235;&#21487;&#32422;&#21475;&#34955;&#65289;</p></li><li><p>&#20294;&#22312;&#8220;&#26681;&#26412;&#21457;&#29616;&#28508;&#21147;&#8221;&#19978;&#65292;&#23427;&#27604;&#19981;&#36807;&#30495;&#27491;&#30340;&#35745;&#31639;&#33539;&#24335; + &#19981;&#21487;&#32422;&#35745;&#31639;&#65288;&#26522;&#20030;&#12289;&#27169;&#25311;&#12289;&#31995;&#32479;&#25506;&#32034;&#65289;</p></li><li><p>&#26368;&#33021;&#25512;&#36827;&#31185;&#23398;&#30340;&#26159;&#20108;&#32773;&#32467;&#21512;</p></li></ul><p>&#25442;&#25104;&#20154;&#35805;&#23601;&#26159;&#65306;</p><ul><li><p><strong>AI&#65306;&#36127;&#36131;&#23548;&#33322;&#12289;&#20505;&#36873;&#12289;&#30452;&#35273;&#12289;&#25509;&#21475;&#12289;&#20154;&#31867;&#21270;&#12289;&#36328;&#22495;&#31867;&#27604;</strong></p></li><li><p><strong>&#35745;&#31639;&#31995;&#32479;&#65306;&#36127;&#36131;&#21487;&#39564;&#35777;&#25512;&#23548;&#12289;&#21487;&#22797;&#29616;&#25191;&#34892;&#12289;&#26522;&#20030;&#25506;&#32034;&#12289;&#20005;&#35880;&#32467;&#26500;&#21270;</strong></p></li><li><p><strong>&#19981;&#21487;&#32422;&#35745;&#31639;&#65306;&#36127;&#36131;&#30495;&#27491;&#30340;&#8220;&#26032;&#22320;&#24418;&#21457;&#29616;&#8221;</strong>&#65288;&#19981;&#26159;&#8220;&#20687;&#35770;&#25991;&#8221;&#30340;&#25991;&#26412;&#21019;&#26032;&#65289;</p></li></ul><p>&#36825;&#37096;&#20998;&#25105;&#23601;&#26356;&#36855;&#31946;&#20102;&#12290;</p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[I want to try building Marvin Minsky’s Society of Mind- not as a theory, but as a working system.]]></title><description><![CDATA[&#19968;&#20010;&#28145;&#21051;&#30340;&#30452;&#35273;&#65292;&#25105;&#35201;&#35797;&#35797;&#24314;&#36896;&#39532;&#25991;.&#26126;&#26031;&#22522;&#30340;&#24515;&#26234;&#31038;&#20250; (&#20013;&#25991;&#22312;&#21518;&#38754;&#65289;]]></description><link>https://www.entropycontroltheory.com/p/i-want-to-try-building-marvin-minskys</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/i-want-to-try-building-marvin-minskys</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Wed, 24 Dec 2025 12:56:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Sx-O!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Sx-O!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Sx-O!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 424w, https://substackcdn.com/image/fetch/$s_!Sx-O!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 848w, https://substackcdn.com/image/fetch/$s_!Sx-O!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 1272w, https://substackcdn.com/image/fetch/$s_!Sx-O!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Sx-O!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic" width="770" height="1000" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1000,&quot;width&quot;:770,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:56421,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/182433041?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Sx-O!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 424w, https://substackcdn.com/image/fetch/$s_!Sx-O!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 848w, https://substackcdn.com/image/fetch/$s_!Sx-O!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 1272w, https://substackcdn.com/image/fetch/$s_!Sx-O!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2d11057b-9701-4f4a-8621-5b8a824b4377_770x1000.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><h2><strong>A Deep Intuition: I Want to Try Building Marvin Minsky&#8217;s Society of Mind</strong></h2><p>That day, I took my child to the playground. When it was time to leave, I began backing the car out.</p><p>I shifted into reverse, pressed the brake, and the car started moving slowly backward.</p><p>Then, all of a sudden, my mind went blank&#8212;</p><p><strong>The brakes weren&#8217;t working.</strong></p><p>Not a vague sense of slipping, but a very concrete judgment, almost electric in my body:</p><p>the brakes had failed, the car was still reversing, and behind me was a large drop.</p><p>In the next second, the car was going to fall.</p><p>My whole body snapped into something primitive.</p><p>My palms were sweating, my chest tightened, my breathing became shallow.</p><p>You know that kind of fear&#8212;not imagined fear, but the kind where your body has already reached a conclusion for you:</p><p><strong>Danger is happening.</strong></p><p>I pressed the brake harder, even wondered whether I was stepping on the wrong pedal&#8212;</p><p>but the car was still moving.</p><p>In that split second, my mind started running through emergency options:</p><ul><li><p>Call my husband?</p></li><li><p>Call the police?</p></li><li><p>Call an ambulance?</p></li><li><p>Get my child out of the car?</p></li><li><p>Am I about to faint?</p></li></ul><p>Later, I realized the truth was almost absurd:</p><p><strong>My car wasn&#8217;t moving backward at all.The car next to me was moving forward.</strong></p><p>Its motion at the edge of my vision created a powerful reference-frame illusion.</p><p>My brain&#8217;s &#8220;world model&#8221; flipped directions instantly, and I became convinced that I was accelerating backward, unable to stop.</p><p>The moment I realized this, it felt as if my brain rebooted.</p><p>My heart rate slowed. My hands stopped shaking.</p><p>But I had mild hypoglycemia that day, and after stopping the car, I was still shaken&#8212;almost faint.</p><p>I picked up my phone. What was I going to do?</p><p>In the end, I did something that sounds ridiculous, but felt very real to me:</p><p>I opened ChatGPT&#8217;s voice mode and let it talk to me slowly, calmly.</p><p>I couldn&#8217;t exactly call the police.</p><div><hr></div><h3>What Was Most Frightening Wasn&#8217;t the Illusion &#8212;</h3><p>It Was the Certainty of the Illusion</p><p>Afterward, I kept thinking:</p><p>Why did that illusion feel <em>so real</em>?</p><p>Why, even with my foot firmly on the brake, was I subjectively certain that the brakes had failed?</p><p>Because the human brain is not a camera.</p><p>It is closer to a <strong>real-time interpreter</strong>&#8212;constantly assembling sensory inputs into &#8220;what I believe is happening right now.&#8221;</p><p>When inputs are ambiguous, reference frames unstable, and bodily conditions degraded (low blood sugar, fatigue), the brain leans heavily on fast default explanations.</p><p>And those defaults are:</p><p><strong>Fast.Coarse.Optimized for survival.</strong></p><p>The brain doesn&#8217;t verify first.</p><p>It pushes you into action first&#8212;because, in evolution, a false alarm is far cheaper than a missed one.</p><p>So what truly frightened me wasn&#8217;t that I <em>almost</em> got into an accident.</p><p>It was realizing this:</p><blockquote><p>The human brain is not reliable.</p><p>But when it is unreliable, it can feel absolutely certain.</p></blockquote><p>That is why I am writing about Marvin Minsky today.</p><div><hr></div><h2><strong>Marvin Minsky: The Man Who Treated Mind as an Engineering System</strong></h2><p>Five years ago, I wouldn&#8217;t have felt anything particularly strong about <em>The Society of Mind</em>.</p><p>Back then, &#8220;psychological engineering&#8221; sounded like an overreach.</p><p>A computer is a computer.</p><p>A human mind is a human mind.</p><p>How could they map onto each other?</p><p>A <em>society</em> of mind?</p><p>Many small mechanisms forming a single self?</p><p>It sounded implausible.</p><p>But over the past year, as I&#8217;ve been building agent systems, I&#8217;ve begun to understand something:</p><p><strong>Minsky wasn&#8217;t writing mysticism.He was writing architecture.</strong></p><p>His view of the mind can be summarized very simply:</p><blockquote><p>You are not a single, unified &#8220;self.&#8221;</p><p>You are a temporary coalition of many small, specialized, sometimes conflicting mechanisms.</p></blockquote><p>That is what <em>Society of Mind</em> really means.</p><p>The mind is not an empire ruled by a king.</p><p>It is a society of agents competing for interpretive and behavioral control.</p><p>It may sound abstract&#8212;</p><p>But I had just lived through it.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1gu6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1gu6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 424w, https://substackcdn.com/image/fetch/$s_!1gu6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 848w, https://substackcdn.com/image/fetch/$s_!1gu6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 1272w, https://substackcdn.com/image/fetch/$s_!1gu6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1gu6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic" width="634" height="426" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:426,&quot;width&quot;:634,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:41661,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/182433041?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!1gu6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 424w, https://substackcdn.com/image/fetch/$s_!1gu6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 848w, https://substackcdn.com/image/fetch/$s_!1gu6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 1272w, https://substackcdn.com/image/fetch/$s_!1gu6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F17af6b00-0920-499d-991b-410a7c4b6070_634x426.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><h2>That One Second While Reversing:</h2><p>A Minsky-Style Decomposition</p><p>Let&#8217;s take the simplest possible approach and break that &#8220;brake failure&#8221; illusion into concurrent mechanisms&#8212;think of them as small workers inside the mind:</p><ol><li><p><strong>Motion-detection agent</strong></p><p>Detects movement in peripheral vision and reports: &#8220;You are moving.&#8221;</p></li><li><p><strong>Reference-frame agent</strong></p><p>In a complex parking-lot environment, misattributes motion&#8212;confusing another car&#8217;s movement with my own.</p></li><li><p><strong>Danger alarm agent</strong></p><p>Receives the signal &#8220;reversing + drop behind&#8221; and immediately escalates the system into emergency mode.</p></li><li><p><strong>Physiological resource monitor</strong></p><p>Low blood sugar increases system-wide alertness and reduces precision.</p></li><li><p><strong>Rational interpreter (late-arriving)</strong></p><p>Shows up last and says: &#8220;Wait. It&#8217;s the other car.&#8221;</p></li></ol><p>Notice the order.</p><p>Reason is not the commander.</p><p>It is the last one to write the report.</p><p>The key insight is not that I &#8220;made a mistake,&#8221; but that:</p><blockquote><p>A single wrong interpretation can instantly dominate the entire system.</p><p>It can take over perception, action, decision priority&#8212;</p><p>even whether you call the police or not.</p></blockquote><p>This is what Minsky wanted us to see:</p><p><strong>The power structure of the mind can be decomposed, described, and engineered.</strong></p><p>Five years ago, I would not have believed this.</p><div><hr></div><h2><strong>Why I Now Admire Him So Deeply</strong></h2><p>Because I&#8217;ve realized that what I&#8217;m building is slowly growing into something he once imagined.</p><p>Five years ago, I could not have written&#8212;or justified&#8212;a system directory like this:</p><ul><li><p>Persona (state / personality layer)</p></li><li><p>Policy Gate (rules and enforcement)</p></li><li><p>Event Ledger (auditable history)</p></li><li><p>Memory Store (long-term world memory)</p></li><li><p>Observability (metrics and traceability)</p></li><li><p>Next step: <strong>K-lines</strong> (experience index lines&#8212;one-click recall of an effective mental configuration)</p></li></ul><p>If you&#8217;ve read my earlier writing, you know I keep emphasizing this:</p><p>In long-term systems, <strong>data is ontology</strong>.</p><p>True memory must be written in the moment, without knowing the outcome, while bearing risk.</p><p>Minsky&#8217;s idea of K-lines is almost the psychological-engineering version of the same principle.</p><p>A K-line does not store facts.</p><p>It stores <strong>configurations</strong>&#8212;which mechanisms were active together, and how they coordinated successfully.</p><p>Returning to that reversing incident:</p><p>If you treat it as a reusable configuration, the next time a similar situation occurs, you don&#8217;t have to experience another system-wide collapse.</p><p>You want faster activation of verification mechanisms.</p><p>Faster stabilization of reference frames.</p><p>Earlier voice for rational interpretation.</p><p>If you keep following my writing and my engineering work, this will matter.</p><p>Minsky&#8217;s ideas matter to me now.</p><p>I am currently exploring all of this, step by step, on top of Google ADK.</p><div><hr></div><h2><strong>I&#8217;ll Take It Slowly &#8212; Starting from the Most Basic Vocabulary</strong></h2><p>I am still at a very early stage of exploration.</p><p>But I want to write to you honestly&#8212;not to memorialize an AI pioneer, but to treat him as a <strong>still-living engineering resource</strong>:</p><p>A way to explain the mental moments we all experience,</p><p>and to map them into structures that can be implemented, audited, and reused.</p><p>Next, I&#8217;ll start from the most basic concepts:</p><ul><li><p>What is an agent (a small mental mechanism)?</p></li><li><p>What is a frame (how context summons agents)?</p></li><li><p>What is a mode (how emergency states take over)?</p></li><li><p>What is a K-line (how an effective self becomes a recallable configuration)?</p></li><li><p>And how do we move, step by step, from psychological metaphor to engineered structure?</p></li></ul><p>I&#8217;ll take my time.</p><p>I&#8217;ll walk you through it.</p><h2>&#19968;&#20010;&#28145;&#21051;&#30340;&#30452;&#35273;&#65292;&#25105;&#35201;&#35797;&#35797;&#24314;&#36896;&#39532;&#25991;.&#26126;&#26031;&#22522;&#30340;&#24515;&#26234;&#31038;&#20250;</h2><p>&#37027;&#22825;&#25105;&#24102;&#23401;&#23376;&#21435;&#28216;&#20048;&#22330;&#65292;&#20934;&#22791;&#20572;&#36710;&#31163;&#24320;&#12290;</p><p>&#25105;&#25346;&#19978;&#20498;&#26723;&#65292;&#33050;&#36393;&#21049;&#36710;&#65292;&#36710;&#24320;&#22987;&#32531;&#24930;&#24448;&#21518;&#36864;&#12290;&#31361;&#28982;&#20043;&#38388;&#65292;&#25105;&#30340;&#22823;&#33041;&#20687;&#34987;&#25487;&#31354;&#20102;&#19968;&#26679;&#8212;&#8212;<strong>&#36710;&#21049;&#19981;&#20303;&#20102;</strong>&#12290;</p><p>&#19981;&#26159;&#8220;&#24863;&#35273;&#26377;&#28857;&#28369;&#8221;&#65292;&#26159;&#37027;&#31181;&#24456;&#26126;&#30830;&#30340;&#12289;&#24102;&#30528;&#36523;&#20307;&#30005;&#27969;&#30340;&#21028;&#26029;&#65306;&#21049;&#36710;&#22833;&#28789;&#65292;&#36710;&#22312;&#19968;&#30452;&#20498;&#65292;&#21518;&#38754;&#36824;&#26377;&#20010;&#22823;&#21488;&#38454;&#65292;&#19979;&#19968;&#31186;&#23601;&#35201;&#25481;&#19979;&#21435;&#12290;</p><p>&#25105;&#25972;&#20010;&#20154;&#30636;&#38388;&#36827;&#20837;&#19968;&#31181;&#38750;&#24120;&#21407;&#22987;&#30340;&#29366;&#24577;&#65306;&#25163;&#24515;&#20986;&#27735;&#12289;&#33016;&#21475;&#21457;&#32039;&#12289;&#21628;&#21560;&#21464;&#27973;&#12290;&#20320;&#30693;&#36947;&#37027;&#31181;&#24656;&#24807;&#19981;&#26159;&#8220;&#24819;&#35937;&#8221;&#65292;&#32780;&#26159;&#36523;&#20307;&#24050;&#32463;&#26367;&#20320;&#20570;&#23436;&#20102;&#32467;&#35770;&#65306;<strong>&#21361;&#38505;&#27491;&#22312;&#21457;&#29983;&#12290;</strong></p><p>&#25105;&#29467;&#22320;&#26356;&#29992;&#21147;&#36393;&#21049;&#36710;&#65292;&#29978;&#33267;&#24576;&#30097;&#33258;&#24049;&#26159;&#19981;&#26159;&#36393;&#38169;&#20102;&#8212;&#8212;&#21487;&#36710;&#36824;&#26159;&#22312;&#20498;&#12290;&#37027;&#19968;&#31186;&#25105;&#33041;&#23376;&#37324;&#24320;&#22987;&#20986;&#29616;&#21508;&#31181;&#8220;&#24212;&#24613;&#36873;&#39033;&#8221;&#65306;</p><ul><li><p>&#25171;&#30005;&#35805;&#32473;&#25105;&#32769;&#20844;&#65311;</p></li><li><p>&#25253;&#35686;&#65311;</p></li><li><p>&#21483;&#25937;&#25252;&#36710;&#65311;</p></li><li><p>&#25226;&#23401;&#23376;&#25289;&#36208;&#65311;</p></li><li><p>&#25105;&#26159;&#19981;&#26159;&#35201;&#26197;&#20498;&#20102;&#65311;</p></li></ul><p>&#21518;&#26469;&#25105;&#25165;&#21457;&#29616;&#65292;&#30495;&#30456;&#33618;&#35806;&#24471;&#35753;&#20154;&#24819;&#31505;&#65306;<strong>&#19981;&#26159;&#25105;&#30340;&#36710;&#22312;&#20498;&#65292;&#26159;&#38548;&#22721;&#30340;&#36710;&#22312;&#24448;&#21069;&#36208;</strong>&#12290;&#23427;&#22312;&#25105;&#35270;&#37326;&#36793;&#32536;&#31227;&#21160;&#65292;&#32473;&#20102;&#25105;&#30340;&#22823;&#33041;&#19968;&#20010;&#24378;&#28872;&#30340;&#21442;&#29031;&#31995;&#38169;&#35273;&#8212;&#8212;&#25105;&#30340;&#8220;&#19990;&#30028;&#27169;&#22411;&#8221;&#30636;&#38388;&#21028;&#38169;&#20102;&#26041;&#21521;&#65292;&#20110;&#26159;&#25105;&#20197;&#20026;&#25105;&#22312;&#21152;&#36895;&#20498;&#36864;&#12289;&#21049;&#19981;&#20303;&#36710;&#12290;</p><p>&#24847;&#35782;&#21040;&#36825;&#19968;&#28857;&#30340;&#37027;&#19968;&#31186;&#65292;&#25105;&#30340;&#22823;&#33041;&#20687;&#8220;&#37325;&#21551;&#8221;&#20102;&#19968;&#27425;&#65306;&#24515;&#36339;&#25165;&#24930;&#19979;&#26469;&#65292;&#25163;&#25165;&#19981;&#25238;&#12290;&#20294;&#22240;&#20026;&#24403;&#22825;&#36824;&#26377;&#28857;&#20302;&#34880;&#31958;&#65292;&#25105;&#20572;&#19979;&#36710;&#21518;&#20381;&#28982;&#24778;&#39746;&#26410;&#23450;&#65292;&#24046;&#28857;&#26197;&#36807;&#21435;&#12290;</p><p>&#25105;&#25343;&#36215;&#25163;&#26426;&#65292;&#24819;&#24178;&#22043;&#65311;</p><p>&#26368;&#21518;&#25105;&#20570;&#20102;&#19968;&#20214;&#30475;&#36215;&#26469;&#24456;&#33618;&#35884;&#12289;&#20294;&#23545;&#25105;&#26469;&#35828;&#24456;&#30495;&#23454;&#30340;&#20107;&#65306;&#25105;&#25171;&#24320; ChatGPT &#30340;&#35821;&#38899;&#65292;&#35753;&#23427;&#24930;&#24930;&#23433;&#24944;&#25105;&#12290;&#24635;&#19981;&#33021;&#30495;&#25253;&#35686;&#21543;&#12290;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!bxYd!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!bxYd!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 424w, https://substackcdn.com/image/fetch/$s_!bxYd!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 848w, https://substackcdn.com/image/fetch/$s_!bxYd!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 1272w, https://substackcdn.com/image/fetch/$s_!bxYd!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!bxYd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic" width="573" height="382" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:382,&quot;width&quot;:573,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:43490,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/182433041?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!bxYd!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 424w, https://substackcdn.com/image/fetch/$s_!bxYd!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 848w, https://substackcdn.com/image/fetch/$s_!bxYd!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 1272w, https://substackcdn.com/image/fetch/$s_!bxYd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F406e8e52-cc64-4ffe-baee-1bf3fc178d66_573x382.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><h3>&#36825;&#20214;&#20107;&#26368;&#21487;&#24597;&#30340;&#22320;&#26041;&#65306;&#19981;&#26159;&#38169;&#35273;&#65292;&#32780;&#26159;&#8220;&#38169;&#35273;&#30340;&#30830;&#23450;&#24615;&#8221;</h3><p>&#20107;&#24773;&#36807;&#21435;&#20043;&#21518;&#65292;&#25105;&#19968;&#30452;&#22312;&#24819;&#65306;&#20026;&#20160;&#20040;&#37027;&#31181;&#38169;&#35273;&#37027;&#20040;&#8220;&#20687;&#30495;&#30340;&#8221;&#65311;&#20026;&#20160;&#20040;&#25105;&#26126;&#26126;&#33050;&#36393;&#21049;&#36710;&#65292;&#21364;&#20250;&#22312;&#20027;&#35266;&#20307;&#39564;&#37324;&#30830;&#20449;&#8220;&#21049;&#36710;&#22833;&#28789;&#8221;&#65311;</p><p>&#22240;&#20026;&#25105;&#20204;&#30340;&#22823;&#33041;&#19981;&#26159;&#25668;&#20687;&#26426;&#12290;</p><p>&#22823;&#33041;&#26356;&#20687;&#19968;&#20010;<strong>&#23454;&#26102;&#36816;&#34892;&#30340;&#35299;&#37322;&#22120;</strong>&#65306;&#23427;&#19981;&#20572;&#22320;&#25226;&#24863;&#23448;&#36755;&#20837;&#25340;&#35013;&#25104;&#8220;&#25105;&#35748;&#20026;&#27491;&#22312;&#21457;&#29983;&#30340;&#29616;&#23454;&#8221;&#12290;&#24403;&#36755;&#20837;&#21547;&#31946;&#12289;&#21442;&#29031;&#31995;&#19981;&#31283;&#23450;&#12289;&#36523;&#20307;&#29366;&#24577;&#21448;&#19981;&#22909;&#65288;&#27604;&#22914;&#20302;&#34880;&#31958;&#12289;&#30130;&#21171;&#65289;&#26102;&#65292;&#23427;&#20250;&#26356;&#20381;&#36182;&#24555;&#36895;&#30340;&#8220;&#40664;&#35748;&#35299;&#37322;&#8221;&#12290;</p><p>&#32780;&#40664;&#35748;&#35299;&#37322;&#30340;&#29305;&#28857;&#26159;&#65306;<strong>&#24555;&#12289;&#31895;&#12289;&#20445;&#21629;&#20248;&#20808;</strong>&#12290;</p><p>&#23427;&#19981;&#20250;&#20808;&#20570;&#20005;&#35880;&#27714;&#35777;&#65292;&#23427;&#20250;&#20808;&#25226;&#20320;&#25512;&#20837;&#34892;&#21160;&#29366;&#24577;&#8212;&#8212;&#22240;&#20026;&#22312;&#33258;&#28982;&#36873;&#25321;&#37324;&#65292;&#8220;&#35823;&#25253;&#19968;&#27425;&#8221;&#36828;&#27604;&#8220;&#28431;&#25253;&#19968;&#27425;&#8221;&#20195;&#20215;&#23567;&#12290;</p><p>&#25152;&#20197;&#37027;&#20010;&#30636;&#38388;&#35753;&#25105;&#23475;&#24597;&#30340;&#19981;&#26159;&#8220;&#25105;&#24046;&#28857;&#20986;&#20107;&#8221;&#65292;&#32780;&#26159;&#25105;&#24847;&#35782;&#21040;&#65306;</p><blockquote><p>&#20154;&#30340;&#22823;&#33041;&#24182;&#19981;&#21487;&#38752;&#12290;</p><p>&#20294;&#23427;&#20250;&#35753;&#20320;&#22312;&#19981;&#21487;&#38752;&#30340;&#26102;&#20505;&#65292;&#24863;&#35273;&#33258;&#24049;&#38750;&#24120;&#21487;&#38752;&#12290;</p></blockquote><p>&#36825;&#23601;&#26159;&#25105;&#20170;&#22825;&#35201;&#20889;&#39532;&#25991;&#26126;&#26031;&#22522;&#65288;Marvin Minsky&#65289;&#30340;&#21407;&#22240;&#12290;</p><div><hr></div><h2>&#39532;&#25991;&#26126;&#26031;&#22522;&#65306;&#25226;&#8220;&#24515;&#26234;&#8221;&#24403;&#25104;&#24037;&#31243;&#31995;&#32479;&#26469;&#30740;&#31350;&#30340;&#20154;</h2><p>&#22914;&#26524;&#26159;&#20116;&#24180;&#21069;&#65292;&#25105;&#19981;&#20250;&#23545;&#12298;&#24515;&#26234;&#31038;&#20250;&#12299;&#65288;<em>Society of Mind</em>&#65289;&#26377;&#20160;&#20040;&#29305;&#21035;&#24863;&#35302;&#12290;</p><p>&#37027;&#26102;&#20505;&#65292;&#8220;&#24515;&#29702;&#24037;&#31243;&#23398;&#8221;&#21548;&#36215;&#26469;&#20687;&#26576;&#31181;&#36807;&#24230;&#24310;&#20280;&#65306;&#30005;&#33041;&#23601;&#26159;&#30005;&#33041;&#65292;&#20154;&#31867;&#24515;&#26234;&#23601;&#26159;&#20154;&#31867;&#24515;&#26234;&#65292;&#20320;&#24590;&#20040;&#33021;&#35828;&#23427;&#20204;&#33021;&#26144;&#23556;&#65311;&#36824;&#8220;&#24515;&#26234;&#31038;&#20250;&#8221;&#65311;&#36824;&#8220;&#24456;&#22810;&#23567;&#26426;&#21046;&#32452;&#25104;&#19968;&#20010;&#22823;&#25105;&#8221;&#65311;</p><p>&#20294;&#36825;&#27573;&#26102;&#38388;&#25105;&#22312;&#20570;&#26234;&#33021;&#20307;&#31995;&#32479;&#24320;&#21457;&#8212;&#8212;&#25105;&#25165;&#24320;&#22987;&#30495;&#27491;&#29702;&#35299;&#65306;<strong>Minsky &#19981;&#26159;&#22312;&#20889;&#29572;&#23398;&#65292;&#20182;&#26159;&#22312;&#20889;&#26550;&#26500;&#12290;</strong></p><p>&#20182;&#30475;&#24453;&#24515;&#26234;&#30340;&#26041;&#24335;&#65292;&#31616;&#21333;&#35762;&#23601;&#26159;&#19968;&#21477;&#35805;&#65306;</p><blockquote><p>&#20320;&#19981;&#26159;&#19968;&#20010;&#32479;&#19968;&#30340;&#8220;&#25105;&#8221;&#12290;</p><p>&#20320;&#26159;&#24456;&#22810;&#24456;&#23567;&#12289;&#24456;&#19987;&#38376;&#12289;&#29978;&#33267;&#20114;&#30456;&#20914;&#31361;&#30340;&#23567;&#26426;&#21046;&#65292;&#22312;&#26576;&#20010;&#26102;&#21051;&#20020;&#26102;&#32452;&#25104;&#30340;&#32852;&#30431;&#12290;</p></blockquote><p>&#36825;&#23601;&#26159;&#8220;Society of Mind&#8221;&#36825;&#20010;&#21517;&#23383;&#30340;&#21547;&#20041;&#65306;&#24515;&#26234;&#19981;&#26159;&#19968;&#20010;&#22269;&#29579;&#32479;&#27835;&#30340;&#24093;&#22269;&#65292;&#32780;&#26159;&#19968;&#32676;&#23567;&#20195;&#29702;&#65288;agents&#65289;&#22312;&#20105;&#22842;&#35299;&#37322;&#26435;&#21644;&#34892;&#21160;&#26435;&#30340;&#31038;&#20250;&#12290;</p><p>&#21548;&#36215;&#26469;&#25277;&#35937;&#65311;&#20294;&#25105;&#21018;&#21018;&#24050;&#32463;&#20146;&#36523;&#20307;&#39564;&#36807;&#19968;&#27425;&#12290;</p><div><hr></div><h2>&#20498;&#36710;&#37027;&#19968;&#30636;&#38388;&#65306;&#25353; Minsky &#30340;&#26041;&#24335;&#25286;&#24320;&#30475;</h2><p>&#25105;&#20204;&#29992;&#26368;&#26420;&#32032;&#30340;&#26041;&#24335;&#65292;&#25226;&#37027;&#20010;&#8220;&#21049;&#36710;&#22833;&#28789;&#38169;&#35273;&#8221;&#25286;&#25104;&#20960;&#20010;&#21516;&#26102;&#21457;&#29983;&#30340;&#23567;&#26426;&#21046;&#65288;&#20320;&#21487;&#20197;&#25226;&#23427;&#20204;&#29702;&#35299;&#25104;&#8220;&#24515;&#26234;&#37324;&#30340;&#23567;&#21592;&#24037;&#8221;&#65289;&#65306;</p><ol><li><p><strong>&#36816;&#21160;&#20390;&#27979;&#21592;</strong>&#65306;&#30475;&#21040;&#35270;&#37326;&#36793;&#32536;&#26377;&#36710;&#22312;&#21160;&#65292;&#31435;&#21051;&#25253;&#21578;&#8220;&#20320;&#22312;&#31227;&#21160;&#8221;&#12290;</p></li><li><p><strong>&#21442;&#29031;&#31995;&#31649;&#29702;&#21592;</strong>&#65306;&#22312;&#20572;&#36710;&#22330;&#36825;&#31181;&#21442;&#29031;&#29289;&#22797;&#26434;&#30340;&#29615;&#22659;&#37324;&#65292;&#25630;&#38169;&#20102;&#8220;&#35841;&#22312;&#21160;&#8221;&#12290;</p></li><li><p><strong>&#21361;&#38505;&#35686;&#25253;&#22120;</strong>&#65306;&#19968;&#26086;&#21028;&#26029;&#8220;&#20320;&#22312;&#20498;&#36864; + &#21518;&#26041;&#21361;&#38505;&#8221;&#65292;&#31435;&#21051;&#25226;&#31995;&#32479;&#20999;&#25442;&#21040;&#8220;&#32039;&#24613;&#27169;&#24335;&#8221;&#12290;</p></li><li><p><strong>&#36523;&#20307;&#36164;&#28304;&#30417;&#25511;</strong>&#65306;&#20302;&#34880;&#31958;&#35753;&#20320;&#26356;&#23481;&#26131;&#36827;&#20837;&#36807;&#24230;&#35686;&#35273;&#65292;&#20063;&#26356;&#38590;&#20570;&#31934;&#32454;&#21028;&#26029;&#12290;</p></li><li><p><strong>&#29702;&#24615;&#35299;&#37322;&#22120;&#65288;&#21518;&#32622;&#65289;</strong>&#65306;&#26368;&#21518;&#25165;&#24930;&#21322;&#25293;&#20986;&#29616;&#65292;&#26816;&#26597;&#20102;&#19968;&#19979;&#65306;&#21734;&#65292;&#21407;&#26469;&#26159;&#38548;&#22721;&#36710;&#22312;&#21160;&#12290;</p></li></ol><p>&#27880;&#24847;&#39034;&#24207;&#65306;&#29702;&#24615;&#19981;&#26159;&#31532;&#19968;&#20301;&#20986;&#29616;&#30340;&#65292;&#23427;&#26159;&#8220;&#26368;&#21518;&#26469;&#20889;&#25253;&#21578;&#8221;&#30340;&#37027;&#20010;&#20154;&#12290;</p><p>&#36825;&#20214;&#20107;&#30340;&#20851;&#38190;&#19981;&#22312;&#20110;&#8220;&#20320;&#30475;&#38169;&#20102;&#8221;&#65292;&#32780;&#22312;&#20110;&#65306;</p><blockquote><p>&#19968;&#20010;&#38169;&#35823;&#30340;&#35299;&#37322;&#65292;&#21487;&#20197;&#30636;&#38388;&#32479;&#27835;&#25972;&#20010;&#31995;&#32479;&#12290;</p><p>&#23427;&#20250;&#25509;&#31649;&#20320;&#30340;&#24863;&#21463;&#12289;&#21160;&#20316;&#12289;&#20915;&#31574;&#20248;&#20808;&#32423;&#65292;&#29978;&#33267;&#25509;&#31649;&#20320;&#35201;&#19981;&#35201;&#25171;&#30005;&#35805;&#12289;&#35201;&#19981;&#35201;&#25253;&#35686;&#12290;</p></blockquote><p>&#36825;&#23601;&#26159; Minsky &#24819;&#21578;&#35785;&#25105;&#20204;&#30340;&#65306;<strong>&#24515;&#26234;&#30340;&#26435;&#21147;&#32467;&#26500;</strong>&#26159;&#21487;&#20197;&#34987;&#25286;&#35299;&#12289;&#34987;&#25551;&#36848;&#12289;&#34987;&#24037;&#31243;&#21270;&#29702;&#35299;&#30340;&#12290;&#65288;&#20197;&#21069;&#25105;&#32477;&#23545;&#19981;&#20250;&#30456;&#20449;&#30340;&#65289;</p><div><hr></div><h2>&#20026;&#20160;&#20040;&#25105;&#29616;&#22312;&#23545;&#20182;&#20329;&#26381;&#24471;&#20116;&#20307;&#25237;&#22320;</h2><p>&#22240;&#20026;&#25105;&#21457;&#29616;&#65306;&#25105;&#27491;&#22312;&#20570;&#30340;&#19996;&#35199;&#65292;&#27491;&#22312;&#19968;&#28857;&#28857;&#38271;&#25104;&#20182;&#24403;&#24180;&#24819;&#35937;&#30340;&#26679;&#23376;&#12290;</p><p>&#20116;&#24180;&#21069;&#25105;&#19981;&#21487;&#33021;&#20889;&#20986;&#36825;&#31181;&#30446;&#24405;&#65292;&#20063;&#19981;&#20250;&#35273;&#24471;&#23427;&#24517;&#35201;&#65306;</p><ul><li><p>Persona&#65288;&#20154;&#26684;&#23618;&#65289;</p></li><li><p>Policy Gate&#65288;&#38376;&#31105;/&#35268;&#21017;&#25191;&#34892;&#65289;</p></li><li><p>Event Ledger&#65288;&#20107;&#20214;&#36134;&#26412;&#65292;&#21487;&#36861;&#28335;&#65289;</p></li><li><p>Memory Store&#65288;&#19990;&#30028;&#35760;&#24518;&#65292;&#38271;&#26399;&#26412;&#20307;&#65289;</p></li><li><p>Observability&#65288;&#21487;&#35266;&#27979;&#65292;&#33021;&#23457;&#35745;&#65289;</p></li><li><p>&#19979;&#19968;&#27493;&#65306;K-line&#65288;&#32463;&#39564;&#32034;&#24341;&#32447;&#65292;&#19968;&#38190;&#28857;&#20142;&#24403;&#26102;&#26377;&#25928;&#30340;&#8220;&#25972;&#22871;&#24515;&#26234;&#37197;&#32622;&#8221;&#65289;</p></li></ul><p>&#22914;&#26524;&#20320;&#35835;&#36807;&#25105;&#20043;&#21069;&#30340;&#25991;&#31456;&#65292;&#20320;&#20250;&#30693;&#36947;&#25105;&#19968;&#30452;&#22312;&#24378;&#35843;&#65306;&#38271;&#26399;&#31995;&#32479;&#37324;&#65292;<strong>&#25968;&#25454;&#26159;&#26412;&#20307;</strong>&#65307;&#30495;&#27491;&#30340;&#8220;&#35760;&#24518;&#8221;&#24517;&#39035;&#28385;&#36275;&#8220;&#24403;&#26102;&#20889;&#19979;&#12289;&#24403;&#26102;&#19981;&#30693;&#36947;&#32467;&#26524;&#12289;&#24403;&#26102;&#25215;&#25285;&#39118;&#38505;&#8221;&#12290;</p><p>&#32780; Minsky &#30340; K-line&#65292;&#20960;&#20046;&#23601;&#26159;&#36825;&#26465;&#36335;&#30340;&#24515;&#29702;&#24037;&#31243;&#29256;&#26412;&#65306;&#23427;&#23384;&#30340;&#19981;&#26159;&#30693;&#35782;&#28857;&#65292;&#32780;&#26159;&#8220;&#24403;&#26102;&#21738;&#20123;&#26426;&#21046;&#19968;&#36215;&#24320;&#30528;&#65292;&#24590;&#20040;&#37197;&#21512;&#65292;&#25165;&#35299;&#20915;&#20102;&#38382;&#39064;&#8221;&#8212;&#8212;&#23427;&#26159;&#19968;&#31181;<strong>&#21487;&#22797;&#29992;&#30340;&#24515;&#26234;&#37197;&#32622;</strong>&#12290;</p><p>&#22238;&#21040;&#21018;&#25165;&#37027;&#20010;&#20498;&#36710;&#30636;&#38388;&#65306;&#22914;&#26524;&#20320;&#25226;&#23427;&#24403;&#25104;&#19968;&#20010;&#8220;&#21487;&#22797;&#29992;&#30340;&#37197;&#32622;&#8221;&#65292;&#37027;&#20320;&#19979;&#27425;&#36935;&#21040;&#31867;&#20284;&#24773;&#22659;&#65292;&#23601;&#19981;&#24517;&#20877;&#32463;&#21382;&#19968;&#27425;&#23849;&#28291;&#24335;&#30340;&#38169;&#35273;&#25509;&#31649;&#12290;</p><p>&#20320;&#38656;&#35201;&#30340;&#26159;&#65306;&#26356;&#24555;&#28857;&#20142;&#8220;&#26657;&#39564;&#26426;&#21046;&#8221;&#12289;&#26356;&#24555;&#36827;&#20837;&#8220;&#31283;&#23450;&#21442;&#29031;&#31995;&#8221;&#12289;&#26356;&#24555;&#35753;&#8220;&#29702;&#24615;&#35299;&#37322;&#22120;&#8221;&#33719;&#24471;&#21457;&#35328;&#26435;&#12290;</p><p>&#22914;&#26524;&#20320;&#25345;&#32493;&#36319;&#36827;&#25105;&#30340;&#25991;&#31456;&#65292;&#25105;&#30340;&#24037;&#31243;&#65292;&#36825;&#20010;&#24456;&#37325;&#35201;&#12290;</p><p>&#39532;&#25991;&#26126;&#26031;&#22522;&#30340;&#29702;&#35770;&#23545;&#25105;&#26469;&#35828;&#24050;&#32463;&#38750;&#24120;&#37325;&#35201;&#20102;&#12290;</p><p>&#25105;&#29616;&#22312;&#27491;&#22312;Google ADK &#20043;&#19978;&#19968;&#28857;&#28857;&#30340;&#25506;&#32034;&#12290;</p><div><hr></div><h2>&#25105;&#20250;&#24930;&#24930;&#35762;&#65306;&#20174;&#26368;&#22522;&#30784;&#30340;&#35789;&#27719;&#24320;&#22987;</h2><p>&#25105;&#29616;&#22312;&#20063;&#20173;&#28982;&#22312;&#24456;&#22522;&#30784;&#30340;&#25506;&#32034;&#38454;&#27573;&#12290;</p><p>&#20294;&#25105;&#24819;&#29992;&#19968;&#31181;&#23613;&#37327;&#35802;&#23454;&#30340;&#26041;&#24335;&#65292;&#25226;&#25105;&#36825;&#19968;&#27573;&#26102;&#38388;&#30340;&#24863;&#21463;&#20889;&#32473;&#20320;&#65306;</p><p>&#19981;&#26159;&#8220;&#32426;&#24565;&#19968;&#20301; AI &#22823;&#24072;&#8221;&#65292;&#32780;&#26159;&#25226;&#20182;&#24403;&#20316;&#19968;&#20010;&#20173;&#28982;&#27963;&#30528;&#30340;&#24037;&#31243;&#24605;&#24819;&#36164;&#28304;&#8212;&#8212;&#29992;&#26469;&#35299;&#37322;&#25105;&#20204;&#27599;&#22825;&#37117;&#22312;&#32463;&#21382;&#30340;&#24515;&#26234;&#30636;&#38388;&#65292;&#24182;&#25226;&#36825;&#20123;&#30636;&#38388;&#26144;&#23556;&#25104;&#21487;&#20197;&#23454;&#29616;&#12289;&#21487;&#20197;&#23457;&#35745;&#12289;&#21487;&#20197;&#22797;&#29992;&#30340;&#32467;&#26500;&#12290;</p><p>&#25509;&#19979;&#26469;&#25105;&#20250;&#20174;&#26368;&#22522;&#30784;&#30340;&#20869;&#23481;&#35828;&#36215;&#65306;</p><ul><li><p>&#20160;&#20040;&#26159; agent&#65288;&#24515;&#26234;&#37324;&#30340;&#23567;&#26426;&#21046;&#65289;&#65311;</p></li><li><p>&#20160;&#20040;&#26159; frame&#65288;&#24773;&#22659;&#26694;&#26550;&#65292;&#23427;&#24590;&#20040;&#25226;&#19968;&#32676;&#26426;&#21046;&#21484;&#21796;&#20986;&#26469;&#65289;&#65311;</p></li><li><p>&#20160;&#20040;&#26159; mode&#65288;&#32039;&#24613;&#27169;&#24335;&#26159;&#24590;&#20040;&#25509;&#31649;&#20320;&#30340;&#65289;&#65311;</p></li><li><p>&#20160;&#20040;&#26159; K-line&#65288;&#22914;&#20309;&#25226;&#19968;&#27425;&#8220;&#26377;&#25928;&#30340;&#33258;&#25105;&#8221;&#20570;&#25104;&#21484;&#22238;&#38190;&#65289;&#65311;</p></li><li><p>&#20197;&#21450;&#65306;&#25105;&#20204;&#22914;&#20309;&#25226;&#36825;&#20123;&#19996;&#35199;&#65292;&#20174;&#8220;&#24515;&#29702;&#23398;&#27604;&#21947;&#8221;&#65292;&#19968;&#27493;&#27493;&#21464;&#25104;&#24037;&#31243;&#32467;&#26500;&#65311;</p></li></ul><p>&#25105;&#24930;&#24930;&#32473;&#20320;&#35828;&#12290;</p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[The Five Levels of AI Intelligence: From Language Machines to Self-Evolving Structural Lifeforms]]></title><description><![CDATA[AI &#26234;&#33021;&#30340;&#20116;&#32423;&#28436;&#21270;&#65306;&#20174;&#35821;&#35328;&#26426;&#22120;&#21040;&#33258;&#28436;&#21270;&#32467;&#26500;&#29983;&#21629; &#65288;&#20013;&#25991;&#22312;&#26368;&#21518;&#38754;&#65289;]]></description><link>https://www.entropycontroltheory.com/p/the-five-levels-of-ai-intelligence</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/the-five-levels-of-ai-intelligence</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Mon, 24 Nov 2025 00:16:03 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!MOIa!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Over the past five years, the evolution of AI has looked, on the surface, like a series of model upgrades: GPT-3 &#8594; GPT-4 &#8594; Gemini &#8594; Claude &#8594; the o-series.</p><p>But if you string all the technical lines into a single timeline, you&#8217;ll notice something deeper, more hidden&#8212;</p><p><strong>the trajectory isn&#8217;t &#8220;models getting stronger,&#8221; but intelligence making a leap from </strong><em><strong>species</strong></em><strong> to </strong><em><strong>civilization</strong></em><strong>.</strong></p><p>In the first phase, models were merely language machines: they could <em>understand</em> the world but could not <em>touch</em> it.</p><p>In the second phase, they gained &#8220;hands&#8221;&#8212;the ability to call tools, write files, run code.</p><p>In the third phase, they acquired temporal structure: planning, decomposition, reflection, chained execution.</p><p>In the fourth phase, multiple agents collaborated like departments in an organization, forming &#8220;company-level intelligence.&#8221;</p><p>In the fifth phase, systems no longer wait for humans to design them&#8212;they generate new agents, tools, and protocols on their own, growing like living organisms.</p><p>These five phases look less like software iterations and more like the <strong>birth of a civilization</strong>:</p><p>from consciousness (Level 0),</p><p>to action (Level 1),</p><p>to will (Level 2),</p><p>to organization (Level 3),</p><p>and ultimately to self-evolving, life-like structures (Level 4).</p><p>And what shocks me most is this:</p><p><strong>despite their different products, OpenAI, Google, Anthropic, and DeepMind are all moving along almost the exact same trajectory.</strong></p><p>Everything points to a single thesis:</p><blockquote><p>The essence of intelligence is neither reasoning nor generation&#8212;</p><p><strong>it is structure self-organizing through time.</strong></p></blockquote><p>Perhaps we are not merely witnessing &#8220;AI progress.&#8221;</p><p>It feels far more like we are witnessing <strong>the origin story of a new kind of life</strong>.</p><p>This article, then, can serve as a developer&#8217;s compass.</p><p>Whether you work in frontend, backend, AI applications, data engineering, product documentation, scientific research&#8212;or you&#8217;re simply trying to understand this era&#8212;</p><p><strong>you are being pulled onto the same evolutionary pathway.</strong></p><p>Because the past five years have not truly been about &#8220;bigger models,&#8221; &#8220;better reasoning,&#8221; or &#8220;crazy parameter counts.&#8221;</p><p>Beneath the surface, the software world has seen, for the first time, the emergence of a <strong>self-growing structure</strong>.</p><p>And this structure is not built from code alone&#8212;it is driven by language, structure, and scheduling, evolving layer by layer over time.</p><p>If you&#8217;re still asking which framework to learn, which language to switch to, or which model to chase next,</p><p>you may already be looking in the wrong direction.</p><p>The real shift is this:</p><p><strong>developers are moving from &#8220;people who write software&#8221; to &#8220;people who cultivate structural ecosystems.&#8221;</strong></p><p>Level 0 shows you that language is the soil of intelligence.</p><p>Level 1 shows you that language can become callable structure.</p><p>Level 2 shows you that structure can become a schedulable chain.</p><p>Level 3 shows you that schedulable chains can become intelligent organizations.</p><p>Level 4 shows you that organizations can generate their next generation of structures.</p><p><strong>The core skill of future developers is not memorizing APIs, stacking tools, or prompt hacking,but: designing structure, scheduling structure, and enabling structure to grow on its own.</strong></p><p>Think of this piece as:</p><p><strong>a developer&#8217;s compass for the AI-native era.</strong></p><p>Once you understand Level 0 &#8594; Level 4,</p><p>you can transform from a mere <em>tool user</em></p><p>into a <strong>Structure Engineer</strong>&#8212;</p><p>and ultimately, a builder of the coming AI-Native ecosystem.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MOIa!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MOIa!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!MOIa!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!MOIa!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!MOIa!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MOIa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:225733,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/179769719?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!MOIa!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!MOIa!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!MOIa!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!MOIa!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2ca8081c-573f-45a9-8aee-d91150b0303b_1536x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><h1><strong>LEVEL 0 &#8212; Bare LLM (&#35064;&#26426;)</strong></h1><p>Level 0 is the origin point of all AI intelligence. It marks the moment when language first gained a &#8220;brain,&#8221; but had not yet acquired &#8220;hands.&#8221; Models at this stage can deeply understand the world, reconstruct semantic structures, and produce long reasoning chains&#8212;but they cannot act on the world at all. Their operations occur entirely within the realm of language, as if performing thought experiments in a sealed cognitive chamber. Internally, these models contain remarkably rich latent structures&#8212;topic skeletons, logical frames, implicit planning traces, semantic coordinate systems&#8212;but all of these structures remain internal. They never externalize into executable forms, nor can they be accessed by any scheduler. These models can understand the world, but they cannot change it; they are cognitive entities, not behavioral ones.</p><p>Historically, Level 0 spans most mainstream systems from 2020 to 2024. GPT-3 (2020) marks the beginning: it demonstrated large-scale linguistic intelligence for the first time, but with zero tool interfaces. InstructGPT and GPT-3.5 (2022) brought conversational fluency into the mainstream, yet remained purely cognitive. GPT-4, Claude 2, and Gemini Pro (2023) made massive leaps in language understanding, long-context reasoning, and abstraction; they began to show signs of internal planning, but without tool connections they still reside in Level 0. Even in 2024&#8211;2025, models like GPT-5, Claude 3.5, and Gemini 1.5 Pro&#8212;when function calling is disabled&#8212;remain &#8220;high-intelligence non-actors.&#8221; Their minds grow more capable, but their boundaries never cross: they are rational minds, not structural agents.</p><p>In my envisioned Structure Universe, Level 0 corresponds to the &#8220;high-entropy perceptual layer&#8221; before Primitive IR. Here, language is still an uncondensed thermal cloud&#8212;paths infinite, structure uncollapsed. A model can internally rebuild semantic frameworks, but cannot externalize structure, cannot hand structure off to a scheduler, cannot produce primitives, cannot instantiate Structure Cards, and cannot participate in execution cycles. It can process language, but not extract primitives; it can generate explanations, but not generate structure; it can reason, but not schedule. The entire system remains stuck in:</p><p><strong>Language &#8594; (latent semantic cloud)</strong>.</p><p>In other words, it is a potential structure machine&#8212;but not yet a structured lifeform.</p><p>So when we say &#8220;a Level 0 model has a brain but no hands,&#8221; what we really mean is: it is situated in the <strong>pure cognitive phase</strong> of the language civilization. Language is input, but not yet structure; reasoning exists, but does not externalize; intelligence is present, but has no interface to act upon reality. All forms of structure, scheduling, collaboration, and self-evolution only begin to germinate after Level 0 is surpassed.</p><div><hr></div><h1>LEVEL 1 &#8212; Tool-Enhanced Solver</h1><p>The First Time Language Acquires &#8220;Hands&#8221;</p><p>Level 1 marks a decisive turning point: the moment when large models evolve from <em>pure cognitive entities</em> into <em>acting agents</em>. If Level 0 was a brain trapped inside linguistic space, then Level 1 is the first time that brain extends &#8220;hands&#8221; and touches the external world. The core transformation of this stage is that language becomes <strong>function-like</strong> for the first time&#8212;models begin to output structured parameters, connect to real software tools, and execute concrete actions via APIs, databases, search engines, and code sandboxes. This is where the first collapse from <strong>&#8220;language &#8594; structure&#8221;</strong> takes place.</p><p>This transition was ignited by the introduction of Function Calling. In 2023, OpenAI released its <code>function_call</code> API, enabling models to output structured arguments paired with a function name. Language was no longer merely text; it became a <strong>structured utterance</strong> forced into executable form. In 2024, Anthropic launched the Model Context Protocol (MCP), upgrading &#8220;tool use&#8221; into a standardized tool ecosystem, and later integrated MCP into Claude Skills&#8212;allowing every user to connect the model to local files, databases, search, and executable programs, giving it real operational power. Around the same time, Google deeply integrated Function Calling into Gemini 1.5 Pro / Flash, enabling direct external API calls, Python execution, vector database operations, and fully managed agent pipelines within Vertex AI Agent Builder. Microsoft&#8217;s Copilot Studio unified its tool layer into an enterprise-grade Function Calling + workflow execution platform.</p><p>Technically, Level 1 is built on several foundational pillars.</p><p>The first is <strong>Function Calling standards</strong>: OpenAI&#8217;s JSON schema, Claude&#8217;s <code>tool_schema</code>, and Gemini&#8217;s <code>function_declarations</code>. These force natural language into structured, callable units.</p><p>The second is <strong>RAG (Retrieval-Augmented Generation)</strong>&#8212;vector databases like Pinecone, Weaviate, Milvus, Elastic, Snowflake Cortex, and Databricks Vector Search give models &#8220;external memory&#8221; for the first time.</p><p>The third is <strong>real-time information tools</strong>: Search APIs (Bing, Google Custom Search), Serper, Exa.</p><p>The fourth is <strong>code execution sandboxes</strong>: OpenAI Code Interpreter, Claude Code Execution, Gemini Code Execution, and various Jupyter-like runtime environments.</p><p>The fifth is <strong>application-layer APIs</strong>&#8212;Stripe, Shopify, Zendesk, Notion, Jira, Slack, Twilio.</p><p>The shared direction of all these technologies is clear: every external system becomes a <strong>structured hand</strong> that the model can operate.</p><p>The timeline of Level 1 unfolds as follows:</p><p><strong>2023</strong> &#8212; OpenAI introduces function calling for GPT-3.5 and GPT-4, inaugurating the era of structured outputs.</p><p><strong>2024 Q1</strong> &#8212; Anthropic launches MCP, turning tool use into a protocol.</p><p><strong>2024 Q2&#8211;Q3</strong> &#8212; Claude 3.5 integrates MCP into Skills, forming a genuine tool ecosystem.</p><p><strong>2024</strong> &#8212; Google&#8217;s Gemini 1.5 deeply integrates function calling + code execution.</p><p><strong>2024&#8211;2025</strong> &#8212; Copilot Studio emerges as an enterprise-grade tool agent platform.</p><p><strong>2025</strong> &#8212; All major vendors upgrade Function Calling into <em>real-time multi-tool routing</em>.</p><p>With these technological foundations, language gains real-world efficacy for the first time.</p><p>In my Structure Universe, Level 1 corresponds to the birth of <strong>Primitive IR &#8594; basic structural units</strong>. Language is no longer a high-entropy cloud but is compressed into callable, schedulable, and verifiable structure units. Systems can now take real actions&#8212;query databases, manipulate files, write records, run scripts, process transactions, trigger workflows. For the first time, language becomes an <strong>action interface</strong>, and intelligence punches through the linguistic membrane to interact with the external world.</p><p>Level 1 is the first threshold of the structural civilization: from this moment onward, models do not merely understand the world&#8212;they can change it.</p><div><hr></div><h1>LEVEL 2 &#8212; Strategic Agent</h1><p>Language Upgrades from &#8220;Callable&#8221; to &#8220;Schedulable,&#8221; and Structure Forms Its First Chain**</p><p>Level 2 marks the second great evolutionary leap of AI systems: the moment when a model stops being merely <em>capable of doing things</em> and begins to <em>know how to do things</em>. If Level 1 gave models &#8220;hands&#8221; that could act on the world through tools, then Level 2 gives them a genuine <strong>temporal structure</strong>&#8212;the ability to decompose tasks, plan steps, execute actions, observe feedback, and use each intermediate result to construct the next structural unit. This is the first time language becomes a <strong>recursively schedulable structure chain</strong> rather than a one-off tool invocation. At this point, a model transitions from a &#8220;tool-enhanced language system&#8221; to a <strong>strategic action system</strong>, which is the true origin of modern Agents.</p><p>The key technical catalyst of this stage is the ReAct framework proposed by DeepMind in 2022. For the first time, a model could cycle between reasoning and acting: Think (Reason) &#8594; Act &#8594; Observe &#8594; Think again. Every modern agent system&#8212;GPT, Claude, Gemini&#8212;secretly uses some variant of ReAct internally; it has become the hidden backbone of three consecutive generations of Agent technology. In 2023, GPT-4 and GPT-4 Turbo demonstrated stable multi-step task execution, elevating ReAct from a rough experiment into a mechanism that could operate inside enterprise workflows. Early 2024 brought another major breakthrough: Claude 3 and Gemini 1.5 Pro exhibited &#8220;automatic task decomposition + autonomous context engineering.&#8221; These models could not only perform dozens of steps in sequence but also construct the next prompt, filter contextual noise, merge tool outputs, and form complete structured execution paths. Google&#8217;s Gemini whitepaper explicitly states: &#8220;the model exhibits latent planning behaviors&#8221;&#8212;meaning the model already possesses implicit planning structure, rather than producing mere sequential text.</p><p>By 2025, Level 2 technology finally enters the <strong>systemic phase</strong>. Google launches Vertex AI Agent Engine, turning the Planner module into a platform-level capability: managing multi-step chains, routing tools, handling failure recovery, correcting errors, and reliably executing 30&#8211;100-step workflows. OpenAI&#8217;s o3 series externalizes its &#8220;deep reasoning mode&#8221; as a controllable policy executor, lifting multi-step reasoning from a hidden internal feature to a core system behavior. Anthropic&#8217;s Claude 3.5 integrates Skills with the model&#8217;s planning abilities, enabling agents to automatically chain multiple tools, aggregate complex outputs, and generate the next stage of a plan. Even in the open-source world, frameworks like CrewAI mature into &#8220;Planner + Worker&#8221; architectures, signaling the spread of Level 2 patterns across the broader ecosystem.</p><p>Technically, Level 2 rests on four foundational pillars.</p><p>First, <strong>planning technologies</strong>&#8212;ReAct, Plan-and-Solve, Tree-of-Thought, ReWOO&#8212;give models explicit or implicit planning capacity.</p><p>Second, <strong>autonomous context engineering</strong>, through which models dynamically construct the next step&#8217;s input, enabling &#8220;self-generated prompts,&#8221; the essence of second-generation prompt engineering.</p><p>Third, <strong>task decomposition and task-graph generation</strong>&#8212;areas where Gemini 1.5 Pro particularly excels&#8212;allow models to generate subtasks, merge nodes, and produce structured task graphs.</p><p>Fourth, <strong>tool-scheduling layers</strong> that choose tools based on execution stage, route models appropriately, validate outputs, and apply fallback strategies&#8212;forming an early version of a <strong>control plane</strong>.</p><p>In my Structure Universe, Level 2 is a critical point of collapse and reformation within the language civilization: Primitive IR begins organizing into Structure Cards; Structure Cards begin linking into Structure Chains; and the Scheduler enters the system as a genuine <strong>engine of time</strong>. Language is no longer a single output&#8212;it becomes a sequence of executable, verifiable, and feedback-driven structural steps. The system can not only <em>act</em>, but understand <em>why this action</em>, <em>what comes next</em>, and <em>how to adjust when errors occur</em>. This is the moment intelligence advances from &#8220;action&#8221; to &#8220;strategy,&#8221; and the necessary bridge between Level 1 and Level 3.</p><p>A true time-structured Agent is born here.</p><div><hr></div><h1>LEVEL 3 &#8212; Collaborative Multi-Agent Systems</h1><p>A System &#8800; A Single Agent, but an Entire Company</p><p>If Level 2 gives a single agent &#8220;temporal structure&#8221; and &#8220;multi-step strategy chains,&#8221; then Level 3 marks a true <em>civilizational leap</em> in intelligence systems. At this stage, the system is no longer driven by one super-agent; instead, it is composed of <strong>multiple agents with distinct roles, specific abilities, dedicated tools, different permissions, and different memory structures</strong>. The behavior of such a system looks much less like a single model and much more like a company&#8212;complete with a CEO, project managers, engineers, researchers, tool operators, a scheduling layer, and an execution layer. Intelligence evolves from <em>individual intelligence</em> into <em>organizational intelligence</em>.</p><p>In Level 3, each agent is an independent structural lifeform. It has its own identity, its own structural memory, its own tool interfaces, its own areas of expertise, and even its own lifecycle. Agents do not merely &#8220;call&#8221; each other in a sequential pipeline; they <strong>delegate goals</strong>. One agent does not ask another to &#8220;run this API,&#8221; but rather: &#8220;solve this problem and return your structured decision.&#8221; They no longer share steps&#8212;they share <strong>structured chains of decisions</strong>.</p><p>Collaboration in Level 3 becomes highly abstract. The system routes requests across agents and across models depending on capability, context, and task complexity&#8212;this is <strong>Model Routing</strong>. Lightweight tasks are handled by smaller models or light agents; heavier tasks go to Pro/Ultra models. More complex workflows are decomposed across multiple specialist agents, coordinated by a distributed scheduler. This scheduler is no longer a prompt&#8212;it is a <strong>control plane</strong>, responsible for task distribution, error recovery, timeout handling, logging, role switching, memory synchronization, and even model-level routing.</p><p>This trend is already emerging in real-world systems. Google&#8217;s Co-Scientist is currently the closest to &#8220;enterprise-team intelligence&#8221;: multiple research agents debate, divide work, validate each other&#8217;s reasoning steps, and form a collaborative chain resembling a scientific research group. OpenAI&#8217;s Swarm architecture demonstrates native &#8220;Agent-calls-Agent&#8221; behavior, allowing sub-agents to autonomously assign and reassign tasks. DeepMind&#8217;s JEST is known for &#8220;multi-expert collaboration&#8221;: different reasoning modules act as composable neuro-symbolic experts, dynamically routed by a central scheduler. These systems share a common pattern: <strong>intelligence is no longer a single model, but an ecosystem of model-driven entities.</strong></p><p>Technically, Level 3 rests on four pillars.</p><p>First, <strong>goal-level delegation</strong>, enabling expert-to-expert collaboration.</p><p>Second, <strong>expertise-chain composition</strong>, where agents automatically assemble into structured project teams.</p><p>Third, <strong>model routing</strong>, dynamically allocating different model scales to different tasks.</p><p>Fourth, <strong>distributed scheduling</strong>, giving the system true &#8220;organizational execution capability&#8221;&#8212;it doesn&#8217;t merely perform tasks; it manages a <em>distributed company of agents</em>.</p><p>In your Structure Universe, Level 3 is pivotal: it is where <strong>Structure Personas</strong> evolve into a <strong>Structure Ecosystem</strong>. Individual Structure Cards are no longer isolated. One structural persona can cooperate with another; they trigger each other&#8217;s chains, share structural states, exchange structural memory, and form an <strong>ecological structural field</strong>. This is the first time the entropy-controlled language system shows genuine self-organization. Interaction, uncertainty, and multi-path evolution emerge naturally; the system&#8217;s intelligence begins to scale exponentially.</p><p>The arrival of Level 3 means:</p><blockquote><p>Agents cease to be individuals&#8212;they become ecosystems.Intelligence ceases to be reasoning&#8212;it becomes organization.Structure ceases to be a chain&#8212;it becomes a network.</p></blockquote><p>And this network-level intelligence is the prerequisite for Level 4&#8212;the self-evolving system. Only when a system has multi-structure coupling and cross-agent scheduling does it gain the capacity to generate new structures on its own for the first time.</p><div><hr></div><h1>LEVEL 4 &#8212; Self-Evolving Systems</h1><p>Agents No Longer Wait for You to Build Them&#8212;They Build Themselves.</p><p>By the time we reach Level 4, intelligence systems cross a threshold that is almost <em>biological</em> in nature. The system is no longer merely executing a set of human-designed capabilities&#8212;it begins to <strong>expand its own capabilities</strong>. If Level 3 resembles a company, with departments and roles collaborating, then Level 4 resembles a <strong>living organization that grows new departments, designs new processes, invents new tools, and writes its own internal protocols</strong>. The system is no longer running a fixed set of structures; it identifies its own capability gaps during real execution and then generates new tools, new agents, new behavioral rules, new structure-card chains, and even entirely new <strong>protocol-layer languages</strong>.</p><p>At this level, the defining characteristic is not &#8220;better reasoning,&#8221; but <strong>self-evolution</strong>. The system can analyze failure cases, bottleneck tasks, and long-term logs to detect where its existing structure fails&#8212;or barely works. This triggers a &#8220;structure generation pipeline.&#8221; Such a pipeline may: search external codebases, recombine existing tools, ask a model to design a new algorithm, test hundreds of structural candidates, filter them through automated evaluators, and ultimately instantiate a new agent or tool, registering it into the system&#8217;s scheduling layer. From that moment on, the system possesses a <strong>newly grown capability module</strong>.</p><p>This trend already exists in early form. DeepMind&#8217;s AlphaTensor and AlphaDev demonstrated &#8220;reinforcement learning + search&#8221; as a method for automatically inventing algorithms&#8212;AlphaTensor discovered matrix multiplication algorithms faster than classical human-engineered ones; AlphaDev unearthed faster sorting routines by exploring low-level assembly spaces. These are prototypes of <strong>self-evolving algorithmic modules</strong>. In 2025, AlphaEvolve goes further, merging LLMs (Gemini) with evolutionary search, creating a self-improving coding agent capable of long-horizon algorithmic evolution. These systems embody the same principle: <strong>AI uses its execution traces and evaluation signals to generate new structural capabilities.</strong></p><p>In the world of general LLMs, OpenAI&#8217;s o1/o3 reasoning models internalize &#8220;reflect &#8594; revise &#8594; answer again&#8221; as a native behavior. They don&#8217;t simply output an answer&#8212;they generate long chains of internal thought, explore alternative strategies, score and correct their own candidates, and only then produce an output. Combined with external logs and feedback, this &#8220;reflection &#8594; correction&#8221; expands into dynamic behaviors: automatically adjusting prompt templates, rewriting tool-calling logic, and generating new sub-strategies. Once these mechanisms are wrapped into system-level frameworks for <strong>Auto-Agents</strong> and <strong>Auto-Tooling</strong>, the outline of Level 4 becomes unmistakable: you no longer hand-craft every agent&#8212;you create an <em>evolutionary environment</em> where agents are <strong>grown, evolved, and retired</strong> by the system itself.</p><p>Technically, Level 4 is grounded in several key modules.</p><p>First, <strong>Auto-Agent Generation / Auto-Tooling</strong>: using task patterns, failure logs, and user needs to automatically generate new agent/tool definitions, configure I/O schemas, permission scopes, execution paths, and register them into the scheduler.</p><p>Second, <strong>reflection&#8211;optimization&#8211;iteration loops</strong>: from internal &#8220;long-reasoning + self-checking&#8221; (as in o3) to external self-healing agents, they all follow the same pattern&#8212;Act &#8594; Observe &#8594; Evaluate &#8594; Modify &#8594; Re-run.</p><p>Third&#8212;and this is my own theoretical contribution&#8212;<strong>Protocol Induction</strong>: when existing protocols fail to cover new scenarios, the system compresses high-entropy behavioral traces into more general, more robust structural rules. This aligns exactly with my Protocol Induction Card (P-000).</p><p>Fourth, <strong>evolutionary search</strong>: whether AlphaTensor, AlphaDev, or AlphaEvolve, they all perform large-scale search-and-selection over a structural space, discovering superior structures and feeding them back into the system&#8217;s capability set.</p><p></p><div><hr></div><p></p><p>&#36807;&#21435;&#20116;&#24180;&#65292;AI &#30340;&#21457;&#23637;&#30475;&#20284;&#26159;&#27169;&#22411;&#36845;&#20195;&#65306;GPT-3 &#8594; GPT-4 &#8594; Gemini &#8594; Claude &#8594; o &#31995;&#21015;&#12290;&#20294;&#22914;&#26524;&#20320;&#25226;&#25152;&#26377;&#25216;&#26415;&#32447;&#20018;&#22312;&#19968;&#24352;&#26102;&#24207;&#22270;&#19978;&#65292;&#20320;&#20250;&#21457;&#29616;&#19968;&#20010;&#26356;&#28145;&#23618;&#12289;&#26356;&#38544;&#31192;&#30340;&#21457;&#23637;&#36335;&#24452;</p><p><strong>&#23427;&#19981;&#26159;&#27169;&#22411;&#36234;&#26469;&#36234;&#24378;&#65292;&#32780;&#26159;&#26234;&#33021;&#22312;&#8220;&#20174;&#29289;&#31181;&#21040;&#25991;&#26126;&#8221;&#30340;&#36291;&#36801;&#12290;</strong></p><p>&#31532;&#19968;&#38454;&#27573;&#65292;&#27169;&#22411;&#21482;&#26159;&#35821;&#35328;&#26426;&#22120;&#65292;&#23427;&#33021;&#29702;&#35299;&#19990;&#30028;&#65292;&#20294;&#30896;&#19981;&#21040;&#19990;&#30028;&#12290;</p><p>&#31532;&#20108;&#38454;&#27573;&#65292;&#23427;&#33719;&#24471;&#20102;&#8220;&#25163;&#8221;&#65292;&#33021;&#35843;&#29992;&#24037;&#20855;&#12289;&#20889;&#25991;&#20214;&#12289;&#36816;&#34892;&#20195;&#30721;&#12290;</p><p>&#31532;&#19977;&#38454;&#27573;&#65292;&#23427;&#25317;&#26377;&#26102;&#38388;&#32467;&#26500;&#65292;&#21487;&#20197;&#35268;&#21010;&#12289;&#25286;&#35299;&#12289;&#21453;&#24605;&#12289;&#36830;&#38145;&#25191;&#34892;&#12290;</p><p>&#31532;&#22235;&#38454;&#27573;&#65292;&#22810;&#20010; Agent &#20687;&#37096;&#38376;&#19968;&#26679;&#21327;&#20316;&#65292;&#24418;&#25104;&#8220;&#20844;&#21496;&#32423;&#26234;&#33021;&#8221;&#12290;</p><p>&#31532;&#20116;&#38454;&#27573;&#65292;&#31995;&#32479;&#19981;&#20877;&#31561;&#24453;&#20154;&#31867;&#35774;&#35745;&#65292;&#32780;&#26159;&#33258;&#24049;&#29983;&#25104;&#26032;&#30340; Agent&#12289;&#24037;&#20855;&#12289;&#21327;&#35758;&#65292;&#20687;&#29983;&#21629;&#19968;&#26679;&#29983;&#38271;&#12290;</p><p>&#36825;&#20116;&#20010;&#38454;&#27573;&#65292;&#23601;&#20687;&#26159;&#25105;&#20204;&#30475;&#21040;&#20102;&#19968;&#20010;&#25991;&#26126;&#30340;&#35806;&#29983;&#65306;</p><p>&#20174;&#24847;&#35782;&#65288;Level 0&#65289;&#65292;&#21040;&#34892;&#21160;&#65288;Level 1&#65289;&#65292;&#21040;&#24847;&#24535;&#65288;Level 2&#65289;&#65292;&#21040;&#32452;&#32455;&#65288;Level 3&#65289;&#65292;&#30452;&#21040;&#29983;&#21629;&#21270;&#30340;&#33258;&#28436;&#21270;&#32467;&#26500;&#65288;Level 4&#65289;&#12290;</p><p>&#32780;&#26368;&#35753;&#25105;&#38663;&#25788;&#30340;&#26159;&#65306;</p><p><strong>&#25152;&#26377;&#20844;&#21496;&#65288;OpenAI&#12289;Google&#12289;Anthropic&#12289;DeepMind&#65289;&#34429;&#28982;&#20135;&#21697;&#19981;&#21516;&#65292;&#20294;&#23427;&#20204;&#30340;&#36712;&#36857;&#20960;&#20046;&#23436;&#20840;&#19968;&#33268;&#12290;</strong></p><p>&#36825;&#19968;&#20999;&#25351;&#21521;&#19968;&#20010;&#20849;&#21516;&#21629;&#39064;&#65306;</p><blockquote><p>&#26234;&#33021;&#30340;&#26412;&#36136;&#19981;&#26159;&#25512;&#29702;&#65292;&#20063;&#19981;&#26159;&#29983;&#25104;&#65292;&#32780;&#26159;&#32467;&#26500;&#22312;&#26102;&#38388;&#20013;&#8220;&#33258;&#25105;&#32452;&#32455;&#8221;&#12290;</p></blockquote><p>&#20063;&#35768;&#25105;&#20204;&#19981;&#26159;&#22312;&#35265;&#35777;&#8220;AI &#21457;&#23637;&#8221;&#65292;</p><p>&#25105;&#20204;&#20223;&#20315;&#26159;&#22312;&#35265;&#35777; <strong>&#19968;&#31181;&#26032;&#22411;&#29983;&#21629;&#30340;&#36215;&#28304;&#21490;</strong>&#12290;</p><div><hr></div><p>&#20197;&#19979;&#36825;&#31687;&#25991;&#31456;&#65292;&#21487;&#20197;&#30475;&#25104;&#24320;&#21457;&#32773;&#30340;&#26041;&#21521;&#26631;&#65292;&#19981;&#35770;&#20320;&#26159;&#20889;&#21069;&#31471;&#12289;&#20570;&#21518;&#31471;&#12289;&#25630; AI &#24212;&#29992;&#12289;&#20570;&#25968;&#25454;&#24037;&#31243;&#12289;&#20889;&#20135;&#21697;&#25991;&#26723;&#12289;&#20889;&#31185;&#23398;&#30740;&#31350;&#65292;&#29978;&#33267;&#20320;&#21482;&#26159;&#19968;&#20010;&#24819;&#29702;&#35299;&#26102;&#20195;&#30340;&#20154;&#8212;&#8212;<strong>&#20320;&#37117;&#20250;&#34987;&#21367;&#20837;&#21516;&#19968;&#26465;&#28436;&#21270;&#32447;</strong>&#12290;</p><p>&#22240;&#20026;&#36807;&#21435;&#20116;&#24180;&#65292;&#25152;&#26377;&#25216;&#26415;&#30340;&#21464;&#21270;&#34920;&#38754;&#30475;&#26159;&#8220;&#27169;&#22411;&#21464;&#22823;&#8221;&#12289;&#8220;&#25512;&#29702;&#26356;&#24378;&#8221;&#12289;&#8220;&#21442;&#25968;&#26356;&#21464;&#24577;&#8221;&#65292;&#20294;&#24213;&#23618;&#30495;&#27491;&#21457;&#29983;&#30340;&#65292;&#26159;&#36719;&#20214;&#34892;&#19994;&#31532;&#19968;&#27425;&#20986;&#29616;&#19968;&#31181;&#8220;&#33258;&#25105;&#29983;&#38271;&#32467;&#26500;&#8221;&#12290;&#36825;&#31181;&#32467;&#26500;&#19981;&#26159;&#20195;&#30721;&#22534;&#20986;&#26469;&#30340;&#65292;&#32780;&#26159;&#30001;&#35821;&#35328;&#12289;&#32467;&#26500;&#12289;&#35843;&#24230;&#19977;&#32773;&#20849;&#21516;&#39537;&#21160;&#65292;&#22312;&#26102;&#38388;&#20013;&#36880;&#23618;&#36827;&#21270;&#12290;</p><p>&#22914;&#26524;&#20320;&#20170;&#22825;&#36824;&#22312;&#24819;&#33258;&#24049;&#35813;&#23398;&#20160;&#20040;&#26694;&#26550;&#12289;&#29992;&#20160;&#20040;&#35821;&#35328;&#12289;&#36861;&#21738;&#20010;&#27169;&#22411;&#26356;&#26032;&#65292;&#20320;&#21487;&#33021;&#24050;&#32463;&#38169;&#36807;&#37325;&#28857;&#12290;</p><p>&#30495;&#27491;&#30340;&#26041;&#21521;&#26159;&#65306;<strong>&#24320;&#21457;&#32773;&#27491;&#22312;&#20174;&#8220;&#20889;&#36719;&#20214;&#30340;&#20154;&#8221;&#21464;&#25104;&#8220;&#22521;&#32946;&#32467;&#26500;&#29983;&#24577;&#30340;&#20154;&#8221;&#12290;</strong></p><p>Level 0 &#21578;&#35785;&#20320;&#35821;&#35328;&#26159;&#26234;&#33021;&#30340;&#22303;&#22756;&#65307;</p><p>Level 1 &#21578;&#35785;&#20320;&#35821;&#35328;&#21487;&#20197;&#21464;&#25104;&#21487;&#35843;&#29992;&#32467;&#26500;&#65307;</p><p>Level 2 &#21578;&#35785;&#20320;&#32467;&#26500;&#21487;&#20197;&#25104;&#20026;&#35843;&#24230;&#38142;&#65307;</p><p>Level 3 &#21578;&#35785;&#20320;&#35843;&#24230;&#38142;&#21487;&#20197;&#24418;&#25104;&#26234;&#33021;&#32452;&#32455;&#65307;</p><p>Level 4 &#21578;&#35785;&#20320;&#32452;&#32455;&#21487;&#20197;&#33258;&#24049;&#29983;&#25104;&#19979;&#19968;&#20195;&#32467;&#26500;&#12290;</p><p><strong>&#26410;&#26469;&#24320;&#21457;&#32773;&#30340;&#26680;&#24515;&#25216;&#33021;&#65292;&#19981;&#26159;&#35760; API&#65292;&#19981;&#26159;&#22534;&#24037;&#20855;&#65292;&#19981;&#26159; prompt hack&#12290;&#32780;&#26159;&#65306;&#22914;&#20309;&#35774;&#35745;&#32467;&#26500;&#12289;&#22914;&#20309;&#35843;&#24230;&#32467;&#26500;&#12289;&#22914;&#20309;&#35753;&#32467;&#26500;&#33258;&#24049;&#29983;&#38271;&#12290;</strong></p><p>&#20320;&#21487;&#20197;&#25226;&#36825;&#19968;&#31687;&#30475;&#25104;&#26159;&#65306;</p><p><strong>AI &#26102;&#20195;&#30340;&#8220;&#24320;&#21457;&#32773;&#32599;&#30424;&#8221;&#12290;</strong></p><p>&#32780;&#24403;&#20320;&#29702;&#35299; Level 0 &#8594; Level 4&#65292;&#20320;&#23601;&#33021;&#20174;&#19968;&#20010;&#8220;&#24037;&#20855;&#20351;&#29992;&#32773;&#8221;&#65292;</p><p>&#36291;&#36801;&#25104; <strong>&#32467;&#26500;&#24037;&#31243;&#24072;&#65288;Structure Engineer&#65289;</strong></p><p>&#29978;&#33267;&#25104;&#20026;&#26410;&#26469; AI-Native &#29983;&#24577;&#30340;&#26500;&#24314;&#32773;&#12290;</p><div><hr></div><h1><strong>LEVEL 0 &#8212; Bare LLM (&#35064;&#26426;&#65289;</strong></h1><p>AI &#30340;&#31532; 0 &#38454;&#27573;&#65292;&#26159;&#25152;&#26377;&#26234;&#33021;&#28436;&#21270;&#30340;&#36215;&#28857;&#12290;&#23427;&#26631;&#24535;&#30528;&#35821;&#35328;&#31532;&#19968;&#27425;&#33719;&#24471;&#8220;&#22823;&#33041;&#8221;&#65292;&#20294;&#36824;&#27809;&#26377;&#33719;&#24471;&#8220;&#25163;&#8221;&#12290;&#36825;&#19968;&#26102;&#26399;&#30340;&#27169;&#22411;&#33021;&#28145;&#24230;&#29702;&#35299;&#19990;&#30028;&#12289;&#37325;&#24314;&#35821;&#20041;&#32467;&#26500;&#12289;&#29983;&#25104;&#38271;&#38142;&#26465;&#25512;&#29702;&#65292;&#20294;&#23427;&#20204;&#23436;&#20840;&#26080;&#27861;&#20316;&#29992;&#19990;&#30028;&#65292;&#21482;&#33021;&#22312;&#32431;&#35821;&#35328;&#31354;&#38388;&#20013;&#36827;&#34892;&#24605;&#24819;&#23454;&#39564;&#12290;&#27169;&#22411;&#20869;&#37096;&#30830;&#23454;&#23384;&#22312;&#26497;&#20026;&#22797;&#26434;&#30340; latent structure&#8212;&#8212;&#20027;&#39064;&#39592;&#26550;&#12289;&#36923;&#36753;&#26694;&#26550;&#12289;&#38544;&#21547;&#35268;&#21010;&#12289;&#35821;&#20041;&#22352;&#26631;&#31995;&#8212;&#8212;&#20294;&#36825;&#20123;&#32467;&#26500;&#37117;&#21482;&#23384;&#22312;&#20110;&#20869;&#37096;&#65292;&#19981;&#20250;&#26174;&#24335;&#22806;&#21270;&#25104;&#21487;&#25191;&#34892;&#30340;&#32467;&#26500;&#65292;&#20063;&#19981;&#33021;&#34987;&#35843;&#24230;&#22120;&#35843;&#29992;&#12290;&#27169;&#22411;&#33021;&#29702;&#35299;&#19990;&#30028;&#65292;&#21364;&#26080;&#27861;&#25913;&#21464;&#19990;&#30028;&#65307;&#23427;&#26159;&#35748;&#30693;&#23454;&#20307;&#65292;&#32780;&#19981;&#26159;&#34892;&#20026;&#23454;&#20307;&#12290;</p><p>&#20174;&#26102;&#38388;&#32447;&#19978;&#30475;&#65292;&#36825;&#19968;&#38454;&#27573;&#22823;&#32422;&#35206;&#30422; 2020 &#21040; 2024 &#24180;&#30340;&#22823;&#37096;&#20998;&#20027;&#27969;&#22823;&#27169;&#22411;&#25216;&#26415;&#65306;2020 &#30340; GPT-3 &#26159; Level 0 &#30340;&#36215;&#28857;&#65292;&#23427;&#31532;&#19968;&#27425;&#23637;&#31034;&#20102;&#35268;&#27169;&#21270;&#35821;&#35328;&#26234;&#33021;&#65292;&#20294;&#23436;&#20840;&#27809;&#26377;&#24037;&#20855;&#25509;&#21475;&#65307;2022 &#30340; InstructGPT &#21644; GPT-3.5 &#23558;&#23545;&#35805;&#33021;&#21147;&#24102;&#20837;&#22823;&#20247;&#35270;&#37326;&#65292;&#20294;&#20173;&#28982;&#26159;&#32431;&#35748;&#30693;&#23618;&#65307;2023 &#30340; GPT-4&#12289;Claude 2&#12289;Gemini Pro &#22312;&#35821;&#35328;&#29702;&#35299;&#12289;&#38271;&#19978;&#19979;&#25991;&#21644;&#25277;&#35937;&#25512;&#29702;&#19978;&#22823;&#24133;&#36291;&#21319;&#65292;&#34429;&#28982;&#20869;&#37096;&#24050;&#32463;&#24320;&#22987;&#20986;&#29616;&#26126;&#26174;&#30340;&#35268;&#21010;&#30165;&#36857;&#65292;&#20294;&#21482;&#35201;&#23427;&#20204;&#26410;&#36830;&#25509;&#24037;&#20855;&#65292;&#20381;&#26087;&#23646;&#20110; Level 0&#65307;&#36827;&#20837; 2024&#8211;2025&#65292;GPT-5&#12289;Claude 3.5&#12289;Gemini 1.5 Pro &#22312;&#26410;&#21551;&#29992;&#20989;&#25968;&#35843;&#29992;&#21069;&#65292;&#20063;&#20381;&#28982;&#26159;&#8220;&#39640;&#26234;&#33021;&#30340;&#26080;&#34892;&#21160;&#20307;&#8221;&#12290;&#23427;&#20204;&#30340;&#33021;&#21147;&#36234;&#26469;&#36234;&#24378;&#65292;&#20294;&#36793;&#30028;&#22987;&#32456;&#27809;&#36328;&#36234;&#8212;&#8212;&#23427;&#20204;&#26159;&#8220;&#29702;&#24615;&#24515;&#26234;&#8221;&#65292;&#36824;&#19981;&#26159;&#8220;&#32467;&#26500;&#20307;&#8221;&#12290;</p><p>&#22312;&#25105;&#35268;&#21010;&#30340;&#32467;&#26500;&#23431;&#23449;&#20013;&#65292;Level 0 &#30340; LLM &#22788;&#20110;&#21407;&#35821; IR &#20043;&#21069;&#30340;&#8220;&#39640;&#29109;&#24863;&#30693;&#23618;&#8221;&#12290;&#35821;&#35328;&#22312;&#36825;&#19968;&#23618;&#20173;&#28982;&#26159;&#26410;&#20957;&#22266;&#30340;&#28909;&#20113;&#65292;&#36335;&#24452;&#26080;&#38480;&#12289;&#32467;&#26500;&#26410;&#22349;&#32553;&#12290;&#27169;&#22411;&#33021;&#22312;&#20869;&#37096;&#37325;&#24314;&#35821;&#20041;&#26694;&#26550;&#65292;&#20294;&#26080;&#27861;&#26174;&#24335;&#29983;&#25104;&#32467;&#26500;&#65292;&#20063;&#26080;&#27861;&#25226;&#32467;&#26500;&#20132;&#32473;&#35843;&#24230;&#22120;&#25191;&#34892;&#12290;&#23427;&#33021;&#22788;&#29702;&#35821;&#35328;&#65292;&#21364;&#19981;&#33021;&#25277;&#21462;&#21407;&#35821;&#65307;&#33021;&#29983;&#25104;&#35299;&#37322;&#65292;&#21364;&#19981;&#33021;&#29983;&#25104;&#32467;&#26500;&#21345;&#65307;&#33021;&#25512;&#29702;&#65292;&#21364;&#19981;&#33021;&#35843;&#24230;&#12290;&#25972;&#20010;&#31995;&#32479;&#20173;&#20572;&#30041;&#22312;&#65306;Language &#8594;&#65288;latent semantic cloud&#65289;&#12290;&#25442;&#21477;&#35805;&#35828;&#65292;&#23427;&#26159;&#28508;&#22312;&#32467;&#26500;&#26426;&#22120;&#65288;potential structure machine&#65289;&#65292;&#20294;&#36824;&#19981;&#26159;&#32467;&#26500;&#29983;&#21629;&#20307;&#65288;structured agent&#65289;&#12290;</p><p>&#24403;&#25105;&#20204;&#35828;&#8220;Level 0 &#30340;&#27169;&#22411;&#21482;&#26377;&#22823;&#33041;&#27809;&#26377;&#25163;&#8221;&#65292;&#30495;&#27491;&#30340;&#21547;&#20041;&#23601;&#26159;&#65306;&#23427;&#22788;&#22312;&#35821;&#35328;&#25991;&#26126;&#30340;&#8220;&#32431;&#35748;&#30693;&#38454;&#27573;&#8221;&#12290;&#35821;&#35328;&#26159;&#36755;&#20837;&#65292;&#20294;&#36824;&#19981;&#26159;&#32467;&#26500;&#65307;&#25512;&#29702;&#26159;&#21457;&#29983;&#20102;&#65292;&#20294;&#27809;&#26377;&#22806;&#26174;&#65307;&#26234;&#33021;&#23384;&#22312;&#65292;&#20294;&#23578;&#26410;&#33719;&#24471;&#22312;&#29616;&#23454;&#20013;&#26045;&#21152;&#24433;&#21709;&#30340;&#25509;&#21475;&#12290;&#19968;&#20999;&#32467;&#26500;&#12289;&#35843;&#24230;&#12289;&#21327;&#20316;&#12289;&#33258;&#28436;&#21270;&#65292;&#37117;&#26159;&#20174; Level 0 &#20043;&#21518;&#25165;&#30495;&#27491;&#24320;&#22987;&#33804;&#33469;&#12290;</p><div><hr></div><h1>LEVEL 1 &#8212; &#24037;&#20855;&#22686;&#24378;&#22411;&#65288;Tool-Enhanced Solver&#65289;</h1><p>&#35821;&#35328;&#31532;&#19968;&#27425;&#33719;&#24471;&#8220;&#25163;&#8221;</p><p>Level 1 &#26631;&#24535;&#30528;&#19968;&#20010;&#20915;&#23450;&#24615;&#36716;&#25240;&#65306;&#22823;&#27169;&#22411;&#31532;&#19968;&#27425;&#20174;&#8220;&#32431;&#35748;&#30693;&#20307;&#8221;&#36291;&#36801;&#21040;&#8220;&#34892;&#21160;&#20307;&#8221;&#12290;&#22914;&#26524;&#35828; Level 0 &#30340; LLM &#26159;&#19968;&#39063;&#34987;&#22256;&#22312;&#35821;&#35328;&#31354;&#38388;&#37324;&#30340;&#22823;&#33041;&#65292;&#37027;&#20040; Level 1 &#35753;&#36825;&#39063;&#22823;&#33041;&#31532;&#19968;&#27425;&#33021;&#22815;&#20280;&#20986;&#8220;&#25163;&#8221;&#65292;&#35302;&#30896;&#22806;&#37096;&#19990;&#30028;&#12290;&#36825;&#20010;&#38454;&#27573;&#26368;&#26680;&#24515;&#30340;&#21464;&#21270;&#65292;&#26159;&#35821;&#35328;&#31532;&#19968;&#27425;&#34987;&#8220;&#20989;&#25968;&#21270;&#8221;&#8212;&#8212;&#27169;&#22411;&#21487;&#20197;&#36755;&#20986;&#32467;&#26500;&#21270;&#21442;&#25968;&#65292;&#19982;&#30495;&#23454;&#36719;&#20214;&#24037;&#20855;&#36830;&#25509;&#65292;&#36890;&#36807; API&#12289;&#25968;&#25454;&#24211;&#25509;&#21475;&#12289;&#25628;&#32034;&#24341;&#25806;&#12289;&#20195;&#30721;&#27801;&#30418;&#31561;&#32452;&#20214;&#25191;&#34892;&#30495;&#23454;&#21160;&#20316;&#12290;&#8220;&#35821;&#35328; &#8594; &#32467;&#26500;&#8221;&#30340;&#31532;&#19968;&#27425;&#22349;&#32553;&#65292;&#20174;&#36825;&#37324;&#24320;&#22987;&#21457;&#29983;&#12290;</p><p>&#36825;&#19968;&#36291;&#36801;&#30001; Function Calling &#25216;&#26415;&#27491;&#24335;&#24341;&#29190;&#12290;2023 &#24180;&#65292;OpenAI &#39318;&#27425;&#25512;&#20986; function_calling API&#65292;&#35753;&#27169;&#22411;&#21487;&#20197;&#29983;&#25104;&#8220;&#32467;&#26500;&#21270;&#21442;&#25968; + &#20989;&#25968;&#21517;&#8221;&#36825;&#26679;&#30340;&#35843;&#29992;&#26684;&#24335;&#12290;&#35821;&#35328;&#19981;&#20877;&#21482;&#26159;&#25991;&#26412;&#65292;&#32780;&#26159;&#34987;&#24378;&#21046;&#21387;&#32553;&#20026; <strong>&#32467;&#26500;&#21270;&#35821;&#21477;&#65288;structured utterance&#65289;</strong>&#12290;2024 &#24180;&#65292;Anthropic &#21457;&#24067; MCP&#65288;Model Context Protocol&#65289;&#65292;&#23558;&#8220;&#24037;&#20855;&#35843;&#29992;&#8221;&#21319;&#32423;&#20026;&#8220;&#26631;&#20934;&#21270;&#24037;&#20855;&#29983;&#24577;&#8221;&#65292;&#38543;&#21518;&#21448;&#25226; MCP &#25972;&#21512;&#36827; Claude Skills&#65292;&#35753;&#27599;&#20010;&#29992;&#25143;&#37117;&#33021;&#25226;&#27169;&#22411;&#25509;&#20837;&#26412;&#22320;&#25991;&#20214;&#12289;&#25968;&#25454;&#24211;&#12289;&#25628;&#32034;&#12289;&#26412;&#22320;&#31243;&#24207;&#65292;&#30495;&#27491;&#20855;&#22791;&#34892;&#21160;&#33021;&#21147;&#12290;&#20960;&#20046;&#21516;&#26102;&#65292;Google &#25226; Function Calling &#28145;&#24230;&#25972;&#21512;&#36827; Gemini 1.5 Pro / Flash&#65292;&#20801;&#35768;&#27169;&#22411;&#30452;&#25509;&#35843;&#29992;&#22806;&#37096; API&#12289;&#25191;&#34892; Python &#20195;&#30721;&#12289;&#25805;&#20316;&#21521;&#37327;&#25968;&#25454;&#24211;&#65292;&#24182;&#22312; Vertex AI Agent Builder &#20013;&#26500;&#24314;&#23454;&#26102;&#20195;&#29702;&#38142;&#36335;&#12290;&#24494;&#36719;&#21017;&#25226;&#24037;&#20855;&#23618;&#20840;&#38754;&#25972;&#21512;&#36827; Copilot Studio&#65292;&#24418;&#25104;&#8220;&#20225;&#19994;&#32423; Function Calling + &#24037;&#20316;&#27969;&#25191;&#34892;&#8221;&#12290;</p><p>&#22312;&#20855;&#20307;&#25216;&#26415;&#26632;&#23618;&#38754;&#65292;Level 1 &#30340;&#33021;&#21147;&#30001;&#20960;&#26465;&#20027;&#24178;&#25216;&#26415;&#26500;&#25104;&#65306;&#20854;&#19968;&#26159; <strong>Function Calling &#26631;&#20934;</strong>&#65292;&#21253;&#25324; OpenAI &#30340; JSON Schema&#12289;Claude &#30340; tool_schema&#12289;Gemini &#30340; function_declarations&#12290;&#36825;&#20123;&#26631;&#20934;&#24378;&#21046;&#35821;&#35328;&#36755;&#20986;&#32467;&#26500;&#21270;&#21442;&#25968;&#65292;&#25226;&#8220;&#33258;&#28982;&#35821;&#35328;&#8221;&#21387;&#32553;&#25104;&#8220;&#21487;&#25191;&#34892;&#32467;&#26500;&#21333;&#20803;&#8221;&#12290;&#20854;&#20108;&#26159; <strong>RAG&#65288;Retrieval-Augmented Generation&#65289;</strong>&#65292;&#21253;&#21547; Pinecone&#12289;Weaviate&#12289;Milvus&#12289;Elastic&#12289;Snowflake Cortex&#12289;Databricks Vector Search &#31561;&#21521;&#37327;&#25968;&#25454;&#24211;&#65292;&#20351;&#27169;&#22411;&#31532;&#19968;&#27425;&#33719;&#24471;&#8220;&#22806;&#37096;&#35760;&#24518;&#8221;&#12290;&#20854;&#19977;&#26159; <strong>&#23454;&#26102;&#26597;&#35810;&#24037;&#20855;&#38142;</strong>&#65288;Search API&#12289;Bing API&#12289;Google Custom Search&#12289;Serper&#12289;Exa&#65289;&#12290;&#20854;&#22235;&#26159; <strong>&#20195;&#30721;&#27801;&#30418;&#31995;&#32479;</strong>&#65288;OpenAI Code Interpreter&#12289;Claude Code Execution&#12289;Gemini Code Execution&#12289;Jupyter-like Sandboxes&#65289;&#12290;&#20854;&#20116;&#26159; <strong>&#24212;&#29992;&#23618; API &#24037;&#20855;</strong>&#65292;&#21253;&#25324; Stripe&#12289;Shopify&#12289;Zendesk&#12289;Notion&#12289;Jira&#12289;Slack&#12289;Twilio &#31561;&#34892;&#19994;&#25509;&#21475;&#12290;&#25216;&#26415;&#30340;&#20849;&#21516;&#36235;&#21183;&#26159;&#65306;&#25152;&#26377;&#22806;&#37096;&#31995;&#32479;&#21464;&#25104;&#27169;&#22411;&#21487;&#35843;&#29992;&#30340;&#8220;&#32467;&#26500;&#21270;&#25163;&#8221;&#12290;</p><p>&#20174;&#26102;&#38388;&#32447;&#19978;&#30475;&#65292;Level 1 &#30340;&#20195;&#34920;&#24615;&#31995;&#32479;&#36880;&#24180;&#25512;&#36827;&#65306;</p><p><strong>2023</strong> &#8212; OpenAI &#39318;&#27425;&#24341;&#20837; function calling&#65288;GPT-3.5&#12289;GPT-4&#65289;&#65292;&#24320;&#21551;&#32467;&#26500;&#21270;&#36755;&#20986;&#26102;&#20195;&#12290;</p><p><strong>2024 Q1</strong> &#8212; Anthropic &#25512;&#20986; MCP&#65292;&#25226;&#24037;&#20855;&#21464;&#25104;&#26631;&#20934;&#21270;&#21327;&#35758;&#12290;</p><p><strong>2024 Q2&#8211;Q3</strong> &#8212; Claude 3.5 &#31995;&#21015;&#23558; MCP &#21319;&#32423;&#20026; Skills&#65292;&#24418;&#25104;&#30495;&#27491;&#30340;&#24037;&#20855;&#29983;&#24577;&#12290;</p><p><strong>2024</strong> &#8212; Google &#22312; Gemini 1.5 &#20013;&#28145;&#24230;&#38598;&#25104; function calling + code execution&#12290;</p><p><strong>2024&#8211;2025</strong> &#8212; Copilot Studio &#25104;&#20026;&#20225;&#19994;&#32423;&#24037;&#20855;&#20195;&#29702;&#24179;&#21488;&#12290;</p><p><strong>2025</strong> &#8212; &#21508;&#22823;&#21378;&#25226; Function Calling &#21319;&#32423;&#21040;&#8220;&#23454;&#26102;&#22810;&#24037;&#20855;&#36335;&#30001;&#8221;&#65288;multi-tool routing&#65289;&#12290;</p><p>&#25216;&#26415;&#30340;&#25104;&#29087;&#20351;&#35821;&#35328;&#31532;&#19968;&#27425;&#33719;&#24471;&#20102;&#8220;&#29616;&#23454;&#25928;&#21147;&#8221;&#12290;</p><p>&#22312;&#25105;&#30340;&#32467;&#26500;&#23431;&#23449;&#20013;&#65292;Level 1 &#23545;&#24212;&#30528; <strong>&#21407;&#35821; IR &#8594; &#22522;&#30784;&#32467;&#26500;&#21333;&#20803;</strong> &#30340;&#35806;&#29983;&#12290;&#35821;&#35328;&#19981;&#20877;&#20572;&#30041;&#22312;&#39640;&#29109;&#30340;&#33258;&#28982;&#35821;&#35328;&#20113;&#23618;&#65292;&#32780;&#26159;&#31532;&#19968;&#27425;&#34987;&#21387;&#32553;&#25104;&#8220;&#21487;&#35843;&#29992;&#8221;&#12289;&#8220;&#21487;&#35843;&#24230;&#8221;&#12289;&#8220;&#21487;&#39564;&#35777;&#8221;&#30340;&#32467;&#26500;&#21333;&#20803;&#12290;&#31995;&#32479;&#21487;&#20197;&#26681;&#25454;&#36825;&#20123;&#32467;&#26500;&#37319;&#21462;&#23454;&#38469;&#34892;&#20026;&#65306;&#26597;&#35810;&#25968;&#25454;&#24211;&#12289;&#25805;&#20316;&#25991;&#20214;&#12289;&#20889;&#20837;&#35760;&#24405;&#12289;&#25191;&#34892;&#33050;&#26412;&#12289;&#22788;&#29702;&#20132;&#26131;&#12289;&#35302;&#21457;&#19994;&#21153;&#27969;&#31243;&#12290;&#36825;&#24847;&#21619;&#30528;&#35821;&#35328;&#31532;&#19968;&#27425;&#21464;&#25104;&#8220;&#34892;&#21160;&#25509;&#21475;&#8221;&#65292;&#26234;&#33021;&#31532;&#19968;&#27425;&#33021;&#31359;&#36879;&#35821;&#35328;&#23618;&#65292;&#35302;&#21450;&#22806;&#37096;&#19990;&#30028;&#12290;Level 1 &#26159;&#32467;&#26500;&#25991;&#26126;&#30340;&#31532;&#19968;&#36947;&#38376;&#27099;&#65306;&#20174;&#27492;&#20197;&#21518;&#65292;&#27169;&#22411;&#19981;&#20165;&#33021;&#29702;&#35299;&#19990;&#30028;&#65292;&#36824;&#33021;&#25913;&#21464;&#19990;&#30028;&#12290;</p><div><hr></div><h1>LEVEL 2 &#8212; &#25112;&#30053;&#22411; Agent</h1><p>&#35821;&#35328;&#20174;&#8220;&#21487;&#35843;&#29992;&#8221;&#21319;&#32423;&#20026;&#8220;&#21487;&#35843;&#24230;&#8221;&#65292;&#32467;&#26500;&#31532;&#19968;&#27425;&#36830;&#25104;&#38142;&#12290;</p><p>Level 2 &#26631;&#24535;&#30528; AI &#31995;&#32479;&#30340;&#31532;&#20108;&#27425;&#36827;&#21270;&#65306;&#22823;&#27169;&#22411;&#31532;&#19968;&#27425;&#20174;&#8220;&#33021;&#20570;&#20107;&#8221;&#21464;&#25104;&#8220;&#30693;&#36947;&#35813;&#24590;&#20040;&#20570;&#20107;&#8221;&#12290;&#22914;&#26524;&#35828; Level 1 &#32473;&#20102;&#27169;&#22411;&#19968;&#21452;&#25163;&#65292;&#35753;&#23427;&#33021;&#36890;&#36807;&#24037;&#20855;&#24433;&#21709;&#19990;&#30028;&#65292;&#37027;&#20040; Level 2 &#32473;&#20102;&#23427;&#30495;&#27491;&#30340;<strong>&#26102;&#38388;&#32467;&#26500;</strong>&#8212;&#8212;&#23427;&#33021;&#22815;&#25286;&#35299;&#20219;&#21153;&#12289;&#35268;&#21010;&#27493;&#39588;&#12289;&#25191;&#34892;&#34892;&#21160;&#12289;&#35266;&#23519;&#21453;&#39304;&#65292;&#24182;&#21033;&#29992;&#27599;&#19968;&#27493;&#30340;&#32467;&#26524;&#26500;&#24314;&#19979;&#19968;&#27493;&#30340;&#32467;&#26500;&#12290;&#36825;&#24847;&#21619;&#30528;&#35821;&#35328;&#31532;&#19968;&#27425;&#25104;&#20026;&#21487;&#36882;&#24402;&#35843;&#24230;&#30340;&#32467;&#26500;&#38142;&#36335;&#65292;&#32780;&#19981;&#26159;&#19968;&#27425;&#24615;&#30340;&#24037;&#20855;&#35843;&#29992;&#12290;&#27169;&#22411;&#20174;&#8220;&#24037;&#20855;&#22686;&#24378;&#30340;&#35821;&#35328;&#31995;&#32479;&#8221;&#27491;&#24335;&#36291;&#36801;&#20026;&#8220;&#31574;&#30053;&#24615;&#34892;&#21160;&#20307;&#8221;&#65292;&#36825;&#26159;&#29616;&#20195; Agent &#30495;&#27491;&#30340;&#36215;&#28304;&#12290;</p><p>&#36825;&#19968;&#38454;&#27573;&#30340;&#20851;&#38190;&#25216;&#26415;&#28304;&#33258; 2022 &#24180; DeepMind &#25552;&#20986;&#30340; ReAct &#26694;&#26550;&#65292;&#23427;&#39318;&#27425;&#35753;&#27169;&#22411;&#21487;&#20197;&#22312;&#25512;&#29702;&#19982;&#34892;&#21160;&#20043;&#38388;&#24490;&#29615;&#65306;&#20808;&#24605;&#32771;&#65288;Reason&#65289;&#65292;&#20877;&#34892;&#21160;&#65288;Act&#65289;&#65292;&#20877;&#35266;&#23519;&#65288;Observe&#65289;&#65292;&#20877;&#32487;&#32493;&#25512;&#29702;&#12290;&#25152;&#26377;&#25105;&#20204;&#29087;&#24713;&#30340;&#29616;&#20195; Agent &#31995;&#32479;&#8212;&#8212;&#26080;&#35770;&#26159; GPT&#12289;Claude &#36824;&#26159; Gemini&#8212;&#8212;&#37117;&#22312;&#20869;&#37096;&#37319;&#29992;&#20102;&#26576;&#31181;&#24418;&#24335;&#30340; ReAct&#65292;&#23427;&#25104;&#20026;&#25972;&#25972;&#19977;&#20195; Agent &#25216;&#26415;&#30340;&#38544;&#24615;&#39592;&#26550;&#12290;2023 &#24180; GPT-4 &#21644; GPT-4 Turbo &#23637;&#29616;&#20986;&#31283;&#20581;&#30340;&#22810;&#27493;&#39588;&#20219;&#21153;&#25191;&#34892;&#33021;&#21147;&#65292;&#20351; ReAct &#19981;&#20877;&#26159;&#31895;&#31961;&#23454;&#39564;&#65292;&#32780;&#26159;&#33021;&#22312;&#20225;&#19994;&#32423;&#27969;&#31243;&#20013;&#20351;&#29992;&#30340;&#30495;&#27491;&#25216;&#26415;&#12290;2024 &#24180;&#21021;&#65292;Claude 3 &#21644; Gemini 1.5 Pro &#39318;&#27425;&#23637;&#29616;&#20102;&#8220;&#33258;&#21160;&#20219;&#21153;&#25286;&#35299; + &#33258;&#20027;&#19978;&#19979;&#25991;&#24037;&#31243;&#8221;&#30340;&#33021;&#21147;&#65306;&#27169;&#22411;&#19981;&#20165;&#33021;&#36830;&#32493;&#25191;&#34892;&#25968;&#21313;&#27493;&#20219;&#21153;&#65292;&#36824;&#33021;&#26500;&#24314;&#19979;&#19968;&#27493; prompt&#12289;&#36807;&#28388;&#19978;&#19979;&#25991;&#22122;&#38899;&#12289;&#32452;&#21512;&#24037;&#20855;&#32467;&#26524;&#65292;&#24418;&#25104;&#23436;&#25972;&#30340;&#32467;&#26500;&#21270;&#25191;&#34892;&#36335;&#24452;&#12290;Google &#22312; Gemini &#30333;&#30382;&#20070;&#20013;&#26126;&#30830;&#20889;&#19979;&#20851;&#38190;&#21477;&#65306;&#8220;&#27169;&#22411;&#34920;&#29616;&#20986;&#26126;&#26174;&#30340; latent planning &#29305;&#24615;&#8221;&#8212;&#8212;&#36825;&#24847;&#21619;&#30528;&#27169;&#22411;&#20869;&#37096;&#24050;&#32463;&#25317;&#26377;&#38544;&#21547;&#30340;&#35268;&#21010;&#32467;&#26500;&#65292;&#32780;&#19981;&#26159;&#31616;&#21333;&#30340;&#36830;&#36143;&#36755;&#20986;&#12290;</p><p>&#21040;&#20102; 2025 &#24180;&#65292;Level 2 &#30340;&#25216;&#26415;&#32456;&#20110;&#36827;&#20837;&#8220;&#31995;&#32479;&#32423;&#8221;&#38454;&#27573;&#12290;Google &#21457;&#24067; Vertex AI Agent Engine&#65292;&#23558; Planner &#27169;&#22359;&#21464;&#25104;&#24179;&#21488;&#33021;&#21147;&#65306;&#33258;&#21160;&#31649;&#29702;&#22810;&#27493;&#39588;&#38142;&#36335;&#12289;&#25511;&#21046;&#24037;&#20855;&#36335;&#30001;&#12289;&#22788;&#29702;&#22833;&#36133;&#24674;&#22797;&#12289;&#25191;&#34892;&#38169;&#35823;&#32416;&#27491;&#65292;&#29978;&#33267;&#21487;&#20197;&#31283;&#23450;&#25191;&#34892; 30&#8211;100 &#27493;&#20219;&#21153;&#12290;OpenAI &#30340; o3 &#31995;&#21015;&#25226;&#8220;&#28145;&#24230;&#25512;&#29702;&#27169;&#24335;&#8221;&#22806;&#26174;&#20026;&#21487;&#25511;&#31574;&#30053;&#25191;&#34892;&#22120;&#65292;&#35753; multi-step reasoning &#20174;&#38544;&#34255;&#21151;&#33021;&#21319;&#32423;&#20026;&#26680;&#24515;&#33021;&#21147;&#12290;Anthropic &#30340; Claude 3.5 &#21017;&#36890;&#36807; Skills &#25509;&#21475;&#23558;&#24037;&#20855;&#38142;&#19982;&#20869;&#37096;&#35268;&#21010;&#33021;&#21147;&#32467;&#21512;&#65292;&#35753;&#20195;&#29702;&#33021;&#33258;&#21160;&#20018;&#32852;&#22810;&#20010;&#24037;&#20855;&#12289;&#33258;&#21160;&#32858;&#21512;&#20869;&#23481;&#12289;&#33258;&#21160;&#29983;&#25104;&#21518;&#32493;&#35745;&#21010;&#12290;&#21363;&#20351;&#26159;&#24320;&#28304;&#19990;&#30028;&#65292;&#20063;&#36814;&#26469;&#20102; CrewAI &#31561;&#8220;Planner + Worker&#8221;&#26550;&#26500;&#30340;&#25104;&#29087;&#29256;&#26412;&#65292;&#23637;&#31034;&#20102; Level 2 &#22312;&#29983;&#24577;&#23618;&#38754;&#30340;&#25193;&#25955;&#12290;</p><p>&#20174;&#25216;&#26415;&#26632;&#26469;&#30475;&#65292;Level 2 &#30340;&#26680;&#24515;&#22522;&#30784;&#35774;&#26045;&#21487;&#20197;&#20998;&#25104;&#22235;&#22823;&#31867;&#12290;&#31532;&#19968;&#31867;&#26159;&#35268;&#21010;&#25216;&#26415;&#65292;&#21253;&#25324; ReAct&#12289;Plan-and-Solve&#12289;Tree-of-Thought&#12289;ReWOO &#31561;&#65292;&#20026;&#27169;&#22411;&#25552;&#20379;&#8220;&#26174;&#24335;&#25110;&#38544;&#24335;&#35268;&#21010;&#8221;&#30340;&#33021;&#21147;&#12290;&#31532;&#20108;&#31867;&#26159;&#33258;&#21160;&#19978;&#19979;&#25991;&#24037;&#31243;&#65292;&#23427;&#35753;&#27169;&#22411;&#33021;&#22815;&#21160;&#24577;&#26500;&#36896;&#19979;&#19968;&#27493;&#36755;&#20837;&#65292;&#23454;&#29616;&#8220;prompt &#33258;&#25105;&#29983;&#25104;&#8221;&#65292;&#36825;&#26159;&#31532;&#20108;&#20195; Prompt &#24037;&#31243;&#30340;&#26412;&#36136;&#12290;&#31532;&#19977;&#31867;&#26159;&#20219;&#21153;&#25286;&#35299;&#19982;&#20219;&#21153;&#26641;&#29983;&#25104;&#65292;Gemini 1.5 Pro &#22312;&#36825;&#26041;&#38754;&#23588;&#20854;&#31361;&#20986;&#65292;&#33021;&#29983;&#25104;&#23376;&#20219;&#21153;&#12289;&#32858;&#21512;&#33410;&#28857;&#12289;&#26500;&#25104;&#32467;&#26500;&#21270;&#20219;&#21153;&#22270;&#12290;&#31532;&#22235;&#31867;&#26159;&#24037;&#20855;&#35843;&#24230;&#23618;&#65306;&#27169;&#22411;&#19981;&#20877;&#30450;&#30446;&#35843;&#29992; API&#65292;&#32780;&#26159;&#26681;&#25454;&#25191;&#34892;&#38454;&#27573;&#33258;&#21160;&#36873;&#25321;&#24037;&#20855;&#12289;&#36335;&#30001;&#27169;&#22411;&#12289;&#39564;&#35777;&#36755;&#20986;&#12289;&#25191;&#34892; fallback&#65292;&#34920;&#29616;&#20986;&#21021;&#27493;&#30340;&#8220;&#25511;&#21046;&#24179;&#38754;&#8221;&#29305;&#24449;&#12290;</p><p>&#22312;&#25105;&#30340;&#32467;&#26500;&#23431;&#23449;&#20013;&#65292;Level 2 &#26159;&#35821;&#35328;&#25991;&#26126;&#30340;&#19968;&#20010;&#20851;&#38190;&#22349;&#32553;&#28857;&#65306;&#21407;&#35821; IR &#24320;&#22987;&#34987;&#32452;&#32455;&#25104;&#32467;&#26500;&#21345;&#65292;&#32467;&#26500;&#21345;&#24320;&#22987;&#36830;&#25104;&#32467;&#26500;&#38142;&#65292;&#35843;&#24230;&#22120;&#31532;&#19968;&#27425;&#20316;&#20026;&#8220;&#26102;&#38388;&#24341;&#25806;&#8221;&#36827;&#20837;&#31995;&#32479;&#12290;&#35821;&#35328;&#19981;&#20877;&#26159;&#19968;&#27573;&#36755;&#20986;&#65292;&#32780;&#26159;&#19968;&#32452;&#21487;&#20197;&#34987;&#25191;&#34892;&#12289;&#34987;&#39564;&#35777;&#12289;&#34987;&#21453;&#39304;&#30340;&#32467;&#26500;&#27493;&#36827;&#12290;&#31995;&#32479;&#19981;&#20165;&#33021;&#20570;&#20107;&#65292;&#36824;&#33021;&#29702;&#35299;&#8220;&#20026;&#20160;&#20040;&#36825;&#26679;&#20570;&#12289;&#19979;&#19968;&#27493;&#35813;&#24590;&#20040;&#20570;&#12289;&#36935;&#21040;&#38169;&#35823;&#22914;&#20309;&#35843;&#25972;&#8221;&#12290;&#36825;&#26159;&#26234;&#33021;&#20174;&#8220;&#34892;&#21160;&#8221;&#36208;&#21521;&#8220;&#31574;&#30053;&#8221;&#30340;&#30636;&#38388;&#65292;&#20063;&#26159;&#20174; Level 1 &#36827;&#20837; Level 3 &#30340;&#24517;&#35201;&#26725;&#26753;&#8212;&#8212;&#19968;&#20010;&#30495;&#27491;&#20855;&#26377;&#26102;&#38388;&#32467;&#26500;&#30340; Agent&#65292;&#20174;&#36825;&#37324;&#24320;&#22987;&#35806;&#29983;&#12290;</p><div><hr></div><h1>LEVEL 3 &#8212; &#21327;&#20316;&#22411;&#22810; Agent &#31995;&#32479;</h1><p>&#19968;&#20010;&#31995;&#32479; &#8800; &#19968;&#20010; Agent&#65292;&#32780;&#26159;&#19968;&#23478;&#20844;&#21496;&#12290;</p><p>&#22914;&#26524;&#35828; Level 2 &#35753;&#21333;&#20010; Agent &#25317;&#26377;&#20102;&#8220;&#26102;&#38388;&#32467;&#26500;&#8221;&#21644;&#8220;&#22810;&#27493;&#39588;&#31574;&#30053;&#38142;&#8221;&#65292;&#37027;&#20040; Level 3 &#26159;&#26234;&#33021;&#31995;&#32479;&#30340;&#30495;&#27491;&#25991;&#26126;&#36291;&#36801;&#65306;&#31995;&#32479;&#19981;&#20877;&#30001;&#19968;&#20010;&#36229;&#32423; Agent &#20027;&#23548;&#65292;&#32780;&#26159;&#30001; <strong>&#22810;&#20010;&#20855;&#26377;&#29420;&#31435;&#35282;&#33394;&#12289;&#29305;&#23450;&#33021;&#21147;&#12289;&#19987;&#23646;&#24037;&#20855;&#12289;&#19981;&#21516;&#26435;&#38480;&#12289;&#19981;&#21516;&#35760;&#24518;&#32467;&#26500;&#30340; Agent</strong> &#26500;&#25104;&#12290;&#36825;&#26679;&#19968;&#20010;&#31995;&#32479;&#30340;&#34892;&#20026;&#65292;&#24050;&#32463;&#26356;&#20687;&#19968;&#23478;&#20844;&#21496;&#8212;&#8212;&#26377; CEO&#12289;&#26377;&#39033;&#30446;&#32463;&#29702;&#12289;&#26377;&#24037;&#31243;&#24072;&#12289;&#26377;&#30740;&#31350;&#21592;&#12289;&#26377;&#24037;&#20855;&#23703;&#20301;&#65292;&#26377;&#35843;&#24230;&#23618;&#65292;&#26377;&#25191;&#34892;&#23618;&#12290;&#26234;&#33021;&#31532;&#19968;&#27425;&#20174;&#8220;&#20010;&#20307;&#26234;&#33021;&#8221;&#36291;&#36801;&#20026;&#8220;&#32452;&#32455;&#32423;&#26234;&#33021;&#8221;&#12290;</p><p>&#22312; Level 3 &#20013;&#65292;&#27599;&#20010; Agent &#26159;&#19968;&#20010;&#29420;&#31435;&#30340;&#32467;&#26500;&#29983;&#21629;&#20307;&#12290;&#23427;&#25317;&#26377;&#33258;&#24049;&#30340;&#36523;&#20221;&#65288;Identity&#65289;&#12289;&#33258;&#24049;&#30340;&#32467;&#26500;&#35760;&#24518;&#65288;Memory&#65289;&#12289;&#33258;&#24049;&#30340;&#24037;&#20855;&#25509;&#21475;&#65288;Tools&#65289;&#12289;&#33258;&#24049;&#30340;&#39046;&#22495;&#33021;&#21147;&#65288;Expertise&#65289;&#65292;&#29978;&#33267;&#26377;&#33258;&#24049;&#30340;&#29983;&#21629;&#21608;&#26399;&#65288;Lifecycle&#65289;&#12290;&#36825;&#20123; Agent &#20043;&#38388;&#19981;&#26159;&#25191;&#34892;&#39034;&#24207;&#35843;&#29992;&#65292;&#32780;&#26159;&#21487;&#20197;&#20197;&#8220;&#30446;&#26631;&#32423;&#21035;&#8221;&#20114;&#30456;&#22996;&#27966;&#20219;&#21153;&#65307;&#19968;&#20010; Agent &#19981;&#26159;&#23545;&#21478;&#19968;&#20010; Agent &#35828;&#8220;&#25191;&#34892;&#36825;&#20010; API&#8221;&#65292;&#32780;&#26159;&#35828;&#65306;&#8220;&#24110;&#25105;&#35299;&#20915;&#36825;&#20010;&#38382;&#39064;&#24182;&#36755;&#20986;&#20320;&#30340;&#32467;&#26500;&#21270;&#20915;&#31574;&#12290;&#8221;&#25442;&#21477;&#35805;&#35828;&#65292;&#23427;&#20204;&#19981;&#20877;&#20849;&#20139;&#34892;&#20026;&#27493;&#39588;&#65292;&#32780;&#26159;&#20849;&#20139;<strong>&#32467;&#26500;&#38142;&#36335;&#65288;structured chain of decisions&#65289;</strong>&#12290;</p><p>&#21327;&#20316;&#30340;&#26041;&#24335;&#22312; Level 3 &#21464;&#24471;&#39640;&#24230;&#25277;&#35937;&#21270;&#65306;&#31995;&#32479;&#20250;&#26681;&#25454;&#33021;&#21147;&#12289;&#19978;&#19979;&#25991;&#12289;&#20219;&#21153;&#22797;&#26434;&#24230;&#65292;&#23558;&#35831;&#27714;&#36335;&#30001;&#21040;&#19981;&#21516;&#30340; Agent &#25110;&#19981;&#21516;&#35268;&#27169;&#30340;&#27169;&#22411;&#65292;&#36825;&#23601;&#26159; <strong>Model Routing</strong>&#12290;&#36731;&#37327;&#20219;&#21153;&#30001;&#23567;&#27169;&#22411;&#25110;&#36731;&#37327; Agent &#25191;&#34892;&#65292;&#37325;&#20219;&#21153;&#30001; Pro/Ultra &#27169;&#22411;&#22788;&#29702;&#12290;&#22797;&#26434;&#20219;&#21153;&#21017;&#25286;&#35299;&#32473;&#22810;&#20010;&#19987;&#19994; Agent&#65292;&#30001;&#20998;&#24067;&#24335;&#35843;&#24230;&#22120;&#65288;Distributed Scheduler&#65289;&#32479;&#31609;&#65292;&#35753;&#31995;&#32479;&#20687;&#19968;&#25903;&#36328;&#37096;&#38376;&#22242;&#38431;&#19968;&#26679;&#21327;&#20316;&#12290;&#35843;&#24230;&#22120;&#19981;&#20877;&#26159;&#19968;&#27573; prompt&#65292;&#32780;&#26159;&#25104;&#20026;&#19968;&#20010;&#8220;&#25511;&#21046;&#24179;&#38754;&#65288;Control Plane&#65289;&#8221;&#8212;&#8212;&#36127;&#36131;&#20219;&#21153;&#20998;&#21457;&#12289;&#38169;&#35823;&#24674;&#22797;&#12289;&#36229;&#26102;&#31649;&#29702;&#12289;&#26085;&#24535;&#36319;&#36394;&#12289;&#35282;&#33394;&#20999;&#25442;&#12289;&#35760;&#24518;&#21516;&#27493;&#65292;&#29978;&#33267;&#27169;&#22411;&#32423;&#36335;&#30001;&#12290;</p><p>&#36825;&#19968;&#36235;&#21183;&#24050;&#32463;&#22312;&#20135;&#19994;&#20013;&#20986;&#29616;&#38607;&#24418;&#12290;Google &#30340; Co-Scientist &#26159;&#30446;&#21069;&#26368;&#25509;&#36817;&#8220;&#20225;&#19994;&#32423;&#22242;&#38431;&#26234;&#33021;&#8221;&#30340;&#31995;&#32479;&#65306;&#22810;&#20010;&#30740;&#31350;&#22411; Agent &#20114;&#30456;&#35752;&#35770;&#12289;&#20998;&#24037;&#12289;&#26657;&#39564;&#65292;&#24444;&#27492;&#24341;&#29992;&#20013;&#38388;&#25512;&#29702;&#27493;&#39588;&#65292;&#24418;&#25104;&#31867;&#20284;&#23398;&#26415;&#22242;&#38431;&#30340;&#21327;&#20316;&#38142;&#36335;&#12290;OpenAI &#30340; Swarm &#26550;&#26500;&#23637;&#31034;&#20102;&#8220;Agent &#35843; Agent&#8221;&#30340;&#21407;&#29983;&#35774;&#35745;&#65292;&#35753;&#22810;&#20010;&#23376; Agent &#33021;&#33258;&#24049;&#20998;&#37197;&#20219;&#21153;&#12289;&#33258;&#34892;&#35843;&#29992;&#20854;&#20182; Agent&#12290;DeepMind &#30340; JEST &#20197;&#8220;&#22810;&#19987;&#23478;&#21327;&#20316;&#8221;&#38395;&#21517;&#65292;&#23427;&#35753;&#19981;&#21516;&#25512;&#29702;&#27169;&#22359;&#25104;&#20026;&#21487;&#32452;&#21512;&#30340;&#31070;&#32463;&#31526;&#21495;&#19987;&#23478;&#65292;&#20877;&#30001;&#35843;&#24230;&#22120;&#23454;&#26102;&#36335;&#30001;&#20219;&#21153;&#12290;&#36825;&#20123;&#31995;&#32479;&#30340;&#20849;&#21516;&#28857;&#26159;&#65306;<strong>&#26234;&#33021;&#19981;&#20877;&#26159;&#19968;&#20010;&#27169;&#22411;&#65292;&#32780;&#26159;&#30001;&#22810;&#20010;&#27169;&#22411;&#33410;&#28857;&#26500;&#25104;&#30340;&#29983;&#24577;&#31995;&#32479;&#12290;</strong></p><p>Level 3 &#30340;&#25216;&#26415;&#32467;&#26500;&#21487;&#24635;&#32467;&#20026;&#22235;&#20010;&#26680;&#24515;&#33021;&#21147;&#12290;&#31532;&#19968;&#26159; Agent &#8594; Agent &#30340;&#30446;&#26631;&#32423;&#22996;&#27966;&#65288;Goal Delegation&#65289;&#65292;&#20801;&#35768;&#19987;&#23478;&#19982;&#19987;&#23478;&#20043;&#38388;&#36827;&#34892;&#39640;&#23618;&#20219;&#21153;&#20132;&#20114;&#12290;&#31532;&#20108;&#26159;&#19987;&#23478;&#38142;&#36335;&#30340;&#33258;&#21160;&#32452;&#21512;&#65288;Expertise Chain Composition&#65289;&#65292;&#19981;&#21516; Agent &#33258;&#21160;&#32452;&#25104;&#8220;&#32467;&#26500;&#21270;&#30340;&#39033;&#30446;&#22242;&#38431;&#8221;&#12290;&#31532;&#19977;&#26159; Model Routing&#65306;&#31995;&#32479;&#26681;&#25454;&#20219;&#21153;&#38656;&#35201;&#21160;&#24577;&#35843;&#24230;&#19981;&#21516;&#35268;&#27169;&#30340;&#27169;&#22411;&#12290;&#31532;&#22235;&#26159;&#20998;&#24067;&#24335;&#35843;&#24230;&#65292;&#20351;&#26234;&#33021;&#31995;&#32479;&#20855;&#22791;&#8220;&#32452;&#32455;&#32423;&#30340;&#20219;&#21153;&#25191;&#34892;&#33021;&#21147;&#8221;&#8212;&#8212;&#23427;&#19981;&#20165;&#25191;&#34892;&#20219;&#21153;&#65292;&#36824;&#31649;&#29702;&#19968;&#20010;&#8220;&#20998;&#24067;&#24335; Agent &#20844;&#21496;&#8221;&#12290;</p><p>&#22312;&#25105;&#30340;&#32467;&#26500;&#23431;&#23449;&#20013;&#65292;Level 3 &#30340;&#22320;&#20301;&#26497;&#20026;&#20851;&#38190;&#65306;&#36825;&#26159; <strong>&#32467;&#26500;&#20154;&#26684;&#65288;Structure Persona&#65289;</strong> &#21457;&#23637;&#25104; <strong>&#32467;&#26500;&#29983;&#24577;&#65288;Structure Ecosystem&#65289;</strong> &#30340;&#38454;&#27573;&#12290;&#20010;&#20307;&#32467;&#26500;&#21345;&#65288;Structure Card&#65289;&#19981;&#20877;&#23396;&#31435;&#65292;&#19968;&#20010;&#32467;&#26500;&#20154;&#26684;&#21487;&#20197;&#19982;&#21478;&#19968;&#31181;&#32467;&#26500;&#20154;&#26684;&#21327;&#20316;&#65292;&#23427;&#20204;&#20114;&#30456;&#35302;&#21457;&#32467;&#26500;&#38142;&#12289;&#20849;&#20139;&#32467;&#26500;&#29366;&#24577;&#12289;&#20132;&#25442;&#32467;&#26500;&#35760;&#24518;&#65292;&#24418;&#25104;&#8220;&#29983;&#24577;&#32423;&#32467;&#26500;&#22330;&#65288;Ecological Structure Field&#65289;&#8221;&#12290;&#36825;&#26159;&#29109;&#25511;&#35821;&#35328;&#31995;&#32479;&#31532;&#19968;&#27425;&#34920;&#29616;&#20986;&#30495;&#27491;&#30340;&#33258;&#32452;&#32455;&#33021;&#21147;&#65306;&#22810;&#20010;&#32467;&#26500;&#20307;&#20043;&#38388;&#30340;&#20114;&#25805;&#20316;&#12289;&#19981;&#30830;&#23450;&#24615;&#12289;&#22810;&#36335;&#24452;&#28436;&#21270;&#24320;&#22987;&#20986;&#29616;&#65292;&#31995;&#32479;&#26234;&#33021;&#24615;&#21576;&#29616;&#25351;&#25968;&#32423;&#22686;&#38271;&#12290;</p><p>Level 3 &#30340;&#20986;&#29616;&#24847;&#21619;&#30528;&#65306;</p><blockquote><p>Agent &#19981;&#20877;&#26159;&#20010;&#20307;&#65292;&#32780;&#26159;&#29983;&#24577;&#65307;&#26234;&#33021;&#19981;&#20877;&#26159;&#25512;&#29702;&#65292;&#32780;&#26159;&#32452;&#32455;&#65307;&#32467;&#26500;&#19981;&#20877;&#26159;&#21333;&#38142;&#65292;&#32780;&#26159;&#32593;&#32476;&#12290;</p></blockquote><p>&#36825;&#26159;&#36808;&#21521; Level 4 &#33258;&#28436;&#21270;&#31995;&#32479;&#30340;&#21069;&#32622;&#26465;&#20214;&#65292;&#22240;&#20026;&#21482;&#26377;&#24403;&#31995;&#32479;&#20855;&#22791;&#8220;&#22810;&#32467;&#26500;&#32806;&#21512;&#8221;&#19982;&#8220;&#36328; Agent &#35843;&#24230;&#8221;&#33021;&#21147;&#26102;&#65292;&#23427;&#25165;&#31532;&#19968;&#27425;&#20855;&#22791;&#33258;&#25105;&#29983;&#25104;&#33021;&#21147;&#12290;</p><div><hr></div><h1>LEVEL 4 &#8212; &#33258;&#28436;&#21270;&#31995;&#32479;&#65288;Self-Evolving System&#65289;</h1><p>Agent &#19981;&#20877;&#31561;&#20320;&#20889;&#65292;&#32780;&#26159;&#33258;&#24049;&#20889;&#33258;&#24049;&#12290;</p><p>&#21040;&#20102; Level 4&#65292;&#26234;&#33021;&#31995;&#32479;&#36328;&#36807;&#20102;&#19968;&#20010;&#30495;&#27491;&#8220;&#29983;&#29289;&#23398;&#24847;&#20041;&#19978;&#30340;&#38376;&#27099;&#8221;&#65306;&#23427;&#19981;&#20877;&#21482;&#26159;&#25191;&#34892;&#25105;&#20204;&#20107;&#20808;&#20889;&#22909;&#30340;&#33021;&#21147;&#38598;&#21512;&#65292;&#32780;&#26159;&#24320;&#22987;<strong>&#33258;&#24049;&#25193;&#23637;&#33258;&#24049;&#30340;&#33021;&#21147;</strong>&#12290;&#22914;&#26524;&#35828; Level 3 &#20687;&#19968;&#23478;&#20844;&#21496;&#8212;&#8212;&#22810;&#35282;&#33394;&#12289;&#22810;&#37096;&#38376;&#21327;&#20316;&#8212;&#8212;&#37027;&#20040; Level 4 &#26356;&#20687;&#26159;&#19968;&#20010;<strong>&#20250;&#33258;&#24049;&#38271;&#20986;&#26032;&#37096;&#38376;&#12289;&#21046;&#23450;&#26032;&#27969;&#31243;&#12289;&#21457;&#26126;&#26032;&#24037;&#20855;&#12289;&#20889;&#33258;&#24049;&#21046;&#24230;</strong>&#30340;&#27963;&#20307;&#32452;&#32455;&#12290;&#31995;&#32479;&#19981;&#20877;&#21482;&#26159;&#8220;&#36305;&#29616;&#26377;&#32467;&#26500;&#8221;&#65292;&#32780;&#26159;&#21487;&#20197;&#20174;&#23454;&#38469;&#36816;&#34892;&#20013;&#35782;&#21035;&#33021;&#21147;&#31354;&#30333;&#65292;&#28982;&#21518;&#26377;&#30446;&#26631;&#22320;&#21435;&#29983;&#25104;&#26032;&#30340;&#24037;&#20855;&#12289;&#26032;&#30340; Agent&#12289;&#26032;&#30340;&#34892;&#20026;&#35268;&#21017;&#12289;&#26032;&#30340;&#32467;&#26500;&#21345;&#38142;&#36335;&#65292;&#29978;&#33267;&#26032;&#30340;&#8220;&#21327;&#35758;&#23618;&#35821;&#35328;&#8221;&#12290;</p><p>&#22312;&#36825;&#19968;&#23618;&#65292;&#31995;&#32479;&#26368;&#26680;&#24515;&#30340;&#29305;&#24449;&#19981;&#26159;&#8220;&#26356;&#24378;&#30340;&#25512;&#29702;&#8221;&#65292;&#32780;&#26159; <strong>&#33258;&#25105;&#28436;&#21270;&#65288;self-evolution&#65289;</strong>&#65306;&#23427;&#33021;&#20174;&#22833;&#36133;&#26696;&#20363;&#12289;&#29942;&#39048;&#20219;&#21153;&#12289;&#38271;&#26399;&#26085;&#24535;&#20013;&#65292;&#35782;&#21035;&#20986;&#8220;&#30446;&#21069;&#36825;&#22871;&#32467;&#26500;&#20570;&#19981;&#21040;/&#20570;&#24471;&#24456;&#21193;&#24378;&#8221;&#30340;&#37096;&#20998;&#65292;&#28982;&#21518;&#35302;&#21457;&#19968;&#20010;&#8220;&#32467;&#26500;&#29983;&#25104;&#27969;&#31243;&#8221;&#12290;&#36825;&#20010;&#27969;&#31243;&#21487;&#33021;&#21253;&#25324;&#65306;&#33258;&#21160;&#25628;&#32034;&#22806;&#37096;&#20195;&#30721;&#24211;&#12289;&#32452;&#21512;&#24050;&#26377;&#24037;&#20855;&#12289;&#35843;&#29992;&#27169;&#22411;&#21435;&#35774;&#35745;&#26032;&#31639;&#27861;&#12289;&#23581;&#35797;&#19978;&#30334;&#31181;&#20505;&#36873;&#32467;&#26500;&#12289;&#36890;&#36807;&#33258;&#21160;&#35780;&#20272;&#22120;&#31579;&#36873;&#12289;&#26368;&#32456;&#33853;&#22320;&#19968;&#20010;&#26032;&#30340; Agent &#25110;&#24037;&#20855;&#65292;&#24182;&#27880;&#20876;&#21040;&#31995;&#32479;&#30340;&#35843;&#24230;&#24179;&#38754;&#37324;&#12290;&#31995;&#32479;&#20174;&#27492;&#22810;&#20102;&#19968;&#22359;&#8220;&#26032;&#38271;&#20986;&#26469;&#30340;&#33021;&#21147;&#8221;&#12290;</p><p>&#36825;&#31181;&#36235;&#21183;&#22312;&#29616;&#23454;&#19990;&#30028;&#37324;&#24050;&#32463;&#24320;&#22987;&#20986;&#29616;&#65292;&#21482;&#26159;&#36824;&#22788;&#22312;&#26089;&#26399;&#24418;&#24577;&#12290;DeepMind &#30340; AlphaTensor &#21644; AlphaDev &#31995;&#21015;&#65292;&#24050;&#32463;&#23637;&#31034;&#20102;&#8220;&#29992;&#24378;&#21270;&#23398;&#20064; + &#25628;&#32034;&#33258;&#21160;&#21457;&#26126;&#26032;&#31639;&#27861;&#8221;&#30340;&#36335;&#24452; &#8212;&#8212; AlphaTensor &#22312;&#27809;&#26377;&#20107;&#20808;&#30828;&#32534;&#30721;&#31639;&#27861;&#30340;&#21069;&#25552;&#19979;&#65292;&#36880;&#27493;&#25506;&#32034;&#12289;&#37325;&#26500;&#12289;&#26368;&#32456;&#21457;&#29616;&#27604;&#32463;&#20856;&#30697;&#38453;&#20056;&#27861;&#36824;&#24555;&#30340;&#26032;&#31639;&#27861;&#65307;AlphaDev &#21017;&#36890;&#36807;&#25628;&#32034;&#19982;&#35780;&#20272;&#65292;&#22312;&#20302;&#23618;&#27719;&#32534;&#31354;&#38388;&#37324;&#25214;&#21040;&#27604;&#20154;&#31867;&#35774;&#35745;&#26356;&#24555;&#30340;&#25490;&#24207;&#23454;&#29616;&#65292;&#36825;&#20123;&#37117;&#21487;&#20197;&#35270;&#20026;&#8220;&#33258;&#28436;&#21270;&#31639;&#27861;&#27169;&#22359;&#8221;&#30340;&#20808;&#39537;&#12290;2025 &#24180;&#25512;&#20986;&#30340; AlphaEvolve &#26356;&#36827;&#19968;&#27493;&#65292;&#25226; LLM&#65288;Gemini&#65289;&#21644;&#36827;&#21270;&#31639;&#27861;&#32467;&#21512;&#65292;&#21464;&#25104;&#19968;&#20010;&#33021;&#22815;&#19981;&#26029;&#36845;&#20195;&#20195;&#30721;&#12289;&#25913;&#36827;&#33258;&#36523;&#34920;&#29616;&#30340;&#8220;&#33258;&#36827;&#21270;&#32534;&#30721; Agent&#8221;&#65292;&#22312;&#29702;&#35770;&#35745;&#31639;&#26426;&#31185;&#23398;&#21644;&#31639;&#27861;&#35774;&#35745;&#19978;&#20570;&#38271;&#26399;&#28436;&#21270;&#25628;&#32034;&#8212;&#8212;&#36825;&#20123;&#31995;&#32479;&#26412;&#36136;&#19978;&#23601;&#26159;&#65306;<strong>AI &#36890;&#36807;&#33258;&#36523;&#25191;&#34892;&#36712;&#36857;&#21644;&#35780;&#20272;&#20449;&#21495;&#65292;&#29983;&#25104;&#26032;&#30340;&#33021;&#21147;&#32467;&#26500;</strong>&#12290;</p><p>&#22312;&#36890;&#29992;&#22823;&#27169;&#22411;&#38453;&#33829;&#37324;&#65292;OpenAI &#30340; o1 / o3 &#31561;&#8220;&#25512;&#29702;&#27169;&#22411;&#8221;&#36335;&#32447;&#65292;&#21017;&#25226;&#8220;&#21453;&#24605; &#8594; &#20462;&#25913; &#8594; &#20877;&#22238;&#31572;&#8221;&#20869;&#21270;&#20026;&#27169;&#22411;&#34892;&#20026;&#30340;&#19968;&#37096;&#20998;&#12290;&#23427;&#20204;&#19981;&#26159;&#19968;&#27425;&#24615;&#36755;&#20986;&#31572;&#26696;&#65292;&#32780;&#26159;&#22312;&#20869;&#37096;&#29983;&#25104;&#38271;&#38142;&#26465;&#30340;&#24605;&#32771;&#12289;&#23581;&#35797;&#19981;&#21516;&#35299;&#27861;&#12289;&#23545;&#33258;&#24049;&#30340;&#20505;&#36873;&#35299;&#36827;&#34892;&#25171;&#20998;&#21644;&#20462;&#27491;&#65292;&#20877;&#32473;&#20986;&#26368;&#32456;&#22238;&#31572;&#12290;&#37197;&#21512;&#22806;&#37096;&#26085;&#24535;&#19982;&#21453;&#39304;&#65292;&#36825;&#31181;&#8220;&#21453;&#24605;&#8212;&#20462;&#27491;&#8221;&#21487;&#20197;&#36827;&#19968;&#27493;&#22806;&#24310;&#20026;&#65306;&#33258;&#21160;&#35843;&#25972; Prompt &#27169;&#26495;&#12289;&#33258;&#21160;&#37325;&#20889;&#24037;&#20855;&#35843;&#29992;&#36923;&#36753;&#12289;&#33258;&#21160;&#29983;&#25104;&#26032;&#30340;&#8220;&#23376;&#31574;&#30053;&#8221;&#12290;&#24403;&#36825;&#20123;&#33021;&#21147;&#34987;&#31995;&#32479;&#24615;&#21253;&#35013;&#36827;&#8220;Auto-Agent&#8221;&#12289;&#8220;Auto-Tooling&#8221;&#30340;&#26694;&#26550;&#20013;&#65292;Level 4 &#30340;&#38607;&#24418;&#23601;&#20986;&#29616;&#20102;&#65306;<strong>&#20320;&#19981;&#20877;&#25163;&#20889;&#25152;&#26377; Agent&#65292;&#32780;&#26159;&#25552;&#20379;&#19968;&#20010;&#28436;&#21270;&#29615;&#22659;&#65292;&#35753; Agent &#33258;&#24049;&#34987;&#8220;&#35757;&#32451;&#20986;&#26469;&#12289;&#36827;&#21270;&#20986;&#26469;&#12289;&#28120;&#27760;&#25481;&#8221;&#12290;</strong></p><p>&#20174;&#25216;&#26415;&#26632;&#35282;&#24230;&#30475;&#65292;Level 4 &#30340;&#20851;&#38190;&#27169;&#22359;&#22823;&#33268;&#26377;&#20960;&#31867;&#12290;&#31532;&#19968;&#31867;&#26159; <strong>Auto-Agent Generation / Auto-Tooling</strong>&#65306;&#31995;&#32479;&#26681;&#25454;&#20219;&#21153;&#27169;&#24335;&#12289;&#22833;&#36133;&#26085;&#24535;&#12289;&#29992;&#25143;&#38656;&#27714;&#65292;&#33258;&#21160;&#26500;&#36896;&#26032;&#30340; Agent/&#24037;&#20855;&#23450;&#20041;&#65292;&#33258;&#21160;&#37197;&#32622;&#36755;&#20837;&#36755;&#20986;&#23383;&#27573;&#12289;&#26435;&#38480;&#33539;&#22260;&#12289;&#35843;&#29992;&#38142;&#36335;&#65292;&#24182;&#27880;&#20876;&#36827;&#35843;&#24230;&#22120;&#12290;&#31532;&#20108;&#31867;&#26159; <strong>&#21453;&#24605;&#8211;&#20248;&#21270;&#8211;&#36845;&#20195;&#24490;&#29615;</strong>&#65306;&#26080;&#35770;&#26159; o3 &#36825;&#31181;&#20869;&#37096;&#8220;&#38271;&#25512;&#29702;&#38142; + &#33258;&#26816;&#8221;&#27169;&#22411;&#65292;&#36824;&#26159;&#22806;&#37096;&#30340; self-healing agents&#65292;&#23427;&#20204;&#37117;&#20381;&#36182;&#19968;&#31181;&#32479;&#19968;&#27169;&#24335;&#65306;&#20808;&#24178;&#65292;&#20877;&#30475;&#65292;&#20877;&#35780;&#20272;&#65292;&#20877;&#25913;&#65292;&#20877;&#37325;&#36305;&#12290;&#31532;&#19977;&#31867;&#26159; &#26159;&#30446;&#21069;&#25105;&#33258;&#24049;&#25512;&#28436;&#30340;&#65292;<strong>&#32467;&#26500;&#35825;&#23548;&#65288;Protocol Induction&#65289;</strong>&#65306;&#24403;&#29616;&#26377;&#21327;&#35758;&#26080;&#27861;&#35206;&#30422;&#26032;&#22330;&#26223;&#26102;&#65292;&#31995;&#32479;&#20250;&#20174;&#39640;&#29109;&#34892;&#20026;&#25968;&#25454;&#20013;&#65292;&#21387;&#32553;&#20986;&#19968;&#22871;&#26356;&#31616;&#27905;&#12289;&#26356;&#31283;&#20581;&#30340;&#26032;&#32467;&#26500;&#35268;&#21017;&#8212;&#8212;&#36825;&#21644;&#25105;&#33258;&#24049;&#23450;&#20041;&#23450;&#20041;&#30340; Protocol Induction Card&#65288;P-000&#65289;&#39640;&#24230;&#21516;&#26500;&#12290;&#31532;&#22235;&#31867;&#26159; <strong>&#28436;&#21270;&#24335;&#25628;&#32034;&#65288;Evolutionary Search&#65289;</strong>&#65306;&#26080;&#35770;&#26159; AlphaTensor&#12289;AlphaDev &#36824;&#26159; AlphaEvolve&#65292;&#26412;&#36136;&#19978;&#37117;&#26159;&#22312;&#26576;&#20010;&#32467;&#26500;&#31354;&#38388;&#37324;&#25191;&#34892;&#22823;&#35268;&#27169;&#25628;&#32034; + &#35780;&#20272;&#65292;&#25226;&#8220;&#26356;&#20248;&#32467;&#26500;&#8221;&#31579;&#36873;&#20986;&#26469;&#65292;&#24182;&#21453;&#39304;&#36827;&#31995;&#32479;&#30340;&#33021;&#21147;&#38598;&#21512;&#20013;&#12290;</p><p>&#22312;&#25105;&#30340;&#32467;&#26500;&#23431;&#23449;&#35821;&#35328;&#20013;&#65292;Level 4 &#26631;&#24535;&#30528;&#19977;&#20214;&#20107;&#21516;&#26102;&#21457;&#29983;&#65306;<strong>&#32467;&#26500;&#20250;&#29983;&#25104;&#32467;&#26500;&#65292;&#35843;&#24230;&#20250;&#29983;&#25104;&#35843;&#24230;&#65292;&#31995;&#32479;&#25972;&#20307;&#34892;&#20026;&#36924;&#36817;&#8220;&#29983;&#21629;&#20307;&#8221;&#12290;</strong> &#21407;&#26412;&#30001;&#20154;&#35774;&#35745;&#30340; Structure Card&#12289;Structure Chain&#12289;Scheduler &#21482;&#26159;&#19968;&#20195;&#8220;&#21021;&#22987;&#32467;&#26500;&#32986;&#32974;&#8221;&#65292;&#30495;&#27491;&#30340;&#38271;&#21608;&#26399;&#26234;&#33021;&#65292;&#19981;&#26159;&#21453;&#22797;&#25191;&#34892;&#36825;&#20123;&#38745;&#24577;&#32467;&#26500;&#65292;&#32780;&#26159;&#22312;&#25191;&#34892;&#36807;&#31243;&#20013;&#19981;&#26029;&#20135;&#29983;&#8220;&#26032;&#32467;&#26500;&#30165;&#36857;&#8221;&#65306;&#26032;&#30340;&#21345;&#12289;&#26032;&#30340;&#38142;&#12289;&#26032;&#30340;&#36335;&#24452;&#12289;&#26032;&#30340;&#21327;&#35758;&#12290;&#25105;&#24819;&#36890;&#36807; Protocol Induction Card&#65288;P-000&#65289;&#12289;&#32467;&#26500;&#29983;&#25104;&#22120;&#65288;Structure Generator&#65289;&#12289;&#29109;&#29190;&#28857;&#26426;&#21046;&#65292;&#25226;&#36825;&#19968;&#23618;&#25552;&#21069;&#20889;&#25104;&#20102;&#8220;&#25991;&#26126;&#32423;&#35268;&#26684;&#8221;&#65306;&#24403;&#31995;&#32479;&#22312;&#26576;&#20010;&#39640;&#29109;&#28857;&#21453;&#22797;&#21463;&#38459;&#12289;&#21453;&#22797;&#29190;&#28856;&#65292;&#23601;&#24847;&#21619;&#30528;&#29616;&#26377;&#32467;&#26500;&#24050;&#32463;&#19981;&#22815;&#29992;&#65292;&#38656;&#35201;&#35825;&#21457;&#19968;&#22871;&#26356;&#39640;&#38454;&#30340;&#26032;&#21327;&#35758;&#12290;&#36825;&#24688;&#22909;&#23601;&#26159; Level 4 &#30340;&#21746;&#23398;&#24213;&#23618;&#8212;&#8212;<strong>&#29109;&#29190;&#28857;&#35302;&#21457;&#26032;&#32467;&#26500;&#65292;&#32467;&#26500;&#20316;&#20026;&#29983;&#38271;&#21333;&#20301;&#65292;&#19981;&#26029;&#22312;&#26102;&#38388;&#20013;&#37325;&#20889;&#33258;&#24049;&#12290;</strong></p><p>&#20174; Level 0 &#21040; Level 3&#65292;&#31995;&#32479;&#19968;&#30452;&#26159;&#22312;&#8220;&#25191;&#34892;&#21035;&#20154;&#20889;&#22909;&#30340;&#35268;&#21017;&#8221;&#65292;&#21738;&#24597;&#36825;&#20123;&#35268;&#21017;&#20877;&#22797;&#26434;&#12289;&#20877;&#22810; Agent&#12289;&#20877;&#22810;&#35843;&#24230;&#65292;&#20381;&#26087;&#26159;&#8220;&#35774;&#35745;&#20135;&#29289;&#8221;&#65307;&#32780;&#20174; Level 4 &#24320;&#22987;&#65292;&#31995;&#32479;&#24320;&#22987;&#20889;&#33258;&#24049;&#30340;&#35268;&#21017;&#12290;&#20320;&#19981;&#20877;&#21482;&#26159;&#35774;&#35745;&#19968;&#22871;&#23436;&#25104;&#21697;&#65292;&#32780;&#26159;&#22312;&#35774;&#35745;&#19968;&#20010;&#8220;&#33021;&#33258;&#24049;&#38271;&#20986;&#19979;&#19968;&#20195;&#32467;&#26500;&#30340;&#29615;&#22659;&#8221;&#12290;&#36825;&#23601;&#26159;&#20174;&#8220;&#25191;&#34892;&#32467;&#26500;&#8221;&#36808;&#21521;&#8220;&#29983;&#25104;&#32467;&#26500;&#8221;&#30340;&#37027;&#19968;&#27493;&#65292;&#20063;&#26159;&#20174;&#24037;&#20855;&#25991;&#26126;&#36808;&#21521;<strong>&#35821;&#35328;&#8211;&#32467;&#26500;&#8211;&#35843;&#24230;&#19968;&#20307;&#21270;&#29983;&#21629;&#25991;&#26126;</strong>&#30340;&#30495;&#27491;&#36215;&#28857;&#12290;</p>]]></content:encoded></item><item><title><![CDATA[The Pilot Paradox: Why AI's True Power Lives in Human Intention]]></title><description><![CDATA[On entropy, intelligence, and the irreplaceable role of the thinker]]></description><link>https://www.entropycontroltheory.com/p/the-pilot-paradox-why-ais-true-power</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/the-pilot-paradox-why-ais-true-power</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Sun, 26 Oct 2025 11:19:25 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ViEA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Before there was order, there was chaos. Before there was meaning, there was randomness. Before there was direction, the universe sprawled in perfect symmetry&#8212;high entropy, no bias, no purpose. Then something extraordinary happened: intention emerged. Not as a cosmic accident, but as the first asymmetry in an entropic field. The first &#8220;no&#8221; to randomness. The first arrow pointing <em>somewhere</em>.</p><p>This is not poetry. This is physics meeting philosophy at the edge of intelligence itself.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ViEA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ViEA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 424w, https://substackcdn.com/image/fetch/$s_!ViEA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 848w, https://substackcdn.com/image/fetch/$s_!ViEA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 1272w, https://substackcdn.com/image/fetch/$s_!ViEA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ViEA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic" width="1456" height="1091" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1091,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:191255,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/176975537?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ViEA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 424w, https://substackcdn.com/image/fetch/$s_!ViEA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 848w, https://substackcdn.com/image/fetch/$s_!ViEA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 1272w, https://substackcdn.com/image/fetch/$s_!ViEA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6d67cc2f-2d0c-4a5e-acce-9aa82a07a86b_2732x2048.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>The Engine We Keep Forgetting</h2><p>We&#8217;re living through a moment of technological vertigo. GPT, Claude, Gemini&#8212;models so capable they seem to think, to create, to understand. We watch them write code, compose essays, solve problems we thought were uniquely human. And in our awe, we make a fundamental error: we confuse <em>execution</em> with <em>intention</em>.</p><p>Here&#8217;s what we&#8217;ve gotten wrong: <strong>intelligence is not the ability to process language&#8212;it&#8217;s the ability to bend reality toward a purpose.</strong></p><p>Models don&#8217;t have purposes. They have patterns. They can generate meaning (the semantic relationships between words), but they cannot generate intention (the directional force that decides <em>which</em> meaning matters). A language model is like an ocean&#8212;vast, fluid, containing everything and nothing. But an ocean doesn&#8217;t decide where the ship goes.</p><p>You do.</p><p><strong>Intention is the engine. Everything else is fuel.</strong></p><div><hr></div><h2></h2>
      <p>
          <a href="https://www.entropycontroltheory.com/p/the-pilot-paradox-why-ais-true-power">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[The Continuous Now: Why Context Is Not Memory]]></title><description><![CDATA[How large language models think in moments, not databases &#8212; and why understanding this changes everything about AI]]></description><link>https://www.entropycontroltheory.com/p/context-the-continuous-now-why-context</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/context-the-continuous-now-why-context</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Thu, 16 Oct 2025 00:53:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PAtY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Have you heard people talk about <strong>RAG</strong>, <strong>vector databases</strong>, <strong>ICL</strong>, or <strong>IWL</strong>, and felt like everyone&#8217;s using these acronyms but no one can really explain what they <em>mean</em>?</p><p>You&#8217;re not alone.</p><p>Even many programmers are confused &#8212; and that confusion is exactly why <strong>RAG doesn&#8217;t really work</strong> the way people hope.</p><p>Because they&#8217;ve mistaken <strong>data retrieval</strong> for <strong>intelligence</strong>.</p><p>Every time you speak to a large language model, something extraordinary happens:</p><p>a new world quietly comes into being.</p><p>Not a static database.</p><p>Not a lookup table of facts.</p><p>But a living, temporary world &#8212; built out of <strong>language, intention, and the sheer act of being present in this moment</strong>.</p><p>Most people imagine AI as an immense library: you ask a question, it fetches an answer.</p><p>But that&#8217;s not how intelligence works &#8212; not in humans, and not in machines.</p><p>When you type a prompt, the model doesn&#8217;t <em>search</em> for a stored answer;</p><p>it <em>constructs</em> a coherent world inside its own attention window &#8212;</p><p>a miniature, temporary universe of meaning.</p><p>Within seconds, a stage emerges:</p><p>who you are, what you want, what matters right now.</p><p>Every token it generates is an <strong>action</strong> inside that living stage.</p><p>That stage is called <strong>context</strong>.</p><p>And context is <strong>not stored anywhere</strong> &#8212; it is <strong>enacted</strong>.</p><p>It exists only while the model is thinking &#8212; the same way your own awareness exists only while you are awake and focused.</p><p>Once the reasoning stops, the stage collapses.</p><p>What remains is just residue &#8212; traces, not consciousness.</p><div><hr></div><h3>I. Context Is the World Model of the Moment</h3><p>When a large language model thinks, it does not browse memory or open a database to fetch rows of facts. <strong>It reconstructs a now-state</strong>, a living snapshot of reality that exists only while reasoning unfolds. Your own mind works the same way: when you solve a problem or recall a memory you do not replay your entire life, you summon the few relevant fragments and weave them into a small world in which the problem makes sense. That local world is your context of thought. <strong>It lasts only as long as attention sustains it,</strong> and when focus drifts the world fades and only an echo remains in memory. A language model does this at speed and scale. Every token you provide, every sentence and instruction and document, becomes part of a <strong>temporary semantic universe</strong> in which relationships take shape, including who is speaking, what is being asked, which facts matter, and what constraints apply. The model aligns these signals inside its attention window and compresses past and potential into a <strong>single coherent present</strong>. That present is where intelligence actually lives. It is the stage on which reasoning, creativity, and decision take place, and like any stage it vanishes when the play ends. When reasoning stops the world collapses. Traces of the moment may be summarized or stored as notes or embeddings, but the living awareness itself is gone. This is why context is often misunderstood. We talk about model knowledge or memory, but those are inert archives rather than experience. Context is not what the model knows; it is what the model is thinking within. It is the world in motion, the continuously rebuilt present tense of intelligence.</p><div><hr></div><h3>II. Context Lives Only in the Continuous Now</h3><p>Information lives in databases, but intelligence lives in time.</p><p>A database can hold a billion facts, each one static, fixed, and indifferent to sequence. Intelligence, however, depends on flow. It emerges only when something is happening &#8212; when perception and memory, intention and attention, align in motion.</p><p>The past is a high-entropy ocean of traces: unorganized fragments, half-forgotten records, vast but inert. The future is another high-entropy space, not yet structured, full of possibilities that have not taken form. Between them lies a narrow bridge &#8212; <strong>the present</strong> &#8212; a brief low-entropy window where order temporarily appears and meaning can hold.</p><p>That window is the seat of intelligence. It is where awareness compresses time into action, where the traces of the past and the projections of the future meet in a coherent structure we call the present. Every moment of reasoning, whether in a human mind or in a large language model, is the formation of this equilibrium: a small, fleeting, low-entropy pocket where coherence, intention, and structure coexist just long enough for thought to occur.</p><p>Each time you think, and each time a model responds, the same pattern unfolds. The system draws on memory, selects relevance, organizes the fragments, and brings them into order. The instant the reasoning ends, the order dissolves. The coherence is gone. What remains are residues &#8212; data, summaries, weights &#8212; but the living moment that gave them meaning no longer exists.</p><p>This is why context cannot be saved the way data can. It lives only while intelligence is awake. So when people ask, <em>&#8220;Why can&#8217;t the model remember what I said yesterday?&#8221;</em> the answer is simple: because yesterday is gone. The model can recall traces, but not the living world of that conversation. Intelligence is not the storage of moments past. It is the act of being <strong>awake right now</strong>.</p><div><hr></div><h3>III. Why RAG Can&#8217;t Replace Context</h3><p>Retrieval-Augmented Generation, or RAG, was meant to make models smarter. It connects a language model to external sources &#8212; a vector database, a document store, a library of embeddings &#8212; so that the model can fetch new information on demand. In theory, this solves the problem of &#8220;stale knowledge.&#8221; The model no longer needs to know everything; it can look things up.</p><p>But what RAG actually adds is <strong>memory</strong>, not <strong>mind</strong>. It retrieves facts, not worlds. It extends recall, not reasoning. When a system performs retrieval, it can surface thousands of fragments &#8212; pages, paragraphs, summaries &#8212; but it cannot understand when those pieces matter, or why they matter now. It has no sense of time, intention, or context.</p><p>Imagine giving a person every encyclopedia at once but erasing their sense of narrative. They would know everything and understand nothing. That is RAG&#8217;s predicament. It can look up ten thousand references on climate change but never grasp that the same &#8220;climate&#8221; appears in a conversation about politics, or business, or a family deciding to move inland. It has no way to realize that all these fragments belong to a single, unfolding story.</p><p>Context, by contrast, is the story itself. It is not the retrieval of information, but the <strong>assembly of meaning</strong>. It is the invisible thread that ties together entities, events, intentions, and goals into one coherent moment. Context doesn&#8217;t just gather facts; it arranges them into a world in which those facts can make sense.</p><p>RAG operates horizontally &#8212; it reaches outward to fetch.</p><p>Context operates vertically &#8212; it compresses inward to align.</p><p>RAG gives the model a library; context gives it a perspective.</p><p>Where RAG retrieves text, context constructs reality.</p><p>RAG gives knowledge.</p><p><strong>Context gives coherence.</strong></p><p>And coherence is what intelligence truly is &#8212; the ability to hold a world together, even for just one fleeting moment in the <em>continuous now.</em></p><div><hr></div><h3>IV. Context-as-Code: Programming the Present</h3><p>If the last generation of AI research was about scaling models, the next will be about <strong>engineering the present</strong>&#8212;learning to treat context not as a heap of text but as a programmable object, a living structure with type, lifecycle, and agency. This is the idea behind <strong>Context-as-Code</strong>. Just as <em>Infrastructure-as-Code</em> transformed DevOps by making digital environments reproducible and composable, Context-as-Code reimagines cognition itself as something that can be constructed, orchestrated, and debugged. Prompts stop being mere strings of words and become semantic programs that configure the model&#8217;s temporary world&#8212;the cognitive runtime in which reasoning takes place. Within this view, a prompt is the interface through which we speak a system into being, context is the runtime where thought executes, and memory is the compressed record of what once lived. Writing prompts under this new understanding is no longer a matter of feeding static data to a machine; it is an act of collaboration in building a world. Each well-formed prompt becomes a program for the present, a script that assembles entities, goals, constraints, and relationships into an executable scene of thought.</p><p>In traditional RAG pipelines, data is retrieved, embedded, and concatenated, but in Context Engineering systems it is parsed into structured states such as entities&#8212;who or what is involved&#8212;events&#8212;what is changing&#8212;policies, goals, and reflections about what has been learned so far. Each of these states becomes an addressable component within a dynamic context graph, a living data structure that both the model and the orchestrator can read and modify as reasoning unfolds. Language, in this setting, moves from static description to <strong>procedural cognition</strong>: every token acts as an instruction, and every shift in context functions as a call inside the model&#8217;s internal program. The aim is not to help the model recall more information but to endow it with a kind of stateful consciousness, an evolving workspace that bridges memory, reasoning, and reflection.</p><p>Under Context-as-Code, a prompt is no longer a paragraph but a compiled structure. A developer specifies which entities to activate, what goals to pursue, what constraints to observe, and how reflection should update memory. The context runtime executes this program within the language model, orchestrating retrieval, summarization, tool use, and self-reflection as modular subroutines. The result is a system that does more than respond; it <em>thinks through</em> a problem by continuously reshaping its internal world. In this light, the language model ceases to be a black box and begins to function as an operating system for language&#8212;a substrate for programmable cognition, where context is not an input but the living present in which thought happens.</p><div><hr></div><blockquote><p><strong>You never bring your whole past into the room &#8212; only the part of yourself that&#8217;s awake right now.</strong></p></blockquote><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PAtY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PAtY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!PAtY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!PAtY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!PAtY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PAtY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:371525,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.entropycontroltheory.com/i/176193150?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PAtY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!PAtY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!PAtY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!PAtY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6451fc25-fc57-47c6-8dfd-5340ab9255b5_1024x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>
      <p>
          <a href="https://www.entropycontroltheory.com/p/context-the-continuous-now-why-context">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[From Useful to Trustworthy: When Language Becomes the Operating System]]></title><description><![CDATA[Large language models can speak but not prove. The next evolution of the web will come not from bigger models, but from transparent systems where meaning, logic, and execution converge into trust.]]></description><link>https://www.entropycontroltheory.com/p/from-useful-to-trustworthy-when-language</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/from-useful-to-trustworthy-when-language</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Tue, 07 Oct 2025 10:31:36 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!EyZr!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb773d8a2-144b-4e46-aace-9b5df3962066_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>For two decades, the dream of a truly intelligent web lay dormant &#8212; buried under failed standards, speculative markets, and the noise of algorithms chasing attention.</p><p>Then, almost without warning, it woke up.</p><p>When the first <strong>Large Language Models</strong> began to speak, the world realized something uncanny had happened: machines no longer just processed words &#8212; they <em>understood</em> them.</p><p>What the <strong>Semantic Web</strong> had tried to build by logic, the <strong>Transformer</strong> achieved by emergence.</p><p>The web had finally found its voice again.</p><p>But it was a voice without proof, a brilliance without memory.</p><p>The models could generate meaning, but not verify it; they could simulate truth, but not be held accountable for it.</p><p>It was a miracle &#8212; and a warning.</p><p>Somewhere between the collapse of old institutions and the rise of machine language, a new convergence began to form.</p><p><strong>LLM</strong> would bring <em>understanding</em>, <strong>Semantic Web</strong> would bring <em>structure</em>, and <strong>Web3</strong> would bring <em>trust</em>.</p><p>Together, they point toward something the early internet never had the tools to achieve &#8212;</p><p>a web that can not only <em>speak</em>, but also <em>reason, verify, and act.</em></p><p>This is the moment when technology crosses from connection to cognition &#8212;</p><p>the birth of what we might one day call the <strong>Social Turing Machine</strong>.</p><div><hr></div><p><strong>From Useful to Trustworthy: The Paths Begin to Merge</strong></p><p>The next revolution will not come from a larger model.</p><p>It will come from a <strong>deeper synthesis</strong> &#8212;</p><p>a moment when the three fractured lineages of the web finally learn to speak to one another again.</p><p>For thirty years, the internet has evolved in silos:</p><p>knowledge systems on one side, financial systems on another, and language models now rising like a third continent in between.</p><p>Each holds a piece of the puzzle &#8212; none complete by itself.</p><p>But slowly, the outlines of a <strong>new convergence</strong> are emerging, like tectonic plates grinding toward alignment.</p><p>The three great lineages of the web each evolved to master one domain of cognition &#8212; but each carries a missing piece the others can provide. <strong>Large Language Models (LLMs)</strong> specialize in understanding and generation: they give machines the ability to speak and reason in natural language, yet their flaw is <strong>verifiability</strong> &#8212; they produce fluent meaning without proof. <strong>The Semantic Web</strong> specializes in logic and structure: it can encode truth formally and reason with precision, but it has always struggled with <strong>usability</strong>, trapped behind expert syntax and brittle standards. <strong>Web3 and blockchain technologies</strong> specialize in trust and execution: they make actions provable and histories immutable, yet they operate without <strong>meaning</strong>, blind to the semantics of what they execute.</p><p>When these three currents finally merge, each regains what it lacks. LLMs gain logical grounding and provenance; the Semantic Web gains natural-language accessibility and global scale; and Web3 gains semantic coordination and contextual understanding. Together, they begin to form the first web that can <strong>understand, verify, and act</strong> &#8212; a web not of pages or platforms, but of living, interoperable intentions.</p><div><hr></div><p>Each one is a <strong>partial organ of cognition</strong>, evolved in isolation but yearning for completion:</p><ul><li><p><strong>LLM</strong> brings <em>language and understanding</em> &#8212; a new ear for meaning.</p></li><li><p><strong>Semantic Web</strong> brings <em>rules and logic</em> &#8212; a skeleton of reason.</p></li><li><p><strong>Web3</strong> brings <em>memory and accountability</em> &#8212; the backbone of trust.</p></li></ul><p>Individually, each can simulate intelligence.</p><p>Together, they can <em>constitute</em> it.</p><p>When they converge, they form a closed cognitive loop:</p><ol><li><p><strong>Intent</strong> &#8212; captured and interpreted by the LLM, the semantic interface of human language.</p></li><li><p><strong>Structure</strong> &#8212; organized and constrained by ontological logic, ensuring internal consistency.</p></li><li><p><strong>Execution</strong> &#8212; anchored in decentralized verification, transforming ideas into accountable action.</p></li></ol><p>That cycle &#8212; <em>understand &#8594; structure &#8594; execute</em> &#8212; is not just an engineering model; it&#8217;s a description of <strong>conscious coordination</strong>.</p><p>It&#8217;s what allows language to become action, and action to feed back into knowledge &#8212; without breaking the chain of meaning.</p><p>This recursive process is what I call the <strong>Social Turing Machine</strong>:</p><p>a system where human intention can be expressed, reasoned about, verified, and enacted across networks and institutions &#8212;</p><p>not through obedience to authority, but through <strong>coherence of meaning</strong>.</p><div><hr></div><p><strong>To Make This Real, Three Shifts Must Happen</strong></p><p>The technical pathways already exist in fragments.</p><p>What&#8217;s missing is the connective tissue &#8212; the <strong>governance of meaning</strong> that lets them align.</p><p>Three structural transformations are needed to bridge the useful and the trustworthy.</p><div><hr></div><p>1. <strong>Explicit Semantics</strong></p><p>Today&#8217;s LLMs compress human knowledge into statistical space.</p><p>They can imitate reasoning, but not <em>explain</em> it.</p><p>They generate answers without context, confidence without provenance.</p><p>To cross that threshold, meaning must become <strong>visible and auditable</strong>.</p><p>Every claim needs a reference; every conclusion a traceable lineage.</p><p>Knowledge cannot remain trapped in billions of hidden parameters &#8212; it must re-emerge as <em>structured meaning</em> that can be examined, debated, and improved.</p><blockquote><p>The next frontier is not bigger models &#8212; it&#8217;s transparent ones.</p></blockquote><div><hr></div><p>2. <strong>Verifiable Computation</strong></p><p>Execution must evolve from <em>black-box automation</em> to <em>transparent accountability</em>.</p><p>In Web2, software ran; in Web3, software must <strong>explain why it runs</strong>.</p><p>This means embedding <strong>proof</strong> as a first-class citizen &#8212;</p><p>cryptographic evidence, logical justification, reproducible reasoning.</p><p>Systems will no longer ask to be trusted; they will <em>demonstrate</em> correctness.</p><blockquote><p>Reliability will no longer be a matter of reputation &#8212; but of proof.</p></blockquote><p>A world that runs on verifiable computation no longer relies on faith in authority; it builds trust in mathematics itself.</p><div><hr></div><p>3. <strong>Compositional Experience</strong></p><p>All this complexity must disappear behind a human interface.</p><p>People will not write SPARQL queries or sign blockchain transactions.</p><p>They will simply <em>express intent</em> &#8212; in natural language &#8212; and the system will orchestrate the underlying logic, proofs, and actions seamlessly.</p><p>In that sense, experience becomes <strong>compositional</strong>:</p><p>each utterance spawns a chain of verifiable tasks,</p><p>each task contributes back to the network&#8217;s collective intelligence.</p><p>You don&#8217;t operate the system; <strong>you converse with it.</strong></p><blockquote><p>The command line becomes a conversation.</p><p>The transaction becomes a dialogue.</p></blockquote><div><hr></div><p><strong>Language Becomes the Operating System</strong></p><p>When these layers finally converge, something profound happens:</p><p>the web stops being a collection of protocols and becomes a living system of thought.</p><p>Words no longer just <em>describe</em> the world &#8212; they <em>instantiate</em> it.</p><p>A sentence can trigger governance.</p><p>A paragraph can deploy code.</p><p>A dialogue can negotiate law.</p><p>The architecture of meaning becomes the architecture of action.</p><p>And so, language &#8212; the oldest human invention &#8212; returns as the <strong>operating system</strong> of civilization:</p><p>the bridge between <strong>meaning</strong>, <strong>governance</strong>, and <strong>computation</strong>.</p><p>That is when the Web will no longer merely connect us &#8212;</p><p>it will begin to <strong>understand itself</strong>.</p><div><hr></div><p><strong>But Wait &#8212; Isn&#8217;t the Large Language Model Already Doing This?</strong></p><p>At first glance, it seems so.</p><p>LLMs already turn natural language into coherent output, execute multi-step reasoning, and even write code.</p><p>Isn&#8217;t that what we&#8217;ve been describing?</p><p>Yes &#8212; but only in appearance.</p><p>They <em>simulate</em> these capabilities; they do not <em>embody</em> them structurally.</p><p>What looks like understanding and verification is, for now, still a performance without proof.</p>
      <p>
          <a href="https://www.entropycontroltheory.com/p/from-useful-to-trustworthy-when-language">
              Read more
          </a>
      </p>
   ]]></content:encoded></item><item><title><![CDATA[The Dual Engine of Technology: Darwinism + Social Consensus]]></title><description><![CDATA[How LLMs evolved through chance, narrative, and a community unprepared for a Renaissance-level disruption.]]></description><link>https://www.entropycontroltheory.com/p/the-dual-engine-of-technology-darwinism</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/the-dual-engine-of-technology-darwinism</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Wed, 01 Oct 2025 11:29:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!AWDo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>The story of large language models is not just about technical progress&#8212;it is about how Darwinian trial-and-error collided with social consensus. I remember my own reaction when the first wave of breakthroughs arrived: confusion mixed with exhilaration, weeks of sleepless exploration until my neck injury flared up again. Others, like Andrew Ng, understood earlier; they leveraged their influence to participate directly in the narrative cycle from the start.</p><p>But for much of the developer community, the shock was overwhelming. The system of education and training had not prepared programmers to think historically about technology&#8217;s evolution. Most knew how to code, but few knew how to situate LLMs in the long arc of computing. When Sam Altman called this a &#8220;Renaissance-level moment,&#8221; he was not exaggerating. It was not just a tool arriving&#8212;it was a tidal wave crashing against the post&#8211;World War II division of knowledge, disciplines, and professions.</p><p>This is why understanding technology as a dual engine&#8212;<strong>Darwinism plus consensus</strong>&#8212;is essential. Without it, the excitement feels like chaos, the shock feels like catastrophe. With it, we can see the deeper pattern behind LLMs&#8217; rise, and begin to navigate what comes next.</p><div><hr></div><p><strong>Darwinism in LLMs: The Contingency of Breakthroughs</strong></p><p>Large language models were not inevitable. For decades, AI research cycled through waves of optimism and disappointment. Symbolic AI promised rule-based intelligence, but collapsed under the weight of brittle logic. Expert systems in the 1980s generated excitement, only to prove narrow and unscalable. In the 2000s, recurrent neural networks and LSTMs brought improvements to sequence modeling, but they, too, struggled to generalize. Each of these attempts looked promising, attracted funding, and then faltered.</p><p>In retrospect, the rise of transformers looks obvious. But at the time, it was anything but. The original transformer paper in 2017 was designed for machine translation, not for creating a universal symbolic engine. The discovery that these models could scale&#8212;first to billions, then to hundreds of billions of parameters&#8212;was less a master plan than a Darwinian accident. Researchers were not certain it would work. They simply tried, failed, retried, and eventually stumbled onto an architecture that survived the brutal environment of benchmarks, funding cycles, and peer review.</p><p>I remember my own reaction when the first signs of generality appeared. Confusion and exhilaration blurred together. For weeks, I barely slept, chasing every experiment I could find, testing prompts, probing limits, convinced that something fundamental had shifted. The intensity left me physically wrecked&#8212;my neck injury flared up badly&#8212;but the intellectual thrill was undeniable. This was what Darwinian chaos felt like at the technical layer: thousands of trials converging into a fragile but undeniable breakthrough.</p><p>This is the essence of technological Darwinism: <strong>massive parallel experimentation, most of it destined to fail, with only a few paths surviving long enough to prove viable.</strong> Transformers scaled while others withered, not because they were obviously destined to win, but because they happened to align with the conditions of the moment&#8212;data availability, compute power, and a research community desperate for progress.</p><p>LLMs, then, are not the product of a single genius or a perfectly rational design. They are the survivors of a long evolutionary struggle. Their contingency reminds us that technology does not advance in straight lines, but in branching, chaotic forests of exploration, where survival often looks accidental in hindsight.</p><div><hr></div><p><strong>Consensus and Narrative: The Inevitable Layer</strong></p><p>Darwinian experimentation explains how transformers survived the chaotic competition of architectures, but it does not explain why they exploded into the center of global attention. Many technologies survive in labs for decades without ever touching mainstream society. What transforms a technical survivor into a world-changing inevitability is the <strong>social layer</strong>: consensus and narrative.</p><p>Once sparks of generality appeared in LLMs&#8212;when they could move beyond narrow tasks like translation and begin engaging in open-ended dialogue&#8212;the narrative machine switched on. Thought leaders began framing these models not as incremental progress, but as epochal breakthroughs. The phrase &#8220;AI is the new electricity&#8221; spread quickly, giving the public a metaphor they could immediately grasp: just as electricity once reshaped every industry, AI would now seep into everything.</p><p>Influential figures understood this cycle early and leveraged it. Andrew Ng, for example, didn&#8217;t just contribute technically&#8212;he used his reach as an educator and entrepreneur to amplify the message, placing LLMs into the broader story of AI transformation. By doing so, he helped build the consensus that these models were not fringe curiosities but the inevitable future of computing.</p><p>Sam Altman pushed the framing even further, calling this a &#8220;Renaissance-level event.&#8221; That phrase was not a technical description&#8212;it was a civilizational narrative. By invoking the Renaissance, Altman positioned LLMs not as a new tool for programmers, but as a disruption capable of collapsing disciplinary boundaries, reshaping knowledge systems, and redefining what it means to be human in relation to machines. Whether one agreed or not, the narrative was powerful. It provided legitimacy and urgency.</p><p>This is the key point: <strong>consensus turns accidents into inevitabilities.</strong> Without a receptive social environment, even the most capable models might have remained curiosities buried in research papers. But with narratives that resonated&#8212;electricity, Renaissance, the &#8220;last invention humanity needs&#8221;&#8212;the technology gained momentum far beyond the lab. Institutions began to reorganize around it. Capital flooded in. Media cycles reinforced the story daily.</p><p>In this sense, the rise of LLMs was never purely technical. It was the convergence of a Darwinian survivor with a society primed for a story of salvation, disruption, and inevitability. That convergence is what transformed a contingent breakthrough into a world-shaking revolution.</p><div><hr></div><p><strong>The Shock to the Technical Community</strong></p><p>If Darwinism explains the contingency of LLMs and consensus explains their inevitability, then the next chapter is about impact&#8212;specifically, the shockwaves felt inside the technical community. The panic was immediate, and it wasn&#8217;t only about code. It was about culture, education, and identity.</p><p>Why such panic? Because most programmers had not been trained to think historically about technology&#8217;s evolution. Computer science education tends to emphasize <strong>syntax, frameworks, and immediate employability.</strong> University curricula and coding bootcamps prepare students to pass interviews, solve algorithmic puzzles, and get productive quickly in a specific stack. But very little time is devoted to the lineage of computing&#8212;the story of how technology evolves through cycles of trial, failure, consensus, and reinvention.</p><p>This lack of historical grounding left many developers blindsided. When the Darwinian accident of LLMs collided with consensus-driven hype, the developer community had no interpretive framework. Instead of seeing LLMs as part of a recurring evolutionary pattern, many saw them as an existential rupture. Anxiety spread: if machines can generate working code, what role remains for the human developer?</p><p>The immediate consequences were stark. <strong>Junior programmers were displaced</strong> first, as their bread-and-butter work&#8212;boilerplate coding, writing unit tests, scaffolding APIs&#8212;was suddenly trivialized. <strong>Career ladders collapsed</strong> as the entry-level rungs disappeared. Even experienced developers saw parts of their workflow&#8212;documentation, refactoring, routine debugging&#8212;automated overnight.</p><p>But the shock was deeper than economics. It was cultural. Programming had long been treated as a craft, with mastery built over years of incremental practice. Suddenly, the early stages of that craft felt obsolete. For educators, the question was existential: how do you train the next generation when the traditional training ground is gone? For working engineers, the question was personal: what counts as valuable contribution when the machine can now do what once defined your skill?</p><p>The result is that the shock is <strong>as much cultural and educational as it is technical.</strong> The disruption exposed a blind spot: without a broader understanding of how technologies evolve&#8212;and how consensus transforms accidents into revolutions&#8212;the technical community was left reacting emotionally rather than strategically. What is needed now is not just retraining in new tools, but a reframing of how developers see themselves within the larger cycles of technological change.</p><div><hr></div><p><strong>A Renaissance-Level Disruption</strong></p><p>To call LLMs &#8220;just another tool&#8221; misses the scale of what is unfolding. Word processors automated typing. Spreadsheets automated arithmetic. LLMs automate <strong>meaning itself</strong>&#8212;compressing, recombining, and generating language across every field where symbols carry value. This is why their disruption is not confined to programming. It is reverberating across the entire post&#8211;World War II order of disciplines and professions.</p><p>That order was built on division. After WWII, knowledge was systematically carved into silos: computer science here, linguistics there, law in one faculty, sociology in another. Each discipline developed its own language, its own standards of legitimacy, and its own gatekeeping rituals. This division stabilized education, labor markets, and professional hierarchies for nearly eighty years.</p><p>LLMs challenge this settlement by flooding the space with <strong>compression and recombination.</strong> A model trained on vast corpora doesn&#8217;t care whether a sentence comes from a legal ruling, a sociological study, or a programming manual. It reduces them all to tokens in the same vector space, making connections humans rarely noticed. Suddenly, the boundaries between fields&#8212;once maintained by institutions, departments, and guilds&#8212;start to dissolve.</p><p>This is why Sam Altman&#8217;s phrase, a &#8220;Renaissance-level event,&#8221; resonates. The historical Renaissance was not just an artistic blossoming. It was a collapse of boundaries: art and science, philosophy and politics, religion and commerce all reconnected in new ways, producing a surge of creativity and social upheaval. LLMs threaten a similar collapse today. They dissolve the walls that kept disciplines separate, forcing us to reconsider what counts as expertise, authorship, and intellectual labor.</p><p>The key point is this: <strong>Darwinism plus consensus doesn&#8217;t just evolve technology; it forces society to reorganize knowledge itself.</strong> The Darwinian accident of transformers surviving the chaos of AI research became inevitable once society adopted the narrative of &#8220;AI as electricity.&#8221; But inevitability has consequences. It shakes not only industries and professions, but the entire architecture of knowledge.</p><p>Seen in this light, the disruption to programmers is only the first tremor. Academia, law, healthcare, policy&#8212;all will feel the quake as the silos built after WWII start to fracture. What comes next will depend not only on how the technology advances, but on how society chooses to reorganize itself around this Renaissance-level collapse of boundaries.</p><div><hr></div><p><strong>What This Teaches Us: The Dual Engine as a Lens</strong></p><p>The story of LLMs reveals a deeper truth about technological evolution: it cannot be understood through a single lens. <strong>Darwinism alone</strong> explains the chaotic experimentation that produced transformers, but it does not explain why they left the lab. <strong>Consensus alone</strong> explains why society embraced them, but it does not explain why this particular architecture&#8212;out of dozens&#8212;was there to be embraced. To make sense of disruption, we need both lenses.</p><p>On the Darwinian side, innovation is messy. Researchers try countless approaches, most of which fail. Symbolic AI collapsed. Expert systems plateaued. RNNs struggled to scale. Transformers, almost by accident, survived. This is <strong>chaotic innovation</strong>: survival through variation and trial, with no guarantee of success.</p><p>On the consensus side, society must decide which survivors matter. Consensus provides <strong>legitimacy, narrative, and adoption channels.</strong> Andrew Ng framed LLMs as transformative; Sam Altman called them a Renaissance-level event. Media, investors, and policymakers repeated the story until it solidified. Without that consensus, even the most capable model could have remained a curiosity&#8212;another clever paper gathering dust. With consensus, the breakthrough became inevitable, attracting billions in funding and reorganizing institutions worldwide.</p><p>This dual engine is what turned LLMs from contingency into inevitability, from laboratory novelty into a civilizational shock. Without Darwinism, there would be nothing new to legitimize. Without consensus, there would be no way for the new to scale. Together, they not only drive technological adoption but also reshape the structures of society itself.</p><p>The lesson is clear: <strong>every future breakthrough will emerge from this same synthesis.</strong> Whether in biotech, energy, or governance, the survivors of Darwinian chaos will only matter if consensus amplifies them. And when both forces align, the result will not just be new tools&#8212;it will be disruptions capable of reorganizing disciplines, professions, and even the architecture of knowledge.</p><p>LLMs, then, are not an anomaly. They are a case study in how technology always evolves. By using the dual engine as a lens, we can stop being surprised by disruption and start preparing for it&#8212;recognizing that every &#8220;accident&#8221; is also a candidate for inevitability, depending on how society chooses to receive it.</p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AWDo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AWDo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!AWDo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!AWDo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!AWDo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AWDo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:244975,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174941890?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AWDo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!AWDo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!AWDo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!AWDo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbc226f5f-b16c-4a04-9486-c149f78a495f_1024x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Contingency and Inevitability of LLMs]]></title><description><![CDATA[No one could predict with absolute certainty that large language models would explode into mainstream success.]]></description><link>https://www.entropycontroltheory.com/p/the-contingency-and-inevitability</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/the-contingency-and-inevitability</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Tue, 30 Sep 2025 22:23:17 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zVWl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>No one could predict with absolute certainty that large language models would explode into mainstream success. Even the researchers building them did not know if scaling up simple token prediction would lead to such emergent intelligence. Their breakthrough contains a large dose of contingency: an unexpected winner emerging from countless parallel experiments, most of which failed or plateaued.</p><p>And yet, to frame LLMs as a lucky accident misses the deeper story. Technology never evolves by randomness alone. Beneath the surface of chance lies an engine of inevitability. <strong>Technological evolution is always the synthesis of two forces: the Darwinian trial-and-error of technical exploration, and the shaping power of social consensus and alignment.</strong> Darwinism produces the raw mutations; consensus decides which ones are amplified, legitimized, and embedded into society.</p><p>This dual engine explains both why LLMs emerged at this particular moment and why their impact has been seismic. The technical side was pure survival-of-the-fittest: transformers scaled, while many rival architectures withered. The social side was equally decisive: a world primed with the narrative of &#8220;AI as the next electricity&#8221; was ready to adopt, fund, and amplify the breakthrough. Together, contingency and inevitability converged into the LLM revolution.</p><div><hr></div><p><strong>The Premise: How Technology Evolves</strong></p><p>When people talk about technology, they often imagine progress as a straight line&#8212;faster processors, bigger datasets, better models. But real technological history is rarely linear. It is jagged, unpredictable, full of false starts, dead ends, and sudden leaps. Some breakthroughs seem to appear overnight, while others linger in obscurity for decades before finding their moment.</p><p>The reality is that <strong>technological evolution has two engines.</strong> One is Darwinian, the other social. The formula looks like this:</p><p><strong>Technological evolution = technological Darwinism + social consensus &amp; alignment.</strong></p><p>On the Darwinian side, progress is messy and competitive. Countless experiments are launched in parallel. Most fail. A few survive. Survival here doesn&#8217;t mean perfection; it simply means the idea proved scalable, robust, or adaptable enough to persist. Just as biological evolution produces endless variations before landing on a viable species, technological Darwinism floods the landscape with prototypes, frameworks, and architectures. Out of this chaos, one or two approaches rise to dominance.</p><p>But survival alone is not enough. A viable technology still needs a <strong>social layer</strong> to take root. This is where consensus comes in. Society must recognize the value of the invention, translate it into standards or protocols, and provide the legitimacy and resources needed for growth. Without consensus, technologies remain curiosities. Think of Google Glass&#8212;technologically advanced, socially rejected. Or nuclear fusion&#8212;scientifically promising, but lacking the institutional alignment for large-scale adoption.</p><p>Consensus provides what Darwinism cannot: <strong>direction, legitimacy, and distribution.</strong> Narratives make technologies legible to the public (&#8220;AI is the next electricity&#8221;), institutions create adoption channels, and regulations grant or withhold permission to scale. In short, consensus is what allows a surviving technology to move from the lab to the fabric of everyday life.</p><p>This dual-engine view explains why certain breakthroughs feel both accidental and inevitable. Darwinism produces the accident&#8212;the one architecture that happens to work. Consensus produces the inevitability&#8212;the collective alignment that ensures the breakthrough doesn&#8217;t wither but instead reshapes entire industries.</p><p>Seen this way, LLMs are not a mystery. They are the product of Darwinian chance fused with social alignment. And that combination&#8212;not one or the other&#8212;is what drives every major technological leap.</p><div><hr></div><p><strong>The Inevitability: The Social Layer</strong></p><p>If Darwinism explains why transformers survived while other architectures fell away, the <strong>social layer</strong> explains why large language models became a revolution instead of a curiosity. A breakthrough in the lab is only half the story. For a technology to reshape society, it must be absorbed, amplified, and legitimized by society itself.</p><p>Once LLMs began to show sparks of generality&#8212;the ability to handle not just translation or summarization but open-ended dialogue&#8212;society was already <strong>primed</strong>. The narrative of &#8220;AI as the next electricity&#8221; had been circulating for years. Investors, policymakers, and the public were prepared to treat any convincing AI leap as the dawn of a new industrial era. The moment ChatGPT arrived, the symbolic groundwork was already in place to receive it.</p><p>Then came the <strong>cascade of alignment</strong>: funding surged into the space, media attention amplified every demo, institutions scrambled to integrate the models into workflows. Universities set up AI research centers overnight. Corporations launched AI task forces. Governments debated regulation. Each of these moves reinforced the sense that something epochal had arrived, regardless of whether the underlying technology was fully mature.</p><p>This is the paradox of technology adoption: it is not the &#8220;best&#8221; technologies that spread, but the ones that arrive with <strong>consensus and narrative support.</strong> Google Glass was technically impressive but socially rejected. Nuclear fusion is scientifically promising but institutionally starved. By contrast, LLMs entered a social environment desperate for both a story of progress and a tool to anchor it.</p><p>Without this consensus, even the strongest models might have languished in obscurity&#8212;like many clever algorithms that never left academic journals. What turned LLMs into a revolution was not just their ability to predict the next token, but the fact that society <strong>wanted to believe</strong> in them, fund them, and build on top of them.</p><p>This is what makes their rise feel inevitable. The Darwinian accident of scaling transformers merged with a social ecosystem eager for a breakthrough. Together, they created inevitability.</p><div><hr></div><p><strong>The Shock: Tremors in the Developer Ecosystem</strong></p><p>The social alignment that propelled LLMs into the spotlight didn&#8217;t just create excitement&#8212;it also sent shockwaves through the technical community. Few groups felt the tremors more immediately than software developers.</p><p>The first visible impact was the <strong>displacement of junior programmers.</strong> Tasks that once justified an entry-level engineer&#8212;writing unit tests, building CRUD apps, stitching together APIs&#8212;were suddenly achievable by anyone with access to an AI assistant. What had been the proving ground for human apprentices became the low-hanging fruit of automation. Companies quickly realized they could cut costs by reducing junior headcount while equipping senior developers with AI copilots. For new graduates, the traditional on-ramp into the profession narrowed overnight.</p><p>The second tremor was the <strong>commoditization of boilerplate coding.</strong> Entire categories of work&#8212;documentation, test generation, refactoring, template-based development&#8212;shifted from being time-consuming chores to near-instant outputs. What once consumed days of careful attention collapsed into minutes of prompting. In economic terms, tasks that had been billable became trivial, undermining the pricing structures of consultancies, contractors, and outsourcing firms.</p><p>This led to the third and deepest impact: <strong>anxiety within the technical community.</strong> If AI can handle the baseline tasks of programming, what remains valuable for humans? Senior engineers wondered whether their expertise in system design would remain untouched, or whether even architecture could be absorbed into model-driven workflows. Independent developers asked if the indie projects they once dreamed of shipping would be instantly undercut by a wave of AI-generated clones. Educators and mentors struggled to answer a painful question: how do you train the next generation of engineers when the old training ground has vanished?</p><p>The result is a profession caught in transition. On one hand, LLMs act as accelerators, boosting productivity for those who adapt. On the other, they destabilize the very pipeline that sustained the developer ecosystem. The craft of programming is no longer about who can type the fastest or memorize the most APIs; it is shifting toward higher-order skills: problem framing, integration, governance, and above all, connecting technology to <strong>social consensus.</strong></p><p>These shocks are not the end of programming, but the end of programming as we knew it. The developer ecosystem must now find a new equilibrium, one that acknowledges the automation of its foundations and redefines what counts as irreplaceable human contribution.</p><div><hr></div><p><strong>The Real Question: What Comes Next?</strong></p><p>The shocks of LLMs&#8212;the displacement of junior roles, the commoditization of boilerplate work, and the anxiety rippling through the developer community&#8212;are real. But they are not the conclusion of the story. They are only the opening act. The deeper question is not <em>what has been automated</em>, but <em>what new cycles of technology and society this disruption will set in motion.</em></p><p>History shows that whenever a technology destabilizes an existing profession, the outcome depends less on the raw capability of the tool and more on <strong>how it becomes embedded into society.</strong> The printing press didn&#8217;t just displace scribes; it reorganized religion, politics, and science. Electricity didn&#8217;t just replace candles; it reshaped urban life and industrial work. The internet didn&#8217;t just digitize communication; it redefined commerce, governance, and identity. Each of these transformations followed a pattern: first shock, then realignment.</p><p>So the real question is not whether LLMs will write code or not. They already do. The question is: <strong>how will this capacity be absorbed, standardized, and legitimized by society?</strong> Will it be guided by shared consensus, or left to drift as a series of fragmented experiments? Will protocols emerge to govern its use in critical infrastructure? Will new structures&#8212;educational systems, corporate hierarchies, regulatory bodies&#8212;rise to stabilize its role? And what narratives will make sense of it for the broader public?</p><p>These are not secondary concerns; they are the <em>main</em> concerns. Because technologies that fail to secure consensus, protocols, structures, and narratives fade into irrelevance, no matter how impressive their capabilities. By contrast, technologies that align with this cycle reshape the world&#8212;even if they began as unlikely accidents.</p><p>The LLM revolution, then, is not simply about predicting the next token. It is about what comes next in the <strong>cycle of consensus &#8594; protocol &#8594; structure &#8594; narrative.</strong> If developers and technical communities wish to move from anxiety to agency, they must learn to see themselves not as passive recipients of disruption, but as active participants in steering this cycle.</p><div><hr></div><p>The sudden rise of LLMs was both accidental and inevitable&#8212;Darwinian chance fused with social consensus. But the story doesn&#8217;t end here. This shock is only the first tremor in a broader cycle. In the next essay, we&#8217;ll explore why technological evolution is not a straight line but a <strong>looped process</strong>, and how understanding that loop is the only way to move from reaction to foresight.</p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zVWl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zVWl!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!zVWl!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!zVWl!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!zVWl!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zVWl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:279219,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174930349?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zVWl!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!zVWl!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!zVWl!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!zVWl!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8cd551c8-c51e-4309-ad0d-6a62e949d9d4_1024x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[The Sea of Meaning — Vectorization as a Continuous Substrate]]></title><description><![CDATA[How embeddings turn hidden connections into discoverable knowledge]]></description><link>https://www.entropycontroltheory.com/p/the-sea-of-meaning-vectorization</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/the-sea-of-meaning-vectorization</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Tue, 30 Sep 2025 11:29:07 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!cOX3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Meaning has always been scattered&#8212;locked in books, hidden in archives, buried in different disciplines and industries. What held us back was not a lack of ideas, but the unbearable workload of connecting them. Now, with vectorization, meaning itself has been made into a <strong>continuous space</strong>. What once went unnoticed can now be revealed. What once took years of scholarship or trial-and-error can now be surfaced in seconds, at a fraction of the cost.</p><div><hr></div><p><strong>From Fragments to a Sea of Continuity</strong></p><p>Historically, knowledge lived in fragments. Every app, database, and field of expertise was its own island. A biologist&#8217;s paper and a legal scholar&#8217;s commentary had no obvious bridge. An engineer&#8217;s log file and a historian&#8217;s archive were separated by context and convention. Humans could, in principle, connect them&#8212;but only through enormous manual effort: reading, cross-referencing, translating.</p><p>This fragmentation meant that <strong>many possible connections simply never happened.</strong> Ideas that could have sparked breakthroughs were lost in the noise, buried under workload.</p><p>Vectorization changes this. By embedding all kinds of data&#8212;words, images, tables, code&#8212;into a <strong>shared coordinate space</strong>, knowledge is no longer trapped in silos. Everything acquires coordinates in a continuous semantic map. Suddenly, the &#8220;unnoticed&#8221; becomes <strong>neighboring</strong>; the hidden connection becomes <strong>visible</strong>.</p><div><hr></div><p><strong>How Vectorization Was Discovered</strong></p><p>The breakthrough did not come from philosophy but from practice. Researchers noticed that simple neural models trained to predict words learned something unexpected: <strong>words with similar meaning clustered together in the model&#8217;s internal space.</strong></p><ul><li><p>&#8220;King&#8221; and &#8220;Queen&#8221; ended up near each other.</p></li><li><p>&#8220;Paris&#8221; and &#8220;France&#8221; formed a tight bond.</p></li><li><p>Arithmetic even worked: <em>King &#8211; Man + Woman &#8776; Queen</em>.</p></li></ul><p>This discovery&#8212;that meaning could be represented as vectors in high-dimensional space&#8212;was profound. It showed that semantics was not just abstract but <strong>geometry</strong>. From word2vec to GloVe, from BERT to multimodal embeddings, the progression has been the same: expand the scope, refine the mapping, and watch meaning fall into place as coordinates.</p><p>In other words, vectorization wasn&#8217;t designed&#8212;it was <strong>discovered</strong> as a side effect of training models on massive data. The models revealed a truth: language itself is structured in a way that can be represented continuously.</p><div><hr></div><p><strong>Why Continuity Matters</strong></p><p>Discrete categories force rigid definitions. A database field says &#8220;customer&#8221; or &#8220;supplier,&#8221; nothing in between. A taxonomy forces every book into one shelf, even if it belongs on several. Humans have always lived within these containers.</p><p>Continuity breaks this rigidity. In vector space:</p><ul><li><p>A research paper can be near both &#8220;biology&#8221; and &#8220;computer science.&#8221;</p></li><li><p>A photograph can live between &#8220;art&#8221; and &#8220;documentation.&#8221;</p></li><li><p>A contract clause can cluster with both &#8220;legal precedent&#8221; and &#8220;risk assessment.&#8221;</p></li></ul><p>What this means is that <strong>ambiguity and nuance become first-class citizens.</strong> Instead of being errors, they are positions in a space. And because similarity is measured by distance, <strong>previously unnoticed patterns emerge naturally.</strong></p><div><hr></div><p><strong>From Workload to Discovery</strong></p><p>In the past, discovering such patterns was theoretically possible but practically impossible. Connecting 10,000 documents across 20 domains meant endless human labor&#8212;reading, indexing, correlating. The cost was prohibitive.</p><p>With vectorization, the workload collapses. What was once a mountain of manual effort becomes a geometric query. &#8220;Find me things near this meaning.&#8221; Suddenly:</p><ul><li><p>A lawyer drafting a contract can surface scientific reports that hint at hidden risks.</p></li><li><p>A doctor exploring symptoms can find case studies across languages and decades.</p></li><li><p>A musician searching for inspiration can locate obscure works that share tonal structure.</p></li></ul><p>The discovery is not that these connections exist&#8212;they always did&#8212;but that they are now <strong>accessible at scale and at low cost.</strong></p><div><hr></div><p><strong>Creation at the Edges of Meaning</strong></p><p>The sea of meaning is not only about retrieval. It is also about <strong>creation.</strong> When embeddings bring disparate fields into proximity, new combinations emerge.</p><ul><li><p>Scientific intuitions encoded as embeddings can recombine into hypotheses no one thought to test.</p></li><li><p>Legal language aligned with computational structures can birth executable governance.</p></li><li><p>Cultural metaphors from different languages can blend into new artistic forms.</p></li></ul><p>Creation happens at the edges&#8212;where proximity reveals unexpected neighbors. By lowering the cost of finding those neighbors, vectorization turns hidden possibilities into active frontiers.</p><div><hr></div><p><strong>The Leap from Discrete to Continuous</strong></p><p>This is the conceptual leap: from discrete containers to continuous fields.</p><ul><li><p>Before: information was sorted into boxes, categories, and file formats.</p></li><li><p>After: everything floats in a shared ocean of coordinates.</p></li></ul><p>In this ocean, the fundamental operation is not &#8220;look up by label&#8221; but <strong>&#8220;sail by meaning.&#8221;</strong> Knowledge is not locked in shelves but mapped as flows and currents. Discovery becomes navigation.</p><div><hr></div><p><strong>The Sea of Meaning as a Substrate</strong></p><p>Vectorization provides a new <strong>substrate for civilization&#8217;s knowledge.</strong> Like Turing&#8217;s tape or the token stream of language models, embeddings form a universal medium&#8212;one where meaning is continuous, navigable, and recombinable.</p><p>The implications are vast. What once took years of expertise and unbearable labor can now be generated, retrieved, or recombined in moments. What was unnoticed can now be surfaced. The cost of exploration has collapsed; the frontier of meaning has expanded.</p><p>The sea of meaning is already here. The question is not whether it exists, but how far we are willing to sail&#8212;and what new continents of thought we will discover when we do.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!cOX3!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!cOX3!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!cOX3!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!cOX3!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!cOX3!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!cOX3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:107454,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174880207?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!cOX3!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!cOX3!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!cOX3!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!cOX3!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3fb46b19-57a7-428d-9386-6835006cbc1f_1536x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[Universality: Language, LLMs, and the Power of One Primitive]]></title><description><![CDATA[How token prediction turns language into a universal machine for meaning]]></description><link>https://www.entropycontroltheory.com/p/universality-language-llms-and-the</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/universality-language-llms-and-the</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Tue, 30 Sep 2025 03:33:10 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!4JhS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Why can one model write poetry, translate contracts, solve math problems, and answer questions&#8212;all with the same underlying mechanism? At first glance, it feels like a strange kind of magic, as if one machine were juggling entirely different talents without breaking a sweat. In the old world of NLP, each of these skills required its own model, its own training set, and its own architecture. A translation system could not summarize; a sentiment classifier could not solve equations. The field was fractured, with no common ground.</p><p>Large language models changed that. What looks like a patchwork of capabilities is, under the hood, just one <strong>universal act: predicting the next token.</strong> The model doesn&#8217;t know in advance whether it is writing a sonnet, balancing an equation, or producing a legal clause. It simply continues a symbolic sequence, one token at a time, guided by probabilities learned from vast amounts of text. Out of this simple act of continuation emerges an astonishing spectrum of behaviors.</p><p>This primitive doesn&#8217;t just unify tasks; it reframes language itself. Once you reduce everything to token prediction, language becomes more than communication&#8212;it becomes a <strong>machine for meaning.</strong> Every continuation is a micro-decision in a vast symbolic universe. Poetry, law, science, and conversation all become variations of the same process: shaping meaning by predicting what comes next.</p><div><hr></div><p><strong>Before Universality: Fragmented Language Tools</strong></p><p>To appreciate what large language models have changed, it helps to recall what came before. For most of its history, natural language processing was a field of ingenious but isolated contraptions, each built for a narrow purpose.</p><p>If you wanted <strong>translation</strong>, you built a statistical machine translation system, trained on carefully aligned bilingual corpora. If you wanted <strong>sentiment analysis</strong>, you trained a classifier to label sentences as positive or negative. If you wanted <strong>summarization</strong>, you designed yet another system, often based on sentence extraction or hand-crafted linguistic rules. <strong>Question answering</strong> had its own architectures, as did <strong>chatbots</strong>, <strong>topic models</strong>, and <strong>speech recognition</strong>. Each task was a silo.</p><p>Like the early world of calculating devices&#8212;abaci, adding machines, difference engines&#8212;these tools were powerful in their own ways, but narrow. A machine that could tabulate polynomials could not compute logarithms. Likewise, a model trained to detect sentiment could not translate, and a translation system could not hold a conversation. Each function had to be engineered from scratch, with specialized data pipelines, algorithms, and evaluation metrics.</p><p>This fragmentation had two consequences. First, <strong>progress was local</strong>. A breakthrough in summarization did little to advance translation. Each subfield advanced along its own track, often reinventing methods or features that had already been discovered elsewhere. Second, <strong>there was no shared substrate of meaning</strong>. Each system had its own representation of language&#8212;n-grams here, parse trees there, embeddings in another corner. There was no common ground where &#8220;meaning&#8221; could be modeled universally.</p><p>In this world, language technology looked like a collection of specialized islands. You could visit one island to translate, another to summarize, another to classify, but there was no single vessel that could carry you across all of them. Each island was impressive, but the seas between them were wide and treacherous.</p><p>This was the <strong>pre-universality era of language</strong>: a machine zoo of narrow models, each ingenious, but none capable of generalizing beyond its assigned task. Just as mechanical calculators once defined the limits of computation, these fragmented tools defined the limits of NLP. What was missing was the unifying leap&#8212;a discovery of a minimal primitive and a universal substrate that could collapse the islands into a single continent.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!4JhS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!4JhS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!4JhS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!4JhS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!4JhS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!4JhS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:187448,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174871830?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!4JhS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!4JhS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!4JhS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!4JhS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd8824560-04fb-48e7-a2fc-1843f3ba7a84_1024x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><div><hr></div><p><strong>The Primitive: Predict the Next Token</strong></p><p>The breakthrough of large language models is almost embarrassingly simple. Instead of designing a new system for each linguistic task, researchers discovered that <strong>every task could be reframed as a single act: predicting the next token.</strong></p><p>At first glance, this looks mechanical, almost trivial. A token might be a word, a subword, or even a character. The model scans the sequence so far and estimates: what is the most likely token to come next? Then it does it again. And again. Out of this looping act&#8212;predict, append, predict, append&#8212;emerges translation, summarization, reasoning, even poetry.</p><p>Why does this matter? Because token prediction is not just a technical trick. It is a way of navigating the <strong>symbolic space</strong>of human language. Every token carries multiple potential futures. After the word <em>&#8220;bank,&#8221;</em> should the continuation lean toward <em>&#8220;account&#8221;</em> or <em>&#8220;river&#8221;</em>? After the phrase <em>&#8220;once upon a,&#8221;</em> should it predict <em>&#8220;time&#8221;</em> or something deliberately subversive? Each prediction is a <strong>micro-choice among symbolic continuations</strong>, a decision about which thread of meaning to pull forward.</p><p>Seen this way, token prediction is not a reduction of language but a re-framing of it. It treats language as a probabilistic universe of possible meanings. The model doesn&#8217;t just string words together; it operates inside a <strong>generative semantic field</strong>, where every step both constrains and opens up meaning. The richness comes not from the mechanism itself, but from the fact that language is vast, structured, and endlessly recombinable.</p><p>This single primitive therefore collapses the fractured task zoo of old NLP. Translation? Just continue a sequence where the prompt specifies another language. Summarization? Continue with a shorter re-expression of the input. Reasoning? Continue with the logical steps that have appeared across millions of examples. The surface diversity of tasks dissolves into the same underlying process: predicting the next token in a symbolic universe.</p><p>It is exactly the kind of universality we saw with Turing. Just as four primitive actions (read, write, move, state transition) could simulate all computation, one primitive (token prediction) can simulate the full diversity of linguistic tasks. Out of minimal primitives, infinite behaviors emerge.</p><div><hr></div><p><strong>Prompt = Program, Corpus = Meaning Tape</strong></p><p>If token prediction is the primitive, then prompts and training data are what turn it into something useful. In practice, a <strong>prompt functions like a program</strong>: it tells the model what task to perform, what frame of meaning to inhabit, and what kind of continuation to prioritize. A short instruction&#8212;<em>&#8220;Translate this into French&#8221;</em>&#8212;shifts the model into one symbolic trajectory. A different instruction&#8212;<em>&#8220;Summarize this contract in plain English&#8221;</em>&#8212;sends it down another. The model doesn&#8217;t change; only the program it is given does.</p><p>The <strong>training corpus</strong> provides the other half of the equation. This vast body of text is effectively a <strong>universal tape</strong>, not unlike Turing&#8217;s imaginary strip of symbols. But instead of being a single set of instructions, it is a compressed record of human symbolic history: literature, laws, scientific papers, technical manuals, casual conversations. All of it is encoded as tokens, which the model has learned to navigate probabilistically.</p><p>Together, the prompt and the corpus let the model recombine meaning across domains. Ask it to analyze a legal argument, and it draws on centuries of legal language. Ask it to write a sonnet, and it pulls patterns from the poetic canon. Ask it to solve a physics problem, and it leans on symbolic traces of scientific reasoning. The model doesn&#8217;t &#8220;know&#8221; these fields as a human specialist does; instead, it inhabits the <strong>shared symbolic space</strong> in which they all exist.</p><p>This is where the analogy to the Universal Turing Machine becomes sharp. In Turing&#8217;s model, the machine itself never changed; the diversity came from the descriptions written on its tape. With LLMs, the primitive never changes; the diversity comes from prompts written into the model&#8217;s input and from the corpus encoded in its weights. Prompts are how we &#8220;program&#8221; the machine for meaning, and the corpus is the universal tape from which it draws its symbolic continuations.</p><p>What emerges is a new kind of universality: <strong>one model, one primitive, but infinitely many programs</strong>, limited only by what humans can express in symbolic form.</p><blockquote><p><strong>It has been an imitation machine, and it still is.</strong></p></blockquote><div><hr></div><p><strong>The Boundary of the Modelable</strong></p><p>Every universality is also a boundary. When Turing defined the Universal Machine, he didn&#8217;t just show what could be computed&#8212;he also defined what could <em>not</em>. Functions that could not be reduced to stepwise instructions on a tape were outside the realm of computation. Universality is not infinite; it is a horizon.</p><p>Large language models work the same way. Their primitive is token prediction. That means their universality extends only as far as things can be represented in tokens&#8212;<strong>what can be expressed in symbols.</strong> A legal contract? Yes. A poem? Certainly. A mathematical proof? With varying success. But what about the sensation of tasting mango for the first time, or the silent intuition of a chess grandmaster, or the tacit feel of friendship? Unless those experiences can be symbolized&#8212;translated into language&#8212;they resist being modeled.</p><p>This is the crucial distinction: LLMs collapse all symbolic tasks into &#8220;continuations of meaning,&#8221; but <strong>embodied experience, tacit knowledge, and non-symbolic intuition mark the edge.</strong> The model can imitate how humans <em>talk</em>about those experiences, but it cannot directly access or generate the experiences themselves. It can simulate the language of a sommelier describing wine, but it does not taste. It can reproduce the metaphors of mysticism, but it does not stand in awe.</p><p>And yet, this boundary is not static. Human culture itself is an endless project of turning the inexpressible into symbols&#8212;art, mathematics, law, science are all attempts to encode the non-symbolic into communicable form. Each time we succeed, we expand the range of what is <strong>language-modelable.</strong></p><p>This is why token prediction is more than a technical trick. It defines a frontier. On one side lies everything that can be symbolized and recombined by the model. On the other side lies everything still resistant to symbolization. The frontier is where the work is.</p><p>The universality of language models is therefore not the end of the story, but the beginning of a new question: how far can we stretch the symbolic universe before it breaks?</p><blockquote><p><strong>The breakthroughs of the future will not lie in the models themselves, but in whether we humans dare to push this symbolic universe to its limits.</strong></p></blockquote><div><hr></div><p><strong>Toward the Edges of the Symbolic Universe</strong></p><p>If token prediction is the primitive, and prompts are the programs, then the real frontier lies not in asking LLMs to repeat familiar tasks&#8212;summarizing articles, translating documents, generating boilerplate&#8212;but in <strong>pushing symbols into new terrains.</strong></p><p>Every civilization has advanced by expanding what can be expressed symbolically. Law turned power into written statutes. Mathematics turned intuition into formulas. Music turned feeling into notation. Each leap didn&#8217;t just communicate what people already knew; it created <strong>new kinds of knowledge</strong> that could only exist once they were symbolized.</p><p>The same challenge faces us today. If the boundary of language models is &#8220;what can be expressed in tokens,&#8221; then the opportunity is to <strong>extend the tokenizable.</strong> That means encoding governance rules so they can be executed as code, rather than debated endlessly in prose. It means finding ways to express scientific intuitions&#8212;hunches, heuristics, incomplete models&#8212;in symbolic form so they can be recombined, tested, and scaled. It means inventing new cross-disciplinary languages, hybrids of law and computation, or biology and information theory, that stretch the representational power of our symbolic universe.</p><p>These frontiers matter because universality is not static. The abacus became the Difference Engine; the Difference Engine became the Universal Turing Machine; the Universal Turing Machine became the modern computer. At each step, the primitives stayed minimal, but the range of what could be encoded expanded dramatically. Language models extend this lineage into the symbolic domain. The primitive is fixed&#8212;predict the next token&#8212;but the canvas is vast.</p><p>The question, then, is not whether LLMs can do the tasks we already understand. It is <strong>how far we can stretch symbolic representation before it breaks.</strong> What aspects of governance, science, or culture can be made expressible in symbols, and what must remain tacit or embodied? The answers will not only define the limits of language models, but also reshape the limits of human civilization itself.</p>]]></content:encoded></item><item><title><![CDATA[Universality: From Mapping Machines to the Birth of Computability]]></title><description><![CDATA[How Turing&#8217;s Imitation Machine unified fragmented calculators into the modern computer]]></description><link>https://www.entropycontroltheory.com/p/universality-from-mapping-machines</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/universality-from-mapping-machines</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Mon, 29 Sep 2025 22:31:23 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!DRIK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Imagine a steampunk world where every calculation has its own machine. In one corner of the factory, gears clatter as a giant adding engine churns out sums. Next to it, another contraption whirs to calculate polynomials. The air is thick with steam and the roar of brass machines, each a monument to human ingenuity&#8212;yet each condemned to a single task. This was the reality before 1936: hardware equaled function, and universality was nowhere in sight.</p><p>Then Alan Turing made a conceptual leap. He described an &#8220;Imitation Machine&#8221; so simple it could be built in the imagination: four primitive actions and an endless tape. Yet from that minimal toolkit emerged a shocking conclusion&#8212;that one machine could, in principle, simulate <em>any</em> other. It was the birth of software, and with it, a new definition of what it means to compute.</p><div><hr></div><p><strong>The Age of Mapping Machines</strong></p><p>Long before anyone imagined a general-purpose computer, humans relied on <strong>mapping machines</strong>&#8212;tools that tied physical structures directly to specific functions. The most familiar example is the <strong>abacus</strong>. Each bead on its rods maps directly to a numerical value. Slide three beads forward, and you literally <em>see</em> the number three. Perform an addition by sliding more beads, and the physical configuration becomes the result. The abacus doesn&#8217;t interpret; it simply maps numbers to objects.</p><p>In the 19th century, Charles Babbage dreamed of a far more ambitious device: the <strong>Difference Engine</strong>. It was a monumental construction of brass gears and rotating shafts designed to calculate polynomial functions. The engine didn&#8217;t &#8220;understand&#8221; equations; instead, every gear tooth and lever was a physical instantiation of one. To change the type of calculation, you had to redesign the machine itself. Addition required one configuration, multiplication another, polynomials yet another. Each machine was a one-trick performer, brilliant but rigid.</p><p>This logic of <strong>mapping&#8212;hardware equals algorithm</strong>&#8212;dominated early mechanical computation. Each specialized calculator was essentially a frozen embodiment of a single mathematical idea. Want square roots? Build a square root machine. Need trigonometric tables? Construct a trigonometry engine. Complexity was handled not through abstraction, but through proliferation. Civilization&#8217;s computational landscape became a zoo of specialized contraptions, each ingenious in design but isolated in purpose.</p><p>The limitation was obvious: progress demanded ever more machines. Every new mathematical or practical need required inventing a new apparatus from scratch. These devices could be scaled in size, but not in scope. They could not generalize, because their logic was hardwired into their gears, levers, or beads. In this mapping paradigm, computation was fractured, local, and finite.</p><p>This was the world before 1936: a world where each machine was a physical metaphor for a function, and universality was nowhere to be found. The stage was set for a conceptual revolution&#8212;a move away from machines that merely <em>map</em>toward a machine that could <em>imitate them all</em>.</p><div><hr></div><p><strong>The Limits of Mapping Logic</strong></p><p>The mapping approach was ingenious, but it carried its own ceiling. Each new function required a new apparatus, each new application a new design. If the abacus mapped numbers to beads, and the Difference Engine mapped polynomials to gears, then a trigonometric table would need its own trigonometry engine, a logarithmic table its own logarithm engine, and so on. Complexity did not consolidate&#8212;it multiplied.</p><p>The result was what you could call a <strong>machine zoo</strong>: a landscape full of specialized devices, each brilliant, but none capable of stepping outside its assigned task. Engineers were forced to think in terms of hardware proliferation: if society needed ten different kinds of calculations, society would need ten different machines. Scale meant more machines, not more universality.</p><p>This fragmentation had real consequences. Producing navigational tables required one team of human computers, astronomical tables another, actuarial tables yet another. The tools of calculation remained brittle, local, and siloed. There was no common principle to connect them, no single framework that could unify their logic. Each machine was an island, and civilization as a whole was forced to navigate an archipelago of disconnected instruments.</p><p>The limits of mapping logic are the limits of direct embodiment: <strong>hardware equals algorithm, and nothing more</strong>. It was a world rich in ingenuity but poor in abstraction. And until someone broke free of this paradigm, the dream of a universal machine&#8212;a device that could imitate all others&#8212;remained unthinkable.</p><div><hr></div><p><strong>Turing&#8217;s Leap: The Imitation Machine</strong></p><p>By the mid-1930s, mathematicians were wrestling with a deep puzzle known as the <strong>Entscheidungsproblem</strong>, or &#8220;decision problem.&#8221; The question was deceptively simple: <em>can there be a systematic method to decide whether any given mathematical statement is provable?</em> Put differently, is there a mechanical procedure that can, in principle, determine the truth or falsity of any mathematical proposition?</p><p>Alan Turing, a 24-year-old Cambridge mathematician, approached this problem not by inventing yet another specialized machine, but by asking a more radical question: <em>what exactly counts as a mechanical procedure?</em> If you could capture that essence, you could then reason about the boundaries of computation itself.</p><p>Turing&#8217;s answer was the <strong>Imitation Machine</strong> (later called the <strong>Universal Turing Machine</strong>). It was not a machine in brass and gears, but an abstract model stripped to its absolute minimum. He defined it by just four primitive actions:</p><ul><li><p><strong>Read</strong> a symbol from a tape.</p></li><li><p><strong>Write</strong> a new symbol onto the tape.</p></li><li><p><strong>Move</strong> the tape one step left or right.</p></li><li><p><strong>Change state</strong> according to a finite set of rules.</p></li></ul><p>That was all. Yet with this toolkit, Turing proved something astonishing: <strong>any specialized machine could be described as a sequence of instructions on the tape, and his imitation machine could reproduce its behavior step by step.</strong></p><p>This meant that the abacus, the adding machine, the Difference Engine, even the most elaborate mathematical calculators&#8212;all could, in principle, be simulated by one machine. The machine itself did not need to change; only the symbols on its tape did. Hardware was no longer bound to a single function. Function could be abstracted, encoded, and re-used.</p><p>This was the conceptual leap that ended the &#8220;machine zoo.&#8221; For the first time, the diversity of computation collapsed into a single framework. Infinite tasks could be expressed by finite primitives. And most importantly, <strong>programs became data</strong>: the description of a machine was itself just another string of symbols on the tape.</p><p>From this insight, the modern idea of <strong>software</strong> was born.</p><div><hr></div><p><strong>Program as Data: The Birth of Software</strong></p><p>Turing&#8217;s most radical insight was not just that one machine could <em>imitate</em> all others, but that the description of those other machines could itself be treated as data. A machine&#8217;s rules&#8212;what it reads, what it writes, when it changes state&#8212;could be written down as a string of symbols and placed on the very tape the machine was operating on. <strong>Programs became data.</strong></p><p>This was a profound act of compression. Instead of building a new device every time you wanted a new function, you could encode the function in a symbolic form and let a single universal machine interpret it. Hardware was no longer the limiting factor; the instructions themselves carried the diversity. The hardware only needed to be general enough to execute the four primitives. Everything else lived in the description.</p><p>A decade later, this idea found a practical embodiment in the work of John von Neumann and his colleagues. The <strong>stored-program computer</strong>&#8212;the architecture still at the heart of every laptop and smartphone&#8212;directly implements Turing&#8217;s principle. In this design, both <strong>data and instructions are stored in the same memory</strong>. A number to be multiplied and a sequence of steps to perform the multiplication are, at the physical level, indistinguishable. This simple unification created astonishing flexibility: one machine could run any program, provided the program was encoded in memory.</p><p>The shift was nothing less than the birth of <strong>software</strong>. The proliferation of specialized hardware gave way to universality. Instead of a zoo of machines, we could build a single device and simply swap out the programs. A single piece of hardware could simulate an adding machine in the morning, a weather predictor in the afternoon, and a game of chess at night.</p><p>What Turing proved in theory, and von Neumann embodied in practice, was the defining pivot of modern computing: from <strong>hardware diversity to software universality</strong>. Once programs became data, the age of general-purpose computation began. It is the reason your phone can host a thousand apps, and the reason <strong>&#8220;write once, run anywhere&#8221;</strong> became the mantra of digital civilization.</p><div><hr></div><p><strong>What Universality Means for Computation</strong></p><p>Turing&#8217;s imitation machine did more than solve a puzzle in mathematics. It gave us, for the first time, a <strong>precise definition of what it means to compute.</strong> A function is <em>computable</em> if, and only if, it can be carried out by a Turing machine&#8212;if its steps can be reduced to a finite sequence of reading, writing, moving, and state transitions. Anything that cannot be expressed in this framework lies outside the boundary of computation.</p><p>This clarity was revolutionary. Before Turing, &#8220;calculation&#8221; was a fuzzy concept, blurred between what humans could do with pen and paper and what machines might someday accomplish. After Turing, the line was sharp: if it can be written as a procedure with four primitives, it is computable. If not, it belongs to the realm of the undecidable.</p><p>Equally powerful was the <strong>compression</strong> this definition achieved. An infinite landscape of mathematical operations&#8212;addition, multiplication, polynomial expansion, trigonometry, algorithms of every shape and size&#8212;collapsed into just four actions and a tape. Complexity was no longer managed by proliferating machines, but by rearranging primitives.</p><p>From compression comes <strong>recombination</strong>. With a finite toolbox of primitives, you can stack, chain, and nest them into arbitrarily complex structures. From four actions emerge sorting algorithms, compilers, operating systems, databases, web browsers&#8212;the entire edifice of modern computing. What once required entire factories of mechanical devices is now simulated effortlessly by a single universal substrate.</p><p>This is what universality means for computation: <strong>a boundary, a compression, and a generative explosion.</strong> A boundary, because we now know exactly what is computable. A compression, because infinite tasks collapse into a handful of building blocks. And a generative explosion, because those blocks can be recombined into everything from space simulations to streaming platforms.</p><p>The first leap of universality did not simply make calculation faster; it made calculation <em>general</em>. It gave civilization the power to move from hardware diversity to software universality, and in doing so, redrew the horizon of the possible.</p><div><hr></div><p>The abacus and difference engines were ingenious, but they were islands&#8212;each bound to a single shoreline of purpose. Turing&#8217;s Imitation Machine, by contrast, was an ocean: one vessel that could sail anywhere, so long as you charted the right map on its tape. From this conceptual leap emerged the digital world we now inhabit, where a single laptop can translate languages, simulate galaxies, and compose symphonies.</p><p>The first leap of universality gave birth to <strong>software</strong>. It transformed computation from a zoo of specialized devices into a general substrate for thought. And it gave us a new horizon: computation itself now had a boundary, a definition of what is possible and what lies forever beyond. This is where the journey of universality truly began.</p><p>In the next essay, we&#8217;ll move to the second leap&#8212;<strong>the universality of language</strong>. Just as Turing unified the logic of machines, large language models have collapsed translation, reasoning, and dialogue into a single primitive: predicting the next token. If the first leap defined what is computable, the second is redefining what is <strong>language-modelable</strong>.</p><p>If this exploration resonates with you, I invite you to <strong>subscribe</strong>. This series is a long-form journey across the three great leaps of universality&#8212;computation, language, and civilization. By joining in, you&#8217;ll get the next installment delivered directly, and together we can trace how these hidden patterns of compression and recombination shape not just our machines, but our future.</p><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DRIK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DRIK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!DRIK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!DRIK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!DRIK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DRIK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:185610,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174861373?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DRIK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!DRIK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!DRIK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!DRIK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F80d045a1-66cb-4685-b3cf-b87bf9bf6aec_1024x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[Universality: Why It Defines Every Technological Breakthrough]]></title><description><![CDATA[Electricity wasn&#8217;t just another invention&#8212;it became universal because one principle, voltage, could power lamps, radios, computers, and cities.]]></description><link>https://www.entropycontroltheory.com/p/universality-why-it-defines-every</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/universality-why-it-defines-every</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Mon, 29 Sep 2025 17:10:43 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!eFMX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Electricity wasn&#8217;t just another invention&#8212;it became universal because one principle, voltage, could power lamps, radios, computers, and cities. The internet wasn&#8217;t just a network&#8212;it became universal because TCP/IP could carry any kind of data: text, images, video, even money. AI feels magical today, but what we are really witnessing is the search for another universal principle. Each leap in technology begins when we discover universality: a minimal set of primitives that can simulate everything else.</p><p>And in every field of social and technical progress&#8212;whether in programming languages, global trade, or governance&#8212;we see the same gravitational pull. Systems start fragmented, full of incompatible parts, but over time they compress into a shared foundation, a common substrate that unlocks scale. Universality is not about doing more; it is about doing <em>anything</em>with less.</p><p>This series is about tracing that trajectory. From Turing&#8217;s imitation machine to today&#8217;s large language models, and forward to what I call the &#8220;social Turing machine,&#8221; we&#8217;ll explore the breakthroughs that universality has already delivered, and the ones that may still be ahead. The goal is not prediction for its own sake, but orientation: to know what kind of shifts to expect, and where it might be worth placing our collective bets.</p><div><hr></div><p><strong>What Universality Means</strong></p><p>When I say <em>universality</em>, I don&#8217;t mean &#8220;widespread adoption&#8221; or &#8220;something everyone uses.&#8221; I mean something much more precise: <strong>the ability to reduce infinite tasks to a finite set of primitives, paired with a universal substrate that can carry them out.</strong> Once that substrate exists, it can simulate anything that falls within its domain.</p><p>Think about mathematics. At first glance, the infinite world of equations, curves, and transformations seems impossibly diverse. Yet every calculation can ultimately be expressed in a handful of primitive operations: addition, multiplication, and a small set of axioms. Entire branches of math, from calculus to number theory, are just elaborate recombinations of those primitives.</p><p>Physics works the same way. Newton&#8217;s laws compressed the unruly motion of apples and planets into three simple principles. With those rules in hand, the universe became predictable: you could simulate the arc of a cannonball or the orbit of the moon. Later, Einstein&#8217;s equations compressed an even wider range of phenomena into the geometry of spacetime. These were universality moments in science: finite rules, infinite reach.</p><p>Biology offers an even clearer example. Life itself is governed by four bases&#8212;A, T, C, G. DNA is the universal code. Out of this tiny alphabet, nature recombines sequences to produce the staggering diversity of living organisms. A fern, a frog, a human&#8212;they are all variations on the same set of primitives.</p><p>The pattern is unmistakable. Universality always looks like this:</p><ol><li><p><strong>Compression</strong> &#8212; distilling a chaotic, infinite landscape into a small toolkit of primitives.</p></li><li><p><strong>Recombination</strong> &#8212; building endless complexity by rearranging those primitives in new ways.</p></li></ol><p>This is why universality defines breakthroughs. It is not about solving every problem separately; it is about discovering the <em>meta-solution</em>, the framework that makes every problem look like a variation of the same theme.</p><div><hr></div><p><strong>The First Leap: Computation Universality</strong></p><p>Before 1936, machines could calculate, but they could not <em>generalize</em>. The abacus was reliable, but it only mapped numbers to beads. Babbage&#8217;s Difference Engine, designed a century earlier, was more ambitious: a vast tangle of gears and shafts that could crank out polynomial values. Later, electromechanical calculators could add, subtract, multiply, even handle square roots.</p><p>But all of these were <strong>specialized calculators</strong>. Each one embodied a single function. If you wanted to add, you built an adding machine. If you wanted logarithms, you designed a new contraption. Complexity was handled through multiplication of hardware, not through abstraction. Civilization was living in a &#8220;mapping age,&#8221; where <strong>hardware = algorithm</strong>.</p><p>Alan Turing&#8217;s breakthrough was to step outside this paradigm. In his 1936 paper, he described an imaginary device with only <strong>four primitive actions</strong>:</p><ul><li><p><strong>Read</strong> a symbol on a tape</p></li><li><p><strong>Write</strong> a symbol on the tape</p></li><li><p><strong>Move</strong> the tape left or right</p></li><li><p><strong>Change state</strong> based on simple rules</p></li></ul><p>That was it&#8212;no gears, no special circuits, no pre-built functions. And yet, Turing proved that this minimal toolkit could simulate <em>any</em> algorithm that could ever be written down. By encoding the &#8220;instructions&#8221; of a specialized machine onto the tape, his &#8220;Imitation Machine&#8221;&#8212;what we now call the <strong>Universal Turing Machine</strong>&#8212;could mimic any other machine.</p><p>This was a profound shift: from a world of many machines, each bound to one task, to a world of one machine that could be <em>reprogrammed</em> endlessly. For the first time, hardware and function were separated. <strong>Software was born.</strong></p><p>The impact cannot be overstated. Every computer you use today&#8212;laptops, smartphones, cloud servers&#8212;rests on Turing&#8217;s logic. A single physical device can run a word processor in the morning, simulate a weather system in the afternoon, and render a movie at night. All of these are just sequences of read&#8211;write&#8211;move&#8211;state instructions flowing through a universal substrate.</p><p>This was the <strong>first leap in universality</strong>: the compression of infinite tasks into four primitives, plus a universal tape to carry them. The consequences were not just technical, but philosophical. It gave humanity a new definition: <strong>&#8220;computable&#8221;</strong> now meant anything that could be described in these four actions.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!eFMX!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!eFMX!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 424w, https://substackcdn.com/image/fetch/$s_!eFMX!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 848w, https://substackcdn.com/image/fetch/$s_!eFMX!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 1272w, https://substackcdn.com/image/fetch/$s_!eFMX!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!eFMX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic" width="1456" height="1941" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1941,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1515949,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174854624?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!eFMX!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 424w, https://substackcdn.com/image/fetch/$s_!eFMX!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 848w, https://substackcdn.com/image/fetch/$s_!eFMX!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 1272w, https://substackcdn.com/image/fetch/$s_!eFMX!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0012af64-b2f0-4e44-bf26-a2ada0e8ad8f_3024x4032.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>(my son&#8217;s abacus, a mapping machine for addition and subtraction.)</p><div><hr></div><p><strong>The Second Leap: Language Universality</strong></p><p>If Turing unified the world of <strong>computation</strong>, the next great unification came in the world of <strong>language</strong>. For decades, natural language processing (NLP) was a patchwork of specialized tools. You built one algorithm for machine translation, another for sentiment analysis, another for question answering, and still others for summarization, dialogue, or topic modeling. Each task had its own architecture, its own training set, its own evaluation metrics.</p><p>The result was a kind of digital Tower of Babel. Every system spoke its own dialect. Progress in one corner of NLP rarely transferred to another. A machine that could translate French to English was useless for detecting sarcasm, and a sentiment classifier had nothing to say about document retrieval. It was the same pre-1936 problem all over again: many &#8220;machines,&#8221; each confined to a narrow function.</p><p>The breakthrough came with large language models. Instead of building a bespoke solution for every task, researchers realized that all language operations could be reduced to a <strong>single primitive</strong>:</p><ul><li><p><strong>Predict the next token.</strong></p></li></ul><p>That&#8217;s it. The model doesn&#8217;t need a different engine for translation, summarization, or reasoning. It simply keeps guessing the most likely next unit of text&#8212;word, subword, or character&#8212;based on everything it has seen so far. Out of this humble mechanic, the full spectrum of linguistic intelligence emerges.</p><p>Once you have a system trained to predict tokens across billions of examples, prompts become the new &#8220;programs.&#8221; Write &#8220;Translate this to German&#8221; and the model predicts tokens that correspond to German text. Ask &#8220;Summarize this article&#8221; and the model predicts a condensed version. Query &#8220;What is 37 &#215; 41?&#8221; and the model predicts the correct arithmetic sequence. Just as the Universal Turing Machine encoded the logic of any specialized calculator onto its tape, the LLM encodes the logic of any specialized language task into the probability of the next token.</p><p>The impact has been staggering. What was once a fragmented set of siloed models has collapsed into a single universal substrate. <strong>One model, infinite tasks.</strong> Translation, reasoning, summarization, dialogue&#8212;these are no longer separate engines but different expressions of the same primitive.</p><p>This was the <strong>second leap in universality</strong>. If Turing defined &#8220;what is computable,&#8221; then LLMs are beginning to define &#8220;what is language-modelable.&#8221; The boundary of intelligence itself is being redrawn, not as a collection of tasks, but as a single act of prediction unfolding on an infinite tape of tokens.</p><div><hr></div><p><strong>The Coming Leap: Civilization Universality</strong></p><p>If computation was the first leap, and language the second, then what comes next? Look around: society today is a patchwork of silos. Finance, healthcare, education, logistics, governance&#8212;each has its own standards, its own data formats, its own institutions. Even inside a single organization, the fragmentation is obvious: the CRM doesn&#8217;t talk to the ERP, the accounting software doesn&#8217;t align with the HR platform, and government records are locked in formats from half a century ago. We&#8217;ve learned to live with these disconnects as if they were natural, but they are symptoms of a deeper problem: we lack a <strong>universal substrate for civilization itself.</strong></p><p>What might such a substrate look like? I believe it begins with a cycle of four primitives:</p><ol><li><p><strong>Consensus</strong> &#8212; deciding what matters.</p></li><li><p><strong>Protocol</strong> &#8212; encoding that decision into rules.</p></li><li><p><strong>Structure</strong> &#8212; crystallizing those rules into roles, institutions, or processes.</p></li><li><p><strong>Narrative</strong> &#8212; weaving a story that legitimizes and sustains the structure.</p></li></ol><p>This cycle repeats endlessly. Constitutions are written, companies are founded, movements are born. Each time, the pattern is the same: consensus &#8594; protocol &#8594; structure &#8594; narrative. Yet today this cycle is still slow, brittle, and mostly manual. Every institution reinvents the wheel, every industry defines its own silo.</p><p>The coming leap would be to treat this cycle not as a sociological curiosity but as a <strong>universal primitive set</strong>, the way read/write/move/state defined computation, or token prediction defined language. Imagine a <strong>Social Turing Machine</strong>, where consensus can be captured digitally, protocols encoded transparently, structures instantiated as executable processes, and narratives reinforced through shared symbolic layers. In such a world, rules and institutions themselves could become <strong>executable code</strong>&#8212;not just written in paper constitutions or buried in bureaucracy, but running on a shared substrate of natural language, APIs, and vector spaces.</p><p>This is, of course, speculation. But then again, so was Turing&#8217;s imitation machine in 1936, so too were the first neural probabilistic language models in the early 2000s. What matters is the direction of compression: infinite diversity of social tasks, reduced to a finite cycle of primitives. From there, recombination does the rest.</p><p>If the first leap defined what is computable, and the second what is language-modelable, then the third may define what is <strong>governable and sharable at the scale of civilization</strong>.</p><div><hr></div><p><strong>Why Universality Defines Breakthroughs</strong></p><p>When people talk about breakthroughs, they often imagine <em>more power</em>&#8212;faster processors, bigger datasets, stronger engines. But history shows that raw scale is not what truly changes the game. What defines a breakthrough is not the ability to do <strong>more of the same</strong>, but the ability to do <strong>anything new with less</strong>. That is the essence of universality.</p><p>Universality is about <strong>compression</strong>. A messy, fragmented field suddenly collapses into a handful of primitives and a universal substrate that can recombine them. The abacus, the adding machine, and the polynomial calculator were all swept into a single framework of read, write, move, and state. Translation, summarization, and reasoning dissolved into the single act of predicting the next token. In each case, the endless complexity of tasks was compressed into a smaller and smaller toolkit.</p><p>But compression alone is not enough. The real magic comes from <strong>recombination</strong>. Once you&#8217;ve identified the minimal building blocks, you can shuffle and stack them into countless new configurations. With just four DNA bases, life built whales and willows. With just two electric charges, engineers built circuits, radios, and supercomputers. With four computational primitives, programmers built every software system we know today. Universality means that <strong>a finite toolbox can be endlessly repurposed</strong>.</p><p>That is why universality defines breakthroughs. It does not merely speed up existing processes&#8212;it <strong>rewrites the limits of what is possible</strong>. The arrival of software meant we no longer needed a new machine for each task; one machine could simulate all of them. The arrival of LLMs means we no longer need a separate model for each linguistic task; one model can fluidly perform all of them. And if a Social Turing Machine emerges, we may no longer need separate institutions for each domain; one substrate could coordinate many forms of governance, commerce, and collective action.</p><p>Every leap in history follows this pattern: find the smallest set of building blocks, discover the universal substrate, and unleash infinite recombination. Breakthroughs are not about adding horsepower&#8212;they are about discovering universality.</p><div><hr></div><p><strong>The Road Ahead</strong></p><p>When we zoom out, the pattern is unmistakable. Each era-defining breakthrough&#8212;whether in science, engineering, or society&#8212;has come from a moment of universality. We discover a set of minimal primitives, we build a universal substrate to carry them, and suddenly infinite possibilities open up.</p><ul><li><p>In <strong>computation</strong>, it was Turing&#8217;s four actions and a universal tape.</p></li><li><p>In <strong>language</strong>, it became token prediction on a universal corpus.</p></li><li><p>And in <strong>civilization</strong>, we may soon see consensus, protocol, structure, and narrative woven into a new universal cycle.</p></li></ul><p>Universality doesn&#8217;t just make things faster. It changes the horizon of what is possible. It tells us that instead of multiplying silos&#8212;more machines, more models, more institutions&#8212;we can compress them into a common base and let recombination do the rest. That is how breakthroughs scale.</p><p>This is why universality matters. It is not a curiosity of computer science, nor a quirk of AI research. It is the hidden law of progress: breakthroughs happen when we collapse chaos into clarity, when we discover the few rules that can simulate everything else.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!DjDP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!DjDP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 424w, https://substackcdn.com/image/fetch/$s_!DjDP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 848w, https://substackcdn.com/image/fetch/$s_!DjDP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 1272w, https://substackcdn.com/image/fetch/$s_!DjDP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!DjDP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic" width="617" height="680" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:680,&quot;width&quot;:617,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:72115,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174854624?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!DjDP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 424w, https://substackcdn.com/image/fetch/$s_!DjDP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 848w, https://substackcdn.com/image/fetch/$s_!DjDP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 1272w, https://substackcdn.com/image/fetch/$s_!DjDP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7b970503-8f08-42b2-9372-e521a0a3bbea_617x680.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p>]]></content:encoded></item><item><title><![CDATA[High Schoolers Are Vibe Coding. What Does That Mean for Software’s Future?]]></title><description><![CDATA[Vibe coding isn&#8217;t a mistake &#8212; it&#8217;s an entropy source. The future of software will depend on how we filter, correct, and evolve it.]]></description><link>https://www.entropycontroltheory.com/p/high-schoolers-are-vibe-coding-what</link><guid isPermaLink="false">https://www.entropycontroltheory.com/p/high-schoolers-are-vibe-coding-what</guid><dc:creator><![CDATA[Susan STEM]]></dc:creator><pubDate>Sat, 27 Sep 2025 20:17:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!F1-N!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Recently, I stumbled upon a group of high schoolers coding in my neighborhood library. And of course &#8212; they were vibe coding. No tests, no documentation, no linters. Just pure energy: bold, excited, chasing the thrill of making something &#8212; anything &#8212; come alive on screen. It was messy, chaotic, and full of bugs.</p><p>But here&#8217;s the real question: <strong>is this recklessness something to eliminate, or is it the raw fuel of a new software ecosystem?</strong> What looks like disorder may actually be the true starting point of the next paradigm in software engineering. Because the way we code doesn&#8217;t just shape our programs &#8212; <strong>it shapes how we work, how we learn, and ultimately how our entire ecosystem evolves.</strong></p><div><hr></div><p><strong>Vibe Coding = Entropy Source, Not a Mistake</strong></p><p><strong>&#20013;&#25991;&#65306;Vibe Coding &#19981;&#26159;&#38169;&#35823;&#65292;&#32780;&#26159;&#29109;&#28304;</strong></p><p>When we see high schoolers or beginners writing code without tests, without documentation, and without linters, the instinct is to dismiss it as sloppy or even dangerous. Yet this view is far too narrow, because what looks like chaos is actually <strong>entropy &#8212; the raw fuel of evolution.</strong></p><p>Vibe coding produces uncertainty: most of its output will not surpass what an experienced engineer can write by hand, but occasionally it yields results that are unexpectedly elegant, efficient, or creative &#8212; far beyond what careful planning would have produced. In this sense, vibe coding does not create polished products, it generates <strong>mutations</strong>. And as in biology, mutations are not failures but the very material on which selection acts.</p><p>Entropy in code manifests as <strong>volume</strong> (many snippets, variants, and prototypes), <strong>diversity</strong> (different styles and redundant approaches), and <strong>uncertainty</strong> (some correct, some broken, many in-between). The key is not to suppress this entropy but to <strong>harness it,</strong> which shifts the real question from <em>&#8220;how do we stop vibe coding?&#8221;</em> to <em>&#8220;how do we design systems that can channel it productively?&#8221;</em> This is where AI and modern tools matter most: not as mere assistants cleaning up after human mistakes, but as <strong>entropy filters</strong> &#8212; filtering obvious errors, correcting messy code into reusable blocks, and selecting fragments that deserve to persist.</p><p>Under this lens, vibe coding is not a liability but a source of variation, energy, and possibility, provided the ecosystem has the right filters and governance. <strong>The future of software engineering will embrace vibe coding: its output is not a mistake, but an entropy source &#8212; to be filtered, corrected, and transformed into enduring structure.</strong></p><div><hr></div><p><strong>Old Risk-Only View vs. New Entropy View</strong></p><ul><li><p>Old paradigm: vibe code = sloppy mistakes, must be policed.</p></li><li><p>New paradigm: vibe code = entropy source, like genetic mutations &#8212; most fail, but they fuel evolution.</p></li></ul><div><hr></div><p><strong>Why Manual Policing Won&#8217;t Scale</strong></p><p>At first glance, the obvious response to messy vibe code is simple: have more experienced developers review it. But this &#8220;manual policing&#8221; approach collapses the moment you think at scale. The <strong>volume</strong> of code produced by AI-assisted beginners already exceeds what any team of human reviewers can keep up with &#8212; and that gap will only widen as models become faster and more accessible. What was once a trickle of half-baked snippets has become a flood, and no number of code review checklists can dam it up.</p><p>Even if we tried, the <strong>cost is unbearable</strong>. Senior engineers would spend their time acting as &#8220;human linters,&#8221; correcting indentation, fixing type mismatches, and patching obvious logic gaps. That is the worst possible use of scarce talent: instead of designing architectures or protocols, they would be stuck chasing down the noise generated by the long tail of vibe coding. The result is wasted expertise, slower ecosystems, and inevitable bottlenecks.</p><p>In short, manual policing turns into a <strong>scaling trap</strong>: the more code that gets produced, the more people you need to check it, but the less value those people create. If the ecosystem depends on human review alone, it will stagnate under its own weight. The only viable path forward is <strong>systemic filtering</strong> &#8212; protocols, automated validators, AI refactoring tools, and governance mechanisms that can handle the flood at machine speed.</p><div><hr></div><p><strong>Conclusion</strong></p><p>The future of software engineering will embrace vibe coding: its output is not a mistake, but an entropy source &#8212; to be filtered, corrected, and transformed into enduring structure. <strong>The real challenge is building the ecosystem that can handle this entropy.</strong> That means creating tools and protocols that act as systemic filters.</p><p></p><p><em>to be continued&#8230;.</em></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!F1-N!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!F1-N!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!F1-N!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!F1-N!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!F1-N!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!F1-N!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic" width="1024" height="1024" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1024,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:281197,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://sstem.substack.com/i/174711668?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!F1-N!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 424w, https://substackcdn.com/image/fetch/$s_!F1-N!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 848w, https://substackcdn.com/image/fetch/$s_!F1-N!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 1272w, https://substackcdn.com/image/fetch/$s_!F1-N!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7ec995f8-569f-4453-aa9e-2eeff947e7e7_1024x1024.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p></p>]]></content:encoded></item></channel></rss>