<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Felix View]]></title><description><![CDATA[Our latest Thoughts and Company News]]></description><link>https://www.thefelixview.com</link><generator>Substack</generator><lastBuildDate>Fri, 15 May 2026 21:03:40 GMT</lastBuildDate><atom:link href="https://www.thefelixview.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Felix Research]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[felixresearch@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[felixresearch@substack.com]]></itunes:email><itunes:name><![CDATA[Felix Research]]></itunes:name></itunes:owner><itunes:author><![CDATA[Felix Research]]></itunes:author><googleplay:owner><![CDATA[felixresearch@substack.com]]></googleplay:owner><googleplay:email><![CDATA[felixresearch@substack.com]]></googleplay:email><googleplay:author><![CDATA[Felix Research]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Capturing Experience and pivoting from Hero Culture]]></title><description><![CDATA[On avoiding siloed expertise and fortifying firm intelligence]]></description><link>https://www.thefelixview.com/p/capturing-experience-and-pivoting</link><guid isPermaLink="false">https://www.thefelixview.com/p/capturing-experience-and-pivoting</guid><dc:creator><![CDATA[Sav]]></dc:creator><pubDate>Wed, 08 Apr 2026 15:48:02 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="5848" height="3860" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3860,&quot;width&quot;:5848,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;white and blue factory under blue sky during daytime&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="white and blue factory under blue sky during daytime" title="white and blue factory under blue sky during daytime" srcset="https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1589725971211-7e86a631e2c2?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHxzaWxvfGVufDB8fHx8MTc3NTY2MzA3Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@waldemarbrandt67w">Waldemar Brandt</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>The most resilient financial firms are disciplined about personnel. They engineer fail-safes for their trading stacks and diversify their counterparties to ensure no single technical glitch can halt operations. We understand that in almost every other context, a lack of backup is an unacceptable risk.</p><p>Yet, a glaring vulnerability persists in research departments: the Expert Bottleneck. This is where a firm&#8217;s most critical strategic intelligence is trapped solely within the minds of a few indispensable people. Keeping institutional memory siloed within an individual is an unsustainable burden that risks exhausting the expert, leaves their hard-won insights at risk of being lost and creates a bottleneck that leaves the entire team vulnerable to an information blackout.</p><p>The &#8220;Hero Culture&#8221; of research is often celebrated as a mark of elite talent, but it&#8217;s a systemic risk. Relying on a few star analysts who hold the keys to the kingdom creates massive hidden costs. When a team depends on the brilliance and intuition of a few, the pace of the entire firm is limited by those individuals&#8217; bandwidth. This creates a linear relationship between headcount and output that is difficult to scale; if your lead researcher is occupied, the rest of the firm&#8217;s strategic pipeline sits idle.</p><p>Beyond simple availability, there is the issue of &#8220;shadow data.&#8221; When an analyst performs a complex synthesis of market trends, the final memo is often the only surviving record of that work. The hundreds of intermediary steps, the rejected hypotheses, and the nuanced connections between entities are often lost the moment the analyst closes their laptop. This loss of provenance means that a firm is constantly paying to reinvent the wheel, as subsequent team members lack the &#8220;logic trail&#8221; required to build upon previous work.</p><p>As markets become more volatile and regulatory landscapes shift, the manual retrieval of data has become a tax that even the best analysts can no longer afford to pay. Professionals are currently drowning in fragmented workflows where tools often add more complexity than clarity. Every hour spent hunting for lost context or reformatting a document is an hour stolen from high-level decision-making. We see this daily in the way research teams interact with their data; they are forced to act as manual &#8220;data janitors&#8221; before they can act as strategic thinkers.</p><p>This tax is compounded by the sheer volume of unstructured data entering the firm. Current research solutions often increase the noise rather than the signal. An analyst might find a relevant insight in a 300-page filing, but if that insight cannot be seamlessly linked to the firm&#8217;s proprietary data or previous investment theses, it remains a stranded asset. This fragmentation is a direct contributor to increased error rates and missed opportunities, not just a nuisance.</p><p>To de-risk the expert, firms must move from a model of individual starpower to one of system-based intelligence. This requires a workspace that preserves the logic and history of every insight. By decoupling intelligence from headcount, the firm&#8217;s collective knowledge remains a secure, navigable asset.</p><p>In an era where markets move faster than any one person can read, the firms that thrive will be those that have integrated team expertise into their processes. Felix Research is developing FelixOne, an intelligent harness that unifies frontier AI models with secure infrastructure to ensure the brain of the operation remains an institutional asset.</p><p>By investing in a system that captures and scales knowledge, you ensure that even when your people move on, your momentum does not. It&#8217;s time to stop relying on the luck of the Hero and start building the certainty of the system.</p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.thefelixview.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.thefelixview.com/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The Myth of the “Naked” LLM]]></title><description><![CDATA[A Conversation with Dimitri, Felix Research Founding Engineer]]></description><link>https://www.thefelixview.com/p/the-myth-of-the-naked-llm</link><guid isPermaLink="false">https://www.thefelixview.com/p/the-myth-of-the-naked-llm</guid><dc:creator><![CDATA[Sav]]></dc:creator><pubDate>Thu, 02 Apr 2026 14:56:08 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="5916" height="3920" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3920,&quot;width&quot;:5916,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;man in black t-shirt and brown pants standing beside black and silver motorcycle&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="man in black t-shirt and brown pants standing beside black and silver motorcycle" title="man in black t-shirt and brown pants standing beside black and silver motorcycle" srcset="https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1626995587693-c50c9a5544c6?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw2M3x8cmFjZSUyMGNhciUyMGFuYXRvbXl8ZW58MHx8fHwxNzc1MTMxMzc1fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@brodus_k">Paul Kansonkho</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>In the current AI arms race, there is a persistent misunderstanding that the model is the <em>entire</em> product. Investors and users alike often ask why bespoke solutions are necessary when frontier models like Claude and similar exist as &#8220;standalone powerhouses&#8221;.</p><p>However, treating an LLM as a complete solution is like admiring a high-performance engine while forgetting it requires a surrounding structure to actually move.</p><p>I sat down with Felix Research&#8217;s Founding Engineer, Dimitri, to address the myth of the &#8220;naked&#8221; LLM. We discussed why the future of specialised research doesn&#8217;t lie in the brilliance of a single model, but in the &#8220;glue&#8221;; the programmatic tools, the guardrails, the architecture and crucially, the domain expertise.</p><p><strong>Sav:</strong> So, we already discussed context windows for the <a href="https://www.thefelixview.com/p/why-the-future-of-ai-isnt-about-bigger">recent piece</a>. How it&#8217;s not just an issue of bigger and bigger because of the costs... they increase quadratically.</p><p><strong>Dimitri:</strong> <strong>Yeah. They skyrocket.</strong></p><p><strong>Sav:</strong> Exactly. But what I&#8217;m curious about is your answer, as the engineer, when investors are like, &#8220;<em>Why can&#8217;t I just use Claude? Why do I need you guys?</em>&#8221; Is it frustrating? Or do they have a  point? Articulate the reasoning!</p><p><strong>Dimitri: Okay. Well. It&#8217;s because everyone is looking at the software and attributing all of it to &#8220;the LLM&#8221; because they are &#8220;AI companies.&#8221; But what they&#8217;re forgetting is that these companies hire engineers to build around the LLM. That&#8217;s the final product you interact with. Not just a lone LLM.</strong></p><blockquote><p>Dimitri&#8217;s frustration is backed by the latest industry research; the industry is moving away from evaluating LLMs as standalone &#8220;brains&#8221; and toward evaluating them as Composite Systems. A recent arXiv paper, entitled <em>Precision Proactivity: Measuring Cognitive Load in Real-World AI-Assisted Work</em>, asserts that &#8220;Agents are systems, not models,&#8221; and that single-turn accuracy (the &#8220;naked&#8221; output) is no longer a viable metric for real-world enterprise utility.</p></blockquote><p><strong>D: What LLM&#8217;s are really good at is: you give it text, it understands it </strong>[to varying degrees, depending on what &#8220;understand&#8221; means to you]<strong>, and it gives you output. Output specifically (predominantly) in text format. Now, that doesn&#8217;t sound very powerful, but it is in the context of a programmatic system that can do something with that output.</strong></p><p><strong>Ours </strong>[Felix Intelligence]<strong> is a bit different because it&#8217;s using a visual LLM. It&#8217;s taking in an image and producing text. But the system we built - the parts that show the tables, the RAG (Retrieval Augmented Generation), the bit where you ask a question and it finds the relevant source - that&#8217;s a tool built using normal programming languages. It&#8217;s engineering glue.</strong></p><p><strong>S:</strong> So there&#8217;s an architectural misunderstanding going on. People think the entire product is the &#8220;naked&#8221; LLM. They don&#8217;t realise that Claude and similar are (often) a blend of engineering and models?</p><p><strong>D: Yeah so if you go to GPT and press &#8220;Deep Research/ Web Search&#8221; they are providing a programmatic tool to the LLM to use if it wants to - a tool where the LLM inputs text into the tool and the tool returns text. To be clear for a second, this isn&#8217;t prompt engineering or fine-tuning. This is giving it access to tools, regardless of prompt engineering.</strong></p><p><strong>S:</strong> What do you actually mean by &#8220;tools&#8221;?</p><p><strong>D: Okay, so, you have the LLM in a box. You have the prompt going in, and the output being spat out. But the output doesn&#8217;t have to be just raw text. It can be a specific structured format.</strong></p><p><strong>Then you have &#8220;LLM tools.&#8221; These are mechanisms the LLM can use, for example to query our database. These tools are programmatic. It&#8217;s basically like giving the LLM an API so it can pull information from the external world. Importantly, we have to build each of those tools programmatically before the LLM can touch them.</strong></p><p><strong>S:</strong> So the efficiency and power that the industry is speaking of at the moment doesn&#8217;t come from the lone LLM.</p><p><strong>D: It comes from a system. Multiple LLMs plus programmatic engineering to put it all together and tell it what to do. That&#8217;s what people forget. Take Claude Code; the reason it&#8217;s so powerful is that Anthropic built engineering tools around the LLM, to enable it.</strong></p><blockquote><p>The rise of &#8220;Claude Code&#8221; and OpenAI&#8217;s &#8220;Deep Research&#8221; proves that the frontier labs are no longer just selling intelligence; they are selling orchestration. However, as the ArXiv paper notes, these general tools often fail in production because they lack &#8220;operational constraints&#8221; and &#8220;domain-specific safety&#8221; that specialised systems like Felix provide.</p></blockquote><p><strong>D: That&#8217;s the problem with investors </strong>[with love! We love you]<strong>. They go, &#8220;</strong><em><strong>Why should I invest in this when Anthropic is going to replace you?</strong></em><strong>&#8221; Well, because they&#8217;re not. Because if they want to go into a specific domain and create purpose-built tools at our level of granularity, they would need a team like ours.</strong></p><p><strong>S:</strong> If you were on an investor call and you could speak freely, what would you actually say in response to that line of questioning?</p><p><strong>D: I mean </strong>[he laughs to himself]<strong>, I&#8217;ve done this many times. I tell them: an LLM is a tool, and what you&#8217;re investing in is the system around it that makes it powerful and specialised. There&#8217;s a reason the frontiers are still hiring engineers like crazy and paying them however many hundreds of thousands to millions a year. It&#8217;s because the LLM is useful, but the real power comes from the enabling system.</strong></p><p><strong>It&#8217;s like... using power tools-</strong></p><p><strong>S:</strong> -A jackhammer is powerful and effective but you can&#8217;t just hand it to anyone and expect perfect outcomes with no mistakes or destruction.</p><p><strong>D: Exactly.</strong></p><p><strong>S:</strong> So when someone says, &#8220;<em>Why can&#8217;t I just use Claude?</em>&#8221;, they&#8217;re ignoring that you still have to build all these domain-specific tools.</p><p><strong>D: Yeah. Look at what Anthropic did with Claude code. They provided a tool for the LLM to write raw Bash. Bash being a system level Language.</strong></p><p><strong>S:</strong> Bash is the programming language that... what? Controls the computer?</p><p><strong>D: Yeah, well at least Mac and Linux. Anthropic couldn&#8217;t - and wouldn&#8217;t - write a specific tool for every single thing. So instead they said, &#8220;</strong><em><strong>Here&#8217;s Bash. Bash can do everything. If we give the LLM the ability to use Bash, we don&#8217;t need to write a tool for every little task.</strong></em><strong>&#8221;</strong></p><p><strong>S:</strong> [Likely looking perturbed] Um&#8230;subject to&#8230;permissions&#8230;?</p><p><strong>D: Yeah. Because of the potential for danger, the LLM will ask you for permission every single time it wants to use Bash - rather than it being like an on switch that stays on. Like if you tell it to do something it shouldn&#8217;t, it can technically send an email to your boss or the government.</strong></p><blockquote><p>Lepine, Kim, Mishkin and Beane highlight that &#8220;reliability is more valuable than brilliance.&#8221; In financial research, an agent that can write Bash is &#8220;brilliant,&#8221; but an agent that can reliably navigate a proprietary financial database without violating PII (Personally Identifiable Information) boundaries is &#8220;valuable.&#8221; Felix&#8217;s value lies in this bridge between raw power and operational safety.</p></blockquote><p><strong>D: The important thing to get across is that we&#8217;re not reinventing the wheel. We&#8217;re creating value addition. We&#8217;re building a powerful system around various LLM&#8217;s, as well as blending models and proprietary RAG guardrails.</strong></p><p><strong>S:</strong> They think it&#8217;s a brain; it&#8217;s actually an engine. And we&#8217;re building the car.</p><p><strong>D: Yeah. You can&#8217;t generalise expertise. You have to build the specific tools for the domain.</strong></p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.thefelixview.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.thefelixview.com/subscribe?"><span>Subscribe now</span></a></p><p></p><p><em>References</em>: <a href="https://arxiv.org/pdf/2505.10742">https://arxiv.org/pdf/2505.10742</a></p><div class="embedded-post-wrap" data-attrs="{&quot;id&quot;:192379643,&quot;url&quot;:&quot;https://www.oneusefulthing.org/p/claude-dispatch-and-the-power-of&quot;,&quot;publication_id&quot;:1180644,&quot;publication_name&quot;:&quot;One Useful Thing&quot;,&quot;publication_logo_url&quot;:&quot;https://substackcdn.com/image/fetch/$s_!hyZZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd2ee4f7-3e71-42f0-92eb-4d3018127e08_1024x1024.png&quot;,&quot;title&quot;:&quot;Claude Dispatch and the Power of Interfaces&quot;,&quot;truncated_body_text&quot;:&quot;AIs are already far more capable than most people realize. A large part of this so-called capability overhang comes not from the limits of AI (though, of course, they still have many limits), but from how people interact with it. The vast majority of people access AI through chatbots, and usually the free versions with less capable models. A chatbot is &#8230;&quot;,&quot;date&quot;:&quot;2026-03-31T22:34:37.308Z&quot;,&quot;like_count&quot;:541,&quot;comment_count&quot;:26,&quot;bylines&quot;:[{&quot;id&quot;:846835,&quot;name&quot;:&quot;Ethan Mollick&quot;,&quot;handle&quot;:&quot;oneusefulthing&quot;,&quot;previous_name&quot;:null,&quot;photo_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7c05cdbc-40fd-459b-915d-f8bc8ac8bf01_3509x5263.jpeg&quot;,&quot;bio&quot;:&quot;I am a professor at the Wharton School of the University of Pennsylvania. I study entrepreneurship &amp; innovation and AI. I am trying to understand what our new AI-haunted era means for work and education.&quot;,&quot;profile_set_up_at&quot;:&quot;2022-07-03T02:55:46.296Z&quot;,&quot;reader_installed_at&quot;:&quot;2024-10-18T13:48:35.897Z&quot;,&quot;publicationUsers&quot;:[{&quot;id&quot;:1134116,&quot;user_id&quot;:846835,&quot;publication_id&quot;:1180644,&quot;role&quot;:&quot;admin&quot;,&quot;public&quot;:true,&quot;is_primary&quot;:true,&quot;publication&quot;:{&quot;id&quot;:1180644,&quot;name&quot;:&quot;One Useful Thing&quot;,&quot;subdomain&quot;:&quot;oneusefulthing&quot;,&quot;custom_domain&quot;:&quot;www.oneusefulthing.org&quot;,&quot;custom_domain_optional&quot;:false,&quot;hero_text&quot;:&quot;Trying to understand the implications of AI for work, education, and life. By Prof. Ethan Mollick&quot;,&quot;logo_url&quot;:&quot;https://bucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com/public/images/cd2ee4f7-3e71-42f0-92eb-4d3018127e08_1024x1024.png&quot;,&quot;author_id&quot;:846835,&quot;primary_user_id&quot;:846835,&quot;theme_var_background_pop&quot;:&quot;#BAA049&quot;,&quot;created_at&quot;:&quot;2022-11-08T03:49:40.900Z&quot;,&quot;email_from_name&quot;:null,&quot;copyright&quot;:&quot;Ethan Mollick&quot;,&quot;founding_plan_name&quot;:&quot;Founding Member&quot;,&quot;community_enabled&quot;:true,&quot;invite_only&quot;:false,&quot;payments_state&quot;:&quot;enabled&quot;,&quot;language&quot;:null,&quot;explicit&quot;:false,&quot;homepage_type&quot;:&quot;newspaper&quot;,&quot;is_personal_mode&quot;:false,&quot;logo_url_wide&quot;:null}}],&quot;twitter_screen_name&quot;:&quot;emollick&quot;,&quot;is_guest&quot;:false,&quot;bestseller_tier&quot;:1000,&quot;status&quot;:{&quot;bestsellerTier&quot;:1000,&quot;subscriberTier&quot;:5,&quot;leaderboard&quot;:null,&quot;vip&quot;:false,&quot;badge&quot;:{&quot;type&quot;:&quot;bestseller&quot;,&quot;tier&quot;:1000},&quot;paidPublicationIds&quot;:[320996,2880588,2141880,1084089,3061248,1198173,35345],&quot;subscriber&quot;:null}}],&quot;utm_campaign&quot;:null,&quot;belowTheFold&quot;:true,&quot;type&quot;:&quot;newsletter&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="EmbeddedPostToDOM"><a class="embedded-post" native="true" href="https://www.oneusefulthing.org/p/claude-dispatch-and-the-power-of?utm_source=substack&amp;utm_campaign=post_embed&amp;utm_medium=web"><div class="embedded-post-header"><img class="embedded-post-publication-logo" src="https://substackcdn.com/image/fetch/$s_!hyZZ!,w_56,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2Fcd2ee4f7-3e71-42f0-92eb-4d3018127e08_1024x1024.png" loading="lazy"><span class="embedded-post-publication-name">One Useful Thing</span></div><div class="embedded-post-title-wrapper"><div class="embedded-post-title">Claude Dispatch and the Power of Interfaces</div></div><div class="embedded-post-body">AIs are already far more capable than most people realize. A large part of this so-called capability overhang comes not from the limits of AI (though, of course, they still have many limits), but from how people interact with it. The vast majority of people access AI through chatbots, and usually the free versions with less capable models. A chatbot is &#8230;</div><div class="embedded-post-cta-wrapper"><span class="embedded-post-cta">Read more</span></div><div class="embedded-post-meta">a month ago &#183; 541 likes &#183; 26 comments &#183; Ethan Mollick</div></a></div>]]></content:encoded></item><item><title><![CDATA[Why the Future of AI Isn’t About Bigger Context Windows]]></title><description><![CDATA[The efficiency trap]]></description><link>https://www.thefelixview.com/p/why-the-future-of-ai-isnt-about-bigger</link><guid isPermaLink="false">https://www.thefelixview.com/p/why-the-future-of-ai-isnt-about-bigger</guid><dc:creator><![CDATA[Sav]]></dc:creator><pubDate>Tue, 24 Mar 2026 15:37:02 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="4986" height="2805" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2805,&quot;width&quot;:4986,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;brown wooden framed glass window&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="brown wooden framed glass window" title="brown wooden framed glass window" srcset="https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1598721987126-0e7bee3ba71f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxN3x8d2luZG93c3xlbnwwfHx8fDE3NzQzNjY2ODV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@markolsen">Mark Olsen</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>For the past year, the AI landscape has been locked in a singular arms race: the size of the context window. We&#8217;ve watched models expand from handling a few pages of text to processing entire novellas.</p><p>The logic seemed intuitive: the more data you can cram into a model&#8217;s short-term memory, the better the output. This expanded window was touted as the ultimate tool for synthesis, allowing researchers to feed in a 200-page regulatory filing and expect instant clarity.</p><p>At Felix Research, where we prefer the augmented over the strictly artificial, this initially felt like a breakthrough. However, using massive context windows in high-stakes environments, such as financial analysis, has exposed a crucial flaw.</p><p><strong>The future isn&#8217;t about the size of the window; it&#8217;s about the precision of the architecture.</strong></p><p>The primary issue with massive context windows is now a documented failure: LLMs tend to forget information presented in the middle of a large prompt.</p><p>Researchers call this the &#8220;lost-in-the-middle&#8221; phenomenon. A model might recall the first few paragraphs and the final few sentences, but the hundreds of pages sandwiched in between often dissolve into digital noise, taking nuance with them.</p><p><strong>If you are an analyst using an AI to find a contradiction between a CEO&#8217;s statement on page 3 and a risk factor buried on page 112, a large context window will often fail you.</strong> The model provides a superficial summary (the <em>flavour</em> of the text) rather than the surgical substance required for a true edge. This forces humans to spend hours manually validating the AI&#8217;s work, which entirely defeats the point of automation.</p><p>Beyond the &#8220;cognitive&#8221; failure, there is a physical cost to the CramEverythingIn approach: <strong>computation</strong>.</p><p><strong>The relationship between input length and the power required to process it isn&#8217;t linear; it&#8217;s quadratic</strong>. Doubling the context window can quadruple the inference time and the costs. Using a 128k token window to answer a simple question is not just overkill; it is economically and environmentally unsustainable. To optimise for the future, we have to stop throwing more data at the problem and start throwing more logic at it.</p><p>The industry pivot is already shifting away from linear context toward structured context.</p><p>Standard Retrieval-Augmented Generation (RAG) was the first step. Instead of sending a 500-page book to the model, a system identifies the ten most relevant paragraphs and presents only those to the LLM. The AI is no longer a library; it is an analyst.</p><p>But even standard RAG has its limits. If you ask, &#8220;<em>What is our exposure to tariff fluctuations?&#8221;</em> a basic system might pull snippets about trade and currency, but it often fails to connect the thematic dots. The model receives the ingredients, but it still doesn&#8217;t have the recipe.</p><p>To solve this, the frontier of AI isn&#8217;t focused on longer inputs, but on smarter, pre-computed (synthesised) context.</p><p>Instead of retrieval being a simple keyword search, the next generation of data architectures acts like a cognitive sous-chef. Before a user ever asks a question, the system continuously processes incoming data - organising relationships, indexing nuance, and building a hierarchical knowledge graph.</p><p>When a query is made, the system doesn&#8217;t just grab chunks of text; it retrieves a pre-synthesised, multi-modal context. This drastically reduces the load on the LLM, cuts inference time, and eliminates the lost-in-the-middle effect.</p><p>The defining breakthrough of the next year won&#8217;t be measured in token length. It will be about how useful the context <em>inside</em> that window has become.</p><p>Don't let your insights get lost in the middle of context bloat. <strong>Visit our site today and try Amuse-Bouche</strong> to see how our architecture delivers the surgical substance your high-stakes analysis demands.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.thefelixview.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.thefelixview.com/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[Financial Analysis: Lessons Learned]]></title><description><![CDATA[A good workman never blames his tools, but a great workman looks for better ones.]]></description><link>https://www.thefelixview.com/p/financial-analysis-lessons-learned</link><guid isPermaLink="false">https://www.thefelixview.com/p/financial-analysis-lessons-learned</guid><dc:creator><![CDATA[Sav]]></dc:creator><pubDate>Thu, 19 Mar 2026 14:41:29 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="4782" height="2690" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2690,&quot;width&quot;:4782,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;architectural photography of building with people in it during nighttime&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="architectural photography of building with people in it during nighttime" title="architectural photography of building with people in it during nighttime" srcset="https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1480944657103-7fed22359e1d?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhbmFseXN0fGVufDB8fHx8MTc3MzkyODE4Nnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@mikofilm">Mike Kononov</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>&#8220;<em>Check out my quant</em>&#8221; is something nobody has ever said about me, and for good reason.</p><p>I am not a financial analyst, nor am I a financier by training; I am firmly in ops management and qualitative research - much more accustomed to researching by interpreting and cross-referencing texts, tracing arguments and identifying conceptual (meta)structures than by interrogating balance sheets or corporate filings. When FRx&#8217;s CEO, Ben, suggested that I would make a suitable guinea pig for purposes that were still yet to be explained, I was game.</p><p>The brief I received was deceptively simple. He provided me a name, which we will change to <em><strong>Markus Poole</strong></em>, and said: &#8220;I&#8217;m your manager at a firm. You have three hours to come back with something I haven&#8217;t already found.&#8221;</p><p>No template, no predefined output and no checklist of questions. The instruction was open ended and intentionally so. Should be straightforward enough. Right?</p><p></p><h3><strong>The manual process</strong></h3><p>I began where I figured one is supposed to begin: with a search on Companies House.</p><p>A trawl through filings, directorships, dates and changes in shareholdings. A few entities that appeared dormant. One that might be active. <em>How can I even be sure that this person exists? &#8220;He&#8221; could be some kind of shell holdings entity&#8230;</em> I wondered.</p><p>LinkedIn next.</p><p>Employment history, board positions, mutual connections, universities and events. Markus&#8217; professional trajectory appeared coherent enough - not very information rich, however.</p><p>On to Google.</p><p>News mentions, archived interviews, conference panels. A local newspaper article from a decade ago. A blog post here, a podcast appearance there. <strong>Each source opened another tab. Each tab introduced another thread </strong>and within forty minutes I found myself with over thirty tabs open.</p><p>I was taking &#8220;notes&#8221; (read: scrappy little fragments, copy-and-pasted excerpts and half-baked hypotheses) in a text editor. Externalising is a favourite coping mechanism of mine and I was sure that when I looked back at the notes I would remember what they signified. Or referred to. Or labelled?</p><p><em>Is there a pattern in the types of companies he joins?<br>Are there overlapping directors across these entities?<br>What do the accounts suggest about performance?</em></p><p>The process felt unstructured, lossy and honestly a bit anxiety inducing. I was conscious that I was losing context with every new tab. A date copied into my notes lost its meaning while a quote from an article was detached from its publication source and author. <strong>The difficulty was the absence of structure, rather than the absence of information. Provenance was like sand slipping between clenched fists. </strong>The more information I gathered, the less confident I felt about my grip on it.</p><p>This is an information architecture problem.</p><p></p><h3><strong>The hidden friction</strong></h3><p>Professionals in financial research are deeply habituated to this workflow. They toggle between tools. They copy and paste. They reconstruct context manually. They hold mental maps of relationships between people and entities. They are highly skilled at it. But the process itself is fragile.</p><p>Small errors creep in easily. A misattributed quote. A company with a similar name but a different registration number. I found myself constantly backtracking and triple-checking.</p><p>Where did that claim originate?<br>Was that figure from the 2021 accounts or 2022?<br>Was that directorship current or resigned?</p><p>Each question required reopening a tab, rescanning a page, and reorienting myself.</p><p>Three hours, in this context, is not long. It is barely sufficient to construct a coherent map of a moderately active professional life. I began to understand how much of the analyst&#8217;s role involves simply maintaining orientation across a sea of fragmented data.</p><p>Judgement requires synthesis. Synthesis requires clarity. Clarity requires structure and data quality. Without structure, effort dissipates into some kind of corporate ether.</p><p></p><h3><strong>Introducing Amuse-Bouche into the workflow</strong></h3><p>At the halfway mark, Ben suggested that I run the same materials through Amuse Bouche.</p><p>And while AB&#8217;s UI is elegant, it isn&#8217;t theatrical. It&#8217;s purpose built and restrained, but what changed dramatically was the feeling of cognitive load.</p><p>Documents were not treated as flat text, corporate filings retained their internal structure. Names, dates and roles were extracted in relation to one another. Entities were recognised as entities rather than as strings of characters.</p><p>Instead of maintaining my own fragile web of notes, I could explore a structured representation of the information.</p><ul><li><p><strong>Markus Poole was linked to Company A as a director from 2018 to 2021.</strong></p></li><li><p><strong>Company A shared a co-director with Company B.</strong></p></li><li><p><strong>Company B had filed late accounts twice.</strong></p></li><li><p><strong>A news article referencing Company B was tied back to the specific entity rather than floating as an isolated citation.</strong></p></li></ul><p>Crucially, <em><strong>finally</strong></em>, each extracted element retained provenance. I could trace a claim back to its source document instantly and the genuine anxiety of context loss subsided.</p><p>AB did not replace judgement. It did not generate an opinion for me. It altered the substrate on which judgement could operate and I was no longer expending energy on bookkeeping.</p><p><strong>The bottleneck in financial analysis is rarely access to information. It is the management of it.</strong></p><p>Amuse-Bouche is a small demonstration of a larger thesis.</p><p>If the early stages of research are dominated by manual extraction, context reconstruction and error minimisation, then a significant portion of analytical capacity is diverted away from higher order thinking. Over time, this shapes the profession itself. Analysts become adept navigators of fragmented systems rather than beneficiaries of coherent ones.</p><p>For someone like me, entering from a qualitative background, the contrast was stark. The manual workflow felt like playing darts in a pitch-black room. The structured workflow turned the research and discovery process into something intentional and fruitful.</p><p>Financial research demands rigour, scepticism and traceability. None of these qualities are enhanced by forcing professionals to act as human glue between incompatible tools.</p><p>Amuse Bouche does not claim to solve every dimension of this problem. It does demonstrate that document interaction can feel different.</p><p></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.thefelixview.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.thefelixview.com/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[A Busy Week at Felix Research]]></title><description><![CDATA[A progress update on FRx's innovations and activities]]></description><link>https://www.thefelixview.com/p/a-busy-week-at-felix-research</link><guid isPermaLink="false">https://www.thefelixview.com/p/a-busy-week-at-felix-research</guid><dc:creator><![CDATA[Sav]]></dc:creator><pubDate>Wed, 11 Mar 2026 12:56:02 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!jDUq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>It&#8217;s been an incredibly productive period for the team at Felix Research. </p><p>We are pleased to announce the launch of <strong>Amuse-Bouche</strong>, our first tech demo. This release marks a significant milestone for us, and it is already garnering a great deal of attention. This early interest is a testament to the dedication of the team, particularly the hard work put in by Dimitri to get the project ready for the public.</p><h3>Pitching in Monaco</h3><p>While the digital launch was taking place, the team was also active on the ground. We have just returned from a trip to Monaco, where we focused our efforts on pitching and fundraising. It was a fantastic experience that provided valuable opportunities to connect with partners. As an added bonus, the trip offered a brief but very welcome reprieve from the typical British weather.</p><h3>Looking Ahead</h3><p>We are eager to share more updates with you in the coming weeks as our work progresses. In the meantime, we invite you to try Amuse-Bouche for yourself. We are  hungry for feedback and would value your thoughts on the experience.</p><p>You can find the demo here: <a href="https://lnkd.in/ezjTEMQK">Amuse-Bouche</a></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jDUq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jDUq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 424w, https://substackcdn.com/image/fetch/$s_!jDUq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 848w, https://substackcdn.com/image/fetch/$s_!jDUq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 1272w, https://substackcdn.com/image/fetch/$s_!jDUq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jDUq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png" width="2546" height="1702" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1702,&quot;width&quot;:2546,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:954906,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefelixview.com/i/190610504?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5db7a48f-cf7e-481c-a55d-077e65081d23_2546x1702.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jDUq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 424w, https://substackcdn.com/image/fetch/$s_!jDUq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 848w, https://substackcdn.com/image/fetch/$s_!jDUq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 1272w, https://substackcdn.com/image/fetch/$s_!jDUq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F2aa2cd24-0728-4687-8568-0871d8c98c14_2546x1702.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><em>(Pictured: Amuse-Bouche knowledge graph of IBM 2024 Annual Report)</em></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefelixview.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Felix View! Subscribe for free to receive new posts and support our work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Amuse-Bouche]]></title><description><![CDATA[A Tech Demo by Felix Research]]></description><link>https://www.thefelixview.com/p/amuse-bouche</link><guid isPermaLink="false">https://www.thefelixview.com/p/amuse-bouche</guid><dc:creator><![CDATA[Sav]]></dc:creator><pubDate>Thu, 05 Mar 2026 12:52:44 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!vfWL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vfWL!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vfWL!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 424w, https://substackcdn.com/image/fetch/$s_!vfWL!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 848w, https://substackcdn.com/image/fetch/$s_!vfWL!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 1272w, https://substackcdn.com/image/fetch/$s_!vfWL!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vfWL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png" width="1456" height="514" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:514,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:876548,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.thefelixview.com/i/189992217?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!vfWL!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 424w, https://substackcdn.com/image/fetch/$s_!vfWL!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 848w, https://substackcdn.com/image/fetch/$s_!vfWL!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 1272w, https://substackcdn.com/image/fetch/$s_!vfWL!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6367808-f1e3-400f-9631-6b1f3c91dd9e_2531x893.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p></p><p>We are very excited to launch <strong>Amuse-Bouche</strong>.</p><p>It&#8217;s our first public tech demo; a small plate. A taste of what we&#8217;ve been building at <strong>Felix Research</strong>.</p><p>Whilst it represents only a fraction of what&#8217;s to come, it stands on its own. It works. It&#8217;s useful. It&#8217;s fast. And we think you&#8217;ll enjoy exploring it, <strong><a href="https://www.felixresearch.com?ref=blog">here</a></strong>.</p><p></p><h2><strong>Why we built this</strong></h2><p>Ben founded Felix Research less than a year ago with the simple aim of addressing the pain points he faced as a professional in private equity. What began as a tool for personal use, quickly caught the attention of his colleagues; <strong>they wanted a taste.</strong></p><p>Fast forward to 2026 and we find ourselves amidst a technological renaissance, traversing an exciting and unpredictable landscape. It&#8217;s clear that powerful enterprise AI capabilities already exist, but the way professionals actually work hasn&#8217;t caught up.</p><p>Analysts toggle between tools.</p><p>They copy and paste.</p><p>They format.</p><p>They reformat.</p><p>They reconstruct context manually.</p><p>They try to minimise errors.</p><p>They lose time to process instead of spending it on judgement.</p><p>Amuse-Bouche is not our full solution to that problem but it <em>is</em> a demonstration of the underlying technology we&#8217;ve been developing. We hope it serves as a glimpse into how information architecture, extraction and intelligent interaction can feel when designed properly from the ground up, by people with domain expertise.</p><p></p><h2><strong>What Amuse-Bouche does</strong></h2><p>Amuse-Bouche showcases a core capability of our stack: structured document interaction and no context lost.</p><p>Instead of treating documents as flat text, it understands form, structure, and relationships. It extracts intelligently. It preserves context. It allows you to explore content in a way that feels fluid rather than fragmented.</p><p>We&#8217;ve intentionally kept the interface simple; our goal is not to overwhelm clients with features, it&#8217;s to let you experience the speed and clarity of the underlying technology.</p><p>Crucially, Amuse-Bouche is not a mock-up, not a slide deck, not a conceptual prototype.</p><p>It is real software.</p><p></p><h2><strong>Built with care (and restraint)</strong></h2><p>We are young, ambitious and moving quickly but we are not cowboys.</p><p>If you work in financial services, or any sensitive domain, you know that trust, accuracy and reliability matter more than novelty or speed. So let us be explicit:</p><p>Amuse-Bouche was built with data privacy and security as foundational constraints, not afterthoughts. Add to this meticulous provenance structuring and tracking as a USP and you have at your fingertips a force multiplier that revolutionises financial research whilst respecting client data.</p><p>We are launching our first tech demo now, simply because building in public forces clarity. We want feedback from real users and we believe the best way to communicate ambition is through execution.</p><p>Amuse-Bouche demonstrates:</p><ul><li><p>The quality of our extraction engine</p></li><li><p>The speed of our processing</p></li><li><p>The clarity of our referencing</p></li><li><p>The design philosophy underpinning Felix Research</p></li></ul><p></p><h2><strong>Who Amuse-Bouche is for</strong></h2><p>If you work with dense documents.If you routinely extract structured information from unstructured sources.If you care about speed, precision and traceability.If you are curious about what AI-native tools might feel like when purpose-built rather than retrofitted.</p><p>This is for you.</p><p>Even if you&#8217;re just curious, explore it. We built it to be intuitive, so no onboarding call required.</p><p>Click through, upload something real, explore your Knowledge Graph. If it surprises you, tell us. If it frustrates you, <em>definitely tell us</em>.</p><p>This demo is both a showcase and a conversation starter.</p><p></p><p><em>Bon app&#233;tit.</em></p><p><strong>The FRx Team</strong></p><p></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.thefelixview.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Felix View! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Prediction Markets and Why Classification Matters (More than Outcomes?)]]></title><description><![CDATA[Prediction markets such as Polymarket and Kalshi sit uncomfortably between finance, gaming, gambling and information markets.]]></description><link>https://www.thefelixview.com/p/prediction-markets-and-why-classification-matters-more-than-outcomes</link><guid isPermaLink="false">https://www.thefelixview.com/p/prediction-markets-and-why-classification-matters-more-than-outcomes</guid><dc:creator><![CDATA[Ben Jaletzke]]></dc:creator><pubDate>Mon, 09 Feb 2026 13:26:06 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/2847c3bb-24f3-474b-b502-ad42b6b32c42_821x376.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3></h3><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!C_OE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!C_OE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 424w, https://substackcdn.com/image/fetch/$s_!C_OE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 848w, https://substackcdn.com/image/fetch/$s_!C_OE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 1272w, https://substackcdn.com/image/fetch/$s_!C_OE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!C_OE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Prediction Markets and Why Classification Matters (More than Outcomes?)&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Prediction Markets and Why Classification Matters (More than Outcomes?)" title="Prediction Markets and Why Classification Matters (More than Outcomes?)" srcset="https://substackcdn.com/image/fetch/$s_!C_OE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 424w, https://substackcdn.com/image/fetch/$s_!C_OE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 848w, https://substackcdn.com/image/fetch/$s_!C_OE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 1272w, https://substackcdn.com/image/fetch/$s_!C_OE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6878fd69-1722-4da4-a0b3-8754abd50464_821x376.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><p>Prediction markets such as Polymarket and Kalshi sit uncomfortably between finance, gaming, gambling and information markets. Their recent growth has less to do with regulatory blind spots and more to do with a genuine categorisation problem. These platforms do not map cleanly onto existing regimes and that ambiguity has shaped both their trajectory and the intensity of regulatory attention they attract. Rather than asking whether prediction markets are socially beneficial or politically problematic, a more productive lens is to examine how regulatory classification determines what these platforms are permitted to become.</p><h3><strong>The Discomfort</strong></h3><p>Prediction markets have grown quickly because they do not fit neatly into existing regulatory categories. They challenge long-standing distinctions between speculation and information gathering, entertainment and financial exposure. This discomfort has prompted uneven regulatory responses, with authorities reaching for familiar frameworks even where the underlying activity only partially aligns.</p><h3><strong>What Prediction Markets Actually Are</strong></h3><p>At their core, prediction markets translate probabilistic beliefs about future events into tradable positions, blending elements of financial instruments, information markets, gambling and behavioural incentives. Participants are rewarded for being right rather than persuasive, and prices function as aggregated signals of collective belief. This structure explains why prediction markets are often defended as tools for forecasting rather than gambling. However, the presence of monetary stakes introduces exposure, incentives, and potential harm that go beyond pure information aggregation.</p><blockquote><p>"Prediction market contracts are structured as commodity derivatives (<a href="https://kalshi-public-docs.s3.amazonaws.com/kalshi_finance_faq.pdf?ref=thefelixview.com">https://kalshi-public-docs.s3.amazonaws.com/kalshi_finance_faq.pdf</a>) which are explicitly tied to events. In essence, they represent binary positioning opportunities for traders, though many contracts mirror traditional exchange-traded contracts in that they can be structured to reflect e.g. a hurdle rate above or below a contract would pay out.</p></blockquote><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!c8di!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!c8di!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 424w, https://substackcdn.com/image/fetch/$s_!c8di!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 848w, https://substackcdn.com/image/fetch/$s_!c8di!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 1272w, https://substackcdn.com/image/fetch/$s_!c8di!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!c8di!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png" width="1600" height="1295" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1295,&quot;width&quot;:1600,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Prediction Markets and Why Classification Matters (More than Outcomes?)&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Prediction Markets and Why Classification Matters (More than Outcomes?)" title="Prediction Markets and Why Classification Matters (More than Outcomes?)" srcset="https://substackcdn.com/image/fetch/$s_!c8di!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 424w, https://substackcdn.com/image/fetch/$s_!c8di!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 848w, https://substackcdn.com/image/fetch/$s_!c8di!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 1272w, https://substackcdn.com/image/fetch/$s_!c8di!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F78fd2034-4ca1-48f6-bfd6-66b2f93f0687_1600x1295.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><blockquote><p>..As commodity derivatives, rather than company-linked equity derivatives, they are not generally encompassed by insider trading rules or other restrictions, allowing financial professionals (in principle and making no assumptions about company-specific guidelines) to engage in these markets. A key difference between prediction markets and &#8220;traditional&#8221; betting is that the prediction sites are generally exchanges rather than market makers. Whereas betting companies generally take the opposing side of a bet that is placed, on prediction markets, positioning is matched by participants, prices and probabilities then settling at the clearing price &amp; volume rather than otherwise defined algorithmic outcomes." - Ben, <strong>FRx CEO</strong></p></blockquote><h3><strong>The Classification Problem</strong></h3><p>The central regulatory challenge posed by prediction markets is not their subject matter, whether political, economic, or cultural, but their classification. Are they derivatives contracts, gaming products, or data-driven forecasting tools? Each framing carries different assumptions about risk, consumer protection, and market integrity. Classification is not a semantic exercise. It determines which regulatory body has jurisdiction, which rules apply, and which behaviours are treated as unacceptable.</p><h3><strong>Why Classification Shapes Everything Downstream</strong></h3><p>Once a market is classified, its regulatory destiny follows. Oversight authority, capital requirements, disclosure obligations, and enforcement mechanisms are all downstream consequences of that initial decision. A platform treated as a financial market is assessed primarily through the lens of systemic risk, market manipulation, and participant protection. A platform treated as gaming is regulated around fairness, addiction, and consumer harm. Information markets, by contrast, face far lighter scrutiny. The difficulty for prediction markets is that they borrow features from all three.</p><h3><strong>Polymarket and Kalshi as Contrasting Case Studies</strong></h3><p>The divergence between Polymarket and Kalshi illustrates how regulatory posture, rather than market design alone, shapes legitimacy and operational scope. Kalshi has pursued registration and approval under the US Commodity Futures Trading Commission, positioning itself as a regulated derivatives exchange. Polymarket, by contrast, has operated largely outside formal US regulatory approval, relying on decentralised infrastructure and offshore positioning. The result is not simply different compliance burdens, but different ceilings on growth, partnerships, and institutional participation.</p><h3><strong>The Role of the CFTC and Financial Logic</strong></h3><p>The involvement of the Commodity Futures Trading Commission reflects a broader regulatory instinct to prioritise financial risk and market integrity over the informational value of forecasts. From the CFTC&#8217;s perspective, the key questions concern leverage, manipulation, concentration of positions, and exposure to loss. Whether a market produces accurate predictions is largely irrelevant to this assessment. What matters is whether the structure creates incentives or vulnerabilities that could undermine confidence or cause harm.</p><blockquote><p>"Given the rapidly increasing scale of these markets, they actively engage with regulators, as is their duty, to ensure proper handling of information and safeguarding against market manipulation. Large PMOs (prediction market operators) do not allow insider trading on their contracts in principle, and state that they actively monitor, much like most traditional exchanges, for suspicious activity. Given the reliance on crypto for trade financing and a degree of anonymity, this cannot be entirely ensured. However, it is important to keep in mind the typical / highest volume contracts on these exchanges, most of which focus on events that are either difficult or prohibitively expensive to manipulate by a single player. For example, while it is conceivable, many bets are simply not feasible to directly manipulate, except by parties to the outcome themselves." - Ben</p></blockquote><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!FUO5!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!FUO5!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 424w, https://substackcdn.com/image/fetch/$s_!FUO5!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 848w, https://substackcdn.com/image/fetch/$s_!FUO5!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 1272w, https://substackcdn.com/image/fetch/$s_!FUO5!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!FUO5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png" width="1600" height="1345" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/385d8740-5170-458d-853f-2936db0070ad_1600x1345.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1345,&quot;width&quot;:1600,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Prediction Markets and Why Classification Matters (More than Outcomes?)&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Prediction Markets and Why Classification Matters (More than Outcomes?)" title="Prediction Markets and Why Classification Matters (More than Outcomes?)" srcset="https://substackcdn.com/image/fetch/$s_!FUO5!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 424w, https://substackcdn.com/image/fetch/$s_!FUO5!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 848w, https://substackcdn.com/image/fetch/$s_!FUO5!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 1272w, https://substackcdn.com/image/fetch/$s_!FUO5!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F385d8740-5170-458d-853f-2936db0070ad_1600x1345.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h3><strong>A Note on the UK and EU Regulatory Posture</strong></h3><p>From a UK and EU perspective, prediction markets remain largely peripheral rather than explicitly addressed, but their underlying characteristics intersect with several existing regimes. In the EU, the focus remains anchored in financial classification through instruments such as MiFID II and, where applicable, market abuse frameworks. Prediction markets that resemble derivatives may therefore encounter regulatory scrutiny by extension rather than by design. The UK doesn&#8217;t have an explicit &#8220;prediction market&#8221; regulatory category; instead, most real&#8209;money prediction markets are treated as gambling under the UK Gambling Commission&#8217;s regime (with binary / purely financial derivatives treated or restricted under FCA financial rules)..&nbsp;</p><p>While there is no bespoke regime for prediction markets, their operation would likely be assessed through a combination of financial services regulation, gambling law, and, increasingly, digital governance principles around consumer protection and platform responsibility. What is notable is the absence of clear guidance rather than the presence of active prohibition. This ambiguity creates space for experimentation, but also reinforces the likelihood that any future regulatory engagement will prioritise structural risk and governance maturity over the social or informational value of prediction itself.</p><h3><strong>Structure Over Outcomes</strong></h3><p>Whether prediction markets accurately forecast elections or economic indicators is largely secondary to how incentives, participation, and risk are structured. A market that produces accurate signals but concentrates exposure among a small group of actors presents a different risk profile from one that is widely distributed and tightly constrained. Regulatory attention therefore gravitates towards design choices rather than predictive success. This mirrors developments in other areas of digital regulation, where system architecture increasingly matters more than stated intent.</p><h3><strong>Boundary-Setting Over Endorsement</strong></h3><p>Effective regulation in this space is less about endorsing or suppressing prediction markets, and more about setting clear boundaries that align market behaviour with institutional risk tolerance. Clear classification allows regulators to articulate what is permitted, what is prohibited, and where responsibility lies. It also gives market operators a clearer basis on which to design compliant products, rather than operating in a perpetual state of ambiguity.</p><h3><strong>Closing</strong></h3><p>Prediction markets ultimately test whether modern regulatory systems can accommodate hybrid instruments without forcing them into ill-fitting categories that obscure rather than manage risk. As these platforms continue to evolve, the central question is not whether they are good or bad, but whether regulatory frameworks are capable of recognising their complexity without defaulting to blunt analogies. Classification, more than outcome, will determine their future.</p>]]></content:encoded></item><item><title><![CDATA[A Practical Look at a New Regulatory Era]]></title><description><![CDATA[Intro Artificial Intelligence enters 2026 with less novelty and greater consequence.]]></description><link>https://www.thefelixview.com/p/a-practical-look-at-a-new-regulatory-era</link><guid isPermaLink="false">https://www.thefelixview.com/p/a-practical-look-at-a-new-regulatory-era</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Thu, 22 Jan 2026 14:41:30 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/01054831-3093-4b36-bcae-b40e825c0f96_2000x1333.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ADEg!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ADEg!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ADEg!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ADEg!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ADEg!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ADEg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Practical Look at a New Regulatory Era&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Practical Look at a New Regulatory Era" title="A Practical Look at a New Regulatory Era" srcset="https://substackcdn.com/image/fetch/$s_!ADEg!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 424w, https://substackcdn.com/image/fetch/$s_!ADEg!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 848w, https://substackcdn.com/image/fetch/$s_!ADEg!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!ADEg!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6458f90b-b764-4586-8a2b-611dee4d8272_2000x1333.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><h2><strong>Intro</strong></h2><p>Artificial Intelligence enters 2026 with less novelty and greater consequence. Over the past eighteen months, the focus has shifted away from demonstrations of capability and towards questions of governance, accountability and institutional readiness. This period has been marked by a steady accumulation of regulatory signals, from the EU AI Act entering into force in mid-2024 through to successive UK guidance initiatives across 2025 and early 2026. AI systems are no longer confined to experimental deployments, they are increasingly embedded within core enterprise workflows, influencing decisions, markets, and the distribution of responsibility between humans and machines.</p><p>This shift reflects a broader maturation of the AI ecosystem. Early optimism has given way to more sober assessment as organisations confront the limits of automation in practice. <strong>The challenge is no longer whether AI can generate outputs at speed, but whether those outputs can be trusted, explained, and governed at scale. In high-stakes environments such as financial research and decision-making, these questions are not abstract. They determine risk exposure and strategic confidence.</strong></p><p>Regulation is an inevitable response to this moment, but it is also a revealing one. The sequencing of regulatory activity over the past two years shows governments experimenting with guidance, voluntary codes and formal legislation in parallel. The way states and regions choose to regulate AI reflects a necessary balancing act between risk appetite and the pursuit of technological and economic opportunity. Overcorrection risks stifling innovation, while under-reaction erodes public trust and institutional legitimacy. Navigating this balance has become one of the defining policy challenges of the current decade.</p><p>This piece lightly discusses that challenge through a number of lenses including: the growing body of literature interrogating AI&#8217;s concentration of power; the EU AI Act as a reference point; recent UK regulatory initiatives that have unfolded incrementally since early 2025; and the UK&#8217;s evolving attempt to align digital ambition with credible oversight.</p><h2><strong>AI Regulation Enters Its Literary Phase</strong></h2><p>Over the past year, AI has become the subject of a different kind of scrutiny. Alongside technical papers and policy consultations, a growing body of long-form writing has emerged that treats AI less as a breakthrough technology and more as a social and economic system. This literary turn has unfolded in parallel with the EU AI Act&#8217;s publication in August 2024. Karen Hao&#8217;s <em>Empire of AI: Dreams and Nightmares in Sam Altman&#8217;s OpenAI (Empire of AI) </em>exemplifies this shift, moving the conversation away from model performance and towards questions of institutional control and exploitation of both labour and resources.</p><p>At the heart of Hao&#8217;s argument is a reframing of AI as an extractive enterprise. Rather than portraying progress as the natural outcome of innovation, she highlights the concentration of resources and influence within a small group of firms. These organisations shape the direction of AI development through scale and capital, while the costs of that development are often distributed elsewhere. The book draws attention to the human labour that underpins modern AI systems, much of it invisible, precarious and geographically distant from centres of decision-making.</p><p>What makes Hao&#8217;s commentary particularly relevant to the regulatory conversation is the insistence that governance cannot be separated from incentive structures. <strong>Hao does not argue that AI is inherently harmful, nor does she call for sweeping prohibition. Instead, she exposes how existing market dynamics reward speed, opacity and centralisation, even when these qualities undermine accountability or long-term resilience.</strong> In this framing, regulatory missteps are less about technical ignorance and more about institutional misalignment.</p><p>The significance of this perspective lies in how it reshapes public expectations at a time when regulators are still calibrating their approach. As AI becomes more embedded in everyday systems, narratives that emphasise concentration and dependency resonate more strongly than abstract promises of efficiency.</p><p>Generally speaking, literature has a unique ability to surface these concerns in a way that technical documentation often cannot. It connects individual experiences of confusion or mistrust to broader structural patterns, creating a shared vocabulary for critique. For regulators, this literary turn presents both opportunity and challenge. <strong>On one hand, it broadens the scope of oversight beyond narrow safety metrics, encouraging consideration of fairness and long-term ESG impact. On the other, it risks accelerating regulatory urgency ahead of empirical clarity. Stories are persuasive, but they do not always map cleanly onto enforceable rules.</strong></p><p>The emergence of works like <em>Empire of AI</em> signals that AI regulation is no longer driven solely by engineers and lawyers. It is increasingly shaped by public sentiment and political interpretation. Effective governance in this context requires discernment; literature can illuminate the stakes and expose blind spots, but regulation must translate those insights into proportionate and durable frameworks.</p><h2><strong>From Narrative to Governance</strong></h2><p>Literary critique alone does not produce regulation, but it shapes the conditions under which regulation becomes politically possible. This has been evident over the past two years as public concern has grown alongside formal regulatory action, particularly following the EU AI Act&#8217;s entry into force in 2024. <strong>At times, public opinion calcifies into prescriptive and paternalistic overreach, locking in assumptions that outpace evidence. At others, scepticism towards intervention can drift into inertia, allowing structural risks to accumulate unchecked. Both tendencies risk distorting regulation away from its core function: establishing clear, accessible guidelines grounded in risk mitigation and sustainable organisational benefit.</strong></p><p>Completely dismissing literary influence is a mistake. Public legitimacy and trust matter. Regulation that fails to address widely understood concerns is out of touch with real end users and struggles to command compliance, regardless of architectural sophistication. The task for policymakers is to extract durable insights without codifying transient sentiment.</p><p>Distinguishing between risks that warrant formal oversight and broader anxieties that require transparency rather than restriction is imperative. This distinction underpins the EU AI Act&#8217;s risk-based architecture and is echoed, in softer form, in the UK&#8217;s subsequent guidance-led approach. Questions of cultural unease or speculative future harm are, at times, better addressed through standards, disclosure and ongoing review.</p><p>The challenge is heightened by the public&#8217;s tendency to frame regulation as a binary choice between protection and control. In the aftermath of the Grok episode, explained below, public debate quickly spun out into concerns about censorship, surveillance and overreach. Whilst these legitimate concerns merit serious consideration in general, within this context, they risk obscuring the narrower and more practical question of <strong>system accountability</strong>. Effective digital regulation must resist false dichotomy. Safeguards designed to prevent demonstrable harm do not require expansive monitoring of speech or behaviour; they require clarity about responsibility, proportionate controls and credible mechanisms for redress.&nbsp;</p><h2><strong>The EU AI Act as a Regulatory Reference Point</strong></h2><p>The EU AI Act is a (rightly) ambitious step towards codifying AI governance into law. Published and having entered into force in August 2024, it introduced a risk-based framework that categorises AI systems according to their potential to cause harm, with obligations scaling accordingly. For firms operating within or alongside the European market, the Act offers something that has long been absent from AI oversight:&nbsp; shared regulatory vocabulary.</p><p>From a UK perspective, the importance of the EU AI Act lies less in direct applicability and more in its role as a reference point. Even outside the Union, British companies building or deploying AI systems will encounter its influence through cross-border operations, investor expectations and emerging norms of best practice. The Act effectively sets a baseline against which other regulatory approaches, including those developing in the UK, will be compared, particularly as it becomes generally applicable in August 2026.</p><p><strong>One of the Act&#8217;s strengths is its explicit definition of boundaries. By identifying prohibited uses and high-risk categories, it seeks to clarify what is unacceptable whilst allowing space for lawful innovation. This approach aligns with a jurisprudential preference for outlining constraints rather than prescribing permissible activity. </strong>In theory, such clarity should reduce uncertainty for developers and deployers alike.</p><p>In practice, however, the challenge lies in implementation. Many of the Act&#8217;s definitions are necessarily broad, reflecting the diversity and pace of AI and machine learning development. This creates an interpretive burden, particularly for organisations without extensive compliance infrastructure. There is a risk that complexity, rather than risk, becomes the primary driver of regulatory cost. Smaller firms may struggle to navigate layered requirements, even where their systems pose limited harm.</p><p>As a regulatory artefact, the EU AI Act marks a turning point. It moves AI governance from principle to operational. Whether it succeeds will depend on how effectively its requirements are translated into enforceable, proportionate practice.</p><h2><strong>Sequencing Security and Innovation in the UK</strong></h2><p>In contrast to the EU&#8217;s legislative approach, the UK&#8217;s regulatory posture over the past year has been characterised by sequencing and experimentation. Since the January 2025 publication of the AI Cyber Security Code of Practice (&#8220;<em>guidance to help stakeholders across the supply chain for AI systems, particularly Developers and System Operators, to meet the cyber security provisions outlined for AI systems in the UK Government&#8217;s Code of Practice (and subsequently ETSI TS 104 223)</em>&#8221;), followed by the Software Security Code of Practice in May 2025 and its update in January 2026, the emphasis has been on laying foundations for secure design before hardening expectations.</p><p>Alongside the Digital and Technologies Sector Plan, which frames AI as a driver of national growth, the recent launch of the Software Security Ambassador Scheme (January&nbsp; 2026) signals a deliberate<strong> trust-led approach to governance.</strong> The scheme builds on earlier guidance by encouraging industry-led adoption, peer learning and demonstrable implementation, rather than immediate enforcement.</p><p>The Code, co-designed with industry and the National Cyber Security Centre, sets out principles for embedding security across the software lifecycle. A cohort of signatories has committed to championing these principles and sharing practical insight on implementation. This staged approach allows voluntary practice to mature into shared expectation, giving regulators a clearer view of organisational capability before escalating intervention.</p><p><strong>Taken together, these initiatives suggest a regulatory philosophy focused on learning, iteration and institutional readiness. They reflect an understanding that effective oversight depends as much on operational maturity as on formal rules.</strong></p><h2><strong>When Strategy Meets Reality: The Grok Backlash</strong></h2><p>The EU AI Act reflects a structured attempt to regulate AI through risk categorisation, while the UK&#8217;s guidance-led initiatives emphasise proportionality and innovation. Recent events surrounding Grok, the generative AI chatbot integrated into the social media platform X (formerly Twitter), illustrate how quickly these frameworks can be tested.</p><p><strong>A significant controversy emerged after users demonstrated that Grok&#8217;s image generation and editing features could be used to produce non-consensual sexualised images of real individuals (including minors). As examples circulated publicly, the issue drew rapid political and regulatory scrutiny, particularly in the UK.</strong></p><p>What followed exposed the tension between strategic ambition and operational accountability. The speed and visibility of harm shifted regulatory response from future-proofing to immediate containment and in the UK, the episode was framed through online safety, raising questions about how existing regimes apply to AI-mediated harms.</p><p>The (ongoing) episode highlights the limits of voluntary safeguards once systems are deployed at scale. While mitigations were introduced, the incident reinforced a broader reality: <strong>public-facing AI systems understandably attract hard expectations of accountability.</strong> For policymakers pursuing innovation-led strategies, this underscores the need for governance mechanisms that function under pressure.</p><h2><strong>Closing: Clarity, Restraint and Risk as Organising Principles</strong></h2><p>The current moment calls for clarity of purpose on all sides of the regulatory landscape. For regulators and lawmakers, this means defining boundaries rather than enumerating acceptable use. The most effective frameworks will remain firm regarding impermissible actions whilst allowing for flexibility where application is concerned. <strong>The rapidly evolving landscape requires restraint, however, risk must remain the organising principle.&nbsp;</strong></p><p>For organisations and builders, governance is no longer a downstream compliance exercise. It is an operational cornerstone that must shape the full AI supply chain, from Ethics-by-design, through R&amp;D, all the way to deployment and monitoring. As ever, firms that embed oversight early will be better positioned as regulatory expectations harden.&nbsp;</p><p><strong>Regulation of AI/ML and Digital in 2026 is no longer solely about playing catch-up with technology. It is about governance maturity. Literary critique of the levers behind AI development, the EU&#8217;s formalisation of oversight and the UK&#8217;s incremental, trust-led guidance all reflect the spreading recognition that AI governance must be framed and managed as a component of broader institutional responsibility.</strong></p><p>This is the defining task of the new regulatory era.</p><h2><strong>Timeline &amp; References</strong></h2><ul><li><p>August 2024 <a href="https://ai-act-service-desk.ec.europa.eu/en/ai-act-explorer?ref=thefelixview.com">EU AI Act</a> enters into force</p></li><li><p>January 2025 <a href="https://assets.publishing.service.gov.uk/media/679cae441d14e76535afb630/Implementation_Guide_for_the_AI_Cyber_Security_Code_of_Practice.pdf?ref=thefelixview.com">AI Cyber Security Code of Practice</a> published</p></li><li><p>April 2025 <a href="https://www.etsi.org/deliver/etsi_ts/104200_104299/104223/01.01.01_60/ts_104223v010101p.pdf?ref=thefelixview.com">ETSI TS 104 223</a> published</p></li><li><p>May 2025 <a href="https://assets.publishing.service.gov.uk/media/681a156fdf188ba858873aac/Software_Security_Code_of_Practice_Web_Accessible.pdf?ref=thefelixview.com">Software Security Code of Practice</a> published</p></li><li><p>January 2026 <strong>Software Security Code of Practice</strong> updated</p></li><li><p>January 2026 <a href="https://www.gov.uk/government/publications/software-security-ambassadors-scheme/software-security-ambassadors-scheme?ref=thefelixview.com">Software Security Ambassador Scheme</a> announced</p></li><li><p>August 2026 <strong>EU AI Act</strong> becomes generally applicable&nbsp;</p></li></ul><h2>Further References</h2><ul><li><p><a href="https://assets.publishing.service.gov.uk/media/6892104df15b237bf6610996/industrial_strategy_digital_and_technologies_sector_plan_accessible.pdf?ref=thefelixview.com">The UK&#8217;s Modern Industrial Strategy - Digital and Technologies Sector Plan</a>&nbsp;</p></li><li><p><a href="https://www.bbc.co.uk/search?q=grok&amp;d=NEWS_PS&amp;ref=thefelixview.com">BBC XAI Grok Coverage&nbsp;</a></p></li><li><p><a href="https://www.youtube.com/watch?v=t9G_kXUm6mI&amp;ref=thefelixview.com">The Observer Karen Hao interview on Empire of AI</a></p></li></ul>]]></content:encoded></item><item><title><![CDATA[A Little Introduction to Neural Networks]]></title><description><![CDATA[If you're anything like me, you may have recently heard people talk about AI a bit more.]]></description><link>https://www.thefelixview.com/p/introduction</link><guid isPermaLink="false">https://www.thefelixview.com/p/introduction</guid><dc:creator><![CDATA[Ben Jaletzke]]></dc:creator><pubDate>Fri, 09 Jan 2026 12:24:05 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/6f79d2ef-237d-40be-bfc5-2eb523e25286_2000x3000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Q64r!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Q64r!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Q64r!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Q64r!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Q64r!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Q64r!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:590,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!Q64r!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Q64r!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Q64r!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Q64r!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3e6fe22c-51df-4111-98ff-1a8263efb656_2000x3000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a></figure></div><p></p><p>If you're anything like me, you may have recently heard people talk about AI a bit more. Apparently there are one or two companies that have made some progress in the field.</p><p>While I consider myself technologically savvy enough to use a computer and save a PDF, I've mostly spend time with my mind wandering whenever the topic comes up of how machine learning and neural networks function.</p><p>Nowadays, that's not the healthiest attitude, and since we at Felix Research are rather actively engaged in the field of using and developing Artificial Intelligence solutions, it became time to learn a bit more.</p><p>The problem that I have repeatedly faced is that many of the explanations of how the more complex models work feel a bit like this guide to drawing an Owl,</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!2XM2!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!2XM2!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 424w, https://substackcdn.com/image/fetch/$s_!2XM2!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 848w, https://substackcdn.com/image/fetch/$s_!2XM2!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!2XM2!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!2XM2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg" width="450" height="450" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:450,&quot;width&quot;:450,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!2XM2!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 424w, https://substackcdn.com/image/fetch/$s_!2XM2!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 848w, https://substackcdn.com/image/fetch/$s_!2XM2!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!2XM2!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff5d0de39-d7f8-44b8-af3e-c22d9bdf74fa_450x450.jpeg 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>which while technically correct, sort of leaves out some key details.</p><p>Understandably, the large commercial providers (and <em>cough</em> 'non-profit' organisations) don't particularly have an interest in writing a guide for how to compete with them.</p><p>I'm in the fortunate position of not having to do all of our coding anymore (and I think the team is quite glad that we have Dimitri on board as our Founding Engineer), but I believe that I should know as much as possible about the subject, since it's pretty core to our business. As I learn, I figured I'd share this, and some of our readers might find it a useful 101-guide or reminder.</p><h2>Linear Predictions with ML Models</h2><p>The simplest possible model is a linear regression that takes an input and predicts an output.</p><p>Using <a href="https://pytorch.org/?ref=thefelixview.com">Torch</a> as one of the classic foundations for model building, we can quite simply write such a model.</p><p>In this simplest example, we are trying to train the model to learn <strong>y = 2x</strong>. Not really worth building a neural network for, but this example shows the basic structure of a neural network:</p><p>We define the Input and Output data for testing, our X and Y values, MSE loss and stochastic gradient descent (SGD). This is the most basic form of a model.</p><p>In this type of model, we use a mean squared error loss function to calculate the "wrongness" of a model prediction, and then gradient descent over epochs to train the model to be... less wrong. Our learning rate defines the step size at which we conduct this gradient descent.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!P_sQ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!P_sQ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 424w, https://substackcdn.com/image/fetch/$s_!P_sQ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 848w, https://substackcdn.com/image/fetch/$s_!P_sQ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 1272w, https://substackcdn.com/image/fetch/$s_!P_sQ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!P_sQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png" width="2000" height="1125" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1125,&quot;width&quot;:2000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!P_sQ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 424w, https://substackcdn.com/image/fetch/$s_!P_sQ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 848w, https://substackcdn.com/image/fetch/$s_!P_sQ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 1272w, https://substackcdn.com/image/fetch/$s_!P_sQ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdc25c68d-daf3-48fe-ac37-c6878e4d25ee_2000x1125.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Changing our learning rate significantly impacts how and at what rate our weights get optimized to reduce loss, and involves a fair amount of trial and error.</p><p>The difference between <em>gradient descent</em> and <em>stochastic gradient descent</em> is primarily that in SGD we pick a random sample of our data at a given point in time rather than using the full sample of data to compute a gradient at each step, thus being more computationally efficient. [^1]</p><p>Important to note for the curious is that:</p><blockquote><p>Strictly speaking, SGD was originally defined to update parameters by using exactly one training sample at a time. In modern usage, the term &#8220;SGD&#8221; is used loosely to mean &#8220;minibatch gradient descent,&#8221; a variant of GD in which small batches of training data are used at a time. The major advantage to using subsets of data rather than a singular sample is a lower noise level, because the gradient is equal to the average of losses from the minibatch. For this reason, minibatch gradient descent is the default in deep learning. Contrarily, strict SGD is rarely used in practice. These terms are even conflated by most machine learning libraries such as PyTorch and TensorFlow; optimizers are often called &#8220;SGD,&#8221; even though they typically use minibatches.</p></blockquote><p>[^1] See <a href="https://www.ibm.com/think/topics/stochastic-gradient-descent?ref=thefelixview.com">IBM's great explainer on SGD for a better and more in depth explanation</a></p><pre><code>import torch
import torch.nn as nn
import torch.optim as optim
import numpy as np

# 1. PREPARE DATA
# X is our input data, Y is our target data (labels).
# We must convert python lists into PyTorch Tensors.
X = torch.tensor([[1.0], [2.0], [3.0], [4.0], [5.0]], dtype=torch.float32)
Y = torch.tensor([[2.0], [4.0], [6.0], [8.0], [10.0]], dtype=torch.float32)

# 2. DEFINE THE MODEL
# We create a class that inherits from nn.Module
class SimpleNet(nn.Module):
    def __init__(self):
        super(SimpleNet, self).__init__()
        self.linear = nn.Linear(1, 1)

    def forward(self, x):
        return self.linear(x)

model = SimpleNet()

# 3. DEFINE LOSS AND OPTIMIZER
# Loss Function: Measures how wrong the model is.
# MSE (Mean Squared Error) is standard for regression.
criterion = nn.MSELoss()

# Optimizer: Updates the weights to reduce the error.
# SGD = Stochastic Gradient Descent. lr = learning rate.
optimizer = optim.SGD(model.parameters(), lr=0.01)

# 4. THE TRAINING LOOP
# This is the heart of AI programming.
# Adding history to let us inspect what happens
history = {
    'loss': [],
    'weight': [],
    'bias': []
}

epochs = 200
print("Training started...")

for epoch in range(epochs):
    # A. Forward pass: Compute predicted y by passing x to the model
    y_pred = model(X)

    # B. Compute loss: Difference between predicted and actual
    loss = criterion(y_pred, Y)
    
    # We use .item() to get the plain python number out of the Tensor
    history['loss'].append(loss.item())
    history['weight'].append(model.linear.weight.item())
    history['bias'].append(model.linear.bias.item())

    # C. Zero gradients: Clear old gradients before calculation
    optimizer.zero_grad()

    # D. Backward pass: Compute gradient of the loss with respect to model parameters
    loss.backward()

    # E. Step: Update parameters (weights) based on gradients
    optimizer.step()

    if (epoch+1) % 100 == 0:
        print(f'Epoch [{epoch+1}/{epochs}], Loss: {loss.item():.4f}')

checkpoint = {
    'model_state': model.state_dict(),
    'history': history
}

torch.save(checkpoint, "model_and_metrics.pth")
print("\nModel and metrics saved to model_and_metrics.pth")
</code></pre><p>This trains our most simple model that takes 1 input and 1 output over 200 epochs, and saves the model weights to a file, so we can load it again later.</p><p>Visually, we see graphs that may look familiar - as the training epochs go on, the MSE of the model decreases toward zero, and our weight and bias approach the actual mathematical function we are trying to imitate here.</p><p>In this case given the simplicity of the model, even a few epochs are enough to 'correctly' predict our output, but we do see that if we run the model at only 200 epochs, we are left with an output that is close, but given the equation, pretty far off the mark.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1dCK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1dCK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 424w, https://substackcdn.com/image/fetch/$s_!1dCK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 848w, https://substackcdn.com/image/fetch/$s_!1dCK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 1272w, https://substackcdn.com/image/fetch/$s_!1dCK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1dCK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png" width="1790" height="490" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:490,&quot;width&quot;:1790,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!1dCK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 424w, https://substackcdn.com/image/fetch/$s_!1dCK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 848w, https://substackcdn.com/image/fetch/$s_!1dCK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 1272w, https://substackcdn.com/image/fetch/$s_!1dCK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F528134a1-1c3f-4950-b0ed-ab0042e202b2_1790x490.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We can also train our model for a longer period, that is for more epochs, which... changes very little in this case, really.</p><pre><code>Training started...
Epoch [100/100000], Loss: 0.0173
Epoch [200/100000], Loss: 0.0088
Epoch [300/100000], Loss: 0.0045
Epoch [400/100000], Loss: 0.0023
Epoch [500/100000], Loss: 0.0012
Epoch [600/100000], Loss: 0.0006
Epoch [700/100000], Loss: 0.0003
Epoch [800/100000], Loss: 0.0002
Epoch [900/100000], Loss: 0.0001
...
Epoch [100000/100000], Loss: 0.0000

Model and metrics saved to model_and_metrics.pth
</code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-rhI!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-rhI!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 424w, https://substackcdn.com/image/fetch/$s_!-rhI!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 848w, https://substackcdn.com/image/fetch/$s_!-rhI!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 1272w, https://substackcdn.com/image/fetch/$s_!-rhI!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-rhI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png" width="1790" height="490" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/24c50da6-7227-4907-92af-78c99dbee518_1790x490.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:490,&quot;width&quot;:1790,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!-rhI!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 424w, https://substackcdn.com/image/fetch/$s_!-rhI!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 848w, https://substackcdn.com/image/fetch/$s_!-rhI!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 1272w, https://substackcdn.com/image/fetch/$s_!-rhI!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F24c50da6-7227-4907-92af-78c99dbee518_1790x490.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>We are left with an essentially perfect model that predicts y = 2x, and we could now run this model to predict a value for this equation:</p><pre><code>input_value = 20.0

# Convert the number into a Tensor of shape (1, 1)
input_tensor = torch.tensor([[input_value]])

# Turn off gradient calculation (saves memory/speed for inference)
with torch.no_grad():
    # Pass the tensor to the model
    prediction = model(input_tensor)

# Get the simple float value out of the resulting tensor
result = prediction.item()

print(f"Input: {input_value}")
print(f"Model Prediction: {result:.12f}")
</code></pre><pre><code>Input: 20.0
Model Prediction: 39.999973297119
</code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!x0Cs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!x0Cs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 424w, https://substackcdn.com/image/fetch/$s_!x0Cs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 848w, https://substackcdn.com/image/fetch/$s_!x0Cs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 1272w, https://substackcdn.com/image/fetch/$s_!x0Cs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!x0Cs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png" width="474" height="474" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:474,&quot;width&quot;:474,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!x0Cs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 424w, https://substackcdn.com/image/fetch/$s_!x0Cs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 848w, https://substackcdn.com/image/fetch/$s_!x0Cs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 1272w, https://substackcdn.com/image/fetch/$s_!x0Cs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa0dc746c-5ac8-4204-8abd-b8883f5f2166_474x474.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>So What?</h2><p>Now that we have wasted many orders of magnitude of compute beyond what's useful to predict that 20 * 2 in fact does equal 40 (or rather, failing to do so since we predict not quite 40), how do we make something that approaches a base level of usefulness?</p><h2>ReLU</h2><p>The above "model" is an example of linear regression, where we use the model to take an input and simply predict an output (Y = 2x + bias).</p><p>More use can be gotten from understanding how models can classify data, like a very simple example using colour. First, let's make a test set of data in the form of some spirals!</p><pre><code>def create_spiral_data(n_points=1000):
    theta = np.sqrt(np.random.rand(n_points)) * 2 * np.pi 
    
    # Class A (Red) -&gt; Target 0
    r_a = 2 * theta + np.pi
    data_a = np.array([np.cos(theta) * r_a, np.sin(theta) * r_a]).T
    x_a = data_a + np.random.randn(n_points, 2) * 0.2
    
    # Class B (Blue) -&gt; Target 1
    r_b = -2 * theta - np.pi
    data_b = np.array([np.cos(theta) * r_b, np.sin(theta) * r_b]).T
    x_b = data_b + np.random.randn(n_points, 2) * 0.2

    # Combine
    X = np.vstack([x_a, x_b])
    # Labels: 0 for Class A, 1 for Class B
    Y = np.hstack([np.zeros(n_points), np.ones(n_points)])
    
    return torch.FloatTensor(X), torch.FloatTensor(Y).view(-1, 1)

X, Y = create_spiral_data()
</code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!O_KK!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!O_KK!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 424w, https://substackcdn.com/image/fetch/$s_!O_KK!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 848w, https://substackcdn.com/image/fetch/$s_!O_KK!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 1272w, https://substackcdn.com/image/fetch/$s_!O_KK!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!O_KK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png" width="989" height="790" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:790,&quot;width&quot;:989,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!O_KK!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 424w, https://substackcdn.com/image/fetch/$s_!O_KK!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 848w, https://substackcdn.com/image/fetch/$s_!O_KK!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 1272w, https://substackcdn.com/image/fetch/$s_!O_KK!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fead2f70c-31b1-479f-92fc-1813d8b65017_989x790.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Let's use a new model, with an additional (and larger) layers to address the added complexity of our data, and supporting two inputs (our X and Y coordinates).</p><p>Importantly, we still output just one number, this being our probability (for the classification of red and blue dots)</p><pre><code>class SpiralNet(nn.Module):
    def __init__(self):
        super(SpiralNet, self).__init__()
        self.layer1 = nn.Linear(2, 64)
        self.layer2 = nn.Linear(64, 64) 
        self.layer3 = nn.Linear(64, 1)  # Output is 1 number (Probability)
        self.relu = nn.ReLU()
        self.sigmoid = nn.Sigmoid() # Squishes output between 0 and 1

    def forward(self, x):
        x = self.relu(self.layer1(x))
        x = self.relu(self.layer2(x))
        x = self.sigmoid(self.layer3(x)) # Final activation for probability
        return x

model = SpiralNet()
</code></pre><p>We've added "ReLU" here, at which point I became scared of maths again and closed my computer.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!-m_n!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!-m_n!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 424w, https://substackcdn.com/image/fetch/$s_!-m_n!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 848w, https://substackcdn.com/image/fetch/$s_!-m_n!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 1272w, https://substackcdn.com/image/fetch/$s_!-m_n!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!-m_n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp" width="177" height="300" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:300,&quot;width&quot;:177,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!-m_n!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 424w, https://substackcdn.com/image/fetch/$s_!-m_n!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 848w, https://substackcdn.com/image/fetch/$s_!-m_n!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 1272w, https://substackcdn.com/image/fetch/$s_!-m_n!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F74758fff-8376-48df-94b3-6cd4a06a0fca_177x300.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>When I returned, I put on my big boy pants and entered google, at which point I became scared again, since anything with more letters than numbers in maths is inherently scary, and that isn't helped when the second result starts with "A Gentle Introduction to..."</p><h2>What are Rectified Linear Units?</h2><p>Actually, quite simple!</p><p>ReLU(x) = max(0,x) where the output = input if the input is &gt;= 0, and 0 otherwise.</p><p>Unlike linear functions like in the example above, where no transformation is applied at the cost of being unable to teach the model any complex functions, we can address this using ReLU, by being able to pass what appears to be a linear input to a model for <em>backpropagation</em>, since we receive the input for all values &gt; 0, and 0 for those below, leaving us with a piecewise linear function.</p><p>The adoption of ReLU around 2010-11 is one of the significant milestones that has enabled significant progress in the deep learning field. Before its use, the primary methods used were logistic sigmoid and hyperbolic tangent functions, with both have problems with saturation.</p><p>See the excellent <a href="https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/?ref=thefelixview.com">Gentle Introduction</a> for mode detail on these.</p><blockquote><p>A general problem with both the sigmoid and tanh functions is that they saturate. This means that large values snap to 1.0 and small values snap to -1 or 0 for tanh and sigmoid respectively. Further, the functions are only really sensitive to changes around their mid-point of their input, such as 0.5 for sigmoid and 0.0 for tanh.</p></blockquote><p>We now set the loss criterion. In our Linear function, this was a Mean Squared Error loss, which works well for regression tasks.</p><p>For classification, like in our example, we want to use a loss function that maximises the penalty for incorrect guesses, which leads to better training results than a MSE-based version, where we consider the distance to a correct guess.</p><p>Cross Entropy Loss is more commonly used in these classification problems, as it measures the divergence between the predicted probability distribution and the true distribution of target classes.</p><p>If you're a mathematician, it looks like this:</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!mOGy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!mOGy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 424w, https://substackcdn.com/image/fetch/$s_!mOGy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 848w, https://substackcdn.com/image/fetch/$s_!mOGy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 1272w, https://substackcdn.com/image/fetch/$s_!mOGy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!mOGy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png" width="876" height="157" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:157,&quot;width&quot;:876,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!mOGy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 424w, https://substackcdn.com/image/fetch/$s_!mOGy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 848w, https://substackcdn.com/image/fetch/$s_!mOGy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 1272w, https://substackcdn.com/image/fetch/$s_!mOGy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e6592c9-1e44-425c-a52b-d59f2677fdad_876x157.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a></figure></div><p>If you're a programmer using the torch library, it looks like this:</p><pre><code>import torch.nn as nn
nn.BCELoss()
</code></pre><pre><code># We changed MSELoss -&gt; BCELoss (Binary Cross Entropy)
# This is used for Yes/No (1/0) classification questions
criterion = nn.BCELoss()
optimizer = optim.Adam(model.parameters(), lr=0.01)

# 4. TRAINING LOOP
# Adding history to track loss and accuracy over time
history = {
    'loss': [],
    'accuracy': []
}

epochs = 10000
print("Training on Spiral Data...")

for epoch in range(epochs):
    y_pred = model(X)
    loss = criterion(y_pred, Y)
    
    # Calculate accuracy (How many did we get right?)
    predicted_classes = y_pred.round() # Round to 0 or 1
    acc = (predicted_classes.eq(Y).sum() / float(Y.shape[0])).item()
    
    # Track history
    history['loss'].append(loss.item())
    history['accuracy'].append(acc)
    
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()
    
    if (epoch+1) % 200 == 0:
        print(f'Epoch [{epoch+1}/{epochs}], Loss: {loss.item():.4f}, Accuracy: {acc*100:.2f}%')

# Save model state and history
checkpoint = {
    'model_state': model.state_dict(),
    'history': history
}
torch.save(checkpoint, "spiral_model.pth")
print("Model and metrics saved to spiral_model.pth")
</code></pre><p>After training our model, we can load the spiral again, and see that we have a red and a blue region respectively, showing the model's mapping of the space after it has been trained. In this dataset with clear space and separation between the two spirals, the model has very few regions in which it cannot confidently separate the red and blue regions.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JNZJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JNZJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 424w, https://substackcdn.com/image/fetch/$s_!JNZJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 848w, https://substackcdn.com/image/fetch/$s_!JNZJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 1272w, https://substackcdn.com/image/fetch/$s_!JNZJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JNZJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png" width="1772" height="489" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:489,&quot;width&quot;:1772,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!JNZJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 424w, https://substackcdn.com/image/fetch/$s_!JNZJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 848w, https://substackcdn.com/image/fetch/$s_!JNZJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 1272w, https://substackcdn.com/image/fetch/$s_!JNZJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9a46e7d7-f479-4e0f-b8f5-5df373a535e7_1772x489.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>However, looking at the generated outputs and boundaries, it's clear that the model we have built here does not actually know what's 'red' or 'blue', it just splits the space into those two regions, leaving us with areas that are, visually, quite clearly outside the region of either spiral, and yet the model predicts with high confidence that this is a 'blue' or 'red' region respectively.</p><p>We can train the model with a third spiral, to separate the space further.</p><pre><code># 1. CREATE 3-CLASS SPIRAL DATA
def create_spiral_data(n_points=1000, classes=3):
    X = []
    y = []
    for i in range(classes):
        theta = np.sqrt(np.random.rand(n_points)) * 2 * np.pi 
        r = 2 * theta + np.pi
        
        # Rotate the spiral based on the class index
        # Class 0: 0 deg, Class 1: 120 deg, Class 2: 240 deg
        rotation = i * (2 * np.pi / classes)
        
        # Math to generate the spiral arms
        d = np.array([np.cos(theta) * r, np.sin(theta) * r]).T
        
        # Apply rotation matrix
        rot_mat = np.array([[np.cos(rotation), -np.sin(rotation)],
                            [np.sin(rotation), np.cos(rotation)]])
        d = np.dot(d, rot_mat)
        
        # Add noise
        d += np.random.randn(n_points, 2) * 0.2
        
        X.append(d)
        y.append(np.zeros(n_points) + i) # Label is 0, 1, or 2

    X = np.concatenate(X)
    y = np.concatenate(y)
    
    # Note: CrossEntropyLoss expects LongTensor for labels (integers), not Float
    return torch.FloatTensor(X), torch.LongTensor(y)

X, Y = create_spiral_data(classes=3)

# 2. DEFINE THE MODEL
class MultiSpiralNet(nn.Module):
    def __init__(self):
        super(MultiSpiralNet, self).__init__()
        self.layer1 = nn.Linear(2, 64)
        self.layer2 = nn.Linear(64, 64)
        # OUTPUT CHANGE: We now have 3 output neurons!
        # [Score for Class 0, Score for Class 1, Score for Class 2]
        self.layer3 = nn.Linear(64, 3) 
        self.relu = nn.ReLU()
        
        # Note: We do NOT put Softmax here. 
        # PyTorch's CrossEntropyLoss includes Softmax automatically for numerical stability.

    def forward(self, x):
        x = self.relu(self.layer1(x))
        x = self.relu(self.layer2(x))
        x = self.layer3(x) 
        return x

model = MultiSpiralNet()

# 3. LOSS (The "Big Gun" of AI)
criterion = nn.CrossEntropyLoss()
optimizer = optim.Adam(model.parameters(), lr=0.01)

# 4. TRAINING
epochs = 2000
print("Training 3-Class Spiral...")

for epoch in range(epochs):
    # Forward
    logits = model(X) 
    
    # Loss
    loss = criterion(logits, Y)
    
    # Backward
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()
    
    if (epoch+1) % 200 == 0:
        # Calculate accuracy
        # torch.max returns (max_value, index_of_max_value)
        # We want the index (0, 1, or 2)
        _, predicted = torch.max(logits, 1)
        acc = (predicted == Y).sum().item() / Y.size(0)
        print(f'Epoch [{epoch+1}/{epochs}], Loss: {loss.item():.4f}, Accuracy: {acc*100:.2f}%')

torch.save(model.state_dict(), "spiral_3class.pth")
print("Model saved.")
</code></pre><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!iRGN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!iRGN!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 424w, https://substackcdn.com/image/fetch/$s_!iRGN!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 848w, https://substackcdn.com/image/fetch/$s_!iRGN!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 1272w, https://substackcdn.com/image/fetch/$s_!iRGN!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!iRGN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png" width="1127" height="490" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:490,&quot;width&quot;:1127,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!iRGN!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 424w, https://substackcdn.com/image/fetch/$s_!iRGN!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 848w, https://substackcdn.com/image/fetch/$s_!iRGN!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 1272w, https://substackcdn.com/image/fetch/$s_!iRGN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F15e16d23-3814-42d6-8119-67c6ed06727a_1127x490.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This leaves us with regions that are closer to the actual spiral areas, but this shows us what this model does, and importantly, what it does not do.</p><p>We want the neural networks we build to be useful. The small model that we have put together above is useful, but only for this one specific task.</p><p>Rather than the model actually recognizing red / blue / neither, it essentially builds a map of our space, predicting the likelyhood of any of the options appearing in that space.</p><p>We can visualise this by changing the input from a spiral, where we can see quite clearly that we have not built a model good at differentiating colour, but one that learns a space.</p><p>As we can see with the test data below where instead of a spiral, we generate a set of lines of coloured data, the model is entirely useless for this task. If we wanted the model to be able to classify these arrangements, we would have to train it again.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!chNd!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!chNd!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 424w, https://substackcdn.com/image/fetch/$s_!chNd!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 848w, https://substackcdn.com/image/fetch/$s_!chNd!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 1272w, https://substackcdn.com/image/fetch/$s_!chNd!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!chNd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png" width="825" height="717" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:717,&quot;width&quot;:825,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!chNd!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 424w, https://substackcdn.com/image/fetch/$s_!chNd!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 848w, https://substackcdn.com/image/fetch/$s_!chNd!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 1272w, https://substackcdn.com/image/fetch/$s_!chNd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff64d1a6c-5743-4745-a70c-500ac3be7da1_825x717.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><h2>What's next?</h2><p>In order to proceed beyond simple classification or simple prediction, as we have looked at in this brief overview post, we have to start talking about Convolutional Neural Networks, which can begin to use training data, like the classical MNIST example set, to understand handwritten numbers.</p><h3>What is a CNN?</h3><p>From Wikipedia:</p><blockquote><p>CNNs are also known as shift invariant or space invariant artificial neural networks, based on the shared-weight architecture of the convolution kernels or filters that slide along input features and provide translation-equivariant responses known as feature maps. Counter-intuitively, most convolutional neural networks are not invariant to translation, due to the downsampling operation they apply to the input.<br>Feedforward neural networks are usually fully connected networks, that is, each neuron in one layer is connected to all neurons in the next layer. The "full connectivity" of these networks makes them prone to overfitting data. Typical ways of regularization, or preventing overfitting, include: penalizing parameters during training (such as weight decay) or trimming connectivity (skipped connections, dropout, etc.) Robust datasets also increase the probability that CNNs will learn the generalized principles that characterize a given dataset rather than the biases of a poorly-populated set.<br>Convolutional networks were inspired by biological processes in that the connectivity pattern between neurons resembles the organization of the animal visual cortex. Individual cortical neurons respond to stimuli only in a restricted region of the visual field known as the receptive field. The receptive fields of different neurons partially overlap such that they cover the entire visual field.</p></blockquote><p>Essentially, in a CNN, the input is broken into kernels. For example, in the model we set up below, we take a 28x28 pixel image, and break it into 3x3 kernels. The model then extracts a set of "features" from the images that it learns over epochs of training.</p><p>Importantly, as you are no doubt aware, when anyone talks about model responses, these are always, ultimately, a probability expression of the most likely answer out of the possible options.</p><p>At the end of our model, we have a layer with 10 outputs (numbers 0-9), which we flatten (with ReLU), to arrive at the activation of the layer that corresponds to our number value, which is the final model response.</p><div><hr></div><p>In a dataset like the MNIST training set, we train the model with a large amount of test data, where each test image is labelled with its corresponding number. The model here effectively learns what shapes correspond to what labelled number, so that when we ask it to identify a number after training, it can identify the closest likely matching number.</p><pre><code>import torch
import torch.nn as nn
import torch.optim as optim
from torchvision import datasets, transforms
from torch.utils.data import DataLoader

# 1. PREPARE DATA (MNIST)
# Transforms allow us to turn raw images into Tensors
transform = transforms.Compose([
    transforms.ToTensor(),
    transforms.Normalize((0.1307,), (0.3081,)) # Standard normalization for MNIST
])

# Download data
train_dataset = datasets.MNIST('./data', train=True, download=True, transform=transform)
# DataLoader handles batching (giving us 64 images at a time)
train_loader = DataLoader(train_dataset, batch_size=64, shuffle=True)

# 2. DEFINE THE CNN MODEL
class SimpleCNN(nn.Module):
    def __init__(self):
        super(SimpleCNN, self).__init__()
        
        # --- FEATURE EXTRACTION (The "Eyes") ---
        # Layer 1: Input 1 channel (grayscale) -&gt; Output 32 features
        # Kernel 3x3 means it looks at small 3x3 squares
        self.conv1 = nn.Conv2d(in_channels=1, out_channels=32, kernel_size=3, padding=1)
        
        # Layer 2: Input 32 features -&gt; Output 64 features
        self.conv2 = nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, padding=1)
        
        # Pooling: Shrinks image by half (2x2)
        self.pool = nn.MaxPool2d(kernel_size=2, stride=2)
        
        # --- DECISION MAKING (The "Brain") ---
        # We need to calculate the size of the flattened input.
        # MNIST is 28x28. 
        # After Pool 1 (28-&gt;14). After Pool 2 (14-&gt;7).
        # Final Grid is 7x7 pixels. We have 64 feature maps.
        # So: 64 * 7 * 7 = 3136 inputs.
        self.fc1 = nn.Linear(64 * 7 * 7, 128)
        self.fc2 = nn.Linear(128, 10) # 10 Outputs (Digits 0-9)
        
        self.relu = nn.ReLU()
        self.flatten = nn.Flatten()

    def forward(self, x):
        # Pass 1: Conv -&gt; ReLU -&gt; Pool
        # Input: [Batch, 1, 28, 28] -&gt; Output: [Batch, 32, 14, 14]
        x = self.pool(self.relu(self.conv1(x)))
        
        # Pass 2: Conv -&gt; ReLU -&gt; Pool
        # Input: [Batch, 32, 14, 14] -&gt; Output: [Batch, 64, 7, 7]
        x = self.pool(self.relu(self.conv2(x)))
        
        # Flatten: Turn Grid into Line
        # Output: [Batch, 3136]
        x = self.flatten(x)
        
        # Classification
        x = self.relu(self.fc1(x))
        x = self.fc2(x)
        return x

model = SimpleCNN()

# 3. LOSS &amp; OPTIMIZER
criterion = nn.CrossEntropyLoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)

# 4. TRAIN
print("Training CNN on MNIST digits...")
epochs = 3 # CNNs learn FAST. 3 epochs is enough for &gt;98% accuracy.

for epoch in range(epochs):
    running_loss = 0.0
    for batch_idx, (data, target) in enumerate(train_loader):
        # Standard training steps
        optimizer.zero_grad()
        output = model(data)
        loss = criterion(output, target)
        loss.backward()
        optimizer.step()
        
        running_loss += loss.item()
        
        if batch_idx % 100 == 0:
            print(f'Epoch {epoch+1}, Batch {batch_idx}: Loss {loss.item():.4f}')

print("Training Complete.")
torch.save(model.state_dict(), "mnist_cnn.pth")
</code></pre><p>When we run this model, training takes quite a bit longer to train than the previous little testers we've built, as a result of the added complexity of our model and its layers.</p><p>We can see the difference in structure between the simple model we built for our spirals</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!SQtZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!SQtZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 424w, https://substackcdn.com/image/fetch/$s_!SQtZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 848w, https://substackcdn.com/image/fetch/$s_!SQtZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 1272w, https://substackcdn.com/image/fetch/$s_!SQtZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!SQtZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png" width="179" height="669" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:669,&quot;width&quot;:179,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!SQtZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 424w, https://substackcdn.com/image/fetch/$s_!SQtZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 848w, https://substackcdn.com/image/fetch/$s_!SQtZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 1272w, https://substackcdn.com/image/fetch/$s_!SQtZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4db21f32-ffee-4c21-b1fa-c3c06aa11955_179x669.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>and the MNIST training model</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!C-wJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!C-wJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 424w, https://substackcdn.com/image/fetch/$s_!C-wJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 848w, https://substackcdn.com/image/fetch/$s_!C-wJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!C-wJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!C-wJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png" width="236" height="1152" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/afc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1152,&quot;width&quot;:236,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!C-wJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 424w, https://substackcdn.com/image/fetch/$s_!C-wJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 848w, https://substackcdn.com/image/fetch/$s_!C-wJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 1272w, https://substackcdn.com/image/fetch/$s_!C-wJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fafc33fe9-3a33-488d-9fb0-949625d0f3a1_236x1152.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>What can this model do? The MNIST training model is a sort of classic learning piece, which lets us recognize handwritten numbers from their training set, for example we can load one of the test images for the number '1', and indeed</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Uaeo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Uaeo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 424w, https://substackcdn.com/image/fetch/$s_!Uaeo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 848w, https://substackcdn.com/image/fetch/$s_!Uaeo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 1272w, https://substackcdn.com/image/fetch/$s_!Uaeo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Uaeo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png" width="389" height="411" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/44e19e26-5074-4afb-bef3-70e4da425249_389x411.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:411,&quot;width&quot;:389,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A Little Introduction to Neural Networks&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A Little Introduction to Neural Networks" title="A Little Introduction to Neural Networks" srcset="https://substackcdn.com/image/fetch/$s_!Uaeo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 424w, https://substackcdn.com/image/fetch/$s_!Uaeo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 848w, https://substackcdn.com/image/fetch/$s_!Uaeo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 1272w, https://substackcdn.com/image/fetch/$s_!Uaeo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F44e19e26-5074-4afb-bef3-70e4da425249_389x411.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The model correctly and confidently predicted our number.</p><p>Neat.</p><h2>What now?</h2><p>None of the above is new, but by going through the stages of models to this point logically, and beginning to understand some of the choices that are made to achieve the desired results, it helps us to be able to speak the language of neural networks, and to begin playing around with more complex cases. We went from a simple linear model all the way to a basic convolutional neural network, from predicting <em>y = 2x</em> to recognizing handwritten numbers.</p>]]></content:encoded></item><item><title><![CDATA[Felix Research's Origins & AI for Finance]]></title><description><![CDATA[Interviewing CEO & Founder, Ben Jaletzke]]></description><link>https://www.thefelixview.com/p/felix-researchs-origins-ai-for-finance</link><guid isPermaLink="false">https://www.thefelixview.com/p/felix-researchs-origins-ai-for-finance</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Tue, 16 Dec 2025 12:51:36 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/c22ec26f-2943-4bbe-bf09-b636ab0497d5_1540x1534.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!MCgi!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!MCgi!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 424w, https://substackcdn.com/image/fetch/$s_!MCgi!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 848w, https://substackcdn.com/image/fetch/$s_!MCgi!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 1272w, https://substackcdn.com/image/fetch/$s_!MCgi!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!MCgi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Felix Research's Origins &amp; AI for Finance&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Felix Research's Origins &amp; AI for Finance" title="Felix Research's Origins &amp; AI for Finance" srcset="https://substackcdn.com/image/fetch/$s_!MCgi!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 424w, https://substackcdn.com/image/fetch/$s_!MCgi!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 848w, https://substackcdn.com/image/fetch/$s_!MCgi!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 1272w, https://substackcdn.com/image/fetch/$s_!MCgi!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffbf797c4-b120-49ea-8bc8-0220b9d2c623_1540x1534.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p><em>Interviewing</em>&nbsp;<em>CEO &amp; Founder, Ben</em>&nbsp;<em>Jaletzke&nbsp;</em>&nbsp;</p><p><em><strong>Ben and</strong></em><strong>&nbsp;</strong><em><strong>I head over to</strong></em><strong>&nbsp;</strong><em><strong>a</strong></em><strong>&nbsp;</strong><em><strong>Battersea pub</strong></em><strong>&nbsp;</strong><em><strong>for an informal interview about the origins of Felix Research, the mission driving it and the industrial pain</strong></em><strong>&nbsp;</strong><em><strong>points that inform Ben&#8217;s approach to product building.</strong></em><strong>&nbsp;</strong><em><strong>We</strong></em><strong>&nbsp;</strong><em><strong>are</strong></em><strong>&nbsp;</strong><em><strong>joined by</strong></em><strong>&nbsp;</strong><em><strong>James (CCO &amp; co-founder) to discuss AI for enterprise use, as well as the past and future of the workforce.&nbsp;</strong></em><strong>&nbsp;</strong></p><p><strong>Sav</strong>:&nbsp;Of course&nbsp;I already know&nbsp;this, but&nbsp;tell me how the idea came to you. Give us&nbsp;the&nbsp;origin story. Discovery. Pain points. Perceived opportunity.&nbsp;</p><p><strong>Ben</strong>: Well, you can read all about it in my new book, <strong>Felix One</strong>!&nbsp;</p><p><strong>Sav</strong>: "Rags to Riches" for the low, <em>low</em> price of five ninety-nine, ninety-nine!&nbsp;</p><p><strong>Ben</strong>:&nbsp;But&nbsp;in seriousness,&nbsp;I mean, I&nbsp;don't&nbsp;think there was that...&nbsp;eureka moment.&nbsp;</p><p><strong>We</strong>&nbsp;<strong>get comfortable in their chairs, having opted for stools outside but under the glow of space heaters &#8211; crucial for a</strong>&nbsp;<strong>brisk</strong>&nbsp;<strong>November</strong>&nbsp;<strong>evening in London.</strong>&nbsp;</p><h2>Intro &amp; Background</h2><p>I'd&nbsp;thought about&nbsp;various ideas&nbsp;before,&nbsp;but I&nbsp;don't&nbsp;think there was ever a moment where&nbsp;I&nbsp;was like, &#8220;That's&nbsp;The&nbsp;Thing&#8221;.&nbsp;It was more that while I was working elsewhere, I was playing around with a bunch of ideas in little elements that either annoyed me or that I thought I had a better approach to.&nbsp;So&nbsp;I&nbsp;starting programming.&nbsp;</p><p><strong>I wait expectantly.</strong></p><p><strong>Ben</strong>: Well I was just&nbsp;sort of playing&nbsp;around with thoughts and one of them&nbsp;was&nbsp;<em><strong>how do you make it easier for people in the team to share files and links and data with each other?</strong></em>&nbsp;<em>R</em>ather than doing it in an email or a&nbsp;teams&nbsp;chat&nbsp;or&nbsp;a WhatsApp chat,&nbsp;having one&nbsp;sort of&nbsp;centralised&nbsp;space where&nbsp;every&nbsp;time you share a thing, you do it there. That could have just been a plugin or&nbsp;whatever,&nbsp;but it was one my&nbsp;initial&nbsp;frustrations.&nbsp;&nbsp;</p><p><strong>Sav</strong>: Okay we'll certainly get back to that but before we do, at the time what were you working on?&nbsp;&nbsp;</p><p><strong>Ben</strong>:&nbsp;Like,&nbsp;job wise or?&nbsp;</p><p><strong>Sav</strong>:&nbsp;Yeah, like&nbsp;give us a bit of personal background but also some context in terms of&nbsp;where you were encountering these&nbsp;pain&nbsp;points.&nbsp;</p><p><strong>Ben</strong>:&nbsp;Yeah, I mean having spent some years in industry [institutional finance], as much as the efficient markets people want to say that&nbsp;there's&nbsp;some theoretically perfect information and yada&nbsp;yada, no one&nbsp;has even&nbsp;a modicum of genuine information. Like, if you wanted to break down any investment decision or&nbsp;<em>possible</em>&nbsp;investment&nbsp;decision&nbsp;into like&nbsp;its &#8220;prime factors&#8221;, -&nbsp;</p><p><strong>Sav:</strong>&nbsp;- Nice.&nbsp;</p><p><strong>Ben</strong>: -&nbsp;then&nbsp;there's&nbsp;too many of them to consider.&nbsp;So&nbsp;we just never have the actual data for it.&nbsp;That's&nbsp;broadly&nbsp;why<strong>&nbsp;Felix Research</strong> exists.</p><p>For example, if you really want to get into&nbsp;what&nbsp;inflation&#8217;s&nbsp;going&nbsp;to be like next year,&nbsp;there's&nbsp;too much data to do that.&nbsp;It's&nbsp;a very simple&nbsp;thing at&nbsp;a high level&nbsp;- you can look and see&nbsp;it&#8217;s&nbsp;at 2% right&nbsp;now and&nbsp;think that&nbsp;there's&nbsp;going to be a&nbsp;supply&nbsp;demand.&nbsp;So&nbsp;the&nbsp;<em>assumption</em>&nbsp;is that&nbsp;it's&nbsp;going to be slightly higher or lower and then you shift that 2% by some decimal&nbsp;place. But if you tried to do the real math, you would need an infinite amount of data to practically&nbsp;do that, because every single payment and every single action and every single choice would flow into that.&nbsp;</p><p>So&nbsp;when I was doing the fundamentals,&nbsp;it became clear that&nbsp;we have a chance here to&nbsp;actually set&nbsp;up&nbsp;a very clean&nbsp;research infrastructure,&nbsp;from scratch,&nbsp;for a firm without&nbsp;- I&nbsp;don't&nbsp;know what you want to call it - like&nbsp;corporate debt,&nbsp;almost.&nbsp;</p><p><strong>Sav</strong>: And&nbsp;by that&nbsp;you mean figurative debt? Like an unpleasant residual?&nbsp;</p><p><strong>Ben</strong>:&nbsp;Yeah,&nbsp;I mean the overhang of bad behaviours and practices. Like, if&nbsp;you're&nbsp;a company&nbsp;subscribed&nbsp;to certain databases, then those are the databases you use to&nbsp;inform decision-making. If you have Bloomberg, you have a lot of data and&nbsp;you're&nbsp;sort of lucky&nbsp;in that sense, but most people either&nbsp;don't&nbsp;have it or&nbsp;can't&nbsp;use all&nbsp;of it.&nbsp;&nbsp;And that&nbsp;led me to the practical question of&nbsp;how can we actually do this?&nbsp;Because, again,&nbsp;practically&nbsp;speaking,&nbsp;I&nbsp;can't&nbsp;download 500 terabytes of&nbsp;information&nbsp;and then&nbsp;search it. I&nbsp;don't&nbsp;even know how to&nbsp;search&nbsp;it.&nbsp;</p><p>You have your one investment bank that sends you investment reports, so you use those, plus of course, other information. But&nbsp;that's&nbsp;sort of&nbsp;the&nbsp;only&nbsp;primary source you have in that sense.&nbsp;So&nbsp;you&nbsp;miss out on&nbsp;99.99999999...&nbsp;essentially&nbsp;100%&nbsp;of the information in the market if&nbsp;you're&nbsp;only considering that&nbsp;single source.&nbsp;So&nbsp;the question that I wanted to&nbsp;answer,&nbsp;from an investment perspective - which made me want to do the startup &#8211;&nbsp;was:&nbsp;<strong>How can we get closer to not</strong>&nbsp;<strong>missing out on</strong>&nbsp;<strong>the</strong>&nbsp;<strong>100%?</strong>&nbsp;The only way to do that is programmatically. To have something that can&nbsp;search&nbsp;millions of data points or millions of ideas at the speed of&nbsp;core&nbsp;cycles.&nbsp;</p><p><strong>Sav:</strong>&nbsp;Can&nbsp;I&nbsp;push you to elaborate on &#8220;the&nbsp;missed 100%&#8221;? Get into that a bit more.&nbsp;</p><p><strong>Ben</strong>:&nbsp;So&nbsp;if you have&nbsp;your&nbsp;one&nbsp;advisor database&nbsp;and we assume&nbsp;you&nbsp;don't&nbsp;have any other database,&nbsp;from a&nbsp;numbers&nbsp;perspective, you have&nbsp;let's&nbsp;say 10&nbsp;thousand&nbsp;documents. But&nbsp;there's&nbsp;50&nbsp;<em>million</em>&nbsp;documents out there.&nbsp;So&nbsp;you have&nbsp;a dismally&nbsp;small fraction of the total information.&nbsp;If you break it down by industry or sector or company type, the numbers change, but the ratios&nbsp;probably only&nbsp;get worse. Because if&nbsp;I'm&nbsp;only searching,&nbsp;let's&nbsp;say,&nbsp;the Goldman Sachs database, they have a lot of information, but they&nbsp;don't&nbsp;have any&nbsp;<strong>macro information</strong>&nbsp;outside of&nbsp;what's&nbsp;in their report.&nbsp;So&nbsp;if I&nbsp;didn't&nbsp;have a macro database, then in terms of making any kind of macroeconomic assumptions, I would have zero data, effectively, except whatever I read&nbsp;in&nbsp;a given&nbsp;report.&nbsp;&nbsp;</p><p>So now if I add a macro database to my set, I&nbsp;have much&nbsp;more information. But now I have another problem:&nbsp;<strong>how do I combine that information sensibly?</strong>&nbsp;I&nbsp;don't&nbsp;want to be reading an article and then&nbsp;having&nbsp;to&nbsp;go to my macro browser,&nbsp;then to&nbsp;looking&nbsp;up the time series, understanding&nbsp;the data, and then going&nbsp;back to&nbsp;my project workspace&nbsp;- that takes too long.&nbsp;It's&nbsp;unwieldy.&nbsp;&nbsp;</p><p>If you look at the European Central Bank, for example, they have so many small data points.&nbsp;They're&nbsp;all reliable, but you need a&nbsp;consolidated&nbsp;version of that.&nbsp;&nbsp;So&nbsp;all that&#8217;s to say,&nbsp;you need to expand how much data you could have as your input, you need&nbsp;that amount of data&nbsp;to be&nbsp;manageable to work&nbsp;with&nbsp;and you achieve&nbsp;this&nbsp;by having a system that can look through whatever you want it to look through.&nbsp;</p><p><strong>Sav</strong>:&nbsp;So&nbsp;if we drew a straight line, narratively, from the link sharing exploration stuff to&nbsp;<strong>Felix One</strong>, what would it look like?&nbsp;</p><p><strong>Ben</strong>: The link sharing process, in relation to&nbsp;<strong>Felix</strong>&nbsp;<strong>One</strong>,&nbsp;was me&nbsp;trying to take a first step.&nbsp;There's&nbsp;this sort of philosophical ideal of what research data should look like, which is to be able to search all data from any source in any fidelity, perfectly and instantly.&nbsp;That's&nbsp;essentially sort&nbsp;of like quantum computing for&nbsp;data, if&nbsp;you will. Like, ideally you&nbsp;search&nbsp;infinite mutations and get the result you want, yes, but we&nbsp;can't&nbsp;do that. That&nbsp;doesn't&nbsp;work.&nbsp;It's&nbsp;not&nbsp;feasible. What we&nbsp;<em>can</em>&nbsp;do at least - with the link sharing thing, for example, is if we have a file, I can make sure that you also have that&nbsp;file&nbsp;or you can see that file.&nbsp;So&nbsp;we're&nbsp;not yet cutting out the noise, but&nbsp;we're&nbsp;at least making a tiny fraction of working with some amount of data more manageable.&nbsp;So&nbsp;<strong>Felix One</strong>&nbsp;is a first step into extending a researcher's dataset and also making it easier to work with the data.&nbsp;</p><h2>Workflows &amp; Enterprise Finance</h2><p><strong>Sav:</strong>&nbsp;Talk to me a bit about&nbsp;workflows. We spoke earlier this year on the blog about technology alone being an insufficient strategy;&nbsp;how do developers close the gap between the software and the human end user&nbsp;- financial researcher or otherwise?&nbsp;</p><p><strong>Ben</strong>:&nbsp;Within the finance industry there is nothing except for&nbsp;<strong>your</strong>&nbsp;take on&nbsp;something;&nbsp;essentially&nbsp;it&#8217;s&nbsp;your IP that separates you from another firm. There is functionally&nbsp;no difference&nbsp;between two boutique investment banks&nbsp;in that&nbsp;there&#8217;s&nbsp;nothing that one bank&nbsp;generally has&nbsp;that another bank&nbsp;can't&nbsp;also&nbsp;get&nbsp;- that goes for industry knowledge too.&nbsp;You can always hire the person with&nbsp;whatever&nbsp;expertise&nbsp;you&#8217;re&nbsp;looking for.&nbsp;In that way&nbsp;it&#8217;s&nbsp;kind of&nbsp;a&nbsp;people-driven business.&nbsp;Therefore,&nbsp;if you leave a big firm where you became&nbsp;very good&nbsp;at a&nbsp;Thing,&nbsp;you're&nbsp;going to leave it to do that&nbsp;Thing for someone else or for yourself.&nbsp;But the inverse&nbsp;means&nbsp;that&nbsp;even if&nbsp;you're&nbsp;smaller,&nbsp;you can be highly effective.&nbsp;&nbsp;</p><p>If you were&nbsp;making Citadel, like, 2B&nbsp;a year, then you&nbsp;can&nbsp;usually&nbsp;do that exact same thing one-for-one&nbsp;in a different context. Oftentimes,&nbsp;you&nbsp;even&nbsp;take your old team with you after a while. So that's not only Citadel losing your revenue, but&nbsp;they're&nbsp;losing potential investors who might go with that&nbsp;team, which is why&nbsp;that kind of thing&nbsp;is&nbsp;strictly regulated.&nbsp;&nbsp;</p><p><strong>Sav</strong>:&nbsp;Sounds&nbsp;fairly&nbsp;zero-sum-y.&nbsp;</p><p><strong>Ben</strong>:&nbsp;Yeah the&nbsp;<em><strong>vibe</strong></em>&nbsp;is&nbsp;very&nbsp;zero-sum&nbsp;but&nbsp;the situation&nbsp;isn&#8217;t&nbsp;entirely.&nbsp;It's&nbsp;largely&nbsp;the&nbsp;attitude people have&nbsp;that creates&nbsp;the zero-sum game.&nbsp;And you know,&nbsp;that's&nbsp;partly a function of it being&nbsp;an industry where the thing&nbsp;you're&nbsp;producing is also the thing that you get paid.&nbsp;&nbsp;</p><p><strong>Sav</strong>:&nbsp;Sure.&nbsp;</p><p><strong>Ben</strong>:&nbsp;It's&nbsp;not like car manufacturing, where you make a lot of cars and then some percentage of that after cost and after supply chain might be a bonus of like your performance or whatever.&nbsp;It's&nbsp;like if you make more money, you get more money.&nbsp;Therefore,&nbsp;people are very much all in that&nbsp;zero-sum&nbsp;mindset. Like "<em>I would like&nbsp;all of&nbsp;the money if I can</em>".&nbsp;And&nbsp;that's&nbsp;reasonable.&nbsp;&nbsp;</p><p>But,&nbsp;going back to the&nbsp;idea for <strong>Felix One</strong> and the idea of the Super-Empowered Professional, &nbsp;I think part of the problem&nbsp;is that some of the ideas that&nbsp;I'd&nbsp;like to pursue are ones that are&nbsp;actually quite&nbsp;complex in that&nbsp;they&nbsp;require&nbsp;both immense man-power and considerable&nbsp;compute.&nbsp;Some issues and ideas are&nbsp;conceptually&nbsp;very easy&nbsp;to understand, but the&nbsp;Doing&nbsp;It&nbsp;is&nbsp;a different matter. Sometimes&nbsp;a million connections&nbsp;have to&nbsp;be made for&nbsp;something&nbsp;to work the way&nbsp;it's&nbsp;supposed to, as seamlessly as&nbsp;it's&nbsp;supposed to.&nbsp;But then&nbsp;that&#8217;s&nbsp;sort of what&nbsp;I hope the future work of <strong>Felix Research</strong> to be, more broadly once&nbsp;we&#8217;re&nbsp;able&nbsp;to do so.&nbsp;</p><p>So,&nbsp;the link&nbsp;sharing&nbsp;thing would have&nbsp;involved the&nbsp;huge headache of connecting&nbsp;every single thing&nbsp;that someone could&nbsp;possibly use&nbsp;to chat and figuring out how to do that securely and quickly.&nbsp;&nbsp;</p><p><strong>Sav</strong>:&nbsp;Sounds as&nbsp;unwieldy as it would be helpful!&nbsp;</p><p><strong>Ben</strong>:&nbsp;Yeah, I mean,&nbsp;I think helpfulness&nbsp;has to be the&nbsp;Ultimate Good for our purposes -&nbsp;it's&nbsp;no good if you design a thing that makes people's lives&nbsp;easier&nbsp;but&nbsp;you&#8217;ve&nbsp;actually ended&nbsp;up adding steps to the workflow.&nbsp;</p><p><strong>Sav</strong>:&nbsp;Definitely.&nbsp;</p><p><strong>Ben</strong>: With some footnotes to that, of course.&nbsp;Ultimately, the&nbsp;trade-off&nbsp;has to&nbsp;make sense.&nbsp;&nbsp;</p><p><strong>Sav</strong>: Yes.&nbsp;How many steps are you adding versus how much better is it?&nbsp;</p><p><strong>Ben</strong>: If&nbsp;you're&nbsp;adding five workflow steps, but each step&nbsp;actually adds&nbsp;value,&nbsp;then&nbsp;maybe&nbsp;it's&nbsp;okay to add the five steps.&nbsp;</p><p><strong>Sav</strong>: And I guess&nbsp;we're&nbsp;in interesting times&nbsp;for&nbsp;AI-powered workplace software in that&nbsp;people are&nbsp;rushing to the table with &#8220;Productivity Tools&#8221; that neither&nbsp;eradicate the administrative bloat meaningfully,&nbsp;nor create massively valuable&nbsp;output. Hello! James has just joined us.&nbsp;</p><p>&nbsp;<strong>James Enters.</strong>&nbsp;</p><p>&nbsp;<strong>James</strong>: Hello! Please continue.&nbsp;</p><p><strong>James pulls up a stool and casts a glance at the recording setup - a very artfully/ precariously balanced phone atop a coffee cup, if memory serves.</strong></p><p><strong>Sav</strong>: I was going to&nbsp;say,&nbsp;on&nbsp;this issue of creating diminishing returns, where&nbsp;you're&nbsp;catering to a pain point, but&nbsp;actually making&nbsp;the process more complicated.&nbsp;You're&nbsp;seeing this emerging issue wherein&nbsp;companies are implementing these AI&nbsp;pilots&nbsp;that&nbsp;are&nbsp;glorified wrappers and putting their staff through training. That's&nbsp;nonsense.&nbsp;&nbsp;</p><p><strong>James</strong>:&nbsp;Yep&nbsp;to then use a tool that&nbsp;isn't&nbsp;very agile,&nbsp;doesn't&nbsp;reflect the reality of their workflow&nbsp;and ends up gathering dust a year later.&nbsp;</p><p><strong>Sav</strong>:&nbsp;So&nbsp;it seems that&nbsp;you're&nbsp;uniquely&nbsp;positioned because the solution&nbsp;you're&nbsp;creating was a pain point to&nbsp;an end user within industry&nbsp;- yourself. Rather than a reverse engineered attempt to see whatever businesses will pay for.&nbsp;</p><p><strong>Ben</strong>:&nbsp;I mean the overarching mission has been to&nbsp;build&nbsp;a simple website that makes&nbsp;mundane parts of the workflow&nbsp;easier.&nbsp;There have been so many fleeting instances of like&nbsp;<em>&#8220;If I could just improve this one tiny thing,</em>&nbsp;<em>I would find</em>&nbsp;<em>Product much</em>&nbsp;<em>nicer to use&#8221;.</em>&nbsp;</p><p>So&nbsp;there&nbsp;are&nbsp;lots of these&nbsp;opportunities&nbsp;for&nbsp;little&nbsp;tweaks,&nbsp;where something&nbsp;is&nbsp;really good&nbsp;but&nbsp;there&#8217;s&nbsp;a gap to be bridged as far as&nbsp;real&nbsp;life usability.&nbsp;Like,&nbsp;I&nbsp;just need to adjust&nbsp;these things&nbsp;for my personal workflow&nbsp;- now, my personal workflow is by no means the&nbsp;<em>best</em>&nbsp;workflow, but I&nbsp;found&nbsp;that by making these tweaks,&nbsp;I inadvertently started&nbsp;learning&nbsp;more&nbsp;about these tools and what it takes to&nbsp;develop&nbsp;one.&nbsp;</p><p>There's also a kind of connectivity to the learning. An example is another orbiting idea&nbsp;that I&nbsp;placed on the back-burner because it&nbsp;was never&nbsp;intended&nbsp;to be a&nbsp;standalone&nbsp;product. It&nbsp;was&nbsp;a&nbsp;multi-format text&nbsp;editor,&nbsp;where&nbsp;instead of having four&nbsp;different&nbsp;editors for four different text&nbsp;formats,&nbsp;we&nbsp;just&nbsp;have one&nbsp;page where I can,&nbsp;at the click of a button, switch between HTML, markdown, rich text and other formats and get an output&nbsp;ready to paste into website code.&nbsp;&nbsp;</p><p><strong>Sav</strong>: And why&nbsp;wasn't&nbsp;it&nbsp;ever intended to&nbsp;be&nbsp;or&nbsp;developed&nbsp;into a standalone product?&nbsp;</p><p><strong>Ben</strong>:&nbsp;I wanted it to exist differently. The principles of centralisation and interoperability are attractive,&nbsp;evidently,&nbsp;but&nbsp;people&nbsp;don't&nbsp;pay for text editors.&nbsp;It's&nbsp;the kind of thing that you would bundle it in&nbsp;some&nbsp;subscription where you get&nbsp;it as an add-on for some extra pounds per&nbsp;month or&nbsp;something.&nbsp;</p><p><strong>James</strong>:&nbsp;Fair.&nbsp;Realistic. I suppose a benefit of having familiarity with the landscape&nbsp;you&#8217;re&nbsp;trying to sell within.&nbsp;</p><p><strong>Ben</strong>:&nbsp;Yeah, but at the same time,&nbsp;it was another step in the direction of like, okay, what can I use of the available tools? And then tweak them slightly or put them together a bit like Lego&nbsp;and&nbsp;put them together in a slightly&nbsp;different way&nbsp;to make them nicer for myself. And&nbsp;that&#8217;s&nbsp;how&nbsp;we get to the&nbsp;early&nbsp;stages&nbsp;of <strong>Felix&nbsp;One</strong>. After those practical functional questions came the kind of... self-belief&nbsp;part.&nbsp;</p><h2>On Personal Process</h2><p>I was catching up with some friends from my old job and one of them said to me,&nbsp;<em>&#8220;Oh, I never thought</em>&nbsp;<em>you'd</em>&nbsp;<em>stay in finance long. I always thought</em>&nbsp;<em>you'd</em>&nbsp;<em>do a startup at some point</em>&#8221;.&nbsp;And he&nbsp;kind of gave&nbsp;me the impetus to&nbsp;consider, &#8220;<em>Maybe I</em>&nbsp;<em>should</em>&nbsp;<em>give it a try</em>&#8221;.&nbsp;And the more I kept&nbsp;talking&nbsp;to&nbsp;people,&nbsp;the more I got these enthusiastic responses&nbsp;&#8211; I found it really encouraging.&nbsp;Honestly,&nbsp;Rebecca had a huge role in getting me out of my head and out there doing validation.&nbsp;You know,&nbsp;I&nbsp;don't&nbsp;like to talk ideas because I see all the reasons why&nbsp;they're&nbsp;not&nbsp;good&nbsp;<em>enough</em>&nbsp;ideas,&nbsp;or&nbsp;why&nbsp;not&nbsp;<em>yet</em>. But&nbsp;she was like,&nbsp;&#8220;Just talk to people.&nbsp;<strong>You</strong>&nbsp;<strong>have to</strong>&nbsp;<strong>talk to people about it to get feedback on what could make it better.&#8221;</strong>&nbsp;</p><p><strong>Sav</strong>:&nbsp;Amazing.&nbsp;&nbsp;</p><p><strong>Ben</strong>: And so&nbsp;that's&nbsp;how&nbsp;I started.&nbsp;&nbsp;</p><p><strong>Sav</strong>:&nbsp;And&nbsp;so&nbsp;to be clear, you were in the mode of&nbsp;&#8220;<em>I would&nbsp;benefit&nbsp;from this</em>&#8221; purely.&nbsp;&nbsp;</p><p><strong>Ben</strong>: I was in the mode of&nbsp;"<em>I would&nbsp;benefit&nbsp;from&nbsp;it&nbsp;and&nbsp;there's&nbsp;probably a&nbsp;way to turn that into&nbsp;larger benefit for others".</em>&nbsp;The original idea for&nbsp;<strong>Felix</strong>&nbsp;<strong>One</strong>&nbsp;was called&nbsp;Parsley&nbsp;and it was just the idea of parsing data. I called it&nbsp;DocParse, which, you know - very&nbsp;Me&nbsp;Naming&nbsp;A&nbsp;Product. And Rebecca looked at it and was like,&nbsp;&#8220;<em>Oh, you can call it Parsley, like the leaf</em>&#8221;.&nbsp;</p><p>&nbsp;<strong>Sav</strong>:&nbsp;Love that.&nbsp;</p><p><strong>James</strong>: That's so good.</p><p><strong>Ben</strong>:&nbsp;Right?&nbsp;I was like,&nbsp;"<em>Fuck,</em>&nbsp;<em>that's</em>&nbsp;<em>a</em>&nbsp;<em>really good</em>&nbsp;<em>name"</em>.&nbsp;So&nbsp;the first time I went to one of the&nbsp;Gathr&nbsp;events as a&nbsp;founder was for Parsley. And then&nbsp;it&nbsp;kind of, you know, quickly&nbsp;snowballed from there.&nbsp;Obviously&nbsp;you and I&nbsp;started talking and&nbsp;I&nbsp;started to work on it more&nbsp;with the explicit aim of making it a product&nbsp;how can we make this a product?&nbsp;&nbsp;</p><p>I never liked the idea of doing something&nbsp;that's&nbsp;just a little&nbsp;bit&nbsp;useful, because that&nbsp;doesn't&nbsp;feel as satisfying if&nbsp;it's&nbsp;a plugin. You also probably&nbsp;wont&nbsp;<em>like</em>&nbsp;using it&nbsp;as much.&nbsp;Its unmemorable. I&nbsp;didn't&nbsp;like that. Then I would have rather taken a&nbsp;standard&nbsp;job;&nbsp;I also&nbsp;didn&#8217;t&nbsp;want&nbsp;to start a company for the sake of starting a company. And then it just&nbsp;sort&nbsp;of took shape.</p><p>As I started working on it in April&nbsp;ish, I was more like, okay, so actually the link&nbsp;sharing&nbsp;thing and the&nbsp;centralised editing&nbsp;things&nbsp;are kind of connected in that if we can make it easier for you to share files and we could make the files&nbsp;more readily available to edit,&nbsp;what&nbsp;we're essentially doing is reducing&nbsp;the workflow to its&nbsp;core&nbsp;component of&nbsp;Words In&nbsp;A File. You&nbsp;don't&nbsp;care about the formatting anymore. You care about the information contained in it&nbsp;and the relevant context.&nbsp;The parsing&nbsp;function&nbsp;is obviously just an extracting&nbsp;exercise.&nbsp;But that&nbsp;eventually morphed into the idea of well how&nbsp;can we&nbsp;build a full&nbsp;interface in which&nbsp;you can, you know, read&nbsp;PDFs easily?&nbsp;&nbsp;</p><p>No one likes whatever tool they use for PDFs.&nbsp;Yeah, like I will give Apple's preview a sort of shout out here,&nbsp;not that&nbsp;I'm&nbsp;in a position&nbsp;to shout out to Apple, but there's&nbsp;a lot of annoyance in the&nbsp;modern&nbsp;workflow&nbsp;itself&nbsp;and&nbsp;there's&nbsp;a&nbsp;dearth&nbsp;of good products that are&nbsp;made specifically for the financial industry. And&nbsp;that&#8217;s&nbsp;always slightly baffled me because&nbsp;it's&nbsp;the largest and most important - well, important&nbsp;in that&nbsp;it's&nbsp;one of the most <em>significant</em> - industries in the world.&nbsp;</p><p>But&nbsp;its&nbsp;an issue&nbsp;of How. How is it that video editors and photographers and doctors&nbsp;and academics&nbsp;and industrial designers,&nbsp;<strong>everyone</strong>&nbsp;has their own&nbsp;domain-specific&nbsp;software and for finance&nbsp;its&nbsp;like,&nbsp;&#8220;<em>Ah, you just use Excel</em>.&#8221;&nbsp;Fuck off?</p><p><strong>Sav</strong>: Is that, I mean, within&nbsp;the sector, is that almost&nbsp;a&nbsp;point of pride?&nbsp;You know, people who&nbsp;actually enjoy&nbsp;using a&nbsp;kind of no&nbsp;nonsense, no fuss, no&nbsp;frills,&nbsp;bells&nbsp;or&nbsp;whistles kind of thing?&nbsp;</p><p><strong>Ben</strong>:&nbsp;So&nbsp;this is the other&nbsp;thing.&nbsp;It&#8217;s&nbsp;a mixture.&nbsp;It&nbsp;is a&nbsp;<em>big</em>&nbsp;point of pride for people in finance to become good at using Excel and Bloomberg and those sorts of tools that all&nbsp;require&nbsp;quite a learning curve.&nbsp;</p><p><strong>Sav</strong>:&nbsp;Understandably.&nbsp;</p><p><strong>Ben</strong>: And I get it, because once you become good at it, it is&nbsp;very powerful. But&nbsp;more and more,&nbsp;no doubt also because of AI and everything,&nbsp;you see&nbsp;people&nbsp;actually becoming&nbsp;far worse at it nowadays. I was talking to a friend the other night. He was struggling to find anyone who could build a&nbsp;really good&nbsp;financial model.&nbsp;</p><p><strong>James</strong>: Why?&nbsp;</p><p><strong>Ben</strong>: Just because&nbsp;you're&nbsp;learning it less and the people who learned it as stringently are becoming more senior, so&nbsp;they're&nbsp;not the ones training the people anymore.&nbsp;&nbsp;</p><p>With Excel,&nbsp;there's&nbsp;always been this issue which is that you&nbsp;have to&nbsp;justify&nbsp;whatever&nbsp;you're&nbsp;doing as a junior to your senior,&nbsp;who's&nbsp;the actual stakeholder. Excel and PowerPoint are easy in that&nbsp;they&#8217;re&nbsp;tangible for the older generation to use&nbsp;-&nbsp;even if&nbsp;the&nbsp;senior&nbsp;doesn't&nbsp;model himself anymore,&nbsp;he&nbsp;can still go into an Excel spreadsheet and know how to navigate it. If you click F2 on a cell, you see the calculation -&nbsp;it's&nbsp;there in that cell, in front of your eyes. And the entire&nbsp;thing,&nbsp;start to finish,&nbsp;is auditable, which you&nbsp;don't&nbsp;get in the same way if&nbsp;you're&nbsp;in a web application&nbsp;oftentimes.&nbsp;You can learn all the&nbsp;rules&nbsp;and you can work with it, but&nbsp;it's&nbsp;not auditable in the same way. And the problem is, the only other option so far for finance has&nbsp;essentially been&nbsp;Python programming or C++,&nbsp;sort of like&nbsp;what you see in quantitative terms for building algorithmic models. And they are way faster and way better in a lot of ways, but the problem is,&nbsp;<strong>you</strong>&nbsp;<strong>can't</strong>&nbsp;<strong>manipulate them as easily.</strong>&nbsp;You&nbsp;can't&nbsp;take Python code and just be like,&nbsp;&#8220;<em>What if we replace</em>&nbsp;<em>this entire table with the Q3 table?</em>&#8221;&nbsp;It's&nbsp;static code&nbsp;that&nbsp;requires a greater level of learning that language properly, learning how to use it, learning the tools required to display images and dashboards,&nbsp;and so on.&nbsp;</p><h2>AI &amp; The Modern Workflow</h2><p><strong>Now</strong>&nbsp;<strong>we're</strong>&nbsp;<strong>at a stage where,</strong>&nbsp;<strong>because of AI, you can still give someone a spreadsheet, but you can have a thing that works in the spreadsheet alongside</strong>&nbsp;<strong>them.</strong>&nbsp;And I&nbsp;do&nbsp;think as&nbsp;the industry changes and as everything adapts to AI,&nbsp;there'll&nbsp;be a big adjustment in finance.&nbsp;There's already&nbsp;a growing movement in&nbsp;some&nbsp;private&nbsp;firms&nbsp;to stop&nbsp;hiring juniors. A senior who knows what&nbsp;they're&nbsp;doing financially and from an investment perspective has&nbsp;much&nbsp;less need for an analyst to build them a full model because he can still build <em>enough</em> of it to not need the analyst to do all the number crunching.&nbsp;&nbsp;</p><p>So&nbsp;what we need to do is to become part of that more modern&nbsp;workflow and make it such&nbsp;that, I guess in a slightly romantic way, the people who work in finance&nbsp;are people who are&nbsp;really good&nbsp;at finance. But you&nbsp;don't&nbsp;also have to be&nbsp;really good&nbsp;at Excel&nbsp;modelling&nbsp;specifically to be good at finance.&nbsp;&nbsp;</p><p>Just like if you&nbsp;went back 60 years&nbsp;and saw&nbsp;people doing industrial and architectural design, you would have offices with like 50 or 100 people doing Blueprint drawings and calculations on&nbsp;paper, essentially. And now you have your CAD software that is orders of magnitude&nbsp;more effective.&nbsp;</p><p><strong>James</strong>: A guy with a CNC and CAD software can make what would have taken a team of 100 people to design.&nbsp;</p><p><strong>Ben</strong>:&nbsp;Yeah.&nbsp;I still feel like Excel and&nbsp;probably things&nbsp;like Bloomberg are the only real innovations that have happened in terms of how you do finance. Everything else has been a philosophical adjustment of private and public markets or emerging markets. But the sort of, the workflow itself has never really caught up. And&nbsp;it's&nbsp;probably because until the introduction of AI, there&nbsp;hasn't&nbsp;been a need really to change anything it out the workflow.&nbsp;</p><p><strong>Sav</strong>:&nbsp;So&nbsp;regarding&nbsp;the advent of&nbsp;AI, what do you think? I mean, obviously the disruption is already happening, but what do you see as the effect of AI on finance?&nbsp;&nbsp;</p><p><strong>Ben</strong>: Well, so there&nbsp;are&nbsp;different ways&nbsp;to approach that. One is obviously from an investing perspective, which&nbsp;isn't&nbsp;really the point of this right now, but&nbsp;it's&nbsp;shifted what you have to focus on.&nbsp;Because you&nbsp;now&nbsp;have to, for example, look&nbsp;at how&nbsp;much more resilient&nbsp;a business&nbsp;is&nbsp;to&nbsp;either integrating&nbsp;with&nbsp;or existing&nbsp;in spite of&nbsp;AI, which from an investing perspective changes the game a bit.&nbsp;&nbsp;</p><p>Why does&nbsp;Docusign, for example, have 10,000 employees?&nbsp;There's&nbsp;a good chance that as&nbsp;AI&nbsp;models get&nbsp;better and better, a lot of those roles become redundant. And the same thing in things like marketing.&nbsp;So&nbsp;all of a sudden&nbsp;when&nbsp;you're&nbsp;doing company analysis, you&nbsp;have to&nbsp;take a different approach. But what&nbsp;it's&nbsp;also meant is that other industries are seeing the effects of the benefits of simple AI, like summarising a dot com,&nbsp;or email apps that reply&nbsp;and&nbsp;manage your calendar.&nbsp;We're&nbsp;getting&nbsp;closer to the point&nbsp;of being able to&nbsp;use&nbsp;AI&nbsp;for fundamental analysis. And it's, you know, novel in the sense that for a long time there have&nbsp;been people building&nbsp;ML&nbsp;models&nbsp;(pre-LLM)&nbsp;that try to&nbsp;analyse&nbsp;the tone of, for example, the&nbsp;Fed Board of Governors&nbsp;meeting&nbsp;- &#8220;<em>were they sounding positive or negative?</em>&#8221;&nbsp;For years&nbsp;there&#8217;ve&nbsp;been these investigations, not that&nbsp;they've&nbsp;ever been particularly&nbsp;accurate, but&nbsp;they're&nbsp;better than having to read a thousand tweets.&nbsp;</p><p><strong>Sav</strong>:&nbsp;Indeed.&nbsp;I guess for some of us&nbsp;that's&nbsp;our hobby.&nbsp;</p><p><strong>Ben</strong>:&nbsp;Quite. But if you do it 10 times a day for 10 companies, it becomes&nbsp;something&nbsp;else entirely.&nbsp;</p><p>But I think there's good and bad ways to use AI and&nbsp;it's&nbsp;important to be wary of it. But equally, I think if you are too wary of it and you have competitors&nbsp;starting to&nbsp;use&nbsp;it properly, then&nbsp;you're&nbsp;going to lose out.&nbsp;Like, if&nbsp;you're&nbsp;still using antiquated workflows that are also much more painful from a&nbsp;quality of life&nbsp;perspective for the people using them, then&nbsp;you're&nbsp;much less attractive as a place to work as well.&nbsp;&nbsp;</p><p>But most important is speed of analysis.&nbsp;There's&nbsp;a lot of events that you&nbsp;have to&nbsp;react to quite quickly in finance and usually whatever happens,&nbsp;there's&nbsp;a lot of source data and a lot of&nbsp;calculation that goes into&nbsp;those reactions. If, for example,&nbsp;there's&nbsp;a regulatory announcement by a government party, there&#8217;s usually&nbsp;a&nbsp;great many&nbsp;implications&nbsp;to&nbsp;that. And&nbsp;if you&nbsp;<em>do</em>&nbsp;read&nbsp;over&nbsp;those&nbsp;regulatory documents,&nbsp;the way&nbsp;that&nbsp;they're&nbsp;written&nbsp;is deliberately&nbsp;difficult to understand -&nbsp;it makes them&nbsp;annoying&nbsp;to read. So just having a thing that&nbsp;simplifies&nbsp;without reducing&nbsp;the&nbsp;depth of&nbsp;context&nbsp;available&nbsp;will become a game changer for the industry.&nbsp;</p><p><strong>Sav</strong>: And you said there are&nbsp;good ways&nbsp;of using AI and bad ways of using AI and you know, you&nbsp;don't&nbsp;want to lose an advantage that your competitor has, but you want to be wielding it properly. Why&nbsp;don't&nbsp;you just tell me a bit about what that&nbsp;means practically, not as&nbsp;a&nbsp;industry commentator, but as a business owner.&nbsp;To close out.&nbsp;</p><p><strong>Ben</strong>:&nbsp;For the most part,&nbsp;I&nbsp;think the bad ways to use AI are to rely on the outputs as factual.&nbsp;I mean,&nbsp;there's&nbsp;plenty of proof that AI models lie&nbsp;almost as&nbsp;much as they&nbsp;don't. Well,&nbsp;actually that's&nbsp;not true.&nbsp;&nbsp;</p><p><strong>*All</strong>&nbsp;<strong>laugh*</strong>&nbsp;</p><p>It's&nbsp;like 80% true, but&nbsp;that's&nbsp;a&nbsp;20%&nbsp;error rate, which is unacceptable by most standards, let alone professional standards.&nbsp;In finance, if you look at a model&nbsp;and&nbsp;there&nbsp;a&nbsp;20%&nbsp;chance&nbsp;of what you&nbsp;see being&nbsp;wrong, then&nbsp;it's&nbsp;entirely useless. The problem&nbsp;with AI&nbsp;is&nbsp;a lot&nbsp;of the time&nbsp;you&nbsp;don't&nbsp;necessarily&nbsp;know which 20%&nbsp;is&nbsp;wrong.&nbsp;It's&nbsp;not like the first 80% are&nbsp;really good&nbsp;and then at the end it drops off.&nbsp;It can be sprinkled here and there. So that is the bad way to use it.&nbsp;&nbsp;</p><p>The&nbsp;good way&nbsp;to use it is to&nbsp;employ&nbsp;it in ways&nbsp;such that&nbsp;it has less of a chance to make a mistake.&nbsp;Build auditability into your workflow.&nbsp;Essentially, if&nbsp;you ask it to&nbsp;&#8220;<em>summarise a document</em>&#8221;&nbsp;or to&nbsp;&#8220;<em>extract the table A</em>&#8221;,&nbsp;it's&nbsp;much easier to check.&nbsp;You just want to have the&nbsp;information&nbsp;to hand&nbsp;for reference&nbsp;- if&nbsp;you're&nbsp;extracting the table, you can visually see both results next to each other. You can even do a little Excel true/&nbsp;false thing just to highlight if all the cells match.&nbsp;<strong>That's</strong>&nbsp;<strong>a good way</strong>&nbsp;<strong>to use AI because it takes you more time to do that than it takes the AI.</strong>&nbsp;<strong>It's</strong>&nbsp;<strong>sort of a</strong>&nbsp;<strong>speed enhancement, not a replacement for thinking.</strong>&nbsp;And I think&nbsp;that's&nbsp;the core&nbsp;issue -&nbsp;if you use it to&nbsp;free up valuable time and activities,&nbsp;it's&nbsp;good. If you use it to replace your thinking, then at least for now,&nbsp;it's&nbsp;bad.&nbsp;</p><p><strong>James</strong>: The benefits of keeping a human in the loop!&nbsp;</p><p><strong>Sav</strong>:&nbsp;The mission is&nbsp;<strong>Clarity at the speed of Thought</strong>&nbsp;for a reason!&nbsp;&nbsp;</p>]]></content:encoded></item><item><title><![CDATA[Part 2: Felix Research’s Clean Data proposal]]></title><description><![CDATA[Who has the right to create or &#8220;create&#8221;?]]></description><link>https://www.thefelixview.com/p/part-2-felix-researchs-clean-data-proposal</link><guid isPermaLink="false">https://www.thefelixview.com/p/part-2-felix-researchs-clean-data-proposal</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Mon, 15 Dec 2025 17:59:14 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/85061546-9957-49ca-ba9c-ddf1628f3bd0_1083x759.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Hms8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Hms8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 424w, https://substackcdn.com/image/fetch/$s_!Hms8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 848w, https://substackcdn.com/image/fetch/$s_!Hms8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 1272w, https://substackcdn.com/image/fetch/$s_!Hms8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Hms8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Part 2: Felix Research&#8217;s Clean Data proposal&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Part 2: Felix Research&#8217;s Clean Data proposal" title="Part 2: Felix Research&#8217;s Clean Data proposal" srcset="https://substackcdn.com/image/fetch/$s_!Hms8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 424w, https://substackcdn.com/image/fetch/$s_!Hms8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 848w, https://substackcdn.com/image/fetch/$s_!Hms8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 1272w, https://substackcdn.com/image/fetch/$s_!Hms8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F98765eec-2ed5-41d3-accc-0f0a15ea55f4_1083x759.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p><strong>Who has the right to create or &#8220;create&#8221;? </strong>&nbsp;</p><p>My CEO sent me a quote tweet (or&nbsp;Xeet, or post, or whatever the correct&nbsp;nomenclature currently is) about Nano Banana and AI in general and told me to blog about it. The tweet:&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PSBv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PSBv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 424w, https://substackcdn.com/image/fetch/$s_!PSBv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 848w, https://substackcdn.com/image/fetch/$s_!PSBv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 1272w, https://substackcdn.com/image/fetch/$s_!PSBv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PSBv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png" width="358" height="498" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:498,&quot;width&quot;:358,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Part 2: Felix Research&#8217;s Clean Data proposal&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Part 2: Felix Research&#8217;s Clean Data proposal" title="Part 2: Felix Research&#8217;s Clean Data proposal" srcset="https://substackcdn.com/image/fetch/$s_!PSBv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 424w, https://substackcdn.com/image/fetch/$s_!PSBv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 848w, https://substackcdn.com/image/fetch/$s_!PSBv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 1272w, https://substackcdn.com/image/fetch/$s_!PSBv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1f1c7f78-e749-4365-b94a-2ff8e2d9361e_358x498.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>This is a&nbsp;Xeet&nbsp;after&nbsp;all, so&nbsp;we&#8217;ll&nbsp;gloss over the misappropriation of the generalised &#8220;AI&#8221; label to refer solely to generative AI. We also&nbsp;won&#8217;t&nbsp;question too closely the aptness of &#8220;god complex&#8221; in this situation. A God Complex is a delusion of grandeur and superiority -&nbsp;albeit,&nbsp;the link to creation is elegant.&nbsp;</p><p>We can understand the sentiment of this tweet as:&nbsp;&nbsp;&nbsp;</p><p><em>&#8220;Generative AI&#8217;s main use case is</em>&nbsp;<em>ostensibly empowering</em>&nbsp;<em>the lazy, the untalented and the unqualified to create things they have no business creating. If they had any business creating those things, they would value and invest in the journey/ slog and consequently they would become initiated via their own passion and aptitude &#8211; at which stage they would have little to no interest in using</em>&nbsp;G<em>enAI. Instead, the mediocre reap the fruit of the labour of the contributors, whose uncredited and uncompensated work trained these models and is the foundation of anything the interlopers go on to</em>&nbsp;<em><strong>&#8220;create&#8221;</strong>.&nbsp;&nbsp;&nbsp;</em>&nbsp;</p><p>Or&nbsp;maybe I&#8217;m&nbsp;completely incorrect in my interpretation. For the purposes of this&nbsp;piece, however, we will work with the above.&nbsp;&nbsp;</p><h2><strong>What's Wrong with Using AI to "create"?</strong>&nbsp;</h2><p><strong>A) Energy usage and ESG concerns.</strong>&nbsp;These are practical issues that are&nbsp;relatively&nbsp;simply&nbsp;handled - if&nbsp;the relevant parties choose&nbsp;to handle them. This will not be the focus of the&nbsp;piece. The&nbsp;more slippery issue is, by nature, harder to name and implied in @supermoongirl9's tweet. I will&nbsp;attempt&nbsp;to&nbsp;identify,&nbsp;name&nbsp;and&nbsp;unravel&nbsp;it&nbsp;within my breakdown of the below.&nbsp;</p><p><strong>B) Plagiarism &amp; lack of originality.</strong>&nbsp;Why is this an issue? Because of a lack of renumeration and credit for the original contributors. However, this is a normal (not&nbsp;<em>good</em>, but also not&nbsp;<em>unique</em>) function of capitalism.&nbsp;</p><p>The issue with centring stolen/ uncredited/ uncompensated IP in discussions about&nbsp;genAI&nbsp;use, is that such a thing is a practical issue in the domain of contract law, not morality (apologies to those who think of the two as tied).&nbsp;&nbsp;&nbsp;</p><p>Consider the following situation:&nbsp;<em>You join an organisation and sign your employment contract, which means signing over your work-related IP to your employer. During your tenure, you create a revolutionary internal program that saves your company hundreds of thousands. For this,</em>&nbsp;<em>you get a promotion and raise (which is a fraction of the revenue you have generated). After your exit, your employer repackages and sells the program to third</em>&nbsp;<em>parties</em>&nbsp;<em>and you</em>&nbsp;<em>don&#8217;t</em>&nbsp;<em>see a penny of that sales</em>&nbsp;<em>revenue because it is not</em>&nbsp;<em>and was never</em>&nbsp;<em>your IP.</em>&nbsp;&nbsp;</p><p>If we&nbsp;are going&nbsp;to delineate the&nbsp;concept&nbsp;of&nbsp;credit, we must figure out how we value&nbsp;visibility/&nbsp;exposure. Consider&nbsp;the structure of&nbsp;author contributions&nbsp;to a scientific journal&nbsp;&#8211;&nbsp;these are&nbsp;usually&nbsp;not financially compensated. If LLMs had an&nbsp;academia-style&nbsp;inbuilt citation system that assigned credit to sources of training data, would this be a solution? Partly, but there&nbsp;seems to&nbsp;be&nbsp;an&nbsp;additional&nbsp;aspect&nbsp;to the general public's discomfort with "AI Creators" and I suspect it is the&nbsp;<strong>intangible reward that is acclaim and cultural currency.</strong>&nbsp;</p><p>The dichotomy we are presented with is: "<em>pick a side - you either think</em>&nbsp;<em>genAI</em>&nbsp;<em>is like traditional media production but BETTER</em>&nbsp;(superlative of the same thing)&nbsp;<em>or you think it&#8217;s outright THEFT</em>".&nbsp;&nbsp;</p><p>Mightn&#8217;t there be a third&nbsp;option? That&nbsp;genAI&nbsp;is a different (value neutral) kind of creation?&nbsp;&nbsp;</p><p>To draw on a parallel,&nbsp;let&#8217;s&nbsp;look at driving. There is no moral implication associated with only knowing how to drive&nbsp;automatic&nbsp;(it&#8217;s&nbsp;important the reader knows that&nbsp;all three men in my office were quick to disagree with this)&nbsp;but&nbsp;it&#8217;s&nbsp;understood that there is a&nbsp;fundamental skill that&nbsp;manual&nbsp;drivers&nbsp;possess, which&nbsp;automatic&nbsp;drivers lack. This&nbsp;isn&#8217;t&nbsp;an issue because the machine compensates for the gap in the latter&#8217;s skillset. The question is&nbsp;therefore&nbsp;purely practical &#8211;&nbsp;<em><strong>are</strong></em>&nbsp;<em><strong>you</strong></em>&nbsp;<em><strong>road safe and is</strong></em>&nbsp;<em><strong>the end result</strong></em>&nbsp;<em><strong>competence?&nbsp;</strong></em>&nbsp;</p><p>But if we were discussing Formula&nbsp;1 drivers, the conversation would&nbsp;surely&nbsp;be different; namely being one to do with desert and reward, rather than practical&nbsp;outcomes.&nbsp;&nbsp;&nbsp;</p><p>What&nbsp;I&#8217;m&nbsp;getting at here, is that acclaim,&nbsp;recompense&nbsp;and appreciation for effort (i.e.&nbsp;components of&nbsp;credit) are muddying factors. "<em>If you want to enter the market, don't cheat</em>" is the sentiment here.&nbsp;Don't&nbsp;try to enter the market and you can do what you like.&nbsp;&nbsp;</p><p>These&nbsp;nuances within the&nbsp;wider&nbsp;umbrella of&nbsp;credit&nbsp;bring to mind&nbsp;two parallels:&nbsp;</p><p><strong>1) The invention of the camera.</strong>&nbsp;&nbsp;</p><p>This critique boils down to the above "<em>you're cheating and unskilled</em>" argument. But&nbsp;perhaps what&nbsp;is&nbsp;emerging,&nbsp;similar to&nbsp;the advent of photography, is a new type of creation and therefore a new kind of skill?&nbsp;&nbsp;</p><p>Is the invention of&nbsp;genAI&nbsp;to&nbsp;media production what the invention of photography was to traditional art? Or is the invention of&nbsp;genAI&nbsp;to media production what piracy is to media production companies?&nbsp;Prompt engineering is a skill (spoiler:&nbsp;it's&nbsp;philosophy (more&nbsp;specifically,&nbsp;theory of mind))&nbsp;that requires&nbsp;some&nbsp;effort and development and&nbsp;perhaps&nbsp;deserves&nbsp;some credit. This,&nbsp;however, does not answer the&nbsp;issue at the heart of the&nbsp;second&nbsp;parallel...&nbsp;</p><p><strong>2)</strong>&nbsp;<em><strong>"Reality stars are (undeservedly) famous for being famous (doing</strong></em>&nbsp;<em><strong>nothing)!"</strong></em>&nbsp;&nbsp;</p><p>We can&nbsp;take&nbsp;this critique to mean&nbsp;<em>"We/ society</em>&nbsp;<em>doesn't</em>&nbsp;<em>value the "skill" which is being rewarded here"</em>. The response to this sentiment is arguably&nbsp;<em><strong>"So?"</strong></em>&nbsp;Our hypothetical complainant&nbsp;might level either of the following retorts:&nbsp;i) we&nbsp;<em>should</em>&nbsp;care to police that which contributes to cultural rot ii) we&nbsp;<em>should</em>&nbsp;be wary of&nbsp;creating incentives (like&nbsp;attention&nbsp;and&nbsp;resultantly&nbsp;money) to degrade culture.&nbsp;</p><p>The problem is,&nbsp;&#8220;rot&#8221; is subjective and even if we&nbsp;<em>could</em>&nbsp;reach consensus on such a thing,&nbsp;when something is&nbsp;lucrative,&nbsp;or even just profitable,&nbsp;that is&nbsp;a&nbsp;factual&nbsp;demonstration of how this given thing is valued by our society.&nbsp;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!vxGZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!vxGZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 424w, https://substackcdn.com/image/fetch/$s_!vxGZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 848w, https://substackcdn.com/image/fetch/$s_!vxGZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!vxGZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!vxGZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg" width="1179" height="728" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/ffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:728,&quot;width&quot;:1179,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Part 2: Felix Research&#8217;s Clean Data proposal&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Part 2: Felix Research&#8217;s Clean Data proposal" title="Part 2: Felix Research&#8217;s Clean Data proposal" srcset="https://substackcdn.com/image/fetch/$s_!vxGZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 424w, https://substackcdn.com/image/fetch/$s_!vxGZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 848w, https://substackcdn.com/image/fetch/$s_!vxGZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!vxGZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fffd9e347-221b-40a2-a860-ba3b203fde42_1179x728.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">We shouldn't permit things just because they are fun?</figcaption></figure></div><p>So how do we decide what is permissible&nbsp;and what&nbsp;isn&#8217;t?&nbsp;Maybe we&nbsp;examine&nbsp;risk and&nbsp;impact&nbsp;<em>ad hoc</em>&nbsp;and&nbsp;determine&nbsp;what our societal values are?&nbsp;Lofty and easier said than done! Do we posit that intellectualism is virtue whilst anti-intellectualism is vice&nbsp;and use this as a rubric? What does this&nbsp;mean for productivity tools and AI given&nbsp;<strong><a href="https://news.mit.edu/news-clip/forbes-785?ref=thefelixview.com">MIT's findings</a></strong>?&nbsp;</p><p>Enter&nbsp;<strong>Felix Research</strong>'s philosophy of Augmented Intelligence.&nbsp;Wherein keeping a&nbsp;human-in-the-loop&nbsp;leads to&nbsp;the prioritisation of&nbsp;both&nbsp;<a href="https://www.thefelixview.com/more-british-please-the-struggle-with-untranslatables/">clarity&nbsp;</a><strong>and</strong>&nbsp;speed.&nbsp;</p><h2><strong>Practicalities of Avoiding Slop &amp;</strong>&nbsp;<strong>FRx's</strong>&nbsp;<strong>Clean Data</strong>&nbsp;<strong>Proposal</strong>&nbsp;</h2><p>Thus far, we have looked at AI slop as a general phenomenon. If we examine it in the context of enterprise use, AI slop is not just an irksome source of noise, but rather a technical challenge&nbsp;and&nbsp;source of risk.&nbsp;&nbsp;</p><p>Existing&nbsp;genAI&nbsp;systems&nbsp;create an environment where,&nbsp;frankly,&nbsp;plausibility&nbsp;of&nbsp;output&nbsp;is&nbsp;conducive to complacency&nbsp;in end users. The generated outputs&nbsp;<em>sound</em>&nbsp;correct, especially when the work is&nbsp;shallow&nbsp;or&nbsp;unfamiliar, however using an LLM for&nbsp;deep research, or for&nbsp;an esoteric subject that you have&nbsp;expertise&nbsp;in&nbsp;will quickly reveal its limitations. If an uncritical LLM user is working in a team, sharing their work and findings with others, this makes it&nbsp;all the more&nbsp;difficult to distinguish&nbsp;accurate&nbsp;output from misinformation further down the line.&nbsp;</p><p>The systems that generate slop are often well-engineered at their core, but poorly supervised,&nbsp;unexamined&nbsp;and unchecked. What they lack is structure.&nbsp;<strong>For organisations that depend on reliable intelligence, particularly in financial contexts, this absence of rigour is untenable.</strong>&nbsp;&nbsp;&nbsp;</p><p><strong>Felix Research</strong>&#8217;s proposal in favour of&nbsp;<em>clean data</em>&nbsp;addresses this challenge directly. We put forth the below introductory&nbsp;framework for keeping AI fast,&nbsp;trustworthy&nbsp;and transparent by embedding governance principles into its foundations.&nbsp;<strong>The simple truth is that</strong>&nbsp;<strong>speed</strong>&nbsp;<strong>is only an advantage if you can trust your outputs; when AI systems produce results that cannot be traced or</strong>&nbsp;<strong>validated,</strong>&nbsp;<strong>acceleration becomes a liability.</strong>&nbsp;Clean data restores balance by ensuring that every layer of automation is grounded in provenance and accountability.&nbsp;&nbsp;&nbsp;</p><p>"A primary concern for enterprise data governance is the lineage of information" notes Ryan Daws in his recently published&nbsp;<a href="https://www.artificialintelligence-news.com/news/ai-web-search-risks-mitigating-business-data-accuracy-threats/?ref=thefelixview.com">article</a>&nbsp;on AI and threats to business data accuracy. Every dataset carries a lineage, and knowledge of that lineage is essential to judging the reliability of any conclusion derived from it.&nbsp;&nbsp;&nbsp;</p><p><strong>Provenance</strong>&nbsp;is therefore the first pillar of clean data in the context of AI for enterprise use.&nbsp;Like a house of cards, if we lack this clarity on&nbsp;the fundamental layer that is good data&nbsp;provenance, outputs cannot be meaningfully verified. Clean data therefore depends on disciplined data hygiene: structured ingestion, active&nbsp;validation&nbsp;and documented review. These are long-standing practices in Business Intelligence, but they must now become rigorously enforced standards in AI.&nbsp;&nbsp;&nbsp;</p><p>The second pillar is&nbsp;<strong>Ethics-by-Design</strong>. Responsible AI cannot be achieved by policy alone; it must be built into systems&nbsp;architecture. This means defining transparent parameters for how data is collected,&nbsp;processed&nbsp;and applied, as well as ensuring that accountability is distributed rather than deferred. In finance, where AI systems could&nbsp;conceivably shape&nbsp;regulatory reporting and investment decisions in the future, such clarity is not a moral preference but a compliance necessity.&nbsp;&nbsp;&nbsp;</p><p>Our third pillar is&nbsp;<strong>Hybrid Reasoning</strong>. By combining symbolic AI (which uses explicit rules and logic) with generative&nbsp;models /unsupervised&nbsp;components&nbsp;that learn and adapt, organisations can achieve both&nbsp;speed&nbsp;and interpretability. Symbolic reasoning provides a structured framework for traceability and compliance whilst generative&nbsp;and unsupervised&nbsp;systems add flexibility and contextual understanding. Together, they create an AI stack capable of producing insight that is dynamic yet dependable.&nbsp;</p><h2><strong>AI Slop vs Clean Data AI Table</strong>&nbsp;</h2><p>Dimension&nbsp;</p><p><strong>AI Slop</strong>&nbsp;</p><p><strong>Clean Data AI</strong>&nbsp;</p><p>Output Quality&nbsp;</p><p>Superficially fluent, factually fragile&nbsp;</p><p>Transparent, traceable, and domain-specific&nbsp;</p><p>Data Integrity&nbsp;</p><p>Unverified, poorly sourced, feedback-looped&nbsp;</p><p>Curated,&nbsp;validated, and provenance-tracked&nbsp;</p><p>Governance Approach&nbsp;</p><p>Ad hoc, siloed oversight&nbsp;</p><p>Cross-functional governance (Compliance + BI + Engineering)&nbsp;</p><p>Human Role&nbsp;</p><p>Removed from the loop, human as observer&nbsp;</p><p>Central to the loop, human as sense-maker&nbsp;</p><p>Architecture&nbsp;</p><p>Purely generative, prone to hallucination&nbsp;</p><p>Hybrid symbolic + probabilistic reasoning &amp; RAG systems&nbsp;</p><p>Compliance Risk&nbsp;</p><p>High, opaque decisions and unverifiable outputs&nbsp;</p><p>Low, auditable data lineage and deterministic logic&nbsp;</p><p>Business Value&nbsp;</p><p>Volume-driven productivity illusion&nbsp;</p><p>Decision-ready insight and reputational resilience&nbsp;</p><p>At&nbsp;<strong>Felix Research</strong>, we&nbsp;seek&nbsp;to&nbsp;create&nbsp;a&nbsp;centralised workflow&nbsp;environment wherein&nbsp;data lineage, validation and reasoning coexist seamlessly. Analysts should be able to move quickly without surrendering oversight and organisations should be free to innovate without compromising trust.&nbsp;&nbsp;&nbsp;</p><p>Our&nbsp;philosophy&nbsp;is not a rejection of automation but a refinement of it. We understand that innovation with lacklustre governance is unsustainable and that acceleration without clarity leads nowhere useful. By treating data integrity as an enabler rather than a constraint, institutions can build systems that are both efficient and ethical.&nbsp;&nbsp;&nbsp;</p><p><strong>Watch this space.&nbsp;&nbsp;&nbsp;</strong></p>]]></content:encoded></item><item><title><![CDATA[More British Please: The Struggle with Untranslatables]]></title><description><![CDATA["I cannot define it, but I know it when I see it." This is something we're all distinctly familiar with, and it's becoming increasingly relevant in the age of AI.]]></description><link>https://www.thefelixview.com/p/more-british-please-the-struggle-with-untranslatables</link><guid isPermaLink="false">https://www.thefelixview.com/p/more-british-please-the-struggle-with-untranslatables</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Fri, 05 Dec 2025 13:02:57 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/209fb25d-e429-4522-b5a0-a432884a8c59_2000x3000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!pmmS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!pmmS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!pmmS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!pmmS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!pmmS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!pmmS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/fb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;More British Please: The Struggle with Untranslatables&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="More British Please: The Struggle with Untranslatables" title="More British Please: The Struggle with Untranslatables" srcset="https://substackcdn.com/image/fetch/$s_!pmmS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!pmmS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!pmmS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!pmmS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ffb80886a-24a9-4d2c-94cc-cbcbe5cd23a2_2000x3000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p><em>"I cannot define it, but I know it when I see it."</em>&#8239;This is something we're all distinctly familiar with, and it's becoming increasingly relevant in the age of AI. As various models approach (and often surpass) human capabilities in numerous domains, it's the&#8239;<em>differences</em>&#8239;between AI and humans, not their similarities, that draw our greatest frustration.&nbsp;</p><p>"More British please" is a prompt I find myself making constantly when using AI as a tool in my daily work. As a professional doing outreach (cold, warm, and everything in between), tone and subtlety are what I look for. Asking for investment, building rapport, getting replies, and acquiring customers all depend on unwritten textual and social rules that cannot be expressed in any concise or quantifiable manner.&nbsp;</p><p>The American-heavy nature of AI development and training data means modification via prompts is increasingly required but also increasingly difficult. The challenge lies not in the AI itself but in the user's desired yet nebulous output. I can describe what "British tone" means based on my own background and experience, but is that the same as what the AI understands? And is that the same as how everyone else interprets it?&nbsp;</p><p>This goes far beyond spelling and grammar. Much of my work has been UK-based, and through in-person meetings and phone calls, you develop a feel for what works (and more importantly, what doesn't). In a culture like the UK, which values subtlety generally and directness only occasionally, these nuances matter enormously.&nbsp;</p><h2><strong>The Hybrid Approach: Keeping Humans in the Loop</strong>&nbsp;</h2><p>However, this struggle illuminates the benefits of a hybrid approach; Augmented Intelligence means <strong>keeping humans in the loop</strong>. Why have a human write from scratch when we can refine? The last 20% is often 80% of the value (and effort!).&nbsp;</p><p>I'm not suggesting all writing should be done by AI, but when time is constrained, it's better for both creator and recipient that effort is allocated to the value-add. Personalisation and tone are where humans excel. Navigating relationships is what we've evolved to do and what we're paid to do in client-facing roles. The charismatic, eloquent salesperson is in the company to spend their time driving value, not drafting approximate emails to reach as many targets as possible.&#8239;<strong>AI should enable you to focus on your strengths.</strong>&nbsp;</p><p>At Felix Research, we understand this is true across all domains, and especially in finance where costs are high and time constraints the highest. The analyst, partner, or senior has studied hard, worked hard<s>,</s> and thought hard to reach a position where they add value. Why continue doing things the old way? <strong>How much of a &#163;100k analyst's salary is spent sorting and scanning PDFs?</strong>&nbsp;</p><h2><strong>Beyond Cultural Untranslatables: Domain-Specific Meaning</strong>&nbsp;</h2><p>But untranslatables aren't unique to cultural contexts. Consider: if I ask for "the European gas price", what does that mean? Or "a good wine to invest in"? Or "a company that could be the next GameStop"? All of these have multiple interpretations and contextual meanings.&nbsp;</p><p>In finance, terms often have more specific, generally agreed-upon interpretations, but these are built on agreements and assumptions. Assumptions, however, don't always create clear or concise answers. Only someone (or something) trained to understand the industry-specific interpretation can give you a good answer. And crucially, a&#8239;<em>fast</em>&#8239;one.&nbsp;</p><p><strong>This is where Felix One differs from generic AI.</strong>&#8239; A standard LLM available today is good at being verbose and covering all bases, but misses the mark for the professional user. A professional needs the&#8239;<em>correct</em>&#8239;answer, and often needed it yesterday. Speed and accuracy are crucial, but so is traceability and referenced findings. After all, no one likes being asked where they found that data.&nbsp;</p><h2><strong>An AI Built for Finance</strong>&nbsp;</h2><p>Take a real example: an analyst researching renewable energy exposure across European utilities. With a generic LLM, you'd spend time crafting the perfect prompt, sifting through verbose responses, then manually tracking down sources to verify claims. With Felix Research, our AI-driven financial research platform is trained specifically on financial documents and understands domain-specific terminology. Ask about renewable energy exposure, and you get precise data points with direct citations to the relevant regulatory filings, investor presentations, and annual reports.&nbsp;</p><p>The platform doesn't replace the analyst's judgment about which utilities represent the best investment opportunity. Instead, it eliminates the hours spent gathering and verifying basic information, allowing the analyst to focus on what they do best: analysis, pattern recognition and strategic thinking.&nbsp;</p><p><strong>This is how AI should work in professional contexts. </strong>It gets you to the stage where&#8239;<em>you</em>&#8239;add value faster by finding sources, showing them clearly<s>,</s> and being correct.&#8239;<strong>Humans are still needed, and still wanted, for making crucial financial analysis and decisions.</strong>&#8239;The &#163;100k analyst should be synthesising insights and advising clients, not wrestling with PDF searches.&nbsp;</p><h2><strong>The Path Forward</strong>&nbsp;</h2><p>The future isn't about AI replacing financial professionals. It's about purpose-built AI platforms that understand finance specifically, not generically. Platforms that cite their sources. Platforms that understand what "European gas price" means in context. And most importantly, platforms that free professionals to do what only humans can: apply judgment, build relationships, and make the critical decisions that drive value.&nbsp;</p><p><strong>Ready to reclaim your time for high-value work?</strong>&nbsp;</p><p><em>Written by James Hall</em></p>]]></content:encoded></item><item><title><![CDATA[Part 1: The Perils of AI Slop]]></title><description><![CDATA[Every day brings another wave of AI-generated reports, analyses and summaries, promising insight but delivering an uncanny simulacrum of some other familiar output.]]></description><link>https://www.thefelixview.com/p/part-1-the-perils-of-ai-slop-2</link><guid isPermaLink="false">https://www.thefelixview.com/p/part-1-the-perils-of-ai-slop-2</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Tue, 18 Nov 2025 13:34:43 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/f50a395d-3f29-407e-80ed-46f38cb5b5d3_999x997.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!liG1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!liG1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 424w, https://substackcdn.com/image/fetch/$s_!liG1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 848w, https://substackcdn.com/image/fetch/$s_!liG1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 1272w, https://substackcdn.com/image/fetch/$s_!liG1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!liG1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Part 1: The Perils of AI Slop&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Part 1: The Perils of AI Slop" title="Part 1: The Perils of AI Slop" srcset="https://substackcdn.com/image/fetch/$s_!liG1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 424w, https://substackcdn.com/image/fetch/$s_!liG1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 848w, https://substackcdn.com/image/fetch/$s_!liG1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 1272w, https://substackcdn.com/image/fetch/$s_!liG1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F99f078d1-0002-4bde-8173-8002b1fa0c93_999x997.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p>Every day brings another wave of AI-generated reports, analyses and summaries, promising insight but delivering an uncanny simulacrum of some other familiar output. As volume grows, discernment declines; what remains is what we may call&nbsp;<em>AI slop</em>.</p><p>This phrase captures a growing problem in modern information work: content that appears intelligent but is not. It describes the confident output (read: regurgitation) of automated systems that&nbsp;<em>imitate&nbsp;</em>understanding without demonstrating it, producing analysis that sounds convincing but lacks reasoning, evidence, or domain context (in the worst cases, the evidence will be present but fictional). In finance and research, such imitation can be dangerous. When language models mimic logic rather than exercise it, they risk creating a polished illusion of expertise, one that will avoid telling you it doesn't know a given thing and habitually conceals its own fallibility.</p><p>Financial analysis depends on trust and traceability. Analysts operate under constant pressure to deliver clear, timely insight from vast and shifting data sets. Automation promises relief from this strain, yet when AI tools deliver results without interpretability, they introduce a new problem. They accelerate confusion instead of clarity.</p><p>AI slop emerges when systems prioritise fluency over substance. It is partially the product of relying on probabilistic models rather than hybrid structures that can &#8220;understand&#8221; within a closed environment. The effect is easy to miss at first; the sentences read smoothly, the conclusions appear sound, but the logic that should underpin them is absent. In research and financial settings, this disconnect is more than bewildering and inconvenient, it is an issue of unnecessarily increased risk and time wasted on double-checking.</p><p>The solution is not to reject automation, but to redesign it. Progress will come from tools that enhance human thinking rather than replace it. This is the principle of Augmented Intelligence. It positions AI as a partner in reasoning rather than a substitute for it. Machines handle scale, speed and synthesis; humans provide context, interpretation and validation. Together, they form a system that is faster than the human could be alone, yet remains transparent and scrutable.</p><p>When applied to financial research, this approach creates an environment where every conclusion can be tested and every assumption examined. Analysts retain oversight of the process and automation becomes a trusted collaborator rather than a black-box magic 8 ball (i.e. opaque and mystically wise whenever it is not wrong). Transparency and velocity do not have to be mutually exclusive; they can reinforce each other. The faster a system operates, the more important it becomes to understand how and why it reaches its results. That metadata can be used to propel learning and advancement to the end of long-term, sustainable gains.</p><p>AI slop reflects the principle that true intelligence, in humans or machines, lies in the ability to explain decisions and adapt reasoning when faced with uncertainty. The next generation of financial AI must therefore prioritise transparency and comprehension alongside performance. &nbsp;</p><p>Automation will (rightly) continue to shape how research is done, but the goal is not more and more automation; it is better automation. Systems that merge human discernment with computational precision will define the next phase of workplace and financial innovation. They will replace hollow speed with meaningful acceleration, producing insight that is both rapid and robust.</p><p>The future of AI in finance will not be measured by how much data it can process, but by how clearly it can think. That is the purpose of Augmented Intelligence and the standard to which Felix One aspires: delivering Clarity at the Speed of Thought.</p>]]></content:encoded></item><item><title><![CDATA[From Big Data to Smart Data]]></title><description><![CDATA[Why Financial Research Needs More Than Just LLMs]]></description><link>https://www.thefelixview.com/p/from-big-data-to-smart-data</link><guid isPermaLink="false">https://www.thefelixview.com/p/from-big-data-to-smart-data</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Thu, 09 Oct 2025 09:33:23 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/4c0c0319-96a8-4ed2-927d-a7342cad6733_2000x2000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!TJf8!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!TJf8!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!TJf8!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!TJf8!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!TJf8!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!TJf8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;From Big Data to Smart Data&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="From Big Data to Smart Data" title="From Big Data to Smart Data" srcset="https://substackcdn.com/image/fetch/$s_!TJf8!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!TJf8!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!TJf8!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!TJf8!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F483b62c5-7a37-4c6a-ad84-0a2dcbae6395_2000x2000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p><strong>Why Financial Research Needs More Than Just LLMs</strong></p><p><em>Financial research cannot rely on LLMs alone. Moving from big data to smart data means combining language models with reasoning and BI principles to deliver insights that are rigorous, transparent and decision-ready. The institutions that make this shift will be better equipped to navigate complexity and act with confidence.</em></p><p>Financial institutions are facing an unprecedented flood of information. Market data streams arrive by the second, economic indicators shift daily, client communications generate vast amounts of unstructured content and research reports accumulate faster than most teams can process. The sheer scale of this material has created an environment where simply having access to data is no longer enough. What matters is transforming it into insights that are rigorous, reliable and decision-ready.</p><p>For years, business intelligence platforms have been the workhorses of data management, helping firms organise and visualise information. Now, with the rise of artificial intelligence, large language models (LLMs) such as ChatGPT and its peers have entered the conversation. Their ability to summarise, translate and generate text at scale has understandably caught the attention of financial professionals. Yet it would be a mistake to assume that these tools alone can meet the demands of financial research.</p><p>LLMs excel at detecting surface-level patterns in language and producing fluent text, but financial analysis depends on more than prediction. It requires reasoning, validation and the ability to connect signals across structured and unstructured sources. When billions are on the line, plausible-sounding answers are not enough. Decision-makers need confidence that insights are supported by evidence, traceable to source data and consistent with a coherent logic.</p><p>This is where the concept of moving from big data to smart data comes in. Smart data is not just information that has been aggregated; it is information that has been contextualised, validated and prepared for action. Achieving this shift means combining the strengths of LLMs with models designed for reasoning and action, sometimes called language reasoning models (LRMs) or language action models (LAMs). These systems can enforce structure, apply rules and ensure that insights are not only well phrased but also trustworthy.</p><p>For financial research, the implications are clear:</p><ul><li><p><strong>LLMs are useful, but insufficient</strong>: they help manage volume, reduce time spent on repetitive tasks and create first drafts of analysis. But without reasoning and validation layers, their outputs risk being shallow or even misleading.</p></li><li><p><strong>Reasoning systems add rigour</strong>: by integrating BI principles and structured frameworks, LRMs and LAMs ensure that outputs can withstand scrutiny, align with compliance requirements and trace back to source evidence.</p></li><li><p><strong>Smart data enables better governance</strong>: financial institutions operate under strict regulatory oversight. Structured reasoning models help maintain auditability and transparency, which LLMs alone cannot guarantee.</p></li></ul><p>For example, imagine an analyst tasked with understanding sectoral impacts of a sudden interest rate change. An LLM might summarise recent news and research reports quickly, but a reasoning-enabled system could go further. It could link rate changes to sector models, validate findings against historical data and flag inconsistencies with current economic forecasts. The result is not just a narrative but a structured assessment that a risk committee can act upon with confidence.</p><p>The future of financial research lies in this integration. Big data will continue to expand, but the competitive edge will belong to firms that can consistently transform it into smart data. That means building AI strategies that do not stop at surface-level pattern recognition but instead embed structured reasoning, validation and governance.</p><p>LLMs will remain part of the toolkit, but they must be complemented by systems capable of structured thought and decision support. Financial research is not about producing more words; it is about producing better judgements. The firms that recognise this will be best positioned to navigate complexity, manage risk and identify opportunities in a market that never slows down.</p>]]></content:encoded></item><item><title><![CDATA[How AI Is Reshaping Private Equity: Trends, Risks, and Where GPs Should Place Their Bets]]></title><description><![CDATA[Artificial intelligence has moved from being a sector of investment to a practical toolkit that private equity firms are using across the deal lifecycle.]]></description><link>https://www.thefelixview.com/p/how-ai-is-reshaping-private-equity-trends-risks-and-where-gps-should-place-their-bets</link><guid isPermaLink="false">https://www.thefelixview.com/p/how-ai-is-reshaping-private-equity-trends-risks-and-where-gps-should-place-their-bets</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Mon, 06 Oct 2025 10:45:30 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/eab2e1d8-b38d-4d64-b72a-d38472249e8b_2000x1333.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!V7Ug!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!V7Ug!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 424w, https://substackcdn.com/image/fetch/$s_!V7Ug!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 848w, https://substackcdn.com/image/fetch/$s_!V7Ug!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!V7Ug!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!V7Ug!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;How AI Is Reshaping Private Equity: Trends, Risks, and Where GPs Should Place Their Bets&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="How AI Is Reshaping Private Equity: Trends, Risks, and Where GPs Should Place Their Bets" title="How AI Is Reshaping Private Equity: Trends, Risks, and Where GPs Should Place Their Bets" srcset="https://substackcdn.com/image/fetch/$s_!V7Ug!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 424w, https://substackcdn.com/image/fetch/$s_!V7Ug!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 848w, https://substackcdn.com/image/fetch/$s_!V7Ug!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!V7Ug!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe9dd7d9e-b9de-4378-9402-36da3be009e4_2000x1333.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p><em>Artificial intelligence has moved from being a sector of investment to a practical toolkit that private equity firms are using across the deal lifecycle. From sourcing and due diligence through to value creation and exit planning, AI is influencing how funds operate and where they expect to capture value. Over the past two years adoption has accelerated. Limited partners are asking more questions, specialist AI funds have emerged and portfolio companies are beginning to integrate predictive and generative models into core operations. The discussion has shifted from whether AI is relevant for private equity to how quickly firms can adapt their playbooks.</em>&nbsp;</p><h2><strong>Trends Shaping Private Equity and AI</strong>&nbsp;</h2><p>One of the most visible developments is the dual role of AI. General partners continue to invest in AI-native companies such as vertical model providers, machine learning enabled SaaS and data infrastructure platforms. At the same time they are embedding AI into traditional portfolio companies to drive operational gains, margin expansion and cost efficiency.&nbsp;</p><p>Deal sourcing is also undergoing rapid change. Natural language processing and machine learning tools can now scan filings, job adverts, patents and customer signals at scale, producing a more targeted pipeline. Early adopters report faster funnel conversion and higher quality outreach.&nbsp;</p><p>Technical and data due diligence is now a standard component of the investment process. Buyers assess data quality, the robustness of existing models, and the maturity of machine learning operations. Weaknesses in these areas are increasingly treated as red flags that can affect valuation or even derail a transaction.&nbsp;</p><p>In value creation, firms are moving toward AI-first operating playbooks. Pricing optimisation, churn prediction, demand forecasting and automation of back-office processes are among the most common applications. Some firms are building centralised AI centres of excellence or bringing in operating partners with machine learning expertise to accelerate rollouts.&nbsp;</p><p>Competition is intensifying. Specialist funds focused on AI infrastructure, sector-specific applications in healthcare and fintech, and data-centric software have entered the market. At the same time the main constraints are shifting from capital to talent and data infrastructure. Access to clean, well-structured data and the ability to attract senior machine learning leadership often determines the pace of adoption.&nbsp;</p><p>Alongside opportunity comes heightened risk. Concerns around explainability, transparency, data privacy, bias and cybersecurity are growing, particularly in regulated sectors. Limited partners increasingly want to understand how funds manage these risks and what governance structures are in place.&nbsp;</p><p>Finally, pricing dynamics are evolving. While AI-driven improvements in revenue and margin can justify higher multiples, competition for attractive assets is fierce. Buyers without a clear operational playbook may find expected returns compressed.&nbsp;</p><h2><strong>Implications for Firms</strong>&nbsp;</h2><p>For deal making, private equity teams should expand diligence capabilities to include data scientists and engineers able to review models and assess data lineage. Execution risk must be priced realistically, with earn-outs or conditional structures tied to actual delivery of AI-driven value.&nbsp;</p><p>For value creation, firms will often see the highest return from practical use cases that are quick to implement. Automating repetitive processes or refining pricing strategies can yield results within a year. More ambitious projects requiring significant R&amp;D can be pursued selectively but should not delay initial gains.&nbsp;</p><p>Organisationally, the most effective model is often a lean centre of excellence that provides shared capabilities in data engineering, model deployment and change management. Recruiting or upskilling operating partners with relevant expertise is critical.&nbsp;</p><p>Risk management frameworks should be established early. This includes versioning of models, monitoring for drift, maintaining audit trails and preparing incident response plans. Ethical guidelines and transparency policies help protect reputation and build trust with stakeholders.&nbsp;</p><h2><strong>LP and GP Responses</strong>&nbsp;</h2><p>Limited partners are sharpening their questions. They want to know not just how firms intend to capture value through AI but also how risks are being managed.</p><p>General partners are responding in varied ways. Some have launched dedicated AI funds, while others are integrating AI mandates into growth or private credit strategies. Direct investments in infrastructure such as data platforms or MLOps vendors are also increasing, giving firms greater control over the ecosystem.&nbsp;</p><h2><strong>A Practical Playbook</strong>&nbsp;</h2><p><em>For General Partners&nbsp;</em></p><ul><li><p>Define a clear AI thesis that specifies whether the focus is revenue, cost or retention.&nbsp;</p></li><li><p>Run quick pilot projects that can show measurable ROI within six to twelve months.&nbsp;</p></li><li><p>Integrate AI diligence into the standard checklist and budget for remediation where data maturity is lacking.&nbsp;</p></li><li><p>Link earn-outs or incentives to actual AI-driven performance improvements.&nbsp;</p></li></ul><p>&nbsp;<em>For Portfolio Companies&nbsp;</em></p><ul><li><p>Focus first on high-impact applications such as pricing, churn reduction and procurement automation.&nbsp;</p></li><li><p>Establish strong data governance practices early, since clean data compounds in value.&nbsp;</p></li><li><p>Combine AI initiatives with process change and adoption planning to ensure real operational impact.&nbsp;</p></li></ul><p>&nbsp;<em>For Limited Partners&nbsp;</em></p><ul><li><p>Ask targeted questions about governance, technical capability and exposure to regulatory risk.&nbsp;</p></li><li><p>Assess whether the GP has sufficient access to talent and partnerships for implementation.&nbsp;</p></li><li><p>Evaluate the GP&#8217;s track record in operationalising AI across a portfolio.&nbsp;</p></li></ul><h2><strong>Outlook</strong>&nbsp;</h2><p>Artificial Intelligence is now central to private equity strategy. The firms that will emerge strongest are those that combine financial discipline with credible technical capacity, robust governance and repeatable operating models that can scale across a portfolio. Private equity is evolving into a discipline that integrates operational transformation with traditional buyout expertise. Success will belong to those who do not just invest in AI but who operate with it at the core of their approach.&nbsp;</p>]]></content:encoded></item><item><title><![CDATA[Technology alone is not strategy]]></title><description><![CDATA[A new MIT study shows again that AI is not a panacea for stalled revenue or lost competitiveness, as some have claimed. This paired with IBM&#8217;s recent CEO Study, shows a clear message is emerging to the myriad stakeholders in enterprise AI; whilst generative AI has become a strategic imperative and near ubiquitous investment, its successful implementation is more elusive.]]></description><link>https://www.thefelixview.com/p/technology-is-not-strategy-2</link><guid isPermaLink="false">https://www.thefelixview.com/p/technology-is-not-strategy-2</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Wed, 20 Aug 2025 14:26:55 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/6cccae64-d777-461b-ad3d-5fe73e5fd1ad_2000x1500.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!odqs!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!odqs!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!odqs!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!odqs!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!odqs!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!odqs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Technology alone is not strategy&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Technology alone is not strategy" title="Technology alone is not strategy" srcset="https://substackcdn.com/image/fetch/$s_!odqs!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 424w, https://substackcdn.com/image/fetch/$s_!odqs!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 848w, https://substackcdn.com/image/fetch/$s_!odqs!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!odqs!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4eaae56b-ce59-49d4-a582-b9d2e40aea11_2000x1500.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p>A new MIT study shows again that AI is not a panacea for stalled revenue or lost competitiveness, as some have claimed.&nbsp;This paired with IBM&#8217;s recent CEO Study, shows a clear message is emerging to the myriad stakeholders in enterprise AI; whilst generative AI has become a strategic imperative and near ubiquitous investment, its successful implementation is more elusive.</p><p><strong>The burgeoning gap between AI capability and profitable utilisation makes clear that business fundamentals remain essential; the true differentiator is how and where AI is applied.</strong></p><div class="captioned-image-container"><figure><p><a href="https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com">MIT report: 95% of generative AI pilots at companies are failing</a></p><figcaption class="image-caption"><a href="https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com">There&#8217;s a stark difference in success rates between companies that purchase AI tools from vendors and those that build them internally.</a></figcaption></figure></div><a class="image-link image2" target="_blank" href="https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ULZZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 424w, https://substackcdn.com/image/fetch/$s_!ULZZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 848w, https://substackcdn.com/image/fetch/$s_!ULZZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 1272w, https://substackcdn.com/image/fetch/$s_!ULZZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ULZZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e580814d-3750-4f03-acfe-39f713e3def8_193x193.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Technology alone is not strategy&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:&quot;https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com&quot;,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Technology alone is not strategy" title="Technology alone is not strategy" srcset="https://substackcdn.com/image/fetch/$s_!ULZZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 424w, https://substackcdn.com/image/fetch/$s_!ULZZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 848w, https://substackcdn.com/image/fetch/$s_!ULZZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 1272w, https://substackcdn.com/image/fetch/$s_!ULZZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe580814d-3750-4f03-acfe-39f713e3def8_193x193.png 1456w" sizes="100vw"></picture><div></div></div></a><p><a href="https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com">FortuneSheryl Estrada</a></p><a class="image-link image2" target="_blank" href="https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!M6zz!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!M6zz!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!M6zz!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!M6zz!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!M6zz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Technology alone is not strategy&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:&quot;https://fortune.com/2025/08/18/mit-report-95-percent-generative-ai-pilots-at-companies-failing-cfo/?ref=thefelixview.com&quot;,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Technology alone is not strategy" title="Technology alone is not strategy" srcset="https://substackcdn.com/image/fetch/$s_!M6zz!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 424w, https://substackcdn.com/image/fetch/$s_!M6zz!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 848w, https://substackcdn.com/image/fetch/$s_!M6zz!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!M6zz!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F19facf36-15a0-4335-8acf-18e3047615d4_1200x600.jpeg 1456w" sizes="100vw"></picture><div></div></div></a><p>IBM&#8217;s 2025 CEO Study: <em>5 Mindshifts to Supercharge Business Growth</em>, surveying over 2,000 CEOs across 24 industries,<strong> </strong>outlines five conceptual &#8220;Mindshifts&#8221; intended to guide enterprise leaders through structural and operational change. <strong>Yet the most compelling insight lies not in the shifts themselves, but in the dissonance between executive belief and organisational readiness.</strong></p><div class="captioned-image-container"><figure><p><a href="https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com">2025 CEO Study: 5 mindshifts to supercharge business growth</a></p><figcaption class="image-caption"><a href="https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com">CEOs are under pressure to turn turbulence into opportunity. Activate five mindshifts to create clarity in crisis&#8212;and supercharge your organization&#8217;s growth with AI.</a></figcaption></figure></div><a class="image-link image2" target="_blank" href="https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GZO0!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 424w, https://substackcdn.com/image/fetch/$s_!GZO0!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 848w, https://substackcdn.com/image/fetch/$s_!GZO0!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 1272w, https://substackcdn.com/image/fetch/$s_!GZO0!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GZO0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Technology alone is not strategy&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:&quot;https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com&quot;,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Technology alone is not strategy" title="Technology alone is not strategy" srcset="https://substackcdn.com/image/fetch/$s_!GZO0!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 424w, https://substackcdn.com/image/fetch/$s_!GZO0!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 848w, https://substackcdn.com/image/fetch/$s_!GZO0!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 1272w, https://substackcdn.com/image/fetch/$s_!GZO0!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7815204e-293f-4f7e-ae01-33d1483b5197_32x32.ico 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><p><a href="https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com">IBMTom Hogan</a></p><a class="image-link image2" target="_blank" href="https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!egkO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 424w, https://substackcdn.com/image/fetch/$s_!egkO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 848w, https://substackcdn.com/image/fetch/$s_!egkO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 1272w, https://substackcdn.com/image/fetch/$s_!egkO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!egkO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Technology alone is not strategy&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:&quot;https://www.ibm.com/thought-leadership/institute-business-value/en-us/c-suite-study/ceo?ref=thefelixview.com&quot;,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Technology alone is not strategy" title="Technology alone is not strategy" srcset="https://substackcdn.com/image/fetch/$s_!egkO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 424w, https://substackcdn.com/image/fetch/$s_!egkO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 848w, https://substackcdn.com/image/fetch/$s_!egkO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 1272w, https://substackcdn.com/image/fetch/$s_!egkO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f195aeb-b34b-4e74-92c1-f0d32ca7be0b_1456x728.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><p>Unsurprisingly, 68% of CEOs believe that generative AI is fundamentally transforming their business models. 61% agree that competitive advantage will increasingly depend on their ability to adopt and scale it. However, 25% of AI initiatives have not delivered their desired results. The Issue is not willingness to implement but the actual use cases and methods that are being adapted.</p><p>At <strong>Felix Research</strong>, we interpret this divergence as validation of our central philosophy: the future of enterprise AI lies not in general-purpose automation, but in domain-specific human-centric Augmented Intelligence applications. The difference is nontrivial. Most Large Language Models currently deployed in enterprise settings are designed for maximum generality. The demand for purpose-built tools that champion human-in-the-loop principles remains neglected in the market.</p><p>The IBM report&#8217;s fourth Mindshift, <em>Ignore FOMO, lean into ROI</em>, speaks directly to this challenge. The data confirms that early adopters who rushed into AI pilots often did so without clear KPIs, governance, or integration pathways. Felix Research believes that the future of AI implementation will be with platforms that allow humans to focus on high value critical task using AI to handle routine functions. In turn, high-value research is generated, and ROI is clearly measurable.</p><p>The third Mindshift, <em>Cultivate a vibrant data environment</em>, is another area wherein the gap between vision and execution remains wide. Although 72% of CEOs consider integrated data architecture essential for AI success, actual approaches to achieve this shift are far and few between.</p><p>Perhaps the most important theme emerging from IBM&#8217;s report is that trust remains a structural constraint on AI adoption. Though not framed as a discrete Mindshift, it permeates each of the five. In high-stakes, research-rich fields, model transparency, auditability, and explainability are not optional, they are the bedrock of ethics-by-design and consequentially, sourcing, trust and functionality. There is little use for work you cannot trust.</p><p>IBM&#8217;s final Mindshift, <em>Borrow the talent you can&#8217;t buy</em>, may be the most telling. In a landscape where speed-to-deployment and domain expertise are both required, external partnerships are not a liability; they are a strategic advantage. There is a dearth of institution-grade tools, developed in close collaboration with project-based professionals, and tested against the real workflows it aims to support. Felix Research encourages you to watch this space.</p><p>IBM&#8217;s 2025 CEO Study provides a useful framework for assessing the future of AI adoption. But the study&#8217;s most important implication may be that generative AI alone will not produce transformation.</p><p>Only well-scoped, governed, and trusted AI systems, purpose-built for specific domains, will do so.</p><p><strong>Felix One</strong> is that system.</p>]]></content:encoded></item><item><title><![CDATA[The Original Vision of Augmented Intelligence: What the “Mother of All Demos” Got Right]]></title><description><![CDATA[The idea that technology should amplify, not substitute, human intelligence isn't new.]]></description><link>https://www.thefelixview.com/p/the-original-vision-of-augmented</link><guid isPermaLink="false">https://www.thefelixview.com/p/the-original-vision-of-augmented</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Thu, 19 Jun 2025 15:14:49 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/a3628047-7d56-4352-a566-78a43d529361_2000x1329.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0JDc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0JDc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 424w, https://substackcdn.com/image/fetch/$s_!0JDc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 848w, https://substackcdn.com/image/fetch/$s_!0JDc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 1272w, https://substackcdn.com/image/fetch/$s_!0JDc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0JDc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;The Original Vision of Augmented Intelligence: What the &#8220;Mother of All Demos&#8221; Got Right&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="The Original Vision of Augmented Intelligence: What the &#8220;Mother of All Demos&#8221; Got Right" title="The Original Vision of Augmented Intelligence: What the &#8220;Mother of All Demos&#8221; Got Right" srcset="https://substackcdn.com/image/fetch/$s_!0JDc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 424w, https://substackcdn.com/image/fetch/$s_!0JDc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 848w, https://substackcdn.com/image/fetch/$s_!0JDc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 1272w, https://substackcdn.com/image/fetch/$s_!0JDc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F52ec3e57-3f0d-4bee-a622-815b88b8f027_2000x1329.png 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p>The idea that technology should amplify, not substitute, human intelligence isn't new.</p><p>It was first proposed in 1960 by American psychologist and computer scientist <strong>J.C.R. Licklider</strong> in his seminal paper <em>&#8220;Man-Computer Symbiosis&#8221;</em>. Licklider envisioned a future in which humans and machines would form close working partnerships, with computers handling the mechanical tasks and humans focusing on creative, strategic thinking.</p><p>That vision came to life eight years later in what&#8217;s now called the <strong>&#8220;Mother of All Demos&#8221;</strong>. In a single 90-minute presentation, computing pioneer <strong>Douglas Engelbart</strong> introduced an astonished audience to the mouse, hypertext, graphical interfaces, real-time document collaboration, and even early video conferencing. But the real breakthrough wasn&#8217;t the technology, it was the mindset.</p><p>Engelbart&#8217;s purpose wasn&#8217;t to replace the professional. It was to augment them. He believed computers could help humans think better, work faster, and collaborate more effectively. It was a direct continuation of Licklider&#8217;s symbiosis; practical, powerful, and crucially human.</p><p>Today, as financial institutions adopt AI into research, analysis, and compliance, that original vision is more relevant than ever. We shouldn&#8217;t be asking whether machines can take over entire roles, but how they can remove the friction from work and free up professionals to do what they do best.</p><p><strong>Augmented intelligence is not a new idea. But it's now here.</strong> And those who embrace it won&#8217;t just be more efficient. They&#8217;ll be smarter, faster, and ultimately, more human in the way they work.</p>]]></content:encoded></item><item><title><![CDATA[Augmented Intelligence: Why the Smartest Professionals Are Partnering with AI]]></title><description><![CDATA[The term &#8220;Augmented Intelligence&#8221; may sound like just another tech buzzword, but its origins speak to something deeper: a belief that AI should amplify human capabilities, not replace them.]]></description><link>https://www.thefelixview.com/p/augmented-intelligence-why-the-smartest</link><guid isPermaLink="false">https://www.thefelixview.com/p/augmented-intelligence-why-the-smartest</guid><dc:creator><![CDATA[Felix Research]]></dc:creator><pubDate>Thu, 19 Jun 2025 11:17:38 GMT</pubDate><enclosure url="https://substack-post-media.s3.amazonaws.com/public/images/1290f5fa-a67a-460b-9a9e-86453167ec82_2000x3000.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!RqKS!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!RqKS!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!RqKS!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!RqKS!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!RqKS!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!RqKS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:null,&quot;width&quot;:null,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Augmented Intelligence: Why the Smartest Professionals Are Partnering with AI&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Augmented Intelligence: Why the Smartest Professionals Are Partnering with AI" title="Augmented Intelligence: Why the Smartest Professionals Are Partnering with AI" srcset="https://substackcdn.com/image/fetch/$s_!RqKS!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 424w, https://substackcdn.com/image/fetch/$s_!RqKS!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 848w, https://substackcdn.com/image/fetch/$s_!RqKS!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!RqKS!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6ab7530c-9909-42f2-8927-caf3e57fa679_2000x3000.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div></div></div></a><p>The term &#8220;Augmented Intelligence&#8221; may sound like just another tech buzzword, but its origins speak to something deeper: a belief that AI should amplify human capabilities, not replace them. The concept is about using AI to support and enhance human expertise. It's not about handing over control, but making professionals more effective at what they already do best.</p><p>Let&#8217;s be honest: resisting AI is no longer a viable strategy.</p><p>The institutions that ignore or fear it are setting themselves up to be outpaced, outsmarted, and ultimately left behind. But the opposite extreme, treating AI as a complete replacement for human professionals is both technically flawed and strategically short-sighted.</p><p>Even the best models cannot yet fully replicate human judgment, insight, or instinct.</p><p><strong>Man and machine; better than either alone</strong></p><p>Imagine an analyst who no longer spends hours compiling fragmented data, manually formatting reports, or trawling PDFs. Instead, AI takes care of the heavy-lifting by surfacing insights, automating structure, and connecting dots. What&#8217;s left? More time for strategy, sharper insights, and faster execution. That is the power of augmented intelligence.</p><p>In short: AI shouldn&#8217;t replace the analyst. It should unleash them. The future of work is not artificial, it&#8217;s Augmented.</p>]]></content:encoded></item></channel></rss>