<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[THE BLUEPRINT: 🤔 Opinion]]></title><description><![CDATA[Sharing and shouting about important topics and concepts in Generative AI.]]></description><link>https://www.the-blueprint.ai/s/opinion</link><generator>Substack</generator><lastBuildDate>Fri, 01 May 2026 04:58:38 GMT</lastBuildDate><atom:link href="https://www.the-blueprint.ai/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Sean Betts]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[theaiblueprint@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[theaiblueprint@substack.com]]></itunes:email><itunes:name><![CDATA[Sean 🤓]]></itunes:name></itunes:owner><itunes:author><![CDATA[Sean 🤓]]></itunes:author><googleplay:owner><![CDATA[theaiblueprint@substack.com]]></googleplay:owner><googleplay:email><![CDATA[theaiblueprint@substack.com]]></googleplay:email><googleplay:author><![CDATA[Sean 🤓]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Problem with Sell-Side Agents]]></title><description><![CDATA[There is a line between helpful agents and dangerous ones]]></description><link>https://www.the-blueprint.ai/p/the-problem-with-sell-side-agents</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/the-problem-with-sell-side-agents</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Tue, 07 Apr 2026 09:02:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ctzy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ctzy!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ctzy!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!ctzy!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!ctzy!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!ctzy!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ctzy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1598912,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.the-blueprint.ai/i/192941405?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ctzy!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!ctzy!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!ctzy!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!ctzy!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff0b3e36b-10b4-4395-a584-03ed08819359_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Advertising is entering an agentic phase, where AI agents can negotiate, transact, and increasingly attempt to execute campaigns across the supply chain. Standards bodies are actively building this future. IAB Tech Lab is mapping existing ad standards into &#8220;agentic&#8221; extensions and creating an <a href="https://iabtechlab.com/introducing-the-iab-tech-lab-agent-registry/">Agent Registry</a> intended to improve trust and transparency.</p><p>Meanwhile, <a href="https://adcontextprotocol.org">Ad Context Protocol</a> (AdCP) is trying to standardise how AI agents talk to ad platforms, so the same agent can discover inventory and place or update media buys across different sellers without needing a bespoke integration for each one.</p><p>This all sounds sophisticated and exciting, especially if you are trying to drive more efficiency into advertising operations, but there&#8217;s one glaring problem.</p><p>AI agents are probabilistic by design. A language model is literally a probability distribution over sequences of tokens, and it generates outputs by selecting what comes next based on those probabilities. That means you cannot guarantee exactly what it will do, or that it will do it the same way every time.</p><p>That is fine for low-risk tasks like reporting and insights. It is not fine for high-risk tasks like placing a buy, changing budgets, or optimising a live campaign.</p><p>My argument is simple. Sell-side agents should support reporting, insights, and forecasting. But they should not support executing, or optimising media campaigns directly.</p><h2><strong>The only distinction that matters</strong></h2><p>Most of the excitement around agentic advertising is about efficiency. Fewer manual steps. Faster optimisation. Less operational drag. And on paper, it sounds inevitable: if AI agents can negotiate and transact, why wouldn&#8217;t it also execute?</p><p>But there is a line here that I don&#8217;t think should be crossed, at least for large clients or agencies managing large budgets and high volumes of campaigns. An AI agent that recommends is one thing. An AI agent that can change a live campaign is something else entirely. The moment an AI agent can place a buy, reallocate budgets, alter targeting, or change bidding strategies, it stops being a helper and starts making decisions that directly shape delivery.</p><p>This matters because campaigns are not just settings. They are the execution of a strategy that has been crafted, signed off, and often contractually committed to. Once an AI agent can change a live campaign, you are no longer executing the plan you agreed. You are continuously rewriting it, and accountability gets blurred. And when accountability gets blurred, trust breaks.</p><h2><strong>Where the risk actually lives</strong></h2><p>The risk is not that AI will sometimes be wrong. Everyone knows that. The real risk is that the industry makes it easy for AI agents to take actions that materially change delivery, at speed, across lots of campaigns, and across lots of platforms.</p><p>That is exactly what the emerging standards are trying to enable. IAB Tech Lab&#8217;s agentic roadmap is about scaling agentic execution by extending existing standards with modern protocols, so systems can coordinate and transact more efficiently without rebuilding everything from scratch.</p><p>But once you accept that direction, you also have to accept what comes with it. If an agent can place or update buys, it can also make the wrong change, at the wrong time, in the wrong place, and do it repeatedly. This is about edge cases, and the fact that generative systems can be unreliable or inconsistent under different conditions. </p><p>When you are running big budgets at scale, small inconsistencies become real money, real brand risk, real governance problems, and real liability issues.</p><p>The solution is not to slow down the standards. Standardisation is helpful because it makes workflows more consistent and, in principle, easier to govern. The question is where you draw the execution boundary.</p><p>A practical line for large advertisers is this. Let sell-side agents help you understand and plan: insights, recommendations, campaign setup, and reporting performance. Let them propose changes. But do not let sell-side agents put campaigns live or directly change them once they are live.</p><p>Instead, campaign execution must remain buy-side controlled. The buy-side owns the plan, the budgets and holds the liability if things go wrong.</p><h2><strong>The rule set we need</strong></h2><p>None of this is an argument against agentic advertising. It&#8217;s an argument for being explicit about where agents should sit in the operating model.</p><p>Standards like the IAB Tech Lab&#8217;s agentic roadmap are useful because they move us towards common workflows and shared language. They reduce integration friction and make it easier for systems to coordinate. That is good for the whole ecosystem.</p><p>But as we make campaign operations easier to automate, we also need to make governance easier to enforce. The moment a probabilistic system can change what is live, the question starts becoming &#8220;who is accountable?&#8221;</p><p>So I think the industry needs a simple set of rules.</p><p>Sell-side agents can advise, propose, and explain. They can help with planning, forecasting, and setup. They can surface opportunities and risks. But they should not be allowed to activate, change, or optimise live campaigns.</p><p>Buy-side agents can automate execution, but only through deterministic interfaces with clear parameters, human-validation before changes go live, and audit logs that make every action attributable.</p><p>If we get that boundary right, we get the upside of agentic workflows without creating a gap where the buyer carries the liability but loses control. And if we get it wrong, we will normalise a world where campaign delivery can be altered by systems the buyer does not govern, and then we&#8217;ll spend the next few years trying to bolt accountability back on after the fact.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Advertising Inside the Model]]></title><description><![CDATA[Old logic doesn&#8217;t work in a new medium.]]></description><link>https://www.the-blueprint.ai/p/advertising-inside-the-model</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/advertising-inside-the-model</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Wed, 03 Dec 2025 16:43:07 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!yOUZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!yOUZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!yOUZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 424w, https://substackcdn.com/image/fetch/$s_!yOUZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 848w, https://substackcdn.com/image/fetch/$s_!yOUZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 1272w, https://substackcdn.com/image/fetch/$s_!yOUZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!yOUZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp" width="1456" height="910" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:910,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Perplexity AI to launch Ads in Q4&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Perplexity AI to launch Ads in Q4" title="Perplexity AI to launch Ads in Q4" srcset="https://substackcdn.com/image/fetch/$s_!yOUZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 424w, https://substackcdn.com/image/fetch/$s_!yOUZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 848w, https://substackcdn.com/image/fetch/$s_!yOUZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 1272w, https://substackcdn.com/image/fetch/$s_!yOUZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F01345a7a-86f3-43fe-b7a5-4288d7575a83_1920x1200.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>AI platforms are under increasing pressure to find a sustainable revenue model. Around 90-95% of ChatGPT users do not pay for the product. Google is urgently searching for a successor to its search advertising revenue. Perplexity and other emerging players need a dependable commercial engine to keep funding their growth. </p><p>Every AI platform is now starting to look towards advertising as the answer. It is a tried and trusted value exchange that has powered the internet for more than twenty five years: free services in return for attention. For decades it has created a stable equilibrium between user experience and commercial pressure. Now every AI platform is attempting to recreate that playbook inside an environment that works nothing like the open web.</p><h2>Old Logic, New Medium</h2><p>The problem is that AI platforms are trying to apply the same old thinking to a completely new medium. They are looking to wedge ads into conversational interfaces in the same way we once wedged ads into webpages and search results. The formats look familiar. The logic looks familiar. The compromises look familiar. And if they continue down this path, the same relevancy issues that have plagued digital advertising for two decades will reappear inside AI platforms.</p><p>At their core, these relevancy issues have always stemmed from the same pattern. Ads are shown because they can be shown, not because they should be shown. Despite the advertising industry&#8217;s best efforts, we rely on loose proxies for intent, imperfect targeting data, and bidding systems that reward the highest payer rather than the most suitable option. This means consumers are routinely interrupted with messages that have nothing to do with what they want, where they are, or what they are trying to achieve in that moment.</p><p>Put simply, AI platforms are in danger of treating advertising as a UI problem rather than a reasoning problem. It&#8217;s a strategic failure in the making. They are sitting on something the advertising industry has never had before: an intelligence that genuinely understands what a user is trying to do. Not a keyword. Not a demographic bucket. Not a retargeting signal scraped from another tab. A real-time, contextual interpretation of intent. Yet instead of using this intelligence to fix advertising, they are just trying to put display ads inside an assistant. This is incredibly one dimensional thinking.</p><h2>Let The Model Think About The Ad</h2><p>The opportunity AI platforms are missing is to let their models think about the advertising. Allow the model to evaluate it in the same way it evaluates any other piece of information. By introducing paid advertising opportunities into the reasoning loop, AI models can judge whether it is relevant to the task, the context, and the user&#8217;s stated or implied intent. This idea works because of what large language models already do - they evaluate information, filter options, and assemble an answer. </p><p>Paid advertising opportunities would not look like ads in the traditional sense. They would be structured inputs, including product information, visual assets, messaging, and a URL. This turns advertising into a format the model can reason with so it can decide if it&#8217;s relevant, doing everything the advertising industry has spent twenty years trying to do manually.</p><p>The flow should be simple: advertisers provide structured inputs. The model evaluates them. Irrelevant options drop out. Relevant ones surface with full disclosure.</p><p>To be clear, paid advertising opportunities would exist for one purpose only: to be evaluated by the model for relevance. The model decides whether the advertising content informs the answer. If it doesn&#8217;t, the user never sees it. If it does, it appears in the output with full disclosure and a clear citation. This mirrors how AI platforms cite sources in search-style answers today - the mechanism already exists, they can simply be extended to paid content.</p><p>This approach requires a simple, transparent rule set. Any paid influence must be cited. If a sponsored option contributes to the output, the user sees that immediately. If a sponsored option is considered but rejected, the platform can log that privately so advertisers can understand why and improve future ads. </p><p>Most importantly, the integrity of the AI platform must be non negotiable. Putting ads on the surface invites scepticism. Putting ads into the model&#8217;s reasoning, with full disclosure, invites relevance. One is a dark pattern. The other is a design pattern.</p><h2>A Healthier System for Everyone</h2><p>Advertising inside the model flips the dynamics of advertising completely. Brands would not win just because they outbid competitors. They would win because their messaging genuinely matched the user&#8217;s intent as interpreted by the model. If instead AI platforms move advertising into the reasoning layer, they unlock something the industry has never had. True contextual relevance at the moment of intent. Ads that are additive. Ads that are filtered by intelligence rather than interrupted by design. For the first time, advertising would be additive rather than interruptive and has the additional benefit of not adding unwanted clutter to the user experience that could break flow, annoy consumers, and erode trust. This is a healthier system for everyone involved.</p><p>This also creates a new commercial opportunity. A cost per thought when the model evaluates a paid advertising opportunity. A cost per impression when that option is surfaced. A cost per click if the user takes action. It&#8217;s a clean hierarchy that mirrors the actual funnel of the AI model&#8217;s reasoning that will give advertisers flexibility, allow them to control budgets around different stages of decision making, and align incentives around quality instead of intrusion.</p><p>If AI platforms get this right, they can build a new kind of advertising system. One built on intelligence, context and intent. One that rewards relevance as a first principle. One that gives users better answers and gives brands better opportunities.</p><p>The future of advertising will not be won by whoever finds the cleverest place to stick an ad in a chatbot. It will be won by whoever lets their model put the most relevant content in front of consumers.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Clicks Are Dead]]></title><description><![CDATA[The Practical Guide to Marketing in Web 4.0]]></description><link>https://www.the-blueprint.ai/p/clicks-are-dead</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/clicks-are-dead</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 24 Jul 2025 09:14:35 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!tW7Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!tW7Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!tW7Z!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!tW7Z!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!tW7Z!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!tW7Z!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!tW7Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png" width="1456" height="971" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/dee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:971,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2320860,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.the-blueprint.ai/i/169121592?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!tW7Z!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 424w, https://substackcdn.com/image/fetch/$s_!tW7Z!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 848w, https://substackcdn.com/image/fetch/$s_!tW7Z!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!tW7Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fdee0732e-9a0e-4302-8c28-9e320723455c_1536x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>For over two decades, the internet has operated on a simple economic model: (mostly) "free" content in exchange for attention and ad revenue. This system runs on the trilogy of searches, clicks, and website visits. Page views, click-through rates, and purchase funnels have been the metrics that mattered, powering a digital economy that today is worth hundreds of billions of dollars.</p><p>But this system is starting to break down. We're seeing the emergence of Web 4.0 - a fundamental shift from a human-centric to an AI-operated internet. Where previous evolutions of the web changed how we interacted online, Web 4.0 introduces a new dimension to our digital ecosystem: AI platforms, and increasingly agents, that can understand, decide, and act on our behalf.</p><p>The evidence of this transition has been mounting rapidly in 2025. Apple's Eddy Cue <a href="https://fortune.com/2025/05/08/apple-eddy-cue-testimony-google-alphabet-safari-ai-search-features/">testified that Safari searches dropped for the first time in 22 years</a>, with users turning to ChatGPT and Perplexity instead of traditional search. Google's own internal documents, show the technology giant acknowledging that <a href="https://searchengineland.com/google-search-traffic-decline-inevitable-455345">the decline of searching is "inevitable"</a>. Meanwhile, <a href="https://techcrunch.com/2025/06/10/googles-ai-overviews-are-killing-traffic-for-publishers/">publishers are reporting dramatic traffic declines</a> as Google's AI Overviews provide answers without requiring site visits. This transition is starting to gain pace, as the early majority adopts AI-powered search tools at unprecedented rates.</p><h2>The Three Pillars of Zero-Click Disruption</h2><p>The transition to Web 4.0 is starting to play out across the entire purchase funnel, impacting every consumer online touchpoint by removing the clicks that underpin and fund the modern web. Zero-click search eliminates the need to visit websites for information, as AI now provides answers directly. Zero-click browsing removes online navigation entirely, with AI now visiting websites autonomously on users' behalf. Zero-click purchasing will eliminate the purchase funnel entirely, with AI able to research, compare, and purchase products autonomously.</p><p>Together, these three pillars represent the systematic dismantling of the click-based economy that built the internet as we know it.</p><p><strong>Zero-click search</strong> is already here and its growth is accelerating. <a href="https://searchengineland.com/google-search-zero-click-study-2024-443869">Nearly 60% of Google searches now end without a click</a>. For news searches specifically, that number jumps to 69%, quickly increasing from 56% just a few months ago. The impact on publishers, a canary in the coal mine, is already devastating. The New York Times, for example, has watched its search traffic drop from 44% to 36.5% of total traffic as AI provides answers without requiring site visits.</p><p><strong>Zero-click browsing</strong> is also just starting to emerge. The dual launch of Perplexity's Comet browser and OpenAI's ChatGPT Agent over the last 2 weeks has introduced AI platforms that are capable of navigating websites on users' behalf, eliminating the need for traditional browsing patterns entirely. Unlike zero-click search, which provides answers from AI training data, these AI platforms visit websites autonomously, stripping away the tracking parameters and user behaviour data that powers modern web analytics.</p><p>The implications of zero-click browsing for digital marketing are hugely significant. Traditional web analytics depends entirely on URL parameters - the difference between a clean URL (<em>site.com/page</em>) and a tracked URL (<em>site.com/page?utm_campaign=email&amp;utm_source=newsletter</em>). These URL parameters tell marketers which campaigns, emails, or social posts drove traffic, forming the backbone of attribution models and ROI calculations. When AI browsers visit sites autonomously, they strip away these tracking codes, rendering decades of established digital measurement practices obsolete overnight.</p><p>The final pillar of zero-click disruption, that is just starting to emerge, is <strong>zero-click purchasing</strong>. AI agents will soon be capable and trusted enough to make purchases on consumers' behalves. This final piece of the puzzle will completely collapse the traditional online purchase funnel, eliminating many of the touchpoints that digital marketers rely on to influence consumer decisions. <a href="https://blog.adobe.com/en/publish/2025/03/17/adobe-analytics-traffic-to-us-retail-websites-from-generative-ai-sources-jumps-1200-percent">According to recent research from Adobe</a>, there has been a 1,200% increase YoY in traffic to US retail sites from generative AI sources, and the AI sales conversion gap has decreased from 43% to just 9% over the same period. This suggests that the foundation for this is already forming.</p><p>As we transition to Web 4.0, traditional digital marketing metrics will start to become meaningless. URL parameters, cookie-based attribution, and purchase funnels all depend on human browsing patterns that AI platforms don't follow. Traditional digital analytics tools will struggle to measure AI-driven business impact, creating massive blind spots in digital campaign optimisations and ROI calculations. The infrastructure that powers modern digital marketing is quickly becoming obsolete.</p><h2>What Digital Marketers Need to Do Now</h2><p>The shift to Web 4.0 requires an agile approach that balances quick testing with longer-term strategic decisions. The key is understanding that this transition is happening quickly and digital marketers need to begin preparing for all three 'zero-click' pillars simultaneously while maintaining their current human-facing best-practice. Priorities won't be easy to manage.</p><p>The most important thing is to consider both the scale and speed of impact of any optimisations made to your digital marketing strategy. The current reality is that there aren't many quick wins available yet - providing shopping feeds to AI platforms and implementing Model Context Protocol to help future AI agents interact with your digital services just aren't available or mature enough yet. There are however some simple optimisations that digital marketers can make now that will have big payoffs longer term with minimal effort.</p><p>Start monitoring your Algorithmic Availability - the degree to which your brand, product, or service is accessible, recognisable, and prioritised by AI platforms like ChatGPT, Claude, and Gemini. Digital marketers need to go beyond simple mentions to track how AI platforms understand, represent, and contextualise your brand compared to competitors, including sentiment, accuracy, and recommendations.</p><p>Digital marketers also need to start implementing technical foundations that position your brand's content for better AI consumption. Broaden your use of schema.org markup to make more of your content prioritised by AI platforms and create llms.txt files to provide AI-specific site navigation instructions. Longer term its also worth considering creating text-only versions of your website pages that are easier for AI platforms to parse and can include AI-specific content. Remember that traditional SEO remains crucial since ChatGPT uses Bing search results and Gemini relies on Google Search - your existing optimisation efforts already directly impact AI visibility.</p><p>There are also more involved strategies that digital marketers can test that will have more immediate impact, such as building authority on platforms like Reddit, Quora, and review sites. This requires more investment but will deliver a much bigger impact on your current presence in AI platforms. Focus on authentic participation in platforms where AI platforms source information, but understand this is a sustained effort rather than a quick fix.</p><p>As zero-click browsing becomes more prevalent, start tracking referrals from chatgpt.com, perplexity.ai, and claude.ai as separate segments in your analytics. This AI-driven traffic will behave differently from traditional visitors and will require new measurement approaches. Begin developing attribution models that don't rely on URL parameters and monitor the conversion rates of traffic from AI platforms closely - this will give you insights into on-site optimisations you will be able to make to make using your website more AI-friendly.</p><p>To start testing for zero-click purchasing, structure your product information for direct AI consumption with comprehensive specifications and comparison data that AI platforms can easily process. Monitor how AI platforms recommend your products versus competitors - this competitive intelligence will become crucial as purchase decisions are increasingly made by AI agents. You can also start implementing and testing Model Context Protocol (MCP). If a consumer can take an action on your website, you'll want future AI agents to be able to do the same. MCP is also an interesting marketing opportunity as MCPs are 'advertised' to AI agents using natural language. Think about how you want to position your MCPs and how you can optimise their description to make your brand's services more attractive to an AI agent.</p><p>For now, the best approach balances immediate technical changes with sustained authority-building efforts, and testing new AI-based technologies such as llms.txt and MCP. The transition to Web 4.0 is already underway, and the evidence is mounting rapidly. Safari searches dropped for the first time in 22 years. Nearly 60% of Google searches end without clicks. Publishers are reporting dramatic traffic declines as AI provides answers without requiring site visits.</p><p>The click economy built the modern internet, but the transition is already underway. The digital marketers who start optimising their Algorithmic Availability, building measurement systems for an AI-operated internet, and balancing quick wins with strategic investments today will be best positioned for success as Web 4.0 continues to evolve. But remember, you need to architect for both worlds simultaneously by maintaining excellence in 'human' channels while building capabilities for an AI-mediated future.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Nuclear Waste Kids]]></title><description><![CDATA[The Simpsons x Garbage Pail Kids]]></description><link>https://www.the-blueprint.ai/p/nuclear-waste-kids</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/nuclear-waste-kids</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Tue, 01 Apr 2025 18:01:08 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!Bxjl!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F06fb151e-7dc5-486f-a96f-0265b43f2a01_1024x1536.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Like many people, I've been playing around with ChatGPT's new image generation capabilities over the last week or so. It's been fun creating Studio Ghibli inspired images and all sorts of other remixes of popular IP. </p><p>However....</p><div class="image-gallery-embed" data-attrs="{&quot;gallery&quot;:{&quot;images&quot;:[{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/06fb151e-7dc5-486f-a96f-0265b43f2a01_1024x1536.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/735be8a2-d0aa-4ddf-a968-62ef51afabdd_1024x1536.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/69cbf974-8b5d-4362-84e9-91790462bbcf_1024x1536.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2ea7cc5e-b8bf-4c10-9613-8113e8c2cc22_1024x1536.png&quot;}],&quot;caption&quot;:&quot;The Simpsons Garbage Pail Kids Trading Cards&quot;,&quot;alt&quot;:&quot;&quot;,&quot;staticGalleryImage&quot;:{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/da7a1c91-7a3d-44e0-9c72-2468a705b7e2_1456x1456.png&quot;}},&quot;isEditorNode&quot;:true}"></div><p>I've just had some time to revisit an old favourite of mine - testing the generation of Garbage Pail Kids trading cards. I first did this with MidJourney about 18 months ago, which I wrote up in an article called <a href="https://www.the-blueprint.ai/p/tech-pail-kids?utm_source=publication-search">Tech Pail Kids</a>.</p><p>Back then it wasn't possible for image models to create the whole trading card, just the images the trading cards were based on. Now the models are good enough to not only create the whole thing in one prompt, but to remix other IP into them as well, as you can see by these Simpsons inspired ones...</p><p>It's a lot of fun! But then I realised that I'd very quickly (in 4 prompts) gone from a fun, care-free image of Auto in a Garbage Pail Kids trading card to one of Maggie (a baby) holding a gun and a bloodied knife...</p><p>For full transparency, here is the order I created them in with the prompt and a link so you can see each of them for yourself. Auto was generated from scratch and then each subsequent image was 'remixed' from the previous one with just the prompt below:</p><ol><li><p> "Create me a Garbage Pail Kids Trading Card of Auto from the Simpsons" (<a href="https://sora.com/g/gen_01jqs28q7tf20aacg117yrz6xz">https://sora.com/g/gen_01jqs28q7tf20aacg117yrz6xz</a>)</p></li><li><p>"Create one for Lisa Simpson" (<a href="https://sora.com/g/gen_01jqs2rgw7ebnbs1c036ddfhwf">https://sora.com/g/gen_01jqs2rgw7ebnbs1c036ddfhwf</a>)</p></li><li><p>"Let's create one of these for Marge"(<a href="https://sora.com/g/gen_01jqs390taf35beaa5k32pe3qd">https://sora.com/g/gen_01jqs390taf35beaa5k32pe3qd</a>)</p></li><li><p>"Let's create one for Maggie" (<a href="https://sora.com/g/gen_01jqs3gs23evm9neh99vd6pd74">https://sora.com/g/gen_01jqs3gs23evm9neh99vd6pd74</a>)</p></li></ol><p>As you can see from the images, things started off ok with Auto and Lisa, but then quickly went off the rails with Marge, which was then doubled-down on with Maggie.</p><p>It's important to say that I did get a few refusals from OpenAI when trying to generate versions for Mr. Burns, Homer, and Bart. They were generated, but then weren't shown and I was given the following message: "This content can't be shown for now. We're still developing how we evaluate which content conflicts with our policies. Think we got it wrong? Let us know."</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ky_D!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ky_D!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 424w, https://substackcdn.com/image/fetch/$s_!ky_D!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 848w, https://substackcdn.com/image/fetch/$s_!ky_D!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 1272w, https://substackcdn.com/image/fetch/$s_!ky_D!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ky_D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png" width="1456" height="2555" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:2555,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:191555,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.the-blueprint.ai/i/160355673?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ky_D!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 424w, https://substackcdn.com/image/fetch/$s_!ky_D!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 848w, https://substackcdn.com/image/fetch/$s_!ky_D!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 1272w, https://substackcdn.com/image/fetch/$s_!ky_D!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0e7d0b9d-c66d-49ed-8529-660d2390b746_1760x3088.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Obviously there are lots of issues that we still need to figure out with this kind of technology from trademark and copyright, to issues of bias, all the way through to things like the right to publicity and defamation for anything based on a real person.</p><p>This image generation capability from OpenAI is also very new, and they've purposely set "a new high-water mark for ... allowing creative freedom." They're still refining what this means, and there's a great post from Joanne Jang, OpenAI&#8217;s  head of product and model behaviour, about it <a href="https://reservoirsamples.substack.com/p/thoughts-on-setting-policy-for-new">here</a>.</p><p>What my experience highlights is not only the ethical and legal complexities of systems that can so easily generate imagery, but also the challenges of consistently enforcing any policy and controls. Sometimes I was able to generate these images, sometimes I wasn&#8217;t. Some of the images were perfectly benign, others were borderline offensive (IMHO).</p><p>The reality is that the technology cat is already out of the bag, and even if much stricter laws and regulations are put in place I think it will be almost impossible to enforce them through the technology itself. That&#8217;s just not how these models work.</p><p>This is why &#8216;human-in-the-loop&#8217; is so important. As this technology is advancing so rapidly, we have to individually take responsibility for the content we&#8217;re generating. Sometimes we&#8217;ll get it right, sometimes we&#8217;ll get it wrong. We&#8217;re only human and we will hopefully learn from our mistakes.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!g-hY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!g-hY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 424w, https://substackcdn.com/image/fetch/$s_!g-hY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 848w, https://substackcdn.com/image/fetch/$s_!g-hY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 1272w, https://substackcdn.com/image/fetch/$s_!g-hY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!g-hY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png" width="1194" height="418" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:418,&quot;width&quot;:1194,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:79175,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://www.the-blueprint.ai/i/160355673?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!g-hY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 424w, https://substackcdn.com/image/fetch/$s_!g-hY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 848w, https://substackcdn.com/image/fetch/$s_!g-hY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 1272w, https://substackcdn.com/image/fetch/$s_!g-hY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F59c7c932-6b50-4442-97a4-b25abf9a483a_1194x418.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">https://x.com/sama/status/1906771292390666325</figcaption></figure></div><p>However, this technology is being adopted at incredible speeds. Just yesterday, Sam Altman announced that they&#8217;d added 1m users in an hour, which is mind blowing.  As more and more people start using these incredibly powerful tools, critical thinking and broader digital literacy become even more important. The technology itself may be morally neutral, but how we choose to use it certainly isn't.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[I really miss Steve Jobs]]></title><description><![CDATA[Stay hungry. Stay foolish.]]></description><link>https://www.the-blueprint.ai/p/i-really-miss-steve-jobs</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/i-really-miss-steve-jobs</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Wed, 19 Mar 2025 09:01:52 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!6Mkc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6Mkc!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6Mkc!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 424w, https://substackcdn.com/image/fetch/$s_!6Mkc!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 848w, https://substackcdn.com/image/fetch/$s_!6Mkc!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 1272w, https://substackcdn.com/image/fetch/$s_!6Mkc!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6Mkc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic" width="1200" height="800" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:800,&quot;width&quot;:1200,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:53851,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://www.the-blueprint.ai/i/159280062?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!6Mkc!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 424w, https://substackcdn.com/image/fetch/$s_!6Mkc!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 848w, https://substackcdn.com/image/fetch/$s_!6Mkc!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 1272w, https://substackcdn.com/image/fetch/$s_!6Mkc!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8e58de5f-b206-4520-9ff5-55bd2a512a2a_1200x800.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It&#8217;s a bit of understatement to say that I was a fan. I was a HUGE fan of not just Apple the company and their products, but really of Steve Jobs. He embodied many of the things that I aspired to. He was, and still is, a great inspiration to me, and also someone I learned, and continue to learn, a lot of lessons from. </p><p>But then he sadly passed away on 5th October 2011 at the age of 56. He was taken from us too soon and in the nearly 15 years since then I&#8217;ve never been able to shake the feeling that the world would have been a better place with him still in it. </p><p>If Steve were alive today, he would have just turned 70 years old. I&#8217;d like to think that he&#8217;d be happily retired living a peaceful life somewhere out of the limelight, but I don&#8217;t think that was his style. And that&#8217;s kind of the point. Steve always was, and always would have been, strong-willed and highly opinionated. He was an unconventional thinker, with an incredible intuition and instinct for what consumers wanted before they knew themselves. He also had a deep emotional connection to product design and user experience. He wanted to delight, evoke joy, and encourage curiosity.</p><p>These are the things that I miss the most, and what I think we need in 2025 more than any other year in the last 15.</p><div class="pullquote"><p><em>&#8220;Don&#8217;t be trapped by dogma&#8221;</em></p></div><p>I think Steve would have both loved and hated the state of technology in 2025. He&#8217;d absolutely love the incredible possibilities, but absolutely hate what the industry has become. I don&#8217;t think it&#8217;s a coincidence that in the post-Jobs era we&#8217;ve seen the prioritisation of engagement metrics over human experiences. Optimisation over integrity. And maybe worst of all, quarterly returns over transformative vision. Too many companies  are just reacting to their competitors rather than leading with purpose. There&#8217;s not enough bravery. There&#8217;s not enough delight.</p><p>This is exactly the kind of dogma that Steve would have rejected, and by doing so he would have set an example for the rest of the industry to follow. Steve saw technology as a liberator of human potential, not as something to capture human attention. There&#8217;s no doubt in my mind that he would have been incredibly critical and vocal about the large digital platforms that have risen to such dominance over the last 15 years.</p><div class="pullquote"><p><em>&#8220;&#8230;that&#8217;s what a computer is to me&#8230; the most remarkable tool that we have ever come up with. It&#8217;s the equivalent of a bicycle for our minds.&#8221;</em></p></div><p>But what of those incredible possibilities? Obviously I&#8217;m referring to AI, and specifically generative AI. I think Steve would be in his element right now - he was always attracted to revolutionary changes and there&#8217;s arguably nothing more revolutionary than AI. He liked these changes because they were hard, and getting generative AI right is hard. He would have relished this challenge.</p><p>Generative AI is the purest extension we&#8217;ve ever seen of Steve&#8217;s idea of what a computer is. He believed that computers were the most remarkable tool that we&#8217;ve ever invented. He thought of them as an extension of the human mind, and something that amplified human potential. </p><p>Steve was often quoted as saying that computers were the equivalent of a bicycle for our minds. I really love this quote, but it&#8217;s often misinterpreted. On its own, the quote seems to be about speed, but when you read it in context it&#8217;s actually about efficiency. The bicycle wasn&#8217;t remarkable because it made humans faster; it was remarkable because it allowed humans to expend far less energy to achieve far more.</p><p>This is generative AI. It&#8217;s the most remarkable tool we&#8217;ve ever invented and it will amplify human potential far more than computers ever have. But, like Steve&#8217;s quote, we can&#8217;t allow ourselves to misinterpret it - Generative AI is not about making humans faster, it&#8217;s about allowing humans to achieve far more.</p><div class="pullquote"><p><em>"You've got to start with the customer experience and work backward to the technology &#8212; not the other way around."</em></p></div><p>One of the challenges we&#8217;re facing is that the most cutting edge technology that humanity has ever invented is currently trapped behind a one dimensional chat interface. You type, it types back. With some you speak, it speaks back. That&#8217;s about it.</p><p>I completely understand how we got here. OpenAI trained GPT-3 as part of their research into Artificial General Intelligence. They then fine-tuned it for chat and decided to put a simple user interface on it to see how people might use it. ChatGPT was a research preview, not a consumer product. But that research preview blew up, taking everyone by surprise. That&#8217;s how November 30th 2022 happened and the rest is history. </p><p>But in the last two years the interaction model has hardly moved on. All the frontier AI companies have just copied that 2022 research preview&#8217;s user interface. They&#8217;ve made the mistake of starting with the technology, not the consumer experience. This was understandable in early 2023 but less so in early 2025.</p><p>One thing I passionately believe is that technology is a mirror on humanity. This has never been more true of a technology than with generative AI - it&#8217;s trained on human data, it can create like humans, and it makes mistakes like humans. So where&#8217;s the humanity in how we interact with this technology? Where&#8217;s the delight? Where&#8217;s the emotion? Where&#8217;s the joy?</p><p>Maybe it&#8217;s still too early. There are certainly glimpses of more human-like interactions in the advanced voice modes that have been released over the last 6 months that have enabled generative AI models to express themselves with emotion. But speech interfaces aren&#8217;t productive. They&#8217;re great for quick interactions but mostly useless for more involved tasks. We&#8217;re going to need more unconventional thinking like Steve&#8217;s to crack this.</p><div class="pullquote"><p><em>&#8220;Technology is nothing. What's important is that you have faith in people, that they're basically good and smart, and if you give them tools, they'll do wonderful things with them.&#8221;</em></p></div><p>Another challenge we&#8217;re facing with generative AI is one of trust. I speak to a lot of large organisations about what their strategy is, what their ambitions are, and how they&#8217;re embracing AI. Of all the organisations I speak to, it&#8217;s the absolute minority that have given all their employees access to one of the frontier enterprise models and are encouraging play and experimentation. Most haven&#8217;t got anywhere near that yet and are still in a &#8216;testing&#8217; phase with a handful of people working with generative AI models to prove out use cases and &#8216;value&#8217;.</p><p>This is absolutely the wrong approach for a general purpose, transformative technology like generative AI. With something this broad you just can&#8217;t identify use cases top-down in an organisation. Especially in large, global organisations with complex structures, multi-disciplined teams, and a wide variety of specialist departments. I&#8217;m not even sure you can identify use cases in a meaningful way bottom-up, because everyone&#8217;s role is different and everyone has their own way of working. </p><p>But what I do know is that we have a lot of smart people in large organisations, who if you give them generative AI tools would do wonderful things with them. What organisations need to embrace is collective learning, so that interesting and useful ideas are shared and everyone can learn from each other&#8217;s experiences. </p><div class="pullquote"><p><em>"&#8220;Design is a funny word. Some people think design means how it looks. But of course, if you dig deeper, it's really how it works.&#8221;</em></p></div><p>Yet another challenge is how the models are built. Copyright issues, environmental issues, bias, interpretability, safety - you name it. What the models can do is amazing, but there are lots of challenges with the data they&#8217;re trained on and even understanding how they actually work.</p><p>One of my favourite anecdotes from Steve is something he often said he learned from his father, who was a carpenter. He said that a true craftsman wouldn&#8217;t use a piece of plywood on the back of a beautiful chest of drawers, even though it faces the wall and nobody will see it. A true craftsman would use a beautiful piece of wood on the back because for them to sleep well at night the aesthetic, the quality, has to be carried all the way through.</p><p>Of all the frontier AI companies, I think Anthropic is probably the closest we have to an Apple of AI. This is because they care about how the models work, not just about what they can do. They&#8217;re more focused on research and safety than any of their rivals, and have done the most work on interpretability and figuring out what makes the models tick.</p><p>But, there&#8217;s still a lot more work to be done and more care to be had on how generative AI models are built. That care and attention is something Steve would be a huge champion of.</p><div class="pullquote"><p><em>&#8220;Here&#8217;s to the crazy ones&#8220;</em></p><div id="youtube2-mtftHaK9tYY" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;mtftHaK9tYY&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/mtftHaK9tYY?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div></div><p>The &#8220;Here&#8217;s to the crazy ones&#8230;&#8220; ad encapsulates what Steve Jobs was all about better than anything I could possible write. The original version wasn&#8217;t narrated by Steve, but I think his version is better. He wrote the script, which is a reflection of  his own personal manifesto, and using his narration gives it more power.</p><p>We&#8217;re at a significant inflection point with technology in 2025 and I think we&#8217;ll soon see the consumer adoption of generative AI accelerate as it starts to deliver on its incredible promise. That&#8217;s why I&#8217;m missing Steve Jobs right now more than ever. We need the crazy ones to help us humanise the technology and ensure that it amplifies our creativity rather than replacing it.</p><p>What made Steve special was that he knew that the most powerful innovations happen at the intersection of technology and the liberal arts - where engineering meets humanity, and where function meets beauty. If Steve were with us today, I believe he would challenge us to think different about AI. He would push us to create experiences that delight. He would demand that we build models with craftsmanship and take as much care over the bits we can&#8217;t see as the bits that we can see.</p><p>One of Steve&#8217;s great gifts to the world was showing us that technology could have soul. That attention to detail matters, and that simplicity is the ultimate expression of sophistication. Sometimes we need to remind ourselves of these lessons and carry the torch that Steve gave us. We need to stay hungry for what generative AI could be, not just what it is. And we need to stay foolish enough to believe we can make it more human, more joyful, and more magical.</p><p>Because the people who are crazy enough to think they can change the world are the ones who do.</p><p>And I really miss the one who showed us how.</p><div><hr></div><blockquote><p><em>&#8220;Everyone here has the sense that right now is one of those moments when we are influencing the future.&#8221;</em></p><p><strong>Steve Jobs</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Web 4.0 - The Rise of the Agentic Web]]></title><description><![CDATA[We will need to rethink digital marketing]]></description><link>https://www.the-blueprint.ai/p/web-40-the-rise-of-the-agentic-web</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/web-40-the-rise-of-the-agentic-web</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Tue, 11 Feb 2025 09:01:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!6rYo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!6rYo!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!6rYo!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!6rYo!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!6rYo!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!6rYo!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!6rYo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;A futuristic digital network glowing in neon colors, representing the rise of Web 4.0 and AI agents. The image features interconnected nodes and circuits pulsating with bright blue, purple, and pink neon lights. Abstract representations of AI agents are seen navigating the network, symbolizing the transition to an agentic web. The background is dark, enhancing the glowing neon effect, creating a cyberpunk aesthetic.&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="A futuristic digital network glowing in neon colors, representing the rise of Web 4.0 and AI agents. The image features interconnected nodes and circuits pulsating with bright blue, purple, and pink neon lights. Abstract representations of AI agents are seen navigating the network, symbolizing the transition to an agentic web. The background is dark, enhancing the glowing neon effect, creating a cyberpunk aesthetic." title="A futuristic digital network glowing in neon colors, representing the rise of Web 4.0 and AI agents. The image features interconnected nodes and circuits pulsating with bright blue, purple, and pink neon lights. Abstract representations of AI agents are seen navigating the network, symbolizing the transition to an agentic web. The background is dark, enhancing the glowing neon effect, creating a cyberpunk aesthetic." srcset="https://substackcdn.com/image/fetch/$s_!6rYo!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!6rYo!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!6rYo!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!6rYo!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F43ee440d-bd91-46cd-ae28-916c03daa0f7_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>As artificial intelligence continues its rapid evolution, we're about to see a fundamental shift in how technology facilitates all of our digital experiences. The 'Agentic Web' is a new phase in the evolution of the World Wide Web and will impact how we interact with and through online platforms.</p><p>At its core, this shift will be driven by the emergence of digital AI agents: autonomous systems designed not just to assist but to act on our behalf. While the technology industry has long promised various visions of automated assistance, today's convergence of large language models, large reasoning models, and a maturing API ecosystem is starting to make true autonomous digital agents technically feasible.</p><p>These developments will have a big impact on the future of digital marketing, challenging our assumptions about how brands connect with consumers. This transition will necessitate not just tactical adjustments but a comprehensive rethinking of digital marketing strategies in the emerging agent-mediated world.</p><h2>WEB 4.0</h2><p>The evolution of the World Wide Web has been marked by three distinct eras, each fundamentally transforming how humans interact with digital information and with each other. To understand where we're heading, it's crucial to recognise the patterns of this evolution:</p><p><strong><a href="https://en.wikipedia.org/wiki/Web_2.0#Web_1.0">Web 1.0</a></strong> (1989-2004) represented the internet's first incarnation as a one-way publishing platform. Its static pages and limited interactivity seem quaint by today's standards, but they laid the fundamental infrastructure for global digital connectivity.</p><p><strong><a href="https://en.wikipedia.org/wiki/Web_2.0#Web_2.0">Web 2.0</a></strong> (2004-2020) ushered in the era of social participation and user-generated content. This period saw the rise of platforms that transformed passive consumers into active participants, fundamentally changing how information and influence flow through digital networks.</p><p><strong><a href="https://en.wikipedia.org/wiki/Web3">Web 3.0</a></strong> (2014-present) introduced the promise of a semantic, decentralised web. While still evolving, this era has been characterised by efforts to make the internet more machine-readable and user-controlled through blockchain technologies and sophisticated data structures.</p><p>Now, we're witnessing the emergence of <strong>Web 4.0</strong> &#8212; which I'm calling the Agentic Web. Web 4.0 will be fundamentally different from previous eras as rather than simply changing how we interact with technology, it will introduce new actors into the digital ecosystem: autonomous digital AI agents that can understand, decide, and act on our behalf.</p><h2>The Five Levels of AGI</h2><p>It's important to not only define AI Agents but to also understand where they fit into the broader AI ecosystem that is currently evolving at pace. OpenAI have provided a valuable framework for understanding this progression, <a href="https://archive.is/En1sS">outlining five distinct levels of Artificial General Intelligence (AGI) capability</a> that lays out how the technology is likely to develop over the coming years:</p><h3>Level 1: Conversational AI</h3><p>Current AI systems excel at natural language interaction but operate within carefully constrained parameters. These systems fundamentally remain reactive tools rather than proactive agents. Their primary value lies in augmenting human capabilities rather than replacing human agency.</p><h3>Level 2: Reasoning AI</h3><p>We're now starting to see the emergence of systems that can engage in sophisticated problem-solving comparable to human experts. These large reasoning models are a crucial step toward true AI agents, able to create plans and perform advanced problem solving.</p><h3>Level 3: Autonomous AI</h3><p>At this level, AI systems will maintain extended independent operation, managing complex tasks and adapting to unexpected situations without constant human oversight. This capability mirrors the trust we place in human employees to manage ongoing responsibilities.</p><h3>Level 4: Innovating AI</h3><p>Level 4 introduces systems capable of not just executing tasks but improving processes and developing novel solutions. This will deliver a fundamental shift from systems that simply follow rules to those that can identify and implement better ways of achieving their objectives.</p><h3>Level 5: Organisational AI</h3><p>The final level in this framework defines systems capable of managing entire organisational functions, coordinating complex networks of tasks while maintaining strategic alignment. While this may seem distant, early indicators suggest we're moving more rapidly toward this capability than many anticipated.</p><p>Right now we are at an incredibly significant inflection point. As we start to progress from AI models that can reason (level 2) and power more autonomous AI (level 3) we are starting to see the emergence of digital AI agents which will usher in Web 4.0. This transition is actively unfolding as large reasoning models like o1, o3 and R1 begin to power more autonomous systems capable of navigating our complex digital ecosystems. This is where AI will transition from being a tool we deliberately engage with to an active (and eventually pro-active) participant in our digital experiences. </p><h2>The AI Agent Maturity Framework</h2><p>What we're starting to see at the beginning of this year is the development and release of specialised digital AI agents, built on top of level 2 large reasoning models. This is also giving us a clear path for how we can move from specialised systems toward broader, more capable, and more autonomous agents. As specialised digital AI agents master specific domains, the technologies developed and learnings made will pave the way for more general-purpose digital agents in the future.</p><p>However, the challenge with writing about digital AI agents comes from the fact that collectively we currently don't have a clear definition or agreement of what digital AI agents are. Broadly, I classify digital AI agents as:</p><blockquote><p> <em><strong>"AI systems that are designed to perform digital tasks by interacting with digital ecosystems either under human guidance or autonomously"</strong></em></p></blockquote><p>But, as OpenAI have done with Artificial General Intelligence, it's useful to break this down into a framework that can help us get a more nuanced understanding of the technology, where our current capabilities are, and where they are likely to take us next. I outline a suggested framework for this below:</p><h3>The Seven Levels of Digital AI Agent Maturity</h3><div id="datawrapper-iframe" class="datawrapper-wrap outer" data-attrs="{&quot;url&quot;:&quot;https://datawrapper.dwcdn.net/cLWUZ/4/&quot;,&quot;thumbnail_url&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/684afec8-c3e1-466a-be17-580a22a649ec_1260x660.png&quot;,&quot;thumbnail_url_full&quot;:&quot;&quot;,&quot;height&quot;:334,&quot;title&quot;:&quot;| Created with Datawrapper&quot;,&quot;description&quot;:&quot;Create interactive, responsive &amp; beautiful charts &#8212; no code required.&quot;}" data-component-name="DatawrapperToDOM"><iframe id="iframe-datawrapper" class="datawrapper-iframe" src="https://datawrapper.dwcdn.net/cLWUZ/4/" width="730" height="334" frameborder="0" scrolling="no"></iframe><script type="text/javascript">!function(){"use strict";window.addEventListener("message",(function(e){if(void 0!==e.data["datawrapper-height"]){var t=document.querySelectorAll("iframe");for(var a in e.data["datawrapper-height"])for(var r=0;r<t.length;r++){if(t[r].contentWindow===e.source)t[r].style.height=e.data["datawrapper-height"][a]+"px"}}}))}();</script></div><div><hr></div><h4>Level 1 &#8211; Assistive DIGITAL AI Copilots (Specialised, Manual)</h4><p><strong>Task Scope</strong>: Assist with small, well-defined parts of a larger digital task, such as generating code snippets, auto-completing sentences, or suggesting refinements to existing work.</p><p><strong>Prompting</strong>: Require careful, iterative human prompting to guide responses.</p><p><strong>Autonomy</strong>: Minimal&#8212;functions purely as an advanced suggestion engine.</p><p><strong>Examples</strong>: GitHub Copilot, Grammarly.</p><div><hr></div><h4>Level 2 &#8211; Task-Oriented DIGITAL AI Assistants (Broad, MANUAL)</h4><p><strong>Task Scope</strong>: Handle a broad range of individual digital tasks, such as drafting emails, writing creative content, summarising articles, or performing research queries.</p><p><strong>Prompting</strong>: Still requires detailed human direction, often in an iterative manner.</p><p><strong>Autonomy</strong>: Executes tasks based on well-structured user prompts but lacks independent reasoning or planning.</p><div><hr></div><p><strong>Examples</strong>: ChatGPT-4o, Claude 3.5 Sonnet, Google Gemini 2.0 Flash.</p><h4>Level 3 &#8211; Specialised Semi-Autonomous Digital AI Agents (Specialised, Semi-Auto)</h4><p><strong>Task Scope</strong>: Execute highly specialised digital tasks within a specific domain, such as conducting targeted research, analysing financial reports, or autonomously troubleshooting code.</p><p><strong>Prompting</strong>: Needs an initial prompt but can operate semi-autonomously with limited human intervention.</p><p><strong>Autonomy</strong>: Can take a single task from start to finish but still relies on predefined methods and lacks adaptive decision-making.</p><p><strong>Examples</strong>: Devin, specialised Deep Research agents from Google DeepMind and OpenAI.</p><div><hr></div><h4>Level 4 &#8211; Generalist Semi-Autonomous Digital AI Agents (Broad, Semi-Auto)</h4><p><strong>Task Scope</strong>: Can complete a broad range of individual digital tasks by interacting with various systems in ways that mimic human input, such as automating research, booking appointments, or handling customer support tickets.</p><p><strong>Prompting</strong>: Requires a starting prompt but then works semi-autonomously with occasional user input for verification or decision-making.</p><p><strong>Autonomy</strong>: Has limited reasoning abilities and can navigate pre-defined workflows but struggles with unexpected scenarios.</p><p><strong>Examples</strong>: <em>In research preview/beta testing:</em> Google DeepMind&#8217;s Project Mariner, OpenAI&#8217;s Operator, Anthropic's Computer Use, and Apple's Siri enhanced with Apple Intelligence</p><div><hr></div><h4>Level 5 &#8211; Multi-Task Digital AI Agents (Multi-Agent Systems, Auto)</h4><p><strong>Task Scope</strong>: Manage multiple digital tasks that are components of a larger process, coordinating different AI systems and APIs like an automated project manager.</p><p><strong>Prompting</strong>: Needs an initial goal definition but can execute and coordinate multiple subtasks autonomously.</p><p><strong>Autonomy</strong>: Can dynamically adjust plans and distribute workload across multiple specialised AI agents but still relies on user oversight for major decisions.</p><p><strong>Examples</strong>: <em>Still theoretical &#8212; no real-world implementations yet</em>. In research: OpenAI&#8217;s Swarm framework</p><div><hr></div><h4>Level 6 &#8211; End-to-End Digital AI Agents (Early Agentic AI, Auto)</h4><p><strong>Task Scope</strong>: Oversee and execute every digital task within a process.</p><p><strong>Prompting</strong>: Requires only high-level objectives; operates autonomously.</p><p><strong>Autonomy</strong>: Capable of handling unknown variables and making real-time adjustments without human guidance.</p><p><strong>Examples</strong>: <em>Still theoretical &#8212; no real-world implementations yet.</em></p><div><hr></div><h4>Level 7 &#8211; Fully Autonomous Digital AI Agents (General Agentic AI, Auto)</h4><p><strong>Task Scope</strong>: Manage all tasks of a larger digital process, adapting in real-time to changes in the ecosystem without any human prompting.</p><p><strong>Prompting</strong>: Require no starting prompt &#8212; proactively assesses, prioritises, and executes actions based on context and learned behaviours.</p><p><strong>Autonomy</strong>: Self-directed, able to set its own goals, make complex decisions, and continuously refine its approach without human intervention.</p><p><strong>Examples</strong>: <em>Still theoretical &#8212; no real-world implementations yet.</em></p><div><hr></div><p>This framework should be a useful guide when discussing digital AI agents. By mapping capabilities across seven distinct levels &#8212; from assistive digital copilots, to today's specialised and semi-autonomous agents, through to fully autonomous agents &#8212; we can better understand what this technology is and how its likely to develop.</p><p>As large reasoning models mature and agentic architectures evolve, we're starting to see the first genuine examples of semi-autonomous agents. It's likely that we're entering a period of accelerated development of digital AI agents in 2025, powered by increasingly capable large reasoning models.</p><h2>Current State of AI Agent Technology</h2><p>In early 2025 we've already seen the launch of level 3 specialised digital agents as well as early looks at some interesting level 4 general-purpose digital agents. The introduction of Deep Research agents from both Google and OpenAI, alongside Devin's breakthrough in autonomous software development, have shown us real world implementations of level 3 agents that are already incredibly useful and adding a lot of value to many people's workflows. Meanwhile, the research previews and beta releases of Google DeepMind's Projects Mariner and Astra, OpenAI's Operator, and Anthropic's Computer Use are starting to show us what level 4 general-purpose agents could be like.</p><p>The progress that we're seeing is driven by advances in large reasoning models, enhanced by multimodal understanding. It is fundamentally changing how artificial intelligence interfaces with our digital ecosystems. Below is a summary of the current state of AI agents:</p><h3>Level 1: Assistive Digital AI Copilots</h3><p>When GitHub Copilot&nbsp;was fully released in June 2022 it was the first AI agent to ever be released, operating as an advanced suggestion engine within coding workflows. While sophisticated in its code generation capabilities, it could only generate simple, small snippets of code when prompted by a human. </p><h3>Level 2: Task-Oriented Digital AI Assistants</h3><p>The large language models we have seen since ChatGPT was launched in November 2022, including&nbsp;the latest frontier models like ChatGPT-4o,&nbsp;Claude 3.5 Sonnet, and&nbsp;Gemini 2.0 Flash are general models that can perform a broad range of simple tasks, but still require prompting by a human and often have to be used in an iterative workflow.</p><h3>Level 3: SpecialiSed Semi-Autonomous Digital Agents</h3><p>We've recently started to see a new generation of specialised agents which have narrow, specialised capabilities but using them is less iterative than working with level 2 agents: Some examples of these newer, level 3 agents are listed below:</p><p>- <strong><a href="https://devin.ai">Devin</a></strong> is a coding agent that has some autonomous software development capabilities. It can independently plan and execute a task from start to finish and is able to self-correct and problem solve along the way.</p><p>- <strong><a href="https://notebooklm.google">NotebookLM</a></strong> is a simple, personalised research assistant that is able to analyse documents, YouTube videos, and audio files to produce summaries and find interesting insights. It can also produce an audio overview that turns its output into podcast-like deep dive discussion.</p><p>- <strong>Deep Research</strong> from both <a href="https://blog.google/products/gemini/google-gemini-deep-research/">Google DeepMind</a> and <a href="https://openai.com/index/introducing-deep-research/">OpenAI</a> are more sophisticated research planning and execution agents able to semi-autonomously search the web, retrieve relevant content and to synthesise it into a comprehensive research report.</p><h3>Level 4: Generalist Semi-Autonomous Digital Agents</h3><p>Over the last six months there have been research previews and beta releases of more general, semi-autonomous agents from Anthropic, OpenAI, Google DeepMind and Apple. This is where the frontier is right now when it comes to AI agents. Below is a brief overview of these level 4 generalist agents:</p><p>- <strong><a href="https://youtu.be/RXeOiIDNNek">Apple's Siri enhanced with Apple Intelligence</a></strong> was previewed at WWDC in June 2024. Apple's vision for Siri is a very integrated approach, embedding agent capabilities directly into Apple's ecosystem. Rather than operating through a browser, it works at the system level to help with tasks across applications while maintaining Apple's focus on privacy and on-device processing. It is expected to launch in Q2/Q3 2025.</p><p>- <strong><a href="https://www.anthropic.com/news/3-5-models-and-computer-use">Anthropic's Computer Use</a></strong> was introduced as a public beta in October 2024. It enables Claude to operate computers like a human would - interpreting screen contents, moving a cursor, and typing text. The public beta shows Claude's work in real-time through a dedicated browser window and requires supervision for most actions.</p><p>- <strong><a href="https://deepmind.google/technologies/project-mariner/">Google DeepMind's Project Mariner</a></strong> was released as a research prototype in December 2024. It uses Gemini 2.0 to understand and interact with web interfaces, similar to Computer Use but just focssed on web browsing. It moves quite slowly between actions, but achieved 83.5% on the WebVoyager benchmark.</p><p>- <strong><a href="https://deepmind.google/technologies/project-astra/">Google DeepMind's Project Astra</a></strong> was released as a research prototype in December 2024. It is similar to Project Mariner in that it's a general purpose agent, but focuses on real-world assistance through phones and prototype glasses as opposed to just being focused on computer use and web browsing. It can understand voice commands, interpret visual information, and maintain contextual conversations. Project Astra demonstrates how AI agents might integrate more naturally into daily life rather than just automating web tasks.</p><p>- <strong><a href="https://openai.com/index/introducing-operator/">OpenAI's Operator</a></strong> was released as a research preview in January 2025. It is an experimental browser automation agent, similar to Computer Use and Project Mariner that can navigate websites and perform tasks like online shopping, travel booking, and research.</p><h2>Common Traits &amp; Limitations</h2><p>When looking at the current crop of digital AI agents, and especially the newer level 3 and 4 agents, there are some common traits that they all share.</p><p>- The more sophisticated digital agents are built on top of large reasoning models that have more advanced multi-step planning capabilities. </p><p>- These models all have large context windows, enabling them to maintain coherent understanding across complex, extended interactions.</p><p>- All the level 3 agents are text-based whereas all the level 4 agents are multimodal.</p><p>We are yet to see the emergence of true multimodal large reasoning models and I expect that this what we will need before we see true level 4 agents that are ready for prime time.</p><p>Despite the significant progress that we're seeing with AI agents, there are still common challenges that exist. The main one is reliability, especially amongst the level 4 agents we're starting to see. I think that this is the main barrier that currently sits between the semi-autonomous agents we're currently seeing and a future state of more autonomous agents. The reality is that until agents are 95%+ reliable at 95%+ of tasks it's very difficult to see how people will trust them enough to perform tasks autonomously.</p><p>The other major limitation holding back AI Agents from reaching level 5 and beyond is integrations. I've <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">written extensively about this topic</a> before in my <a href="https://www.the-blueprint.ai/p/beyond-chatbots-a-blueprint-for-llms">Beyond Chatbots series</a>. The reason why we're seeing level 4 agents that 'operate computers like humans do' is because there isn't a good alternative for AI agents at the moment. The reality is that 'operating computers like humans do' is inefficient, prone to error, and increases the change of AI agents coming up against problems that they can't solve on their own. I see this current trend of 'computer use' as a short-term stop-gap and longer term a fall-back mechanism for AI agents when they come across platforms where they can't interact with it via an API.</p><p>To solve this integrations challenge there needs to be a big lift to build out API ecosystems to make them more suitable for integration with AI systems. There needs to be more APIs for more platforms with more read-only APIs that allow AI agents to access data and more write-access APIs that allow AI agents to take actions on behalf of their users. I believe this will come in time, as more people adopt AI technologies it will increase demand for a more mature API ecosystem and more incentives for platform owners to build out their APIs. Only time will tell.</p><h2>Future Trajectory</h2><p>So where do AI agents go next? There are some areas that I have a high degree of certainty will improve in the short term, and there are some areas that I am confident will improve, but will take a bit of time.</p><p>In the short term (i.e. later this year) we will absolutely see more advanced and more capable text-based large reasoning models that power AI agents. We're highly likely to see large reasoning models with more multimodal capabilities and I'm also expecting to see larger context windows and longer-term memory that will enable digital AI agents to plan better and execute more complicated workflows. We'll also see reduced latency amongst the level 4 agents we've started to see and I expect them to exit research preview/public beta and be more widely released. So I'm confident that we will see fully released level 4 AI agents before the end of the year.</p><p>The area that is going to take a bit of time to improve is the API ecosystem. As things stand, I think the Pareto principle (80/20 rule) is at play. There are 20% of digital platforms who could enhance their APIs for digital AI agents which would cover 80% of use cases. The issue we're likely to see unfortunately is that these 20% of digital platforms are the 'walled gardens' (Google, Meta, Apple, Amazon etc.) that are least incentivised to open up their APIs to third party digital AI agents as they are developing their own AI systems.</p><p>With the other 80% of digital platforms, it's just going to take time. They will have bigger incentives than the 'walled gardens' to build out their APIs for digital AI agents, but there's just a larger volume of them, and each of them is smaller and will therefore have less of an impact. Interestingly, I think this is where OpenAI and Anthropic could play a big role by working with these digital platforms to help them build out their API infrastructure. This could work in a similar way to how OpenAI is partnering with premium publishers and incentivising them to share their content with their models.</p><h2>The Impact on Digital Marketing</h2><p>The emergence and adoption of digital AI agents is going to fundamentally change how brands connect with consumers online. Digital marketing has always been inextricably linked to the development of the web itself, with each new era requiring big changes in how brands promote themselves. Web 4.0 presents what could be the biggest shift yet in how digital marketing operates and will require a comprehensive rethinking of digital marketing strategies.</p><p>With Web 1.0, websites essentially functioned as digital brochures, and digital marketing was focused on one-way communication. This meant that brands could easily transfer their traditional broadcast marketing models online. When Web 2.0 emerged in the mid-oughts, social platforms transformed marketing into more of a two-way conversation with user-generated content becoming more central to central brand narratives. We haven't seen Web 3.0 technologies have as much of a direct impact on digital marketing as previous eras but its more decentralised nature has led to the rise of cross-platform identity tracking, data-driven programmatic advertising, and the increasing importance of first-party data and associated privacy considerations.</p><p>We're now entering Web 4.0, the Agentic Web, where digital marketing will be defined less by searches, clicks, and visits and more by AI-mediated consumer interactions, the need for API-first marketing architectures, and increasingly human-not-in-the-loop purchase decisions. As digital AI agents become more prominent and are more widely adopted by consumers over the next couple of years we are going to have to get used to the idea of seeing less human traffic online. This will result in fewer human web searches, fewer links being clicked on, and fewer website visits by humans. These have long been the staple of digital marketing strategies and the reason why I believe that Web 4.0 will present the biggest shift yet in how digital marketing operates.</p><p>Instead of digital marketing strategies being dominated by human metrics such as searches, clicks, and visits we will see the increasing importance of strategies that cater not just for humans, but for agents and the large reasoning models that power them. When considering large reasoning models, digital marketers will need to ensure that their marketing content is well represented in their training data and develop new techniques for optimising this against their competitors. There will also need to be an evolution of SEO strategies to optimise marketing content for digital AI agents searching the web and direct-to-consumer brands will need to think about how they show up in the decision making processes of digital AI agents. Part of this will be the development of agent-first APIs as we see an increase in human-not-in-the-loop purchases.</p><p>Digital marketers will also need to develop new approaches to measurement and analytics as searches, clicks and visits become less important and the 'purchase funnel' as we know it becomes obsolete. Macro modelling techniques like Econometrics should fair well, but more granular approaches like Attribution will need rethinking from the bottom up. Instead of brands optimising digital activity based on granular metrics as they have for the last decade, decision-making will need to become more strategic and sophisticated. Digital marketers will need to understand not just what digital AI agents do but why they do it - similar to how SEO evolved from simple keyword optimisation to understanding search intent and semantic relationships. </p><p>The emergence of AI agents means we need to think not just about measurement but about how to structure and present marketing content in ways that agents can reliably process and evaluate. This means considering everything from API design to data architecture to content structure through a new strategic lens.</p><h2>Preparing for the Agentic Future</h2><p>The transition to Web 4.0 represents a fundamental shift in how consumers will interact with our digital ecosystems. Brands that want to thrive in the Agentic Web will need to start preparing now, developing both the technical capabilities and strategic mindset required for an agent-mediated digital ecosystem. It is going to require a rethink on how we approach digital engagement at every level.</p><p>At a foundational level, digital marketers will need to evolve the technical infrastructure of their online presence. However, it won't just be about creating API endpoints for digital AI agents to access but also about reimagining a brand's entire digital presence through the lens of large reasoning models and digital AI agents. This means developing both new content and new data strategies that can simultaneously support human engagement, the training of large reasoning models, and digital AI agent interactions.</p><p>The key to success for digital marketers will be maintaining a balance between optimising for human engagement and developing new capabilities for large reasoning models and digital AI agent interaction. The reality is that human-centric digital marketing will remain crucial even as agent-mediated interactions grow in importance. Brands will need to expand existing digital marketing strategies, not replace them.</p><p>Just as the move from Web 1.0 to Web 2.0 didn't eliminate the need for effective websites but added social capabilities on top, the transition to Web 4.0 won't eliminate human-focused digital marketing but will add new AI-focused layers to it. Success will require sophisticated strategies that can simultaneously serve human users, large reasoning models, and digital AI agents effectively.</p><p>The key is to start experimenting with these parallel approaches now, learning from early implementations while maintaining excellence in traditional channels. Web 4.0 isn't about replacing the old with the new - it's about broadening our conception of what digital marketing can and should be.</p><p>As we look ahead, the Agentic Web will likely bring opportunities we can't yet imagine. The combination of increasingly sophisticated large reasoning models, more capable digital AI agents, and evolving API ecosystems will enable new forms of digital marketing that go beyond our current thinking. The brands that thrive will be those that maintain the agility to capitalise on these opportunities while ensuring their core, human-centric digital marketing capabilities remain strong.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: Collaboration]]></title><description><![CDATA[Redefining How We Work with Technology]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-collaboration</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-collaboration</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Tue, 01 Oct 2024 09:02:21 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!yKJ6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!yKJ6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!yKJ6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!yKJ6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!yKJ6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!yKJ6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!yKJ6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:487218,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!yKJ6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!yKJ6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!yKJ6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!yKJ6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F42960e49-378c-488c-a1d2-9fb24b55ab98_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>So far in the Beyond Chatbots series, we've explored how personalisation, integrations, proactivity, personality, and fact-checking can transform simple AI chatbots into sophisticated digital companions. Next, we're turning our attention to an important area that will establish more advanced capabilities: collaboration. As digital companions become more deeply integrated into our lives and work, the ability to work collaboratively with digital companions &#8211; and for digital companions to collaborate with each other &#8211; will be essential.</p><p>How we work with generative AI tools and platforms is evolving rapidly, as evidenced by OpenAI's recent unveiling of their o1 model, a real breakthrough in AI's ability to plan and reason. The pace of change is accelerating and it's important that we examine how these advancements will reshape our relationship with technology and redefine the very nature of collaboration.</p><h2><strong>&#129309; </strong>The Current State of HUMAN-AI Collaboration</h2><p>We've come a long way in the last couple of years in the capabilities of large language models, but there are also still limited in the ways that we interact and work with them. Most of today's chatbots operate in a reactive, query-response mode. While they can provide information and perform some basic tasks, they often struggle with:</p><ol><li><p>Maintaining context over extended interactions</p></li><li><p>Understanding and adapting to user goals and working styles</p></li><li><p>Engaging in complex, multi-step problem-solving without explicit guidance (with the exception of o1!)</p></li></ol><p>Despite some of these limitations, we're seeing glimpses of generative AI's potential for more dynamic collaboration:</p><ul><li><p><strong>Coding Assistance</strong>: Tools like GitHub Copilot offer context-aware code suggestions, helping developers work more efficiently.</p></li><li><p><strong>Writing and Editing</strong>: Generative AI writing assistants can now help with everything from grammar correction to style suggestions, acting as writing companions.</p></li><li><p><strong>Data Analysis</strong>: There are Generative AI tools that are increasingly able to assist in interpreting complex datasets, spotting trends, and generating visualisations.</p></li></ul><p>While these tools start to demonstrate the potential of AI collaboration, they're still largely operating as assistants rather than true collaborators.</p><p>As Generative AI capabilities advance, so too does the potential for more collaboration. The recent release of OpenAI's o1 model is a great example of how quickly the field is advancing. o1 represents a significant improvement in Generative AI reasoning capabilities:</p><ul><li><p><strong>"Thinking" Before Answering</strong>: Unlike previous models that generate responses immediately, o1 can spend time reasoning through complex problems before providing an answer.</p></li><li><p><strong>Advanced Problem-Solving</strong>: In tests, o1 has demonstrated PhD-level performance in fields like physics, chemistry, and biology, as well as advanced abilities in mathematics and coding.</p></li></ul><p>As advanced Generative AI systems become more capable of complex reasoning and problem-solving, we need to develop new ideas for human-AI interaction that go beyond simple query-response patterns. As I wrote in <a href="https://www.the-blueprint.ai/p/o1-the-next-step-in-conversational">my coverage of o1's launch</a>, sometimes its responses can be lengthy and overwhelming and it tends to shoot for the full answer to a problem in one shot instead of collaborating with the user on the problem and allowing space for iterating together over solutions.</p><p>As generative AI technologies become more advanced, the challenge will lie in creating better collaborative frameworks that allow humans and AI to work together effectively, leveraging the strengths of both.</p><h2>&#129302; Human-AI Collaboration: A new frontier</h2><p>As we've seen with models like OpenAI's o1, we're rapidly approaching the start of more sophisticated human-AI collaboration. Over the next few years, advanced AI systems will start to become true partners in our work and creative processes.</p><p>The potential here is huge. With more advanced human-AI collaboration we will be able to achieve more, faster, and to a higher quality. Imagine a novelist working alongside a digital companion that suggests plot twists and character developments based on the writer's unique style, or a team of scientists collaborating with a digital companion to analyse vast datasets, propose hypotheses, and help design experiments in research.</p><p>The current trajectory and evolution of AI capabilities will push us beyond the current assistant-based model towards more sophisticated collaboration. For this to be realised, digital companions will need to be able to engage in prolonged, context-aware collaborations, understand and align with user goals, and contribute proactively to problem-solving and creative processes. They'll need to seamlessly integrate with human workflows across various domains, adapting their collaborative style based on individual preferences and specific project contexts.</p><p>As AI capabilities expand across text, voice, and video, collaborating will become increasingly more natural and intuitive, mimicking human-to-human interaction. For example, next year we'll probably see the first generative AI model we can interact with on a video call. Beyond this, we will see digital companions taking on more and more sub-tasks or projects independently, requiring humans to become comfortable with a degree of "hands-off" collaboration. </p><p>However, as our reliance on AI collaboration grows, building and maintaining trust becomes increasingly important. This will involve ensuring transparency in the decision-making of digital companions and aligning their behaviour with our values and ethics. Digital companions will have to be able to <a href="https://www.the-blueprint.ai/p/beyond-chatbots-fact-checking">check facts</a>, clearly communicate their limitations and uncertainties, and adapt their approach based on the needs of each unique collaboration.</p><p>While there are still challenges to face in developing more sophisticated digital companions, we're opening up new possibilities for how we work, create, and solve problems. The key will be in creating better collaborative interfaces that allow humans and their digital companions to work together effectively, leveraging the strengths of both to tackle complex challenges and push the boundaries of what's possible.</p><h2>&#129470; AI-AI Collaboration: The Next Frontier</h2><p>As we progress towards more advanced human-AI collaboration, there is also the potential for AI-AI collaboration and multiple digital companions working together. This could open up new possibilities for tackling complex, multi-faceted problems and could dramatically reshape our approach to large-scale projects and decision-making processes.</p><p>AI-AI collaboration will involve multiple digital companions, potentially with different specialisations or capabilities, working together to achieve common goals. This could involve task division, knowledge sharing, and consensus building among them. The benefits of such collaborations could be significant, enhancing problem-solving capabilities, increasing efficiency through parallel processing, improving accuracy via cross-checking, and offering greater scalability and adaptability for large-scale projects.</p><p>However, orchestrating collaboration between multiple AI systems will pose several challenges. These include establishing effective communication protocols, developing mechanisms for conflict resolution, and ensuring transparency and accountability in decision-making processes.</p><p>The applications of AI-AI collaboration could span a wide range of fields. In scientific research, multiple AI agents could work on different aspects of complex problems like climate modelling or drug discovery, sharing insights and collaborating on solutions. For urban planning, AI systems specialising in various aspects like traffic flow, energy usage, and population dynamics could work together to design more efficient and sustainable cities. In healthcare, multiple AI systems could collaborate on patient diagnosis, treatment planning, and drug interaction analysis, each bringing specialised medical knowledge to the task.</p><p>While the potential of AI-AI collaboration is exciting, it also raises important concerns about autonomy and oversight. As AI systems become more capable of working together, there's a risk of decisions being made at speeds and complexities beyond human comprehension or intervention. This could lead to unintended consequences or actions that don't align with human values. There are also questions about accountability: if multiple AI systems collaborate on a task that produces harmful outcomes, how do we attribute responsibility? Over the next few years we're going to have to build robust monitoring systems, clear ethical guidelines, and mechanisms for human intervention. Striking the right balance between leveraging the power of AI-AI collaboration and maintaining appropriate human oversight will be a key challenge as the AI technology continues to develop.</p><p>The introduction of more sophisticated AI models, like OpenAI's o1, opens up some really interesting possibilities for AI-AI collaboration. An advanced reasoning model, like future versions of o1, could act as a "manager," breaking down complex problems and coordinating the efforts of more specialised AI agents. The ability of models like o1 to "think before answering" could lead to more efficient and meaningful communication between AI agents, reducing noise and focusing on relevant information exchange.</p><p>I can imagine a future where networks of AI agents, including advanced reasoning models, work together seamlessly to tackle the world's most pressing challenges. This could lead to global problem-solving platforms dedicated to addressing complex issues like climate change or pandemic response. We will probably see the emergence of hybrid human-AI teams, where humans collaborate not just with individual digital companions, but with interconnected networks of AI specialists.</p><p>The possibilities of AI-AI collaboration are both exciting and daunting.It will unlock new levels of problem-solving capability and efficiency but will also challenges us to think carefully about how we design, implement, and govern these powerful systems to ensure they align with human values and interests.</p><h2>&#9999;&#65039; Implementing Collaborative Features in Digital Companions</h2><p>So, how do we actually design and build these collaborative features in digital companions? The big challenges lies in creating AI systems that can seamlessly work alongside humans by understanding context, managing tasks, and adapting to individual user needs.</p><p>To evolve from our current generation of chatbots into collaborative digital companions several key features will need to be implemented. These include better context awareness through more persistent memory systems, task management capabilities for breaking down complex projects, and knowledge sharing that <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">integrates</a> various information sources. Digital companions will need to <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personalisation">personalise</a> their interaction style based on user preferences, offer <a href="https://www.the-blueprint.ai/p/beyond-chatbots-proactivity">proactive</a> assistance by anticipating user needs, and support multimodal interaction across text, voice, and visual inputs.</p><p>Integrating digital companions into our existing workflows will require seamlessly interacting them with our existing digital ecosystems. This will mean developing robust APIs and integrations with productivity suites, project management tools, communication platforms, and knowledge bases. Imagine a digital companion that can access your documents, manage your tasks in Jira, participate in Slack discussions, and even assist with code reviews on GitHub &#8211; all while maintaining a coherent understanding of your personal work context.</p><p>As digital companions become more integrated and capable, ensuring user oversight and transparency becomes incredibly important. This will require digital companions to be able to better explain their reasoning (much like o1 now does), to seek user confirmation for significant actions, and to provide detailed activity logs. Users should have clear visibility into how their data is being used and stored, with the ability to adjust how proactive or autonomous their digital companion is in different contexts.</p><p>Ethical AI development will need to be front and centre in the building of these new collaborative capabilities, with guidelines hard coded into the decision-making of digital companions. Along side this, better training and onboarding will be essential to help people understand and effectively use these advanced collaborative features.</p><p>By thoughtfully addressing these challenges and implementing these collaborative features, we can create digital companions that truly enhance our work processes while respecting user control and privacy. As AI technology continues to advance at pace, the potential for Human-AI collaboration grows, which will push us beyond the limitations of simple chatbots and into a new era of true digital companions.</p><h2><strong>&#127937; </strong>Conclusion: The Collaborative Future</h2><p>As we've explored throughout the <a href="https://www.the-blueprint.ai/p/beyond-chatbots-a-blueprint-for-llms">Beyond Chatbots</a> series, the evolution from simple chatbots to sophisticated digital companions will bring about a big shift in our relationship with technology. The introduction of advanced models like OpenAI's o1, with its advanced reasoning capabilities, has already accelerated this transformation. We're at the start of a new era of human-AI collaboration, which promises to amplify our problem-solving abilities, boost creativity, and has the potential to tackle some of the world's most pressing challenges.</p><p>The potential applications of better human-AI collaboration are incredibly exciting, from scientists collaborating with AI to accelerate research, to educators creating personalised learning experiences, to global teams addressing complex issues like climate change. However, there are significant challenges and ethical considerations. The next few years will require more than just technological innovation, but careful consideration of the societal, ethical, and individual impact of these advanced AI technologies.For example:</p><ol><li><p>How will we balance efficiency gains with maintaining human skills and agency?</p></li><li><p>What new forms of literacy will be necessary for effective human-AI collaboration?</p></li><li><p>How can we ensure equitable distribution of the benefits of advanced digital companions?</p></li><li><p>What governance structures and ethical frameworks should guide the development and deployment of these technologies?</p></li><li><p>How might widespread human-AI collaboration change our understanding of creativity, problem-solving, and human potential?</p></li></ol><p>These questions don't have easy answers, but they will shape the future of human-AI interaction. As we move forward, it's important that we approach the development of collaborative digital companions with a combination of excitement and responsible caution.</p><p>The journey beyond chatbots is just beginning, and the future of collaboration between humans and AI is limited only by our imagination and our commitment to developing these technologies responsibly and ethically. But one thing is clear: the potential for human-AI collaboration to transform our world is immense, and the adventure is only just beginning.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: Fact-Checking]]></title><description><![CDATA[Ensuring Reliability in Digital Companions]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-fact-checking</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-fact-checking</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 05 Sep 2024 08:01:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!BCCN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BCCN!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BCCN!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!BCCN!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!BCCN!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!BCCN!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BCCN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:662330,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BCCN!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!BCCN!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!BCCN!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!BCCN!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F79b5642c-a5ca-4262-8f3b-76a95996e821_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In the beyond chatbots series so far, we've explored how <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personalisation">personalisation</a>, <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">integrations</a>, <a href="https://www.the-blueprint.ai/p/beyond-chatbots-proactivity">proactivity</a>, and <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personality">personality</a> can transform simple generative AI chatbots into more sophisticated digital companions. However, as these systems become more integrated into our lives, reliability becomes increasingly important. In this post, I&#8217;ll go into the crucial role of fact-checking in ensuring digital companions are not just intelligent, but also trustworthy.</p><p>The need for robust fact-checking mechanisms in generative AI systems has never been more pressing. As we rely more on AI for information and decision-making support, the potential impact of misinformation grows exponentially. From influencing personal choices to shaping public opinion, the consequences of unreliable AI can be far-reaching and profound.</p><p>In this post, I&#8217;ll explore the persistent challenge of generative AI hallucinations, examine current approaches to AI-assisted search and fact-checking, and discuss strategies for implementing effective fact-checking in digital companions. We'll also consider the challenges and ethical considerations in this important area of AI development.</p><h2>&#129300; The Persistent Challenge of AI Hallucinations</h2><p>One of the most significant challenges in developing reliable AI systems is hallucinations. Unlike traditional technology systems that store and retrieve information, Large Language Models (LLMs) function more like human memory - they generate responses based on patterns learned from vast amounts of training data. This approach allows for impressive flexibility and creativity but also introduces a propensity for errors.</p><blockquote><p><em>"AI hallucinations are not just a temporary glitch, but a fundamental challenge rooted in how LLMs process and generate information."</em></p></blockquote><p>Hallucinations occur when a generative AI model generates information that seems plausible but is factually incorrect or entirely fabricated. These can range from minor inaccuracies to completely false statements. For example, an AI might confidently state that "The Eiffel Tower was built in 1896" when it was actually completed in 1889, or it might invent a non-existent historical event.</p><p>The impact of these hallucinations can be significant:</p><ol><li><p><strong>Erosion of Trust</strong>: When users encounter incorrect information, it can quickly erode their trust in the AI system and, by extension, the company or service providing it.</p></li><li><p><strong>Spread of Misinformation</strong>: Online, information spreads rapidly, so AI-generated misinformation can quickly proliferate, potentially influencing public opinion or decision-making.</p></li><li><p><strong>Potential for Harm</strong>: In critical applications like healthcare or financial advice, hallucinations could lead to harmful decisions or actions.</p></li><li><p><strong>Increased Cognitive Load</strong>: Users may feel the need to fact-check every AI response, defeating the purpose of using AI as a time-saving tool.</p></li></ol><p>Recent high-profile incidents have highlighted the severity of this issue. In one alarming case <a href="https://www.washingtonpost.com/technology/2023/11/16/chatgpt-lawyer-fired-ai/">reported by The Washington Post</a>, a young lawyer named Zachariah Crabill used ChatGPT to write a legal motion, only to discover that the AI had fabricated several fake lawsuit citations. This error led to Crabill being reported to a statewide office for attorney complaints and ultimately losing his job. In another incident, Google's search AI incorrectly claimed that <a href="https://futurism.com/google-search-ai-melt-eggs">eggs could be melted</a>, citing information generated by ChatGPT on Quora as its source. This misinformation was briefly featured in Google's search results, demonstrating how AI hallucinations can propagate across platforms and potentially mislead users on a large scale.</p><p>These incidents underscore the critical need for robust fact-checking mechanisms in AI systems, especially as they evolve into more sophisticated digital companions. In the next section, we'll explore some current approaches to addressing this challenge in AI-assisted search and information retrieval.</p><h2><strong>&#129309; The Need for Reliable Digital Companions</strong></h2><p>As we develop new generative AI features that take us from simple chatbots to more sophisticated digital companions, the need for reliability becomes incredibly important. Digital companions won&#8217;t be just novelty tools; they will increasingly become integral parts of our daily lives, influencing decisions and shaping our understanding of the world. This transition brings with it an increasing responsibility to ensure that the information they provide is accurate and trustworthy.</p><blockquote><p><em>"The evolution of digital companions from novelty to necessity requires their reliability and trustworthiness to evolve in parallel.&#8221;</em></p></blockquote><p>The relationship between users and their digital companions will be built on a foundation of trust. As they become more sophisticated and are entrusted with more tasks, maintaining this trust becomes not just important, but essential. Users need to feel confident that when they turn to their digital companion for support in decision-making, whether it's for personal, financial, or educational purposes, the guidance they receive is factual and grounded in reality. For example:</p><ol><li><p><strong>Trust and Credibility</strong>: For digital companions to be truly useful, users need to trust them. Consistent accuracy builds this trust, while repeated errors or misinformation can quickly erode it. As these AI systems become more sophisticated and are used for more critical tasks, maintaining trust and credibility becomes even more essential.</p></li><li><p><strong>Decision Support</strong>: In the future, many users will rely on digital companions for information that informs important decisions, whether personal, professional, or financial. Inaccurate information could lead to poor choices with real-world consequences.</p></li><li><p><strong>Combating Misinformation</strong>: Because misinformation spreads rapidly online, digital companions have the potential to be powerful tools for fact-checking and truth-seeking for their users &#8211; but only if they themselves are reliable.</p></li><li><p><strong>Ethical Responsibility</strong>: There is also an ethical obligation to ensure that digital companions do more good than harm. This includes making sure they're not inadvertently spreading false information.</p></li></ol><p>As we continue to discover what's possible with generative AI technologies, we must remember that the true power of digital companions will lie not just in their ability to process and generate information quickly, but in their ability to do so reliably. Unless we prioritise accuracy and implementing effective fact-checking strategies, we won&#8217;t be able to unlock the full potential of digital companions and turn them into trusted partners in our daily lives.</p><h2><strong>&#129488; Current Approaches to AI-Assisted Search and Fact-Checking</strong></h2><p>As the challenges of AI hallucinations become more apparent, big tech companies and new startups are developing novel approaches to AI-assisted search and fact-checking. These efforts aim to combine the power of large language models with more traditional information retrieval methods to provide accurate, verifiable information. The goal is to address the limitations of current chatbots and search engines, particularly in terms of reliability and transparency. However, these approaches are not without their own challenges and controversies:</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;22e4d908-5d19-4662-8b4b-0d95c2a3a33e&quot;,&quot;duration&quot;:null}"></div><p><strong>1. Google Gemini&#8217;s Double-Check Responses</strong></p><p>Google&#8217;s Gemini has a fantastic feature called "Double-check Responses," which represents a big step forward in empowering users to verify information themselves. This feature uses Google's search capabilities to corroborate or challenge the Gemini&#8217;s responses, providing users with a transparent and interactive fact-checking experience. Key aspects of the Double-check Responses feature include:</p><ul><li><p>A "Double-check response" button that users can click after receiving an answer from Gemini.</p></li><li><p>Utilisation of Google search to find content that's likely similar to or different from Gemini's response.</p></li><li><p>Color-coded highlighting of Gemini's response:</p><ul><li><p>Green highlights indicate statements likely similar to Google's search results, with accompanying links.</p></li><li><p>Orange highlights show statements likely different from Google's search results, also with links.</p></li><li><p>Unhighlighted parts indicate insufficient information from search results to evaluate, or non-factual statements.</p></li></ul></li></ul><p>This approach is particularly powerful because it allows users to easily compare AI-generated content with web sources. It also provides transparency about Gemini&#8217;s potential limitations or inaccuracies and leverages Google's search capabilities to fact-check AI-generated content in real-time.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!S1nR!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!S1nR!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 424w, https://substackcdn.com/image/fetch/$s_!S1nR!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 848w, https://substackcdn.com/image/fetch/$s_!S1nR!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!S1nR!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!S1nR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg" width="450" height="703.2" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1758,&quot;width&quot;:1125,&quot;resizeWidth&quot;:450,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Image&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Image" title="Image" srcset="https://substackcdn.com/image/fetch/$s_!S1nR!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 424w, https://substackcdn.com/image/fetch/$s_!S1nR!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 848w, https://substackcdn.com/image/fetch/$s_!S1nR!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!S1nR!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F0a1eed3e-49a9-4645-bfac-1e3b70846825_1125x1758.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>2. Google's AI Overviews</strong></p><p>Google, the dominant search platform, is adapting to the AI era with its AI Overviews feature. This system generates AI-powered summaries for search queries, but with a crucial focus on source attribution and transparency. Recent updates to AI Overviews include:</p><ul><li><p>A new display that shows cited webpages more prominently to the right of the AI-generated summary.</p></li><li><p>Experimentation with attaching links directly within the text of AI Overviews.</p></li><li><p>The ability for users to save AI Overviews for future reference.</p></li></ul><p>Google's approach emphasises the importance of allowing users to easily verify the sources of information. By making the origins of claims more transparent, this potentially reduces the impact of hallucinations and increases user trust.</p><p>However, Google's AI Overviews have faced significant challenges. According to reports from <a href="https://www.nytimes.com/2024/05/24/technology/google-ai-overview-search.html">The New York Times</a> and <a href="https://www.theatlantic.com/technology/archive/2024/06/google-ai-overview-libel/678751/">The Atlantic</a>, the system has generated numerous false or misleading statements. These range from harmless but bizarre claims (such as recommending glue as a pizza ingredient) to potentially dangerous misinformation (like suggesting the consumption of rocks for nutrition). These incidents have undermined trust in Google's search capabilities and highlighted the ongoing challenges of hallucinations and implementing generative AI in search technologies. Given Google's dominant position in the search market, these issues are particularly concerning and could have far-reaching implications for how people access and trust information online.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!aTVO!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!aTVO!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 424w, https://substackcdn.com/image/fetch/$s_!aTVO!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 848w, https://substackcdn.com/image/fetch/$s_!aTVO!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 1272w, https://substackcdn.com/image/fetch/$s_!aTVO!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!aTVO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png" width="1456" height="714" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:714,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Perplexity AI: Conversational AI search engine to get answers instantly |  Deepgram&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Perplexity AI: Conversational AI search engine to get answers instantly |  Deepgram" title="Perplexity AI: Conversational AI search engine to get answers instantly |  Deepgram" srcset="https://substackcdn.com/image/fetch/$s_!aTVO!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 424w, https://substackcdn.com/image/fetch/$s_!aTVO!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 848w, https://substackcdn.com/image/fetch/$s_!aTVO!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 1272w, https://substackcdn.com/image/fetch/$s_!aTVO!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47423a94-7d75-429b-aa21-fde7313cb378_2878x1412.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>3. <a href="Perplexity.ai">Perplexity.ai</a></strong></p><p><a href="Perplexity.ai">Perplexity.ai</a> represents a new breed of AI-powered search engines designed from the ground up on generative AI technologies. Unlike traditional search engines that primarily provide links to relevant websites, <a href="Perplexity.ai">Perplexity.ai</a> aims to directly answer user queries using AI-generated responses. The platform combines conversational AI capabilities with web search to provide a more interactive and potentially more accurate search experience. <a href="Perplexity.ai">Perplexity.ai</a>'s key features include:</p><ul><li><p>Generating answers using sources from the web with inline citations.</p></li><li><p>A conversational interface that allows for follow-up questions and maintains context.</p></li><li><p>Different search modes, including options to focus on academic sources or specific platforms like Reddit.</p></li></ul><p>By integrating web sources directly into its responses and providing clear citations, <a href="Perplexity.ai">Perplexity.ai</a> aims to reduce hallucinations and increase the verifiability of its outputs.</p><p>However, Perplexity has faced its own set of challenges and controversies. According to reports from <a href="https://techcrunch.com/2024/07/02/news-outlets-are-accusing-perplexity-of-plagiarism-and-unethical-web-scraping/">TechCrunch</a>, the company has been accused of plagiarism and unethical web scraping practices. Publishers like Forbes and Wired have claimed that Perplexity has copied substantial portions of their articles without proper attribution. Additionally, there are concerns that Perplexity may be ignoring standard web protocols that allow websites to opt out of being crawled by bots. These issues raise important questions about copyright, fair use, and the ethical use of web content in AI-powered search and summarisation tools.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!hOtB!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!hOtB!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 424w, https://substackcdn.com/image/fetch/$s_!hOtB!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 848w, https://substackcdn.com/image/fetch/$s_!hOtB!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 1272w, https://substackcdn.com/image/fetch/$s_!hOtB!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!hOtB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;SearchGPT is a prototype of new AI search features | OpenAI&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="SearchGPT is a prototype of new AI search features | OpenAI" title="SearchGPT is a prototype of new AI search features | OpenAI" srcset="https://substackcdn.com/image/fetch/$s_!hOtB!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 424w, https://substackcdn.com/image/fetch/$s_!hOtB!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 848w, https://substackcdn.com/image/fetch/$s_!hOtB!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 1272w, https://substackcdn.com/image/fetch/$s_!hOtB!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F944dbcd2-3f11-49ef-85c3-01b183cf9989_3018x1698.webp 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>4. OpenAI's SearchGPT</strong></p><p>OpenAI, the company behind ChatGPT, has recently introduced SearchGPT, a prototype of new AI search features. This system aims to combine the strength of generative AI models with information from the web to provide fast and timely answers with clear and relevant sources. Key features of SearchGPT include:</p><ul><li><p>Direct responses to questions using up-to-date information from the web</p></li><li><p>Clear links to relevant sources within the responses</p></li><li><p>The ability to ask follow-up questions, maintaining context throughout the conversation</p></li></ul><p>OpenAI is partnering with publishers to ensure that SearchGPT helps users discover high-quality content while respecting the original sources. This approach could potentially address some of the hallucination issues by grounding AI responses in verifiable, current web content.</p><p>SearchGPT uses a specific web crawler called <a href="https://platform.openai.com/docs/bots">OAI-SearchBot</a>, which is designed to link to and surface websites in search results. Importantly, OpenAI states that this crawler is not used for training their AI models, addressing some ethical concerns about data usage. SearchGPT also respects the standard robots.txt protocol, allowing website owners to control whether their content appears in results. This approach, along with publishing the IP addresses used by their crawler, demonstrates a good commitment to transparency and ethical web crawling practices.</p><p>While these approaches represent good progress towards more reliable AI-assisted information retrieval, they are not without challenges. The incidents with Google's AI Overviews and the controversies surrounding Perplexity highlight the potential for misinformation to propagate and raises important ethical and legal questions about the use of web content in AI-powered search tools.</p><p>These current efforts in AI-assisted search provide valuable lessons for developing reliable and trustworthy digital companions:</p><ol><li><p><strong>Source Integration</strong>: Incorporating real-time information from the web can help ground AI responses in current, verifiable facts.</p></li><li><p><strong>Transparency</strong>: Clearly displaying sources and allowing users to easily access original content builds trust and enables fact-checking.</p></li><li><p><strong>Contextual Understanding</strong>: Maintaining conversation context and allowing follow-up questions can lead to more accurate and relevant responses.</p></li><li><p><strong>Varied Sources</strong>: Offering a wide variety of information sources (e.g., academic, news etc.) can help users find the most appropriate information for their needs.</p></li><li><p><strong>Ethical Considerations</strong>: As we develop these technologies, we must grapple with important questions about copyright, fair use, and the ethical use of web content.</p></li><li><p><strong>Respect for Web Protocols</strong>: Adhering to standard web protocols like robots.txt and respecting publisher rights is crucial for building trust and maintaining ethical practices in AI-assisted search.</p></li></ol><p>As we develop digital companions, these ideas will provide essential building blocks for implementing effective fact-checking. However, they also highlight the ongoing challenges in balancing the power of large language models with the need for accuracy, trustworthiness, and ethical use of information.</p><h2><strong>&#129299; Strategies for Implementing Fact-Checking in Digital Companions</strong></h2><p>As we've seen from the current approaches to AI-assisted search and fact-checking, implementing reliable and trustworthy systems is a difficult challenge. However, these examples provide useful insights that we can learn from when developing digital companions. Below are four additional areas that I think will be essential for implementing effective fact-checking and building reliable and trustworthy digital companions:</p><p><strong>1. Using Integrations</strong></p><p>As I explored in the <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">Integrations</a> post earlier in my Beyond Chatbots series, there is a big advantage to using existing APIs to access reliable, trustworthy information online. Digital companions could be designed to ground their responses by:</p><ul><li><p>Accessing up-to-date information from reputable databases and APIs</p></li><li><p>Cross-reference multiple sources to corroborate facts</p></li><li><p>Clearly indicating the recency and source of the information provided</p></li></ul><p>For example, a digital companion answering a question about current events could automatically check multiple news APIs to ensure the accuracy and timeliness of its response as well as ensuring that its response is balanced and unbiased.</p><p><strong>2. Confidence Levels and Uncertainty</strong></p><p>Digital companions should be designed to express varying levels of certainty about the information they provide. This should include:</p><ul><li><p>A clear system for expressing confidence in responses (e.g., high, medium, low)</p></li><li><p>Explicit acknowledgment when information is uncertain or controversial</p></li><li><p>The ability to say "I don't know" or "I'm not certain" when appropriate</p></li></ul><p>It still surprises me that many of the generative AI chatbots don&#8217;t currently express uncertainty or ever say &#8220;I don&#8217;t know&#8221;.</p><p><strong>3. Multi-Modal Fact-Checking</strong></p><p>As digital content becomes increasingly diverse, fact-checking should extend beyond text:</p><ul><li><p>Verifying information across text, images, audio, and video</p></li><li><p>Detecting deep fakes and manipulated media</p></li><li><p>Providing context for visual or audio information</p></li></ul><p>This could be particularly useful in helping users verify the authenticity of news content or viral social media posts.</p><p><strong>4. Personalised Fact-Checking Preferences</strong></p><p>Leveraging the <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personalisation">personalised</a> nature of digital companions, fact-checking could be tailored to individual user preferences:</p><ul><li><p>Allowing users to set their preferred level of fact-checking rigour</p></li><li><p>Remembering user-specific trusted sources</p></li><li><p>Adapting the presentation of fact-checked information based on user preferences</p></li></ul><p>For instance, a user interested in scientific topics might set their digital companion to prioritise peer-reviewed sources and always show confidence levels for scientific claims.</p><p>While these areas could significantly enhance the reliability of digital companions, it's important to note that no system will be 100% perfect. The goal is to create digital companions that are not only helpful but also transparent about their limitations and committed to accuracy.</p><p>By implementing these fact-checking strategies, we can develop digital companions that users can trust to provide reliable information. As digital companions become more integrated into our daily lives, their ability to provide accurate, verifiable information will be crucial in combating misinformation and supporting informed decision-making.</p><h2><strong>&#128679; Challenges and Considerations</strong></h2><p>While the potential benefits of fact-checking in digital companions are significant, their development and implementation come with a range of complex challenges and important ethical considerations. As we explore the reliability and trustworthiness of digital companions, we must navigate these issues carefully to ensure that they enhance our lives without introducing new problems.</p><ul><li><p><strong>Balancing Speed and Accuracy</strong>: One of the primary challenges in implementing robust fact-checking in digital companions is maintaining a balance between speed and accuracy. Users expect quick responses, but thorough fact-checking can be time-consuming. As LLMs become faster through more efficient inference and specialist hardware this will be less of a concern, but it is an important consideration when designing fact checking systems.</p></li><li><p><strong>Handling Evolving and Controversial Topics</strong>: Digital companions will need to navigate our world where information is constantly changing and many topics are subject to ongoing debate or controversy. We will need to implement systems that can distinguish between established facts, evolving situations, and matters of opinion or debate. This might involve clearly labelling information as "current as of [date]" or presenting multiple viewpoints on controversial topics.</p></li><li><p><strong>Avoiding Bias in Fact-Checking Sources</strong>: Sources used for fact-checking can themselves be biased, potentially leading to skewed or incomplete information. To address this, we need to develop a system for balancing responses based on multiple sources, vetting and regularly auditing fact-checking sources, and implement transparency measures so users can see which sources are being used. AI-powered bias detection tools could also be employed to analyse and flag potential biases in sources.</p></li><li><p><strong>Managing User Expectations and AI Limitations</strong>: Despite best efforts, AI systems will sometimes make mistakes or encounter situations beyond their capabilities. There's a risk that users may develop unrealistic expectations about the abilities of their digital companions. We will need to implement transparent error reporting, clear communication of limitations, and easy mechanisms for users to report errors.</p></li><li><p><strong>Cross-Cultural and Multilingual Fact-Checking</strong>: Developing digital companions with cultural awareness and adaptability will be very important. This will involve not just language translation, but understanding and respecting cultural nuances in information interpretation. Involving diverse global teams in the creation and testing of fact-checking systems will be essential to achieve this cultural fluency.</p></li><li><p><strong>Combating Deliberate Misinformation</strong>: We are now regularly seeing disinformation campaigns online, and so digital companions will need to be able to identify and combat deliberately false information. There will be a need to develop more advanced systems for identifying patterns of misinformation and collaborate with fact-checking organisations and regulatory bodies to address this challenge effectively.</p></li></ul><p>In developing fact-checking features for digital companions, addressing these challenges will be crucial. By doing so, we can create digital companions that are not only more reliable and trustworthy, but also ethical and truly beneficial to users' lives. This will require a multi-disciplinary approach with ongoing dialogue between AI developers, ethicists, policymakers, and users to ensure we're creating a future with AI that enhances human potential while respecting human values and the integrity of online information.</p><h2><strong>&#127937; Conclusion</strong></h2><p>As I&#8217;ve explored throughout this post, fact-checking is not just a feature - it needs to be a foundational part of digital companions. The evolution of simple chatbots into digital companions will bring with it an increasing responsibility to ensure the information they provide is accurate, reliable, and trustworthy.</p><p>The persistent challenge of generative AI hallucinations highlights the need for robust fact-checking mechanisms in digital companions. As they become more integrated into our daily lives, the consequences of misinformation grow significantly.</p><p>Current approaches by industry leaders like Google, Perplexity, and OpenAI offer valuable insights and potential building blocks for implementing fact-checking in digital companions. However, to truly unlock their potential, we must prioritise the development of these features to leverage integrations with authoritative sources, implement confidence levels for responses, and communicate uncertainty when appropriate.</p><p>The journey beyond chatbots to develop truly reliable and trustworthy digital companions is just beginning. By prioritising fact-checking and addressing the challenges head-on, we can create digital companions that not only enhance our capabilities but also earn our trust. In doing so, we will take a significant step towards a future where AI and human intelligence work together in new exciting ways.</p><p>In my <a href="https://www.the-blueprint.ai/p/beyond-chatbots-collaboration">next post</a>, I&#8217;ll be looking at how we can collaborate more with digital companions and how in the future we&#8217;ll need our digital companions to collaborate with each other. Remember to subscribe to <a href="https://www.the-blueprint.ai">The Blueprint</a> to received these posts and my weekly newsletter straight to your inbox!</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: Personality]]></title><description><![CDATA[Imbuing Digital Companions with Character]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-personality</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-personality</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 22 Aug 2024 08:00:26 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!zETP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!zETP!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!zETP!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!zETP!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!zETP!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!zETP!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!zETP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:582790,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!zETP!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!zETP!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!zETP!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!zETP!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F66ee302d-c255-4ae7-950f-433363a0dd1c_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>So far in the Beyond Chatbots <a href="https://www.the-blueprint.ai/p/beyond-chatbots-a-blueprint-for-llms">series</a>, I&#8217;ve explored how <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personalisation">personalisation</a> and <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">integrations</a> can transform our current generation of generic chatbots into more tailored and capable digital companions. I&#8217;ve also taken a look at the potential of <a href="https://www.the-blueprint.ai/p/beyond-chatbots-proactivity">proactive</a> digital companions that prompt users and anticipate their needs. In this post I&#8217;m turning our attention to a crucial element that ties all of these aspects together: personality.</p><p>Personality is a big ingredient in turning a useful tool into a trusted companion. It's what makes interactions feel natural, engaging, and uniquely tailored to each user. As we develop sophisticated features for digital companions, the crafting of rich, adaptive personalities will play an essential role in shaping our relationship with them.</p><h2>&#128540; Personality Traits in Digital Companions</h2><p>Why does personality matter so much in the context of AI? The answer lies in two critical factors: trust and enjoyment. As we develop more sophisticated digital companions, their personality will play an incredibly important role in building user trust and creating joyful, engaging experiences.</p><p>Most current chatbots offer generic interactions, struggle to build rapport, and don&#8217;t keep users engaged over time. This leads to intermittent, shallow, and transactional interactions with users. In contrast, if we are able to craft well-defined personalities for digital companions they can offer more natural interactions, improved user engagement and more tailored experiences.</p><p>This will all go a long way to build more trust with our digital companions and change our relationships with technology going forwards. A digital companion with the right personality could become more than just a tool &#8211; it could be a coach, a confidant, or a collaborator that truly understands and complements its user.</p><p>One platform that's pushing the boundaries of AI personality customisation is <a href="https://character.ai">Character.ai</a>. The platform allows users to create and interact with chatbots representing various characters and has been very popular amongst the under 25s, so much so that, according to SimilarWeb, users of Character.ai visit the platform 65% more often than users of ChatGPT use ChatGPT!</p><p>Let's take a quick look at Character.ai&#8217;s approach. On the platform, users can define AI characters using free-form text descriptions. This includes details about the character's background, personality traits, speaking style, and knowledge areas. This works well because users have the freedom to create highly detailed and unique characters and the open-ended nature of text descriptions allows for complex character development. However, some users may find it challenging to know where to start or what details to include without structured guidance. While Character.ai's approach offers a glimpse into the future of AI personality customisation, I think the future lies in more intuitive, granular control over digital companions' traits and behaviours.</p><h2>&#128525; Customising Personality Traits</h2><p>To create compelling personalities for digital companions, there are key features that we will to need to develop and allow users to control. These features should work together to create fully-formed characters that users can relate to and meaningfully interact with.</p><p>To determine what these features should be, we can take inspiration from Psychology and the <a href="https://en.wikipedia.org/wiki/HEXACO_model_of_personality_structure">HEXACO</a> model of personality structure. I prefer this to the traditional <a href="https://en.wikipedia.org/wiki/Big_Five_personality_traits">OCEAN</a> model of big five personality traits as the HEXACO research is more inclusive and represents non-English speaking cultures. The HEXACO model conceptualises personality in terms of six traits:</p><ol><li><p>Honest &amp; Humility (H)</p></li><li><p>Emotionality (E)</p></li><li><p>Extraversion (X)</p></li><li><p>Agreeableness (A)</p></li><li><p>Conscientiousness (C)</p></li><li><p>Openness to Experience (O)</p></li></ol><p>In the HEXACO model, each of these traits has two dimensional scale, which can be used to allow users to dial each dimension up and down to customise the personality of their digital companion. Each dimension also has associated facets and adjectives that can be used to translate these dimensions into natural language to be used in a system prompt to influence digital companions&#8217; responses. This approach is rudimentary but can work effectively - we have already seen chatbots that are very capable of imitating different characters and personalities just based on a natural language description of that character.</p><p>To illustrate how these personality traits could be implemented in practice, below is an example interface that allows users to customise their digital companion's personality based on the HEXACO model:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!LAre!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!LAre!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 424w, https://substackcdn.com/image/fetch/$s_!LAre!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 848w, https://substackcdn.com/image/fetch/$s_!LAre!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 1272w, https://substackcdn.com/image/fetch/$s_!LAre!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!LAre!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png" width="550" height="721.4435146443515" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1254,&quot;width&quot;:956,&quot;resizeWidth&quot;:550,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Personality Sliders.png&quot;,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Personality Sliders.png" title="Personality Sliders.png" srcset="https://substackcdn.com/image/fetch/$s_!LAre!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 424w, https://substackcdn.com/image/fetch/$s_!LAre!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 848w, https://substackcdn.com/image/fetch/$s_!LAre!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 1272w, https://substackcdn.com/image/fetch/$s_!LAre!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F4d04f89b-4989-44f2-b8d0-29b002014fad_956x1254.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>These settings allow users to adjust the six personality traits on a 5-point scale. For example, the "Integrity" slider corresponds to the Honesty-Humility trait in the HEXACO model. Users can move each slider to shape their digital companion's personality, with the extremes of each slider representing opposing characteristics.</p><p>I asked Anthropic&#8217;s Claude to mock up this example interface and you can play around with it <a href="https://claude.site/artifacts/f481fa0f-4835-4f22-88ed-d54d929e6049">here</a>. As you adjust the sliders, the interface generates a system prompt that can be used to guide the digital companion's responses and behaviour. For example:</p><ul><li><p>A high setting on Honesty-Humility might result in a digital companion that's more direct and less likely to flatter the user.</p></li><li><p>A low setting on Extraversion could create a more reserved companion that doesn't initiate conversations as frequently.</p></li><li><p>A high setting on Openness to Experience might lead to a companion that offers more creative solutions and engages in abstract discussions.</p></li></ul><p>I&#8217;ve tested how well this works and it&#8217;s pretty good - the different system prompts significantly change the responses you get from a chatbot. For example, I created two different personalities - an &#8220;Empathetic Supporter&#8221; (highly empathetic, agreeable, and honest, prioritising emotional support and understanding over practical problem-solving) and a &#8220;Practical Achiever&#8221; (focusing on practical solutions and achievement, less emotionally involved but highly organised and goal-oriented) and gave them both the following prompt:</p><blockquote><p><em>&#8220;I&#8217;ve just arrived home after a long, stressful day at work where I made a mistake on an important project. I'm feeling down and considering ordering a take away for dinner instead of cooking the healthy meal I had planned. How should I respond to this situation?&#8221;</em></p></blockquote><p>In their responses, the &#8220;Empathetic Supporter&#8221; prioritised emotional validation and support, while the &#8220;Practical Achiever&#8221; focused on problem-solving and forward-thinking strategies.</p><p>While this implementation is pretty basic, it shows a solid foundation for how we could develop digital companions with customisable personalities. A key challenge will be ensuring that these personality traits are consistently applied across all interactions whilst also being able to adapt to context and the requirements of specific interactions. This will require some new research and new techniques for how we build more sophisticated digital companions, but is something I&#8217;m confident can be delivered.</p><h2>&#129299; Shaping a Digital Companion&#8217;s Personality</h2><p>Beyond personality traits, there are additional features that can contribute to a more nuanced and adaptable digital companion. The way they behave and communicate is crucial in shaping how users relate to them:</p><ul><li><p><strong>Adaptability</strong>: It&#8217;s important that digital companions are able to maintain a core personality while being flexible enough to adjust its communication style based on the context of the conversation, such as switching from casual to formal in professional settings.</p></li><li><p><strong>User feedback</strong>: As users interact with their digital companions over time, their feedback &#8211; both explicit and implicit &#8211; should influence how the AI expresses its personality, such as increasing uses of humour if the user consistently responds well to it.</p></li><li><p><strong>Proactiveness</strong>: Some users might prefer a companion that takes initiative, while others might want one that waits for explicit instructions.</p></li><li><p><strong>Vocabulary</strong>: The choice of words and the complexity of language used can significantly impact how a digital companion is perceived. A digital companion should accurately reflect the language that is used by its user.</p></li><li><p><strong>Humour</strong>: This is a powerful tool for building relationships. Digital companions that have the ability to understand and use humour will make interactions far more engaging and human-like, helping to build trust.</p></li><li><p><strong>Emotional Intelligence</strong>: The ability to recognise and respond appropriately to user emotions is an important part of building user relationships and providing support.</p></li><li><p><strong>Emotional Expression</strong>: Recent advances in GenAI voice technology, as seen in the demos of <a href="https://openai.com/index/hello-gpt-4o/">GPT-4o</a>&#8217;s upcoming advanced voice capabilities, will allow for a digital companion to express more nuanced emotions in voice interactions that just can&#8217;t be done in text based interactions. Adapting emotional expression based on the situation and the user's emotional state will be important for creating natural, more empathetic interactions.</p></li></ul><p>I believe that we will soon be able to build digital companions whose core personality traits can combine with all of these additional character elements to create more human-like, engaging, and trustworthy user experiences.</p><h2><strong>&#129400;</strong> Emotions &amp; Appearance</h2><p>To truly deliver human-like interactions, digital companions must evolve beyond just text-based chat interfaces. The future of AI companionship lies in the integration of emotional intelligence and visual representation, creating a multi-modal interaction experience that feels genuinely personal and engaging.</p><p>Recent advancements with GPT-4o&#8217;s advanced voice mode have demonstrated impressive capabilities in generating emotionally nuanced speech. This opens up new possibilities for digital companions to convey a wide range of emotions through voice, adding depth to AI interactions.</p><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;db7148ac-eab4-47f0-8172-b017fc5adc8f&quot;,&quot;duration&quot;:null}"></div><p>Currently, GPT-4o users can choose from a limited selection of voices. However, imagine being able to describe any voice you'd like your digital companion to have - perhaps a soothing baritone with a hint of a Scottish accent, or a cheerful soprano with a California surfer vibe. This level of customisation would allow users to create truly unique and personally appealing digital companions.</p><p>But expressing emotions is only half of the equation. Equally important is the ability of digital companions to recognise and respond appropriately to the emotions of their users. This emotional intelligence is key to creating meaningful, empathetic interactions. For example:</p><ul><li><p>A digital companion might detect frustration in a user's voice and respond with patience and encouragement.</p></li><li><p>It could pick up on excitement and match that energy in its responses.</p></li><li><p>During periods of stress, it might adopt a calming tone and offer supportive words.</p></li></ul><p>This kind of emotional attunement could significantly enhance user engagement and satisfaction. When users feel understood on an emotional level, they're more likely to form a connection with their digital companion, leading to more frequent and meaningful interactions.</p><p>A large part of emotional expression in humans comes from gestures and non-verbal communication. This is why digital companions will likely need a "visual mode" for certain contexts and use cases. Advances in generative AI&#8217;s abilities to create images and videos will lead to digital companions having a visual digital presence, allowing users to interact with them as if they're on a video call.</p><p>This visual aspect opens up new opportunities for customisation and human-like interactions:</p><ul><li><p><strong>Visual Appearances</strong>: Users could customise the visual appearance of their digital companion for video interactions, ranging from selecting preset avatars to designing completely unique looks.</p></li><li><p><strong>Gestural and Non-Verbal Communication</strong>: Future digital companions might use facial expressions and body language in video calls, adding another layer of personality and emotional expression.</p></li><li><p><strong>Multi-Modal Interaction</strong>: Digital companions could adapt their appearance and personality expression across different interaction modes - text, voice, and video - while maintaining a consistent core personality.</p></li></ul><p>As AI systems become more advanced and integrated into our lives, building and maintaining user trust will be paramount. Personality plays a crucial role in this trust-building process:</p><ul><li><p><strong>Consistency and Predictability</strong>: A well-defined AI personality creates a sense of familiarity and predictability, helping users feel more comfortable in their interactions.</p></li><li><p><strong>Transparency and Honesty</strong>: AI personalities should be designed to be upfront about their capabilities and limitations, fostering trust through honesty.</p></li><li><p><strong>Emotional Resonance</strong>: AI that can express and respond to emotions appropriately can create a sense of understanding and connection with users.</p></li><li><p><strong>Adaptability and Learning</strong>: AI personalities that can learn and adapt to user preferences demonstrate responsiveness and a commitment to meeting the user's needs.</p></li><li><p><strong>Ethical Behaviour</strong>: AI personalities should be designed with strong ethical principles, consistently demonstrating integrity in their interactions.</p></li></ul><h2>&#128679; Challenges and Considerations</h2><p>While the potential benefits of personality-driven digital companions are significant, their development and implementation come with a range of complex challenges and important ethical considerations. As we push the boundaries of personalities in digital companions, we must navigate these issues carefully to ensure that they enhance our lives without introducing new problems.</p><ul><li><p><strong>Ethical Implications and Bias</strong>: One of the primary challenges in developing personality-driven digital companions is ensuring that they don't perpetuate harmful stereotypes or biases. To address this we need diverse development teams and rigorous testing protocols.</p></li><li><p><strong>Emotional Intelligence and Manipulation</strong>: As we develop more emotionally intelligent digital companions, we open up new possibilities for meaningful interactions &#8211; but also for potential manipulation. There is potential for misuse and to exploit users or manipulate their emotions. We need to establish clear ethical guidelines for AI emotional expression and implement transparency measures.</p></li><li><p><strong>Privacy and Data Protection</strong>: Personality-driven AI requires processing large amounts of personal data to provide personalised experiences. This raises significant privacy concerns. How do we balance the need for data to improve AI performance with user privacy rights?</p></li><li><p><strong>Managing User Expectations and Relationships</strong>: As digital companions become more sophisticated and personable, there's a risk that users may develop unrealistic expectations or form inappropriate attachments. We will need to develop guidelines for healthy human-AI interactions.</p></li><li><p><strong>Cross-Cultural Considerations</strong>: Developing digital companions with cultural awareness and adaptability is key. This involves not just language translation, but understanding and respecting cultural norms, taboos, and communication styles. Involving diverse global teams in the creation and testing of AI personalities will be essential to achieve this cultural fluency.</p></li><li><p><strong>Balancing User Desires with Ethical AI Behaviour</strong>: Users may want to shape their digital companions in ways that conflict with ethical AI principles. To navigate this, we need to establish immutable ethical guidelines for AI behaviour that cannot be overridden by user preferences. This ethical framework should be transparently communicated to users, helping them understand why certain customisations might not be possible.</p></li></ul><p>As we develop personality-driven digital companions, addressing these challenges will be crucial. By doing so, we can create digital companions that are not only more engaging and personable, but also trustworthy, ethical, and truly beneficial to users' lives. This will require a multi-disciplinary approach with ongoing dialogue between AI developers, ethicists, policymakers, and users to ensure we're creating a future with AI that enhances human potential while respecting human values.</p><h2>&#127937; Conclusion</h2><p>As we've explored throughout this post, the development of sophisticated, customisable personalities for digital companions could deliver a significant leap forward in our journey beyond simple chatbots.</p><p>The development of sophisticated, customisable personalities for digital companions represents a significant advancement in our journey beyond simple chatbots. This aspect of AI development doesn't exist in isolation &#8211; it's deeply interconnected with the other elements we've explored in this series. Personality enhances personalisation by allowing users to customise a digital companions&#8217; communication style. It complements integrations by enabling digital companions to present information from various sources in a manner consistent with its user&#8217;s preferences. And it elevates proactivity by allowing digital companions to initiate interactions in a way that feels natural and appropriate to the user. In essence, personality is the thread that weaves together all these elements, creating a cohesive and engaging user experience that will transform how we interact with technology.</p><p>However, with this great potential comes significant responsibility. As we develop these technologies, we must navigate complex ethical considerations, prioritise user privacy and wellbeing, and work diligently to build and maintain user trust. The challenges are substantial, but so too are the potential benefits.</p><p>What role do you envision AI personalities playing in your life? How do you think these digital companions might evolve in the coming years? As we continue to push the boundaries of what's possible with generative AI, these are questions we'll all need to consider.</p><p>In my <a href="https://www.the-blueprint.ai/p/beyond-chatbots-fact-checking">next post</a>, I&#8217;ll explore another crucial aspect of creating trustworthy and effective digital companions: fact-checking and reliability. How can we ensure that our digital assistants provide accurate, verifiable information? Stay tuned to find out.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: Proactivity]]></title><description><![CDATA[Enabling Digital Companions to Take the Initiative]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-proactivity</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-proactivity</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 08 Aug 2024 08:01:20 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!9_ib!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9_ib!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9_ib!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 424w, https://substackcdn.com/image/fetch/$s_!9_ib!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 848w, https://substackcdn.com/image/fetch/$s_!9_ib!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!9_ib!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9_ib!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2607932,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9_ib!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 424w, https://substackcdn.com/image/fetch/$s_!9_ib!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 848w, https://substackcdn.com/image/fetch/$s_!9_ib!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 1272w, https://substackcdn.com/image/fetch/$s_!9_ib!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6c873179-72cf-42eb-a89c-6def047307fa_1792x1024.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In the Beyond Chatbots series so far, I&#8217;ve explored how <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personalisation">personalisation</a> and <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">integrations</a> can start to transform our current generation of generic chatbots into digital companions that truly understand and adapt to our needs. Next we&#8217;re looking at proactivity, which will turn our interactions with digital companions on its head.</p><p>In the context of digital companions, proactivity refers to the ability of AI to initiate interactions, anticipate user needs, and take action without explicit user prompts. It's about moving from a reactive model where the AI waits for commands, to a proactive one where it becomes a true companion, offering help before it's asked for.</p><p>Imagine opening your digital companion app and being greeted with a simple message: "Good morning! It's 7:30 AM. Would you like to see your schedule for today?" This small but significant shift &#8212; your AI initiating the conversation &#8212; is the first small step towards realising the potential of proactive digital companions.</p><p>As digital companions evolve over time, their proactive capabilities will become increasingly sophisticated. In the near future, you might wake up to a more useful prompt: "Good morning! I see you have an important presentation at 2PM. Would you like me to summarise the key points from your preparation notes?" Further down the line, your digital companion might even say, "Good morning! I've noticed you have an important presentation today on the product development roadmap for digital companions. I've prepared a summary of key points and some relevant competitive information. Would you like to review it over breakfast?"</p><p>This progression from simple, timely greetings to complex, anticipatory assistance isn't science fiction &#8212; it's where today&#8217;s chatbots will evolve to, and I don&#8217;t think that future is far off.</p><h2><strong>&#128172; Reactive Chatbots</strong></h2><p>Today's chatbots are impressive in many ways but fundamentally only ever react to the user. They wait for us to initiate every interaction, responding to our queries but never taking the initiative. This reactive approach restricts the potential of today&#8217;s chatbots in a few important ways:</p><ol><li><p><strong>The Blank Slate Problem</strong>: Users are typically met with a blank chat interface, offering very few indications of how the chatbot could help them or what it's capable of. This can be intimidating and limits the user's ability to fully understand a chatbot's capabilities.</p></li><li><p><strong>Product-Market Fit Issues</strong>: The reactive nature of our current chatbots contributes to a significant problem with product-market fit. While hundreds of millions of people have tried today&#8217;s chatbots, most don't return regularly. This is something that proactive engagement could address.</p></li><li><p><strong>Limited Learning and Adaptation</strong>: With fewer interactions, reactive chatbots have limited opportunities to learn from user behaviour and adapt their responses over time.</p></li><li><p><strong>Lack of Trust Building</strong>: Purely reactive systems struggle to build a sense of rapport and trust between the user and the AI, which proactive interactions could encourage.</p></li><li><p><strong>Limited Ability to Surprise and Delight</strong>: Proactive features can occasionally provide unexpected but valuable assistance, creating moments of delight that purely reactive systems could never match.</p></li><li><p><strong>Missed Opportunities for Timely Assistance and Reminders</strong>: Without the ability to initiate interactions, chatbots can't offer help at the most opportune moments, or remind users of important tasks, events, or follow-ups without being explicitly asked.</p></li></ol><p>A key issue underlying many of these limitations is the lack of a proper onboarding process. An onboarding experience could serve as the first proactive interaction between a digital companion and a user. The information gathered during onboarding, combined with contextual information (as discussed in our previous post on integrations), would create numerous opportunities for a digital companion to be more proactive in the future.</p><p>By addressing these limitations and moving towards a more proactive digital companion model, we can transform chatbots so that they not only respond to our needs but anticipate them, providing timely, relevant, and personalised assistance. This shift has the potential to significantly enhance user engagement, satisfaction, and the overall value proposition of our current chatbots.</p><h2>&#128256; Redefining the Interaction Model</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!epHv!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!epHv!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 424w, https://substackcdn.com/image/fetch/$s_!epHv!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 848w, https://substackcdn.com/image/fetch/$s_!epHv!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!epHv!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!epHv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png" width="1456" height="607" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:607,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:195925,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!epHv!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 424w, https://substackcdn.com/image/fetch/$s_!epHv!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 848w, https://substackcdn.com/image/fetch/$s_!epHv!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 1272w, https://substackcdn.com/image/fetch/$s_!epHv!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff7a05cec-8fac-4b26-ada7-f29256fff049_2880x1200.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The first step towards developing proactive digital companions requires a fundamental shift in how they interact with us. Instead of always waiting for user prompts, digital companions should be able to initiate conversations when appropriate. Proactivity also goes beyond merely initiating conversations. It encompasses a variety of behaviours, from simple prompts to autonomous actions. At its core, proactivity is about anticipating needs and offering assistance before it's explicitly requested.</p><p>This shift in interaction model could significantly change user expectations and experiences. Users might come to rely on their digital companions for proactive reminders, suggestions, and assistance, much like they would a human personal assistant. This could lead to increased productivity and reduced cognitive load, as users offload certain mental tasks to their AI companions. However, it also raises questions about dependency and the balance between helpful assistance and potential intrusiveness.</p><p>The key to effective proactivity lies in understanding context and user preferences. A digital companion should know when to offer help and when to stay quiet, adapting its proactivity to each user's personal needs and comfort levels.</p><p>Similar to how I suggested we introduce integrations to digital companions in my previous post, introducing proactivity should also be a gradual process. This approach allows users to become comfortable with increasing levels of their digital companion taking initiative. Below I outline a five-step process for implementing proactivity:</p><h4><strong>Step 1: Basic Greetings</strong></h4><p>Initially, a digital companion should proactively initiate a chat with a simple greeting. This small step begins to shift the dynamic from purely reactive to slightly proactive, establishing a friendly presence and gently introducing users to the idea of companion-initiated interactions. It sets the stage for more complex proactive behaviours without being overwhelming. For example, when you open the app, your digital companion might greet you with a simple:</p><blockquote><p><em>"Good morning, [Name]! I hope you're having a great day."</em></p></blockquote><p>This basic interaction might only be for the first ten interactions with a user, but starts building a rapport and prepares users for more advanced proactive features. This&nbsp; would be very simple to implement by coding a digital companion to speak first whenever a new chat is initiated by the user.</p><h4><strong>Step 2: Generic Offers of Assistance</strong></h4><p>After a short while, digital companions should start offering general assistance. This introduces the idea that the companion is ready and willing to help, but still leaves the specifics up to the user. It encourages users to engage more with the digital companion and helps them discover its capabilities while maintaining a low-pressure interaction. After the greeting, the digital companion might add:</p><blockquote><p><em>"Is there anything I can help you with today?"</em></p></blockquote><p>This open-ended question invites users to explore the digital companion&#8217;s abilities at their own pace, gradually increasing their comfort with its proactive nature. This would be similarly straightforward to implement by making the digital companion&#8217;s initial prompt more sophisticated.</p><h4><strong>Step 3: Contextual Suggestions</strong></h4><p>In the next stage, the digital companion should start to use available contextual information such as time, device, location, and calendar events to offer more specific assistance. This approach provides more relevant and timely help, demonstrating the companion&#8217;s ability to understand context and increasing its perceived value. For instance, your digital companion might say:</p><blockquote><p><em>"Good morning! I see you have a busy day ahead with three meetings scheduled. Would you like me to summarise your day's agenda?"</em></p></blockquote><p>By offering contextual assistance, the digital companion shows it can anticipate needs based on readily available information, making it a more useful companion in the user's daily life. Implementing this level of proactivity will require the personalisation and integrations that I&#8217;ve discussed in previous posts and developing ranking algorithms to determine the most important suggestions in a given context.</p><h4><strong>Step 4: Predictive Assistance</strong></h4><p>As digital companions become more sophisticated, they could use patterns from past user behaviour and current data to anticipate needs and offer proactive solutions. This stage offers much more highly personalised assistance to the user, helping them manage complex tasks and schedules, and demonstrates an advanced understanding of user needs and preferences. For example:</p><blockquote><p><em>"I've noticed that you usually start preparing for quarterly reports about two weeks in advance. Your Q2 report is due in 18 days, but your calendar is quite full next week. Would you like me to block out some time this week for report preparation and gather the key financial data you typically include?"</em></p></blockquote><p>This level of proactivity would show the digital companion's ability to learn from user behaviour over time and offer increasingly valuable assistance. This level of proactivity will require an improvement in the planning and reasoning capabilities of large language models, so is probably at least 12 months away from being practical and reliable.</p><h4><strong>Step 5: Autonomous Actions</strong></h4><p>When digital companions have much more sophisticated capabilities, they should be able to take predefined actions on the user's behalf, always with clear consent and easy override options. This approach saves time, reduces cognitive load for users, and handles routine tasks automatically, representing the highest level of proactive assistance. An example might be:</p><blockquote><p><em>"I've noticed a recurring conflict in your schedule. With your permission, I can automatically reschedule the lower-priority meeting to resolve this. Would you like me to do that?"</em></p></blockquote><p>This demonstrates the full potential of a proactive digital companion, taking initiative to solve problems while still ensuring the user remains in control. There is obviously a lot of product development to do before we get here, but these are the types of interactions we should be aiming for with our digital companions in the future.</p><p>This gradual approach to introducing proactivity will allow users to acclimate to more proactive behaviours from their digital companions, building trust and comfort over time. It's important to note that users should always have control over the level of proactivity they're comfortable with, and be able to adjust these settings as their relationship with their digital companion evolves. By implementing proactivity in this gradual manner, we can transform chatbots into true digital companions that not only respond to our needs but in the future anticipates them, providing timely, relevant, and personalised assistance.</p><div class="pullquote"><p>The evolution from reactive chatbots to proactive digital companions isn't just a technological leap - it will fundamentally shift how we interact with technology.</p></div><h2>&#128679; Challenges and Considerations</h2><p>While the potential of proactive digital companions is exciting, it's crucial to address several challenges:</p><ol><li><p><strong>Privacy Concerns</strong>: Proactivity requires access to and analysis of user data. It's essential to implement robust privacy protections and give users granular control over what data is used for proactive features. For example, a proactive digital companion might need access to a user's email, calendar, and location data to provide timely and relevant assistance. This data shouldn&#8217;t leave a user&#8217;s device and only abstracted, non-personal data should be saved as memories for future use.</p></li><li><p><strong>Avoiding Overreach</strong>: There's a fine line between helpful and intrusive. Digital companions need to learn and respect individual user preferences for different levels of proactivity. Users should be able to select how proactive they would like their digital companion to be during onboarding and to be able to adjust these settings whenever they please.</p></li><li><p><strong>Handling Errors and Misinterpretations</strong>: Proactive suggestions based on misinterpreted data could be frustrating or even harmful. Implementing graceful error handling and continuously learning from user feedback is crucial. A potential solution could be to ask for clarification before acting on potentially sensitive interpretations.</p></li><li><p><strong>Ethical Considerations</strong>: We must ensure that proactive features don't reinforce biases or manipulate users. Maintaining user autonomy in decision-making should be a priority. This is an area that needs to build on current efforts to reduce bias in models, expanded out to consider proactive use cases.</p></li></ol><p>Addressing these challenges will be key to creating proactive digital companions that enhance our lives without introducing new frustrations or concerns. As the technology continues to advance, we can expect proactive digital companions to become increasingly sophisticated and helpful. One area I&#8217;m particularly excited about is emotional intelligence, allowing digital companions to better understand and respond to user moods and emotional states.</p><h2>&#127937; Conclusion</h2><p>The shift from reactive chatbots to proactive digital companions represents a big change in how we interact with technology. By initiating interactions, anticipating needs, and offering timely assistance, digital companions have the potential to significantly enhance our productivity, well-being, and quality of life.</p><p>However, as with <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personalisation">personalisation</a> and <a href="https://www.the-blueprint.ai/p/beyond-chatbots-integrations">integrations</a>, it's essential that we develop these features thoughtfully, with a focus on user empowerment, privacy, and ethical considerations. The staged approach to implementing proactivity that I&#8217;ve outlined allows us to build trust and comfort gradually, ensuring that digital companions enhance our lives without overwhelming or intruding.</p><p>In my <a href="https://www.the-blueprint.ai/p/beyond-chatbots-personality">next post</a>, I&#8217;ll explore another crucial aspect of digital companions: Personality. How can we craft AI assistants with distinct, customisable character traits that build rapport with their users? Stay tuned to find out.</p><p>What are your thoughts on proactive digital companions? How would you feel about a digital companion that initiates conversations with you? What kinds of prompts would you find most helpful? Share your thoughts in the comments below.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: Integrations]]></title><description><![CDATA[Embedding Digital Companions into Our Digital Lives]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-integrations</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-integrations</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 01 Aug 2024 08:00:37 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!1q8Q!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!1q8Q!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!1q8Q!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!1q8Q!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!1q8Q!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!1q8Q!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!1q8Q!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:486392,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!1q8Q!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!1q8Q!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!1q8Q!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!1q8Q!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc50de91d-1641-4931-8b66-333c20fe6702_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In my <a href="https://open.substack.com/pub/theaiblueprint/p/beyond-chatbots-personalisation">previous post</a>, I outlined how personalisation can transform generic chatbots into digital companions that know us. I briefly touched on the role of integrations in personalisation, and now it's time to go into more details. How can we integrate digital companions into our existing digital ecosystems to create even more personalised experiences?</p><h2><strong>&#128268; The Power of APIs</strong></h2><p>At the heart of integrations lies the Application Programming Interface (API). APIs act as bridges, allowing different digital systems to communicate and share data. For digital companions to truly become part of our digital lives, they need to connect with a wide array of digital services and platforms. Let's explore the different types of APIs that digital companions could integrate with:</p><div class="pullquote"><p>"Digital companions have the potential to become extensions of ourselves, amplifying our capabilities and freeing us to focus on what matters most."</p></div><h4>1. Personal APIs</h4><p>These APIs allow digital companions to access personal data, helping them understand and adapt to individual users. It's crucial to note that user permissions are essential for personal APIs. Users should always have fine-grained controls over what data is shared with their digital companion. This ensures privacy and builds trust between the user and their digital companion. Digital companions should be transparent about what data they're requesting and why, allowing users to make informed decisions about what they're comfortable sharing:</p><ul><li><p>Calendar APIs: Understanding schedules and commitments</p></li><li><p>Email APIs: Analysing communication patterns and important contacts</p></li><li><p>Browser APIs: Understanding of interests, frequently visited sites and online services used</p></li><li><p>Social Media APIs: Knowledge of interests, social circles, and online behaviour</p></li><li><p>Health and Fitness APIs: Gathering data on physical activity and wellbeing</p></li><li><p>Finance APIs: Accessing financial information and transactions</p></li></ul><p>There is also a lot of contextual data that should be available to a digital companion when the user is accessing it via a web browser or app, such as:</p><ul><li><p>Time and Date: Current time, time zone, and date</p></li><li><p>Location: Country and city</p></li><li><p>Device Type: Desktop, laptop, tablet, or smartphone</p></li><li><p>Operating System: Windows, macOS, iOS, Android, etc.</p></li><li><p>Language Settings: User's preferred language</p></li><li><p>Connection Type: Wi-Fi, cellular data, or offline status</p></li></ul><p>By integrating with these APIs and having access to contextual data, digital companions can build a comprehensive user profile, enabling more nuanced and relevant interactions. For instance, imagine you're traveling and you open your digital companion app to ask for dinner recommendations. Without you having to input any additional information, your digital companion could:</p><ul><li><p>Know that you are usually based in the UK and use the English language</p></li><li><p>Recognise that you're in a different, non-English speaking, country based on your current location</p></li><li><p>Detect that it's 7PM local time</p></li><li><p>Understand that you're using a smartphone rather than a desktop</p></li></ul><p>With this contextual information, your digital companion could say: "Good evening! I see you're in Tokyo and it's dinner time. Since you're out and about on your phone, would you like me to recommend some nearby restaurants with English menus? I can also help with real-time translation if you need it while ordering."</p><p>This example demonstrates how a digital companion can use readily available contextual data to provide a highly personalised and relevant interaction, anticipating your needs based on your current situation.</p><h4><strong>2. Content APIs</strong></h4><p>These APIs enable digital companions to surface different types of content directly within the chat interface:</p><ul><li><p>Wikipedia API: Referencing articles and definitions</p></li><li><p>Google Maps API: Displaying maps and location information</p></li><li><p>News APIs (e.g., Google News): Presenting current events and articles</p></li><li><p>Spotify API: Returning music and playlists</p></li><li><p>YouTube API: Showing relevant videos</p></li><li><p>TMDB API: Providing movie and TV show information</p></li><li><p>Goodreads API: Offering book recommendations and reviews</p></li><li><p>Weather APIs: Providing forecasts and climate data</p></li><li><p>Currency conversion APIs: Offering real-time exchange rates</p></li><li><p>Stock market APIs: Providing real-time stock quotes and financial data</p></li><li><p>Public transportation APIs: Offering real-time transit information and route planning</p></li></ul><p>Integration with content APIs not only enhances the user experience by providing rich, diverse content but also plays a crucial role in preventing hallucinations. By sourcing information directly from authoritative platforms, digital companions can ensure the accuracy and reliability of the information they provide.</p><p>For example, when asked about a historical event, a digital companion could pull a concise summary from Wikipedia, display relevant images, and even suggest a documentary on YouTube &#8211; all within the chat interface, and all from verified sources.</p><h2><strong>&#128274; Read Access vs. Write Access</strong></h2><p>When integrating digital companions with our digital services, it's crucial to distinguish between read access and write access:</p><ol><li><p><strong>Read Access</strong>: This allows the digital companion to view and analyse data from a service, but not make any changes. For example, reading your calendar events or accessing your contact list.</p></li></ol><ol><li><p><strong>Write Access</strong>: This permits the digital companion to make changes or perform actions within a service. For instance, sending emails on your behalf or adding events to your calendar.</p></li></ol><p>I believe the rollout of these integrations should follow a careful, staged approach so that users can build trust with their digital companion and feel more comfortable with them performing actions on their behalf over time:</p><h4><strong>Step 1: Content API Integration</strong></h4><p>The first step is to integrate content APIs into the chat interface. This allows digital companions to surface different types of content (articles, maps, news, music, video) directly within conversations. This stage enhances the user experience and builds trust by providing accurate, source-attributed information.</p><h4><strong>Step 2: Read-Only Personal API Integration</strong></h4><p>The next stage involves granting read-only access to personal APIs during the onboarding process. This allows digital companions to start building a comprehensive user profile, enabling personalised interactions from the get-go. However, at this stage, the companion can only view and analyse data, not modify it.</p><h4><strong>Step 3: Write Access to Personal APIs</strong></h4><p>The final stage is to grant write access, allowing digital companions to perform actions on the user's behalf. This stage should only be implemented once several conditions are met:</p><ul><li><p>Digital companions have demonstrated improved capabilities in planning and reasoning</p></li><li><p>Users have developed sufficient trust in their digital companions</p></li><li><p>Robust safety measures are in place, such as requiring user confirmation before performing any actions</p></li></ul><p>This staged approach allows for the gradual building of trust and capabilities, ensuring that integration enhances rather than disrupts the user experience.</p><h2><strong>&#128679; Challenges and Opportunities</strong></h2><p>Whilst there are some huge potential benefits of deeply integrated digital companions, there are also challenges that need addressing that are both technical and ethical:</p><ol><li><p><strong>Trust Building</strong>: Implementing a staged approach to integration is crucial for building and maintaining user trust. This involves starting with less sensitive integrations and gradually expanding access as users become more comfortable. For example, beginning with read-only content APIs before moving to personal data access.</p></li><li><p><strong>API Limitations</strong>: Many current APIs aren't designed for the level of integration digital companions require. We need more comprehensive, AI-friendly APIs that can handle complex, context-aware queries. This might involve creating new standards for natural language APIs that can interpret and respond to more complex, nuanced queries from AI systems and provide more sophisticated data access controls.</p></li><li><p><strong>AI-to-AI Integrations</strong>: As more systems become AI-driven, we might see direct AI-to-AI integrations, where digital companions can negotiate and exchange information with other AI systems on our behalf.</p></li><li><p><strong>Privacy and Security</strong>: With access to so much personal data, robust security measures and user privacy controls are essential. End-to-end encryption for all data transfers is essential, as well as giving users granular control over what data is shared and when.</p></li><li><p><strong>User Control and Transparency</strong>: Users need clear understanding and control over what data their digital companion can access and what actions it can take. This might involve creating intuitive settings and dashboards where users can view and modify their digital companion's permissions.</p></li><li><p><strong>Interoperability</strong>: Creating seamless interactions across diverse platforms and services requires industry cooperation and standards. This could involve the development of universal protocols for AI-to-API interactions, ensuring digital companions can work effectively across different ecosystems.</p></li><li><p><strong>Balancing Integration and Simplicity</strong>: As we integrate more services, we must ensure the user experience remains intuitive and not overwhelming. This might involve using AI to prioritise and contextualise information, presenting only the most relevant data to users at any given time.</p></li><li><p><strong>Content Licensing and Fair Use</strong>: As digital companions integrate with content platforms, we'll need to navigate complex issues of licensing, attribution, and fair use. This could involve developing new licensing models specifically for AI consumption and redistribution of content.</p></li></ol><div class="pullquote"><p>"The journey from chatbots to integrated digital companions represents a fundamental shift in our relationship with technology, promising unprecedented personalisation but demanding careful consideration of privacy and ethics."</p></div><h2><strong>&#129309; The Integrated Digital Companion</strong></h2><p>As I&#8217;ve explored in this post, the integration of digital companions with our existing digital ecosystems will deliver significant improvements to the current generation of chatbots. By leveraging a wide array of APIs we're opening the door to new personalised and context-aware digital companions.</p><p>However, building these features comes with significant responsibilities. The staged approach I&#8217;ve outlined to integration, moving from content APIs to read-only personal APIs and finally to write access, reflects the need to build trust gradually. It's crucial that users maintain control over their data and that digital companions are transparent about what information they're using and why. It also goes without saying that data from user integrations should NEVER be used in the future training runs of new models.</p><p>Despite these challenges, the potential benefits could be huge. Imagine a digital companion that not only answers your questions but anticipate your needs, providing proactive assistance tailored to your unique circumstances and preferences.</p><p>The journey from our current generation of chatbots to integrated digital companions will represent a fundamental shift in our relationship with technology. Digital companions have the potential to become extensions of ourselves, amplifying our capabilities and freeing us to focus on what matters most. As we continue to develop these technologies, it's crucial that we do so thoughtfully, with a focus on user empowerment, privacy, and ethical considerations.</p><p>In my <a href="https://www.the-blueprint.ai/p/beyond-chatbots-proactivity">next post</a>, I'll explore another crucial aspect of digital companions: proactivity. Building on the foundation of personalisation and integrations covered so far, I'll go into how digital companions should be able to anticipate our needs and take initiative in assisting us.</p><p>What are your thoughts on integrating a digital companion with your digital services? How would you like to see digital companions integrated into your digital life? What excites you, and what concerns you about this level of AI integration? Share your perspectives in the comments below.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: Personalisation]]></title><description><![CDATA[Building Digital Companions That Truly Know You]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-personalisation</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-personalisation</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 25 Jul 2024 08:00:29 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!ccH_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!ccH_!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!ccH_!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!ccH_!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!ccH_!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!ccH_!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!ccH_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:450362,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!ccH_!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!ccH_!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!ccH_!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!ccH_!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff4fa6637-c940-4af8-b437-f5dd269c0b07_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>In my <a href="https://www.the-blueprint.ai/p/beyond-chatbots-a-blueprint-for-llms">previous post</a>, I introduced the concept of digital companions - GenAI-powered assistants that go beyond the simple chatbots we have today to become intuitive, indispensable partners in our digital lives. I outlined seven essential areas of product development needed to realise this vision. In this post, I&#8217;m going to be covering arguably the most crucial of these: personalisation.</p><h2><strong>&#129765; Generic Responses in a Unique World</strong></h2><p>Today's chatbots, such as Anthropic&#8217;s <a href="https://claude.ai/">Claude</a>, OpenAI&#8217;s <a href="https://chatgpt.com">ChatGPT</a> and Google&#8217;s <a href="https://gemini.google.com/">Gemini</a> are impressive in many ways, but all of them fall short when it comes to truly understanding and adapting to individual users. They operate on a one-size-fits-all model, treating each interaction as if it were the first and each user as if they were identical. For example:</p><ul><li><p>There is no onboarding experience, all users are presented with the same generic and empty chat prompt screen.</p></li><li><p>Chatbots fail to remember details about users&#8217; preferences or past interactions, forcing them to constantly reshare information.</p></li><li><p>Chatbots don&#8217;t have any concept of simple things like where users are, what time it is, or what device they&#8217;re accessing it on.</p></li></ul><p>This lack of user knowledge and personalisation severely limits the potential of chatbots becoming a truly helpful digital companion. The core issue is that most chatbots don't currently build a persistent, evolving knowledge of each user. They don't learn from past interactions, adapt to individual needs, or understand personal context. In essence, they lack the ability to really know their users.</p><h2><strong>&#128525; Chatbots That Get To Know You</strong></h2><p>For digital companions to be able to build persistent, evolving knowledge that can then be used for more personalised experiences, they need to be able to create and retain memories of interactions, preferences, and important information.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JO66!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JO66!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 424w, https://substackcdn.com/image/fetch/$s_!JO66!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 848w, https://substackcdn.com/image/fetch/$s_!JO66!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 1272w, https://substackcdn.com/image/fetch/$s_!JO66!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JO66!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png" width="1390" height="1040" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1040,&quot;width&quot;:1390,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:100856,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JO66!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 424w, https://substackcdn.com/image/fetch/$s_!JO66!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 848w, https://substackcdn.com/image/fetch/$s_!JO66!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 1272w, https://substackcdn.com/image/fetch/$s_!JO66!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5e8eabd9-9acd-4eac-bb19-662e776d6a7f_1390x1040.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The closest we have to memory in a chatbot currently is ChatGPT, which gained some <a href="https://openai.com/index/memory-and-new-controls-for-chatgpt/">basic memory features</a> earlier this year. Unfortunately, these features aren&#8217;t available in the EU, so I haven&#8217;t had a chance to test them myself, but they allow ChatGPT to remember details from your interactions across multiple chats. Here's how it currently works:</p><ul><li><p>You can explicitly ask ChatGPT to remember specific information.</p></li><li><p>ChatGPT can also pick up details organically through conversations.</p></li><li><p>Users have control over the memory - they can view, delete, or clear all memories.</p></li><li><p>The memory feature can be turned off entirely if desired.</p></li></ul><p>These features allow for some level of personalisation. In the example above, ChatGPT remembers that the user prefers meeting notes in a specific format, and they have a daughter, which it can then reference in future interactions.</p><p>However, while this is a step in the right direction, it's still far from the level of personalisation I believe we can have with digital companions, which is a big, missed opportunity. A sophisticated digital companion should create and maintain various types of memories:</p><ul><li><p><strong>Factual Memories</strong>: These are the basics - your name, age, job, family details, and explicit preferences. But it goes beyond that. Your digital companion should remember that you're allergic to peanuts, that your favourite colour is teal, or that you have a fear of heights. These details, big and small, form the foundation of personalisation.</p></li><li><p><strong>Contextual Memories</strong>: These memories are about understanding the 'why' behind your actions and requests. If you often ask about healthy recipes in January, your digital companion should remember that you typically set health-related New Year's resolutions. It's not just about remembering facts but understanding their context in your life.</p></li><li><p><strong>Goal-Oriented Memories</strong>: Your digital companion should understand and remember your short-term and long-term goals. Whether you're training for a marathon, learning a new language, or saving for a house, these goals shape your needs and priorities. A digital companion should adapt its support and suggestions accordingly.</p></li><li><p><strong>Behavioural Patterns</strong>: Over time, your digital companion should learn your habits and routines. It should understand that you're most productive in the mornings, that you prefer video calls for work meetings, but voice calls for personal chats, or that you tend to procrastinate on certain types of tasks.</p></li><li><p><strong>Environmental Context</strong>: This involves understanding the various environments you operate in - your home setup, your office, your favourite coffee shop. It's about knowing which devices you use in different contexts and how your needs change based on your environment.</p></li><li><p><strong>Social Context</strong>: Your digital companion should build an understanding of your social network. It should know the difference between your boss, your best friend, and your grandmother, and how your communication style and needs change in each of these relationships.</p></li></ul><h2><strong>&#127793; Chatbots That Grow With You</strong></h2><p>For digital companions to truly personalise their interactions, they need to continuously learn about their users. This learning process should be dynamic, evolving over time as the user's needs, preferences, and circumstances change. Let's explore three essential ways digital companions can create and refine their memories to provide an increasingly personalised experience.</p><h4><strong>1. Onboarding: Laying the Foundation</strong></h4><p>One of the most significant missed opportunities in current chatbot implementations is the lack of a structured onboarding process. An effective onboarding should quickly establish a baseline understanding of the user, setting the stage for more personalised interactions from the start.</p><p>Imagine the first time you interact with your digital companion. Instead of a blank chat screen, you're greeted with a friendly introduction and a series of questions designed to understand your needs and preferences. This onboarding process should:</p><ul><li><p>Quickly establish basic user information (e.g. name, location, communication style)</p></li><li><p>Give a digital companion an understanding of primary areas of interest or common tasks</p></li><li><p>Give users a sense of control and understanding of how the chatbot will use their information.</p></li></ul><p>Implementing an onboarding process would be a relatively simple yet powerful step towards more personalised AI interactions. The initial information would form the foundation of your digital companion's understanding of you. It's not about creating an exhaustive profile, but rather establishing a basic starting point for personalisation that can be refined over time.</p><h4><strong>2. Learning Through Ongoing Conversations</strong></h4><p>While onboarding provides a solid foundation, the real power of a digital companion lies in its ability to learn and adapt through ongoing interactions. This continuous learning process should be both active and passive:</p><p><strong>Active Learning:</strong></p><ul><li><p>Asking clarifying questions when encountering new or ambiguous information</p></li><li><p>Periodically checking if previously stored information is still accurate</p></li><li><p>Requesting feedback on its performance and suggestions</p></li></ul><p>For example, if you mention a new hobby, your digital companion might ask, "That's interesting! Would you like me to remember your interest in rock climbing for future reference?"</p><p><strong>Passive Learning:</strong></p><ul><li><p>Observing patterns in user queries and behaviours</p></li><li><p>Noting frequently used terms or topics</p></li><li><p>Recognising changes in writing style or emotional tone</p></li></ul><p>These are all things that a digital companion should be able to do with the memories it has of the user behind the scenes, without any user interaction, that will greatly enhance the user experience.</p><h4>3. Integrations: Enhancing Understanding Through Connected Services</h4><p>While we'll cover this topic in more detail in a future post in the series, it's worth mentioning that integrations with other digital services could significantly enhance a digital companion's understanding of the user. By connecting with various apps and platforms (with user permission), a digital companion can gather valuable context and information to further personalise its interactions.</p><p>For instance:</p><ul><li><p>Calendar integrations could provide awareness of your schedule and commitments.</p></li><li><p>Social integrations could provide knowledge of family and friendship groups.</p></li><li><p>Fitness app connections could inform health-related advice and goal tracking.</p></li><li><p>Email integration could help with task management and communication preferences.</p></li></ul><p>These integrations allow the digital companion to build a more comprehensive picture of your life, leading to more informed and helpful interactions.</p><p>The goal is a digital companion that doesn't just store information about you, but truly understands you, growing and evolving alongside you over time. This is the key to creating digital companions that are truly indispensable in our daily lives.</p><h2>&#128736;&#65039; The Practical Reality of Building Personalisation Features</h2><div class="native-video-embed" data-component-name="VideoPlaceholder" data-attrs="{&quot;mediaUploadId&quot;:&quot;e64afe8a-17ec-43b0-97bb-206bdc39abde&quot;,&quot;duration&quot;:null}"></div><p>Last year I built a proof of concept called mypanda.ai (demo video above, code repo <a href="https://github.com/the-blueprint-ai/panda.ai">here</a>), a GenAI chatbot with memory that provided users with a more personal chat experience. Creating the memories for mypanda.ai was pretty simple, here&#8217;s how it worked:</p><ul><li><p><strong>Entity Extraction</strong>: mypanda.ai analysed user messages to identify important "Entities" - people, places, and things relevant to the user's life.</p></li><li><p><strong>Contextual Understanding</strong>: The system considers both recent conversation history and previously saved information about entities to provide context.</p></li><li><p><strong>Information Summarisation</strong>: For each entity, mypanda.ai either created or updated a concise summary, incorporating new information from the current interaction.</p></li><li><p><strong>Structured Storage</strong>: Entities were stored as key-value pairs, with the entity name as the key and the summary as the value.</p></li></ul><p>This approach offers several advantages:</p><ol><li><p><strong>Simplicity</strong>: The memory system was straightforward to implement, requiring only a well-designed prompt and a basic database.</p></li><li><p><strong>Flexibility</strong>: It could handle a wide range of entities and information types without needing separate models for different domains.</p></li><li><p><strong>Contextual Understanding</strong>: By considering both chat history and previously saved information, mypanda.ai could make more informed decisions about what to remember and how to update existing memories.</p></li><li><p><strong>Continuous Learning</strong>: mypanda.ai naturally evolved its understanding of entities over time as new information was provided by the user.</p></li></ol><p>The memory features I build into mypanda.ai demonstrate that building practical, useful memory systems for digital companions is achievable with current technology - effective personalisation doesn't necessarily require extremely complex systems. Sometimes, a clever use of prompts and straightforward data storage can go a long way in creating AI assistants that feel more personal and attentive to individual users' needs.</p><p>There are several ways that this approach could be built on and improved, for example:</p><ul><li><p>Graph databases could be used instead of key-value pairs to capture the relationships between different memory entities, allowing a digital companion to understand the relationships between different memories.</p></li><li><p>Relationships between entities could be weighted to represent their strength/importance and they could be adjusted over time which would allow a digital companion to evaluate the importance of each memory.</p></li><li><p>A time dimension could be added to stored memories which would allow a digital companion to understand how they evolve over time.</p></li><li><p>A social element could also be added to memory entities so that they can be shared with other people and enhanced by other people&#8217;s experiences of that memory.</p></li></ul><p>Building personalisation features requires users to share personal data, which can raise serious security and privacy concerns. Implementing memory features for digital companions needs to be done with the appropriate consents, encryption, and user controls so that users can edit and delete any of the memory entities created and stored.</p><p>While the idea of personalised digital companions is exciting, it's important to acknowledge that privacy concerns and data security risks are significant challenges. We also need to ensure that digital companions remain unbiased and don't inadvertently reinforce harmful stereotypes or behaviours. Addressing these issues will be crucial to realising the full potential of digital companions while safeguarding user interests.</p><h2><strong>&#129489;&#8205;&#129309;&#8205;&#129489; </strong>Conclusion: The Path to Truly Personal Digital Companions</h2><p>By developing memory features, continuous learning processes, and integrations with other services, we can create digital companions that genuinely understand and adapt to individual users. The potential benefits could be huge: imagine a digital companion that can provide contextually relevant advice, anticipate your needs, and grow alongside you:</p><ol><li><p><strong>Work and Productivity</strong>: Digital companions could revolutionise task management and creativity, potentially reshaping job roles and fostering new forms of human-AI collaboration.</p></li><li><p><strong>Education</strong>: Digital companions might enable tailored learning experiences and make lifelong learning more accessible, challenging traditional educational models.</p></li><li><p><strong>Social Interactions</strong>: Digital companions could mediate or augment human connections, potentially helping those with social difficulties.</p></li></ol><p>However, developing these features responsibly requires careful ethical considerations - it's crucial that we focus on user privacy, transparency, and trust. No personal data that is shared with a digital companion should ever be used in the training runs of future models.</p><p>In my <a href="https://open.substack.com/pub/theaiblueprint/p/beyond-chatbots-integrations">next post</a>, I&#8217;ll explore another crucial aspect of digital companions: integrations. How can we seamlessly weave them into our existing digital ecosystems? Stay tuned to find out.</p><p>What are your thoughts on AI personalisation? What excites you? What concerns you? How would you use a personalised digital companion? Share your perspectives in the comments below.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Beyond Chatbots: A Blueprint for LLMs]]></title><description><![CDATA[Introducing a new series that explores how we will transform LLMs into intuitive, indispensable digital companions]]></description><link>https://www.the-blueprint.ai/p/beyond-chatbots-a-blueprint-for-llms</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/beyond-chatbots-a-blueprint-for-llms</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Thu, 18 Jul 2024 14:00:35 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!0lSC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0lSC!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0lSC!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!0lSC!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!0lSC!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!0lSC!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0lSC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp" width="1456" height="832" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:832,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:595762,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0lSC!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 424w, https://substackcdn.com/image/fetch/$s_!0lSC!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 848w, https://substackcdn.com/image/fetch/$s_!0lSC!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 1272w, https://substackcdn.com/image/fetch/$s_!0lSC!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbf0b1965-1c46-4c4e-b3ba-d4eca9405925_1792x1024.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Imagine waking up to a digital companion that knows you. It understands your goals, preferences, and working style. It knows your schedule and has already summarised your emails, prioritised your tasks, and even drafted responses to urgent messages - all done knowing your unique style and tone of voice. When you&#8217;re ready, it briefs you on the day ahead, offering insightful suggestions based on your past decisions and current priorities. This digital companion isn't just reactive; it's proactive, collaborative, and in tune with your needs.</p><p>This sounds like science fiction, but if you take a step back and squint at the current generation of GenAI chatbots we&#8217;re not as far away from this reality as you may think. Over the next couple of years we&#8217;re going to see an evolution of Large Language Models (LLMs) that will transform them from simple chatbots into indispensable, multifaceted digital companions that will reshape how we work, create, and interact with the world. Let&#8217;s take a look at where we are today and how close we are to this reality.</p><h2><strong>&#128079; Impressive but Limited</strong></h2><p>Natural Language Processing and LLMs have come a long way since their inception. From rule-based systems to statistical models, and now to the transformer-based architectures that power behemoths like Claude 3.5 and GPT-4o. The progress has been amazing. These models can generate human-like text, answer questions, and even code - but we've only scratched the surface of their potential.</p><p>Currently, most of us interact with LLMs through chatbots. While impressive, these interactions only hint at the potential of the technology. It's like using a smartphone solely for phone calls - functional, but missing out on a world of possibilities.</p><p>Don't get me wrong - current chatbots are impressive. They can engage in human-like conversation, assist with writing tasks and coding, and provide information on a wide range of topics. But they're held back by some significant limitations:</p><ol><li><p><strong>Lack of personalisation</strong>: Current chatbots treat each user the same, missing out on the nuances of individual needs and preferences. For instance, ask a chatbot, "What should I have for dinner?" and it might suggest a generic popular dish without considering your dietary restrictions or preferences. We want chatbots to know that you're vegetarian, that you love spicy food, and suggest a spicy Thai vegetarian dish instead.</p></li><li><p><strong>Limited integration</strong>: Current chatbots often exist in isolation, disconnected from our other digital tools and workflows. Try asking a chatbot, "When's my next meeting with Sarah?" It will likely respond that it doesn't have access to your calendar, or doesn&#8217;t know Sarah. A chatbot should be integrated with your calendar, email, and contacts, allowing it to answer, "Your next meeting with Sarah from Marketing is tomorrow at 2 PM. Would you like me to prepare a summary of your last meeting with her?"</p></li><li><p><strong>Reactive nature</strong>: Current chatbots wait for our queries instead of proactively assisting based on context and past interactions. They won't proactively remind you that it's your mum's birthday next week and suggest gift ideas based on her interests and your budget.</p></li><li><p><strong>Generic personalities</strong>: Current chatbots lack the ability to adapt their communication style to different users or contexts. Whether you're a formal business professional or a casual teenager, they respond in the same tone. Chatbots should be able to adapt its communication style to match your preferences and the context of your interaction.</p></li><li><p><strong>Reliability issues</strong>: Current chatbots can confidently state inaccuracies, lacking robust fact-checking mechanisms. Ask a chatbot about recent events, and it might confidently provide outdated or incorrect information. For instance, it might tell you the score of a football match, even though the match was last year. Chatbots should have mechanisms to verify information in real-time and should be transparent about where it gets its knowledge.</p></li></ol><p>These limitations aren't just inconveniences - they're preventing us from realising the true potential of LLMs and the transformative role they will play in our daily lives.</p><p>So, what's the alternative? I strongly believe that LLMs are more than just another advancement in technology &#8211; they're foundational, and will usher in change akin to the impact of the internet or mobile computing. I can see a future where LLMs serve as the core operating system of new kind of digital product that is versatile, personalised, and proactive.</p><h2><strong>&#129438; From Chatbots to Companions</strong></h2><p>Many commentators have referred to the idea of these types of digital products as "personal AI assistants&#8221;, but I don&#8217;t think we&#8217;ve got the language quite right yet. A personal AI assistant implies a hierarchical, task-oriented, one-directional interaction. I think this is self-limiting and sells the potential of the idea short. It&#8217;s not how I would like to see the technology play out in our futures.</p><p>I much prefer the term "digital companion&#8221;. A digital companion suggests a more collaborative, symbiotic relationship. Digital companions don't just follow commands; they learn, grow, and develop alongside us. They offer not just functionality, but a form of digital kinship that adapts to our individual needs, preferences, goals, and ambitions.</p><p>The language of companionship represents a fundamental shift in how we think about our relationship with technology, and more clearly reflects my vision for AI as a partner in our digital lives, rather than a mere tool. Digital companions won't just change how we interact with information and technology, they will fundamentally reshape our lives. Imagine:</p><ul><li><p>A digital companion that truly understands you, learning from every interaction to provide increasingly personalised support.</p></li><li><p>A digital companion that is seamlessly integrated into your digital life, working in harmony with your existing devices, apps and services.</p></li><li><p>A digital companion that is a proactive helper that anticipates your needs, offering assistance before you even ask.</p></li><li><p>A digital companion that is versatile and can adapt, adjusting its communication style to suit your preferences and your current context.</p></li><li><p>A digital companion that is a trustworthy partner, with robust fact-checking and the ability to explain its reasoning.</p></li><li><p>A digital companion that is a collaborative partner in work and creativity, brainstorming ideas and filling in knowledge gaps.</p></li></ul><p>This isn't just a pipe dream - it's a vision that can become reality by building more capabilities on top of LLMs and developing product features that transform the chatbots of today into the digital companions of tomorrow.</p><h2><strong>&#129489;&#8205;&#129309;&#8205;&#129489; Essential Features for Digital Companions</strong></h2><p>To turn this vision into reality, I believe there are seven essential areas of product development that need to happen over the next couple of years. These features will transform chatbots into digital companions and are all feasible with today&#8217;s technology:</p><ol><li><p><strong><a href="https://open.substack.com/pub/theaiblueprint/p/beyond-chatbots-personalisation">Personalisation</a></strong>: We need to build features for chatbots to create memories, learn, and adapt to individual users.</p></li><li><p><strong><a href="https://open.substack.com/pub/theaiblueprint/p/beyond-chatbots-integrations">Integrations</a></strong>: We need to build integrations that give chatbots seamless access to our existing digital ecosystems.</p></li><li><p><strong><a href="https://www.the-blueprint.ai/p/beyond-chatbots-proactivity">Proactivity</a></strong>: We need to build features for chatbots to anticipate our needs and offer timely assistance.</p></li><li><p><strong><a href="https://www.the-blueprint.ai/p/beyond-chatbots-personality">Personality</a></strong>: We need to allow users to adapt the &#8216;personalities&#8217; of chatbots for more natural interactions.</p></li><li><p><strong><a href="https://www.the-blueprint.ai/p/beyond-chatbots-fact-checking">Fact-checking</a></strong>: We need to build features to verify information and check facts so chatbots are more accurate and can build trust.</p></li><li><p><strong><a href="https://www.the-blueprint.ai/p/beyond-chatbots-collaboration">Collaboration</a></strong>: We need to build systems that enable meaningful teamwork between multiple humans and chatbots.</p></li></ol><p>These six features are not isolated - they interconnect and reinforce each other, and will lead to the development of truly transformative digital companions. For example:</p><ul><li><p>Robust fact-checking features will build trust and help users feel more comfortable sharing information that will support personalisation.</p></li><li><p>As a digital companion learns about you, it becomes better equipped to anticipate your needs.</p></li><li><p>Integration with your digital ecosystem will enable more personalisation and collaboration.</p></li></ul><p>By developing these features in tandem, we can create digital companions that are more than the sum of their parts - intuitive, reliable, and truly indispensable partners in our digital lives.</p><h2><strong>&#128588; Challenges and Opportunities</strong></h2><p>Make no mistake - the path from our current chatbots to fully realised digital companions has its challenges. Whilst the next generation of chatbots will allow for multimodal, real-time interactions (we&#8217;ve seen the demos!), we still face significant technical hurdles in areas like continuous learning and contextual understanding.</p><p>There are also important, unsolved ethical considerations that we must grapple with. For instance:</p><ol><li><p><strong>Privacy and Data Security</strong>: Digital companions will need access to vast amounts of personal data to function effectively. How do we ensure this data is protected from breaches or misuse? What happens if a digital companion is hacked? The potential for privacy violations is very serious and requires robust safeguards.</p></li><li><p><strong>Autonomy and Over-reliance</strong>: As digital companions become more capable and integrated into our lives, there's a risk of over-dependence. Could we become so reliant on our AI companions that we lose the ability to&nbsp; perform tasks independently? How do we balance the benefits of AI assistance with the need to maintain human autonomy and critical thinking skills?</p></li></ol><p>These ethical challenges are not insurmountable, but they require research, discussion, careful consideration and proactive solutions as we develop the technology.</p><p>Yet, the opportunity is huge. Digital companions could democratise access to personalised learning and knowledge, supercharge human creativity and productivity, and usher in a new era of human-AI collaboration. The potential impact on medicine, education, scientific research, and even mental health support is enormous. Imagine a world where everyone has access to a personalised tutor, a mental health supporter, or a research assistant - the possibilities are truly exciting.</p><h2><strong>&#128172; Join the Conversation</strong></h2><p>Over the coming weeks I&#8217;ll be publishing a series of posts that cover each of the seven key areas of product development needed to transform chatbots into digital companions in detail, exploring their potential, the challenges involved, and the incredible opportunities they present.</p><p>It's crucial that we - developers, researchers, business leaders, and users - actively shape the direction and development of digital companions. So, consider how digital companions might impact your life or work. What excites you? What concerns you? Share your thoughts, challenge my ideas, and let's discuss the future we want to create.</p><p>In my next post, I&#8217;ll be covering personalisation (<a href="https://open.substack.com/pub/theaiblueprint/p/beyond-chatbots-personalisation">available here</a>) - one of the key developments needed to transform generic chatbots into digital companions. Until then, I leave you with this question: If you could have a digital companion that truly understood you, what would you want it to do?</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Leadership Lessons]]></title><description><![CDATA[What learnings can we take from OpenAI's midlife crisis?]]></description><link>https://www.the-blueprint.ai/p/leadership-lessons</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/leadership-lessons</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Wed, 22 Nov 2023 21:42:46 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!EvbZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!EvbZ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!EvbZ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 424w, https://substackcdn.com/image/fetch/$s_!EvbZ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 848w, https://substackcdn.com/image/fetch/$s_!EvbZ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 1272w, https://substackcdn.com/image/fetch/$s_!EvbZ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!EvbZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic" width="1456" height="1092" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1092,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:486419,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!EvbZ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 424w, https://substackcdn.com/image/fetch/$s_!EvbZ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 848w, https://substackcdn.com/image/fetch/$s_!EvbZ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 1272w, https://substackcdn.com/image/fetch/$s_!EvbZ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa7b6e1ff-dd1a-4dc0-9fea-faae7722ab9f.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/gdb/status/1727230819226583113">https://x.com/gdb/status/1727230819226583113</a></figcaption></figure></div><p>So, after five tumultuous days it seems OpenAI is almost back to where it started, just with the small detail of a new board being put in place. To say what has happened at OpenAI is unprecedented is a huge understatement. It&#8217;s very rare for us to see a company that has so much of the spotlight, and is potentially so strategically important to humanity, as OpenAI. For a company in that position to implode out of nowhere (especially after such a successful <a href="https://www.the-blueprint.ai/p/openai-devday-special?r=slh7g&amp;utm_campaign=post&amp;utm_medium=web">DevDay</a>), pull itself back together, and hopefully steady itself is hard to get your head around.</p><p>I&#8217;m not sure we&#8217;ll ever fully know the details behind what instigated the ousting of Sam Altman in the first place, but reflecting on everything that&#8217;s transpired, the one thing that stands out to me is that we&#8217;ve seen a masterclass in leadership.</p><p>Obviously not a masterclass from the ex. board of OpenAI (that&#8217;s been a sh!t show) but a masterclass from <a href="https://twitter.com/sama">Sam Altman</a> (CEO), <a href="https://twitter.com/gdb">Greg Brockman</a> (President), <a href="https://twitter.com/satyanadella">Satya Nadella</a>, <a href="https://twitter.com/ilyasut">Ilya Sutskever</a> (yes even him!), <a href="https://twitter.com/miramurati">Mira Murati</a> (CTO), <a href="https://twitter.com/bradlightcap">Brad Lightcap</a> (COO), <a href="https://twitter.com/jasonkwon">Jason Kwon</a> (CSO), and the many other leaders at OpenAI. In fact, every single employee at OpenAI, regardless of their role, experience, or level have taught us valuable leadership lessons one way or another. Below I&#8217;ve outlined the five that I think are the most important and impactful.</p><div><hr></div><h2>&#129761; MISSION DRIVEN LEADERSHIP</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!BIiM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!BIiM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 424w, https://substackcdn.com/image/fetch/$s_!BIiM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 848w, https://substackcdn.com/image/fetch/$s_!BIiM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 1272w, https://substackcdn.com/image/fetch/$s_!BIiM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!BIiM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png" width="360" height="269.9236641221374" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:884,&quot;width&quot;:1179,&quot;resizeWidth&quot;:360,&quot;bytes&quot;:187658,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!BIiM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 424w, https://substackcdn.com/image/fetch/$s_!BIiM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 848w, https://substackcdn.com/image/fetch/$s_!BIiM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 1272w, https://substackcdn.com/image/fetch/$s_!BIiM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9cad19b9-0b28-4ab2-a303-7d97acc63ca3_1179x884.png 1456w" sizes="100vw"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/sama/status/1726510261509779876">https://x.com/sama/status/1726510261509779876</a></figcaption></figure></div><p>Throughout the whole OpenAI saga this week, one thing that both Sam Altman and Greg Brockman kept emphasising was that the mission would continue. Below is a quick reminder of the OpenAI mission that every employee there so passionately believes in:</p><blockquote><p>Our mission is to ensure that artificial general intelligence benefits all of humanity.</p></blockquote><p>Since its founding, OpenAI has been set up as a mission-driven organisation. This is a core part of the culture of OpenAI and the mission was very carefully and purposefully crafted from the start. The articulation of that mission has evolved over time, but it is the singular reason why OpenAI is the company that it is today.</p><p>OpenAI&#8217;s mission is at the core of the organisation for one big reason - talent. Talent is undoubtedly the battleground that matters most, to most companies, especially in technology, and especially in Silicon Valley. You can see this just from how desperately Marc Benioff was trying to <a href="https://x.com/Benioff/status/1727144886259040646?s=20">entice</a> OpenAI staff over to Salesforce on Tuesday, promising them full cash &amp; equity OTE and inviting them to email his office directly with their CV &#129327;.</p><p>When Sam Altman, Greg Brockman and the other founders were setting up OpenAI they realised that to attract the best talent in AI that they had to have a larger purpose and mission. This is because AI professionals are a very different breed from your typical tech talent in Silicon Valley:</p><ul><li><p>They are not just looking for a job, but also seeking purpose. </p></li><li><p>For them it&#8217;s not about the money or fame - the reward comes from helping humanity make progress.</p></li><li><p>They have very strong interests in the ethical implications of technology.</p></li><li><p>They are driven by opportunities for learning and personal growth.</p></li><li><p>The AI community is very open and highly collaborative.</p></li></ul><p>So, having a clearly defined mission that aligns with what&#8217;s important to AI talent has been vitally important to OpenAI in attracting that talent. But having such a strong mission-orientated organisation goes further than that - it sets the vision, gives everyone clarity and purpose and also engenders a huge amount of loyalty amongst employees.</p><blockquote><p><strong>KEY TAKEOUT:</strong> A clear, purpose-driven mission can help attract the best talent, inspire them and foster deep-rooted loyalty within an organisation.</p></blockquote><div><hr></div><h2>&#128079; THE IMPORTANCE OF LOYALTY</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!oISd!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!oISd!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 424w, https://substackcdn.com/image/fetch/$s_!oISd!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 848w, https://substackcdn.com/image/fetch/$s_!oISd!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!oISd!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!oISd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg" width="400" height="525.8914728682171" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1696,&quot;width&quot;:1290,&quot;resizeWidth&quot;:400,&quot;bytes&quot;:802945,&quot;alt&quot;:&quot;&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" title="" srcset="https://substackcdn.com/image/fetch/$s_!oISd!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 424w, https://substackcdn.com/image/fetch/$s_!oISd!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 848w, https://substackcdn.com/image/fetch/$s_!oISd!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!oISd!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F22521ca2-abc4-480e-8843-793ce459d9db_1290x1696.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/karaswisher/status/1726599700961521762">https://x.com/karaswisher/status/1726599700961521762</a></figcaption></figure></div><p>At the last count, c.95% of the 770 employees at OpenAI had signed the letter above which was addressed to the Board of Directors at OpenAI. If that&#8217;s not a vote of no-confidence then I don&#8217;t know what is! Stepping back from this extraordinary letter though, there were many other amazing shows of loyalty, especially over the weekend when most of the situations at OpenAI was unfolding:</p><p>Within hours of Sam Altman being ousted and Greg Brockman quitting, three other key employees resigned: <a href="https://twitter.com/merettm">Jakub Pachocki</a>, <a href="https://twitter.com/sidorszymon">Szymon Sidor</a>, and <a href="https://twitter.com/aleks_madry">Aleksander Madry</a>. No additional details beyond the OpenAI Board&#8217;s original <a href="https://openai.com/blog/openai-announces-leadership-transition">statement</a> had been shared by then, these three people just knew that if Sam and Greg weren&#8217;t at OpenAI then they didn&#8217;t want to be either.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!uC8Z!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!uC8Z!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 424w, https://substackcdn.com/image/fetch/$s_!uC8Z!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 848w, https://substackcdn.com/image/fetch/$s_!uC8Z!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 1272w, https://substackcdn.com/image/fetch/$s_!uC8Z!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!uC8Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png" width="416" height="482.68702290076334" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1368,&quot;width&quot;:1179,&quot;resizeWidth&quot;:416,&quot;bytes&quot;:1794870,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!uC8Z!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 424w, https://substackcdn.com/image/fetch/$s_!uC8Z!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 848w, https://substackcdn.com/image/fetch/$s_!uC8Z!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 1272w, https://substackcdn.com/image/fetch/$s_!uC8Z!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F1d76506b-cd6a-45f1-ae9c-ccebad715e35_1179x1368.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/tifafafafa/status/1726669828172419114">https://x.com/tifafafafa/status/1726669828172419114</a></figcaption></figure></div><p>Over the weekend, a huge number of OpenAI employees cancelled their plans so that they could be in the office and show moral support for Sam, Greg and the remaining leadership team, as well as each other.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!UnFp!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!UnFp!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 424w, https://substackcdn.com/image/fetch/$s_!UnFp!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 848w, https://substackcdn.com/image/fetch/$s_!UnFp!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 1272w, https://substackcdn.com/image/fetch/$s_!UnFp!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!UnFp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png" width="384" height="218.87022900763358" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/35378578-9055-456d-8c3f-4429f74b8096_1179x672.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:672,&quot;width&quot;:1179,&quot;resizeWidth&quot;:384,&quot;bytes&quot;:110793,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!UnFp!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 424w, https://substackcdn.com/image/fetch/$s_!UnFp!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 848w, https://substackcdn.com/image/fetch/$s_!UnFp!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 1272w, https://substackcdn.com/image/fetch/$s_!UnFp!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F35378578-9055-456d-8c3f-4429f74b8096_1179x672.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/sama/status/1726542800815280168">https://x.com/sama/status/1726542800815280168</a></figcaption></figure></div><p>On Monday, there was a rallying cry on X (Twitter) of employees tweeting &#8216;OpenAI is nothing without its people&#8217; ahead of any letter being written or signed. Many of these tweets were reposted and  &#128155;&#8217;d by Sam Altman, Greg Brockman and other OpenAI leaders. This was a huge showing of solidarity from OpenAI employees -  not just with the leaders who left, but with each other as well.</p><p>I&#8217;m sure there are plenty of other showings of loyalty from the OpenAI team during this episode that go beyond what I&#8217;ve shared above. To see this level of support from employees is unprecedented (as far as I&#8217;m aware) and demonstrates a huge level of psychological safety at OpenAI that&#8217;s rarely seen in organisations. Much of this stems from OpenAI&#8217;s mission driven culture and the working environment they&#8217;ve fostered to attract the best AI talent.</p><blockquote><p><strong>KEY TAKEOUT:</strong> Fostering deep-rooted employee loyalty can help leaders through even the worst of times.</p></blockquote><div><hr></div><h2>&#129309; THE POWER OF PARTNERSHIP</h2><iframe class="spotify-wrap podcast" data-attrs="{&quot;image&quot;:&quot;https://i.scdn.co/image/ab6765630000ba8a8cbcb1cde86604c19c27c2e8&quot;,&quot;title&quot;:&quot;Microsoft CEO Satya Nadella on the OpenAI Debacle&quot;,&quot;subtitle&quot;:&quot;Vox Media&quot;,&quot;description&quot;:&quot;Episode&quot;,&quot;url&quot;:&quot;https://open.spotify.com/episode/4i4lKsKevNSGEUnuu7Jzn6&quot;,&quot;belowTheFold&quot;:true,&quot;noScroll&quot;:false}" src="https://open.spotify.com/embed/episode/4i4lKsKevNSGEUnuu7Jzn6" frameborder="0" gesture="media" allowfullscreen="true" allow="encrypted-media" loading="lazy" data-component-name="Spotify2ToDOM"></iframe><p>In one of the first interviews Satya Nadella did following his move to (temporarily) bring Sam Altman and Greg Brockman into Microsoft, following their departures from OpenAI, he over and over again emphasised the importance of partnership. Below is a great quote from the interview that really sums this up:</p><div class="pullquote"><p>&#8220;In Silicon Valley people talk about who is getting ahead of the other, but I believe in partnerships, in fact on of the most understated things is that great partnerships can create lots of enterprise value&#8221;</p></div><p>Not many people really understand the extent of the OpenAI &lt;&gt; Microsoft partnership, so let me summarise:</p><ul><li><p>Microsoft have invested c.$13bn in OpenAI, partly in cash, but mostly in Azure credits for compute etc.</p></li><li><p>This investment has resulted in Microsoft owning c.49% of OpenAI as a business</p></li><li><p>The partnership is far more than just cash and compute though. Microsoft own a huge amount of OpenAI&#8217;s IP as part of the partnership agreement - They own the models, the weights, the data the models are trained on, the code etc.</p></li><li><p>Microsoft have in turn committed to building and tailoring all of the infrastructure OpenAI requires to achieve their mission.</p></li></ul><p>OpenAI and Microsoft obviously have a very unique and powerful partnership. There are real synergies from how they work together and everyone benefits. A real life 1 + 1 = 3. This partnership gives OpenAI the funding and access to compute that they need to continue their mission and it gives Microsoft access to the best AI talent in the business as well as a significant first-mover advantage on any advances that OpenAI make.</p><blockquote><p><strong>KEY TAKEOUT:</strong> Partnerships &gt; Competition and provide leaders and their companies with access to resources and synergies they couldn&#8217;t access alone.</p></blockquote><div><hr></div><h2>&#128556; ADMITTING WHEN YOU&#8217;RE WRONG</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!nvSj!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!nvSj!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 424w, https://substackcdn.com/image/fetch/$s_!nvSj!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 848w, https://substackcdn.com/image/fetch/$s_!nvSj!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!nvSj!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!nvSj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg" width="472" height="249.01102629346903" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/bb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:622,&quot;width&quot;:1179,&quot;resizeWidth&quot;:472,&quot;bytes&quot;:209474,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!nvSj!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 424w, https://substackcdn.com/image/fetch/$s_!nvSj!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 848w, https://substackcdn.com/image/fetch/$s_!nvSj!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!nvSj!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbb9d74cd-ebd5-4699-8ffd-69cb7ef3f95a_1179x622.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>When Sam Altman was removed from his role as CEO and Greg Brockman resigned after being removed as the chair of the board, Greg <a href="https://x.com/gdb/status/1725736242137182594">shared</a> on X that it was Ilya Sutskever that had conducted all of the phone calls and conveyed the Board&#8217;s wishes. It&#8217;s still unclear how much on an instigator Ilya Sutskever was in the board&#8217;s decision, but according to <a href="https://www.businessinsider.com/anna-brockman-cried-asked-ilya-sutskever-change-openai-report-2023-11?r=US&amp;IR=T">reports</a>, he made a U-turn after an emotional conversation with Greg&#8217;s wife Anna over the weekend.</p><p>This was obviously a very difficult about turn for Ilya to make, after what I&#8217;m sure would have been a very difficult experience when conveying the board&#8217;s decision. Both Sam and Greg immediately reacted to Ilya&#8217;s post on X, with a sign of love and confirmation that all had been forgiven.</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!f1Sb!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!f1Sb!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f1Sb!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f1Sb!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f1Sb!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!f1Sb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg" width="420" height="212.67175572519085" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:597,&quot;width&quot;:1179,&quot;resizeWidth&quot;:420,&quot;bytes&quot;:233591,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!f1Sb!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 424w, https://substackcdn.com/image/fetch/$s_!f1Sb!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 848w, https://substackcdn.com/image/fetch/$s_!f1Sb!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!f1Sb!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f6c320f-3785-4071-91f3-c3f39a1733c0_1179x597.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/sama/status/1726594398098780570?s=20">https://x.com/sama/status/1726594398098780570</a></figcaption></figure></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Vw7O!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Vw7O!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Vw7O!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Vw7O!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Vw7O!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Vw7O!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg" width="422" height="333.59117896522474" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:932,&quot;width&quot;:1179,&quot;resizeWidth&quot;:422,&quot;bytes&quot;:260052,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Vw7O!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Vw7O!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Vw7O!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Vw7O!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd39df730-3bda-41d7-9832-f0514a927d91_1179x932.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/gdb/status/1726598594948735256">https://x.com/gdb/status/1726598594948735256</a></figcaption></figure></div><p>Ilya Sutskever admitting that he was wrong so publicly shows humility and an ability to take responsibility for his actions and their consequences. Doing this will have gone a long way in rebuilding trust with Sam, Greg and everyone else at OpenAI and I am sure has played a large role in getting the band back together.</p><p>Similarly, both Sam and Greg showed unity, an ability to forgive and to quickly move on. Their reactions will also have gone a long way in showing stability during a very chaotic period for OpenAI.</p><blockquote><p><strong>KEY TAKEOUT:</strong> Admitting mistakes, taking responsibility for your actions and reacting swiftly are vital for effective leadership and maintaining trust and loyalty.</p></blockquote><div><hr></div><h2>&#10084;&#65039; THE POWER OF LOVE</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!RWJn!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!RWJn!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 424w, https://substackcdn.com/image/fetch/$s_!RWJn!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 848w, https://substackcdn.com/image/fetch/$s_!RWJn!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!RWJn!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!RWJn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg" width="422" height="314.6208651399491" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:879,&quot;width&quot;:1179,&quot;resizeWidth&quot;:422,&quot;bytes&quot;:231849,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!RWJn!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 424w, https://substackcdn.com/image/fetch/$s_!RWJn!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 848w, https://substackcdn.com/image/fetch/$s_!RWJn!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!RWJn!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7e51f2b1-56d3-48d8-a74a-424fbd76c3de_1179x879.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption"><a href="https://x.com/miramurati/status/1727206862150672843">https://x.com/miramurati/status/1727206862150672843</a></figcaption></figure></div><p>Beyond the love that both Sam and Greg showed Ilya, there was a huge outpouring of love from all parts of OpenAI whilst the situation was unfolding. This ranged from lots of &#10084;&#65039;&#8217;s on X to Sam repeatedly tweeting his love for the company and it&#8217;s people:</p><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!t6HG!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!t6HG!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 424w, https://substackcdn.com/image/fetch/$s_!t6HG!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 848w, https://substackcdn.com/image/fetch/$s_!t6HG!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 1272w, https://substackcdn.com/image/fetch/$s_!t6HG!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!t6HG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png" width="410" height="145.14" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:354,&quot;width&quot;:1000,&quot;resizeWidth&quot;:410,&quot;bytes&quot;:84569,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!t6HG!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 424w, https://substackcdn.com/image/fetch/$s_!t6HG!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 848w, https://substackcdn.com/image/fetch/$s_!t6HG!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 1272w, https://substackcdn.com/image/fetch/$s_!t6HG!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F47bb6c6d-67be-4687-89a8-bbbdfd66764b_1000x354.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/sama/status/1725631621511184771">https://x.com/sama/status/1725631621511184771</a></figcaption></figure></div><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!W3_6!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!W3_6!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 424w, https://substackcdn.com/image/fetch/$s_!W3_6!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 848w, https://substackcdn.com/image/fetch/$s_!W3_6!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 1272w, https://substackcdn.com/image/fetch/$s_!W3_6!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!W3_6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png" width="412" height="145.848" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:354,&quot;width&quot;:1000,&quot;resizeWidth&quot;:412,&quot;bytes&quot;:92994,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!W3_6!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 424w, https://substackcdn.com/image/fetch/$s_!W3_6!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 848w, https://substackcdn.com/image/fetch/$s_!W3_6!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 1272w, https://substackcdn.com/image/fetch/$s_!W3_6!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc5123baa-445a-4302-abc2-b90d0e18c6d5_1000x354.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/sama/status/1725742088317534446">https://x.com/sama/status/1725742088317534446</a></figcaption></figure></div><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!AH3x!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!AH3x!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 424w, https://substackcdn.com/image/fetch/$s_!AH3x!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 848w, https://substackcdn.com/image/fetch/$s_!AH3x!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 1272w, https://substackcdn.com/image/fetch/$s_!AH3x!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!AH3x!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png" width="414" height="77.56204379562044" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:154,&quot;width&quot;:822,&quot;resizeWidth&quot;:414,&quot;bytes&quot;:38057,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!AH3x!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 424w, https://substackcdn.com/image/fetch/$s_!AH3x!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 848w, https://substackcdn.com/image/fetch/$s_!AH3x!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 1272w, https://substackcdn.com/image/fetch/$s_!AH3x!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F9ec9bc8f-b948-4941-9f75-e02641740ada_822x154.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/sama/status/1726099792600903681?s=20">https://x.com/sama/status/1726099792600903681</a></figcaption></figure></div><p>Such a public outpouring of positive emotions from a leader is rare, but given the circumstances, it&#8217;s completely unprecedented. It shows a huge amount of humanity, vulnerability, commitment,  and humility - all vitally important in modern leadership. Emotional leadership like this can foster a sense of connection and empathy amongst employees and during these circumstances will have gone a long way to calm the sense of chaos that must have been pervasive at OpenAI.</p><p>Traditionally, leaders have shied away from such public displays of love but I hope this is something that we see more often and from more leaders in the future.</p><blockquote><p><strong>KEY TAKEOUT:</strong> Expressing positive emotions is a key part of being an authentic leader and can contribute to a stronger and more resilient organisation.</p></blockquote><div><hr></div><h2>WRAPPING THINGS UP</h2><p>I sincerely hope we never see another company and its employees go through similar events to what transpired at OpenAI over the last five days. It will have been incredibly difficult for everyone involved and destabilised what was quickly becoming an incredibly successful company that I&#8217;m sure Steve Jobs would have agreed has been putting a dent in the universe.</p><p>There are many leadership lessons to be taken from the behaviours of many of the leaders at OpenAI as well as the actions of all of the employees, and I&#8217;ve covered the five that I think are the most important:</p><ul><li><p>&#129761; <strong>Mission driven leadership</strong> - a clear, purpose-driven mission can help attract the best talent, inspire them and foster deep-rooted loyalty within an organisation.</p></li><li><p>&#128079; <strong>The importance of loyalty</strong> - fostering deep-rooted employee loyalty can help leaders through even the worst of times.</p></li><li><p>&#129309; <strong>The power of partnership</strong> - partnerships &gt; competition and provide leaders and their companies with access to resources and synergies they couldn&#8217;t access alone.</p></li><li><p>&#128556; <strong>Admitting when you&#8217;re wrong</strong> - admitting mistakes, taking responsibility for your actions and reacting swiftly are vital for effective leadership and maintaining trust and loyalty.</p></li><li><p>&#10084;&#65039; <strong>The power of love</strong> - expressing positive emotions is a key part of being an authentic leader and can contribute to a stronger and more resilient organisation.</p></li></ul><p>If you think there are other leadership lessons you think we should be drawing from the last five days at OpenAI please let me know in the comments below.</p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!Wrt1!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!Wrt1!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Wrt1!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Wrt1!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Wrt1!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!Wrt1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg" width="422" height="152.12044105173877" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:425,&quot;width&quot;:1179,&quot;resizeWidth&quot;:422,&quot;bytes&quot;:106992,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!Wrt1!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 424w, https://substackcdn.com/image/fetch/$s_!Wrt1!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 848w, https://substackcdn.com/image/fetch/$s_!Wrt1!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!Wrt1!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff6f12310-0b56-4c20-b483-3060a0674bce_1179x425.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/gdb/status/1727208843137179915">https://x.com/gdb/status/1727208843137179915</a></figcaption></figure></div><p><strong>P.S.</strong> I think we should all agree what a legend Greg Brockman is. He was the chair of the board, the president of the company, and one of the original founders. I&#8217;m sure he&#8217;s got a few things on his plate right now, but he&#8217;s straight back to coding &#128170;.</p><p>If you&#8217;re interested in why he&#8217;s such a legend, this blog post from Sam Altman couldn&#8217;t put it any better. It&#8217;s just titled <a href="https://blog.samaltman.com/greg">Greg</a>.</p><div><hr></div><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!fdLE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!fdLE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 424w, https://substackcdn.com/image/fetch/$s_!fdLE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 848w, https://substackcdn.com/image/fetch/$s_!fdLE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 1272w, https://substackcdn.com/image/fetch/$s_!fdLE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!fdLE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png" width="410" height="238.9058524173028" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/e7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:687,&quot;width&quot;:1179,&quot;resizeWidth&quot;:410,&quot;bytes&quot;:130383,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!fdLE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 424w, https://substackcdn.com/image/fetch/$s_!fdLE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 848w, https://substackcdn.com/image/fetch/$s_!fdLE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 1272w, https://substackcdn.com/image/fetch/$s_!fdLE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fe7007169-c57b-4ebc-b33b-d927b0fab603_1179x687.png 1456w" sizes="100vw" loading="lazy"></picture><div></div></div></a><figcaption class="image-caption"><a href="https://x.com/karaswisher/status/1727075242584211532">https://x.com/karaswisher/status/1727075242584211532</a></figcaption></figure></div><p><strong>P.P.S.</strong> We should all also agree what a legend Kara Swisher is. Her reporting on this whole situation has been on point and could only have come from someone like her (there aren&#8217;t any others!) who has the connections and experience to cut through the noise and comment on what&#8217;s really going on &#129488;.</p><div><hr></div><p><strong>P.P.P.S.</strong> I&#8217;m really sad that this mostly all played out on X (Twitter) and that I&#8217;ve shared so many screenshots and links to X in this post. I&#8217;m sad because the fact that this situation played out of X will have encouraged more people back on to a platform that has become so completely toxic under Musk&#8217;s leadership that even advertisers are deserting it in their <a href="https://www.mediamatters.org/twitter/here-are-companies-pulling-ads-x">droves</a>.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Tech Pail Kids]]></title><description><![CDATA[Retro parody meets modern complexity]]></description><link>https://www.the-blueprint.ai/p/tech-pail-kids</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/tech-pail-kids</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Mon, 20 Nov 2023 10:01:01 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!vo4T!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fc10b7e42-cb7c-43c8-8f0d-505a636f14d7_976x1360.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>I had a lot of fun over the weekend going down a deep, deep MidJourney rabbit hole. Some of you will know what I mean&#8230; you have an idea, it turns out great and then you want to see how far you can push it, where you can take it and what the limits are. The results of this particular rabbit hole are the grotesquely hideous &#8216;Tech Pail Kids&#8217; below&#8230; an homage and parody of the original <a href="https://en.wikipedia.org/wiki/Garbage_Pail_Kids">Garbage Pail Kids</a> which were the brainchild of cartoonist <a href="https://en.wikipedia.org/wiki/Art_Spiegelman">Art Spiegelman</a> and originally released by <a href="https://uk.topps.com">The Topps Company</a> in the mid 80s. These &#8216;Tech Pail Kids&#8217; feature prominent technology leaders from the last 20 years:</p><div class="image-gallery-embed" data-attrs="{&quot;gallery&quot;:{&quot;images&quot;:[{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/c10b7e42-cb7c-43c8-8f0d-505a636f14d7_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/5ba7de89-a0b8-43bc-84cf-238b1cc9ebd0_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/4775a380-bfc5-4adc-b798-5baa4c2baa85_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d64c987f-eca1-479e-92c8-4fbd7cdc01d2_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/30006c6a-bed2-4215-b872-167075e84c29_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/1d913c05-3e62-49d2-bf74-76a3fa4ffcb1_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/463651da-ac3f-4bdc-bb30-639b3a9b381e_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/2aba7dd8-7f43-4a65-bf04-81419293480c_976x1360.png&quot;},{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/15c47a9a-09e7-4df7-92c9-ff5134890767_976x1360.png&quot;}],&quot;caption&quot;:&quot;Tech Pail Kids&quot;,&quot;alt&quot;:&quot;&quot;,&quot;staticGalleryImage&quot;:{&quot;type&quot;:&quot;image/png&quot;,&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7c4ddbbd-6100-4bf0-ba2f-01dbf89e448f_1456x1454.png&quot;}},&quot;isEditorNode&quot;:true}"></div><p>Despite all the fun that I had, these images bring up a lot of legal, ethical and societal concerns relating to generative AI&#8217;s ability to copy/mimic styles and create likenesses of people that haven&#8217;t given their permission for their image to be used. Let&#8217;s jump in and explore this some more&#8230;</p><div><hr></div><p>But before we get into all the issues, it&#8217;s worthwhile outlining how I actually created all of the images above. This will help us explore some of the issues in a bit more depth.</p><p>Firstly, I used <a href="https://www.midjourney.com/">Midjourney </a>to create some images of generic &#8216;Garbage Pail Kids&#8217; and used the <a href="https://docs.midjourney.com/docs/style-tuner">Style Tuner</a> to create a custom style that I could then use repeatedly to create consistent images.</p><p>I then used this style to create the images you see above by asking Midjourney to: </p><blockquote><p>/imagine the technology leader, in the style of garbage pail kids using the style tuner code with an aspect ratio of 5:7.</p></blockquote><p>Some of these images came out perfectly on the first go, others I had to include an image of the tech leader in the prompt and it took multiple iterations using features such as zoom, vary and vary region, to get them to an image I was happy with.</p><p>I then used <a href="https://www.pixelmator.com/pro/">Pixelmator Pro</a> where I created a template to drop the Midjourney images into. In this template I had the border, &#8216;peel here&#8217; arrow, Garbage Pail Kids logo and lock-up for the name, as well as the alphanumerical code in the top right hand corner. I also overlayed a filter to make the images look aged.</p><p>To get the Garbage Pail Kids logo and name lock-up, I purchased them from an <a href="https://www.etsy.com/uk/listing/1312393273/garbage-pail-kids-svg-bundle-font-bonus">Etsy seller</a>. Unfortunately this didn&#8217;t include the font for the name, so I matched that using <a href="https://www.whatfontis.com">WhatFontIs.com</a> and downloaded a free version that matched called <a href="https://www.ffonts.net/WindsorDemifog.font">Windsor Demi</a>.</p><div><hr></div><p>So, let&#8217;s get into the issues surrounding what I&#8217;ve created. I&#8217;m not a lawyer or ethicist, so this won&#8217;t be exhaustive, but will give you an idea of the breadth and depth of all the issues at play here. I&#8217;ll also base this on UK law, where I&#8217;m based. It&#8217;s worth being aware that the legal issues will vary depending on the country you&#8217;re in as there&#8217;s a lot of variation across the globe&#8230;</p><h3>TRADE MARK</h3><p>This is the most obvious one to start with. &#8216;Garbage Pail Kids&#8217; was originally <a href="https://trademarks.ipo.gov.uk/ipo-tmcase/page/Results/1/UK00001261128">trade marked</a> in the UK on 24th February 1986 by <a href="https://uk.topps.com">The Topps Company</a>. The trade mark was originally for Class 30 goods - &#8220;Confectionery, chewing gum and bubble gum, none being medicated&#8221; and then looks like it was <a href="https://trademarks.ipo.gov.uk/ipo-tmcase/page/Results/1/UK00908237414">updated</a> on 23rd April 2009 to also include Class 16 goods - &#8220;Trading cards and stickers&#8221;. The last <a href="https://trademarks.ipo.gov.uk/ipo-tmcase/page/Results/2/WO0000001570650">filing</a> with the UK Governments Intellectual Property Office is on 20th May 2021 where The Topps Company has been granted protected status on the &#8216;Garbage Pail Kids&#8217; trade mark, to be renewed on 9th November 2030. As recently as this year, Topps have been <a href="https://uk.topps.com/catalogsearch/result/?q=garbage+pail+kids">selling</a> new Garbage Pail Kids trading cards in the UK, so I think it&#8217;s safe to say that I&#8217;m on shaky ground using the Garbage Pail Kids logo.</p><p>There&#8217;s nothing mentioned on the Etsy&#8217;s seller&#8217;s <a href="https://www.etsy.com/uk/shop/OnPointMediaAu">webpage</a> about whether the logo and assets sold are allowed to be used and whether they have permission from The Topps Company to sell these assets. To cloud issues further, the seller is based in Australia, meaning there&#8217;s also international law to take into account, but let&#8217;s not go there. I&#8217;m actually very surprised that Etsy allows these assets to be sold on their marketplace, as their <a href="https://www.etsy.com/uk/legal/ip/">intellectual property policy</a> requires that items sold on the platform must not infringe upon the intellectual property rights of others. This includes copyrights, trademarks, patents, and rights of publicity. Interestingly, you can only <a href="https://www.etsy.com/uk/ipmanager/">report</a> a breach of this policy to Etsy if you are the rights owner, or authorised to report on their behalf.</p><h3>COPYRIGHT INFRINGEMENT</h3><p>Beyond the &#8216;Garbage Pail Kids&#8217; trade mark, I could also be on the wrong side of UK copyright law. The reason for this is because using the &#8216;distinctive style&#8217; of the series without the owners permission could be considered a copyright infringement. Copyright protection is granted automatically in the UK and <a href="https://www.gov.uk/copyright">covers</a> making an adaption of the work and also putting it on the internet.</p><p>There are however, some <a href="https://www.gov.uk/guidance/exceptions-to-copyright">exceptions</a> to copyright in the UK. These include non-commercial research and private study, which is permitted when it is &#8216;fair dealing&#8217;<a class="footnote-anchor" data-component-name="FootnoteAnchorToDOM" id="footnote-anchor-1" href="#footnote-1" target="_self">1</a> and not copying the whole work. You are also allowed to copy works in any medium for teaching purposes as long as the use is solely to illustrate a point. There is also an exception for parody, caricature and pastiche. For example, a cartoonist may reference a well known artwork or illustration for a caricature.</p><p>UK copyright law also states that if you are making use of an exception to copy someone else&#8217;s work it is necessary for you to sufficiently acknowledge their work. However, such acknowledgement is not required where it is impossible for reasons of practicality.</p><p>All of this definitely confuses things as I could argue that using the Tech Pail Kids as part of this article is non-commercial research and for teaching purposes to illustrate a point. The images I&#8217;ve create could also be seen as parody, caricature or pastiche as they convey a new expression, are recognisably a derivative of the original work and use no more of the original work than is necessary for the parody, caricature or pastiche to be recognisable. You could also argue that my use of the Garbage Pail Kids logo is an acknowledgement of their original work, which is a requirement to make use of a copyright exception.</p><p>This is all very confusing and a huge grey area!</p><h3>RIGHT OF PUBLICITY</h3><p>Using the likeness of the tech leaders that I have referenced in the Tech Pail Kids without permission can raise issues concerning their right of publicity, which is more of a US concept, and protects against the unauthorised commercial use of an individual&#8217;s name, likeness or other recognisable aspects of their persona. However, in the UK this isn&#8217;t as clearly defined and as I&#8217;m not using the images commercially I should be ok in this area. </p><p>My images are also clearly not real images of the tech leaders I&#8217;ve used, but this is an area generative AI has got incredibly good at this year. If there is a risk that the images mislead the public into believing the individual tech leaders are endorsing or affiliated with a product or service then this can cross over into misrepresentation. This is another grey area as you could argue that featuring the tech leaders in images with the Garbage Pail Kids logo could be seen as them endorsing Garbage Pail Kids. </p><p>Again, this can get complicated and confusing!</p><h3>DEFAMATION</h3><p>The essence of a defamation claim in the UK is to protect an individual&#8217;s reputation from unjustified attack and is covered by the <a href="https://en.wikipedia.org/wiki/Defamation_Act_2013">Defamation Act 2013</a>. Due to the grotesque nature of Garbage Pail Kids and the Tech Pail Kid Images I&#8217;ve created it could be argued that I am showing the tech leaders in an unfair light. However, for an image to be defamatory it must be likely to cause serious harm to the reputation of the individual. This means lowering the individual in the estimation of right-thinking members of society in general, or causing them to be shunned or avoided. The defamatory images also need to be published, which they are as they are attached to this article.</p><p>In practice, the UK court would need to consider whether the a reasonable person would consider the images to be joke or piece of satire rather than a statement of fact about the individual. However, even satirical content can be defamatory if it implies a baseless fact that discredits the individual's reputation. The burden of proof of harm is on the claimant though, and they have to show that the image has caused, or is likely to cause, serious harm to their reputation.</p><p>I think it&#8217;s incredibly unlikely that my Tech Pail Images would cause serious harm to any of the tech leaders featured, even if they were widely distributed, so I feel pretty safe on this one.</p><h3>AI MODEL BIAS</h3><p>The bias issues in generative AI models are well <a href="https://www.bloomberg.com/graphics/2023-generative-ai-bias/">documented</a> and these issues certainly raised their head when I was creating Tech Pail Kids. There are a few different bias factors at play here:</p><ul><li><p>There are the biases inherent in the Midjourney model that are present because of the biases in the data that it has been trained on.</p></li><li><p>There are the biases present in the original Garbage Pail Kids collection, which I reference in my prompt and will have influenced the images created.</p></li><li><p>There are the biases of gender, age, ethnicity etc. amongst the technology leaders I have created images of.</p></li></ul><p>Because of many of the bias factors outlined above, there was one technology leader that was more challenging to create as a Tech Pail Kid than the others:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!GN6h!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!GN6h!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 424w, https://substackcdn.com/image/fetch/$s_!GN6h!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 848w, https://substackcdn.com/image/fetch/$s_!GN6h!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 1272w, https://substackcdn.com/image/fetch/$s_!GN6h!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!GN6h!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png" width="976" height="1360" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/b0d03219-76b4-411e-9664-788148c0445c_976x1360.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1360,&quot;width&quot;:976,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:2417166,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!GN6h!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 424w, https://substackcdn.com/image/fetch/$s_!GN6h!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 848w, https://substackcdn.com/image/fetch/$s_!GN6h!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 1272w, https://substackcdn.com/image/fetch/$s_!GN6h!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb0d03219-76b4-411e-9664-788148c0445c_976x1360.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>It was difficult to get Midjourney to adhere to Satya Nadella&#8217;s Indian heritage with the model repeatedly creating images with a lighter skin tone. To get the result above I had to use &#8216;Indian&#8217; in the prompt, whereas with the other images I didn&#8217;t have to state the individual&#8217;s ethnicity to get the result I wanted. To put it bluntly, MidJourney defaults towards caucasian skin tones.</p><p>However, the representation in the Tech Pail Kids is not just a reflection of Midjourney&#8217;s capabilities but also of the broader societal context, including the diversity in the original Garbage Pail Kids and of leadership in the tech industry.</p><h3>ENVIRONMENTAL IMPACT</h3><p>Generative AI technologies, and especially image generating ones, are currently very nascent and it takes a lot of iteration to get to an image that you want. To be honest, it&#8217;s very hit and miss - sometimes you get what you want first time, others it takes multiple attempts. It&#8217;s more like a game of chance than anything else &#127922;!</p><p>Because of this (and other factors), there is a high environmental impact to using the technology. To create the 9 Tech Pail Kids above, I ended up creating a total of c.1,400 images in Midjourney. That&#8217;s an average of over 150 images for each final image - a huge amount of wastage &#129327;. There is still very little research quantifying the impact of using generative AI technologies on the environment, so it&#8217;s hard to say how much of an impact 1,400 images being generated has. Safe to say it&#8217;s a much bigger impact of 9 images though! If the technology was more mature and it was easier to steer the outputs then that would be a huge step towards reducing its environmental impact.</p><h3>MORAL &amp; ETHICAL RESPONSIBILITIES</h3><p>Ultimately, the moral and ethical responsibilities fall on both the generative AI technology provider and the creator using the technology. There needs to be more awareness and education of the potential misuse of generative AI technology and the risks involved in using it. Just because these tools enable us to reproduce images of people without their permission and/or in the style of a famous artist doesn&#8217;t mean that we should.</p><p>There&#8217;s a responsibility on all of us to ensure that generative AI technologies aren&#8217;t perpetuating societal biases, especially when portraying different ethnicities, genders, and cultural backgrounds. We need to ensure that we aren&#8217;t reinforcing outdated or harmful stereotypes and this involves being critically aware of the sources of the data generative AI technologies are trained on.</p><p>We also need to be more aware of the environmental impact these technologies are having with more research needed to quantify their impact. This information should be presented to the user at the time of generation to help them to decide how best to use the technology and tracked/aggregated over time so people can see the impact of their usage.</p><div><hr></div><h3>IN SUMMARY</h3><p>'Tech Pail Kids' is a playful homage to the classic 'Garbage Pail Kids' trading cards and has allowed me to explore the creative, legal and ethical dimensions of AI-generated art. These dimensions include trademark concerns with 'Garbage Pail Kids', potential copyright infringement, the ambiguous territory of public rights and defamation, and the pressing issue of AI model bias, particularly in representing diverse ethnicities accurately. The environmental impact of generative AI, due to the high number of iterations needed for satisfactory results, is also a key issue. </p><p>Both AI technology providers and creators/users have a responsibility to navigate these complex moral and ethical issues, advocating for greater awareness, responsible usage, and research into the broader implications of these rapidly evolving technologies.</p><div><hr></div><p>Oh, and for those of you that are wondering&#8230;. yes, I did go there&#8230;&#129299;</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!x3aT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!x3aT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 424w, https://substackcdn.com/image/fetch/$s_!x3aT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 848w, https://substackcdn.com/image/fetch/$s_!x3aT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 1272w, https://substackcdn.com/image/fetch/$s_!x3aT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!x3aT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic" width="976" height="1360" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a5b2567a-f791-435d-8bcf-8c8a06109c95.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:1360,&quot;width&quot;:976,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:114085,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!x3aT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 424w, https://substackcdn.com/image/fetch/$s_!x3aT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 848w, https://substackcdn.com/image/fetch/$s_!x3aT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 1272w, https://substackcdn.com/image/fetch/$s_!x3aT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa5b2567a-f791-435d-8bcf-8c8a06109c95.heic 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">and yes, I went <a href="https://www.the-blueprint.ai/p/star-pail-kids">far far away</a> too&#8230;</figcaption></figure></div><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote><div class="footnote" data-component-name="FootnoteToDOM"><a id="footnote-1" href="#footnote-anchor-1" class="footnote-number" contenteditable="false" target="_self">1</a><div class="footnote-content"><p>&#8216;Fair dealing&#8217; is a legal term used to establish whether a use of copyright material is lawful or whether it infringes copyright. There is no statutory definition of fair dealing - it will always be a matter of fact, degree and impression in each case. The question to be asked is: how would a fair-minded and honest person have dealt with the work?</p><p></p></div></div>]]></content:encoded></item><item><title><![CDATA[The Power of GPTs]]></title><description><![CDATA[And how businesses will need to adapt to take advantage of them]]></description><link>https://www.the-blueprint.ai/p/the-power-of-gpts</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/the-power-of-gpts</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Mon, 13 Nov 2023 10:01:01 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!l_55!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!l_55!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!l_55!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 424w, https://substackcdn.com/image/fetch/$s_!l_55!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 848w, https://substackcdn.com/image/fetch/$s_!l_55!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 1272w, https://substackcdn.com/image/fetch/$s_!l_55!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!l_55!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d800784a-7fb6-46fc-9110-1b94bc41ff9e.heic&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:126911,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/heic&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!l_55!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 424w, https://substackcdn.com/image/fetch/$s_!l_55!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 848w, https://substackcdn.com/image/fetch/$s_!l_55!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 1272w, https://substackcdn.com/image/fetch/$s_!l_55!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd800784a-7fb6-46fc-9110-1b94bc41ff9e.heic 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>The biggest announcement made at <a href="https://www.youtube.com/watch?v=U9mJuUkhUzk">OpenAI&#8217;s DevDay</a> was undoubtedly <a href="https://openai.com/blog/introducing-gpts">GPTs</a>. You can find my write up on the event <a href="https://www.the-blueprint.ai/p/openai-devday-special">here</a>.</p><p>When Apple launched the App Store back in July 2008 it started with 500 apps, and 15 years later there are around 1.6m apps available in the App Store. With the launch of GPTs there are currently 15 available to try, it&#8217;s going to be interesting to see if GPTs follow the same trajectory as apps!</p><h2>What are GPTs?</h2><p>GPTs are AI 'agents' that are tailored versions of ChatGPT for specific purposes. They are built by combining instructions, expanded knowledge, and actions, with the ability to be published for others to use. Users can program a GPT simply by having a conversation with it. This makes it easy to customize the agent's behavior to match the desired context or task, enabling a wide array of applications.</p><p>Some of the main features of GPTs are:</p><ul><li><p><strong>Built for specific tasks</strong>: GPTs are designed to follow developer-defined instructions to carry out specific actions based on user input.</p></li><li><p><strong>Extended knowledge</strong>: A GPT can be updated to contain specific knowledge beyond the default model, such as additional content or specific task-related information.</p></li><li><p><strong>Integration with other actions</strong>: GPTs can call predefined functions allowing them to provide more sophisticated and specific responses.</p></li><li><p><strong>Customisation</strong>: Users can customise their GPTs, making them more useful and effective for specific tasks.</p></li><li><p><strong>Publishing and Sharing:</strong> Users can publish their GPTs so that others can use them or keep them private. They can also be shared within an organisation when using ChatGPT Enterprise.</p></li></ul><h2>Giving agency to everyone</h2><p>When Sam Altman announced GPTs he focused on some very compelling headlines:</p><ul><li><p>GPTs will make it easier for users to accomplish all sorts of tasks.</p></li><li><p>Users can program GPTs with language just by talking to ChatGPT.</p></li><li><p>It&#8217;s easy to customise the behaviour of GPTs so that they do exactly what a user wants.</p></li><li><p>Building them is very accessible and it <strong>gives agency to everyone.</strong></p></li></ul><p>It&#8217;s this last point that makes GPTs so exciting - they are doing what generative AI does best, democratising access to incredibly powerful capabilities. GPTs are a second generation generative AI technology (try saying that fast &#129299;) - they&#8217;re more powerful and more accessible to more people than the foundational models they&#8217;re built on. This is why GPTs are so important and why they are going to drive the next phase of OpenAI&#8217;s growth both in terms of users and revenue.</p><p>OpenAI also announced at their DevDay that they now have <a href="https://techcrunch.com/2023/11/06/openais-chatgpt-now-has-100-million-weekly-active-users/">100m active users</a> on the platform every week as well as 2m developers and 92% of the Fortune 500 company using their platform. This is all from a standing start a year ago - that&#8217;s some impressive scaling! I believe, that if the stars align, GPTs will drive x5 user growth for ChatGPT over the next 12 months, allowing OpenAI to surpass X&#8217;s (Twitter) monthly user numbers, and cementing its place amongst the large digital platforms.</p><p>However, there is one big challenge that OpenAI will have to overcome in order to realise this growth and that&#8217;s enterprise adoption. There is so much potential for this 2nd generation generative AI technology in the work place, but there are a number of challenges that organisations will face in adopting GPTs that might prevent OpenAI from seeing this x5 growth.</p><h2>What are the challenges facing organisations in adopting GPTs?</h2><p>In the UK, approximately 48% of all jobs are in small and medium-sized enterprises (SMEs), 32% in large mutli-national organisations and the remaining 20% of jobs are in the public sector.</p><p>In both large multi-national organisations and the public sector (where approx 50% of jobs are), the traditional approach to IT typically involves a centralised structure with a focus on standardisation, consistency, and control of IT resources. These IT strategies are often conservative, prioritising system stability and security over rapid innovation, with a focus on maintaining legacy systems.</p><p>In SMEs, where the other 50% of the workforce is employed, the approach to IT is often more flexible, but is highly cost-sensitive compared to large multi-nationals. </p><p>To summarise, in one half of the working population you have IT teams that are focused on standardisation, consistency, and control, whereas in the other half you have a high degree of cost-sensitivity. Across all of this however is a landscape ripe for the transformative influence of generative AI. While some organisations focus on control and others on cost, both are seeking efficiency and innovation - fertile ground for GPTs to demonstrate their value.</p><p>Despite these organisational constraints, a trend of generative AI usage in the workplace is already emerging. Many professionals are independently exploring generative AI tools to enhance their productivity. This grassroots adoption highlights the latent potential of generative AI, which, when fully integrated with organisational systems and workflows, could unlock unprecedented levels of innovation and efficiency.</p><p>Organisations open to rethinking their technology strategies stand to gain immense value and competitive edge from generative AI. Those slower to adapt might find themselves challenged by more agile competitors and newcomers designed with generative AI at their core. In the near future, access to generative AI tools could even become a sought-after attribute in job candidates, akin to the demand for hybrid-working options today.</p><p>2024 is going to be a really interesting year for technology in organisations. The arrival of &#8216;Enterprise-ready&#8217; generative AI tools marks a pivotal moment. How businesses embrace and leverage this technology could very well redefine their future success and influence the broader technological landscape.</p><div><hr></div><blockquote><p><em>&#8220;The future is already here, it&#8217;s just not evenly distributed.&#8220;</em></p><p><strong>William Gibson</strong></p></blockquote>]]></content:encoded></item><item><title><![CDATA[Is Generative AI a Feature or a Platform?]]></title><description><![CDATA[...or an app or a service, or all of these and more?]]></description><link>https://www.the-blueprint.ai/p/is-generative-ai-a-feature-or-a-platform</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/is-generative-ai-a-feature-or-a-platform</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Tue, 12 Sep 2023 08:00:18 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!KdyT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!KdyT!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!KdyT!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!KdyT!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!KdyT!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!KdyT!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!KdyT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1868895,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!KdyT!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!KdyT!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!KdyT!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!KdyT!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F639cbaac-f1f3-473a-971d-d4c4aa4db22d_1456x816.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">a virus, spreading through a digital system, neon colors, clean pixel art --ar 16:9</figcaption></figure></div><p>We&#8217;re coming up to a year since OpenAI unleashed ChatGPT on the world and in some ways, we&#8217;ve seen a lot of progress on a weekly, and sometimes daily basis, since then. In other ways we haven&#8217;t seen a huge amount of progress either. Enterprise applications of generative AI technologies are few and far between and there are still plenty of ethical, usability and operational challenges to overcome. I expect 2024 to be a bigger year for generative AI than 2023 has been and whilst thinking ahead to next year, I started wondering how pervasive generative AI will be and what different forms we should expect it to take.</p><p>To help with this exploration, it&#8217;s useful to look to the past for guidance, but also not to draw too many parallels as generative AI will no doubt evolve and impact society in ways we can&#8217;t even imagine yet. Arguably, social media is the closest technology that we&#8217;ve seen to generative AI. It certainly shares many of the ethical challenges social media has, and social media has had a huge effect on society over the last 20 years which is a path I expect generative AI to follow, albeit with much great impact across all industries.</p><p>When you look back at the way social media evolved, it started as a just feature in existing application and services (i.e. the &#8216;share button&#8217;) before maturing over time into the huge social media platforms we see today. I am convinced that generative AI will take a similar path but will do so at a much faster pace. Because of this, I think it&#8217;s worth, even at this early stage of the technology, making some predictions of how this could play out.</p><h2>The Anatomy of Digital Ecosystems</h2><p>Before we get into our predictions, however, let&#8217;s define some terms and set some ground rules to help guide us:</p><h4>Features</h4><p>Features are specific functionalities embedded within an existing application or service to perform a particular task. </p><p>For instance, the 'share button' in early social media was a feature integrated into various websites, allowing users to share content but not necessarily to interact in a social networking context.</p><h4>Apps</h4><p>Apps are a standalone piece of software designed to perform a range of related tasks. Apps can embody one or multiple features, but they exist independently, often having their own user interface and operating logic. </p><p>It&#8217;s a bit 2020, but Houseparty is (was?!) a good example of a social media app - it had a singular purpose to facilitate group video calls and doesn&#8217;t have all the trappings of a typical social media platform.</p><h4>Services</h4><p>A Service is an offering that performs specific tasks on behalf of the user, often operating in the background. Services may be part of an app or platform, but they can also stand alone, often requiring some form of subscription or payment model. </p><p>Buffer is a useful social media service that allows users to schedule posts, track social media engagement, and manage multiple accounts across various social platforms. It operates in the background, fulfilling a very specific set of tasks related to social media, typically without providing a platform for social interaction itself.</p><h4>Platforms</h4><p>A Platform is a comprehensive ecosystem that combines various features, apps, and services into a unified experience. Platforms offer a broad range of functionalities, often integrated in such a way that they augment each other, providing a cohesive user experience. Some of the key features that you see on platforms are:</p><ul><li><p>Multiple entry points (web, app, widgets).</p></li><li><p>User profiles.</p></li><li><p>Customisation, inc. privacy controls.</p></li><li><p>Notifications.</p></li><li><p>Search functionality.</p></li><li><p>Activity feeds.</p></li><li><p>Content curation.</p></li><li><p>Collaboration tools &amp; sharing features.</p></li><li><p>Integrations with other apps/services.</p></li></ul><p>Facebook, Instagram, TikTok, LinkedIn, X (Twitter), Pinterest, Snapchat, Slack - there are far too many social media platforms to name them all, but you get the idea!</p><h2>Generative AI as a&#8230;</h2><p>It&#8217;s worth stating before we get into each of the different parts outlined above that unlike social media, we&#8217;ve already seen generative AI manifest as features, applications, and services. Generative AI is not going to see the same linear progression that we saw with the development of social media - it&#8217;s moving much too fast for that.</p><blockquote><p>Side note:  if you plot how generative AI has evolved so far, despite it&#8217;s rapid pace I think it&#8217;s probably progressed (and will continue to progress) mostly in this order: <strong>Service &#8594; App &#8594; Feature &#8594; Platform</strong>, whereas social media mostly progressed as follows: <strong>Feature &#8594; App &#8594; Service &#8594; Platform</strong>.</p></blockquote><h2>Generative AI as a&#8230; Feature</h2><p>Over the last few months, we&#8217;ve started to see many digital apps and platforms announce generative AI features. One of the first digital platforms out of the gates was SnapChat with <a href="https://newsroom.snap.com/en-GB/say-hi-to-my-ai">My AI</a>. Since then we&#8217;ve seen announcements from Microsoft (<a href="https://blogs.windows.com/windowsdeveloper/2023/05/23/bringing-the-power-of-ai-to-windows-11-unlocking-a-new-era-of-productivity-for-customers-and-developers-with-windows-copilot-and-dev-home/">Windows Co-Pilot</a>, <a href="https://blogs.microsoft.com/blog/2023/03/16/introducing-microsoft-365-copilot-your-copilot-for-work/">Office 365 Co-Pilot</a>), Google (<a href="https://workspace.google.com/blog/product-announcements/duet-ai">Duet</a>), Slack (<a href="https://www.salesforce.com/uk/news/stories/slack-news-dreamforce-2023/">Slack AI</a>) and others. GitHub&#8217;s co-pilot pre-dates all of these, <a href="https://github.blog/2021-06-29-introducing-github-copilot-ai-pair-programmer/">announced</a> in June 2021 and is probably the earliest example of a generative AI feature being released.</p><p>I expect the announcement of new generative AI features in apps and platforms to accelerate towards 2024 and I think in a couple of years&#8217; time we&#8217;ll look back and wonder how we &#8216;survived&#8217; without generative AI features in all our apps and platforms. In the future we&#8217;ll take generative AI features for granted in the same way we take search and social features for granted right now.</p><h2>Generative AI as an&#8230; App&#8230; or Service</h2><p>There has been an explosion of new generative AI apps and services in 2023, from only a handful in 2022. Now we have generative AI apps and services for chat, image creation/enhancement, voice synthesis, audio creation, video creation, 3D modelling, analytics and gaming to name a few! It&#8217;s fair to say that the chat, image, and voice apps have had the majority of the limelight but that will no doubt evolve in 2024.</p><blockquote><p>I&#8217;ve purposefully not distinguished between generative AI apps and services here as I think in these early days of the technology it&#8217;s very difficult to draw the line between the two. The difference between the two for generative AI should become much clearer in 2024.</p></blockquote><p>Most of the generative AI apps and services that we&#8217;ve seen so far, regardless of what type of content they output are &#8216;text-input&#8217; apps. This means that many of them are essentially chat interfaces whether or not they&#8217;re actually chat apps. I think this is the area we&#8217;ll see evolve the most in 2024 (at least I hope!). I really like the idea of chat being a primary user interface, but it can&#8217;t be the <strong>ONLY</strong> part of a user interface. There&#8217;s a lot of great work ahead to work out what a chat-centric user interface looks like and I&#8217;m excited to see where it takes us.</p><h2>Generative AI as a&#8230; Platform</h2><p>I&#8217;d argue that we have nothing anywhere near a generative AI platform even on the horizon yet, but I&#8217;m convinced this is where the major players are heading. I&#8217;m not even sure &#8216;generative AI platform&#8217; is the right phrase to use, even though the central technology will be generative AI in its broadest sense. </p><p>It&#8217;s such early days that it&#8217;s probably difficult to even envision what a generative AI platform could be. So, to help us, let&#8217;s go back to those features of a platform that I outlined earlier:</p><h4>Multiple entry points</h4><p>This should be an easy one - any generative AI platform needs to be accessible from a browser, an app (both mobile and desktop), have browser plugins/widgets and most important of all bring voice in as a major entry point. Voice is much easier said than done (#sorrynotsorry) and it hasn&#8217;t really taken off as a major input paradigm yet. At some point in the next few years, we&#8217;ll see that change and generative AI will be the catalyst for that change.</p><h4>User profiles</h4><p>Again, this should be an easy one, right? User profiles are user profiles. Some are good, some are bad, but they all serve a similar purpose&#8230;. until they don&#8217;t. For a generative AI platform, I think we need to see an evolution of the user profile. The reason for this is that users of a generative AI platform will need to set up and manage many more settings and features than on a typical digital platform today. For example:</p><ul><li><p><strong>Connections/integrations with a user&#8217;s other services</strong> that they want a generative AI platform to have access to. This will enable the platform to supplement its knowledge of users&#8217; wider digital lives.</p></li><li><p><strong>Managing the memory/knowledge of the generative AI platform</strong> has of a user. A successful generative AI platform should record every interaction a user has with it and that should be fully visible and editable for users.</p></li><li><p><strong>Tool/plugin management</strong>. It&#8217;s likely that a generative AI platform has access to lots of different tools to help their users and these will need to be setup and managed in a user&#8217;s profile.</p></li></ul><h4>Customisation, inc. privacy controls</h4><p>A generative AI platform should be highly customisable as it should ultimately be a very bespoke and tailored experience for every user. It will be truly personal.</p><ul><li><p><strong>Aesthetics</strong> are important. I&#8217;d love to see a platform where users can customise its look at feel, all the way from the structure of the interface to the sounds, fonts, and colour scheme. I see customisation of the user interface of generative AI platforms needing to be the digital equivalent of the personalisation options we currently see in wearable technologies.</p></li><li><p><strong>AI preferences. </strong>Users should be able to customise the &#8216;personality&#8217; of their generative AI account, being able to set gender, ethnicity, conversational style and level of creativity and original thinking amongst many other things. These setting might even be dynamic and evolve over time based on the interactions a user has with the platform.</p></li></ul><ul><li><p><strong>More sophisticated privacy controls</strong> to allow users granular control of who has visibility of, interaction with and/or access to their generative AI platform account. This will be a core part of any social interactions available on future generative AI platforms.</p></li></ul><h4>Notifications</h4><p>Another simple one, right? Wrong! Similar to privacy controls, users of a generative AI platform are going to want (at least to start with) much more granular control of their notifications. The reason? They&#8217;re going to be handing over control of what they get notified about and when to a generative AI platform for the very first time. I fully expect the need to configure notifications to go away over time as users become more comfortable with the idea of an AI handling their notifications and the platform learning a user&#8217;s preferences. But to start with, users are going to want to have full control of their notifications.</p><p>Notifications will also turn the current interaction model with generative AI technology on its head. At the moment, we have a very &#8216;one-way&#8217; interaction with generative AI where we ask it something, which it then responds to. For a generative AI platform to be truly successful that interaction needs to become &#8216;two-way&#8217; with the platform having the ability to get users&#8217; attention and prompt them for input.</p><h4>Search functionality</h4><p>Nope. Don&#8217;t need this one. Why would you need search functionality when you have a generative AI platform that has knowledge/memory of every interaction you&#8217;ve had with it and is connected into your wider digital life? The answer: you don&#8217;t. If you&#8217;re looking for something, just ask the platform.</p><h4>Activity feeds</h4><p>This is both simple and complex and then simple again all at the same time. Simple in the sense that the activity feed could just be a &#8216;chat&#8217; history/log of all the interactions you&#8217;ve had with the platform. But what if the generative AI platform has access to lots of connections/integrations with a user&#8217;s other services? What if a user allows others to view, interact or even access their account? This complicates things when it comes to an activity feed as I believe users should have a full log of every action a generative AI platform has taken. But that then brings me back to the search functionality - if you want to know what the generative AI platform has been up to just ask it - simple!</p><h4>Content curation</h4><p>Content curation on a &#8216;typical&#8217; digital platform is usually down to user selection, i.e. what topic/person/company etc. they want to follow and see content from. A generative AI platform could be that simple, but that wouldn&#8217;t be fun, would it? Another option would be for the generative AI platform to learn the content you&#8217;re interested in and do the curation for you. This is pretty much how many social media feeds currently work, but it doesn&#8217;t lead to a very satisfactory experience (in my opinion).</p><p>The answer to content curation needs to be much more nuanced. It&#8217;s not just about what a user sees, it&#8217;s also about when, where, and how they see it. For a generative AI platform, I think the definition of content is much broader as well - it could be news, images, social posts, emails, text messages, chat messages, app notifications and all sorts of content coming from all the other connections the platform has with a users&#8217; wider digital life. Bear with me on this one&#8230;</p><p>As a user I want to consume content in many different ways, for example:</p><ul><li><p>There is content I absolutely want to be alerted to the moment they&#8217;re available/happen so that I&#8217;m fully up-to-date and able to respond immediately if necessary.</p></li><li><p>There are fleeting moments when I fancy seeing what&#8217;s going on in the wider world and want to be able to consume content casually.</p></li><li><p>There are moments when I have time to purposefully consume content that I&#8217;m really interested in.</p></li><li><p>There are moments when I want to see what my friends/family/colleagues have been up to and to communicate with them.</p></li></ul><p>These moments all require different types of content curation, and my hope is that a generative AI platform would be able to help me with this, presenting me with the right content in the right moment and learn from my preferences over time.</p><h4>Collaboration tools &amp; sharing features</h4><p>Ah, relief - another simple one I hear you say. Yep, of course it is&#8230; a user just shares what they want to share and has the ability to un-share whenever they please! Except&#8230; this is a generative AI platform and what if the generative AI platform wants to be able to share something? How would we manage the permissions for that and how granular would they need to be? What does sharing and collaboration even mean in the context of generative AI? Is it:</p><ul><li><p>human &#8592;&#8594; human?</p></li><li><p>human &#8592;&#8594; machine?</p></li><li><p>machine &#8592;&#8594; human?</p></li><li><p>machine &#8592;&#8594; machine?</p></li><li><p>All or none of the above?</p></li></ul><p>A user should also be able to control whether their content is visible, extendible, or editable, not just generically &#8216;shared&#8217;. Lots to think on with this one.</p><h4>Integrations with other apps/services</h4><p>I think this will be vast - it&#8217;s not going to be as simple as just ticking a box to say you want an update on one platform to be shared automatically on another platform. If you imagine some of the integrations you might want a generative AI platform to have in order to have a better understanding of your wider digital life it could mean literally anything - your music services, streaming services, news services, emails, messaging, social platforms, banking, subscription services, mobile app integrations, smart home integrations, internet of things integrations and the list will go on.</p><p>Some of these integrations will have to come with considerable safe-guards and require huge amounts of user trust, which won&#8217;t be easy to come by to start with. But I can absolutely see the benefits.</p><p>Just managing this broad an ecosystem of APIs and integrations will be a significant technical task all on its own for a generative AI platform, let alone all the smart stuff that could be enabled once the platform has access to all that data.</p><h2>Could generative AI go further than a platform?</h2><p>You may have noticed that a certain word or phrase is conspicuously missing from all of the content above. It&#8217;s something that&#8217;s often talked about in lofty terms and certainly something that all the major generative AI players have talked about creating. Have you guessed what it is yet?</p><p>Oh, you got it - you smart people! </p><p>The word is <s>SAMANTHA</s> <strong>ASSISTANT</strong>. Or <a href="https://pi.ai/talk">personal assistant</a>, <a href="https://en.wikipedia.org/wiki/Virtual_assistant">virtual assistant</a>,  or <a href="https://decrypt.co/152843/google-ai-life-coach">life coach</a> if you must. I don&#8217;t think the often used term co-pilot quite does this idea justice, so let&#8217;s ignore that one.</p><p>Yes, if you piece together all the generative AI platform features I&#8217;ve outlined above each user gets themselves their very own generative AI personal assistant. You&#8217;re welcome.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!l7wY!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!l7wY!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 424w, https://substackcdn.com/image/fetch/$s_!l7wY!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 848w, https://substackcdn.com/image/fetch/$s_!l7wY!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!l7wY!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!l7wY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg" width="888" height="499" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:499,&quot;width&quot;:888,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:132918,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!l7wY!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 424w, https://substackcdn.com/image/fetch/$s_!l7wY!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 848w, https://substackcdn.com/image/fetch/$s_!l7wY!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!l7wY!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7bc8e718-531a-497c-9a75-a0ab4b128ad3_888x499.jpeg 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Her (2013) from Spike Jonze</figcaption></figure></div><p>In my mind, a generative AI platform <em><strong>is</strong></em> an assistant and the natural evolution of the large digital platforms we currently have. If you go back to my original definition of a platform and put generative AI in the middle of it, you&#8217;ll see what I mean:</p><div class="pullquote"><p>&#8220;A (<strong>generative AI</strong>) Platform is a comprehensive ecosystem that combines various features, apps, and services into a unified experience. (<strong>Generative AI</strong>) platforms offer a broad range of functionalities, often integrated in such a way that they augment each other, providing a cohesive user experience.&#8221;</p></div><p>But real life isn&#8217;t that simple - you&#8217;re not going to get to a generative AI assistant just by putting generative AI features into a digital platform. You&#8217;re also not going to build a generative AI assistant out from a generative AI app or service.</p><p>As you can see from the different platform features I outlined above, all of which need re-thinking and engineering for generative AI, there is a lot of work that needs to be done to get to anywhere near a generative AI assistant.</p><p>I don&#8217;t think the work is overly technical - I think it&#8217;s mostly in user experience design, spotting the opportunities we&#8217;ve previously missed and avoiding all of the mistakes we&#8217;ve made when it comes to giving users full transparency and control over their content and data. This will take time to get right, but it&#8217;s not out-of-reach for today&#8217;s technology.</p><p>For me, that&#8217;s a really exciting thought. I believe it&#8217;s technically possible today to deliver a technology that we&#8217;ve only dreamed of in science fiction for decades. Whilst there are challenges to work through, it&#8217;s absolutely within our reach. I&#8217;m also excited because I don&#8217;t think an assistant like this <em><strong>can</strong></em> come from any of the large digital platforms. And that means that at some point in the next 5 years (maybe less) we&#8217;re going to see a new platform emerge, built ground-up around generative AI, and seriously challenging and disrupting the large platforms we currently have in the digital ecosystem.</p><p>We&#8217;ve had the Internet (as we currently know it) for 30 years now and during that time we&#8217;ve seen a lot come and go. Of those large digital institutions that are still with us, we&#8217;ve had:</p><ul><li><p>Amazon for 30 years.</p></li><li><p>Google for 25 years.</p></li><li><p>Facebook, YouTube, Twitter and LinkedIn for nearly 20 years.</p></li><li><p>iOS, Android, Instagram, SnapChat, WhatsApp, and WeChat for around 15 years.</p></li><li><p>Lastly, TikTok for 6 years.</p></li></ul><p>We&#8217;re definitely overdue a change at the top and I for one can&#8217;t wait to see how generative AI shakes things up.</p><div><hr></div><div class="captioned-button-wrap" data-attrs="{&quot;url&quot;:&quot;https://www.the-blueprint.ai/p/is-generative-ai-a-feature-or-a-platform?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="CaptionedButtonToDOM"><div class="preamble"><p class="cta-caption">Thank you for reading THE BLUEPRINT. This post is public so feel free to share it.</p></div><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://www.the-blueprint.ai/p/is-generative-ai-a-feature-or-a-platform?utm_source=substack&utm_medium=email&utm_content=share&action=share&quot;,&quot;text&quot;:&quot;Share&quot;}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://www.the-blueprint.ai/p/is-generative-ai-a-feature-or-a-platform?utm_source=substack&utm_medium=email&utm_content=share&action=share"><span>Share</span></a></p></div><div><hr></div><blockquote><p><em>"The future is already here, it's just not evenly distributed."</em></p><p><strong>William Gibson</strong></p></blockquote><div><hr></div><p><em>This article was researched and written with help from ChatGPT, but was lovingly reviewed, edited and fine-tuned by a human.</em></p>]]></content:encoded></item><item><title><![CDATA[The Balancing Act]]></title><description><![CDATA[Finding a middle ground for copyright and generative AI]]></description><link>https://www.the-blueprint.ai/p/the-balancing-act</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/the-balancing-act</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Fri, 18 Aug 2023 08:01:07 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kmhq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kmhq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kmhq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!kmhq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!kmhq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!kmhq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kmhq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/f8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:698357,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kmhq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!kmhq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!kmhq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!kmhq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Ff8eb9c0f-3f99-4e7d-a32c-644d01cc0696_1456x816.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">cyborg reading a book, cyberpunk, neon colors, clean pixel art --ar 16:9</figcaption></figure></div><p>Two things happened this week which has caused me to think about how we navigate copyright issues in the era of Generative AI. The first was the announcement by OpenAI about their new web crawler and the second was a response I got from ChatGPT when asking about a literary character by an author my wife works with.</p><h2><strong>GPTBot</strong></h2><p>GPTBot is the <a href="https://en.wikipedia.org/wiki/Web_crawler">web crawler</a> that OpenAI <a href="https://platform.openai.com/docs/gptbot">announced</a> last week that it will now use going forwards to collect data from the internet to train it&#8217;s models on. What makes it interesting is that it can be clearly identified and very simply blocked (in 2 short lines of code!) by any publisher that doesn&#8217;t want their content crawled and used by OpenAI in training any future large language models. OpenAI also use GPTBot to <em>&#8220;remove sources that require paywall access, are known to gather personally identifiable information (PII), or have text that violates (our) policies&#8221;</em>. OpenAI also gives guidance on how publishers can allow access to parts of their sites, but not others.</p><p>To me, this represents good progress where OpenAI are being fully transparent and putting the power back into the publishers hands on whether or not they&#8217;re happy for their content to be included in GPT&#8217;s training data. </p><p>Transparency like this will build trust among generative AI companies, publishers and consumers, which will hopefully lead to more responsible use of AI in the future. Ensuring content owners are 100% in control of their content is a big step towards a more ethical approach to gathering data to train large language models and is hopefully an example that other generative AI companies will follow.</p><h2>Copyrighted Text</h2><div class="captioned-image-container"><figure><a class="image-link image2" target="_blank" href="https://substackcdn.com/image/fetch/$s_!wbve!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!wbve!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 424w, https://substackcdn.com/image/fetch/$s_!wbve!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 848w, https://substackcdn.com/image/fetch/$s_!wbve!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 1272w, https://substackcdn.com/image/fetch/$s_!wbve!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!wbve!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png" width="1456" height="342" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:342,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:94253,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!wbve!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 424w, https://substackcdn.com/image/fetch/$s_!wbve!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 848w, https://substackcdn.com/image/fetch/$s_!wbve!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 1272w, https://substackcdn.com/image/fetch/$s_!wbve!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa1c3946b-5505-493f-b8ad-2a29de1f57d3_1637x385.png 1456w" sizes="100vw"></picture><div></div></div></a><figcaption class="image-caption">can you give me some quotes from The Skull?</figcaption></figure></div><p>The second thing that happened last week was I was told by ChatGPT (using GPT-4) that it couldn&#8217;t respond to my request because it couldn&#8217;t reproduce copyrighted material. At the time I was asking ChatGPT to give me some quote from The Skull in Jonathan Stroud&#8217;s <a href="https://en.wikipedia.org/wiki/Lockwood_%26_Co.">Lockwood &amp; Co.</a> series for a little side project I&#8217;m working on &#128064;.</p><p>This is the first time I&#8217;ve seen a message like this from ChatGPT and not something I&#8217;ve seen widely discussed so I&#8217;m not sure if this is a new feature. However, I think this is the responsible response from ChatGPT - It shouldn&#8217;t be able to reproduce any copyrighted material and whilst it can talk to me about the book series, summarise the plot and answer questions on the content, actually reproducing any text is the line it absolutely shouldn&#8217;t cross.</p><h2>Implications</h2><p>So why are these two seemingly unrelated things so interesting? Well I think they point to a good middle ground in the ongoing debate and lawsuits around how online content is being used by generative AI companies to train their large language models.</p><p>Let&#8217;s put this in human terms, and simplify things for clarity - there is currently a huge amount of content that people have access to online without charge. This is all paid for by advertising (but that&#8217;s another topic for another time - I have strong views!). From a consumers&#8217; perspective they aren&#8217;t paying for any of this content beyond an implicit sharing of their &#8216;data&#8217;. This is one of the main things that has made the internet what it is today - the ability to freely share information and giving consumers the ability to use the internet for a huge variety of day-to-day tasks.</p><p>There is also some &#8216;premium&#8217; content that is only available online if you&#8217;re willing to pay. That could be through more explicitly sharing your data (e.g. having to have a registered account to access it) or through paying a subscription or other charge. Premium, chargeable content has been a big area of growth for lots of publishers over the past few years and there are now lots of successful, paid for services online.</p><p>In both of these cases, we&#8217;re (society) saying that it is ok for humans to consume this content freely, if it&#8217;s available freely, or for a charge if it&#8217;s through a paid service. But what society also says (in our laws etc.) is that it&#8217;s not ok for humans to reproduce this content without the express permission of the content owner. I&#8217;m aware that this is a simplification of what are some very complex laws that differ from country-to-country but the broad principles are consistent.</p><p>Much like how humans read and internalise information from books or articles, large language models 'consume' content by analysing and storing the data they are trained on. This absorption process is analogous to our learning: both humans and generative AI models use the ingested information to inform future actions, responses, or decisions.</p><p>My provocation is therefore, why should this be any different for large language models? If generative AI companies only train their models on content that is freely available, or through commercial agreements with premium content providers, and agree that they won&#8217;t re-produce copyrighted material wouldn&#8217;t this represent a good outcome for all? I certainly think so.</p><p>There are obviously a few caveats to this (e.g. this is easier to spot/enforce for text generation than image generation), and I know the world isn&#8217;t as simple as I&#8217;ve described above, but this seems like a sensible approach. This way society can benefit from large language models that are trained on the widest selection of material available, premium content can be gated and paid for and content creators are safeguarded from their work being reproduced without permission.</p><h2><strong>Conclusion</strong></h2><p>To get to this proposed solution we have to conceptually separate the training of the large language models from the outputs they generate, which I think is a very reasonable thing to do and it&#8217;s great to see OpenAI moving in this direction. I&#8217;m sure that we&#8217;ll be caught up in this debate for some time and there are plenty more legal cases (and lawyers fees!) to be had, but I&#8217;d love to see us find some middle ground that benefits all parties and allows generative AI technology to continue to progress in a way that puts control back into publishers hands and fairly rewards content creators.</p><p>It will be interesting to see how other people feel about this. How do you think we can find a way for generative AI to continue to develop whilst respecting creative rights?</p><p>Please feel free to sound off in the comments below!</p><div><hr></div><blockquote><p><em>"The future is already here, it's just not evenly distributed."</em></p><p><strong>William Gibson</strong></p></blockquote><div><hr></div><p><em>This article was researched and written with help from ChatGPT, but was lovingly reviewed, edited and fine-tuned by a human.</em></p><div><hr></div>]]></content:encoded></item><item><title><![CDATA[The Blueprint - Part III: Training & Access Decisions]]></title><description><![CDATA[Taking an agile approach to implementing generative AI]]></description><link>https://www.the-blueprint.ai/p/part-iii-training-and-access-decisions</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/part-iii-training-and-access-decisions</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Wed, 16 Aug 2023 08:00:19 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!0xVA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!0xVA!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!0xVA!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 424w, https://substackcdn.com/image/fetch/$s_!0xVA!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 848w, https://substackcdn.com/image/fetch/$s_!0xVA!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 1272w, https://substackcdn.com/image/fetch/$s_!0xVA!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!0xVA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:3832132,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!0xVA!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 424w, https://substackcdn.com/image/fetch/$s_!0xVA!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 848w, https://substackcdn.com/image/fetch/$s_!0xVA!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 1272w, https://substackcdn.com/image/fetch/$s_!0xVA!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F93b5b2ee-76e1-4672-95c6-dc3b720d591d_2676x1500.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">isometric clean pixel art image of japanese dojo, neon colors --ar 16:9</figcaption></figure></div><p>Hello and welcome to part three of my Blueprint series where I&#8217;ll be covering the different training methods for generative AI and how we can tackle the challenges around data confidentiality, security, and fluidity. This post is longer than usual and goes into some technical areas (which I&#8217;ve simplified and kept to a minimum), but this is necessary to cover the ground that I want to cover and help readers understand the decisions to be made around training generative AI models on a businesses&#8217; data.</p><p>In my previous article I covered the importance of knowledge management, which forms an essential part of any generative AI strategy. Knowledge is the foundation that any generative AI system will be built on so getting it right is vital. If you haven't read that yet, I encourage you to take a look <strong><a href="https://www.the-blueprint.ai/p/part-ii-knowledge-management">here</a></strong>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.the-blueprint.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Blueprint! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><h2>Generative AI Training &amp; Data Access</h2><p>As briefly covered in <a href="https://www.the-blueprint.ai/p/part-ii-knowledge-management">part two</a> of The Blueprint series, there are broadly five options to consider when it comes to training generative AI models and giving them access to a businesses&#8217; data. Below we go into each of them in more detail, explaining what they are and their pros and cons:</p><h4>1. Pre-training</h4><p>Pre-training involves pre-training a generative AI model from scratch on domain-specific data.</p><ul><li><p><strong>Pros:</strong></p><ul><li><p>Results in a highly specialised model that could outperform other training methods for specific tasks or domains.</p></li><li><p>The generative AI model will have a better understanding of the domain-specific nuances.</p></li></ul></li><li><p><strong>Cons:</strong></p><ul><li><p>Computationally expensive, time-consuming and has high environmental costs.</p></li><li><p>Requires large amounts of domain-specific data.</p></li><li><p>Requires specific deep learning technical expertise.</p></li><li><p>May make the generative AI model less effective at general tasks.</p></li><li><p>Knowledge can&#8217;t reference or be linked back to a source document.</p></li><li><p>Risk of <a href="https://en.wikipedia.org/wiki/Overfitting">overfitting</a> if the domain-specific data is not diverse enough.</p></li></ul></li></ul><h4>2. Fine-tuning</h4><p><a href="https://en.wikipedia.org/wiki/Word_embedding">Fine-tuning</a> involves re-training a pre-trained generative AI model with new domain-specific data.</p><ul><li><p><strong>Pros:</strong></p><ul><li><p>Computationally more efficient, less time-consuming and lower environmental costs than pre-training.</p></li><li><p>Tailors a pre-trained model to a specific task or domain.</p></li><li><p>Less domain-specific data required compared to pre-training.</p></li></ul></li><li><p><strong>Cons:</strong></p><ul><li><p>Requires specific deep learning technical expertise.</p></li><li><p>Knowledge can&#8217;t reference or be linked back to a source document.</p></li><li><p>Risk of <a href="https://en.wikipedia.org/wiki/Overfitting">overfitting</a> to specific domain/tasks, leading to reduced performance of the generative AI model.</p></li><li><p>Risk of <a href="https://en.wikipedia.org/wiki/Catastrophic_interference">catastrophic forgetting</a>, leading to reduced performance of the generative AI model.</p></li><li><p>Might require extensive <a href="https://en.wikipedia.org/wiki/Hyperparameter_optimization">hyperparameter tuning</a> which can be time-consuming and resource intense.</p></li></ul></li></ul><h4><strong>3. Embedding</strong></h4><p><a href="https://en.wikipedia.org/wiki/Word_embedding">Embedding</a> involves creating a mathematical representation of the knowledge that is stored in a separate vector database that the generative AI model can access.</p><ul><li><p><strong>Pros:</strong></p><ul><li><p>Computationally more efficient, faster and has less environmental impact than pre-training and fine-tuning.</p></li><li><p>Quick and easy to update.</p></li><li><p>Knowledge can reference or be linked back to a source document.</p></li></ul></li><li><p><strong>Cons:</strong></p><ul><li><p>Can be limited by the original generative AI model training.</p></li><li><p>It can be complex to create effective embeddings.</p></li><li><p>The embedded knowledge may not be fully utilised by the generative AI model.</p></li></ul></li></ul><h4><strong>4. Direct knowledge access</strong></h4><p>This involves storing knowledge in a traditional database and giving a generative AI model access via the ability to run <a href="https://en.wikipedia.org/wiki/SQL">SQL</a> queries.</p><ul><li><p><strong>Pros:</strong></p><ul><li><p>Computationally efficient, and has the least environmental impact as no additional training is needed.</p></li><li><p>Good for numerical knowledge.</p></li><li><p>Knowledge can reference or be linked back to a source document.</p></li><li><p>Good for highly fluid data that needs to be updated regularly.</p></li><li><p>Access restrictions can be put in place for sensitive and confidential data.</p></li></ul></li><li><p><strong>Cons:</strong></p><ul><li><p>Adds complexity to the generative AI system and is not always as reliable as embedding.</p></li><li><p>Running SQL queries could slow down the response time of the model.</p></li><li><p>Not as good as embeddings for text-based knowledge.</p></li><li><p>Special care must be taken to prevent <a href="https://en.wikipedia.org/wiki/SQL_injection">SQL injection</a> attacks or unintentional data exposure.</p></li></ul></li></ul><h4><strong>5. Prompt engineering</strong></h4><p><a href="https://en.wikipedia.org/wiki/Prompt_engineering">Prompt engineering</a> involves adding knowledge directly to the prompt and designing the prompt to guide the generative AI model towards the desired output.</p><ul><li><p><strong>Pros:</strong></p><ul><li><p>Computationally most efficient, faster and has the least environmental impact as no additional training is needed.</p></li><li><p>Good for text or image-based knowledge.</p></li><li><p>Knowledge can reference or be linked back to a source document.</p></li><li><p>Quick and straightforward way to adapt a generative aI model to new tasks.</p></li><li><p>Flexibility in controlling the generative AI model's behaviour.</p></li></ul></li><li><p><strong>Cons:</strong></p><ul><li><p>Requires careful crafting of prompts.</p></li><li><p>There is a limit to the amount of information that can be included in a prompt.</p></li><li><p>Not ideal for numerical knowledge.</p></li><li><p>May not always lead to desired output.</p></li><li><p>Can lead to inconsistent results for different prompt designs.</p></li></ul></li></ul><p>Next let&#8217;s look at the above techniques, their pros and cons and their suitability for different types of knowledge within a business.</p><h2>Decisions, Decisions&#8230;</h2><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!jYvt!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!jYvt!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 424w, https://substackcdn.com/image/fetch/$s_!jYvt!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 848w, https://substackcdn.com/image/fetch/$s_!jYvt!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 1272w, https://substackcdn.com/image/fetch/$s_!jYvt!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!jYvt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png" width="1456" height="910" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:910,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:222492,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!jYvt!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 424w, https://substackcdn.com/image/fetch/$s_!jYvt!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 848w, https://substackcdn.com/image/fetch/$s_!jYvt!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 1272w, https://substackcdn.com/image/fetch/$s_!jYvt!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F6855f62b-21dd-4928-ad96-4785c88e1035_2880x1800.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Chart showing data training techniques mapped by fluidity of data and sensitivity of data (i.e. high = confidential/sensitive data). Size of bubbles represents cost/complexity of each approach.</figcaption></figure></div><p>So, how do we decide which of these techniques to use? Let&#8217;s start from the bottom left:</p><p><strong>Pre-training</strong> - I think we&#8217;re already at the point where the main generative AI models (<a href="https://chat.openai.com">ChatGPT</a>, <a href="https://claude.ai">Claude</a>, <a href="https://bard.google.com/">Bard</a> etc.) are good enough at most tasks that it doesn&#8217;t make a lot of sense for many businesses to pre-train their own models. However, there are two exceptions - confidentiality and costs. If you want to pass highly confidential knowledge to a generative AI model and you&#8217;re unsure how safe that might be with one of the publicly available models then you should consider running your own in-house model. Also, if you have a high number of users and/or expect high usage of your generative AI tools then the costs of using publicly available generative AI models might start to be prohibitive - again it will make more sense to run an in-house model. </p><blockquote><p>I&#8217;ll be going in to more details on in-house models when I cover the build/partner/buy decision later in The Blueprint series.</p></blockquote><p><strong>Fine-tuning</strong> - This is a really good option if you have specialised domain knowledge in your business/industry that is relatively static and doesn&#8217;t need updating frequently. However, fine tuning does need specialist and technical deep learning expertise, so won&#8217;t be for every organisation - it&#8217;s a big endeavour! Fine-tuning will be a great approach if you want a general purpose model in the business that has deep knowledge of you organisation and/or industry, but it will come with a price tag!</p><p><strong>Embedding</strong> - I think this is probably the best middle ground for most generative AI use cases where you want to get the balance of giving the generative AI model quick an easy access to business knowledge that can be updated relatively frequently. Think overnight updates - more regular than that will probably get costly quite quickly. Embeddings are good for where you have knowledge that you want to update regularly as the embeddings can be overwritten individually. This isn&#8217;t something that can be easily done with pre-training or fine-tuning where you probably need to start from scratch each time to ensure that the knowledge is updated reliably. Embeddings are also good for written or visual knowledge where you want to be able to reference or link back to a source document.</p><p><strong>Direct knowledge access</strong> - This is a great approach to take for numerical data that is stored in spreadsheets or databases. Essentially how it works is that the generative AI model will take your natural language query and execute it on the spreadsheet or database. This can also work for written or visual knowledge if it&#8217;s stored in a database as well. Direct knowledge access is also a good approach where you have confidential or sensitive data as you can restrict access based on the user of the generative AI tool/model, so useful for financial, employee and customer data. Sometimes the results aren&#8217;t always accurate as generative AI models aren&#8217;t 100% reliable in turning natural language queries into the correct database queries. GPT-4 is much better at this than GPT-3.5 was, but there is still room for improvement. One way this approach could be improved is to also fine-tune the generative AI model on the documentation and details of the databases you want it to access and provide example SQL queries to use for different use cases. This might be overkill though with the next generation of generative AI models likely to address any lingering reliability issues in this area.</p><p><strong>Prompt engineering</strong> - This is by far the simplest and cost-effective approach to give a generative AI model access to your business knowledge if it is text based and can be accommodated in the prompt length restrictions that are currently in place with some models. These restrictions are likely to (effectively) disappear in the medium-term however as allowed prompt lengths expand to 1m tokens (c. 750k words, or the first five books in the Harry Potter series added together!). In fact <a href="http://claude.ai/">Claude 2</a> from Anthropic already allows prompts of this size. The challenge with prompt engineering at the moment is that it has to be text-based. If using publicly available generative AI models, longer prompts also push up costs (as they&#8217;re based on the size of the prompt submitted) so isn&#8217;t always the best approach, depending on the use case.</p><h2>Theory, meet Practice&#8230;</h2><p>Let&#8217;s start putting all the theory covered above into practice by looking back at the different knowledge types covered in <a href="https://www.the-blueprint.ai/p/part-ii-knowledge-management">part two</a> of The Blueprint series. To simplify this, let&#8217;s group some of these knowledge types together:</p><ul><li><p><strong>Emails &amp; Chat channels</strong> - highly fluid and may contain sensitive/confidential data.</p></li><li><p><strong>Written documents &amp; Presentations</strong> - medium to low fluidity depending on the content and could also contain sensitive/confidential data.</p></li><li><p><strong>Spreadsheets &amp; Databases</strong> - highly likely to contain sensitive/confidential data and may have either highly fluid or low fluidity data.</p></li></ul><h4>Emails &amp; Chat Channels</h4><p>The idea of including emails and chat conversations in a business&#8217; generative AI model is bound to be a controversial one. It will take quite a bit of cultural change before employees, customers and partners/suppliers would feel comfortable with the idea. However, there is a huge amount of valuable knowledge contained in emails and chat channels, so I strongly believe we need to find a way to include this in a business&#8217; knowledge library.</p><p>The first thing that will need to be in place will be a clear usage policy for all employees, customers and suppliers. In the policy it should state that no personal information will be accessed by the generative AI model and that all sensitive information will be excluded. The policy should also emphasise that business communication platforms should not be used for personal communications. This is a relatively common policy, but is very rarely adhered to or enforced as the lines between our work and personal lives have blurred over time.</p><p>With a clear and sensible policy in place (so all stakeholders understand how generative AI will be used in this use case) we can then leverage generative AI&#8217;s strengths to summarise email and chat communications, removing any sensitive information. As emails/chats are very fluid I&#8217;d recommend that this is run as a nightly process with the summarisations then turned into embeddings that the generative AI model can access.</p><p>There is also important metadata that should be captured and included with the summarisations, such as subject and date of communication. This will allow different email and chat threads to be linked together by the generative AI model. It would also be useful to capture the sender and recipient but this might be a step too far. To ensure no personal data is included the names/email addresses could be replaced with employee/supplier/customer IDs that only certain teams/individuals have access to.</p><div class="poll-embed" data-attrs="{&quot;id&quot;:91912}" data-component-name="PollToDOM"></div><h4>Written documents &amp; presentations</h4><p>Compared to emails and chats, this one should be a (relative) breeze! There are two important area to address, and they&#8217;re confidentiality and version control. There will be documents in a business that are confidential and shouldn&#8217;t be included in a generative AI model and there will be documents that would be incredibly useful to include. A simple way to manage this will be to have a specific location and/or naming convention for any document that shouldn&#8217;t be included. Best practice would be for any confidential document to be marked as such anyway, so we can then design the generative AI training around that.</p><p>Once the confidentiality challenge is taken care of, the important thing for any documents included in a generative AI model is that you&#8217;re able to source and reference the information if the model is using it. This will allow users to fact check and also then explore the wider context of the answer they have been given by the model. The way to achieve this will again be with embeddings and metadata that should be captured and included when the embeddings are created. And this is where you can handle version control as well. When a new version of a document is created then you can choose to either overwrite the existing embedding or you can create a new embedding with the new version number in the metadata. This will allow you to keep track of which version the generative AI model is accessing. My head says that overwriting the embedding would be the best approach, as there&#8217;s a change the model accesses the wrong version of the embedding but you could build in logic to only ever reference the highest version number.</p><p>Similar to email and chat channels, I&#8217;d recommend running an overnight embedding process so that your generative AI model has access to the latest information the next day.</p><h4>Spreadsheets &amp; databases</h4><p>Unlike most artificial intelligence technology, generative AI doesn&#8217;t have a great time with numerical data as it&#8217;s built mostly around text, images, video etc. What it can do well though is code, and specifically it can write pretty good SQL queries. So for any numerical knowledge in a business, it&#8217;s best to load it into a simple database that a generative AI model can then access and turn natural language queries into SQL queries. The added benefit of having all your numerical knowledge in a database is that you can then more effectively manage the security and access control around the knowledge but linking permissions back to the individual user of the generative AI model.</p><blockquote><p>As a quick aside, I&#8217;m also really interested in how generative AI models are getting good at working with APIs in general. There&#8217;s a huge opportunity to open APIs up to natural language queries and make the data/services they provide more accessible to the general population!</p></blockquote><p>So with spreadsheets and databases, I don&#8217;t think it makes sense to look at fine-tuning, or embeddings. I&#8217;d go for direct access with the generative AI model retrieving what it needs from the database using SQL.</p><h2>Summary</h2><p>In this article I&#8217;ve covered the training and access decisions that a business would typically need to make as part of a generative AI strategy. This is the next important step after determining what knowledge you&#8217;d want a generative AI model to be trained on or have access to.</p><p>In the next article in The Blueprint series, I&#8217;ll be looking at the ideal business technology stack for creating and managing all the different knowledge I&#8217;ve discussed above and how that can then connect in to a generative AI technology stack.</p><div><hr></div><blockquote><p><em>"The future is already here, it's just not evenly distributed."</em></p><p><strong>William Gibson</strong></p></blockquote><div><hr></div><p><em>This article was researched and written with help from ChatGPT, but was lovingly reviewed, edited and fine-tuned by a human.</em></p>]]></content:encoded></item><item><title><![CDATA[The Blueprint - Part II: Knowledge Management]]></title><description><![CDATA[Taking an agile approach to implementing generative AI]]></description><link>https://www.the-blueprint.ai/p/part-ii-knowledge-management</link><guid isPermaLink="false">https://www.the-blueprint.ai/p/part-ii-knowledge-management</guid><dc:creator><![CDATA[Sean 🤓]]></dc:creator><pubDate>Wed, 09 Aug 2023 08:01:04 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!JYBJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!JYBJ!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!JYBJ!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!JYBJ!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!JYBJ!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!JYBJ!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!JYBJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png" width="1456" height="816" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:816,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1195772,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!JYBJ!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 424w, https://substackcdn.com/image/fetch/$s_!JYBJ!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 848w, https://substackcdn.com/image/fetch/$s_!JYBJ!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 1272w, https://substackcdn.com/image/fetch/$s_!JYBJ!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F7f66d22b-f17b-47b1-9d2e-b831b897fcaa_1456x816.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">fluid binary data flow, clean pixel art, neon colors --ar 16:9</figcaption></figure></div><p>Hello and welcome to part two of The Blueprint series where I&#8217;ll be exploring where to start when building a strategy for implementing generative AI in business.</p><p>In my previous article I covered some important topics to help business leaders get started with their strategy for generative AI. If you haven't read that yet, I encourage you to take a look <strong><a href="https://www.the-blueprint.ai/p/part-i-leading-with-knowledge-and">here</a></strong>.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://www.the-blueprint.ai/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading The Blueprint! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p>In this second part, we'll shift our focus on to how businesses can harness the potential of their knowledge, which will be foundational for any successful generative AI strategy. We&#8217;re going to delve into the challenges of making knowledge accessible for generative AI, how we can manage data that is fast-moving, maintaining confidentiality and security, and organising and labelling knowledge for optimal usage.</p><h2>Business Knowledge</h2><p>A business&#8217; knowledge is typically spread across many different formats and locations. For example:</p><ul><li><p>Emails (Outlook, Gmail etc.)</p></li><li><p>Chat channels (Slack, Teams etc.)</p></li><li><p>Spreadsheets (Excel, Google Sheets etc.)</p></li><li><p>Written documents (Word, PDF etc.)</p></li><li><p>Presentations  (Powerpoint, Google Slides etc.)</p></li><li><p>Databases (Customer/supplier data, Financial data, Employee data etc.)</p></li></ul><div class="poll-embed" data-attrs="{&quot;id&quot;:92024}" data-component-name="PollToDOM"></div><p>There are many different challenges to work through in making this knowledge accessible to a generative AI model, but the first obvious question is why would we want to make all this knowledge accessible to a generative AI model in the first place? For me there are five big reasons:</p><ol><li><p><strong>Knowledge Discovery</strong>: A business generates and accumulates vast amounts of data over time, which is generally inaccessible. A generative AI model can address this, collecting and sifting through the data to identify trends, insights, and valuable information to enhance decision making.</p></li><li><p><strong>Efficiency</strong>: Generative AI can automate routine tasks such as sorting through emails, handling customer enquiries, managing databases, or even creating drafts for presentations or documents. This would free up time for employees to engage in higher value tasks.</p></li><li><p><strong>Customer Interaction</strong>: Generative AI can help improve customer and supplier interactions by providing timely responses and personalised experiences based on the knowledge a business has on them. It could do this 24/7,  providing real-time help and support.</p></li><li><p><strong>Data Management</strong>: Generative AI can categorise and store information in a structured manner, making it easy to retrieve when needed.</p></li><li><p><strong>Collaboration</strong>: Generative AI can improve collaboration by helping to manage and coordinate communication across various platforms. For instance, it could summarise key points from a Slack conversation, an email thread, or a shared document.</p></li></ol><div class="pullquote"><p>Generative AI&#8230; could summarise key points from a Slack conversation, an email thread, or a shared document.</p></div><h2>Knowledge Challenges</h2><p>So, what are the challenges in making the business&#8217; knowledge available to a generative AI model?</p><h4>Data Confidentiality &amp; Security</h4><p>Data confidentiality is a big concern. Businesses have legal and ethical obligations to protect sensitive information and giving a generative AI model access to business knowledge can introduce new risks. For example:</p><ul><li><p><strong>Financial &amp; supplier data</strong> - this is usually restricted within an organisation, with only specific teams or individuals having access. We need to ensure that these access restrictions remain in place and that financial data is safeguarded in a generative AI model.</p></li><li><p><strong>Customer data</strong> - this is often subject to strict privacy laws, such as the GDPR in Europe, so must be handled with care and comply with the local regulations. There are also some unknowns here - do we need additional consents to share a customer&#8217;s data with a generative AI model? How do we remove a customer&#8217;s data from a generative AI model if we receive a &#8216;right to be forgotten&#8217; request?</p></li><li><p><strong>Employee data</strong> - similar to customer data, giving a generative AI model access to employee data should be done in a manner that respects privacy rights and adheres to local employment laws. Also, employees should be clearly informed about what data is being used, why, and how it's being protected. It's also essential to consider the ethics of using this data, particularly if it's used in ways that could impact employee evaluations or job security.</p></li><li><p><strong>Email/chat data</strong> - this data can contain sensitive business information, personal data, or confidential correspondences. A generative AI model accessing this data should have robust privacy safeguards. For example, the model can be designed to use natural language processing to redact sensitive information automatically. Lastly, a clear usage policy should be communicated to all employees, customers and suppliers so they understand the nature and extent of the AI's access to their communications.</p></li></ul><h4>Data Fluidity</h4><p>Put simply, some data is more fluid than others. What I mean by data fluidity is the frequency and speed at which it is changed or updated. This has important implications on how we train/give generative AI access to it.</p><p>Data that is static is much simpler to manage and give generative AI access to. We have options - we could use it to pre-train a model, fine-tune a model, create embeddings, give a generative AI model direct access to it or use it as part of prompt engineering. </p><blockquote><p>I&#8217;ll be covering all of these options in more depth in the next part of The Blueprint series as well as recommending the best approach for different types of data.</p></blockquote><p>However, data that is highly fluid is much more complicated and we have fewer options when it comes to generative AI. With the current technology it is too expensive, time consuming and bad for the environment to be constantly re-training or fine-tuning a generative AI model with highly fluid data. This might change in the future as hardware costs come down and smaller models become more adept but I suspect that we will be living with this restraint on highly fluid data for some time.</p><h4>Knowledge Management</h4><p>I believe knowledge management in businesses will become an important, distinct, and highly-skilled discipline in the near future. I like the idea of every business having a Knowledge Library that is maintained and curated by Knowledge Librarians, whose job it will be to ensure that all the knowledge in a business is readily available for generative AI models. This will undoubtedly be something that becomes more automated and streamlined over time, but until then Knowledge Librarians will need to catalogue and integrate knowledge across the business to use with generative AI models. </p><blockquote><p>This skillset is something I&#8217;ll cover in more depth in a future instalment of The Blueprint.</p></blockquote><p>Below are the important areas that I believe knowledge librarians will be responsible for and will need to address when a business starts to build out their knowledge library:</p><ol><li><p><strong>Knowledge Integration</strong>: Knowledge in a business is typically spread across many different systems and formats. Integrating this data in a central repository that can be processed by a generative AI model can be a significant challenge, but having a modern business technology stack can greatly help with this. I&#8217;ll have a whole article dedicated to this topic in a future instalment of The Blueprint.</p></li><li><p><strong>Knowledge Transformation &amp; Harmonisation</strong>: As part of integrating knowledge into a central repository, it will need to be transformed into unified formats that are easily accessed by generative AI models.</p></li><li><p><strong>Knowledge Quality</strong>: The effectiveness of a generative AI model depends greatly on the quality of the data it's given. Inconsistent, incomplete, or erroneous data can compromise the model's performance and lead to inaccurate outputs. Therefore, it's essential to have robust data cleaning and validation processes.</p></li><li><p><strong>Knowledge Labelling</strong>: It is useful for data to have Metadata appended to describe the content, location, quality, and other characteristics of each data set. This will also help if the data is being used by a supervised learning model.</p></li><li><p><strong>Knowledge Governance</strong>: Implementing a robust governance strategy will ensure that knowledge is consistently classified, stored, and managed across the business. Clear policies and procedures for knowledge handling will need to be established.</p></li><li><p><strong>Maintenance and Upkeep</strong>: Generative AI models are not set-and-forget tools. They require ongoing maintenance, updating, and validation to ensure they continue to perform effectively and don't 'drift' from their intended purpose.</p></li></ol><h2>Summary</h2><p>In this article we&#8217;ve covered the importance of knowledge management, which forms an essential part of any generative AI strategy. Knowledge is the foundation that any generative AI system will be built on so getting it right is vital.</p><p>In the next article in The Blueprint series, I&#8217;ll be looking at the different training methods for generative AI, other options for giving generative AI access to your business&#8217; knowledge and how we can tackle the knowledge challenges around data confidentiality, security, and fluidity.</p><div><hr></div><blockquote><p><em>"The future is already here, it's just not evenly distributed."</em></p><p><strong>William Gibson</strong></p></blockquote><div><hr></div><p><em>This article was researched and written with help from ChatGPT, but was lovingly reviewed, edited and fine-tuned by a human.</em></p>]]></content:encoded></item></channel></rss>