<?xml version="1.0" encoding="UTF-8"?><?xml-stylesheet href="/rss-styles.xsl" type="text/xsl"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>David Hoang — Notes</title><description>Digital garden notes on design, AI, and building products.</description><link>https://www.davidhoang.com/</link><language>en-us</language><lastBuildDate>Sat, 04 Apr 2026 19:31:02 GMT</lastBuildDate><atom:link href="https://www.davidhoang.com/rss/notes.xml" rel="self" type="application/rss+xml"/><item><title>Dynamic Interfaces</title><link>https://www.davidhoang.com/notes/dynamic-interfaces/</link><guid isPermaLink="true">https://www.davidhoang.com/notes/dynamic-interfaces/</guid><description>Interfaces that change with context, model output, and intent—not fixed screens</description><pubDate>Sat, 04 Apr 2026 00:00:00 GMT</pubDate><content:encoded>&lt;h2&gt;Introduction: why interfaces are the unsettled layer&lt;/h2&gt;
&lt;p&gt;The software interface layer is the least understood—and most unsettled—part of AI-era computing.&lt;/p&gt;
&lt;p&gt;Every major computing shift goes through an awkward adolescence. Early versions look primitive in hindsight, not because the ideas were wrong, but because the interaction grammar had not yet been invented. The first release of iPhone OS famously lacked copy and paste. Touch computing existed, but its affordances were incomplete. A few years later, interactions like pull-to-refresh became so natural that we forgot they were ever invented.&lt;/p&gt;
&lt;h2&gt;Dynamic Interfaces (structured outline)&lt;/h2&gt;
&lt;h3&gt;Introduction / thesis: what dynamic interfaces are&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;The software interface layer is the part of AI-era computing that remains least understood and most unsettled.&lt;/li&gt;
&lt;li&gt;Early versions of new interaction paradigms always look primitive in hindsight (e.g., iPhone OS lacking copy/paste, pull-to-refresh emerging from Tweetie).&lt;/li&gt;
&lt;li&gt;Current AI interfaces—especially chat—are transitional forms, not the final pattern.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Dynamic interface&lt;/strong&gt; describes systems that redesign themselves based on feedback, intent, and context.&lt;/li&gt;
&lt;li&gt;Unlike responsive design or personalization based on cohorts, dynamic interfaces tailor the interface itself to individual users.&lt;/li&gt;
&lt;li&gt;As computation accelerates and learning cycles shorten, the interface can evolve continuously, not in slow human-run experiments.&lt;/li&gt;
&lt;li&gt;The future interface responds to behavior, preference, environment, and task—not just display dimensions.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;A new MVC&lt;/h3&gt;
&lt;p&gt;&lt;em&gt;Summarized by Dia.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Context:&lt;/strong&gt; Traditional MVC separated data, UI, and logic; AI and decentralized protocols (like MCP) are shifting this toward dynamic, agent-orchestrated, distributed systems.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;End users control and store their own data.&lt;/li&gt;
&lt;li&gt;Data also becomes inferred and summarized from LLMs.&lt;/li&gt;
&lt;li&gt;Context windows.&lt;/li&gt;
&lt;li&gt;Apps become interchangeable views over personal data stores.&lt;/li&gt;
&lt;li&gt;Interfaces are declarative queries across distributed data, not bespoke APIs.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;New MVC&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;LLMs as models:&lt;/strong&gt; probabilistic, continuously trained systems—“grown, not built.”&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Apps as views:&lt;/strong&gt; UI layers that render, query, and pass data; portability over storage; interoperability by design.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Agents as controllers:&lt;/strong&gt; interpret, plan, and coordinate actions across systems via protocols like MCP; real-time context switching.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;strong&gt;Implications&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Shift from app-centric to system-centric product experiences.&lt;/li&gt;
&lt;li&gt;Managing endpoints, protocols, and interactions replaces monolithic codebases.&lt;/li&gt;
&lt;li&gt;Vendor lock-in becomes a liability; AI interfaces act as meta-layers across apps.&lt;/li&gt;
&lt;li&gt;Software is decoupling, enabling portability, ownership, and agent-driven orchestration.&lt;/li&gt;
&lt;/ul&gt;
&lt;h4&gt;Model&lt;/h4&gt;
&lt;p&gt;Models expand beyond a single datasource into LLMs, APIs, external datasets, and federated data. Users may choose models, build their own, or bring their own dataset (BYOD). Models become negotiable components, not fixed infrastructure.&lt;/p&gt;
&lt;h4&gt;View&lt;/h4&gt;
&lt;p&gt;The “view” becomes less about fixed screens and more about interface surfaces spanning devices and contexts. Influenced by Ruben Verborgh’s “apps as views” concept. Universal Control, Stage Manager, and AR environments hint toward interfaces that float across display boundaries. Views adapt not only in layout but in function, depending on the task and context.&lt;/p&gt;
&lt;h4&gt;Controller&lt;/h4&gt;
&lt;p&gt;The controller evolves most. Interaction is no longer exclusively initiated by the user; agents and AI controllers act on behalf of the user. User input remains the ground truth, but AI can propose or execute adjustments. The controller mediates between human intent, model capability, and interface behavior.&lt;/p&gt;
&lt;h4&gt;Non-deterministic workflows&lt;/h4&gt;
&lt;p&gt;Many AI-era flows won’t behave like fixed scripts: paths branch, retry, or adapt based on model output, tools, and context.&lt;/p&gt;
&lt;h3&gt;Context and ambient experience&lt;/h3&gt;
&lt;p&gt;Dynamic interfaces interpret context beyond device state (e.g., Focus modes, presence near a device). Software anticipates needs through richer contextual signals: environment, behavior patterns, ergonomics, accessibility preferences.&lt;/p&gt;
&lt;p&gt;Multi-device experiences proliferate: phones, tablets, computers, wearables, XR displays, and invisible or ambient devices. Future environments have too many surfaces to design manually for each form factor.&lt;/p&gt;
&lt;p&gt;Interfaces become &lt;strong&gt;ambient:&lt;/strong&gt; flowing across devices, resizing themselves, reprioritizing content, and responding to user needs without being explicitly summoned.&lt;/p&gt;
&lt;h3&gt;End user experience&lt;/h3&gt;
&lt;h4&gt;Human-centered by design&lt;/h4&gt;
&lt;p&gt;Personalization becomes possible with memory. Even when content was personalized to the end user, the interfaces were built in such a rigid way they didn’t get the value of it. Dynamic interfaces amplify human-centered design rather than threaten it. Users retain free will, not just “control.” Interfaces should be aligned with natural cognition.&lt;/p&gt;
&lt;h4&gt;Cognitive efficiency&lt;/h4&gt;
&lt;p&gt;Interfaces should minimize cognitive load (Jef Raskin’s framework).&lt;/p&gt;
&lt;h4&gt;Malleable software&lt;/h4&gt;
&lt;p&gt;Chat interfaces often hide complexity rather than reduce it. Dynamic interfaces can surface appropriate UI controls to replace or complement prompts.&lt;/p&gt;
&lt;h4&gt;UI as an abstraction layer&lt;/h4&gt;
&lt;p&gt;UI controls handle complexity for the user—e.g., style dials or sliders that modify multiple underlying properties. Products like Flair AI show how generative UI elements can help users steer outcomes without writing extensive prompt text.&lt;/p&gt;
&lt;h4&gt;Personalized accessibility&lt;/h4&gt;
&lt;p&gt;Humans are historically poor at prioritizing accessibility; machines can adapt interfaces in real time. Dynamic interfaces allow per-user adjustments for color, contrast, ergonomics, reading needs, and more, based on feedback inside the app.&lt;/p&gt;
&lt;h4&gt;Feedback loops&lt;/h4&gt;
&lt;p&gt;Feedback is no longer limited to formal usability sessions. Users provide input through natural language, gestures, screenshots, or in-context signals. The system iterates immediately, adjusting layout, hierarchy, or flow.&lt;/p&gt;
&lt;h4&gt;Configurable dynamism&lt;/h4&gt;
&lt;p&gt;Users choose how dynamic they want their interface to be—from fully static to highly adaptive. Choices include:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Which models or algorithms to use&lt;/li&gt;
&lt;li&gt;How views are customized&lt;/li&gt;
&lt;li&gt;What logic the system can automate&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Dynamic interfaces support &lt;strong&gt;personal software&lt;/strong&gt;—software that builds itself around the user.&lt;/p&gt;
&lt;h2&gt;How we build dynamic interfaces&lt;/h2&gt;
&lt;h3&gt;Design tools evolve into hybrid design–development environments&lt;/h3&gt;
&lt;p&gt;A future &lt;strong&gt;IDDE&lt;/strong&gt; (integrated design and development environment): drawing or manipulating UI elements impacts functionality directly (similar to visual development tools). Tools like Figma will likely blend vector drawing, component authoring, logic definition, debugging, and automation.&lt;/p&gt;
&lt;h3&gt;Abstraction as a spectrum&lt;/h3&gt;
&lt;p&gt;Both high-level abstractions (no-code) and low-level control (code) are needed. Tools must help users move along the abstraction spectrum depending on skill and intent. Sophisticated concepts should not be introduced before users are ready; too much abstraction limits capability.&lt;/p&gt;
&lt;h3&gt;Extensibility and interoperability&lt;/h3&gt;
&lt;p&gt;AI models and agents will pressure walled gardens to open up. Interoperability becomes mandatory. Authoring environments must be extensible to accommodate numerous data sources, models, and agent workflows.&lt;/p&gt;
&lt;h3&gt;Defining what is “designable”&lt;/h3&gt;
&lt;p&gt;Developers and designers specify which parts of a UI can be modified dynamically and which remain fixed—analogous to &lt;code&gt;IBDesignable&lt;/code&gt; in Xcode: declaring what the system may alter. Users could define constraints (e.g., what they want more or less of) that guide the system’s reconfiguration.&lt;/p&gt;
&lt;h3&gt;Developer-defined abstractions&lt;/h3&gt;
&lt;p&gt;Not all users need to understand raw code or agent orchestration. Libraries of logic, integrations, and templates will exist alongside component libraries. Collaboration requires multiple layers of comprehension and abstraction.&lt;/p&gt;
&lt;h3&gt;Structured and unstructured prompt criteria&lt;/h3&gt;
&lt;p&gt;Chat and direct manipulation can coexist. Adobe’s generative features point toward hybrid workflows where prompts trigger structured options. Prompts invoke intent; UI refines and executes it.&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;Dynamic interfaces are not a futuristic invention but a continuation of decades of software evolution. The opportunity is to connect past concepts with new capabilities and steer them toward human needs.&lt;/p&gt;
&lt;p&gt;As interfaces become dynamic, software becomes more personal, adaptive, and humane—not more opaque. The goal is not AI designing everything for us, but AI helping us spend less time wrestling with software and more time on the work—and relationships—that matter.&lt;/p&gt;
&lt;h2&gt;Open questions&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Where does &lt;strong&gt;predictability&lt;/strong&gt; (learnable affordances) trade off against &lt;strong&gt;adaptation&lt;/strong&gt; (right control at the right time)?&lt;/li&gt;
&lt;li&gt;How do we avoid &lt;strong&gt;mode chaos&lt;/strong&gt; when the interface can reshape itself?&lt;/li&gt;
&lt;li&gt;What patterns count as dynamic but still &lt;strong&gt;feel designed&lt;/strong&gt;—not arbitrary?&lt;/li&gt;
&lt;/ul&gt;
</content:encoded><category>notes</category><category>interface-design</category><category>ai</category><category>systems</category></item><item><title>AI Interface Systems</title><link>https://www.davidhoang.com/notes/ai-interface-systems/</link><guid isPermaLink="true">https://www.davidhoang.com/notes/ai-interface-systems/</guid><description>Exploring how AI is reshaping interface design and interaction patterns</description><pubDate>Wed, 15 Jan 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Early thoughts on how AI is fundamentally changing interface design...&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Terminal / Command Line&lt;/strong&gt;
- &lt;strong&gt;Traditional OS:&lt;/strong&gt; CLI (bash, zsh)
- &lt;strong&gt;AI OS:&lt;/strong&gt; LLMs (text, voice)
- &lt;strong&gt;What’s changing:&lt;/strong&gt;
- Language becomes the universal command surface
- Power shifts from syntax → intent interpretation
- Prompting evolves into command composition and reuse&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Process Manager&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; init, schedulers, background processes&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Agents&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Execution becomes continuous, not request/response&lt;/li&gt;
&lt;li&gt;Agents decide &lt;em&gt;when&lt;/em&gt; to act, not just &lt;em&gt;how&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;Orchestration over time becomes the core capability&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Kernel&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Resource coordination, system rules&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Plans (and logs)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Plans are no longer documents; they are coordination primitives&lt;/li&gt;
&lt;li&gt;They constrain execution across humans + machines&lt;/li&gt;
&lt;li&gt;Planning shifts from prediction to alignment and constraint&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Filesystem&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Files, directories, permissions&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Information in files (not apps)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;“File over app” becomes foundational (Kepano’s philosophy)&lt;/li&gt;
&lt;li&gt;Apps stop being the source of truth; files do&lt;/li&gt;
&lt;li&gt;Memory moves from opaque storage → inspectable structure&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;File Formats&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Fixed formats (.doc, .jpg, .mp3)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Multimodal, metadata-rich files that can mutate and change based on context while retaining the original form&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;A “file” becomes a bundle of structured data + intent&lt;/li&gt;
&lt;li&gt;Content can transform across text, audio, image, summary, plan&lt;/li&gt;
&lt;li&gt;Format is no longer presentation-specific, but capability-specific&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;RAM / Working Memory&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Volatile memory&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Context windows&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Context is abundant but fragile&lt;/li&gt;
&lt;li&gt;Systems confuse recall with understanding&lt;/li&gt;
&lt;li&gt;Long-term value comes from promoting context → files → plans&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Device Drivers&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Hardware drivers&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Tools, APIs, system actions&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Models don’t act directly; drivers translate intent into execution&lt;/li&gt;
&lt;li&gt;Tool reliability becomes system stability&lt;/li&gt;
&lt;li&gt;Permissions and scope become first-order concerns&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Window Manager / Views&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Desktop environment&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Apps as Views&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;MVC decouples: models ≠ views ≠ controllers&lt;/li&gt;
&lt;li&gt;Apps stop being “systems” and become projections of state&lt;/li&gt;
&lt;li&gt;The same underlying data can render across many surfaces&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Interface Layer&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; GUI, input devices&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Dynamic, state-aware interfaces&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Interfaces respond to what the system &lt;em&gt;knows&lt;/em&gt;, not just user input&lt;/li&gt;
&lt;li&gt;Chat is a bootstrap UI, not the end state&lt;/li&gt;
&lt;li&gt;IDEs and OSes evolve toward adaptive, context-revealing surfaces&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Interoperability Layer&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; IPC, system calls&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Interoperability (not integrations)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Systems coordinate through shared primitives, not brittle glue code&lt;/li&gt;
&lt;li&gt;IDEs and OSes become peers in a larger execution environment&lt;/li&gt;
&lt;li&gt;Data, plans, and actions flow across boundaries by default&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Security &amp;amp; Permissions&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Users, roles, sandboxing&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Identity, consent, memory access&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Personalization requires explicit permission models&lt;/li&gt;
&lt;li&gt;Memory access becomes as sensitive as file access&lt;/li&gt;
&lt;li&gt;Without this, intelligence defaults to surveillance&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Logs &amp;amp; Observability&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Traditional OS:&lt;/strong&gt; Logs, stack traces, system monitors&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;AI OS:&lt;/strong&gt; Decision traces, plan diffs, tool histories&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What’s changing:&lt;/strong&gt;
&lt;ul&gt;
&lt;li&gt;Trust comes from inspectability, not confidence scores&lt;/li&gt;
&lt;li&gt;Systems must explain &lt;em&gt;what happened&lt;/em&gt;, not just answer&lt;/li&gt;
&lt;li&gt;Debugging intelligence becomes a core UX problem&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;
</content:encoded><category>notes</category><category>ai</category><category>interface-design</category><category>systems</category></item><item><title>A New MVC is Emerging</title><link>https://www.davidhoang.com/notes/mvc-is-decoupling/</link><guid isPermaLink="true">https://www.davidhoang.com/notes/mvc-is-decoupling/</guid><description>How AI and protocols are reshaping the classic Model-View-Controller pattern</description><pubDate>Sat, 04 Jan 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Traditional MVC separated data, UI, and logic. AI and decentralized protocols are shifting this toward dynamic, agent-orchestrated, distributed systems.&lt;/p&gt;
&lt;h2&gt;The old MVC&lt;/h2&gt;
&lt;p&gt;The pattern we&apos;ve used for decades assumed:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Models were databases you controlled&lt;/li&gt;
&lt;li&gt;Views were screens you designed&lt;/li&gt;
&lt;li&gt;Controllers were logic you wrote&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Everything was monolithic. Data lived in your app. UI was bespoke. Logic was deterministic.&lt;/p&gt;
&lt;h2&gt;The new MVC&lt;/h2&gt;
&lt;p&gt;&lt;strong&gt;LLMs as Models&lt;/strong&gt; — Probabilistic, continuously trained systems. &quot;Grown, not built.&quot; Models expand beyond a single datasource into LLMs, APIs, external datasets, and federated data. Users may choose models, build their own, or bring their own dataset.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Apps as Views&lt;/strong&gt; — UI layers that render, query, and pass data. Portability over storage. Interoperability by design. The &quot;view&quot; becomes less about fixed screens and more about interface surfaces spanning devices and contexts. Views adapt not only in layout but in function.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Agents as Controllers&lt;/strong&gt; — Interpret, plan, and coordinate actions across systems via protocols like MCP. Real-time context switching. Interaction is no longer exclusively initiated by the user. The controller mediates between human intent, model capability, and interface behavior.&lt;/p&gt;
&lt;h2&gt;What this means&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Shift from app-centric to system-centric product experiences&lt;/li&gt;
&lt;li&gt;Managing endpoints, protocols, and interactions replaces monolithic codebases&lt;/li&gt;
&lt;li&gt;Vendor lock-in becomes a liability&lt;/li&gt;
&lt;li&gt;AI interfaces act as meta-layers across apps&lt;/li&gt;
&lt;li&gt;Software is decoupling, enabling portability, ownership, and agent-driven orchestration&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The controller evolves most dramatically. User input remains ground truth, but AI can propose or execute adjustments. Non-deterministic workflows become the norm.&lt;/p&gt;
&lt;p&gt;&lt;em&gt;This note is extracted from my longer essay on Dynamic Interfaces.&lt;/em&gt;&lt;/p&gt;
</content:encoded><category>notes</category><category>software</category><category>ai</category><category>architecture</category></item><item><title>AI as Creative Partner</title><link>https://www.davidhoang.com/notes/ai-creativity-tools/</link><guid isPermaLink="true">https://www.davidhoang.com/notes/ai-creativity-tools/</guid><description>Early thoughts on how AI changes the creative process</description><pubDate>Fri, 03 Jan 2025 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Rough notes on something I&apos;ve been thinking about...&lt;/p&gt;
&lt;h2&gt;The shift from tool to collaborator&lt;/h2&gt;
&lt;p&gt;Traditional tools are passive. You push pixels, they move. AI tools push back. They have &quot;opinions&quot; (even if those opinions are statistical patterns).&lt;/p&gt;
&lt;p&gt;This changes the creative dynamic. It&apos;s less like using Photoshop and more like working with a very fast, very literal junior designer.&lt;/p&gt;
&lt;h2&gt;What I&apos;m noticing&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;The best outputs come from iteration, not first attempts&lt;/li&gt;
&lt;li&gt;Knowing what to ask for matters as much as the AI&apos;s capabilities&lt;/li&gt;
&lt;li&gt;The &quot;taste gap&quot; still applies—you need to know good from bad&lt;/li&gt;
&lt;li&gt;Speed enables experimentation in ways we haven&apos;t fully explored&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Questions to explore&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Does AI assistance atrophy certain skills?&lt;/li&gt;
&lt;li&gt;What new skills does it require?&lt;/li&gt;
&lt;li&gt;How does credit/authorship work?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;This note is very early. Will revisit as I work with these tools more.&lt;/p&gt;
</content:encoded><category>notes</category><category>ai</category><category>creativity</category><category>tools</category></item><item><title>Design Systems Thinking</title><link>https://www.davidhoang.com/notes/design-systems-thinking/</link><guid isPermaLink="true">https://www.davidhoang.com/notes/design-systems-thinking/</guid><description>How systems thinking applies to design and why components are just the beginning</description><pubDate>Sun, 15 Dec 2024 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Design systems are often reduced to component libraries, but that misses the forest for the trees. A true design system is a &lt;strong&gt;way of thinking&lt;/strong&gt;—a shared language and set of principles that guide how a team approaches problems.&lt;/p&gt;
&lt;h2&gt;Components are outputs, not inputs&lt;/h2&gt;
&lt;p&gt;When teams start with components, they&apos;re working backwards. Components should emerge from understanding:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;What problems are we solving?&lt;/li&gt;
&lt;li&gt;What patterns keep recurring?&lt;/li&gt;
&lt;li&gt;Where do we need consistency vs. flexibility?&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;The three layers&lt;/h2&gt;
&lt;p&gt;I think about design systems in three layers:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Principles&lt;/strong&gt; — The &quot;why&quot; behind decisions&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Patterns&lt;/strong&gt; — Reusable solutions to common problems&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Components&lt;/strong&gt; — The tangible building blocks&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Most teams jump straight to layer 3 and wonder why adoption is hard.&lt;/p&gt;
&lt;h2&gt;Living systems&lt;/h2&gt;
&lt;p&gt;The best design systems evolve. They&apos;re not perfect—they&apos;re &lt;strong&gt;good enough, and improving&lt;/strong&gt;. This requires:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Regular audits and pruning&lt;/li&gt;
&lt;li&gt;Clear contribution models&lt;/li&gt;
&lt;li&gt;Measuring what matters (adoption, consistency, velocity)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;The goal isn&apos;t to build a perfect system. It&apos;s to build a system that makes the team more effective.&lt;/p&gt;
</content:encoded><category>notes</category><category>design</category><category>systems</category><category>process</category></item><item><title>Building Product Intuition</title><link>https://www.davidhoang.com/notes/product-intuition/</link><guid isPermaLink="true">https://www.davidhoang.com/notes/product-intuition/</guid><description>Notes on developing the instinct for what makes products resonate</description><pubDate>Wed, 20 Nov 2024 00:00:00 GMT</pubDate><content:encoded>&lt;p&gt;Product intuition isn&apos;t magic—it&apos;s pattern recognition built through deliberate practice. The best product thinkers I know share a few habits.&lt;/p&gt;
&lt;h2&gt;They use everything&lt;/h2&gt;
&lt;p&gt;Not just products in their domain. They notice how grocery stores arrange aisles, how airports handle wayfinding, how games onboard new players. Every designed experience is a case study.&lt;/p&gt;
&lt;h2&gt;They ask &quot;why&quot; obsessively&lt;/h2&gt;
&lt;p&gt;Not just &quot;what does this do&quot; but &quot;why did they choose this?&quot; Every decision in a product is a bet. Understanding the bet helps you understand the thinking.&lt;/p&gt;
&lt;h2&gt;They ship and learn&lt;/h2&gt;
&lt;p&gt;Intuition without feedback is just guessing. You need the tight loop of:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Form a hypothesis&lt;/li&gt;
&lt;li&gt;Build something&lt;/li&gt;
&lt;li&gt;See what happens&lt;/li&gt;
&lt;li&gt;Update your mental models&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;Open questions I&apos;m still exploring&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;How do you balance intuition with data?&lt;/li&gt;
&lt;li&gt;When should you trust your gut vs. test?&lt;/li&gt;
&lt;li&gt;How do you transfer intuition to a team?&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;More to come as I think through these...&lt;/p&gt;
</content:encoded><category>notes</category><category>product</category><category>craft</category><category>learning</category></item></channel></rss>