Embracing the Ouroboros
Note: Paul and Eric are managing partners in SK Ventures, an early-stage venture fund. In these essays they share their thinking out loud about the nature of venture investing, AI, and related landscapes. You can find out more about us here.
As quickly as AI-infused applications are emerging, it has already become clear that a new paradigm is required for thinking about how they will be applied. Old technology ecologies built around layers don’t make sense. While many are already applying a compute mental model, that is too narrow and limiting, even if understandably appealing. We see the next generation of AI-infused apps being built as part of an altogether new kind of operating system, not just a variant on what has come previously.
The Old Compute Layer Model
Before laying out our vision of where we’re going, let’s quickly walk through what we mean when we say traditional compute models. The oldest model is on the left in the following figure, where everything happens on-premises and is managed there as well. Innovation can happen at any or all of those layers, but they can be treated as fairly discrete, for practical purposes.
Over time this model has evolved, with more and more of what used to be on-premise happening elsewhere, as it is turned into a service and was managed by other vendors. A significant fraction has ended up at Amazon Web Services (AWS), but it ebbs and flows with time, with more and less of the stack being managed on-premise, depending on needs, opportunities, and risks.
What hasn’t changed is thinking in terms of discrete layers: from applications to operating systems, to servers and storage and networking. These layers have been a convenient abstraction for thinking about how we can usefully unbundle what is going on, even if rhetorically, to better understand what is happening.
Of course, we are investors. We are less interested in how people describe things than in how thinking in these terms—insofar as they are based on current or impending reality—creates or eliminates opportunities. The rise of layers in the earliest days created opportunities at every level of the stack, from networking to applications, even if not all at the same time. The subsequent emergence of cloud services created new opportunities for provisioning, metering, and managing services that were hosted in one place and being delivered in another. Every change in compute models creates new opportunities for entrepreneurship and investment.
And that brings us to today. There is a temptation to think in layers again, given that it has worked before, and given that LLMs can be thought of as an operating system into which things plug. To that way of thinking, Windows is to OpenAI as Ubuntu is to Llama: proprietary operating systems vs. open source ones. Developers decide which risks they want to take, and what they will pay, and then build accordingly.
This is fine, as far as it goes, but it misses something important. While LLMs in particular, and AI in general, can be thought of as an operating system, it is a very different kind of operating system. Unlike, say, Windows, which insulated users and apps from hardware, creating an abstraction layer to which developers could write, AI is far more pervasive. Its effects will be felt at all layers of the old stack, from operating systems and networking, to data and apps.
To our way of thinking, LLMs (and more broadly, AI) will not so much be a layer as a kind of benign growth, a metastasizing clump of cells that, via the bloodstream, penetrates all aspects of what passes for compute. At the same time, like those clumps of cells, it will have a huge degree of autonomy, self-referentiality, and self-direction; making its own decisions about what it will do, and how. In that way, AI will jump up and down the stack constantly, inserting itself at all levels, creating pipes, portholes, interconnections, and processes. We think of it as ambient, immersive, holistic, and latent; continuously observing, optimizing, improving, and enhancing both itself and that with which it connects.
The Ouroboros Operating System
In a loose sense, this is an entirely new operating system, but one much broader, deeper, and more ubiquitous than our past notions of operating systems. The next generation of apps will be built on an ambient OS, an OOS: an Ouroboros operating system – one that is built on and consumes itself. And like the Ouroboros of myth, in consuming itself it grows and flourishes.
What does this mean? We are seeing the very beginnings now as we exit the early text-based stages of LLMs and head for what are being called LMMs (large multimodal models). The former were grammar engines, terrific at predicting what should come next—whether legal documents, school essays, software, CSS, fiction, etc.—based on textual grammars and training data. The next generation will be far more sensorially engaged, collecting continuous data from audio, video, text, and even tactility.
Here are some highlights of where we think we are going (we will expand on this in future):
Unprecedented Personalization:
Tailoring based on unique user needs and preferences.
Natural Interactions:
AI understands context and semantics beyond explicit commands.
Self-Optimizing Systems:
AI monitors and enhances its own performance over time.
Continuous Evolution:
AI platforms adapt and evolve, outpacing static systems.
Enhanced Threat Detection:
Holistic anomaly and threat identification.
Revolution in Development:
AI modifies code based on high-level goals, reshaping software creation.
Fluid Tech Ecosystem:
Seamless communication between AI systems, devices, and platforms.
Job Role Transformation:
AI/LLMs undertake complex tasks, altering industry landscapes.
Empowering Collaboration:
Real-time insights, automation, and predictive analytics.
Democratized Expertise:
AI lets non-experts perform at expert levels.
Human & AI Synergy:
AI amplifies human capabilities for co-creative solutions.
Paradigm Technology Shift:
Transition from deterministic to adaptive, ever-evolving systems.
The Post-AI Investment Landscape
Returning to the platform figure above, what will this look like? It will look less like discrete layers than like blended ones, all connecting in that Ouroborosian sense we describe above. But it goes deeper than that. To push the metastatic idea further, one of the hallmarks of this kind of cellular change in an organism is vascularization, the appearance of new blood vessels throughout the organism. Rather than cells being fed from old sources, new supply paths emerge, often in unpredictable and evolutionary ways.
The following figure summarizes our thinking, taking the old layer model, but transforming it for a post-AI landscape. The world starts off layered, then steadily becomes blurred and interconnected. Over time, new vascularization appears, creating new pathways and flows that didn’t previously exist. The source of the blurring and the new bloodlines is AI, which no longer must abide by the old layers and ways, which were largely constraints driven by the limitations of the past. The post-AI investment Ouroboros is a very different creature altogether, and investing along the old lines will lead to nonexistent moats and a general failure to thrive.
Two points to keep in mind in considering this process. First, some of it is formal, with explicit new connections created, as shown below by the path connecting various boxes along the right side. Second, much of the new path making across the former layers will be informal, what we have termed “vascularization.” That is to say that new paths will emerge organically as a function of AIs evolving to best meet their own needs in service of one or more objectives.
The Future
It seems increasingly likely that the only choice ahead will be to what extent we embrace the Ouroboros. We are already seeing researchers observing employees in AI-infused workplaces and finding them separating into two camps: centaurs (who embrace AI as a new part of them), and cyborgs (who live inside AI all day, every day). Whether this sounds palatable or not, those are increasingly likely to be the only choices as the AI infusion accelerates.
We are staunchly in the camp that we must focus on ways to encourage thriving in this transition. We cannot, as we have previously written, afford another Engel’s pause, where we have decades of declining worker wealth, even if society does better, however defined.
But, make no mistake about it, the change is here. What we think people don’t understand is its inevitable pervasiveness, as AI, driven by the evolution from LLMs to LMMs, becomes part of the bloodstream. The past thinking in layers will become a future of thinking across the whole organism of tech. Everything changes when all five senses are deployed, ambient and ingesting – with people mostly hanging on for the ride.
Note: Paul and Eric are managing partners in SK Ventures, an early-stage venture fund. In these essays they share their thinking out loud about the nature of venture investing, AI, and related landscapes. You can find out more about us here.