Equilibria
Equilibriums exist everywhere. The sciences are filled with them. Earth goes round kind of thing.
Economics is tied very closely with game theory, which is a study of how rational economic actors interact in certain games. The key word there is rational economic actors. In these frameworks, you can very reliably predict how supply / demand will affect things like prices and human behavior. It’s almost scientific.
Here’s a quick 4o chat on rational economic actors. What’s interesting, is the root of their rationality is in survivorship. When you back it all the way out, really, what’s “rational“ is rooted in human needs. It’s psychological.
I have very very little experience with psychology (and I know there are some stated flaws with this), but some of the most basic, generalized frameworks look something like Mazlows Hierarchy:
At the bottom are our foundational needs and at the very top are more social needs. The challenges to this basically say that it’s too simplified really lie in that it’s not stepwise, it can be a mix of all or none at once, but generally, actors have this priority stack.
Just taking AGI, you start to realize that these technologies hit at a few layers of this pyramid (maybe all). And at least what we think of as a rational economic actor is about to change.
Yes, the way we think about *value* might be radically different. Soon. The equilibria might shift.
Scientific equilibria will hold, social ones, maybe not so much.
Layer on that our idea of an “actor“ will potentially change too, to the extent machines can have their own volition.
The Future
I’ve been posting about this in a few places with a lot of smart people. The resounding question I get is, what does the future look like?
In short, IDK ¯\_(ツ)_/¯
Generally, it seems pretty easy to predict the shape of the future, but very hard to predict the specifics.
When Covid started, was it easy to predict everyone working from home would cause economic shock and social unrest? Sure. Could you predict exactly the BLM riots? No way.
When GPT4 came out, could you predict that a robot that can parrot code would put a supply shock on engineering talent? Yeah, totally. Was it possible to predict that it would make the senior engineers way more valuable, junior engineers much less valuable, and regular people would still have a hard time coding? No, most predictions were that programmers would go away forever.
This is the same thing. The general shape though is more extreme:
Very unclear timelines, but almost certainty of deployment
High likelihood of social unrest as the technology is deployed
Huge economic shock across most industries
Initial political pushback
A need for forms of UBI
A meaning crisis
A near-utopian long-term outcome
Specifics on how that plays out? IDK ¯\_(ツ)_/¯
I think that possibly the bigger idea though is that how we define value changes too. The equilibria shifts because our definition of survisorship changes too. This happens all the time is smaller, but still significant ways. The difference here will be with which the speed it could change.
The pyramid inverts. Not immediately, but there’s some transition where the bottom economic incentives become less important and the ones at the top become a lot more important.
The arbitrage is to build things for the top, not the bottom anymore. The value you accrue will look a lot different, but it will be still be impactful.