Notes: Daniel Schmachtenberger - Jim Rutt Show
................................................................................................................................................................................
EP7 Daniel Schmachtenberger and the Evolution of Technology
Jim Rutt Show
................................................................................................................................................................................
Anything that has bottom-up coordination only, but abstraction-mediated capacities, like markets, is going to fall to multipolar traps. Multipolar traps with exponential tech will be catastrophically bad.
43:35 - Because of the nature of evolved systems we get antifragility. Complicated systems subsuming their complex substrate increase fragility.
If I burn a forest it will regenerate itself, if I cut my body it will heal itself. If I damage my laptop it won’t heal itself.
Humans take the antifragility of the natural world and turn it into fragile stuff. We turn it into simple and complicated stuff. So we turn a tree, that’s antifragile and complex, into a two-by-four that is simple; or a house that is complicated; but both are fragile.
We have complicated systems subsume the complex systems, so we’re creating an increasingly higher fragility-to-antifragility ratio. We’re trying to run exponentially more energy through an exponentially more fragile system.
45:20 - The way humans solve problems tends to create worse problems. For the solution to solve the problem it has to be larger, faster, somehow bigger than the problem was.
The solution typically is to solve a very narrowly defined problem - [a small number of] metrics - and yet it’s going to interact with with complex systems that affect lots of other metrics, where it will end up having harm-externality that will be larger than the original thing.
The plough solved the problem of local famines, but ended up causing desertification, and species extinction - and all these things writ large globally. The internal combustion engine solved the problem of too much horse shit in the cities and the difficulties of horses, but climate change, oil spills, wars over oil, and the destabilisation of the Middle East are the unintended externalities.
We can see the same for the value of Facebook compared to the unintended externalities it created.
I can define a problem in a narrow way, but that’s actually not the problem it’s a little part of it. It’s the same with biotech - I can say the problem is one biometric that I’m trying to address, for instance LDL, and I can give something that lowers that, but it also might do a bunch of other things that are negative which are the side effects. This is not a good approach to medicine.
The information processing that it takes to come up with a new piece of tech is orders of magnitude less than the information processing it takes to ensure that tech won’t have any externality in its long term application. The safety analysis is going to end up being NP-Hard relative to the work that it takes to come up with the tech being expressible as polynomial.
51:43 - Why do we get so much concentration of sociopathy in the top of Fortune 500 companies, and politics, and especially finance?
They’re basically systems to attract, reward, incentivise and condition sociopathy.
People who are attracted to power and people who are good at winning a bunch of Win-Lose games get to the top of a power game. At each step they move up the ladder they’re winning against somebody else, usually via things like disinformation and deception. If you think about the nature of a government, or a corporation, or any top down power system it is basically a strange attractor for people who want to have power.
If there are forty, fifty - up to a Dunbar number - of people living in a tribe there is an extraordinarily high degree of transparency that is forced in that scenario. Everybody pretty much sees what is going on with everybody else, everybody knows everyone, everyone has fealty relationships with everyone.
So sociopathy is not going to be advantageous - you’re not going to have an evolutionary niche in that environment for much in the way of conspiring and lying, because it will get found out and punished.
So the forced transparency creates an accounting system where you don’t get an evolutionary niche for somebody fucking the other people in the system. As soon as the system starts to get large enough that…
- there are anonymous people, so I can harm people who I don’t really know and care about
- I can do stuff that people won’t be able to see; if I can have a corruption of the accounting in the system…
…we get an evolutionary niche for internal defection, rather than participating with the system. I’m not externally defecting and leaving the system, I’m internally defecting and playing the system.
Most people inside a corporation or a government are optimising what is good for them and their direct fealty relationships, rather than what is good for the whole - and nobody can tell.
We do our social science inside of a world where these systems have become ubiquitous, and we assume that those properties - where there is ubiquitous conditioning - are intrinsic to human nature. We have to be careful about that because I think a lot of them are not intrinsic to human nature, they are a result of the ubiquitous conditioning - and we could create conditioning environments in which things like sociopathy are not advantageous and don’t get up-regulated.
................................................................................................................................................................................