I’ve been wondering about the Frey categorisation1. It gives a % likelihood of a whole profession being automated, e.g. architecture 2%. That’s a low number, and people are justifiable happy about the job security that implies.
What it doesn’t tell you is the internal impact of automation on that profession. E.g. delivery drivers can be replaced by robotic cars, but until the car can get the item from the van to the final destination (into the letter box) there will still need to be a human in the van. Once that is overcome that whole human profession will disappear rapidly.
The other way that can manifest itself is by a gradual chipping away of the need for humans to do a job. I think we saw this in action at the end of the GFC when GDPs recovered and employment didn’t; people had been kept in jobs that had been automated to some large proportion.
I’ve had this idea in my head that progress isn’t like a neat line moving forwards in time. It’s like water moving over a bumpy beach. Sometimes it rushes ahead, and other times it’s held up behind an obstacle. Eventually that obstacle will fall and the pressure is released.
If we think of the surface of the sand as a metaphor for a simplified version of a technological, social and moral landscape then sometimes a thing is just over that hump, but the hump is there because we haven’t discovered a business model that’ll make it viable, or society isn’t morally ready for it. Sometimes the water will rush down the free channels and then backfill the areas dammed off by unreadiness.
I realise that some people will have a problem with me talking about progress as if it’s an unqualified good, or that it’s inevitable. I follow the school of thought that thinks that if a technology is possible then it’s inevitable given infinite time2. I don’t think that it’s at all obvious that it will be necessarily be good, but there is a strong historical bias towards goodness on the whole3.
I think that this is quite a valuable way of thinking about progress. I try to think of what’s holding the development of something up, and think of ways to go around. Or I try to think about what might be over the next hump, given that I have x and y technology and z social change.
This topic is talked about in the video above too. It might be a misinterpretation because we don’t care enough about the people who the progress was bad for. E.g. logging hasn’t been all that good for remote tribes, European colonisation wasn’t that good for South Americans. But these aren’t inevitable outcomes of any given technology. ↩