Limiting factors (Mental Models XI)

One interesting way of approaching a problem is to identify the limiting factors – what is that sets the ultimate limits for progress on a particular issue? In many cases it will be time – there are some things we could do in theory, but where the time needed exceeds the calculated time available to us individually or cosmologically. This means that time is a limiting factor.

Another, interesting, limiting factor is energy.

Assume we build an enormously powerful computer that runs an artificial general intelligence and that it consumes 0.1% of the world’s energy – energy then becomes limit for how many such computers we can run, and so also a limit for how much cognicity we can produce. Now, the research in artificial intelligence has advanced fast, and today’s most advanced systems consume as much energy as perhaps 10-20 human beings, so we may never run into that limiting factor for AI.

Wiener and the notion of limits in systems provide us with an interesting take on understanding technology.

But we could run into another limiting factor – and that is rare earth metals or minerals needed for producing computer chips. The amount of raw material here is another limit on how many computers we can build and will ultimately determine the computational capacity we can access.

If these limiting factors are mostly about access to a resource, there are others that are more about our knowledge. Complexity is emerging as an interesting limiting factor as well — some systems are becoming so complex as to no longer be accessible for most of us, and this is not just true for computers: could you really explain how your fridge works? But this is different from complexity as a limiting factor, where the real question is when complexity of our global ecosystem makes it impossible to influence or impact – where our interventions will have little or no effect on what actually happens.

Complexity leads, at some point, to the loss of causal agency to some degree – and this presents a tricky limiting factor as well. We could imagine a state of our global ecosystem where we have no set of actions available to us that we know could slow down climate change, for example. Are we there? Probably not. Are we closer now than 100 years ago? Absolutely.

For organizations the limiting factors are more mundane – and have to do with people, time, cash, but also computing resources. An organization can be understood and approached as a set of limiting factors, and these are both external and internal. The external are regulation etc.

The value of focusing on limiting factors is that it provides you with a sense of what is not possible for an organisation to do, it delineates search space for solutions and possible evolutions.

When we think about technology and limiting factors we can outline what certain technologies can do with tests — the self-driving tests and Turing tests are examples of finding limits for technology and assessing when they are broken. What makes artificial intelligence really interesting here is that it can be deployed to adjust and change its own limits, much like we can as human beings. That aspect – self-modification – was the focus of some of the early cybernetics (Wiener et al speak of self-reproducing machines and self-organizing systems, which for them includes the ability to self-modify), but seems to have receded into the background a bit now, which is weird. Any technology that starts to be able to modify its limits or improve itself is in a very different category than most technology we know of today.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s