domingo, 25 de fevereiro de 2018

Thinking Ahead (by Nigel Warburton)



















Søren Kierkegaard famously pointed out that the only way we can understand life is backwards – we are compelled to live moving forwards, but attempt understanding by looking at what has happened. Perhaps he had Hegel’s Owl of Minerva that only flies at dusk in mind here. For Hegel, as history unfurls we collectively come to a moment of self-consciousness and understanding of what has transpired on a grand scale – symbolised by the Owl’s wisdom. Kierkegaard, a fierce opponent of Hegelianism, was interested in the individual’s experience and not epochal movements in history. Yet, as he made clear, his insight that we individually live forwards and understand backwards doesn’t make things any easier. We’re constantly moving forwards (until death), so we never achieve a perfect resting point from which to look back.
The future is the stuff of planning and daydreams. We all spend many days of our lives musing on it, despite the injunctions of mindfulness gurus to live in the present. In the next hour you’ll probably start thinking about things you need to get on with, and even the words ‘in the next hour’ will probably have triggered thoughts about your immediate future. If we did just live in the moment, in the now, we would lack motivation to go forward, lack direction, and would stumble from one situation to the next. We would probably leave the house without an umbrella, miss appointments, and not have any food in the refrigerator when we got home. Some of us do live like that, but that’s hardly a good thing. Yet the paradox of planning is that we both need to do it, and haven’t really got much of an idea how things will turn out – so much of it feels like a waste of time. Even our rough predictions, based on what seems like good evidence, can be wildly inaccurate.
In his recent book What We Cannot Know, the mathematician Marcus du Sautoy demonstrates the complexity of predicting something as straightforward as the trajectory of a cube, a die, thrown from a known height. The way this object will bounce and spin is extremely difficult to predict even when we know a great deal about the physics involved. The same is true of a double pendulum (a pendulum with a hinged arm). A miniscule difference of angle in the throw will produce radically different results for both the cube and the pendulum. The pendulum might swing smoothly, or go into a complex pattern of backwards and forwards spinning, when dropped from almost exactly the same height. Close analysis of the casino croupier’s angle of throw is unlikely to predict accurately the numbers that will show up on the dice; and knowing roughly the height from which a double pendulum is released does not allow a physicist to give an accurate approximate estimate of its trajectory. This is disconcerting for anyone brought up to believe that scientific prediction of physical systems is pretty straightforward.
Chaos Theory is the branch of mathematics that has developed to describe such situations: we are surrounded by systems that are in principle predictable if we know a great deal about starting conditions, but where in practice accurate prediction is impossible because tiny differences in the present produce such divergent outcomes. This is sometimes known as the ‘butterfly effect’ after the metereologist Edward Lorenz, who asked in 1972: “Does the flap of a butterfly’s wings in Brazil set a tornado in Texas?” He didn’t necessarily think that it did. His question was: given the complexity of causes affecting the weather, is it possible that significantly different meteorological conditions in the US could be caused by even something as gentle and small-scale as the flap of an insect’s wings thousands of miles away? The reason he asked this was because the weather seemed to be a case study in the difficulty of predicting outcomes from approximations.
There is no mysterious ‘chance’ operating here. As David Hume pointed out in the 18th century, we use the word ‘chance’ where we are ignorant of causes. Putting things down to ‘chance’ is just a way of saying we’re not quite sure what’s going on. But with complex systems like the one that gives rise to a tornado, approximate knowledge of the starting conditions may not be sufficient to make even approximate predictions of the outcome (in this case a tornado), since minute differences can be responsible for the system tipping over into one state or another. Lorenz put this nicely in his definition of Chaos:
“When the present determines the future, but the approximate present does not approximately determine the future.”
In my lifetime so much that has happened wasn’t foreseen. So much could have turned out completely differently. In the year I was born, the Cuban Missile Crisis could so easily have escalated to a nuclear war between superpowers.  When I was at school in the 1970s, only the most fanciful of sci-fi authors would have imagined the level of interconnection and global communication made possible by the Internet. The idea that many people would be carrying a small computer in their back pockets, or that there would be driverless cars on the road, would have struck most people as far-fetched.
Today, many people are confidently predicting that robots and computers will soon be doing most jobs. Some are worried that artificial intelligences will take over the world, and won’t have much patience with the comparatively limited intellectual capacities of human beings. Yet who knows what will really happen? Clever people make huge mistakes in their predictions. The philosopher Ludwig Wittgenstein, who had trained as an aeronautical engineer, and was cleverer than most, declared that no one would ever reach the Moon only a few decades before someone actually did.
The solution isn’t to avoid thinking about the future. The solution isn’t to stagger with our eyes closed into disaster. We need to make predictions. We need to contemplate what might happen; what is likely to happen; what might happen if things go horribly wrong. Weather forecasters carry on forecasting, aware that their predictions can occasionally be wildly off the mark for the reasons that Lorenz pinpointed. Like weather forecasters, we should recognise how chaos plays a part in life, and how difficult an activity prediction can be. And like weather forecasters, we need to keep revising our predictions in the light of new evidence in a fast-changing environment. We need to be sensitive to subtle shifts in the present that might have far-reaching implications in the future.
“What’s the worst that could happen?” It’s a question favoured by Cognitive Behavioural Therapists as a way of limiting fear of failure, and one worth asking even if we don’t know the accurate odds of the worst-case scenario coming to pass. Individually and collectively, worst-case scenario risk planning might be what saves us. At worst it will mean we prepare for some disaster that never materialises. There is still the risk, of course, that something much worse than anything we can now imagine takes place. There is the risk too that we will become gloomy and fearful of the future, a state of mind that may itself affect outcomes and limit our success. Rather than encourage that, it might be better to balance out worst-case scenario thinking with musing about the best that could happen. We should spend some time daydreaming about a utopian future to have an ideal to aim at too. There’s always the chance that a flapping butterfly wing somewhere might just tip us over into conditions that move us a step towards that future.

Originally published in: New Philosopher

Nenhum comentário:

Postar um comentário