Monday, December 10, 2018

Pathfinders: The Only Way Is Ethics (Not) (2018)

Iceland's palm oil ad
The Pathfinders Column from the December 2018 issue of the Socialist Standard

Is there anything you do, eat, wear or travel in that isn’t bad for the environment? Palm oil, used in a zillion products, is now being demonised as the new plastic, and one frozen food company has taken the ‘ethical’ decision to stop buying in palm oil products, while astutely trying to capitalise on this strategy with a Christmas ad featuring a cutesie kid and baby orang utan who sadly shows her his forest utopia being charred and bulldozed for the sake of her hair products. The ad went viral on YouTube after it was banned from the TV by regulators because it was produced by Greenpeace, deemed a ‘political’ organisation although by our definition they’re not as they don’t stand candidates for parliament. At the time of writing there is a heart-warming online campaign to overturn the ban (‘Iceland Christmas ad: Petition to show it on TV hits 670k’, BBC Online, 13 November). No doubt the firm’s marketing director can expect a stuffed bonus in their Christmas stocking for this crafty coup.

Christmas is always the perfect time for guilt-trips which invite you to pause and reflect, during your orgiastic overspending, on your ethical profile, that thing about which you feel least certain and most guilty. But what do we understand by the word ‘ethics’ and how useful is it? Dogs, elephants and other animals are known to have a moral sense, and we humans regard the absence of a moral compass as a clinical defect. We devise moral frameworks, often hi-jacked by religions as the work of some deity, to codify our values, our social concepts and our politics. This is probably a form of evolutionary heuristic, or short-hand guide, since we don’t have smart enough brains to calculate good survival strategies on demand. Instead we feel them as right or wrong, through some obscure associative process nobody really understands.

But there are inevitable problems with allowing your moral compass to do the driving. What if you have the wrong information? Have you corrected for your internal biases? If your morality doesn’t square with other people’s, who is to say who’s right?

A group of programmers currently grappling with the complexities of moral codes are those trying to design the AI systems in self-driving cars. What has them perplexed is the nightmare no-win crash scenario known as the ‘Trolley problem’, in which you can only avoid killing one lot of people by diverting your runaway tram (‘trolley’) down a different track and killing a different bunch of people. You can tweak this problem any way you like, by varying the characteristics of your two groups of ‘victims’, to see what difference this makes to people’s ethical choices.

Being good scientists of course, they approached this scientifically and conducted a numbers exercise to see if they could derive a baseline consensus. What would most people want a self-driving car to do in such a situation? Unfortunately it depends who you ask. The Moral Machine survey collected 40 million decisions from across 233 countries, and found that while on average humans were prioritised over animals and younger people over older (unless the humans were criminals, in which case they rated lower than cats), the regional differences were strikingly hard to integrate into a viable framework. For instance, the young-over-old ethic was much less apparent in Asiatic and Islamic countries, as was the high-status-over-low. South America and French colonies were less inclined to save humans over animals, unless they were specifically women or non-disabled people.

The problem for the programmers is that computer code relies on absolutes, and with morality there are no absolutes, only relatives. No wonder one ethicist describes the task of giving morals to motor vehicles as ‘finding the right comedic parabola, or the right colour of dance, or the right frequency for spaghetti’ (New Scientist, 27 October).

As soon as you start asking ethical questions you get contradictory answers, and there is no objective yardstick, upon which all can agree, by which to judge them. So is it possible to use such a subjective approach to arrive at a consensual programme of action for the planet?

No, it isn’t. That’s why when we’re making the case for socialism we prefer to stick to the facts. If the world is going to steer its way to a sustainable future instead of destruction it’s going to need a practical and accurate roadmap more than it needs gods or cutesie ads or an impassioned polemic.

Place Your Bets Please
If you prefer gas to electric cookers it’s probably because of the zero-response time when you adjust the heat settings. Cooking with electric involves too much thinking ahead, and an adjustment that’s fractionally too high can result in milk boiling over the stove. A similar problem exists in long-latency industries like oil, where adjustments today ‘feed through’ to supply or price levels years down the line. This gives rise to a volatile futures market, which speculates on supply and price in the future. Today’s US sanctions against Iran have caused large producers to pump at full capacity, but fears of future oversupply are depressing the futures market, in turn causing rampant selling and falling stock prices today, which of course will have knock-on effects on industry including food production (‘Oil rally faces tidal wave of supply’, Reuters, 4 November). So today’s activities are not determined by today’s objective and demonstrable necessities, but by some people’s guesses at what the price of these things will be in a few years’ time. If you think that sounds like reckless fast-buck gambling instead of responsible resource management, you’ve just hit on one essential difference between capitalism and socialism.
Paddy Shannon

No comments: