To decarbonize we must decomputerize: why we need a Luddite revolution | Technology


Our built environment is becoming one big computer. “Smartness” is coming to saturate our stores, workplaces, homes, cities. As we go about our daily lives, data is made, stored, analyzed and used to make algorithmic inferences about us that in turn structure our experience of the world. Computation encircles us as a layer, dense and interconnected. If our parents and our grandparents lived with computers, we live inside of them.

A growing chorus of activists, journalists and scholars are calling attention to the dangers of digital enclosure. Employers are using algorithmic tools to surveil and control workers. Cops are using algorithmic tools to surveil and control communities of color. And there is no shortage of dystopian possibilities on the horizon: landlords evicting tenants with “smart locks”, health insurers charging higher premiums because your Fitbit says you don’t exercise enough.

Digitization doesn’t just pose a risk to people, however. It also poses a risk to the planet. July was the hottest month on record. Large chunks of the Arctic are melting. In India, more than half a billion people face water shortages. Putting computation everywhere directly contributes to this crisis. Digitization is a climate disaster: if corporations and governments succeed in making vastly more of our world into data, there will be less of a world left for us to live in.

To understand the relationship between data and climate, the best place to start is machine learning (ML). Billions of dollars are being spent on researching, developing, and deploying ML because major breakthroughs in the past decade have made it a powerful tool for pattern recognition, whether analyzing faces or predicting consumer preferences. ML “learns” by training on large quantities of data. Computers are stupid: Babies know what a face is within the first few months of being alive. For a computer to know what a face is, it must learn by looking at millions of pictures of faces.

This is a demanding process. It takes place inside the data centers we call the cloud, and much of the electricity that powers the cloud is generated by burning fossil fuels. As a result, ML has a large carbon footprint. In a recent paper that made waves in the ML community, a team at the University of Massachusetts, Amherst, found that training a model for natural-language processing – the field that helps “virtual assistants” like Alexa understand what you’re saying – can emit as much as 626,155lbs of carbon dioxide. That’s about the same amount produced by flying roundtrip between New York and Beijing 125 times.

Training models isn’t the only way ML contributes to the cooking of our planet. It has also stimulated a hunger for data that is probably the single biggest driver of the digitization of everything. Corporations and governments now have an incentive to acquire as much data as possible, because that data, with the help of ML, might yield valuable patterns. It might tell them who to fire, who to arrest, when to perform maintenance on a machine or how to promote a new product.





‘Digitization doesn’t just pose a risk to people. It also poses a risk to the planet. In India, more than half a billion people face water shortages.’



‘Digitization doesn’t just pose a risk to people. It also poses a risk to the planet. In India, more than half a billion people face water shortages.’ Photograph: R Parthibhan/AP

One of the best ways to make more data is to put small connected computers everywhere: Cisco predicts there will be 28.5bn networked devices by 2022. Aside from the energy required to manufacture and maintain those devices, the data they produce will live in the carbon-intensive cloud. Data centers currently consume 200 terawatt hours per year – roughly the same amount as South Africa. Anders Andrae, a widely cited researcher at Huawei, tells me that number is likely to grow 4-5 times by 2030. This would put the cloud on par with Japan, the fourth-biggest energy consumer on the planet.

What can be done to curb the carbon costs of data? Greenpeace has long pushed cloud providers to switch to renewable energy sources and improve efficiency. These efforts have seen some success: the use of renewables by data centers has grown substantially. Meanwhile, efficiency gains from better techniques and bigger economies of scale have moderated the cloud’s power consumption in recent years. When it comes to ML, a group of researchers are calling for a more energy-conscious approach, which they call “Green AI.” These are encouraging trends, and tech workers themselves are likely to play a key role in advancing them: Amazon employees have been organizing for a climate plan since late last year, and they recently announced a global walkout for 20 September. Among their demands is for the company to commit to zero emissions by 2030 and to stop selling cloud services to fossil fuel companies.

But it’s clear that confronting the climate crisis will require something more radical than just making data greener. That’s why we should put another tactic on the table: making less data. We should reject the assumption that our built environment must become one big computer. We should erect barriers against the spread of “smartness” into all of the spaces of our lives.

To decarbonize, we need to decomputerize.

This proposal will no doubt be met with charges of Luddism. Good: Luddism is a label to embrace. The Luddites were heroic figures and acute technological thinkers. They smashed textile machinery in 19th-century England because they had the capacity to perceive technology “in the present tense”, in the words of the historian David F Noble. They didn’t wait patiently for the glorious future promised by the gospel of progress. They saw what certain machines were doing to them in the present tense – endangering their livelihoods – and dismantled them.

We are often sold a similar bill of goods: big tech companies talk incessantly about how “AI” and digitization will bring a better future. In the present tense, however, putting computers everywhere is bad for most people. It enables advertisers, employers, and cops to exercise more control over us – in addition to helping heat the planet.

Fortunately, there are latter-day Luddites working to stem the tide. Community groups like the Stop LAPD Spying Coalition are organizing to shut down algorithmic policing programs. A growing campaign to ban the government use of facial recognition software has won important victories in San Francisco and Somerville, Massachusetts, while workers at Amazon are calling for the company to stop selling such software to law enforcement. And in the streets of Hong Kong, protesters are developing techniques for evading the algorithmic gaze, using lasers to confuse facial recognition cameras and cutting down “smart” lamp-posts equipped with monitoring devices.





Teenagers and students take part in a climate protest outside the White House in Washington on 13 September 2019.



Teenagers and students take part in a climate protest outside the White House in Washington on 13 September 2019. Photograph: Nicholas Kamm/AFP/Getty Images

These are just a few possible sources of inspiration for a broader movement for decomputerization, one that pursues social and ecological goals simultaneously. The premise of the Green New Deal is that we can make society greener and more equitable at the same time – that we can democratize as we decarbonize. We should apply the same logic to our digital sphere. Preventing a local police department from constructing an ML-powered panopticon is a matter of algorithmic, social and climate justice. As they used to say in the 1960s: one struggle, many fronts.

For such a struggle to be successful, however, resistance is not enough. We also need a vision of the future we want. Again, the history of the Luddites can be helpful. In 1812, a group of Yorkshire Luddites sent a factory owner a letter promising continued action until “the House of Commons passes an Act to put down all Machinery hurtful to Commonality”. Following their example, we might derive a simple Luddite principle for democratizing technology: we should destroy machinery hurtful to the common good and build machinery helpful to it.

What does this mean in practice? It’s hard to think of anything more hurtful to our common life than heating large portions of the planet beyond habitable levels. Privacy advocates have long called for companies to restrict their collection of data to the minimum necessary to perform a service – a principle now enshrined in the GDPR, the EU’s omnibus data regulation. A 21st-century Luddism should embrace this principle but go further. What matters is not only how much data a service collects, but what imprint that service leaves upon the world – and thus whether it should be performed at all.

Decomputerization doesn’t mean no computers. It means that not all spheres of life should be rendered into data and computed upon. Ubiquitous “smartness” largely serves to enrich and empower the few at the expense of the many, while inflicting ecological harm that will threaten the survival and flourishing of billions of people.

Precisely which computational activities should be preserved in a less computerized world is a matter for those billions of people themselves to decide. The question of whether a particular machine hurts or helps the common good can only be answered by the commons itself. It can only be answered collectively, through the experiment and argument of democracy.

The zero-carbon commonwealth of the future must empower people to decide not just how technologies are built and implemented, but whether they’re built and implemented. Progress is an abstraction that has done a lot of damage over the centuries. Luddism urges us to consider: progress towards what and progress for whom? Sometimes a technology shouldn’t exist. Sometimes the best thing to do with a machine is to break it.




Source link