Top 10 technology and ethics stories of 2020 – ComputerWeekly.com

Posted: January 1, 2021 at 9:18 am

The year 2020 has been shaped by the global pandemic and international outcry over institutional racism and white supremacy.

A number of technology companies, for example, came under sustained scrutiny for their ties to law enforcement and how, despite their proclamations of using tech for good, their products are used to further entrench racist policing practices.

Facial recognition was another major focus of Computer Weeklys 2020 coverage. On the one hand, police use of the technology in south Wales has been found unlawful, while on the other, both public and private sector bodies are racing to develop facial recognition that can work on people wearing masks or other face coverings, which could severely limit peoples ability to protest or even exercise their basic privacy rights.

Big tech also came under fire from lawmakers around the world for their anti-competitive business practices, bringing the possibility of legal anti-trust action much closer to reality, and Amazon in particular caught flak for its poor treatment of workers throughout the pandemic.

Computer Weekly also looked at where the raw materials that technology companies rely on such as cobalt, coltan and lithium are sourced from, and the negative consequences this has for people living in these mineral-rich areas.

Here are Computer Weeklys top 10 technology and ethics stories of 2020:

Following a massive international backlash against police racism and brutality sparked by the killing of George Floyd in Minneapolis in May 2020, private technology companies started coming under increased scrutiny for their relationships with law enforcement.

Within a month, the protests prompted tech giants Amazon, Microsoft and IBM to halt sales of their respective facial-recognition technologies to US law enforcement agencies. However, all three remained silent on how other technologies, such as predictive algorithms and body-worn video cameras, can also be used to fuel racial injustice and discriminatory policing.

Despite the moves, which were condemned by some as merely a public relations stunt, many privacy campaigners were not satisfied and are continuing to push for a permanent ban on the technologys use.

There should be a nation-wide ban on government use of face surveillance, said the Electronic Frontier Foundation in a blog post. Even if the technology were highly regulated, its use by the government would continue to exacerbate a policing crisis in this nation that disproportionately harms black Americans, immigrants, the unhoused, and other vulnerable populations.

The European Unions upcoming Conflict Minerals Regulation is designed to stem the flow of 3TG minerals (tin, tantalum, tungsten and gold) from conflict zones and other high-risk areas. However, upon closer inspection Computer Weekly found a number of loopholes in the new rules that mean multinational technology companies which rely on these vital natural resources for their products and components are not covered.

For example, the technology companies will not be obliged to monitor, track or otherwise act to remove the minerals from their global supply chains; a number of minerals key to the tech industry, such as cobalt and lithium, are ignored by the regulation; and companies will not even be penalised if found to be in breach of the rules.

As is the case with previous regulatory or legislative attempts to deal with conflict minerals, the regulation will also do very little for those living and working on the ground in mineral-rich conflict zones such as the Democratic Republic of Congo.

Those Computer Weekly spoke to instead suggested moving away from voluntary corporate governance and social responsibility models to focus on increasing the productive capacity of those living in conflict zones, so they can develop their own solutions to what are essentially deeply political conflicts.

In early March, it came to light that the Home Office and the Metropolitan Police Service were collaborating with UK universities on a live facial recognition (LFR) project, known as face matching for automatic identity retrieval, recognition, verification and management, or FACER2VM, which could identify people wearing masks or other face coverings.

According to information listed on UK Research and Innovation, the project coordinators expected their research to have a substantial impact.

The societal impact is anticipated to be multifaceted, it said. Unconstrained face biometrics capability will significantly contribute to the governments security agenda in the framework of smart cities and national security. It can effectively facilitate access to public services.

While reports by other media outlets focused on FACER2VMs connection to Jiangnan University, which sparked fears that the project could enhance the Chinese governments ability to identify both masked protesters in Hong Kong and Uighur Muslims in Xinjiang, the use of this technology by UK police or security services is also worrying, as general LFR has already been used against protestors in south Wales, while officers across Britain now routinely film gatherings and demonstrations.

In mid-April, shortly after official lockdowns went into effect around the world, online retail giant Amazon which has done very well financially during the pandemic was hit by a wave of strikes across its European and North American warehouses as frontline logistics workers protested against unsafe working conditions and corporate inaction.

While the striking workers complained about a lack of protective latex gloves and hand sanitiser, overcrowding during shifts and high barriers to quarantine pay, the initial wave kicked off in Spain and Italy after Amazon refused to shut down its facilities after learning that a number of workers had contracted the coronavirus.

Following a similar pattern to their European counterparts, workers in the US began taking strike action after Amazon decided to keep warehouses open.

A number of Amazon employees have since been fired for either taking part in the strikes or showing public support for those who did allegations that Amazon continues to contest.

After reporting on the initial wave of Amazon strikes, Computer Weekly got in touch with Christian Smalls, a process assistant at Amazons Staten Island warehouse in New York, who was the first person fired for speaking out about the alleged state of its warehouses during the pandemic.

The termination of Smalls employment remains a contentious issue, with both parties giving different versions of events.

Smalls told Computer Weekly he was just the first in a growing line of people allegedly fired by Amazon for speaking out or protesting about Covid-related issues, despite Amazons claims that the employees were dismissed for violating various guidelines or internal policies.

This includes the firing of user experience designers Emily Cunningham and Maren Costa, organisers in the Amazon Employees for Climate Justice (AECJ) campaign group who publicly denounced Amazons treatment of employees such as Smalls.

It also includes Minnesota warehouse worker Bashir Mohamed, who was advocating better work conditions and pushing for more rigorous cleaning measures.

In May, Computer Weekly interviewed Shoshana Zuboff, author of The age of surveillance capitalism: the fight for a human future at the new frontier of power (2019), to discuss how the practice of surveillance capitalism is intersecting with the Covid-19 coronavirus pandemic and public health crisis.

As part of a growing body of work alongside texts such as Safiya Nobles Algorithms of oppression and McKenzie Warks Capital is dead: is this something worse? that seeks to analyse and explain the increasingly pivotal role of information and data in our economic, social and political lives, The age of surveillance capitalism argues that human experience (our experience) is captured in data, which is then repackaged in what Zuboff calls prediction products.

These are then sold in behavioural futures markets, making us and our experiences the raw material of these products, which are then sold to other companies in closed business-to-business markets.

Zuboff told Computer Weekly that the current health crisis presents a massive opportunity for surveillance capitalism, adding: While it is a crisis for all of us, it is something like business as usual for surveillance capitalists, in the sense that it is an opportunity to, possibly, significantly enhance their behavioural data supply chains.

She concluded that the fight against surveillance capitalism is a problem of collective action: We need new social movements, we need new forms of social solidarity. Lawmakers need to feel our pressure at their backs.

Although awareness of algorithms and their potential for discrimination have increased significantly over the past five years, Gemma Galdon Clavell, director of Barcelona-based algorithmic auditing consultancy Eticas, told Computer Weekly that too many in the tech sector still wrongly see technology as socially and politically neutral, creating major problems in how algorithms are developed and deployed.

On top of this, Galdon Clavell said most organisations deploying algorithms have very little awareness or understanding of how to address the challenges of bias, even if they do recognise it as a problem in the first place.

She further noted that while companies regularly submit to, and publish the results of, independent financial audits, there is no widespread equivalent for algorithms.

We need to change how we do technology, she said. I think the whole technological debate has been so geared by the Silicon Valley idea of move fast, break things that when you break our fundamental rights, it doesnt really matter.

We need to start seeing technology as something that helps us solve problems. Right now, technology is like a hammer always looking for nails Lets look for problems that could be solved with blockchain, lets look for problems that we can solve with AI actually, no, what problem do you have? And lets look at the technologies that could help you solve that problem. But thats a completely different way of thinking about technology than what weve done in the past 20 years.

In a landmark decision, the Court of Appeal ruled in August that South Wales Polices (SWP) facial recognition deployments breached human rights and data protection laws.

The decision was made on the grounds that SWPs use of the technology was not in accordance with citizens Article 8 privacy rights; that it did not conduct an appropriate data protection impact assessment; and that it did not comply with its public sector equality duty to consider how its policies and practices could be discriminatory.

However, speaking to Computer Weekly at the time, Matrix Chambers barrister Tim James-Matthews said the problem the Court of Appeal ultimately found was an absence of regulation around how the technology was deployed, as opposed to anything particular in the technology itself.

He added: What they said was that, essentially, South Wales Police hadnt done the work of identifying and determining whether or not there were equalities implications in using the technology, and how they might guard against or protect from those.

In the US, following a 16-month investigation into the competitive practices of Amazon, Apple, Facebook and Google, the Democratic majority of the House Judiciary Subcommittee on Antitrust, Commercial and Administrative Law published a report detailing their recommendations on how antitrust laws and enforcement can be changed to address the rise and abuse of market power in the digital economy.

They found that although the four corporations differed in important ways, the investigation into their business practices revealed common problems.

First, each platform now serves as a gatekeeper over a key channel of distribution, the report said. By controlling access to markets, these giants can pick winners and losers throughout our economy. They not only wield tremendous power, but they also abuse it by charging exorbitant fees, imposing oppressive contract terms, and extracting valuable data from the people and businesses that rely on them.

This echoed the opening remarks made by David Cicilline, chairman of the antitrust subcommittee, during its questioning of Facebook, Amazon, Apple and Googles CEOs in July.

The report suggested imposing structural separations and line-of-business restrictions on the companies, which would respectively prohibit a dominant intermediary from operating in markets that place the intermediary in competition with the firms dependent on its infrastructure and generally limit the markets in which a dominant firm can engage.

At the tail of 2019, Computer Weekly reported on a landmark legal case launched against five of the worlds largest multinational technology companies, which were accused by the families of dead or maimed child cobalt miners of knowingly benefiting from human rights abuses in the Democratic Republic of Congo (DRC).

The lawsuit against Alphabet, Apple, Dell, Microsoft and Tesla marked the first legal challenge of its kind against technology companies, many of which rely on their cobalt supply chains to power products such as electric cars, smartphones and laptops.

In August, the companies filed a joint motion to dismiss the case, largely on the grounds they did not have requisite knowledge of the abuses at the specific mining sites mentioned.

However, in the latest round of legal filings, the Congolese victims maintained that the companies had specific knowledge of horrific conditions facing child miners in DRC cobalt mines from a number of sources. Computer Weekly will continue to monitor the case.

Read more:

Top 10 technology and ethics stories of 2020 - ComputerWeekly.com

Related Posts