What the Tech Sector Could Learn from Anthropology

Author: Ingrid Brudvig, PhD candidate in Social Anthropology at the University of Cape Town. Her research explores the intersection of digital technologies, physical and virtual space and belonging in the context of Somali migration in South Africa. She is particularly interested in the politicisation of mobility, and how digital technologies and social media influence mobility, gender, identity and belonging in the digital age. An earlier version of this blog can be found on Ingrid’s Medium page.Technology is a social tool that requires understanding of social and cultural factors for it to be a driver of equality. Failing to incorporate an anthropological perspective into tech design, development and policy, risks increasing social inequalities driven by digital exclusion. It also makes it more likely that your product or service will fail.

The trouble with algorithms

First, internet and social media platforms are driven by particular algorithms, artificial intelligence and automated decision-making processes. These are created and managed by Silicon Valley corporations and a sector that is primarily male, white and elite. There is a widespread belief that algorithms make decisionmaking processes and the information you access online more objective, personalised, and efficient, but this is not always the case. Internet systems, protocols, and algorithms are designed by people who themselves have social beliefs and bias, which often become unconsciously embedded into the systems. Algorithms are not inherently objective and often discriminate based on how they learn about the world through existing data (read: big data) which is based primarily on male perspectives. This led Microsoft’s Twitter Chatbot Tay to turn into a misogynist within days; it led Apple’s Siri to go blank when asked questions about sexual assault and domestic abuse; and it led Google‘s online advertising system to show ads for high paying jobs to men more frequently than to women. Large technology corporations play a tremendous role in facilitating people’s informational, communicative and associational lives, as well as in driving the political economy. Tech corporations, however, are coming under increasing criticism because of a lack of transparency in their decisionmaking processes, protection of online privacy and influence on the political economy (e.g., the rise of fake news playing a major role in the 2016 US presidential election, and with the rise of the platform economy driven by Uber and AirBnb).

Your Second Self?

Tech company business models and profitability are sustained by use of data footprintswhich are used to categorise us (people) into boxes of gender, race, income and interest groups. We are then targeted with ads telling us who we should be, how we should think and the choices we should makefueled by assumption and driven by bias. As a result of categorising humans into fixed boxes and ‘indicators of being’, technology companies have created your “second self” through your data. We understand very little about this system “despite the fact that we, as users, are providing most of the fuelfor free”, as warned by Share Lab. This invites ethical concerns around several issues: personal data as currency in ‘mediascapes’, driven by the manipulation of consent; and the use of big data and artificial intelligence in fueling knowledge production based on limited contextualization. As Giulio Quaggiotto notes in his blog on Big data meets thick data, “Without proper handling and contextualisation, big data risks becoming deep fried data.” Hence, “the era of big data needs even more qualitative, granular knowledge of local contexts.”

The “Tyranny of Data”

Information and communication technologies are driving the “modernisation” of governance systems (e-government), economies (platform economy) and of social norms (attention economy), and they are doing so with much authority. Some have termed this the tyranny of data. This builds from William Easterly’s view that global development is based on a “tyranny of experts”, whose technocratic approaches “reduce multidimensional social phenomena such as poverty or justice into a set of technical solutions that do not take into account either the political systems in which they operate or the rights of intended beneficiaries”. Many have termed these new digital developments “the Fourth Industrial Revolution”, characteristic of modernity owing to their profound impact on the future of work, productivity, skills and education. However, the impact of this revolution will be equally if not more profound on language, beliefs, values and emotions of the future. The situation is eerily similar to anthropology’s crisis of representation in the wake of independence and postcolonial movements. Perhaps we are at the start of another such revolutionary time, one that is driven by citizen’s demand for freedom of information, of association, of movement, of rights online; and of overcoming “the danger of a single story” characteristic of data driven indicators of belonging.

Anthropology’s Role

Anthropology provides important approaches for the tech sector to better understand people and the ethical and political context of their work on people’s lives. Government and technology policymakers should work with anthropologists to better understand the social and cultural norms influencing internet access, particularly for underrepresented groups, to come up with local, contextual, and relevant solutions to drive digital equality. This requires time and money invested into long term ethnographic studies, work with community leaders, and collaboration with people from all parts of society. Similarly, tech companies could better consider anthropological studies of the political, economic and social impacts of their products, and the ethics of the political economic landscape and “social spaces” that their products produce. This goes beyond the framework of “user experience design” but can incorporate UX (user experience design) into methodological approaches and study aims. Anthropologists should also play a role in designing and facilitating ethics trainings and developing codes of conduct for the technology sector. Companies such as Facebook are attempting to remedy bias by providing unconscious bias training to employees. This process would benefit from an anthropological perspective. Digital connectivity and data mediate culture, systems and life today. Failing to take into account the importance of “small data” in a world of big data risks boxing people into categories of belonging that inaccurately represent their lives, hopes, fears and desires.

[Image sourced at Pixabay: https://pixabay.com/en/social-media-digitization-faces-2528410/]

2 thoughts on “What the Tech Sector Could Learn from Anthropology

  1. Fantastic blog Ingrid. In order to send this comment I had to log-in and provide some personal details. Oh the irony!

  2. Pingback: Encoding Value: What is cryptocurrency, and what does it mean for society? | The Familiar Strange

Leave a Reply to JayneCancel reply