“Not only do we need engineers working alongside anthropologists to do good quality engineering, I also think that we need to do an anthropology of engineers… Engineers are making our world, right? And, the way that we, as engineers, think collectively, behave collectively, what we consider to be important… I think somebody should be watching that and reflecting on that and [relaying] that back to us, to society, to understand how the people who are making our world actually view the world.”
This is the eighth and FINAL episode in our STS podcast series. The aim of this series was to explore the intersection between science and anthropology, to better understand the contemporary issues that the amazing people featured in this series try to solve. We’d like to take this moment to thank everyone who has been a part of our STS series, as well as everyone who has listened along with us.
So in this episode, Ian chats with Professor Elanor Huntington, the first (and current) female Dean of Engineering and Computer Science at the Australian National University. While Elanor’s research has specialised in quantum optics (which, from my understanding, relates to the application of quantum mechanics to phenomena involving light), she is also looking to the future – a future of STEM that needs anthropologists. They talk about the problematic nature of describing human behaviour through numbers and algorithms, unpack what an anthropology of the internet would entail, they discuss the importance of trust in scientific endeavours and the decline of the ‘expert’, and ponder what the future of engineering will look like, as well as what it means to take control of making your future.
QUOTES
“People try to invoke all sorts of really weird, mysterious things when they’re talking about quantum, because it is kind of a bit weird and it’s certainly outside of our personal experience – we don’t live in a quantum world. And so people try to make it sound more mysterious than it actually is.”
“We are very monocultural at the moment, so we are not I think building a world that reflects the true diversity of the folks around us. You know, engineering in Australia and engineering in America and engineering in the UK is too white, too male, and too urban.”
“How many of our politicians are slaves to their twitter feed? Our politicians’ reactions are being driven by software that was written by people on the other side of the planet, who had a very particular view about what they were building and a set of cultural and political and societal and technological assumptions that are just built into that software, and they’re unconscious. They didn’t know they were doing it, they didn’t realise that this is how it was going to go, this technology got released into the wild like cane toads, and we really need anthropologists watching this and starting to create a discourse and some active reflection about where this is all going, because I’m not sure that any of us are particularly happy with where it’s t right now. we need some people who understand people!”
Regarding Latour’s notion that facts are made objects, even though it’s not felt that way: “[Engineers] are taught the scientific method very early, it’s deeply ingrained in the way that we believe that we’re operating, and we do genuinely believe that we are … uncovering facts… but not necessarily constructing them.”
“If you look societally, one of the reasons that the discourse around innovation, around the fourth industrial revolution and all of that sort of stuff, one of the reasons that it’s falling a little bit flat at the moment is because it resonates with a small number of people who can actually see that this is achievable for them or for their kids. … And then on top of that, what you’ve got is a decline of the role of the expert in our society at the moment… because you’ve got access to 3D printers, you’ve got access to machining systems that are lightweight, small and you can go buy one and plug it into your wall outlet, most of them are driven by software that are reprogrammable, reconfigurable. If you’ve got access to that skillset, then actually you can make a whole bunch of pieces of technology these days and you can string all of that together. And so then the question is … who gets to learn how to write the code that actually reconfigures that? Who gets access to that technology? How do we know that they’re actually making…something that is useful, safe-“ “Or will they be printing guns?” “Exactly. And so, what is the role of the expert?”
LINKS AND CITATIONS
You can learn more about the 3AI Institute here: https://3ainstitute.cecs.anu.edu.au/
And listen to our podcast episode with Genevieve Bell here: https://thefamiliarstrange.com/2019/03/04/ep-32-genevieve-bell/
The book Elanor mentions reading as part of the 3AI reading list is “Beamtimes and Lifetimes: The World of High Energy Physicists” by Sharon Traweek.
https://www.amazon.com/Beamtimes-Lifetimes-World-Energy-Physicists/dp/0674063481
Ian jokingly refers to an article in The Onion called “‘What if we try this?’ asks robotics grad student about to eliminate 30% of workforce”, which you can read here if you’d like some satire in your life:
https://www.theonion.com/what-if-we-try-this-asks-robotics-grad-student-about-1819579651
For an explainer of the Trolley Problem, give How Stuff Works a peak:
https://people.howstuffworks.com/trolley-problem.htm
And if you haven’t already checked it out, head over to our Facebook group The Familiar Strange Chats. We’d like to keep our discussions going from this podcast episode, so let us know your thoughts: what was most interesting? What was most surprising? Did the episode remind you of something else you’ve read, seen, or heard lately – if so, what is it? Let’s keep talking strange, together.
Our Patreon can be found at https://www.patreon.com/thefamiliarstrange
This anthropology podcast is supported by the Australian Anthropological Society, the ANU’s College of Asia and the Pacific and College of Arts and Social Sciences, and the Australian Centre for the Public Awareness of Science, and is produced in collaboration with the American Anthropological Association.
Music by Pete Dabro: dabro1.bandcamp.com
Shownotes by Deanna Catto
Podcast edited by Ian Pollock
[Feature image ‘Science Background Fractal Physics Abstract’ from Max Pixel
https://www.maxpixel.net/Science-Background-Fractal-Physics-Abstract-1280081
Image of ribbons in Malta by Nick Fewings (2019) from Unsplash
https://unsplash.com/photos/Mh_ie8n84GI]
Ian:
Hey everyone. First off, we at The Familiar Strange want to acknowledge and celebrate the first Australians, on whose lands we're producing this podcast and pay our respects to the elders of the Ngunnawal and Ngambri peoples past, present and emerging. Let's go.
Ian:
Hello and welcome to The Familiar Strange, I'm Ian Pollock your familiar stranger today. Welcome to the podcast. Brought to you with support from the Australian Anthropological Society, the Australian National University's College of Asia-Pacific and College of Arts and Social Sciences, The Australian Center for the Public Awareness of Science. Produced in collaboration with the American Anthropological Association and coming to you from my new home in Jakarta, the dynamic and rapidly sinking capital of Indonesia.
Ian:
Quick announcement, if you haven't joined The Familiar Strange Chats, our Facebook group, now is the time. We've started to do live events there once every two weeks where we'll read out the best comments and give prizes, $10 prizes for our favorites. Just search for The Familiar Strange Chats on Facebook.
Ian:
Now today I'm talking to Dr. Elanor Huntington, Dean of the College of Engineering and Computer Science at Australian National University. This is my third podcast talking to a non-anthropologist. I spoke to a historian in episode seven, an economist in episode 15, and Dr. Huntington's specialty is quantum optics. But she has her eye on the future of science and engineering, a future that really, really needs anthropologists. As she says, "Engineers don't understand people, they aren't trained to understand people. There's a tension between an engineer's view people as just weird components in complex systems and a different view how engineers would like to see themselves, where people are creative, self directed, capable of making and remaking the world." Elanor is calling for a new discipline of engineering, focused on shaping the ways people outside the lab use technology and building what she calls trust at scale, which is a really complex idea and a fundamentally social idea. Drawing on an ideology of science and power and articulating with politics, ethics and culture.
Ian:
During this interview I tried to keep drawing her attention to questions of power and politics. If I didn't ask the questions you would have asked, let me know in the comments at thefamiliarstrange.com. Or tweet at me @tfstweets.
Ian:
Elanor is reaching out here across the disciplinary divide between science and the humanities. So, listen critically and remember, we have to reach back too. Most anthropologists couldn't do the science she does and we can't demand that scientists all become anthropologists. Collaboration, cross pollination and the push for an ethical and critical science and a rigorous and really useful anthropology. That's what's at stake here.
Ian:
Now in this episode we talk about building a new discipline of engineering with people at the center. We talk about the 3Ai Institute, a new program at ANU, and if you want to hear more on that, listen to episode 32, an interview with Genevieve Bell. And we talk about what it means to make our world.
Ian:
So, here it is, my conversation with Elanor Huntington.
E.Huntington:
So when I'm, my particular area of research is quantum mechanics and quantum information and things like that, and people try to invoke all sorts of really weird mysterious things when they're talking about quantum because it is kind of a bit weird and it's certainly outside of our personal experience. We don't live in a quantum world. And so people try to make it sound more mysterious than it actually is.
Ian:
Mysterious?
E.Huntington:
Yeah, so you know when people talk about quantum superposition or entanglement or something like that, they talk about the fact that changes in one particle somewhere in the universe can instantaneously cause changes in a particle somewhere else in the other side of the universe and that this just seems really weird and [crosstalk 00:03:55]
E.Huntington:
Yeah, all of that sort of stuff. And that is one interpretation of the mathematics, and people do it to make it sound more inspiring and mysterious. But there are really pragmatic limitations to what actually that means. People try to draw all sorts of really weird analogies. I once read a paper where somebody was talking about the behavior of nurses in hospitals and they were trying to say that it was quantum mechanical.
Ian:
What does that mean?
E.Huntington:
Exactly. My point exactly. But they just try to invoke of all of these things in really weird and mysterious ways that are actually incorrect. And you've got to be an expert in order to be able to understand when you're trying to make simple examples, not simplistic examples, when you're trying to make simple examples when you can actually make simplifications and when you can't. And that's a dangerous game to play unless you're an expert.
Ian:
Do you find that people often try to make kind of physical or quantum mechanical or relations like that between something they see in the science and something they see in the social world? Like the way people behave in a hospital?
E.Huntington:
Oh look it's more than in the social world, they try to do it in all sorts of things.
E.Huntington:
So [crosstalk 00:04:53]
Ian:
Extend those metaphors out?
E.Huntington:
Everywhere, yeah, because it's kind of weird and magical and they think that that's a useful thing. And sometimes there are interesting things, so I recently was reading a whole book about quantum machine learning, which is kind of a new thing. And they were pointing out that physically quantum mechanical objects behave in particular ways, they follow a certain set of mathematical laws. One of those mathematical laws is that the order in which you do things is not reversible. So if you do A, then B, that's not the same as doing B, then A. And they were pointing out that in fact that might be an interesting thing to do when you're trying to numerically simulate the behavior of human beings. So they're not trying to say that the behavior of human beings is quantum mechanical, they're just saying that that might be interesting. Because it's well known in marketing theory for example that the order in which you send messages to people when you're trying to sell them things matters.
Ian:
Okay, this is [crosstalk 00:05:43]
E.Huntington:
Kind of funky and weird.
Ian:
Yeah, it is.
E.Huntington:
Yeah.
Ian:
But the idea of numerically simulating the behavior of human beings-
E.Huntington:
Yeah.
Ian:
Again, coming from a pure qual background like I do, what does that mean? What does it assume?
E.Huntington:
It assumes first of all that you can actually write down mathematically the way that people behave.
Ian:
So like a discreet set of models and equations [crosstalk 00:05:59]
E.Huntington:
Yeah, yeah, exactly. Yeah, yeah.
Ian:
And all people everywhere?
E.Huntington:
A lot of the time they try to assume that you can model the way that people behave en masse, so the average behavior of people or the average behavior of people plus some noise around the edges and stuff like that.
Ian:
And what kind of behavior are we talking about?
E.Huntington:
Well so like for example, pretty much all of the algorithms in most social media are founded largely on marketing theory because the goal is to sell you stuff. And so the way that information now flows through most social media platforms is based on mathematical models optimizing them selling you stuff.
Ian:
Now you have talked in other places about the desire to sort of create a new discipline within engineering that includes anthropology and anthropologists. Clearly engineers and people who are writing algorithms, well that would be a kind of engineering wouldn't it?
E.Huntington:
Yeah.
Ian:
They are working on shaping, understanding, modeling, all kinds of human behavior. So where do anthropologists come in?
E.Huntington:
Well so I would argue that at the moment anthropologists don't come in enough and that is a really serious weakness. So what we've got going on at the moment is that we've got folks who are writing all sorts of algorithms that are based on machine learning, artificial intelligence, big data, trolling through massive data sets and connecting across data sets and all sorts of stuff. And for the most part, this is code that is being written by people who don't really understand how people operate, either as individuals or en masse. And their goal is essentially to find patterns in data and amplify and accentuate those patterns and they're just writing code and releasing it into the wild and-
Ian:
Into the wild.
E.Huntington:
Into the wild, so kind of like [crosstalk 00:07:30]
Ian:
So you have a sense of like the internet space as an ecosystem?
E.Huntington:
Yeah it is, and it's an ecosystem that is entirely manufactured and at the moment it's entirely manufactured by people who don't understand people particularly well. It's manufactured by people like me who are engineers and computer scientists.
Ian:
So and it sounds like there are two kinds of beings at least that are populating that ecosystem and one of them is people and one of them is algorithms?
E.Huntington:
Correct.
Ian:
What is an algorithm?
E.Huntington:
I guess the easiest way to describe what's going on there is that every time you open your Twitter feed or your Facebook page or anything like that there is software that presents your feed to you and it presents that data to you in your feed on the basis of what it remembers about what you're interested in, what you've looked at before, the things that your friends have looked at before, to say, “Okay, well so we see a pattern emerging here, so we're going to amplify that pattern.” And think about it, right? What goes into your feed and who's deciding that? And there's not really a person that's deciding that, there's a piece of computer code that's deciding that on the basis of what people like you have looked at previously.
Ian:
People like you?
E.Huntington:
That's right, it's looking for patterns, it's putting you into a pattern and it's decided that you kind of fit into this pattern other people have done like you have done things like that and looked at things like that and so we're going to feed you this particular piece of information.
E.Huntington:
Folks these days have started talking about your social media bubble because that piece of software is looking across the whole of the Twitter-sphere and saying, “Okay, well you're talking about things that look like this, we think you will probably be interested in things that are similar to that.” And so what it does is it starts to connect people up who are talking about the same sorts of things and so you create this really narrow microcosm of people who are all essentially saying the same things and reinforcing each other and just talking to each other. And it actually becomes really difficult to get different voices and different pieces of information into that discourse because the algorithm just keep pointing you towards people who are saying and looking at things that you have already looked at.
Ian:
So, at this point we have like a chain of events and a chain of sort of objects that are set up to present this outcome, right? You've got science and marketing research that suggests that an algorithm be built in a certain way, it gets built by a certain kind of person who imagines people fitting into certain kinds of patterns and boxes and then siphons them off in sometimes unexpected directions.
E.Huntington:
Yeah.
Ian:
Where do you see anthropologists-
E.Huntington:
Right.
Ian:
Intervening here?
E.Huntington:
Right. So to me one of the most important things here is that every now and then the folks who run those social media platforms, they do change their algorithms, right? They to tweak them, and then what they do is they then just release those algorithms into the wild kind of like the way that Australia used to release cane toads into the wild.
E.Huntington:
And they've got no idea what actually is going to happen.
Ian:
For our non Australian listeners I just have to emphasize, cane toads were released on purpose to eat some kind of pests?
E.Huntington:
Prickly pear I think.
Ian:
And then they became some kind of horrible pest in their own right and it's a great ecological disaster that everybody regrets.
E.Huntington:
Indeed, yes, indeed.
Ian:
And now you're supposed to like gather cane toads and put them in your freezer.
E.Huntington:
Yes, and do other horrible things to them.
Ian:
As a public service.
E.Huntington:
Indeed, yes.
Ian:
So, the idea of releasing something and it being like a cane toad is really, really bad.
E.Huntington:
Indeed, yeah, and Australia actually interestingly has a cascading series of these things because we released one pest into the wild and then imported another one and released it into the wild to control the first pest and then when that went crazy we released another one into the wild to control that pest. And you can see that happening in most of the algorithms in a lot of the social media platforms right now. Now not all of them, of course go disastrously really wrong and many of them do actually improve the situation. But after, for example, the Trump election Facebook changed their algorithms to try to address the growing tide of concern that they were simply being manipulated by bad actors. But they released them and then everyone complained about those and they just resulted in different types of behavior collectively on that particular platform.
E.Huntington:
And if you think these days about what's going on, how many of our politicians are slaves to their Twitter feed? Our politicians' reactions are being driven by software that was written by people on the other side of the planet who had a very particular view about what they were building and a set of cultural and political and societal and technological assumptions that are just built into that software and they're unconscious, they didn't know that they were doing it. They didn't realize that this was how it's going to go. This technology got released into the wild like cane toads and we really need anthropologists watching this and starting to create a discourse and some active reflection about where this is all going. Because I'm not sure that any of us are particularly happy with where it's at right now.
E.Huntington:
We need some people who understand people, it's kind of that simple.
Ian:
So programmers and engineers feel like they don't understand people?
E.Huntington:
We're not generally trained and educated in those areas to understand people. And we need to much more effectively connect anthropologists to our engineers and computer scientists because the more that we just kind of experiment on the world live, that strikes me as being highly problematic. We need some folks who can actually do an anthropology of the internet and start to use that to help us be somewhat more informed about the way that we write our code and the way that we connect with each other these days because we're constructing a system where we've got computer code connecting these wet squidgy things called people together. And [crosstalk 00:12:30]
Ian:
That is an engineer's perspective [crosstalk 00:12:32]
E.Huntington:
That is indeed, yes, yes, yes.
E.Huntington:
And these wet squidgy things actually behave in particular ways and the way that you connect them is important and it changes the way they interact with each other. And it's actually very poor engineering practice to do something like that where you don't understand what the wet squidgy things are and how they operate.
Ian:
So you get anthropologists into engineering so that engineers better understand every component of the system that they're working in?
E.Huntington:
Correct, that is exactly right.
Ian:
Who's included in an anthropology of the internet? Where does it begin, where does it end?
E.Huntington:
Oh crikey.
E.Huntington:
Sorry, to our non Australian listeners that was a very Australian piece of phraseology right there. So, the most obvious one is we've been talking a lot about social media. So that is a place where two human beings are connected to each other anywhere around the world where they're connected entirely by a piece of software and the hardware at the end which is your phone or whatever it is. And it's that piece of software that actually mediates that connection. These days you are much more likely to be having a narrow cast conversation with a set of people who have exhibited interests that are very similar to you, anywhere in the world, you're much more likely to be talking to them than you are your neighbor three houses down in the same street who might have a different set of interests to you.
Ian:
Yes.
E.Huntington:
But there are other places where we actually very much need to start thinking about the way that people operate in other ways as well. So we're going through a wave of mass urbanization right around the globe at the moment, by 2040, six billion people are predicted to be living in an urban environment. And what that's going to mean is that the built form and the way that we structure the built form is going to very significantly guide the way that again we as individuals act and the way that we interact with each other. And if you then layer on top of that the fact that we're going to start to be dynamically changing that built form because it's going to be possible to update the way that the traffic light system works, it's going to be possible to dynamically update the way that our building operates, pedestrian flow, a whole bunch of things. We're going to be making a world that people actually want to live in.
Ian:
I want to continue on that-
E.Huntington:
Yeah.
Ian:
But first let's go back for a second. Who else is included in an anthropology of the internet? I mean does that include the engineers themselves?
E.Huntington:
Okay, well so that's an interesting one. I think that not only do we need anthropologists working beside engineers in order to do good quality engineering, I also think that we need to do an anthropology of engineers.
Ian:
So what does that mean?
E.Huntington:
I mean engineers are making our world, right? And the way that we as engineers think collectively, behave collectively, what we consider to be important, you know all of the cultural and social things that are about us as engineers. I think somebody should be watching that and reflecting on that and reflecting that back to us, to society, to understand how the people who are making our world actually view the world. Because I fear frankly that we as engineers are actually making a world that reflects what we think the world should be, and how we operate, and how we believe, and how we behave in our culture and we are very monocultural at the moment. So we're not I think building a world that reflects the true diversity of the folks around us.
E.Huntington:
You know engineering in Australia and engineering in America and engineering in the UK is too white, too male and too urban.
Ian:
I want to pick up on something you said just there, which is that engineers are making our world. Now engineers are not the only people who are making the world, politicians, especially in democratic countries, there are a lot of other different sort of institutions that are making the world as well. So where does engineering articulate with other institutional arrangements like that?
E.Huntington:
I guess what I would say is that technology evolves in complex interplay with our society and with our cultures and with our economy. And this doesn't happen in isolation. If you go looking through moments in time and history where there's been simultaneous economic, technological and societal disruption you can find them. And what you interestingly find is that if you then overlay the emergence of engineering disciplines, they do actually tend to emerge at the same time because they're about bringing technological trust at scale at a time when there is actually of this simultaneous disruption going on. The timing of all of this depends on which part of the world you're operating in at any one time. And I guess I know it sounds kind of odd for an engineer to say this but one way of thinking about engineering is that it is about technological trust at scale. So it's much less about the technology and much more about the fact that it's not just about just making stuff, it's about making stuff in a way that operates with trust, at scale, in our society.
E.Huntington:
And the example that I often give is that most of us are comfortable to drive across the Sydney Harbour Bridge because we're comfortable that engineers designed it and designed it in such a way that we're not going to fall into the ocean.
Ian:
See I would take that a step further and say that we're comfortable that the government selected those engineers and continues to kind of assess the bridge and make sure that it's safe.
E.Huntington:
Correct.
Ian:
You know I would certainly believe in a world where engineers could design a brilliant bridge and then a corrupt government could construct it poorly.
E.Huntington:
Correct, yes.
Ian:
So there are additional layers of trust beyond just the trust in engineering or the trust in scientists themselves, is that right?
E.Huntington:
Oh absolutely. It's a very complex interplay and they do not evolve in isolation, and I guess that's my point. So take a very particular example around the first industrial revolution, the first thing that got invented was a steam engine, and then what they did was attach that to a variety of pieces of machinery. And basically we had just a bunch of enthusiastic people making stuff in the sort of metaphorical equivalent of their garage and just having a go at a bunch of things. And for example, they got attached to mining systems, they got attached to mill systems, they got attached to a whole bunch of things. And then what happened was, they were incredibly efficient, they drove the economy, but then people started getting damaged by it. And so it coincided with mass societal disruption, and you're right, that one of the things that then occurred as a consequence of that was that the politicians stepped in to start to try to stabilize that. So for example, a lot of child labor laws came about because kids were being sent in to pick cotton out of cotton mills and having their arms ripped off.
E.Huntington:
So these things evolve together and I completely agree that bringing trust at scale is more than just the engineers but there is a relationship there and they do evolve together.
Ian:
So as part of what's been going on in the school of engineering and computer science lately, has been bringing together the 3Ai-
E.Huntington:
Yeah.
Ian:
Institute.
Ian:
You said Genevieve Bell has given you a bit of a reading list.
E.Huntington:
Yes.
Ian:
It included Bourdieu, it included Judith Butler, and I wonder if you could tell us a bit about what's on that list and what from there has changed your view? What's been surprising to you?
E.Huntington:
So yes, it's been a pretty extensive reading list and it covered Bourdieu, it covered Butler, it covered Geertz, it covered Latour, so-
Ian:
Had you read any of this stuff before?
E.Huntington:
No.
Ian:
Not even Latour? Because Latour writes about science a lot?
E.Huntington:
Correct, yes. So you've got to remember my background is experimental quantum optics, so while I've done a lot of reading, so I never expected to become a dean, that was just not, I mean I didn't wake up one day thinking, “Oh, that's what I'm going to do.” So I've done a lot of reading, and a lot of reflecting and a lot of thinking about the way that people operate and the way that people operate at scale. But I've come at it from the perspective of a kind of reading list that you would get naturally as part of executive coaching and leadership development and that sort of thing. So not the fundamentals if you see what I mean, much more on the applied end. I will admit I had to dust off my dictionary and have it next to me, next to the books.
Ian:
I'm a PhD student and I still do that too, so.
E.Huntington:
Yeah, indeed, yeah. So, having the dictionary there to decode a bunch of words actually the concepts are genuinely interesting. And so it's been something of a liberal arts reading list as you might imagine, but no I'd never come across Latour. And in fact the very first book Genevieve gave me was a book by Sharon Traweek which was an ethnography of particle physicists. Which actually I will admit, particle physics, the training that people get in particle physics and the environment of particle physics is actually very similar to the world that I come from in a technical sense. And I will admit, I found it incredibly confronting to read that book and see what my world looked like from the outside. And it actually [crosstalk 00:20:26]
Ian:
What was confronting about that?
E.Huntington:
Well frankly it gave me PTSD.
Ian:
Wow.
E.Huntington:
Yeah, I mean junior people and students in the very, very experimental physics disciplines are treated as goods and chattels and they're traded on a market. That was my experience of being a junior person in physics all of those years ago. So that was actually both affirming and a little bit confronting, I will say.
E.Huntington:
So that reading journey has actually been really interesting because it's given me a framework to think about observations that I had made anyway. And so to put that into a theoretical structure I think is actually incredibly helpful. So, you know really basic concepts like the concept of performing your identity and habitats and stuff like that are just once you get your head around them, you just carry them around with you all of the time and it sort of embeds the way that you think and do.
Ian:
And so as a dean, how has it effected the way that you're conducing business around the school?
E.Huntington:
It's certainly influenced the way that I've been thinking about what we need to do in order to achieve genuinely system wide and cultural change around the fact that for example we are monocultural in our engineering disciplines. And I don't just mean that in my part of my university of my country I just mean as a discipline we do tend to be monocultural. And that is largely a consequence of the fact that particularly in our societies and our cultures folks identify very early on as being either STEM or not STEM and then they start to perform that their entire lives.
Ian:
So STEM, that's science, technology, engineering, mathematics?
E.Huntington:
Indeed, yes.
Ian:
There's such a harsh divide it seems like in people's minds between STEM and the humanities.
E.Huntington:
Yeah, indeed, and it's entirely artificial and I think very counterproductive particularly given we as human beings are now deeply embedded into pretty much every engineered system that's out there. And so to make that divide is just bonkers, really bonkers. But I mean that is a sense of identity that particularly in Australia and North America and the UK is formed very early, and then it's performed and hardened and amplified as you go through your life. So to have that idea of performative identity presented to me from a theoretical perspective has actually been incredibly constructive in terms of thinking about how we might go about achieving cultural change. Again, writ large, as well as writ smaller, within my own school inside of the university.
E.Huntington:
Concepts like habitus of course are incredibly useful to think about because the way that people talk about the way that things are done compared to the way that they actually do it and then again compared to the way that it's written down.
Ian:
So you've noticed some gaps between those things?
E.Huntington:
Of course, yeah, yeah. I mean you know we are in the end a culture of people in the same way that everyone else is. And to see that reflected back and again to have anthropologists have that conversation I think is actually incredibly important.
Ian:
One of the kind of main elements of being a culture is to have an ideology or a set of values.
E.Huntington:
Yeah.
Ian:
And you've spoke a little bit about how engineers reproduce the world the way they believe it ought to be. Are you trying to being more critical engagement to that ideology? I mean what is in that ideology? [crosstalk 00:23:37] If Foucault is part of your reading, like where does power?
E.Huntington:
Yeah.
Ian:
Where does power fit into this?
E.Huntington:
Indeed, there's a bunch of stuff that sits inside of that. So have we brought much of that critical analysis to that? Not yet. There is a core ideology that interestingly is very different when you talk to academic engineers and computer scientists compared to professional engineers and computer scientists.
Ian:
Yeah.
E.Huntington:
Academic engineers and computer scientists are very ideologically closer to say academic scientists and things like that. So a lot of the insights that come out of places like Latour and stuff like that actually apply very strongly to academic engineers and computer scientists. One of the most interesting insights that I got out of Latour was the concept that we actually construct facts, they are made objects. And that is from the outside I can see how you can see that, when that is not our received wisdom. We are taught the scientific method very early, it's deeply ingrained in the way that we believe that we're operating. And we do genuinely believe that we are, if you'd like, uncovering facts perhaps. But not necessarily constructing them. And so I mean that's an interesting insight.
E.Huntington:
It's also true that one of the strongest dichotomies in an academic environment where we're training people who are going to go out into the world and be professionals is that there is an inherent tension between theory and practice. And that's actually what makes engineering powerful, it is a combination of theory and practice so it's not all theory, it's also not all practice. It's a combination of the both. But that tension plays out very strongly at the point of university because what we have in universities is mostly academics who do the teaching, who these days really are practicing engineers as well. And so I feel like our lived experience of professional engineering is very limited. And so our capacity and our ability to actually educate people from a perspective of practical knowing and doing is actually really quite limited these days.
E.Huntington:
And that's an interesting tension that almost all universities face, and it's one of those places where I think we need to pay a lot more sophisticated and subtle attention.
Ian:
I'm trying to formulate the right question here, and it has to do with ideology and power.
E.Huntington:
Yeah.
Ian:
And we were talking earlier about the industrial revolution in the UK and how that was detrimental to some people, right? It completely transformed the way huge classes of people lived, where they lived, how they worked, and how their lives were assessed, how their lives were valued, how their lives were mapped, how their lives were counted. And I suppose, what I'm looking for in here are where are the spaces where some people fall through? So I wanted to come back to this, but you spoke about urban environments right?
E.Huntington:
Yeah.
Ian:
Urban environments are full of abandoned spaces.
E.Huntington:
Yeah.
Ian:
They're full of waste spaces.
E.Huntington:
Yeah.
Ian:
But those spaces are filled by people in various kinds of social relations. I mean whether it's sex workers, or addicts, or skateboarders.
E.Huntington:
Yeah.
Ian:
I was looking at, in Canberra there's the new light rail, and it's just kilometers long and there's a straight concrete run. And as soon as they take those fences away, skateboarders, like if the bikies, who's going to take advantage of that? So who's going to see that space and use it for something it was never intended for?
Ian:
And I wonder how much space there is in this new discipline of engineering, the things really to take into account, the powerless, the invisible, it goes beyond a certain kind of diversity in the lab itself.
E.Huntington:
Yeah, yeah, yeah.
Ian:
And to a kind of politics as to who is important and who matters.
E.Huntington:
Correct.
Ian:
And how do you then treat people who use those inventions in unintended ways?
E.Huntington:
Yeah. That is a really interesting train of thought. So I guess I'd suggest two things there. One is that if you look societally, one of the reasons that the discourse around innovation, around the fourth industrial revolution, all of that sort of stuff, one of the reasons that it's falling a little bit flat at the moment is because it resonates with a small number of people who can actually see that this is achievable for them or for their kids. But there's a vast majority of folks who are, you know if you're a welder in an industrial pressing somewhere, and you're looking at your 12 year old kid and you're thinking, “Well, so, okay now I'm being told I need to make a start up? I mean really?”
Ian:
There was a great Onion headline recently which was, ‘Let Me Just Try This One Thing,’ Says Engineering Student About To Destroy A Quarter Of The Workforce.
E.Huntington:
Right, exactly.
E.Huntington:
And so it feels unattainable and then on top of that what you've also got is a decline of the role of the expert in our society at the moment. And on top of that, you've also got to a large extent actually a significant democratization of technology.
Ian:
A democratization of technology?
E.Huntington:
Yeah, so in the sense that as you just said, you know, “Let me just try this one thing,” says the engineer who sweeps away a quarter of the workforce. Because you've got access to 3D printers, you've got access to machining systems that are lightweight, small, you can go buy one and plug it into your wall outlet. Most of them are driven by software, they're reprogrammable, they're reconfigurable, if you've got access to that skillset then actually you can make a whole bunch of pieces of technology these days and you can string all of that together. And os then the question is, well so who gets to learn how to write the code that actually reconfigures that? Who gets access to that technology? How do we know that they're actually making something that, as you say, I mean you've just given another example of releasing a cane toad into the wild. How do we know that what they're making is actually going to be something that is useful, safe-
Ian:
Or will they be printing guns?
E.Huntington:
Exactly.
E.Huntington:
And so what is the role of the expert in that world?
E.Huntington:
And one of the most interesting questions to me is, is that perhaps one of the places where we could address or start to think about a number of these things in a slightly different way? Which would be to say, okay, well so let's stop trying to contain and control, let's let that technology be out there and then the role of the people who bring technological trust at scale is to convene and curate the way that a whole bunch of other people engage with that technology rather than just being the person who gets to make it and control it themselves. Which is a very significant flip around compared to the way that industrial societies have previously worked where you've got to have access to the means of production in order to do that. But now the means of production is kind of everywhere and then the access is actually what you know about, how to operate it, rather than having sufficient money to buy the means of production yourself.
E.Huntington:
And it kind of changes things a little bit, not only will the experts need to be expert at the technology, but they're also going to need to be expert at engaging with the people who want to use the technology in a completely different way. And I think that actually means that we're talking about a totally different skillset, which is more than what frankly people from my world often talk about in a very dismissive way around soft skills and stuff like that. I mean [crosstalk 00:30:26]
Ian:
What do you mean? What is that?
E.Huntington:
Well so I mean we talk a lot about soft skills and what most folks mean by that in my world at least is you know the ability to put on a suit, talk in complete sentences, lead a team, go pitch to a client, that sort of thing. And they're kind of just basic functioning skills to operate in the workforce these days. What I'm talking about are people who actually have cultural sensitivity, some actual education around what it means to pull people together to achieve something, some education around the fact that you're going into a different environment. We need a much more diverse group of people with a much more diverse set of experiences and interests in order to actually be able to achieve that successfully. And people with a whole bunch of different sets of aspirations. So people who are just not interested building stuff for its own sake, but people who are interested in helping other people build stuff and bringing their expertise to that.
Ian:
Is Marx on your reading list?
E.Huntington:
Well not officially, but I mean the fact that I can talk about that sort of stuff, it's clear that it's there but I know about it, but no, I haven't got there yet.
Ian:
When you think about that kind of vast democratization of technology-
E.Huntington:
Yeah.
Ian:
And kind of access to means of production as you put it.
E.Huntington:
Yeah.
Ian:
How do you feel about that? What kind of world is that making do you think?
E.Huntington:
A very different one. A very different one.
Ian:
Is it going to make the world a better place?
E.Huntington:
It's going to make the world a different place. And people often ask me, you know, do I have a utopian or dystopian view of where all of this is going, my answer it, at the moment I have neither. What I'm interested in doing is doing what I can to actually get us to a place where we make a world that we want to live in. And at my heart I'm an engineer, I'm a very pragmatic person, and you know I'm going to try to have a red hot go at crafting interventions where we actually make the world we want.
Ian:
I was listening to something of yours on YouTube, preparing for this interview, and you said something about engineers being problem finders.
E.Huntington:
Yeah.
Ian:
As opposed to problem solvers, I was surprised by that formulation. What does that mean to be a problem finder?
E.Huntington:
That's an interesting one. So I came at this when I was trying to understand the nature of creativity. And in the 1970s a bunch of psychologists were interested in understanding the nature of creativity so what they did was take 30 fine art students and lock them in a small room with a pile of objects and said, “Make something.” And then they just watched them. And the students very quickly [crosstalk 00:32:49]
Ian:
Those are prime conditions for creativity [crosstalk 00:32:51]
E.Huntington:
Correct, yeah.
Ian:
Being watched by a bunch of researchers.
E.Huntington:
Indeed, yes. I don't think probes were involved, but. And the students very quickly just fell into two categories, there were those who took the objects and just made something, and they made it fairly decisively and incisively and they called them problem solvers. Then there was another group that were much more like me where they took the objects and sorted them into categories and then unsorted them and resorted them into different categories and made something and then didn't like it and pulled it apart and put it back together again and kind of just generally mucked around for ages and then finally just put something together very quickly. And then they got those pieces of art independently judged for creativity and what they found was that the problem solvers, the first incisive and decisive group were generally judged to have not had as creative works as the second group that they called problem finders.
E.Huntington:
And I think that's actually one of the important ways of thinking about these days in the sense that if you tell an engineer to go build a bridge over there, then a problem solver will go, “Oh, okay,” and they'll figure out how to make a bridge over there. A problem finder will ask the question, "Do we even need the bridge? What is the problem you're trying to solve? And maybe just better wifi connection would be the solution."
Ian:
See this is very much how I think about anthropologists as well. People who are very good at critiquing, but not necessarily so good at finding solutions to the problems that they have identified.
E.Huntington:
Yeah.
Ian:
So a lot of my work has been in the development area, World Bank, stuff like that, and I came to feel that what anthropologists do in development is they tell you why nothing will ever work. Where is the productive collaboration?
E.Huntington:
Right, indeed.
Ian:
Between problem finders and problem solvers.
E.Huntington:
Right.
E.Huntington:
So I guess where I come at this is that, I'm going to circle back to the acronym STEM. So, the acronym STEM stands for science, technology, engineering and maths. And science and maths are motivated by an understanding that people just want to know. Technology is about making things, and engineering is about solving human problems by making stuff using your mastery of science and maths.
Ian:
That's a profoundly social conception of what an engineer is.
E.Huntington:
Correct, yes. And I guess my point here is that the motivations to be an engineer or do engineering are therefore vastly different to the motivations for science and maths or indeed different to being a technologist. And one of the key differences when you talk about the research and the practice of engineering versus say science and maths is that we want to know things for the purposes of crafting an intervention. And so in that sense, there is an interesting relationship, say for example, between anthropology and say engineering or computing in that if anthropology or if a particular subdiscipline of anthropology is motivated by understanding, that's not the same as then the follow on step which is to craft an intervention. And so one of the most lovable and simultaneously irritating things about engineers is that we'll always say, “Well so that's nice, but what can I do with it?” So insight for its own sake is not necessarily something that engineers are famous for being comfortable with. At some point engineers want to know for the purposes of crafting an intervention.
E.Huntington:
And so there's going to be an interesting dialogue between engineering and anthropology from the same perspective in the sense that if I get my way, engineers will work out that we need to know stuff from the world of anthropology. But we're going to always be asking the question, “Okay, so now what do I do with that insight?”
Ian:
Right, so the question of interventions are always really entangled with politics and with ethics as well. And ethics has been something of a crisis in the sort of tech community lately, hasn't it?
E.Huntington:
Indeed, and one of the interesting things there is you know I keep talking about engineering being technological trust at scale, and you keep reiterating and I completely agree with you that that is one element of a very, very much more complex interplay around as you say politics, power, economies, societies, cultures, all of that sort of stuff. And the same thing applies writ small as well. And to me actually one of the interesting things about the discussion about ethics particularly ethics in AI is that we're thinking about it in a very narrow way at the moment. It's really very transactional in the sense that it's just AI plus ethics.
Ian:
What do you mean transactional?
E.Huntington:
As computer scientists, computer engineers, and engineers, at the moment we're asking relatively simple questions around you know the trolley problem for example. So-
Ian:
That's about how a machine decides whether to kill one person or three [crosstalk 00:37:20]
E.Huntington:
Yeah, exactly.
Ian:
I'll put a link to it in the show notes.
E.Huntington:
Yeah, yeah.
E.Huntington:
And as engineers, we want to know well what's the answer?
Ian:
Right.
E.Huntington:
Right.
Ian:
And so how far does that get you?
E.Huntington:
Yeah, and not very is the short answer. And so there was a really interesting attempt, I think it was from MIT to crowdsource an answer to that. And so [crosstalk 00:37:42]
Ian:
Crowdsource an answer to it?
E.Huntington:
To the trolley problem, so they actually posed it I think it was on social media they just said, “Here's a hypothetical example, what would you do?”
Ian:
Twitter poll.
E.Huntington:
Yeah, and what they were interested in doing was I think understanding what the collection of people who were engaging with that particular outlet would actually treat as the trolley problem. Now the interesting thing there is that there are all sorts of cultural assumptions that get built into our computer code all of the time. So, most artificial intelligence and machine learning is based on probabilities and statistics and stuff like that and at some point you actually have to make a decision. And the question is, where do you draw that line and what do you even write down as the basis for the decision. What are the factors that you bring into play, all of the rest of it. Engineers and computer scientists are not trained to understand how to pose the question about what are the factors that you think about, how do you optimize that, how do you make that decision? And so there is the beginnings of some very interesting conversations going on between philosophers, ethicists, machine learning people and all of that sort of thing.
E.Huntington:
But right now it's really quite narrowly focused on those sorts of things and it's a place to start but I think it's a much bigger issue than that.
Ian:
We're coming towards the end of our time and I guess I wanted to wrap up by just asking, what kind of impacts you'd hope to have on engineering as a field? You know comparing what it was like when you came in, what you've been realizing about it and what kind of engineering field you'd like to leave behind you? And how anthropology contributes to that.
E.Huntington:
Right.
E.Huntington:
So I'm motivated by making a difference and I want to change the world. And the way I think that we can change the world is by creating the kind of educational experience that means that we have a whole bunch of people out there in the world who are bringing technological trust at scale.
E.Huntington:
And I think there are kind of two ways that that needs to be done. The first is that pretty much all of the traditional engineering disciplines need to I think pull themselves up an extra layer of abstraction and to start to realize that our world is just not a natural world anymore. It's a constructed environment and what engineers are doing are putting together complicated heterogenous systems of systems where there are lots of these wet squidgy things called people inside of all of them. And our actions and interactions are entirely driven by the things that we as engineers build and we absolutely need to get much, much better at understanding that and building people into the center of everything that we do and doing it incredibly precisely and consciously.
Ian:
So I feel like I'm hearing two sort of contradictory currents, two different parallel currents.
E.Huntington:
Yeah.
Ian:
In some of the things that you've said.
Ian:
One of them is the idea of humans as wet squidgy components in a system and the other one is this kind of democratization and an explosion of creativity and sort of heterogeneous purposes to which technology can be put including political purposes, ideological purposes.
Ian:
How do you kind of work with those two opposite views of what a person is?
E.Huntington:
But that's the thing, I think we need a whole bunch more people out there who are taking charge of their own environment. That's what I want, what I want is for people to have enough expertise of their own to be able to take charge of their own environment and actually not hand it to a very small number of people who are evil overlords basically.
Ian:
So not to be, because we're all embedded in these technological [crosstalk 00:41:03] structures and systems right?
E.Huntington:
Yeah, yeah.
Ian:
But to have agency within that as well?
E.Huntington:
Correct, that is absolutely right.
E.Huntington:
And so what that means is that not everybody who say walks through my doors, or ideally the doors of any other university in the world. Not everybody who walks through those doors and touches on some engineering education necessarily needs to go on and be like the fully qualified ticketed engineer, because what we need is a mixture of those sorts of people as well as people who can actually go out there and take agency and take control of their own and build a world around them that they want.
Ian:
So it sounds like, and ethical engineering in a way is about promoting and protecting the agency of people everywhere?
E.Huntington:
Correct, that is absolutely right.
E.Huntington:
And I think there's a lot of work to be done there just around the traditional engineering disciplines to get that better. The other thing that I really think we need to be doing is I think we stand on the cusp of needing the next engineering discipline and that's the one that explicit brings in the idea that we are indeed creasing systems that have people in the center of them and that we need to be much more explicit about what that means to do that safely, bringing trust at scale, and explicitly understanding how all of that comes together. And that's the piece of work that The Autonomy, Agency and Assurance Institute's doing here at the ANU at the moment. And I'm just thrilled and delighted, they're really going gray guns on that. I think they're going to change the world with her.
Ian:
Well on that I think we're going to have to wrap it up. I want to thank you so much for coming on the show.
E.Huntington:
My very great pleasure.
Ian:
It's been really great.
Ian:
That was it, me and Dr. Elanor Huntington. Today's episode was produced by me, Ian Pollock, with help from the other familiar strangers, Julia Brown, Alex D'Aloia, Jodie-Lee Trembath, Simon Theobald and Kylie Wong Dolan.
Ian:
Our executive producers are Deanna Catto and Matthew Phung.
Ian:
Subscribe to The Familiar Strange podcast, you can find us on iTunes, Spotify and all the other familiar places. And don't forget to leave us a rating or a review. It helps people find the show and helps us make the show better. And if you'd like to support us please check out our Patreon page, patreon.com/thefamiliarstrange, not The Strange Familiars which is another fun podcast, just not ours.
Ian:
You can find the show notes, including a list of all the books and papers mentioned today, plus our blog about anthropology's role in the world at thefamiliarstrange.com.
Ian:
If you want to contribute to the blog, or have anything to say to me or the other hosts of this program, email us at submissions@thefamiliarstrange.com, tweet @tfstweets, or look us up on Facebook and Instagram. Music by Pete Dabro, special thanks to Nick Farley, Will Grant, Martin Pierce and Maud Rowe. Thanks for listening. See you in two weeks, and until next time, keep talking strange.