Stephanie Hare: Technology is Not Neutral

Every new technology that improves our lives also has the potential for unintended consequences. These issues can pose very real and growing ethical problems for all of us. For example, automated facial recognition can make life easier and safer for us – but also poses huge issues with regard to privacy, ownership of data and even identity theft.

How do we develop strategies that maximise the benefits and minimise the harms? Stephanie draws on her experience as a technologist, political risk analyst and historian to discuss the ideas in her new book, ‘Technology Is Not Neutral – A Short Guide to Technology Ethics’, to offer a practical approach to technology ethics to inspire anyone creating, using or investing in technology.

Want more of these insightful talks?

At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.

Sign up for a weekly dose of latest actionable and useful content.


Unsubscribe any time. We will never sell your email address. It is yours.

Transcript

Stephanie Hare
This is like a weird sort of kindergarten show and tell. So this is the book: technology is not neutral, a short guide to technology ethics, and it has a little devil on the front cover. But it has an angel on the back. And when you open it, they’re actually looking at each other.

I wanted the book to be a tool. And so we designed it. There’s a whole cast of characters who helped me make this book comes out of my head. And into the world’s and the artist is an artist named Noma Bar, who does amazing covers for like the economist and the New Yorker and Margaret Atwood and is fantastic. And so he really got it, he read the book, and I was like, I don’t you know, I’m really nervous about putting a book out with a devil on the cover. That’s not I didn’t imagine putting Satan as like part of my brand. And I don’t want to demonise tech or tech workers, because I think that happens a lot in the media anyways. And as someone who’s works in tech, I don’t want to be demonised. But I wanted to play with that of like, how do you appeal to, you know, our better selves or better angels? But also, how do we be realistic and acknowledge that tension between good and bad? And in the book, I broke it down into like, just a few. It’s really, it’s a short guide to technology ethics, it’s not a comprehensive guide. So I’m sure there’s all sorts of people who think why didn’t you cover X, Y, and Zed, and it’s because I wanted it to be read during the pandemic, I think most of us have pretty high demands on our time.

So it’s super short, it’s like a minimum viable product, if you want to clue in to technology ethics, and learn about what it’s about, and why you should care how it relates to business or your work. This is like the quickest, easiest start. And then it’s like a gateway, there’s a full bibliography on my website, if you really want to go in depth, it’s like 174 pages long. It’s actually a deeply researched book. But it’s written in a way I hope that anyone could understand from, you know, a curious bright high school person to somebody post high school in university starting out in their career, or, frankly, people who are really deep in their career and need to start reading something but fresh, because I’m 45 now, I definitely don’t know everything at my advanced age, and you sometimes have to just start at the beginning with something. So the beginning that I start with is just a question: is technology neutral.

And I set it out as like a debate. So there’s team technology is neutral, and team technology is not neutral. And on those teams are some of the top thinkers and doers working in technology today. And they are very strong in how they take their position. And the goal and setting out as a debate isn’t to get you to agree with me, which is kind of irrelevant. It’s more to just lay the debate out there. And by hearing all these different people, by the end of it, hopefully you will have kind of gone through the exercise and the process and come up with your own view. Because you’ll be like, Oh, I agree with that. Not quite with that. And you’ll you’ll figure out your own position, if you will.

And then we move into the second chapter, which is like, Okay, we’ve had this lovely philosophical debate, is it neutral or not? It’s like, well, now what, what are we supposed to do with that? And I think that again, is because I’ve worked in teams with technologists and with engineers for a really long time, I also worked in management consultancy, and I’ve had to advise governments and companies for work as a political risk analyst. It’s all very well and good having debates as an academic. But if you were in the doing space, you have to eventually make a call, you know, do we fund this or not? Do we build this or not? Do we sell to these people? Or not? How do we how do we prevent this from being misused? What needs to be on our mind in terms of the lawyers and the due diligence and compliance side? So all of that was really in my mind. So that chapter is called Where do we draw the line? So once you’ve got your ethics sort of position articulated for yourself, it’s like, well, what are you going to do about it? And that can feel like starting an essay with a blank piece of paper where you maybe relish that or maybe you’re like terrified, so again, wanted to make it really user friendly and be like, let’s actually look at people who’ve had to do this before. What can we learn from our friends in the hard sciences, and it goes back to the Manhattan Project, and takes us right up to the present day looking at how did physicists and mathematicians and chemists and biologists and even baldness, look at the question of their work being weaponized for the Second World War, the Vietnam War. How does this work with atomic policy today, which I’m literally was looking at Twitter before we came on, and Chernobyl is trending on Twitter, which is never a good sign, you know if the atomic issue is alive and well, and then we’re obviously talking about it with regards to artificial intelligence, which some people think is like fire or electricity. It’s so transformative. And other people are like, No, it’s just statistics on steroids chill out.

So for us who are thinking about technology today, and wanted to be like, we don’t have to start from scratch here, there’s actually an entire scholarship. There are scientists who’ve had to figure out ethical positions before and social scientists, you’ve had to figure it out before, how do we weave that in to today’s conversation in a way that is actionable, so that anyone can pick this book on, read it and start using it. So there you go. And then it’s followed by two case studies where it’s like a demo. So once you kind of get the methodology down, it’s like, let’s apply it to facial recognition, technology, which is, you know, increasingly impossible to ignore around the world. And then I also looked at digital health tools in the pandemic, because where I live in the United Kingdom, and particularly England, we really tried to do something quite big with that here. And so we have many learnings to share with the rest of the world about them what we got right and what we got wrong, which I don’t consider a failure, I consider that learning. And it’s really important that we learn from it because we have to decide for our pandemic preparedness toolbox going forward, do we want to have these digital tools in our weaponry or not? So it gives you really practical examples that you will be armed with examples to apply in your own work if you read this book, or just in debates or to read newspapers and listen to the media more critically, and hopefully feel a little bit less like, Oh, God, I’m facing the blank page, I want you to feel like no, I’ve totally got something here that can at least get me started.

Mark Littlewood
Interesting. Now I’ve got some questions, I would love to hear people’s initial thoughts actually about whether technology is neutral or not neutral. So maybe we should try another little chat storm and get ready to either type in neutral or not neutral into the chat. So it shouldn’t take more than about 10 seconds to get that ready.

It’s fascinating. It’s fascinating. Okay, so maybe that. Well, I may come back to you on that. Because I think that tax neutral, but directed by people that aren’t good said, It probably says a lot about some of the people here that you see it as, as not being neutral. Just tell me a little bit or talk a little bit about some of the principal protagonists in the neutral corner, because maybe that’s where we should be

Want more of these insightful talks?

At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.

Sign up for a weekly dose of latest actionable and useful content.


Unsubscribe any time. We will never sell your email address. It is yours.

Stephanie Hare
What did I say? I’m teasing I was actually looking at over because I was I was terrified about failing the exam with my own book, which is a freaking nightmare scenario!

So for example, I mean, the big thing really, of the team technology is neutral team is this. They think that technology is neutral, and it’s how we humans use it, that determines whether it’s good or bad. Right, so like Garry Kasparov, chess grandmaster, author of deep thinking, is really interesting on this. So he says, I’ve gotten his posted on Twitter in 2019. Tech is agnostic, it amplifies us. Ethical AI is like ethical electricity. And so he was kind of he was using the example of like, it’s like the, you know, it’s the physics of nuclear technology that is neutral. And therefore, it’s how do we use it, you could use it to make a nuclear power plant, and that could be part of your green tech revolution, or you could use it to make a weapon and like, wipe out Earth, and lots of people wreak mass devastation. So that’s kind of where they’re thinking about it. And I’ve got a lot of examples here. So just a slight one because of the facial recognition Chapter case study. It’s so big. I kind of teed it up with a bit of forshadowing Amazon Vice President and Chief Technology Officer Verner Vogel’s.

He said that society bears responsibility for the use of facial recognition technology. It goes, it’s not my decision to make because Amazon sells recognition to police forces around the world. This tool that they’ve made, it goes,

this technology is being used for good in many places, it's in society's direction to actually decide whether tech is applicable. And under which conditions, machine learning and AI are like steel mills, sometimes steel is used to make incubators for babies. But sometimes steel is used to make guns.

And you can imagine, like, I’ve given lots of examples in the book, but you can kind of see, and imagine all the different technologies and tools that we all use different people coming up with different things. Social media is another great example, obviously. We’ve got Adam Mosseri, the head of Instagram, and he was saying, you know, we’re not neutral after the January 6 insurrection in the United States last year. He’s like, we’re not neutral, no platform is neutral. We all have values, and those values influence the decisions we make.

But then in August, so only, you know, seven months later, eight months later, he goes no, no, technology isn’t good or bad, it just is. Social media has amplified things like me, too, or Black Lives Matter and bad things like disinformation. So you can see like even some of the people who are, you know, releasing some of the most influential tech companies in the world, even they are playing with it. It’s just fascinating to look at, whereas the technology is not neutral camp. They were really interesting, because I have to say I was probably more in the tech as neutral camp when I started this book in 2019, I changed my own mind by doing the research and writing, which is always tricky.

The people in the not neutral camp were more kind of like Professor Kate Crawford, who gave an absolutely amazing presentation, you can watch it online on YouTube, it was at the Royal Society in 2018. And it was all about ‘Not just an engineer’, it’s about AI and power and saying when you’re creating datasets, or you’re choosing to use a dataset or not, when you are creating an algorithm if you are not auditing that algorithm versus if you are, you are making these design choices at every stage, and you know who is on the receiving end of a technology versus who is wielding it, like play with that a little bit in the book to be like, lots of workplace surveillance technology has sprung up during the pandemic in particular, I mean, it was there before, but it’s gone really big now, because lots of employers are wanting to monitor their workers. Well, that’s always top down. It’s never workers monitoring execs, you know, which like, you can imagine how different you know, the Enron corporate scandal would have been if we had been monitoring our executives and senior management, right, it’s always top down. So she, she really articulated power flows in a way that I thought was cool. And then I also looked at a lot at designers. Art is a big part of my process.

As a researcher and writer, I often think in pictures before I can write; weirdly. And I love talking to graphic designers and artists about tech, and how they how they would take on the challenge of scale, you can take anything from, you know, a pen, to a tube of lipstick, to a phone and have a fascinating conversation with them about like the values of what materials they’re using, is it designed for a default male hand which is bigger than the average woman’s hand and causes women lots of problems. And that’s a great case of like, that’s not about minorities, being affected by a technology; women are actually the majority. But tech isn’t designed for women. It’s designed for men, which Caroline Criado paradied is articulated so brilliantly in her book, Invisible Women, which I just see someone’s popped up in the chat. Superb, you’re absolutely right.

So looking at designs, like aesthetics is a branch of philosophy. And in this book, it’s a book about technology ethics, but it actually goes through the six main branches of philosophy because you can’t really talk about ethics without at least being aware of metaphysics, epistemology, logic, aesthetics, political philosophy, and then your talk on ethics becomes like, turbocharged, and that’s what I really wanted in the book was for people who haven’t studied philosophy and you know, it’s very easy not to in one’s education systems, depending on what country you’re in. You can easily go through life and be highly educated and never knowingly take a philosophy class and you’re really missing out because it’s a fascinating discipline and like thinking methodology that anyone can use. So I kind of take everybody through a very quick tutorial like into the basics of philosophy, how each branch applies to technology, and then it’s like go apply.

Mark Littlewood
Interesting. I think the other I mean, one of the really most powerful quotes for me in that debate was there are a lot of technology people who are very much in the neutral corner, and I do see it’s a fight interesting those so that when everybody and a lot of those big tech leaders, everyone at Google, on the other side, I think, you know, the person that really kind of resonated for me is a guy who’s a web developer, famously described in a BBC interview chat called Sir Tim Berners Lee. And he, I suppose, technically, he is a web developer. But his quote, I think is really fascinating is

as we're designing the system, we are designing society.

And almost, for me kind of stops the conversation; I think it very much depends on your philosophical perspective. And I like this kind of concept that you talk about in the book, which is this sort of concept of, is technology maximising the benefit and minimising the harm for people. And you can turn around in any sense and say, Yes, we are maximising the benefit, minimising the harm. But let’s say that you’re maximising the benefit and being good for 80% of the people in the population in the world, that’s over one and a half billion people that you are potentially harming and this kind of whole concept of doing things at scale, which is very, very much the, the orthodoxy for for growing and scaling tech businesses, investors love tech businesses that can scale and the more people that you can reach that, the better. But that quite often means that you leave people behind.

So when we were talking last week, which feels like a lifetime ago. And in many ways it was we were talking about some of the things that were going on iun tech companies, and the ways that different tech companies were approaching the the war in the Ukraine. Quite a lot has changed in fact, in the in the last week, in terms of what different tech companies can do, and I know you’re very kind of tuned in to what’s going on over there and constantly on Twitter, you’re @harebrain. Just give us a quick summary of some of the things that are going on there. And then I think it’d be really interesting if we can turn this back on some of the people that are here, and ask them what they would do.

Stephanie Hare
Yes. So it’s a really fascinating ethical dilemma in so many ways, and not just for tech companies. So maybe if we like broaden the lens, and then we’ll like narrow the focus down into tech companies. So all sorts of companies have to take an ethical position on all sorts of things, right, like companies, you know, who loves to often pretend that they don’t have a political position in anything will also have a mission statement, which is an articulation of their values. And they put that in their annual report, and they discuss it with shareholders and they hire armies of PR people to pump that out, particularly anytime they find themselves in hot water.

So many, many years ago, during the Arab uprising I started working on something with the academic Professor Timothy Forte, called Corporate Foreign Policy because we were really interested on how technology companies in particular were responding to the Arab uprisings and it was led by Google first, you know, do you pull your people out? It’s easier if you’re a tech company like Google to pull your people out then say, France Telecom, which had, you know, infrastructure, physical infrastructure, and many, many employees in Egypt at the time. So again, it’s that kind of, we’ve seen it with the pandemic, it’s like, same planet, different worlds, you know, we’re all in the same storm, but we’re in different boats. So, I was really intrigued by that and started looking at the companies that, you know, would come out and make very strong political or ethical positions on democracy or whether or not they were aligned with the US State Department schools. This changed, then after the Snowden revelations, when you saw lots of us technology companies, either had willingly been participating with American foreign policy goals, or the NSA had simply just hacked in and was hoovering up their data.

So I then wrote a Harvard Business Review case study on that called ‘for your eyes only’. It’s like US technology companies, sovereign states, in the battle for data protection. And I focused on the US because again, I’m from the US originally, and that was just my interest as a researcher at the time. So here we are now. And we have a really interesting case of Russia, which has, I think, more nukes than any country in the world. So like, that’s one of the factor in terms of what we can do. We have Ukraine, which is not NATO, not in the European Union, has already had Crimea annexed back in 2014. We’ve had Russia, obviously, invading other countries and doing all sorts of things, for instance, in Georgia. And then you’ve got the NATO countries, particularly the Baltics, which are looking, looking eastward very nervously, and then looking westward to their NATO Allies going remember Article Five, which is, you know, an attack on one, it’s an attack on all and every company is therefore having to look at that political situation. So you know, even I think it was Trotsky who said, “you may not be interested in war, but it’s interested in you”, I inadvertently hadn’t actually learned that quote, to my horror as someone who trained as an historian until after I wrote this book, because I was like, you might not be interested in technology, but technology is interested in you, inadvertently citing Trotsky. But it’s kind of relevant here. Because you may just be McDonald’s, and your job is like, you just want to make a tonne of money selling burgers and fries. And now you are suddenly getting nailed on Twitter, you’re the subject of articles in The Washington Post, do you stay? Or do you go?

Well, again, McDonald’s has lots of physical shops, all over Russia, it has staff, they have to decide what to do with. Would leaving them hurt their people, or with staying hurt their people? Danone the food company, Uniqlo the Japanese clothing company has said they’re staying because they think that their ethical responsibility and their business responsibility is to continue feeding and including people in Russia. Whereas others are taking a stand of, you know, we are pulling out. And some companies will only take action when for instance, a government imposes sanctions. So suddenly, it’s like the United States is like you can’t trade with Iran. If you’re American in any way, shape, or form, you just can’t be dealing with Iran, then the companies will tack to that political and international law line because they could get sued or it could affect their business.

Other companies don’t wait for the government to act they will pull out on their own, again, based on their risk profile and how easy it is to do that or not. So what we’ve been seeing literally just in like, I think it’s like the past 10 days, it’s insane how fast this is going, almost sort of lost track of checking my calendar. What we’re seeing is not just technology companies that will come from the moment but all companies that are in Russia or in Ukraine, or even just in frankly, Western Europe, taking a look. We’re seeing countries like India, China, the United Arab Emirates getting critiqued a bit in the Western press certainly, for having abstained in the UN Security Council vote that had been sponsored by the United States to condemn this invasion.

So you see that what makes the technology companies perhaps unique is the fact that first of all, Ukraine’s digital Minister Fedorov is using Twitter, in a really interesting way to lobby directly straight to the tech companies saying you need to pull out, you need to cut off their access to Microsoft Office. You need to you know, you need to do all sorts of things. And so he’s interesting because he was doing a little bit of the corporate foreign policy stuff, even before the war, he had been building those relationships, as many government ministers do. In Silicon Valley. He had a lot of those contacts. And he’s he’s 31 And he’s young, he’s online, he’s engaged, we’re already seeing that the Ukrainian government, not just with their president Zolinsky. But Fedorov and a few others are very savvy in terms of how they’re using social media to get their story out to put video and you know, show that they’re there every day that they haven’t fled or been hurt, that they’re still alive. They’re staying in Kyiv. They’re doing all of that. And they’re putting pressure on these tech companies to pull out or to get involved. So like Microsoft hasn’t just, you know, taken a stand on it sort of ethically, they’re also involved in threat intelligence, because there’s obviously the whole risk of cyber warfare, which is going to be potentially something we’ll be hearing more and more about less so at the moment.

Want more of these insightful talks?

At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.

Sign up for a weekly dose of latest actionable and useful content.


Unsubscribe any time. We will never sell your email address. It is yours.

So the tech companies have a role to play as well, because I think they’re among, you know, the top 10 companies in the world by market capitalization, everything they do is discussed; Elon Musk with Tesla was like sending star links to federal about his request, and they were having the chats openly on Twitter, which, you know, again, is, as a researcher of political risk and tech, I was just like, What am I even seeing right now? Elon Musk is not speaking on behalf of any government, he’s speaking on behalf of Elon Musk. And yet, like now a player in this right, and like, frankly, is that really any different than any CEO, and they, you know, the reason I’m playing with this a bit, and you can see, I like to play in the book a lot with like, I can argue this, but the knot is, like, we just saw Disney their CEO quite recently, like as in the past 48 to 72 hours, was like, we don’t want to get involved. So they’re like not showing any movies and television and stuff in Russia. Now, there have been an ethical call there. But um, in terms of domestic politics within the US, which is obviously very fraught environment politically now for companies too; they were like We don’t want to make stands on all of the different things that are coming up around pick your social justice issue, we think that we can best articulate our values by being silent, not putting out new statements, particularly from the CEO, nobody wants to hear from CEOs. He said, I’m paraphrasing, and you can judge our values through our work output. So, you know, I will leave that to you as to whether or not you all think that is like a valid position to take or if silence is actually a political statement, the whole thing of all that is necessary for evil to flourish is good men do nothing. If you say nothing, are you actually complicit? Or is it only ever by taking an active role?

I guess it depends on if you think silence or doing nothing is actually active. Particularly if you have the capability of acting, but choose not to, I guess, versus if you’re silent, or you can’t act because you just actually can’t that’s different than choosing not to. The reason that ethics both fascinates me and infuriates me is like, it’s maddening. And I say this in the book, like you have to draw the line every time. Like, each generation is having to draw the line differently. So you know, like my generation, Zed nephews, talking to my baby boomer parents, like some very different views, and my siblings and I as Generation X. Our youngest is like Generation X/millennial like we do, sometimes not always, but sometimes fall differently, I think, based on how our worldview is. And obviously that changes depending on your gender, or your race, or your positionality in all areas can be I don’t want to go too far into that. But it can be influenced by where you stand in the world personally. And therefore, as a company, you can imagine, really tricky, you, as a CEO might have a view or as a board, you might have a view. But if you’re a global company, you might have, you know, 500,000 employees based all over the world. And you might have a fiduciary responsibility as well, that has to take into effect how your actions will affect your share price, or your third party contracts that you have. All of that has to come into your decision about where you fall, where you draw the line on all number of issues. So what do you do? It’s a mess, and it’s exhausting.

That’s why I’m talking to all of you! I’ve spent, you know, years during a pandemic reading and writing it by myself locked in this office. I’m dying to hear what you all think. So I hopefully get to the you know,

Audience Member
I was quite cleared up because actually, I was surprised because your voice was very familiar. And you’re on Patrick’s podcast this week. My son does a podcast called Ethics and AI so it was bizarre that you’re actually this episode, he said your favourite interview. But the question I was interested in is there are a lot of things. So business for example, I had two questions, one, how you’re different because ethics is classic business problem, how you see it is different in the technology space, and also how you deal with cases which are not so black and white.

So like, there’s really easy ethical questions – like: Are we in favour of star people starving in the third world – but if you get into some particularly what you might term culture issues. So for example, if you go to America, you have a vaccine versus anti vaccine Choice versus abortion, how you would, how you can deal with it, those sort of grey areas, if those two questions make sense to you.

Stephanie Hare
I mean, they absolutely make sense to me. And I’m kind of I’m kind of laughing because I’m like wearing grey, I’ve noticed I have been wearing grey a lot. So I’m like living in the grey area. There’s actually there’s a section in a book called The grey space. And that, in terms of the technology side, features an exciting interview between Jeremy Paxman and David Bowie, where they’re trying to discuss like, is the internet something new? Like is it just a different delivery mechanism for content? That’s sort of the Jeremy Paxman view. And then David Bowie’s like no, it’s like, it’s totally different. It’s going to change everything. And this, it’s amazing to watch this interview, which you can again find on YouTube, because I think it’s from 1999. And he saw, he intuitively grasped what Paxman who I think is probably a more linear thinker didn’t. I have a lot of sympathy with him because I think most of us probably didn’t get it right away. We’re not all David Bowie. Which is like, when you are in the grey area, when things are not black and white, when you are also still dealing with something whose effects you you possibly haven’t grasped. Because I don’t know how much we teach strategic thinking now. And interdisciplinary thinking as well, which helps you to make those calls.

So like you raised the issue of abortion, which is in the United States, particularly contentious topic, it’s like, I can’t believe we’re still talking about it. But you know, for my entire life, that has been something that has been a lightning rod issue. And it differs not just at the federal level, there’s like the federal position, it’s the 50 states and then within the 50 states, you’re going to have, you know, it’s just this battleground, it’s exhausting. And it takes up so much time. And of course, it’s related to all sorts of other issues about how the United States feels about women, and personal freedoms, but also even public health. Because it isn’t just women. It’s also like who, women in which area is women in which communities. If a woman wants to be able to guide her reproductive future, she will have different options, depending on how much money she has, for instance, and her options to that. So and that’s just in one country before you then take it globally. So again, if you’re a global company, it’s not just your position on abortion within the United States, how are you dealing with the public health and values of your workers which may conflict and not be in alignment all over the world?

So my view in writing this book isn’t to be like, you know, this is right. This is what is ethical. And this is what is not my view. I mean, that would be an absolute failure of a book, I reckon. It’s more to kind of tee up ways of thinking it through which different people will answer differently based on, again, their position, their constraints, their goals, you know, maybe you’re fine with being unethical, because your goal is just to make a tonne of money. Maybe, maybe your goal is social change and profits, not what’s motivating you, right, like, so it’s just, it’s just gonna depend. So yeah, I mean, you’re right to highlight that it is rarely black and white, although I think probably I do raise this point, like, it’s okay to agree to disagree a lot of the times, but like, what do you do when agreeing to disagree isn’t possible, because that person’s view is like, you deserve to be wiped out? You again, take us back to Ukraine. Right now, what do you do if one party is saying we’re going to need de-nazify fight Ukraine, and the Ukrainians are like, What the hell are you talking about? You can’t just be like, okay, like, Let’s just all have our point of view. That’s really hard. Now, luckily, most of the time, we don’t have to take it to the mat like that. But I think that’s what’s challenging so many companies while you’re suddenly seeing Coca Cola being kind of shamed into pulling out because it was one of the laggards on that. Whose side do you want to be on? And how do you want to be judged retrospectively from history, I think is also a very clarifying way to think about issues that don’t feel black and white. And you’re like, if my kids knew about this, or my grandkids knew about, like what I did in this, if it was to be the subject of a Financial Times Big Read investigation, you know how I’m leading this company? Or what my team’s responses or, you know, the product that I create? Are you proud? You know, I cite Joseph Weizenbaum. In the book. Are you proud of what you do?

Audience Member
How do you get out of getting trapped in the language, because like, for example, the charity reprieve kicks back strongly on using drones to kill terrorists. Their argument is drones are killing unarmed civilians and how you phrase that sentence affects your ethical view. And there was a brilliant case with Priti Patel and it was the Telegraph reported 57 illegal migrants drowned in the channel, The Guardian reported 57 Innocent people drowned in the channel and the language had so how do you escape that? Because that language to some extent drives your ethical view as to whether you’re right to kill people or not? I’m curious on that.

Stephanie Hare
I think you you’ve hit the nail on the head. So language is a political act. It’s an ethical act in terms of how would you want to phrase it, you know, the terrorist versus the freedom fighter is the classic one. But yeah, illegal immigrant versus innocent person. It’s awful when you really dive into ethics, because to go back to the Sir Tim Berners Lee quote that Mark had just cited, which is in the book where he talks about as you build your system, as you design your system, your every component of your life, of your society, is a design choice.

And when you flip that switch in your head, and you’re like, what I’m eating, what I’m wearing, where I shop, who I support with my money, and who I pull my funds back from, how I vote, you know, my views on immigration, all of those things, you know, my views on defence spending versus pandemic spending, do I wear a mask or not like, every single thing becomes an ethical decision, if you take that view that none of this is neutral. And language is a big part of that. And as a writer, I’m super sensitive to that because you know, a) I am human, so I’m going to get it wrong nine times out of 10, even with the festival in the world, it’s just, you screw up and you hope to God that your editors catch things, or your readers will call you up on something if you get it wrong. But you also b) just have to be really mindful of your own biases, which again, is really hard. It requires a lot of self reflection, and honesty and humility, which again, is really difficult to do. So you need other people to constantly be checking you. And that’s where like having diversity and inclusion and equity, based in your team or if you’re self employed in your peer group really matters. If you’re only talking to people who refer to immigrants is, you know, illegal, over time that will probably, you’ll start drifting that way without even realising it. Whereas if you are forced to talk to other people, if you actually met some of the people who are immigrating to your country, find out their stories, and realise that they’re human beings just like you, it’s really hard to then demonise them with that language. And we again, come back to like, the angel and the devil, right, like, the power of language and words is massive.

So I think it’s probably just about being mindful and intentional. And again, the humility of going we’re all probably going to get this wrong. So it’s not about Oh, my God, I mustn’t ever get it wrong. It’s like, I probably will get it wrong. What’s my plan for that? And also, how do I design it, so I can catch getting wrong as early as possible? So damage limitation, right? It’s like, it’s so hard being human.

Mark Littlewood
Tell me a little bit about your background. Stephanie, because you said that you’ve started off the perspective of technology’s neutral. I just share a little bit about where you’ve come from on that journey. I’ll name companies if you don’t

Stephanie Hare
We can name and shame them all. And they deserve it for you know, writing this book, because you don’t want to quiet life, if you write ethics, stuff. So like the potted history, super fast:

Born and raised in the United States in the wonderful state of Illinois, one of the most corrupt states in the United States, but with some really good people also trying to fight that so I grew up very interested in politics and parents who are one is very technical at one absolutely thinks technologies quite boring, and is much more into politics and history and art and all of that stuff. So it was a good, good mix. You know, Republicans and Democrats in the Family. So lots of debate with debate was encouraged by then went off to the University of Illinois at Urbana Champaign, and I’d sort of joke about this in the book. But you know, it really does have one of the best programmes for computer science and computer engineering in the states and thus arguably in the world, and I was there during like, the dot com boom was starting. So it would have been an amazing time to study tech, and I completely missed it. Because I was studying French, I didn’t give a damn, totally uninterested. But all of my friends were, so I used I was like the lone humanities student in our group, who would cross over West Green Street, and go to the engineering quad. And like one of my friends had like Marc Andreessen’s old job, or, you know, who like had built mosaic there, and then like, has gone on to become a Silicon Valley billionaire.

Want more of these insightful talks?

At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.

Sign up for a weekly dose of latest actionable and useful content.


Unsubscribe any time. We will never sell your email address. It is yours.

So I was constantly steeped between my dad and my friends in this world of engineers and technologists, and people talking about data and cloud and client server, whatever. And I just was like, because I was studying history, and loved it and languages. So that was my jam. French was my first one, but I studied six and play with them still, to this day. It’s like how I relax. It’s my version of wordle.

So I did that. And then I moved to France. So I lived in France for two years loved, loved it, I was studying and trying to get my French as good as I could then move to the UK did a master’s in theory and history of International Relations focusing on Cold War history, which has proved oddly useful in the past while and then went worked at Accenture. So that was my first like proper grownup job. That was in 2000. I was there for three years, I did coding for two, I was not particularly gifted as a coder, but enjoyed learning how to do it and demystified it. I really loved working with engineers learning project management, but I was more by the end of it on the strategy side, I like talking to users, I absolutely love to sit down with the user and figure out why something’s not working. And what will make it wonderful for them and make them happy. Like, I like tools that work. And I also love talking to, you know, frankly, the people who have to solve the business problem for whom technology is an instrument to solve a problem. And so that could be anyone from somebody running a system for a hospital, to somebody who’s like, we need to get this product to market faster, how do we do it? So I loved that. But I had something that I loved more, which was French history. So I then left after three years to go do my PhD at the London School of Economics back to the scene of the crime. And I did that looking at the career of a French war criminal, who was still alive. I was actually in France while he was on trial for collaborating with the Nazis in the Second World War. And he got let out of jail on health grounds at the age of like, 93. And I remember thinking I was at Accenture, was my last year, they’re thinking somebody has to get this man story before he dies. Because he wasn’t just involved in the Holocaust. He was involved in the Algerian War, which was like a colonial story, dark, grim stuff, and no mitigating factor of a Nazi occupation. This was just a liberal democracy, treating its citizens terribly based on religious and ethnic grounds, and Empire.

So I decided I would go and get his story. And luckily, he agreed to be interviewed. So I interviewed him for three years, won a fellowship to Oxford. So I went to St. Anthony’s College, Oxford, taught there for a bit as well, and then got into political risk, because there was a company there that I was like meeting lots of people who work there, just as the euro crisis was kicking off, and they needed somebody who understood economics and politics and had the languages to be their Western Europe Senior Analyst, so I suddenly slotted into this weird job and it was like riding a bucking bronco for five years, I covered sovereign defaults of countries and obviously being American, there was a lot of US Western Europe, political risk stuff. Planes being shot out of the sky, here in Europe having to cover that the Syria, war, Libya, refugee crisis, all of it. I mean, it was an intense job. And it was while I was in that job, that the Snowden revelations occurred. And I watched pretty much like that, all of a sudden, it became very clear to people that technology was a factor in all sorts of things that these were not just like, yeah, we’re money factories, social media, search engines. Hurrah. It was like, we’re working with the US government to further foreign policy aims on behalf of this country, which might be great. I don’t know what your views are on that. But a lot of countries had a real problem with that. We’re spying on our friends and our enemies.

And then obviously watching the 2016 US presidential election and the Brexit vote here, and the role of technology companies in that. I then left that job at Oxford Analytica to go work at Palantir, which is an interesting choice. But I did that in 2015. So before Trump was elected, I didn’t know about Palantir is roll with ICE, then which actually started under Obama, and became much more notorious under Trump, and continues to this day under Biden.

So that was a really good lesson for me in real time of, you might be working here in the London office for a company that has global operations. And you might not even be aware of what your company’s doing in another part of the world. And you’re focused over here on work that you’re really proud of, and that you think is important. And that’s all going to blow up in your face anyways, and you as an employee will have to decide how you feel about that. And having left that job, I think, in early 2017, the first quarter to go become the principal director for cybersecurity research at Accenture. I saw you know, Accenture has also had its decisions to partner with the US government, its own workers protesting over various things. And indeed, I don’t know a technology company that frankly, hasn’t had to have its workers take a stance. And it’s tricky, because I don’t want to offend anybody here who you know how they voted, whatever. But you know, you can look up the positions of Amazon, Microsoft, Google, Facebook, Palantir, Accenture, Deloitte, PwC, like pick your company, people were having their workers do like tech worker disobedience movements, protesting working with the Trump administration or not, now they’re having a different dialogue and conversation about it under Biden. And that was a US discussion. It’s broadened out now, I think, with the Russia Ukraine invasion, to Oh, my God, this isn’t just like, how do these companies respond in a US political context? It’s like, how do we respond globally, and ditto, again, with China with the detention of over a million Uighur Muslims in concentration camps, a lot of these companies are having to make a decision about how they feel about human rights, and civil liberties and privacy in China.

And that’s just like, that’s just like the example of a few countries we could talk about this for hours, with all sorts of different countries and companies and who’s doing what and what do you do as a worker? If you’re like, Whoa, I joke like Ma’am, this is a Wendy’s. Like, I am just coding here, or my job is I’m just doing human resources here. I don’t even build stuff. I’m just like, helping to run it. And it’s like, Well, okay, but if you’re helping to run it, and it is doing bad stuff in this part of the world, you’re choosing that go get a job elsewhere, or test or whatever. It’s, again, messy. And I have lived that messiness, personally. I don’t claim to know, am I the Devil or the angel Hindenburg, I’m in the grey zone. And I’m like, you start out with good intentions and you find yourself in situations you would never have thought.

Mark Littlewood
Okay, we are unfortunately running out of time. But I’d rather than ask the audience, if they have one last question to ask, I’m gonna ask a question to you: What’s the question that you’d like to leave people with here?

Stephanie Hare
Selfishly, I would love if anybody has read the book, or reads the book later, and can get in touch with me to say, this is where I want you to take this next. Because I think it’s very easy to talk a lot about ethics, or, you know, form committees or write academic papers that get published in journals. I am really wanting to be in the trenches as much as possible with people who are on the sharp end of those decisions of what do we build or not? What do we fund or not? How can I help? How can I be more useful? If I were to continue in this line of research, I think like, I feel like right now, I need to hear from the people out there to be like, You’re barking up the wrong tree, look over here or this area, focus there and go harder. That would be selfishly as a researcher, I would love to have that kind of feedback.

Mark Littlewood
All right. So yeah, please do get in touch with Stephanie. Shared some shed some links to background about the book and various bits and pieces in the in the chat. Stephanie, thank you very, very much for joining us, I think you know, it’s a classic challenge and it’s really fascinating to see an issue like tech ethics evolving. So super quickly. I think someone mentioned in the chat earlier on, it’s that kind of classic, sometimes decades when nothing happened. Sorry, decades when decades happen or someone’s going to correct whatever mutation of that quote,

decades past with nothing happening and sometimes decades happened in weeks.

Mark Littlewood
Yeah. So, yeah, it’s really interesting seeing how quickly things are going. Stephanie, thank you so much. Thank you. @harebrain on Twitter, and we will speak soon


Stephanie Hare

Author & Broadcaster, HareBrain

Stephanie is a researcher, broadcaster and author focused on technology, politics and history. Selected for the BBC Expert Women programme and the Foreign Policy Interrupted fellowship, she contributes frequently to radio and television and has published in the Financial Times, The Washington Post, the Guardian/Observer, the Harvard Business Review, and WIRED.

Previously she has worked at Accenture, Palantir, and Oxford Analytica and also held the Alistair Horne Visiting Fellowship at St Antony’s College, Oxford. She earned a PhD and MSc from the LSE and a BA from the University of Illinois at Urbana-Champaign, including a year at the Université de la Sorbonne (Paris IV).


Next Events

Want more of these insightful talks?

At BoS we run events and publish highly-valued content for anyone building, running, or scaling a SaaS or software business.

Sign up for a weekly dose of latest actionable and useful content.


Unsubscribe any time. We will never sell your email address. It is yours.