Morality and Ethics - Caring is Everything

Presented by Sam Warner at You Got This London 2019

Video hosting kindly provided by Mux - video for developers

Transcript

These transcripts were captured live by a captioner. As such, there may be small errors. If you spot any, please feel free to submit a pull request with amendments.

Hi, everyone. I'm Sam. I'm going to be talking today, if I can find the right buttons to get to the next slide - not that one, not that one - that one!

I'm going to be talking about ethics in software engineering, how caring is everything when you start to actually start make software. So, that's me. That's the only photo in here that is not free and open source but owned by my mum! [Laughter].

I'm Sam, I'm a consultant for Black Pepper Software. If any of the AND Digital people are around, I would say that we also have souls! That's the good start. I have a real passion for ethics and especially in software development. That's the introduction out of the way, I will give you - that's how we work.

So, here are some things that I'm going to try and cover, and hopefully I will have ticked all the boxes. If I haven't, come up to me afterwards and talk to me about these things because I would love to talk about that.

We're going to talk about what ethical software might mean, because it might have a few different definitions to different people. I will define it my way. Methods to prevent what I call ethical debt. A bit like tech debt except where you've overlooked the implementations of what you've made, and you have to go back and fix it, but do you? Review of processes in roles in terms of software development.

I'm willing to guess that a lot of people here develop software in a similar way to the way I do. I'm an Agile software developer. Any other Agile people here? And people roles that you might be able to include in your software development cycles. Signposting other places to learn more about ethics, because I'm by no means a canonical source. It would be nice to know where where to go after I'm gone to help you learn more.

On to the actual talk. So, while I can't speak for everyone in the field, personally, I like to think of myself as a problem-solver, I'm someone who can apply algorithms, design patterns, some kind of process on a problem that's been presented to me and get to an answer - most of the time.

Sometimes, I need some help. That's fine. I just want an answer to the question I've been posed. Cool. When I can't do that, that is scary. That's the point I go and ask questions to other people and say, "Please help me." Most of the time they do. Other times, I feel slightly better about myself. That's fine. When it gets to people, it's a lot less clear what the answer actually is. When you're dealing with fleshy human things, that is something that goes from scary to terrifying. A sea of questions with no clear answer to me is so demotivating. I don't like working in that kind of environment.

But ethics in software development, that's something we're just going to have to dive into. A spoiler alert: I'm not going to tell you any answers because I don't have any. But we will try and do the best that we can. Ethical tech and ethical software, whatever it is, I use those words a lot already.

I'm going to define it this way, and so I'm going to say at the very least, if we are setting the bar really, really low, and it is not makes the world any worse of a place than it already is today, who here thinks they can manage that? Yeah! No? But, really, if we're not setting the bar really, really low, if we want to try and go out there and do something good, let's say that actually we are striving for something, because it is a social good, we want to help the most people in the biggest way we can. It would be really good. Couple more definitions. Not just definitions, from the slide, or definitions.

Ethics, I can blaze over these, is a set of moral principles that will govern the way the person might behave, whether conduct be an activity. Morality, the way that someone might work out right or wrong. Moral judgment, the evaluations you all make to get to the point to work out whether something is right or wrong, or good or bad. And good and bad is like the sad part of that definition. I think good and bad is really subjective. It could partly be to blame for this sea of grey that's held across the board when it comes to ethics and technology.

Good and bad are maybe fuzzy terms familiar from fairy tales from when I was like 15, and they're not concepts with hard borders. In those books, you have the evil witch, or the evil stepmother, or you have, like, I don't know, Prince Charming, and, it's very clear in those books that you have good people and bad people. When you're dealing with huge amounts of people, and something as pervasive as technology, it's no longer tech.

So how do we fix this problem of good and bad, not really having anything to measure it against? So, I like metrics. Who else likes metrics? Software people again, yeah? Yay, metrics! Measuring things is fun. And, in ethics, I would call in "effective altruism"? Anyone heard of those terms before? Woo, that's good. So, I'm going to cite somebody else's talk. I only have so long and I would love to deal with a whistle-stop tour of everything.

There was an amazing talk in London last year where the speak spoke about how we can use effective altruism in software development, qualis, how much your life gets extended by how much the cost of improving that person's life is, and how you can talk to the most people in the biggest way, or work out some way of creating a metric for it. Something that is a bit more utilitarian but can be useful. When you're talking about people, it can be impersonal to put numbers on them and it doesn't seem like ethics to put numbers on people, but I can kind of see why that might be a helpful way, and that's what I would like to do.

So, it does end up begging the question, what are we worth as people? I'm not going to get into that because that's some whole other mess of stuff. Now we know what we're talking about and how we might measure it. We're going to try to understand why it's still a problem. Has anyone heard this phrase before? Has anyone heard this said at any point ever? Yes, software development is ... . I'm getting a few nods. For me, this is kind of, well, I think it's undeniable to some extent. I don't think it's a bad thing. Being immature is something that everyone at some point in their life has to go through. It is how you get from not existing to becoming mature. You go through this middle stage of immaturity. When you compare computer signs or software defence to medicine, we have 60 years and they have several hundreds of years. That's 60 years of every university owning one computer.

How many years has that been of everyone walking around with a phone in their pocket or a computer strapped to their wrist, like, Brave New World. It's not a bad thing, but, really, we should not be ashamed of being immature as long as we mature in a mature way. As long as we mature quickly! Some people seem to take this as a horrible insult. If I'm upsetting anyone by saying that, I hope I'm not, but come and talk to me afterwards. [Laughter].

But I think that really it manifests itself in a lot of things we see regularly. People jumping on whatever bandwagons are the shiniest, whichever technology is newest, and crazy solutions with maddening levels of complexity afterwards. Sometimes, we will pick a card out, and it will say to do the thing. We're immature. We're not defining our stories correctly.

Not through lack of trying, but sometimes the process doesn't line up. We need to refine it. Far too frequently, we see people turn a blind eye to the repercussions of what they're actually going to make. That's not a horrible thing, so, we still are interested in making technology, especially in recent years, it's a hot topic, all sat here listening, not many people on their phones kind of thing.

You know, I know what you mean doing an okay job! I'm going to give three examples of places where ethical technology has become like a big thing, and highlight that it's not just me that's banging on about it. There are other people as well.

Last year, if you go to ... they're an online website, doing newsletters, things like that, and hold conferences. They had a whole day track on tech ethics. All the videos are available to look at on the internet. If it is a choice between binge-watching two more episodes of Friends, like I do all the time, or, going to watch a talk about ethics, go and watch the talk about ethics.

Another ethical technology event I was lucky enough to be part of was MLH. They're on ... somewhere. I went to one of their events at the Vatican called V Hacks. It was fantastic. It was a three-day event where you had to try and make something for social good, and you were developing in the back of your mind, how do I help people instead of how do I drive profit? It was a nice change. It's amazing. You get 120 sweaty teenage people, 20-years-old, whatever, in a room somewhere, and they actually come up with software that works, or half works, because it was a hackathon after all.

All of this points to me as professionals, and even people who are hobbyists, we are interested in the ethical implementations of what we make. We want to be proud of what we make. We want to make a difference. By being sat here today, I think that you guys are all on the right track to go and do that. This is something that you can bring to your employer, to your side projects, to your friends, your colleagues. You can talk about it. You can discuss this. You can try and put yourself in a better position to do, you know, nicer things to larger numbers of people. That's all it boils down to.

I've got a question now. I'll ask you guys some questions. We already heard about Stack Overflow earlier? Every year, they hold a Developer Insight Survey which is all about how the state of software development is at that point in time. Last year, they dedicated a whole section to ethics and technology. Fantastic. Stack Overflow does have some - what's the word? - issues with its demographic, so ... [Laughter].

So, these results are not, in my. 100 per cent accurate. Take the numbers with a pinch of salt. But given the 60,000 people that clicked on the link and filled it out, what percentage of developers said they considered the ethical applications of what they code? Give me a number? [Random shouts of numbers from the audience]. Please, have some faith in your fellow developers.

AUDIENCE: . 90. 30.

SAM: The truth is somewhere closer to 90, actually. That's a big number. 79.6 per cent of people that filled out that survey believe they have an obligation to consider the ethical implementations of what they code. That to me is amazing. I think that's so, so cool. However, that is just ticking a box. How many of us believe that 80 per cent of people do that? Me either. That's what the number said. Let's try and hold people to their word. If they tick that box, please do that thing. Question number 2 is very similar, but this time, what percentage of developers do we believe the ultimate blame of putting unethical technology, putting it out into the wild lies with them? So the ultimate blame?

AUDIENCE: 3, 8, 7. 20.

SAM: We were close with 20. 19.7 per cent of respondents believe that the ultimate responsibility of creating unethical technology lies with them. I'm not sure if I'm happy or sad with the number? What's the mood like in the room?

AUDIENCE: Sad.

SAM: It shows you that other people you can blame on the form, management were about half of it. I think product owners were about a third of it. I think really we want to empower our developers, so, even if this is not true, we feel like it is. That would be nice, in my opinion. I would love to believe that I have the power to stand up and say, actually, I have some obligation here. I should be in control of this.

Do any of you know about Google's Project Maven last year? This was Google with - oh, I'm getting some faces in the front row! Google had a contract with the US - is it the government or military? Military. Developing some artificial intelligence, machine-learning software, whatever, that was going to do some wonderful thing to the world, I'm sure. The employees at Google weren't quite happy about developing this software. Several just up sticks and left, walked out the door. A lot of other people filled out a petition internally saying we don't want to do this, and it stopped. Developers have the power to stop a contract between Google and the US military. That's crazy, right? That's kind of impressive. Do we feel a little bit more empowered now we know that? Maybe? Right. Right. [Laughter].

That's fine! I want to live in a world where developers have a voice, where developers feel like they're in control. Something as universal as software, we should feel like we can do something positive in the world, or at least stop something negative happening. So, giving some examples, I talked about what it is. What do we do about it now? So I've got three principles that I hold in the back of my head. Which I think probably help guide me on to a path where I can be more aware of what I'm doing and try to do things right. Number one: people come first.

A people-first approach to making technology is probably the most important thing that you can be aware of. In recent years, we've looked towards technology to enhance our humanity, and to carry on doing that, we need to make software for people, not for profit, or not just for profit. If we talk to our users, if we consider the impact of our ideas, we understand who the products are made for, and what human problems we're trying to solve, then we can reprogramme the values of software engineering, starting with studies about the ethical impact of the product we want to create, we can continually make projects which make differences in the interests of shareholders. Am we should make technology to create special and useful services for everyone. We have the straight gifts in mobile technology, mobile phones, and internet-connected watches.

If we reduce the costs of energy consumption for software, we might be able to provide a wide pool of access, potentially life-changing technology which is amazing. Who thinks here we make life-changing technology? I asked this question at a different conference and I also got no-one. Believe in yourself! You do something cool! Really, we have to understand that it's not just multi-billion dollar corporation that is are capable of considering humans and making software accessible. Everyone can do it. There are courses you can do, online training, I would really like to see that happen, to have small networks of people that talk about this subject build up so, together, we can make a larger network, a global network, or we can start to funnel some of our problems into and get results out of, that would be really cool.

And then number 3: more diversity to help us understand a wider range of ethical factors. To achieve any of this, we have to understand the technology across nationality, gender, ethnicity, or something like that. Some of those factors are only being realised and remedied now. This isn't the end of the battle. Better technologies come from being diverse.

By developing, managing, marketing in diverse themes, we're likely to realise the ethical issues and implementations of what we are doing, and improve the likes of it being flagged internally before it gets flagged on the front page of the BBC. I know what one I would rather happen to me . Let's get real: we all make mistakes? Who has made a mistake in the last week? Yes? Well done to everyone that put your hand up. The rest of you aren't trying hard enough! It's not something that you can control, and you should not try to control it.

Sometimes, making a mistake, we can't realise that we do it until it's far too late. Don't beat yourself of making a mistake, try and go and fix it. I think when Mark Zuckerberg made Facebook in his dorm room - is that what they call them in America - I don't think it was meant to be this data-crunching monster we know it as today. He started out with an idea that he wanted to do and it got out of hand a bit. To be honest, that's the most damning thing that anyone can say about it. Why do we make mistakes? How do we mitigate against them? There are too many reasons that we make mistakes, and we are going to find this freedom. It can be that you're having a bad day. It could be anything. We all make mistakes basically because we are human. How do we get better at it? That's a harder question. It's not just books. Please don't just think that you can solve human problems by diving into a pile of books because I don't believe in that myself. Education certainly is key, but there are millions of sources you can get it from.

I've got a slide here that will signpost you to some, to help you learn more. You can reflect on past instances of unethical technology. You can watch videos like I said earlier. Google it. Google any of that, brings some great stuff up. Discuss these ideas. This isn't going to get solved when you go home and think about it on your own, everyone needs to try to be on the same page. We all need to try and get better.

Check out the ethical tool kit. That's a fairly new addition to this talk. I'm going to beat through a couple of of the slides about it. The toolkit, this amazing tool that can help us be aware of what we are doing. It's developed by an institute for future - and I fail to say this name correctly - the Omidyar Network, one of my favourite tools about raising awareness in technology. It's a huge pdf. It's a huge pdf. Don't try to sit down into it in one go. I was breaking it up into morning coffee sessions. They highlight eight risk zones, areas that you might be exploiting your users whether on purpose or accidentally. Hopefully, it's not on purpose. And, highlight these so you can learn about them, go away and remedy them. That would be really nice. We've got good ones up there.

Addiction, don't mean, economy, machine ethics, data by the privacy, things like that. It gives you some scenarios, some are a little bit scary. They're a little bit, what's the word I would use, anxiety-inducing, but it gives you the idea of future tech, how it might be used negatively, and I like to try and turn it around into how you would still use the same technology and use it positively. That's what I try and do with a lot of these. And it gives you a checklist. The checklist is really good. I print it off for every project I try and do. You tick off the bits where you think you comply with what the ethical tool kit thinks is ethical software.

Where have we got? Trust, disinformation and propaganda. What type of data do you expect to share, measure, collect? Things like that. Just make you a little bit more aware of where you might be exploiting your users. How do you make most of it? Read through it. Go through the checklist. Run workshops. I ran one at work. It was really popular. People turned up. I talked for a whole air, and lunches weren't eaten. Change the way you work. It's like a step in your software development. So, just as I'm saying that, I will skip to this one. This is a diagram that we use at work to highlight our Agile strategy. It is very curvy. It means all kinds of wonderful things. One thing I would like to add to that is a third ear of the Mickey Mouse hat where you have ethical implications of your software. Add it into feasibility when you're creating your cards. That would be really nice. That's process.

For people, I would say that actually, I reckon the need for tech assist is going to increase. You're going to see more tech debunked. That's my Nostradamus prediction for the future. ... if anyone can make this happen. Personally, I think, yes, assist is going to be a thing that will be needed, but they shouldn't just be like taxes on people that have been caught out, like you did a bad thing. We should actively go about trying to hire people like this to try and catch issues before they make their way out into the public.

And three things to think before you even write the readme. Software in my opinion might be in the short-term, you're going to get caught out, you'll lose user base, you will lose profit and people that want to work with/for you. We should develop software in a way that we don't need to hide it behind contracts. How many people have scrolled down of seven pages of contract and clicked the particular box saying you've understood everything? It would be nice if we didn't need to do that. If we weren't tricking users to get into our software that provides them with some values.

You can make a difference. If that is something you're not happy about, try and flag it. Find someone you're comfortable talking to about that. See if other people agree with you. It would be a good step to believe that we're empowered to do so.

So my final recap: ethical software is about making the world a better place, or at least no worse than it already is - please! Consider a design, constant review of the features, and catch issues before your software goes out the door, and that is really the important bit, before the public can getting affected by this. ... is a good way to boost your colleagues with good knowledge. If you want to start talking about this to your colleagues, and they have no idea where to start, throw the pdf at them.

Talk to people how they thought about this, and see if there's a way that you can consider their worries when you're developing your software. So, thank you very much for listening to me ramble on.

I would like to finish by saying in software development, caring is everything. [Applause].

About the talk

In software development, caring is everything - and as the senior developers of tomorrow, we are should start the revolution. Are we the fix in the fight against guns-for-hire developers? How do we prevent unethical technology spreading (and being made)?

About Sam Warner

Sam is a recent graduate of the University of Warwick. Currently, he helps make cool things happen with Black Pepper Software in his role as a software developer.

Photo of Sam Warner

Sam Warner

@sjwarner_

You Got This is a network of community conferences focused on core, non-technical skills coordinated by Kevin Lewis.

Accessibility