Episode 10: Per Axbom on Compassion and Digital Ethics

Follow the Surfacing Podcast: Apple | Spotify | Google | YouTube | Amazon Music | Stitcher | RSS | More

 
Per.jpg

In this episode, hosts Lisa Welchman and Andy Vitale speak to Per Axbom. He is a consultant in UX, accessibility and digital strategy, and a speaker and writer on Digital Ethics.



Most know Per as the co-host of the UX Podcast, but in this interview they spoke to Per about his research and work in design ethics, his first computer and his experience with the social media platform, Clubhouse.

Transcript

Announcer:

Welcome to Surfacing. In this episode, hosts Lisa Welchman and Andy Vitale speak to Per Axbom. Most know Per as the co-host of the UX Podcast, but in this interview they spoke to Per about his research and work in design ethics, his first computer and his experience with the social media platform, Clubhouse.

Lisa Welchman:

So Per Axbom, thank you for joining us. I think I just learned to pretend how to say Per's name and I'll fall back to my Americanized version of your name Per. Thanks for joining us on Surfacing. You were one of the first people that I wanted to invite. Mostly because you've interviewed me a lot on UX podcast and also everybody knows you from UX podcast, amongst some other things but they mostly know you through UX podcast and so I thought it would be really fun to have you come on and talk about all the other amazing stuff you do. Most of which really resonates with me. Digital compassion, digital ethics, and mis-usability are all terms that I've read a lot on your super long lived blog, and I thought it would just be a great conversation to have with you. So thanks a lot. Let's talk about all of those things, pick one out of a hat. What's your heart beat doing right now?

Per Axbom:

Oh wow, that's fantastic. Thank you so much for having me, this is so exciting. As you say, I'm on the UX podcast but I don't get to talk a lot about my background and what I focus most on. I think to understand where I'm coming from, I actually want to back up a bit and tell everybody about where I grew up, because I was actually born in West Africa, in Liberia. I lived there for the first five years of my life and then we moved to Saudi Arabia and we lived in Riyadh for four years, and that's where I attended an international school. I think that's where I got my sort of American accent. Then after that we moved to Tanzania in East Africa. So I have this background of moving around the world and attending international schools, just immersing myself in cultures and understanding different perspectives and ways of seeing the world.

Per Axbom:

I think as a child that was extremely enlightening for me, in that we always returned to Sweden, because I am Swedish, I have Swedish parents. We'd always return to Sweden during the summers and having people say with confidence what they thought that the people or the way I was living in those countries, they were so certain about how that was and how the people were and about the peoples intelligence. It was just so odd to me to see how they could express such confidence in something they knew nothing about. So that's where I started to understand.

Lisa Welchman:

Were those positive and negative things that they were expressing or was it just off?

Per Axbom:

I think I react... Very off, I think mostly I reacted to the negative and it made me frustrated that people would think those things about other people without knowing anything. It mostly of course came down to news reporting and what aspects of those countries were being shown in the news and this was back in the eighties. So it was like starving children, and that's one aspect of Africa and I still come across this prejudice notion of what Africa is, when it's such a huge continent with so much diversity. Depending on how you look at it, it's said to be the largest continent, unless you say that Asia is, depending on what you include in Asia. But it's also 53 countries as well. So I think that is where I grounded this feeling of I can become a good media because I understand different perspectives so I can help people understand each other. So I studied communication science, hoping to be able to express the continent in a way that people can understand it.

Per Axbom:

So writing was what I was doing the most back then. I was writing content and trying to make that content resonate with the target audience and understanding that. So I was doing personas in that sense, in trying to understand who was the person on the other side reading this. And then the internet happened and I got into usability and had this path that a lot of people in this space have had who've been in it a long time. When you want to get into it and really help people, help people use the technology because you happen to be good at it, I got my first computer in 1982. So I'd been using computers for a long and it was, "I have this competence, I have this skill that I can use to help other people." So usability and accessibility became really important to me in understanding that I have this competence, I can help people and technology will probably solve a lot of things and I was in that head space of being excited about what the future would bring and democratization and people getting access to information, everything that just broke down over the past 10 years, and just that illusion that hits you when you realize that it wasn't as good as I thought it was.

Per Axbom:

I had been pushing people, of course, because I was so excited about technology, pushing people into this space and being an advocate for it and now it's shifted and within this disillusionment found another way of addressing technology and realizing that when you have systems thinking there's this idea that you have this balance, like if you have a car, you need to have the gas pedal and the brake pedal. You have a pencil, you have the pencil tip, you have the eraser, so the systems always have a feedback loop. Now I'm in the space where I try to be the brake pedal and start to brake, and I try to be the steering wheel and try to help people understand there's not just one path to follow, but we have to decide what direction we want to go in.

Per Axbom:

That's where I stumbled into this space of digital ethics and it's just exciting to be in but very frustrating because it's...I'm passionate about it, it's not something that makes me any money speaking of all the things I do because I mostly work with health care products and that's a slow moving industry and this thing I do with digital ethics, it's almost mostly my hobby. I blog about it, I do talks about it, and I'm out there trying to make waves and trying to help people understand and create awareness but it's tough but you find a lot of allies and you realize that a lot of people thinking this way just don't know how to find the tools to actually integrate it into their everyday work because the companies, they're not there yet.

Andy Vitale:

I definitely want to go forward into that story, but I also want to go backwards. You said that you go your first computer in 1982.

Per Axbom:

Yes.

Andy Vitale:

I want to hear more about that. What type of computer was it, reminisce on it for us.

Per Axbom:

So I was living in Saudi Arabia. My dad came home with it. It's a tiny computer, actually I could show it on camera for you because we have it here in my studio. I can do that later on. Yeah, we can most a picture in the show notes.

Lisa Welchman:

Yeah.

Per Axbom:

It's tiny. You hook it up and he hooked it up to the monochrome TV, through the antenna. It just had one K of memory. So you couldn't really do anything it with it except program. So I've learnt how to program BASIC, beginners all purpose symbolic instruction code. So my first program of course was as many people that had it back then. I wrote 10 which was the row number of the code, 10, print, Pelle. Pelle is my nickname. And then 20, next row, go to 10. So I created a loop which printed my name all across the screen. That was my first ever computer program which was also very visual.

Lisa Welchman:

How excited were you?

Per Axbom:

Which was also very visual. I was really excited. I typed something on a keyboard there, it was a membrane keyboard and something happened on the screen up there. It blew my mind.

Lisa Welchman:

It's addicting, isn't it?

Per Axbom:

Yes

Lisa Welchman:

It only takes one hook of running some code where you're just like... It just feels powerful.

Per Axbom:

It does. Yeah, exactly. You gain power and that's exactly that. Then you want more power then you actually expand the memory of the computer and you get stronger computers and you buy a printer and you have this... my first printer was a thermal printer. It actually scrapped with heat, things off the paper and then it was the dot matrix printer that just had eight dots per letter. It's been amazing just being part of this journey of computing making its way into everybody home but then also then when internet happened when I was getting out and starting to apply for work after college-

Lisa Welchman:

And what year did that happen for you, the internet? When you were saying that I thought, "What year?"

Per Axbom:

Yeah, for me it happened, I think... I met my wife to be then in '96 and it so happened that she worked weekends, and the funny thing is when I reflect back on this, I actually didn't have a computer for two years. I didn't think I was going to work in computers, I wanted to be a journalist. But she had a computer sitting in her apartment, she worked weekend so I was sitting in her apartment on weekends just getting online with this 14.4K modem and just right clicking on webpages and seeing view source and just copying code and seeing what could happen. So it's basically the same type of experience as that first computer but now it's like, "I'm copying someone else's code from somewhere else. It's not just from a magazine and I'll see what happens." Oh and something happened, what can I do with that? And just exploring.

Lisa Welchman:

How did you shift from weekend warrior on the internet to I want this to be the rest of my life?

Per Axbom:

I think I realized at that point that it came so easy for me, because of the experience I've had of using computers since I was such a small kid. So that ease with which it came to me and people expressing they were impressed by my knowledge. I realized, "Well, this is too easy." So you mean I can actually work with this? I thought that this was just a toy, I thought it was just for fun. So then I applied for a job at Ericsson as a web designer and I got it in '98 and that's how it all started.

Andy Vitale:

It's amazing when we start to play with things and have fun with them, the amount of learning that we do in exploration, it's so different now. When I look through systems thinkers and they came from tinkering and now I wonder if the draw to learning design, learning how to use the internet, learning development programing engineering is high in instant gratification and dollars, more so than it was for us when it was just a sense of something new and something to explore. It happened to turn into how we make a living. But I don't know that it's where we would've started to think about making a living back then.

Per Axbom:

Yeah, I try to think about what is the way that kids are doing that today, exploring and thinking. But I see that, they're just taking something that's ready and exploring with that. So I don't see the same exploration date which is sort of disconcerting me and I don't know what to do with that thought. I wish there was more awareness of how, because so many of us, were geeks and just excited about this technology. That's how we came into this profession and all of a sudden we gained power which is just crazy because we didn't have any power back then because everybody just thought we're just geeks. Now we're just trying to figure out what we're going to do with it, but we never had that period of time to reflect on what should we do it.

Per Axbom:

We've just been exploring all along and that's what allowed these companies that are releasing app after app are still doing. So we're still this young industry and they're still toying with it. But now it's starting to affect so many people and nobody's stopping and applying the brake pedal, to actually and think about it and what's happening.

Lisa Welchman:

I want to pull apart what you just said because I don't disagree with you, but there's also an element of we're olded to it, right? So I remember when I first started in the web, so we have similar paths, maybe a year earlier or something, something like that, slightly different path but still... I remember all the old document management people. Particularly at Cisco systems because the person who did a lot of the real interesting multichannel work at Cisco in the mid nineties so early was a woman who worked in document management. Then there are folks who worked in document management like Ann-- Rockley in industry who's a genius. Everyone talks about content strategy, this is like content strategy with a capital C, capital S, deep taxonomy, metadata driven content strategies. All of these people come out of the document management space, I remember they were complaining about web people at time trying to organize information online and saying, "This is just doc management." You don't understand the fundamentals of document management, building a document in components, reusing information, all the things that we talk about. So I'm always skeptical when people talk about either millennials or gen Z or whatever you call them, I don't even keep up on all the names.

Lisa Welchman:

I agree that they don't understand how the inside of a telephone works, but they sure know how to talk on it and they also sure know how to do stuff I would never think to do with it because they are natives. I think for me, and I'd really love to hear what your reflection on this, it's like what? How can we be together on this? We made the mess that they're standing on, so we can say that they don't understand it but we also made the mess and how can we be together because they're doing some really interesting and beautiful things with the technology and they're doing some not great stuff with it and we're doing some interesting things and we're doing... It's from a generational perspective, so it would be great to hear just your reflection on how we can come together and push this generational divide back together and proceed because I think we need all parts.

Per Axbom:

Yeah, I also have this vision of everyone coming together, this paradise vision. I also want to say about that concept of digital natives, because can also come off wrong in the sense that you can be 80 years old and be a digital native and you can also be 15 but there are certainly lots of 25 year olds that I know that never use tech, so sometimes it's all these generalizations can be dangerous as well.

Lisa Welchman:

I don't know, and I'm arguing with you. Maybe I'm not supposed to argue with my guests.

Per Axbom:

You're okay.

Lisa Welchman:

Here's the thing. Yes, I agree with exactly what you said, but this issue, if you can't tell, is one that just floats in my head all of the time. What's the difference between these generations of people? I could say my parents are in their 80's, my mom's very digitally savvy, my dad not so much. He's not a Luddite but he's just... The way his brain thinks it doesn't work so much, but she's not a native. For me native means people who take for granted that the technology exists. So you might be 25 and you may not use a smartphone, but I bet you, you know how one works and you understand.

Per Axbom:

With that definition, yes.

Lisa Welchman:

Yeah.

Per Axbom:

I'd completely agree.

Lisa Welchman:

Well that's what I'm talking, I'm not talking about people who're, people who just take it for granted, like I took for granted the telephone. My parents didn't grow up with phones in their house. For me, I can't even conceive of a world where I can't pick up a device and talk to somebody. I still think we're recording this over Zoom so I can see both Per and Andy but it's like the Jetsons television program where we're like video talking.

Per Axbom:

Yeah.

Lisa Welchman:

It still feels futuristic to me, whereas I'm sure a five year old is like, "This is life." So that's what I think of when I think of native. But go ahead, I'm taking us on a sideways tour. You were talking about something completely else.

Per Axbom:

No I think that's amazing. So what this made me think of, which is a sensitive subject just now when Sweden is this app Clubhouse, because Clubhouse is a conversation app where people can get on and talk to each other on rooms around any topic. Some of the things I've discovered on it are exactly what you were talking about, people of different generations actually ending up in the same room and finding out stuff about each other. I find that so exciting, and why I say it's so sensitive is because that same app is also breaking a lot of EU laws, it's breaking GDPR, it's asking you to share your contact information, and people who don't have an iPhone can't get on it. So it's very much an app for privileged people, but this essence of what the app accomplishes when you're in it, and immersed in it, and immersed in a room and immersed in conversation. That has uncovered something that I haven't experienced in a long time in a digital space, where people are finding a connection with each other without screen dumping stuff, without retweeting, without liking, but actually just hearing someones voice and giving them space to speak and giving them space to be listened to.

Per Axbom:

There's something there that's going on. I don't know if clubhouse will be that future app that will do that or if other platforms will just copy the functionality, but I think that's another example of a great experiment that is uncovering something that seems to be something that people need and especially in these times. I've heard so many people now express on Clubhouse how they've needed this so much having been at home for so long over the past year during a pandemic and just hearing someones voice, they didn't realize and I didn't realize before it happened how much just hearing someones voice, how much that brings a connection to another person. So I've talked about the barriers that are being broken down thanks to all the chat services and social media platforms, but somebody said in one of these rooms, "So Clubhouse is bringing social back to social media." Right now I agree with it, I have been on it for a long time. It's something different and that's exciting to me. I think we just need to sit down, again, apply the brakes and figure out what it is that is exciting here and what are still the dangers that could happen here.

Andy Vitale:

Yeah, I wanted to talk to you about Clubhouse briefly. Just reflecting on what you said that it's for privileged people, yet it's creating, or it's solving a need of people that needed to connect in these times. So those are two... It's definitely solving a problem potentially, and it's potentially creating another problem at the same time. Now how do you think the people behind Clubhouse feel about it? Do they see it 100% like, "This is a need, I know it's on iPhone only, I don't think that's coming from a privileged space, it's easier for me to get it out the door first and I've got an android version behind that." I'm just curious, the statement that it's for privileged people is 100% spot but I don't know that Clubhouse themselves or as the organization look at it that way.

Per Axbom:

This is hugely interesting. I don't want to really put thoughts into peoples heads. Obviously I don't really know what they are thinking, but I believe it's like most apps that I come across, and most digital servers I come across. They're just not thinking about the inclusion aspect of it because they're trying something out. They can hide behind, "It's still a beta, so we don't have to have everything working yet." It makes business sense because, one platform, you have that to contend with when you're making it because that means you only have to change that one platform as you learn stuff and that's what they're trying to do. They're reiterating and releasing new stuff constantly, having to do that for two platforms would of course be hugely more expensive, so it makes business sense. The question is, how do you ensure that you actually do include more people in the end?

Per Axbom:

I've been spending a lot of time... so this I find really interesting, I'm so immersed in it right now. I've been spending a lot of times in rooms where there have been deaf people, blind people, and people with speech impairments. So they are being able to use the app and that's what I'm trying to... I'm writing a big analysis on this right now, just this balance that you were talking about Andy. Yes, it's for privileged people but is it also including people that have not been including in other spaces. So it's actually creating a space that wasn't there before as well. So it's this whole thing of holding two thoughts in your head at the same time. Yes it can be very excluded or exclusive, but it can also be including people that have a hard time on other platforms that find it easier to talk to people here.

Per Axbom:

It can be things like if you're just nervous. You're okay to just listen in on conversations as well, I did for a week. Just listening in and people are fine with that as well. That is inclusive for me, I don't have to feel that I have to participate, that's not the only way to participate in social media. Just listening as well is okay. There's so many aspects to the inclusive word that I think they are actually succeeding with. I've also spent time reading the community guidelines, which I know people just don't read. But I find the community guidelines really appealing, I like them. So at the same time I'm criticizing this app a lot here in Sweden because of the GDPR as well. I'm also recognizing that they are trying, it feels like they're trying. They are acknowledging exactly what I was saying about you are allowed to go in and out of a room. Nobody should feel offended by that, that's okay, it's okay to just listen. If somebody invites you up on stage as it's called in Clubhouse to speak you don't have to accept it's okay. So they go through that in community guidelines just to help people feel safe.

Per Axbom:

Now again, people don't read the community guidelines, people have heard this from me as I've explained them and of course that's also how this type of information spreads. But I get the feeling they're trying to create something that is positive, but they haven't thought of all the odds and ends of what that entails. That also actually makes me think about, so what's going to happen, they're probably going to get a fine in the EU, and they're going to pay that and maybe that's been a calculation, because they can afford the fine.

Lisa Welchman:

Yeah, I always think that's part of the business model.

Per Axbom:

Yeah.

Lisa Welchman:

It's definitely part of Google's business model. But I just want to back up again, as Andy's been backing up. You always say these really rich things. The point that's really interesting to me and one that I find myself talking about a lot is that inflection point between invention and innovation, I'm using the I word, invention and innovation, new ideas, creativity, whatever you want to call that pile of where people are pulling something out of thin air and inventing something or changing something slightly and inventing something and bringing it to scale. So somewhere in that mix between, "Oh, I've got a good idea, I think it might work." if it's Tim Berners-Lee, its the World Wide Web, "I think it might work," and then bringing it to scale. There's some point in there where it's okay to be loose, and there's a point in which you really need to check out some basic stuff before you scale it. That's a scenario that Andy and I have been talking a lot about, because we were trying to figure out what's digital safety mean? You use the word safe, how do you make it help people feel safe.

Lisa Welchman:

So I have my opinions for the type of work that I do. I also have a philosophical point of view that sometimes people don't agree with, which is you can't govern or create standards for the most part, I think in the ethical realm this may not be true, but just from a development perspective for a product that doesn't exist. In other words you can over choke something, you can over govern something if you stomp it too early in the gate, but at the same time there are these basic fundamental human centered things, rules that you should never break, you know, things like don't kill people, or don't be abusive. What do you think is in that pile, for digital, or just in general? What do you think, what's in that pile of things that we should never violate when we're creating new products and services versus what can really wait until later? I know I'm really putting you on the spot, and I'm not going to hold you this and I'm sure you've thought about this a lot but I'm curious about that because that's something that I consider.

Lisa Welchman:

What is, what does it knock out for me, inclusivity. I'm a Black woman, it's not everything that I am, but all of this information around algorithmic bias, that's huge for me. How could you even put a product in service that just doesn't work for someone who looks like me and say, "Okay, we'll address is later." It's literally so offensive and dismissive that I just can't wrap my head around it. So that's one thing for me, it should work for all people, but then what does all people mean? I'm focused on the Black woman aspect of it but I can hear and I speak American English and there's a bunch of other things that are like that. So what does that pile look like for you?

Per Axbom:

I'm glad you mentioned the language aspect of course, because I think that's something that a lot of us don't think about, how many people actually don't speak English and how much content actually isn't available to them just because of that. So when I really dove deep into the ethics space was when I was working with a platform cognitive behavioral therapy, I worked on that for four years, an international platform in Sweden, that's rolled out throughout the country now. Building that was when I first came across, according to law you had to do risk analysis at regular intervals and report back on how you were mitigating different risks, and I've never used the word suicide as much as I've done during those four years. That's when I realized anything we do here and that actually means that the information between the doctor and the patient doesn't come across the way it should could potentially, worse case, lead to suicide on our platform. That's when I extrapolated on that and realized that all these other digital tools that I've been working on all these years, I have the exact same challenges with the interface I'm working on and I haven't even thought or anticipated that those digital services could also lead to people feeling really bad about themselves and there's just no mechanism for them to report back to actually even become aware of that.

Per Axbom:

I'm so worried that we're creating so many things that we're just not aware of the people being hurt, because it doesn't just have to be users, it could be the people outside who are not the users, who are affected by peoples changes behavior because of something we created. So it's really complex and so it's finding that list of things we, "This has to be in place or otherwise we can't launch," is extremely difficult. I think we can do a much better job of having these risk analysis sessions at regular intervals and actually thinking about what could potentially go wrong. But the only way to get those right is of course inviting more people, a more diverse set of people to understand the risks. The people least understand risk is white middle aged men like myself, I read the studies on that, because we're exposed to so little risk. So we need the specialist, the people who are always on edge, who always are the people who are exposed to it and so that's how you try mitigate it before hand. But then you also need the mechanisms, you need that feedback loop again of listening, you always need to be listening and making sure that people aren't being hurt, and believing people when they say they're hurt, and not dismissing people because they are an edge case.

Per Axbom:

Because that's the thing as well isn't it? Because we normalize certain behavior, thing that is most in common with most of people, that's what we're designing for because that makes again, the most business sense, whereas when I was talking about Clubhouse before and noting that actually people who are not privileged to have these experiments, they can actually use the app. So there are some privileged people who can't use the app either, but these people can. So it's interesting to see, actually I think to start thinking about that some people, you don't need to design for them, you need to lift the people that don't have much, that have the least power. I think that aspect of it is too rarely discussed and we need those tool sets inside our companies to talk about it and make people aware. Now most companies aren't going to invest and that's one of the things that frustrated me the most about accessibility when I worked across twenty years with accessibility, realizing that I've been giving the same talk almost for 15 years and things are changing so slowly.

Lisa Welchman:

I know.

Per Axbom:

So why isn't it happening? It's because I think people are still unaware, I can actually still, I think the last time, maybe it was four years ago, the last time I heard someone actually asking in surprise, "Oh, so blind people use the internet, really?" And just that not being aware of that group of people using the internet, that was just so strange to me. So what I'm trying to do these days is actually create more awareness and give more power to the people who are on the other side, the users and helping them give feedback to the companies, showing them how they can actually reach the companies with the concerns that they have. So I've started a website community service thing where I do short videos. Like in Sweden we have an identification process where you have something called Bank ID, so I show people how to install Bank ID and show them who they can talk to if they can't get it working, so I'm trying to teach that aspect of technology to people. I'm starting to realize that, "Yes I can work with the companies, but it's happening too slowly there. So I need the people themselves to be more aware of what's happening, know their rights and understand how they can make things happen better for themselves."

Lisa Welchman:

I don't disagree with what you're saying and I also feel like we need to be careful when we talk about giving people power. So I remember growing up there was, I can't remember, at some point, got in a fight with something, race relations in the United States of America are always a big thing and someone talked about how their parents marched in the Civil rights movement and gave me my rights, and I pointed out to them, I always had them. You may not have recognized it or whatever, and so I think there's an element of that as well, which is I find myself having to be careful about making sure that when I look at somebody who is otherly abled that me, that that's not necessarily a down side. Yes, objectively speaking if you look at power mechanisms they may not be pushing and driving the money, but those individuals have value and they have value to me. They're going to make the system better through their intelligence and through things that we think of us others or as disabilities, can actually mean...

Lisa Welchman:

Andy and I met at a conference where a guy gave a great talk and I'm sure those who work in accessibility, building things that are accessible make them better for everyone. So I just want to point that out of just, we're not giving them anything. We're shifting ourselves to realize that we've been narrow and that we've been creating things not taking anyone into account. So I hope that sounds...

Per Axbom:

No, that's a really good point. Yeah.

Lisa Welchman:

Right, and I think if we make that shift we can really see things differently. It's not like, "Oh my God, we have to start our project and worry about those damn blind people." It's just more like they're just part of the human family, and when we're addressing the entire human family, like any family, it's going to be better when everybody's there and what they bring to the table is going to make us better and make the whole experience better for everyone.

Per Axbom:

This is a lot of the messaging that has been in my accessibility talk.

Lisa Welchman:

Yeah.

Per Axbom:

For all those years that I speak about, and the issue I have of course always is that it's happening so slowly. The companies aren't, I mean they're fine with listening to my talks and agreeing with me but that doesn't change their behavior, that doesn't change that outlook, that we want that outlook. How do I teach them to actually care for other people, and understand that maybe growth isn't what we should be focusing on this quarter, we maybe should be focusing on well being. I find it hard to hep companies come to that conclusion so I'm trying to fight this struggle on different sides. I'm teaching them, I'm teaching my peers in the design industry and I'm teaching the people who're exposed to all these tools all the time, who actually don't understand how to use them right now, but they aren't dumb.

Per Axbom:

My mom actually uses a lot of tech, and I realize that if I just spend three minutes with her, she gets it, but nobody has ever spent that time with her, and that's so interesting with when you have a national ID, like the one that I was explaining that we have here in Sweden. So you implement the national ID but then you never create the safe space for people to talk about it, so now we have a national ID that is digital but where's the safe space to correct around the technology where people can discuss the fears, or their doubts or worries and hopes and wishes and whatever that comes along with actually having an ID that you use to identify yourself with different public agencies.

Andy Vitale:

That's interesting because as you mentioned growth and how companies are so focused on growth that they should really just spend time improving the experience for everyone. As designers, as someone that leads design teams, I think it's our responsibility to the companies that we work with and work for to help them understand that that is an opportunity for growth for them. That by opening up their products and platforms to a wider base of users, more people to use it and more people to eventually improve it, that that leads to growth. So it's funny because Lisa and I debate about design and the ego of designers and you talk about that you felt designers drifting far away from the promise of a human centric focus, and I want to hear more about that. Because I'm seeing a little bit of the opposite, I'm seeing designers that are coming out of school, that are new in the field, that are so focused on improving things and making products better and making the world better, design as a force for good, that I want to hear where you're seeing the opposite.

Per Axbom:

I started seeing this when Daniel Kahneman and Richard Thaler, they got the noble prize, so here's Sweden doing its thing again.

Lisa Welchman:

Why do you all get to decide?

Per Axbom:

Exactly. We get this inside into how brains and habits work and we get all these books being written early 2000's, around coercing people into doing the things that we want them to do. There's behavioral psychology, all these types of different titles that people all of a sudden had. They were experts in showing companies how they could grow by placing buttons in different ways and using text in different ways, we have the whole AB testing industry that it more concerned about how people click and where they click rather than if they feel good about the product that they bought. So that is where I'm coming, the whole Amazon one click to buy thing, don't get me started, and now they lost that patent and that's actually being released into every company in the world, so everybody is going to go one click to buy.

Per Axbom:

So as soon as you have this simple way of reducing friction as one of the tenants of design, and people see that as always positive, then that it the path that we're headed towards. But I couldn't agree more when it comes to the young designers coming out, because I think that more and more in design schools it's becoming a lot more common to actually have these. I mean I teach ethics at a design school as well and I can see how excited they are about when it's time for my course to come a long, as well as the guy who teaches accessibility. This is what they want to learn about, they're so aware but I don't think, that wasn't the design industry that created that awareness. It was something else, I mean the awareness across the world about social media probably as well, creating that awareness of the circumstances of people across the world, and so they want to do something about it. So I think there is an incentive now that wasn't there before perhaps. Sometimes when I give talks I talk about back in the early 2000's we wanted a seat at the table, and we got the seat at the table but it was at the expense of arguing for our position against numbers, so we can boost numbers, that's what design is for. But it never was really about that, really.

Andy Vitale:

One of the things that I tried to do, and it's been ingrained in me from a previous chief design officer is we have a seat at the table as designers like anyone else in the organization does but often times the first question is, "So what is the value that you bring? Give me the justification for you having this seat at the table?" At the end of the day, the seat at the table is the justification. We don't have to prove every single thing like the engineering team, like the product team, like the business folks. They don't explain why they're there. So as design organization, as designers, we're always making the case of, "We need the seat. Why're we here? Here's why we're here. This is what we're doing." We need to stop that and focus on the work, and the work itself is the value that we provide. That's the seat, that's our justification, just having that job is the justification for us being there. So it's interesting that we focus so much on how we do things instead of what we do.

Per Axbom:

Yeah.

Andy Vitale:

You talk about that in the form of we need to have these numbers, we're promising all these results. But at the end of the day, the result is our product is more usable, our product is well received, people are solving the needs that they had, they're able to achieve their goals and to me that's the justification for what we do.

Lisa Welchman:

Well for both-

Per Axbom:

Thank you. That's so good, that was exactly... You started on the question and that was exactly the rant that I was going to go on and it was so perfect because you did it for me.

Lisa Welchman:

Wait I want to ask both of you designers, because I'm not one. I'm a designer of people systems but I'm not what people give as a designer. So you have this seat at the table and I mean I run into that whole barrage of folks that Andy mentioned when I'm doing governance work. I'm trying to get these people to work with intent and together. So designers sitting at the table, yes they need to have a seat at the table, we could argue about that but no I'm not going to because yes they need it but how do you engineer this ethical responsibility at this table? This is the big question, and so one of my challenges with designers is, or with any of these set of people because they all say, "We should do it." Content people say it's the content, design people say, they're usually talking about visual design, they'll say that but then they'll say, "No, it's not just visual design." Everyone is trying to get on top of the pile. The only one that's not trying to get on top of the pile is the engineering stack people because they know that stuff isn't going to run if they don't do their stuff. So they're just silently there but everybody else is trying to say, "No, it's us. It's us."

Lisa Welchman:

From your perspective, either of you, where would you put that? At what point in the process, is an easy question to ask, but who is it inside of the organization that's driving this ethical conversation? Because if no one's owning that conversation, it's just floating around in the ether and something you bump into occasionally.

Per Axbom:

Yeah, that is so eloquently put. I think everybody wants to own it, somebody saw it and so they want to run with it. I think that's what we're doing, seeing with Tristan Harris and this whole humane technology movement in that somebody wants to take this flag and own it and say that, "We own this ethics space." But of course anyone could own it. It could be customer service, it could be anyone, as long as everybody is involved in the conversation and we're creating awareness around the problems and I totally agree with this problem with that we're saying, "This is what needs to be done." I think that's my biggest weakness because I don't take the lead, I'm contemplative and try to think a lot and somebody else is more loud and of course it's easier to follow them and that's always the problem when you try to reason and reflect and go slowly.

Per Axbom:

But what I try to do, and this comes from my, I studied coaching for a year as well, and that's when I learned that being a consultant was completely wrong of me because as a consultant I always give advice and as a coach I actually people to discover what they already know. So in these meetings, because I have the privilege of sitting at a lot of these meetings, where a lot people from companies management are sitting around and I ask questions, and I ask questions like, "So if it turns out that somebody gets hurt by this thing we're building, what tools do we have in place to handle that?" And so I expose it by asking the question and not by telling them, because I of course already know that they don't have the tool, but they don't know that and I'm just being inquisitive.

Per Axbom:

So the more times I start asking these types of questions, the more aware they become that there are holes, there are things that can go wrong and sometimes they go as far as to give examples from other companies of course and when I do that, so it's a slow process and that's also of course what's so frustrating for a lot of people in this space that you have to be aware that it is a slow struggle. It has to be, because people have to be with you and they aren't there yet and so they have to go slowly there with you, and you do that by inviting them, not by telling them what to do.

Andy Vitale:

Yeah, I think that's a good point. Inviting people, so what I've seen is in a lot of organizations, the product teams are siloed by product or by lines of business. The engineering team's a little bit heads down. The design team actually has that visibility into things that are happening amongst all the different organizational silos. So they're the ones that can see an opportunity and then use design as a collaborative pulling people together exercise to start to have those conversations around this joint ownership for something like ethics. At the end of the day, obviously there needs to be a single decision maker. It can't be five people owning a decision because then decisions never happen, but the value that I see that design brings isn't to stack their hand on top of the pile, but to bring everybody together so that we can have that conversation around who actually does make this decision and what is happening in these different parts of the organization that may not realize that it's happening. So to me design is that pulling everything together type of roll in that situation.

Lisa Welchman:

See, I think that's true operationally, but I'm biased. I think it's a cultural issue in that it really an executive function to establish a culture of safety, to establish what the ethical frame is for an organization because there's so many people executing on it. I am biased because decision making to me is governance and nobody wants to say that because what an ugly word, it's really horrible that stuff that big bureaucratic organizations do, but we're governing all day in crappy ways in that sort of thing. To name one of the obvious, I think part of the Facebook challenge right now is that there governing framework is such that one individual, Mark Zuckerberg, has complete control over what they do and do not do, and so his cultural values, his personal values, how he sees things drive the entire process. You can imagine, he could be a completely different with a completely different belief set and this whole thing would've spun differently.

Lisa Welchman:

So I think you can't really underestimate some very simplistic things like that value system of leadership and what type of messaging that they're giving in the ways of which they drive the team into what ends. So we can talk about money but there are cultural ends, are you a nice company, do you we abuse peoples trust, is it okay to trick people? These are things that really come out of a value system of an organization and even when you have really intense governing framework in place, like in a military situation, there's still governing mechanisms that are going to control all that so I think it's all an and, it's all these people need to get together and because it's an and for the entire organization that makes it something that executives have to drive. They have to say, "That's not okay for us. For us to behave that way is not okay." I think that's really where it's going to end up being and that's why it's so slow, because you're shifting a culture where, particularly for organizations that existed prior to the web, where that stuff is deep. Its very deep and you've got to dig it out, so hopefully things will be working better over the next 50 years.

Lisa Welchman:

So one of the things that I saw on your website this morning when I was refreshing my brain before we talked to you was digital compassion. So I'm a very serious meditator, and so I hang out with a lot of Buddhists that talk about compassion and empathy all of the time. But I was really interested in why you picked that word. So ethics, yeah, mis-usability, I read about that and I get that. But where do you think compassion sits in this frame and why did you pick that to emphasize?

Per Axbom:

I started with the mis-usability phrase, and that's when I started doing my ethics workshops. Because that was easier for people to understand in that, yes we are doing usability, but sometimes when you do usability and even successfully, we actually hurt people. But I kept coming back to that problem space which I was mentioning that I'm finding it hard to help people to understand that they need to care about other people, which is what it really comes down to. So I think compassion for me and I actually intentionally avoid the word empathy because I think it's really hard to feel empathy for anyone and everyone and try to actually say that I am able to find empathy for the people I design for. I don't think that's what going on at all, but I can decide that I care about the people I design for and feel compassion for them, for their circumstances, even though I don't understand it. I think that being able to do that gives you so much. That gives you power as well as a designer, because all of a sudden it's not about knowing what people need it's about having to ask people what they need, because that's the only way you're going to find out, because you don't know.

Per Axbom:

So compassion is about always being prepared to immerse yourself in spaces where you may be uncomfortable but it's your job because you are going to be affecting these spaces with the way you are designing. I think that's something we don't acknowledge enough as designers, the power that we yield. The power that we have actually to expose people to decision making processes where we decide what is the preferred choice and the secondary choice. We even teach that in design schools. This is a primary button and this is a secondary button, so we're doing choice architecture, so we are actually influencing what you should think is the better choice and we keep doing that. And unless we actually always, and not just before we're designing but during we're designing and after we've designed, we need to be speaking to the people who're affected by what we're making and that it compassion for me.

Per Axbom:

I've been using that word, I used it as the title for my handbook as well, Digital Compassion, where I try and help designers understand that there are other ways of thinking about design and because I think we've been taught for so long what design is and what I realized when I started talking about these issues is that people came up to me afterwards and always, "Oh thank God somebody's talking about this, finally." People had been weary of actually raising their voices and saying the same things, but it wasn't new, it was just that it was silenced in that we're supposed to work with making all experiences frictionless, that's our job as designers. But now we realized that well we do that it can lead to some devastating problems and so here's some other tools, here's some other ways of working and people actually really appreciate that so I've seen people implement... I have this tool called the inclusive panda, I've seen people using that in workshops and it's like, "Wow, so something I invented is actually being used out there," and that's fun but the most fun part is the conversation is happening, because that's all it is. It's a conversation starter, and so they feel more at ease having a conversation if they have an artifact to talk around.

Andy Vitale:

I was thinking about empathy versus sympathy versus compassion, because when I was working in healthcare and I would explain something very similar to the doctors and the medical staff that I was working with there, you don't understand how much compassion just burns you out. And I'm like, "Well how is it really different than empathy?" I think you did a really good job of describing why we need to have compassion for the people because we can't really put ourselves in their shoes. We don't know everything that they're experiencing so I want to hear more about the panda. I see it behind you and you explain it as a tool being used in workshops, so I'm just curious can you talk a little bit more about that and how that leads to designers getting more compassion for the people that we're designing for?

Per Axbom:

I call it the inclusive panda and it started with me drawing a bunch of circles trying to figure out different spaces of things to talk about when it came to inclusion and all of a sudden it started looking like a bear and I don't know why I made it a panda but it turned out... I wanted to make it memorable, and so that actually also makes it easier to describe on a podcast, so there's a big circle that it the head of the panda, which actually then is everybody who uses your digital service, even the ones you haven't thought of, but everybody who is a user of your digital service. Then you have the left ear which is also a circle, but it overlaps with the head, so those are the people who are excluded for accessibility reasons, and more. An interesting thing with the overlap is that you realize, "Well some of the people with accessibility issues actually are still using the service." But you're not aware of them you're not talking to them. So you can do stuff that happens which throws them out or can bring more people in, but you don't know because you're not actually paying attention to them.

Per Axbom:

Then you have the right ear of the panda which is also a circle that overlaps on the opposite side. Those are the people you want to exclude and this is a concept that is difficult for a lot of people, like, "Oh, so now we want to exclude people?" But the absolute easiest way to think about that is that some apps for children should exclude adults, and some apps for adults should exclude children. So you actually have to make an effort to not let people in that could harm the people that you're actually trying to help, so that one creates a lot of interesting discussions. So even around spaces such as slack groups at companies, this is interesting because does everybody have to be on the slack and does everybody feel as comfortable talking about the slack or do some people have more power than others? So if somebody is in the group talking on slack, does that mean other people are silenced? So those types of discussions come up. Then you have the snout of the panda, which is a smaller circle in the big one, and those are the people you're actually designing for, because those are the people participating in your usability tests and you're doing interviews with.

Per Axbom:

So that's just such a small segment of all the people that are affected by what you're doing and that's what you have to recognize as well, that it's just a small part of the puzzle that you actually have and are usually able to keep in your head. Finally there's the nose which is an even smaller circle inside the snout, which are the people who you are designing for but who are still getting hurt. So they're still getting hurt because either you're letting other people in that are hurting them, like we were talking about the other ear or you've made some bad design decisions. So by talking about all these aspects of the pandas face, structurally and in order, you can actually start to have really really interesting discussions about exclusion, inclusion and who we haven't even thought about and when you bring more people into that space to talk about that model, you get some interesting thoughts that you probably haven't thought of yourself.

Andy Vitale:

Nice, I want to explore that with my team actually. After hearing about that and reviewing some of our existing products.

Per Axbom:

That makes me so happy.

Lisa Welchman:

So Per, now that I know how to pronounce your name I'm going to be saying Per all day long.

Per Axbom:

Nice.

Lisa Welchman:

Thank you, really appreciate you being here with us. It's good to talk, I think we could talk a lot more, I forgot how much I miss talking to you, so I really appreciate that and appreciate your time.

Per Axbom:

Thank you so much. This has been wonderful, awesome with you.

Andy Vitale:

Yeah Per, it was great to really get to talk to you for the first time. Just for anyone else that wants to reach out and have a conversation with you or find out more about you, where should they go? What can they do?

Per Axbom:

The space I'm most active in is still twitter, so Axbom, A X B O M, on twitter is where you'll find me and from there you'll find everything else.

Lisa Welchman:

Wonderful, the center of your universe. Well have good day and thanks.

Per Axbom:

Thank you.

Announcer:

Thanks for listening. If you enjoy Surfacing, please rate, review and follow us wherever you listen to your podcasts. Also, consider supporting the podcast on Patrion at patrion.com/surfacingpodcast. On Surfacing Patrion page you support the podcast or become a member and gain access to exclusive ask me anything podcasts with Lisa and Andy. If you have suggestions for guests or a topic you'd like to hear about on Surfacing, please reach out via the contact form found at surfacingpodcast.com.