Enabling design with Ann Paradiso

This post has been republished via RSS; it originally appeared at: Microsoft Research.

Episode 70, April 3, 2019

Ann Paradiso is an interaction designer and the Principal User Experience Designer for the NExT Enable group at Microsoft Research. She’s also the epitome of a phrase she often uses to describe other people: a force of nature. Together with a diverse array of team members and collaborators, many of whom have ALS or other conditions that affect mobility and speech, Ann works on new interaction paradigms for assistive technologies hoping to make a more bespoke approach to technology solutions accessible, at scale, to the people who need it most.

On today’s podcast, Ann tells us all about life in the extreme constraint design lane, explains what a PALS is, and tells us some incredibly entertaining stories about how the eye tracking technology behind the Eye Controlled Wheelchair and the Hands-Free Music Project has made its way from Microsoft’s campus to some surprising events around the country, including South by Southwest and Mardi Gras.

Related:


FINAL TRANSCRIPT

Ann Pardiso: A giant team rallied together for this first hackathon project and built a prototype of a wheelchair that could be driven with the eyes. And what happens with ALS is that, eventually, people will progress to the point where they will lose control over the muscles that allow them to move or speak or breathe. But very, very often they retain good control over their eyes. And so, we can use that. Anything that you can still move we can use, right? If you can move it, we can make a switch out of it. That’s our, our motto!

Host: You’re listening to the Microsoft Research Podcast, a show that brings you closer to the cutting-edge of technology research and the scientists behind it. I’m your host, Gretchen Huizinga.

Host: Ann Paradiso is an interaction designer and the Principal User Experience Designer for the NExT Enable group at Microsoft Research. She’s also the epitome of a phrase she often uses to describe other people: a force of nature. Together with a diverse array of team members and collaborators, many of whom have ALS or other conditions that affect mobility and speech, Ann works on new interaction paradigms for assistive technologies hoping to make a more bespoke approach to technology solutions accessible, at scale, to the people who need it most.

On today’s podcast, Ann tells us all about life in the extreme constraint design lane, explains what a PALS is, and tells us some incredibly entertaining stories about how the eye tracking technology behind the Eye Controlled Wheelchair and the Hands-Free Music Project has made its way from Microsoft’s campus to some surprising events around the country, including South by Southwest and Mardi Gras. That and much more on this episode of the Microsoft Research Podcast.

(music plays)

Host: Ann Paradiso, welcome to the podcast.

Ann Paradiso: Thank you for having me!

Host: You’re an interaction designer and a Principal User Experience Designer at Microsoft Research. We’re absolutely going to get to how you came here a bit later, because it’s a great story. But to get us going here, tell me what gets you up in the morning.

Ann Paradiso: Well, I love my job, and I love the people that I work with. And what gets me up in the morning is coming in and collaborating with my team, and my collaborators on some of my other teams, and getting to work on something that I have a personal and meaningful connection to. I love the people that I collaborate with, the people who inspire me every day, the people living with ALS and other disabilities, and I feel very blessed that I get to do that for my job.

Host: How would you describe the work that you guys are tackling? What kinds of questions are you asking? What kinds of problems are you trying to solve?

Ann Paradiso: I would describe the work that we’re tackling as extreme constraint design. We solve hard problems for people dealing with extreme constraints that are the result of disability. And everything about what we do is constrained in some way. Everything from being able to communicate directly with our user base and our collaborators to dealing with things like resourcing and funding and time. The essence of what I call extreme constraint design is that freedom can be found in very, very hard constraints. It seems counterintuitive because you think you want to have all of these options and all of these choices, but constraints can be a forcing function. And they can cause you to solve problems in unconventional ways that you wouldn’t normally think of on a linear path. But a lot of the problems we’re trying to solve can’t be solved with a conventional, linear path. You have to use unconventional methods, intuition and creative problem solving. And you have to have the resolve and the tenacity to be able to overcome some of the setbacks that come with such a constrained environment.

Host: That leads really beautifully into the next thread, which is, among your professional interests that you have listed in your bio, you include “new interaction paradigms in assistive technology.”

Ann Paradiso: Yes.

Host: And we’re going to get much more deeply into the assistive part of this in a second. But it begs the question, what’s wrong with the old paradigms and why are they left wanting?

Ann Paradiso: Well, the old paradigms work for a lot of people and there’s nothing wrong with paradigms that are already working. The problem is, for the people that we work with and that we collaborate with, there aren’t any existing paradigms for them. And the ones that do exist don’t take the whole picture into account. What we’ve learned, working with the ALS community and some of the people with similar disabilities, other motor neuron disabilities, even spinal cord injuries or people who have other speech and motor impairment from disability, is that a lot of times, there isn’t a one-size-fits-all solution for somebody. Our disease space that we started with, for example, ALS, has so much variation in onset and progression. People lose the ability to move their muscles in their body, but that happens in different ways and in different patterns. Somebody might lose speech first. Somebody might lose hand strength first or leg strength first, so it’s very hard to come up with a sort of one-size-fits-all path for them. And also, I think technology is continuing to evolve. For example, we’ve had eye-tracking technology around for decades.

Host: Right.

Ann Paradiso: But it hasn’t been available at scale and in a way that we can prototype and experiment on. Now you can get an eye tracker for a couple of hundred bucks. You can plug it into a Surface device, use Windows Eye Control, which is something amazing that has come out of our team.

Host: Yeah.

Ann Paradiso: That was led by Harish Kulkarni, one of my beloved colleagues. And we can do things now that would have been very, very expensive and purely academic maybe even five years ago, ten years ago. So, what we really want to focus on is helping people in the now- to two-year time frame.

Host: Interesting.

Ann Paradiso: And so part of it is that. Part of it is that stuff just hasn’t been invented yet. Or if it has been invented, it might have been invented by a graduate student in a very small way just to show a proof of concept, but it isn’t something that could actually be used by somebody who needs it, or supported or sustained. And so, we have to get creative.

Host: I know that ALS, or Amyotrophic Lateral Sclerosis, has been a motivating driver for some of the projects you’re working on, and people with this disease are literally, as you say, locked in their bodies. And normal interfaces don’t work for them. So, you’ve disrupted these interfaces and come up with some remarkable new technologies, and one of them is a project you worked on with Team Gleason. I don’t assume that our listeners will what that project is, but tell us about this project that you’ve done with Team Gleason.

Ann Paradiso: Well, the way that we first got connected with the ALS community was through Steve Gleason. Steve Gleason is… it’s an understatement to say that Steve Gleason is a force of nature. He was a professional football player for the New Orleans Saints. He’s a massive celebrity in New Orleans, and he’s been living with ALS since he was in his thirties. Steve had a professional relationship with Microsoft, and when we had our first Hackathon, he wrote a letter and he stated what he wanted Microsoft to work on in terms of his technology. He’s saying, you know, hey, I’m using your technology, and here’s some opportunities for improvement. And he had great vision around this. I want to be able to move my wheelchair independently. I want to be able to argue with my wife. Some of the rumors have it he wants to be able to win arguments with his wife, which is an LOL, because she’s also a force of nature, but he wanted to be able to meaningfully interact with his wife, and he wanted to play with his son and raise his son. He wanted to be a father and a husband and a community member. He wanted things that everybody wants. And a giant team rallied together for this first Hackathon project and built a prototype of a wheelchair that could be driven with the eyes. And what happens with ALS is that, eventually, people will progress to the point where they will lose control over the muscles that allow them to move or speak or breathe. But very, very often they retain good control over their eyes. And so, we can use that. Anything that you can still move we can use, right? If you can move it, we can make a switch out of it. That’s our, our motto! So anyway, this team got together. They won the Hackathon and they were able to, as a result of that, sit down with Satya. I think they got an hour with him to talk about where they could potentially take it. And around that time NExT was forming, the organization that I work in, and a guy called Jay Beavers went to Peter Lee who is the Corporate Vice President and leader of NExT, and he said, I want to take this project and make it for real. Because at the time, Steve hadn’t been able to drive it. It was a proof of concept. It wasn’t made for real. And he said, I want to make this for real. And Peter funded the team. And so, we set about taking that early prototype and making it real for Steve. In fact, Jay and I went down there and took the very first wheelchair components to him, and I was out in the yard playing with Steve’s son, and Jay was collaborating with Steve. And they were working on it and working on it. We didn’t know if it was going to work. We didn’t know if it was going to work. And Jay says, come in here. And I come in, and Steve’s got this big grin on his face, and he’s driving. He’s driving himself around, And it was just this great moment. So over time, Jay and the team and I worked with Steve, and Steve tested the wheelchair, you know, used it, beta tested it, and we continued to refine it to the point where he can use it successfully. In fact, I was just at Mardi Gras and I was fortunate enough to be in the Team Gleason Krewe. And Steve eye drove almost the entire parade route by himself. And, you know, watching that in action and watching him using our technology was a career highlight for me. You know the other thing that we worked with Team Gleason on was working out some of our communication-based technology. The very first thing that I noticed when we went down to visit, Steve was, we were sitting across from his as you would, like as you and I are.

Host: Mm-hmm.

Ann Paradiso: And he’s using his eyes to type out his words, and there’s a pause. And it continues, and it continues, and we don’t know what’s going on. And, you know, we kind of get up and look around or call for a caregiver. And, oh, you know, he needs to re-calibrate, or he needs assistance and we have no way of knowing that. The other option to that is standing over his shoulder and watching him type without being able to make eye contact. It’s not how humans communicate. So, every time we came home from a trip or from a PALS visit or from any of our collaborative exercises, we would sit down and have a debriefing session, and we would start to brainstorm. And we came home, and I was talking to Jon Campbell, one of the engineers on my team, saying, you know, we need some kind of a status indicator. We need some way for people to know, hey, I’m typing or I’m calibrating, or wait a minute, I’m about to speak. Because that was the other thing that would happen is, we speak so much more rapidly and we take this for granted than somebody can get words out with their eyes, right? So, you’re looking at maybe five to twenty words a minute depending on the accuracy and skill of the eye typer. So, what happens is the conversation moves on more rapidly than the person who’s using their eyes to type is able to get their thoughts out. And it creates disruptions in the flow of conversation, and you can see the effect of that on the person using their eyes to type. And so, we were thinking, you know, how could you have somebody quickly in a conversation without having to type things out? And one of our interns at the time who’s now a full-time engineer here, Alex Fiannaca, started to work on that exact problem and he started looking at ways we could have conversation partners assist with typing speeds…

Host: Interesting.

Ann Paradiso: …or ways that you could set your communication preferences, maybe say via your mobile device ahead of time. My eyes go up for yes, or I blink for no. Those are my gestures. So, if you can ask me yes or no questions, you know, that will help facilitate the conversation.

Host: Right.

Ann Paradiso: Or, you know, one of the other things that he came up with was quick reactions. We had a keyboard that we’d been prototyping and working with Steve on and working with several of our PALS on. And we added these quick reactions to it where he could, without having to type anything out, have a laugh or a sigh or a groan or some way that you can be in the conversation more naturally. And we learned that from Steve because he was doing these things. He already intuitively understood that and was coming up with these strategies. And we would watch him use those, and he would have a bunch of, you know, sort of messages or lead-ins, or natural-sounding communication segues. He would have those stored, and we watched him have these remarkably natural communications that maybe some other people, who hadn’t been doing it as long, hadn’t mastered yet. And so, we took those observations and just directly translated that into features in our keyboard.

Host: That’s amazing. That’s absolutely – I mean, what it makes me think – those three little dots that kind of…

Ann Paradiso: Yes, that’s…

Host: …circulate when you’re texting, you know, they’re typing, and they’re typing something long. I’ll wait. Or an emoji that can say…

Ann Paradiso: Yes.

Host: …with one picture, right? Is that kind of…

Ann Paradiso: That’s exactly what came out of those early discussions. We had an early prototype of this device that we called that arc reactor because it looked like the arc reactor that Iron Man used.

Host: Oh, from Iron Man.

Ann Paradiso: Because, you know, we’re nerds and we’re proud of that. And it was a prototype device and we had some other interns come in and study it, and from this early work, all of our work around expressivity and communications was born, and a whole bunch of great outcomes came out of that. But what happened is we took that arc reactor and we wanted to extend it beyond status. And it was exactly what you said, with the three status dots, was we had this round LED display that would basically function as that. You can take a lot of cues from text messaging.

Host: Absolutely.

Ann Paradiso: Because that levels the playing field a lot, and so we took a lot of inspiration from that, and then we extended that. We wanted to be able to express emojis. Again, from text messaging, you know, this is something that everybody knows how to do and that is a common paradigm that people understand and it’s part of our social communication now. So, at the time, we didn’t have the resolution in that device to be able to create the emojis for real. You had to be able to write code. There was no preview environment. So, it was a great proof of concept, but it wasn’t tangible yet. So, Gavin Jancke, who is one of my favorite collaborators and my former boss on the Advanced Development Team in Microsoft Research Labs, came up with this amazing authoring environment. And you can open up this environment in Windows. You can create basically anything you want to in it. You can take an animated GIF from the internet. You can drop it in and it will pixelate that and turn that into an emoji. And he worked hard on the hardware. What he did was he wrote all the firmware. He created an entirely new device, and it had the resolution that we needed to be able to show those expressive emojis and things like that. Which we then used off-label – to use a pharmaceutical term – for our Hands-Free Music Project to give a visual affordance to each individual drum, which we’ll talk about.

Host: Yeah, yeah, yeah.

Ann Paradiso: But this is sort of the nature of the nonlinear path of innovation here. Gavin just jumped in because he thought it was an interesting space. And so that was the birth of Expressive Pixels and it was all from wanting to enhance expressivity between somebody living with ALS and their families or their communities.

Host: Well, you know, the other thing that I’m thinking of is when you watch sort of remote interviews on TV and the person’s got a headset…

Ann Paradiso: Yes.

Host: And there’s that gap. We all have that uncomfortable “fill in the gap” thing. And so, what I hear you saying is that it’s trying to make it more immediate and natural?

Ann Paradiso: That gap is one of the key drivers. We have noticed this in almost every interaction we’ve either participated in or witnessed is that people in modern life are severely uncomfortable with silence. Discomfort is always a great driver for growth, and that discomfort was an amazing driver for innovation in technology.

Host: You keep referring to PALS.

Ann Paradiso: Yes.

Host: Tell us what that is. I know it’s a thing. It’s not just your buddy.

Ann Paradiso: So, it’s both. It’s one of those delightful acronyms that also means exactly what it sounds like. PALS is the acronym that the ALS community uses for people living with ALS. And CALS is an extension of that, and that is usually a caregiver living with ALS. When I’m referring to a PALS and you hear that subject-verb disagreement, it’s technically just me using the acronym for people living with ALS.

(music plays)

Host: At heart, humans are not just survival machines. We’re expressive, creative beings for whom literature, art and music fill our souls. And disabilities don’t make us any less human. So, tell us about the project you’ve worked on to empower differently-abled humans to participate – or continue to participate – in creative expression, particularly the Hands-Free Music Project.

Ann Paradiso: The Hands-Free Music Project is very near and dear to my heart for exactly the reason you stated. For me personally, music has been such an important influence in my life. And so what happened is, early on in the process, several people had been sitting in a room having a conversation. People, experts from all over Microsoft. It wasn’t just Microsoft Research. But having a conversation about how to design this keyboard. And I remember asking, has anyone here actually spent time with people living with ALS using a keyboard or not? And one woman who had been a speech language pathologist had. I don’t think anybody else in the room had. So, we had a bunch of really smart people trying to solve a problem that they did not understand. And so, the first thing that you do as a user experience designer is you have to go and understand your user base so that you fully understand the challenges and how to solve certain problems. We had had a meeting early on, and there was a guy called Mike Elliot who is Chief Neurologist at Swedish Hospital. At the time, he was running the ALS clinic at Virginia Mason. And a woman, another force of nature, called Annie Eichmeyer, and she was the Care Service Director of the ALS Evergreen Chapter, which is the local wing of the ALS Association which is a national organization. And the local chapter’s sole purpose is to support the ALS community. They had heard about the Hackathon and had come in and asked how can we be involved? And I said, you know, we really need to find some local PALS that would be interested in spending some time with us. Do you have any way that we could start to meet people? And they were all over it. Dr. Elliot said, please come to clinical rounds and we will find some people that might be willing to let you sit in on their doctor’s appointments. And sure enough, they found us a couple of people. And so, we got to know them and found out a lot about them personally, how they used technology, where they’re struggling, you know, what do they love doing? And a lot of people had a strong relationship with music. One guy, he was in a band, and he was pretty progressed at that time. Another guy was a music teacher. I met two or three drummers. I could feel how important that music was to people and that other forms of expressions – some people enjoyed painting. Some people were writers. But they really lit up when they were talking about things that came out of that creative well in them.

Host: Right, right.

Ann Paradiso: And so, I couldn’t stop thinking about that and I came back and started talking to the team. And this is one of many things I love about the Enable team. People were coming up quickly with ideas and how do we solve this? And we started kicking ideas around, and some people were saying, you know, no. We just can’t do this. The timing problem is too hard. And I’m thinking, well, what’s the alternative? You don’t have access to creative expression? No, that can’t be possible. So, we started to talk to different people, and a lot of people were actually interested in the space. And we created a small sort of virtual team. The timing was great on it, because things were aligning. We had the right group of people at the right time to work on an early prototype. We had also consulted with Steve Gleason. He had given us some great ideas on a looping-based mechanism. He showed us one of his favorite artists who goes on stage, and she plays different instruments and records them, creates a loop, and then layers on top, and then puts her vocals on top, and we thought, we can certainly do this. And it sort of started to come together. And this fantastic researcher was visiting from UC Boulder. His name is Shaun Kane and he said, I’ve got an idea on how we might be able to do this. I need some equipment, but I think we can make this work. Proof of concept. How we could enable somebody to play the drums or percussion with their eyes. But we really wanted to do this for Jeremy Best. Jeremy Best is the guy that I was telling you about that I had met at clinical rounds And I went back after our rounds, and I was chatting with Annie Eichmeyer and saying, you know, what is the deal with this guy? And they’re like, oh my gosh, we love him. He’s amazing. He was a high school teacher for twenty years. And so, I went and I found his band on Bandcamp, downloaded their CD and started to listen. We passed it around and we started talking about how we might bring Jeremy back to his music. We didn’t know if we were going to be able to pull it off.

Host: Right.

Ann Paradiso: So, we didn’t tell him right away. So, we had the drum kit. It was a kid’s drum kit. We had it under a sheet. So it was like just this ridiculous lump in the middle of the table, and they roll Jeremy in and he’s looking at the lump and he’s got to go through, you know, research survey questions and do some speed tests, and, you know, he’s doing a bunch of stuff that is really important stuff that we needed for other projects. But you could see he kept on looking over there. And finally, we finish the work part and so we take the sheet off the drums. Shaun starts to explain, you know, what we’re doing, and does he want to try this? And Jeremy uses his speech device to type out, f*%k yeah! We might have to beep that out. But that’s what he said. So, we calibrate him, we bring it in. Again, we don’t know if it’s going to work or not. We’re all standing around like, please let it work. And he just starts ripping on the drums. And he’s got this huge smile on his face and it was just this great moment. And then what happened on top of it is Arturo Toledo, who’s one of our designers on our team and Arturo is also a deejay, and he had some of his equipment in. And he starts riffing on top of Jeremy’s rhythms, and they have this moment where they’re looking at each other, and they had communed. You could feel it. And it was at that moment where we were like, we’re going to keep on working on this. This is definitely something that’s worth investing in. And that’s how that project got started.

Host: So, Ann, that’s a story that actually makes me cry as I listen to it, because I get that. You know, that’s what musicians live for is that… in rowing, they call it “swing” where you’re all together… you feel like you’re one unit, right?

Ann Paradiso: Absolutely.

Host: Um, this story has another little chapter, involving South by Southwest, the music festival. Well, it’s a tech festival, a music festival, a movie festival. It’s all that now. Tell us the story of how the Hands-Free Music Project ended up at South by Southwest and what happened once it got there.

Ann Paradiso: So picking up where we left off, where we had shown the proof of concept to Jeremy and we became energized by it and decided that was validating enough to keep the work going, we were trying to figure out, like, how can we put some cycles on this? Because we had other things that we were working. We were working on Windows Eye Control, we were working on research, and several other priorities. How could we keep this going? And so, what ended up happening is I decided to hire an intern to come in and collaborate directly with Jeremy on Hands-Free Music. And I came across this candidate named Paul Batchelor. And I knew from the second I Skyped with him that he was the one. So, you know, we were talking and he had ideas already, and I had said, you know, I don’t have a lot of direction for you, but what I need you to do is work directly with this collaborator that we have, and see what you guys can come up with. The goal is to get him to play music again. And he was a Berklee College of Music undergrad, and he had gone to Stanford to their CCRMA Program, which is a computational music program.

Host: Right.

Ann Paradiso: And I remember asking him, you know, so you’re a musician but you’re a scientist, you know, how do you balance that? And he said, well, I identify first as a musician, second as a scientist. And that was the other criteria was, we wanted somebody who was a musician who would be able to think musically about this and not just computationally about this. So, what ended up happening is he created this great interface based on a clip launcher which is an interface that’s used sometimes in professional music composition software. Ableton Live, for example, has a clip launcher. And so, he wanted to take paradigms that musicians knew and understood and use those instead of creating a brand-new paradigm. But he used them in different ways. That project ended up being called Sound Jam. So, Paul was here for the summer. He created an interface, a novel eye-gaze-enabled interface, that had, sort of, four quadrants on it. And one quadrant was riffs, one was melody, one was percussion, and the other was sort of whatever we wanted to put in. It could be harmonies, it could be effects or whatever. And it was all generated on the fly. This was not created out of recorded wave samples or anything. He had done all of this computationally. And so, you could pick it up pretty quickly and start playing some music. And then you could lay effects over it using your eyes. And he found really interesting and novel explorations of the space. And in parallel, we were building on what is now called Sound Machine which was code named D Music for Dwayne Lamb, who is the lead engineer on the project. And that came out of that early work that Shaun Kane and Arturo Toledo and the team had done. And so, we were working on Sound Jam in parallel. So, we had two different explorations going at the same time. And I don’t know what made me think of this, but I had said, you guys, all right, we’re going to – I wanted to create a forcing function, or a target for people, you know – we’re going to apply to South by Southwest. If we get this prototype up and working and we can get it released, because that was one of the criteria, we’ll apply to South by Southwest. And at the time, I was also thinking, well maybe Jeremy could play at South by Southwest. That was ambitious, you know. We didn’t know where we were going to land. And I was like, and I’ll take you all. I had NO idea how I would do this. Frankly, because I didn’t think…

Host: You’d have to.

Ann Paradiso: A) I wasn’t even sure we’d get it done. Then I didn’t think we’d actually get it – like we had to pull a lot off just to get the submission in. And I really wanted to get the submission in. We had to create a submission video. And we had a full-sized drum kit built by the team, including, not just the Enable team, but Chuck Needham, Irina Spiridonova, Gavin Jancke, from the Advanced Development Team, were important collaborators as well. So, everybody pulled together to get this prototype going. And we brought Jeremy in to play it, and Henry filmed it for us, and it was, I think, the day the submission was due, we had to get this done. We didn’t even know if we were going to get it working. Everything was coming down to the wire. We got the whole submission done like three minutes before the deadline. In fact, several of us had stayed late to finish it and we sort of collapsed on the floor and we were just like, if that’s all that happens, that is enough.

Host: Right.

Ann Paradiso: Like, we released Sound Jam Open Source, that was important to us. We got the very first bits for Sound Machine released, so we were able to exercise our shipping muscle. We were able to get Expressive Pixels hooked in because we wanted a way to see, visually, what was happening on the drums, and so we were able to get that all hooked up and working. I mean it just was crazy that we were able to get it to pull together. And so then, we submitted to the Innovation Awards in two categories. They were supposed to notify us I think in December. The time came and went, and we never heard anything and so I just assumed, you know, I’ve got to give the speech of, you know…

Host: Sorry, guys.

Ann Paradiso: We didn’t get in, but it’s really not the point, you know? And so, I think it was January at this time, I get this note from South by Southwest saying, you know, if you don’t accept your nominations tomorrow, we have to give them to somebody else. We had gotten in to both categories.

Host: How did you not know that?

Ann Paradiso: So… I get a lot of email, and my inbox has its own event horizon, just like a black hole. And I had been checking, and what happened is my clutter algorithm had grabbed it and put it into the clutter folder.

Host: Bad algorithm.

Ann Paradiso: And so I had checked – well, and it was bad Ann, really. It wasn’t bad algorithm.

Host: Okay.

Ann Paradiso: That thing is usually helpful except for when it’s not. So, we scramble, accept the nomination. And then we have to get a trip planned, we have to get funding to get the whole team going. And, you know, Rico Malvar, who is our leader and just a wonderful human being…

Host: He is.

Ann Paradiso: …he and Jamie Rifley, who’s our business manager, made it happen. And we got it so that everybody who had worked on the drum kit got to go. And we’re trying to get Jeremy there, and in the end, we were not able to pull that off. And so, what we decided was, you will be there, but you’re just going to be there digitally. And so, one of the things that they require you to do at this event is to create an acceptance speech, a Tweet-length acceptance speech, in case you win. And you had to make a video as well, so we had engineers and designers pulling the video together. Everybody rolled up their sleeves, jumped in, and worked on things. Chris O’Dowd, who is our chief hardware guy and builder of everything, built a tee-shirt cannon that could be launched with the eyes. I mean, we had all kinds of cool stuff. And we went down there. And I was giving them the speech the whole time. It doesn’t matter if we win. That’s not important. What’s important is now we have more people who might be able to work in this space that are going to see it. So, we go to South by Southwest… and so then, you know, we win. We win the thing, and we can’t even believe it. We win in the music and audio innovation category. And we really didn’t think we would. We had no idea. And so, we have Jeremy on FaceTime. He’s got a speech prepared. We have Shaun Kane, who is, I talked about earlier, on my phone. And we all go up on the stage. And it’s supposed to be this Tweet-length acceptance speech, and we put Jeremy up. And so they can see him. He starts out by saying, Hi, I’m Jeremy Best coming at you from my living room in Seattle with a female synthetic voice. He’s already – just his expression and his opening – he’s already got everyone, right?

Host: Right.

Ann Paradiso: And then he says, this is so much better than public speaking. I’m not even wearing pants. Everybody cracks up. Then he goes on to deliver this gorgeous speech, which was not Tweet-length. And the guy who had handed out the awards was crying, people in the audience were crying. We realize as we’re on stage that although everyone can see Jeremy, we hadn’t reversed his camera, so he’s basically looking at the top of Chuck’s head…

Host: Oh, no.

Ann Paradiso: Same with Shaun. So, they corral us, we finish the speech. Everybody else who went overtime got the music and got kicked off the stage except for Jeremy. And so, we were able to experience that with him. And at the time, we didn’t want to overrepresent what we had. You know, I think they wanted to tell this story like, oh, he’s playing music again because of this, and, and that wasn’t true. We wanted to stay focused so that we could get there. But in technology, like this sort of stuff takes time. It takes way longer than you think. So, we focused the next year. We just kept on going to Jeremy’s and we kept on working on it. And we finally got to a point where he had created his first composition. And it was just a great moment. And so, we have so much more to do with that, but we’ve already found other applications that we can do now.

(music plays)

Host: Ann, it’s about this time in the podcast where I ask researchers the famous “what keeps you up at night?” question. And I do it because I want to know how researchers are facing and addressing the idea of consequences, intended and otherwise, that come with all these new technologies. So, given the scope of what you’re doing and the world that you’re doing it in with the Enable team, is there anything that keeps you up at night?

Ann Paradiso: Absolutely. What keeps me up at night is knowing we don’t yet have a cure for ALS. I’ve seen the suffering that ALS causes and the impact it has on families, and it is profoundly unfair. And I’ve also seen, firsthand, the effect that technology can have restoring people’s purpose in life, whether it’s a creative purpose, or whether it’s raising your children, or whether it’s working or doing your taxes or buying a car. These are all things that our collaborators wanted to be able to do after ALS had taken their speech and their movement. We have this technology that we know can help, and we still have a barrier connecting that technology to the people who need it. We’ve made great strides with Windows Eye Control, again, Harish Kulkarni and the Windows Input team have put so much heart into this, and that’s the first step. But there’s so much more that needs to happen. And that keeps me up. We started, with the PALS program of ten collaborators, and nine of them have passed away since we’ve been working with them, people that we love, that we’ve become very close with. And that keeps me up at night. And it will continue to keep me up until either there’s a cure for ALS or we find a way to continue this work and to take it to scale where we really need to take it, not just in the United States, but outside of the United States where people have even fewer resources and less access. So that keeps me up.

Host: Ann, I love stories with an unconventional twist. Um, they’re so much more interesting, right? And you have one of those stories in that you’re part of a relatively small group of researchers here at MSR that didn’t arrive with a PhD in a suitcase. And your education path isn’t necessarily what people would have thought, hey, this woman is going to run the joint on user experience design for Enable at Microsoft Research. How did you get here?

Ann Paradiso: That’s a great question. I feel like it was – it was an alternate path. And you’re right. As my friend James Micken says, I did not go to the 25th grade. I had an unconventional upbringing. I struggled a lot in my early years. I was just dis-regulated in general, and I didn’t have some of the skills that I have now. But what I did have was champions. I had champions every single step of the way. I was in my community college, which was really my only choice after, you know, that…

Host: Flaming high school career.

Ann Paradiso: …spectacular, yeah, spectacular high school performance. And I was struggling there, too. I had been living on my own. I had other issues that I was dealing with. And I was sitting in my philosophy class one day, and my philosophy teacher, Blair Morrissey – I still remember his name – asked me to stay after class. And he asked me, do you want to go on with your education? Yes, I do. He put me in touch with the dean. We created a plan. I started to take things more seriously and become accountable to myself. And then I went away to university, and I did fine there, I did great there. So, I got it together. I got it together late, but I got it together. I was an English major. That was something that I just loved. And I started my career as an editor. And my husband and I wanted to move out west, and we kind of came out to Seattle with two dogs and no jobs. We didn’t know anyone. I started looking for work and so on a whim, I applied for a technical writing position at Microsoft. And I got there, and I didn’t actually know what I was interviewing for. My interview was nine hours long! I didn’t understand it. And I called this, this woman that I knew who worked at Microsoft, and I said, I think it was terrible. Like, it was nine hours long. And she said, no, it went well! You went all the way through. And right around the same time, I had interviewed for a startup. It was during the big startup dot.com boom.

Host: Yeah, yeah.

Ann Paradiso: And, and I had gotten an offer there, and I took that job. It was a PM job and I worked at that job for eleven months, and then that whole company just tanked with most of the other startups and we all got brought into a room and they said, we’re done, we don’t have any more money. Everyone’s laid off. We all go to the bar… And I called the Microsoft recruiter a couple days later, and she said, you know, why don’t you come back in and just meet the new director? You don’t have to go through a new loop or anything. Just come meet this group. And I did, and I was a technical writer for three weeks and they re-orged, and then I became basically a web producer. I worked in the Office group for four years and there was a position available at Microsoft Research for a website manager. And I came in and started that way. And we built a new web platform and, you know, that was fun and interesting work. I built up a team. I started managing designers. And then, when we shipped that platform, I started to have opportunities to work on interesting and cool research projects, and that was so great. And I knew, I was like, this is what I want to do. And I got to work on some of the best projects. I got to work on Micro Productivity with Jaime Teevan and I got to work on Pocket Skills with Mary Czerwinski, and I just got to really stretch. And I found the research work so much more interesting. And that was just a great fit for me. And so, I did that. I worked on the Advanced Development Team for eleven years and loved it and would have probably never left had not this opportunity… I was seeking a more meaningful connection. And I was talking with Peter Lee who was just forming NExT and he mentioned this team is forming around people with disabilities, and you can work directly with the users. And that was it for me. I said, that’s what I want to do.

Host: As we close, Ann, what advice or words or wisdom could you give the next generation of researchers who might be interested in making interaction design and user experiences for anyone and everyone better?

Ann Paradiso: I would say, first and foremost, follow your gut. Be honest with yourself. Learn from your mistakes. Be fearless and care about something deeply, and if you find something that you care about deeply, stay with that. Things tend to work out that way. They really do. And, you know, there are opportunities in tech for people that do come from unconventional or non-traditional backgrounds. And there’s great, interesting, meaty work. There’s great problem solving. Multi-modal interaction is such a cool and interesting space, and there’s so much room for invention there. So, stay with that. Figure out what your north star is and stay focused on that at the expense of all things, and everything else will become noise. And that is also what will give you the strength and the resilience you need to overcome the setbacks, because this kind of work comes with a lot of setbacks and you have to fight a lot of battles. So yeah, be a bulldog.

Host: Ann Paradiso, thank you for coming in today. It’s been awesome.

Ann Paradiso: Oh gosh, it’s been my pleasure. Thank you so much for having me.

(music plays)

To learn more about Ann Paradiso and how researchers are bringing innovative interaction design to people with disabilities, visit Microsoft.com/research

The post Enabling design with Ann Paradiso appeared first on Microsoft Research.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.