Data science and ML for human well-being with Jina Suh

This post has been republished via RSS; it originally appeared at: Microsoft Research.

Episode 93, October 9, 2019

Using technology to help us improve our health is nothing new: a quick web search returns hundreds of apps and devices claiming to help us get fit, quit smoking, master anxiety or just “find our center.” What is new is a serious cohort of researchers exploring how artificial emotional intelligence, or AEI, could help us understand ourselves better and, when used in concert with human caregivers, enhance our well-being. One of those researchers is Jina Suh, a former Xbox developer who got hooked on research and is now an RSDE in the Human Understanding and Empathy group at MSR, as well as a PhD student in computer science at the University of Washington.

On today’s podcast, Jina shares her passion for creating technologies that promote emotional resilience and mental health; gives us an inside look at an innovative research collaboration that aims to improve collaborative care for cancer patients with depression; and tells us an emotional story of how, on the brink of quitting her job, she found inspiration to get back in the game and begin a new career in research for human well-being.

Related:


Transcript

Jina Suh: There’s a huge opportunity, an unmet need, of our technology being more understanding of us and us being more understanding of what the capabilities of the technology are so we can have this rich, meaningful experience, collaborating with the technologies and working with the technologies. So in that sense, making our technology more emotionally aware is a key part in making that experience meaningful.

Host: You’re listening to the Microsoft Research Podcast, a show that brings you closer to the cutting-edge of technology research and the scientists behind it. I’m your host, Gretchen Huizinga.

Host: Using technology to help us improve our health is nothing new: a quick web search returns hundreds of apps and devices claiming to help us get fit, quit smoking, master anxiety or just “find our center.” What is new is a serious cohort of researchers exploring how artificial emotional intelligence, or AEI, could help us understand ourselves better and, when used in concert with human caregivers, enhance our well-being. One of those researchers is Jina Suh, a former Xbox developer who got hooked on research and is now an RSDE in the Human Understanding and Empathy group at MSR, as well as a PhD student in computer science at the University of Washington.

On today’s podcast, Jina shares her passion for creating technologies that promote emotional resilience and mental health; gives us an inside look at an innovative research collaboration that aims to improve collaborative care for cancer patients with depression; and tells us an emotional story of how, on the brink of quitting her job, she found inspiration to get back in the game and begin a new career in research for human well-being. That and much more on this episode of the Microsoft Research Podcast.

(music plays)

Host: Jina Suh, welcome to the podcast.

Jina Suh: Thank you.

Host: You work in an interesting space, Jina. your title is Research Software Design Engineer. You’re currently working in a group called HUE, or the Human Understanding and Empathy group at MSR, and you’re beginning your doctoral studies in computer science at the University of Washington. In broad strokes tell us what motivates you to do the work you do and what gets you up in the morning.

Jina Suh: What gets me up in the morning is actually an alarm clock! No, but really it’s my daughter. She’s really my inspiration. I get up to do what I set out to do because I want to make the world a better place for her. I want to show that, you know, being a working mom is hard, but it’s really rewarding, and that learning never stops. So I want to show her that you can redefine yourself at any age, under any circumstances. So, I do work in technology for human understanding and wellbeing. So, I think in terms of mental health and human wellbeing, I think all of us, at some point, go through experiences where we feel like we’re not in control of our emotions or our lives, we go through very stressful moments, we go through very sad moments. I think there’s a huge opportunity for us to develop technologies that helps us understand, you know, who we are, what are the feelings that we’re having, and really help us reflect and be the better version of ourselves.

Host: Where is it that we would need technology to help us be “better versions of ourselves” or to overcome obstacles in our emotional and mental health?

Jina Suh: I think, in a lot of ways, we have the capability to kind of look at what we’ve done and reflect on our past and figure out why we’re feeling certain ways, but that takes effort and motivation, and training and practice, too. So a lot of people need professional health through, you know, therapies to develop the skills in order to reflect, but I think technology, especially with multimodal sensing technologies, we have the opportunity to bring information to the person and help them look at their behaviors in ways that human memories cannot fathom, right?

Host: In some ways, what I’m hearing you say is that, if we can set a timer or an alarm for some physical thing we have to do, why couldn’t we have technology prompt us on things that we have to do for our mental health?

Jina Suh: Right, and a lot of our mental health influences our productivity and our work and our thinking and our decision making. So I think it’s absolutely necessary for us to schedule for ourselves to reflect, and also practice the skills that we need to do to reflect and be mindful, just like we practice for work and for exercises.

Host: Well, we’ll get to the specifics on that in a second, but let’s go back upstream a little bit. You have a dev background, but you say now you only code when you need to. When do you need to?

Jina Suh: Well, when I have an idea!

Host: Okay, cool, I was wondering about that! You’ve also done some work in machine learning and HCI and you say you’re a proud alumnus of Patrice Simard’s Machine Teaching group. Tell us a little bit about your previous work and how it’s informed the work that you do now.

Jina Suh: So I started out as a developer in Xbox. I worked in Xbox for seven-plus years and I always admired the design team and, and loved working with them, and, you know, try to understand the rationale behind their design choices and I wanted to do more high level thinking, you know, be closer to our customers and, you know, build things are more meaningful and have genuine reasons why we’re building them. So, I stumbled across something called HCI, Human Computer Interaction, or HCD, Human Centered Design, and at the same time I stumbled across this Machine Teaching group which claimed to be this focus on like studying the people interacting with machines. And you know when I started working in the group, I also started my part time master’s program at University of Washington in Human Centered Design and Engineering. So, at first, I was just helping the group build their machine teaching tool, and then I got to help build prototypes for user studies and then help run the studies and, eventually, design studies using the knowledge that I was gaining from my masters and also through work and I just fell in love with the science of it all and really observing people using the tools in the way that you hypothesize so…

Host: Yeah.

Jina Suh: … slowly and maturely I started doing more independent research, forming my own projects, and contributing to research, and I’m pretty much hooked now! I am a firm believer of the machine teaching principles, you know this idea of empowering the human teacher. And I believe I can take those learnings and apply in other domains.

Host: Right, and some exciting ones too.

Jina Suh: Mmm-hmm.

Host: I want to talk a bit more in depth about this Human Understanding and Empathy, or HUE, group. Tell us in more detail what’s going on in HUE and who’s doing what right now.

Jina Suh: The mission of the HUE team is to really empower people by creating and inventing new technologies that promote emotional resilience and well-being. It’s really grounded in the fact that emotions are fundamental to human interactions and they influence everything that we do starting from learning, memory, decision making and all these other aspects of our lives. So, you know, how do we bring emotional intelligence to technology is kind of the core our research.

Host: Tell me a little bit more about this idea of emotional intelligence and emotional resilience because this isn’t something we normally associate with computer science, especially in the research areas.

Jina Suh: As humans, we actually generate a lot of data about how we’re feeling or what we’re thinking, we have body language, we have the way that we speak, kind of the faces that we make. It’s really difficult to process all of that data all at once. So we need the help from computers and technology to not only capture all of that information, but also help us make sense of the data by analyzing the information.

Host: Okay.

Jina Suh: Computers have been ubiquitous in our lives and we expect more meaningful interactions with our technologies and we want our technologies to understand this in some sense. And I think there’s a huge opportunity, an unmet need, of our technology being more understanding of us and us being more understanding of what the capabilities of the technology are so we can have this rich, meaningful experience, collaborating with the technologies and working with the technologies. So in that sense, making our technology more emotionally aware is a key part in making that experience meaningful.

Host: So there’s a lot to unpack there in terms of how you translate human emotions into machine language, and we’ll get there in a second, but I want to go back to this idea of human centered design. To me, that implies there’s been a significant amount of design that wasn’t human centered. What can we do to change that?

Jina Suh: So I think we miss the mark in two ways. One is that we often simplify humans because it makes the problem more elegant and easy to solve. And we see this happening in machine learning research, you know, humans are mindless label generating oracles. You know, if you collect enough labels on whether a person on this photo is happy or not we can build emotionally intelligent models, um, but the truth is humans are a lot more complex than that, and we tend to ignore the things that are really difficult. So, you know, I talk to a lot of data scientists about bias in machine learning, you know, everybody knows that bias is a real problem but, because it’s hard to tackle, and it’s not their area of expertise, you know, it’s ignored or deprioritized. So that’s one thing. Another thing is that, you know, we give ourselves too much credit. You know, we assume that we know the problem and jump to solutions too quickly. We say to ourselves that users don’t know what they need if you ask them. You know, I don’t know if you know the famous quote by Henry Ford: “If you ask people what they needed they’d say faster horse.” Umm… So I think if you do your user research right, you’ll not only find the right problems but also, you know, through this iterative design process, you’ll discover the right solution that is, the automobile! You know, not only that you will also be able to contextualize your solution in more detail in real life situations. And, you know, the best part of my job is, actually, talking to users about their problems and hearing their frustrations and complaints. And really, the hardest part is not jumping to solutions and it takes effort and discipline to stay true to your users’ problems and we often ignore this step, so talk to the users!

Host: When we talked about human health and wellbeing, we talked about some of the problems human practitioners face, and you mentioned two specific ways that technology might help us attack these problems. And they were sensing and intervention. Could you unpack these concepts for us and tell us how they might play out in the human and technical worlds? How might technology augment and fill in human gaps here?

Jina Suh: I want to use productivity as an example since we are a productivity company!

Host: Right.

Jina Suh: So, you know, people go through a wide range of emotions throughout their workday and we have certain events or experiences, maybe a bad meeting, which could trigger negative emotions states like anxiety and stress, and that leads to unhealthy behaviors or loss of productivity or even poor decision making. So, you know, first step of coping with negative emotional states like stress would be awareness, which not only involves understanding the triggers that led to this stress but reflecting how your stress is presented in your behavior. So that’s when sensing comes in, because it’s difficult for people to associate these triggers with their behaviors and the emotions. So multimodal sensing assists people with the collection of contextual data and it might actually help you learn salient features and patterns that actually correlate the stress and negative emotional states to your behaviors, right? So that’s the multimodal sensing part. And then there’s the intervention part where we want to help people make sense of their behaviors and actually do something about it. Perhaps interact with the sensing technology to tweak it, personalize it, you know, which is an important machine teaching moment, and work with this emotionally intelligent agent to maybe surface interventions at the right moments. You know, so you can imagine having a just-in-time cognitive reframing for some people. This could be after-the-fact reflection for some other people. Studies have shown that the best time to practice some of these interventions is when the person is at his or her best self…

Host: Hmmm!

Jina Suh: …so providing the right intervention at the right time is key for sustaining engagement in mental health care.

Host: Well, let me interrupt here. What might that look like, a just-in-time intervention?

Jina Suh: Mmm.

Host: If I’m stressed, and my device, whatever it is, that senses it, alerts me and proposes that I reflect and so on, do I have to wait till at my “better self” before I can intervene on myself, as it were?

Jina Suh: I think that’s where the personalization comes in. Everybody’s different. So imagine you have anger management problems.

Host: I… I don’t have to imagine…!

Jina Suh: And some, some intelligent thing tells you, hey, you’re about to have an angry bout… you would be even more angry! Or, some people want to get interventions. Some people want others to tell them what to do. We have done studies in the past where we’ve analyzed different types of people, what kinds of control they want to have. So people who are more, you know, quote-unquote control freaks…

Host: Yeah.

Jina Suh: …are less likely to want to engage in these just-in-time behaviors, and people who are less control freaks are more likely. So we really need to personalize and really understand the person and be able to give the control to the person to adjust and tweak these settings.

(music plays)

Host: I want to talk about the collaboration between the University of Washington and Microsoft Research. I know you’ve a foot in both places. So we just talked about bridging gaps between humans and machines and how one can complement the other, but there’s also gaps that exist between people. Talk about why it’s important to have both medical and technical experts working together in this hybrid space.

Jina Suh: We try to make meaningful contributions to the space of human well-being and, hopefully, make notable differences in mental health. We really need expertise in clinical psychology, but because we don’t have the expertise in clinical psychology in-house, we have to work with people from outside of our company. So, our Pocket Skills, which is a mobile app that we developed, that Mary Czerwinski, our manager, spearheaded – and we collaborated with Marsha Linehan who is the creator and expert in Dialectical Behavior Therapy, DBT, and Chelsea Wilks who is a clinical psychologist – we collaborated with them to translate this proven, in-session therapy technique to a digital format in a mobile app. And currently, I’m working with University of Washington Psychiatry and Behavioral Sciences faculties to develop technologies that we’re actually going to build and test its clinical efficacy. So again, without the experts in these fields, we are not able to truly validate the efficacies of these technologies. I think there is a significant community within the medical field where they develop these apps, and they actually do test for clinical efficacy, but on the technology side, I think we have an opportunity to inform the design and the implementation of some of the technologies in the medical field, and we also have an opportunity to gain the knowledge and expertise that medical specialists have and bring it to the technologists so that we actually do build things that are useful.

Host: Let’s talk about a specific project you’re working on called – are you ready? – Designing Technology Enhanced Collaborative Care for Cancer Patients with Depression. Usually long titles like that have some clever acronym but no, this one doesn’t even make a word… I don’t know, do you have a nickname for it?

Jina Suh: No. It’s really hard!

Host: Someone’s got to figure a name out…

Jina Suh: I know.

Host: …that rolls of the tongue. In any case, I want you to talk about this research in detail because our listeners may not even know much about the topic, let alone the technical approach to helping with the intervention. So what are some of the underlying problems in what you call collaborative care, and why do you think technology could help?

Jina Suh: So, I think I need to explain what collaborative care is first. So a collaborative care model is one implementation model where, basically, behavioral healthcare or mental healthcare is tightly integrated into whatever medical care setting. So it could be primary care, and in my case it’s cancer care. The idea is that, you know, cancer patients do develop mental health disorders because they’re going through tremendous stress. If the medical team discovers that you need this additional mental health support because you have depression, anxiety and other things, they engage what they call care coordinators, or care managers, to help you treat your psycho-social needs. And you have psychiatrists if you need additional, you know, medication support. Or if you develop a mental health disorder that’s complex, you involve psychiatrists who cannot only consult on the cases but also provide kind of medication prescription support. So it’s this idea of putting the patient at the center, and having the oncologist, or providers or physicians, work with your care providers who deliver the day-to-day behavioral health care and the psychiatrist who is giving guidance on medication and prescriptions. So that’s collaborative care, in a nutshell.

Host: Yeah.

Jina Suh: And these care managers, they’re often social workers, and not only are they supposed to give you this behavioral health care – whether it’s, you know, BA, behavioral activation, or CBT, cognitive behavioral therapy, or whatever these interventions and treatments – not only are they tasked to do that, they’re also tasked to do these navigational support things. You know, financial aid, making sure you have housing, making sure you’re taking your medication, making sure you can get to the cancer center and get the care and get the infusion and treatment and whatnot… so they’re overwhelmed. They’re truly overwhelmed. So, what are some ways that we can reduce the workload of these social workers? So one of the things that we noticed is that these social workers are having to do a lot of things during their sessions with their patients including things like administrating validated mental health scales – so PHQ9, which measures depression, and GAT7 which measures anxiety – and that takes away ten minutes out of a thirty minute session. So, one third of your session is spent on administrating this thing. So, you know, you could do it outside of the session. You could do it at the waiting room, that’s great, but, in reality, these patients come in and out of the clinics almost daily because they’re getting infusions, so they’re really burned out and they don’t want to see another person, they just want to go home!

Host: Right, and sleep!

Jina Suh: So, you know, adding this additional burden, like you have to stay in the waiting room for this much longer to fill out this form, or sometimes they cancel appointments and reschedule appointments because they’re just burned out… What are ways to help these patients engage in mental health care on their own, you know, sometimes without having to come to the clinic? And then, also, help social workers be less burdened with the workload that they have. So, one of the things that we could do is, you know, offline remote assessment, so have technology that pings the patients once in a while and measures their depression and anxiety, and when that scale goes up, you know, if you measure high depression all of a sudden, bring them in for care.

Host: Is that self-reporting then?

Jina Suh: Mmm-hmm.

Host: Okay.

Jina Suh: Yeah. Most of these scales are done as a self-reporting tool anyway. They get handed a piece of paper in the…

Host: But sometimes it’s in a supervised setting.

Jina Suh: Mmm-hmm.

Host: And my question would be, if I’m depressed and I’m alone, am I just going to hit the snooze button?

Jina Suh: Right, and that already happens, even without technology, right? A lot of the patients don’t want to come into the clinic. Especially when they’re depressed, they don’t want to engage. So that’s already a challenge, but most of the times, these patients are sitting in their bed using their phones or watching TV. Using these opportunities to bring up, you know, hey, it’s really important to know where you are in terms of your depression and anxiety scale, like, do you want to self-assess right now? Or, hey, it looks like you’ve been sitting at your bed for a long time. You need to go out and talk a walk. Let’s go do that. One of the things that we discovered that patients are missing in this traditional mental health care sense is that, because social work in cancer settings, it’s not billable, which means that there’s no formal scheduling for them, which means they don’t get to go through the standard process that other providers do. Basically, they don’t get treated as an official provider which means there’s no after-visit summary that you get when you go see your primary physician, right? You get this, like, little printed out paper that says you’re supposed to do this, go take these medications… These social workers don’t get to do that because they don’t even have a scheduler.

Host: Right.

Jina Suh: So a lot of times what you end up discussing during the sessions are forgotten or it’s on a written on a piece of sticky note and lost. But we also know that people carry around their cellphones all the time. So if they could have the after-visit summaries and notes and their action items – things that they are supposed to do – if we can have technology that nudges them in the right way to actually take those actions and also help them remember what they’re supposed to do, because it’s written down and it’s recorded in your app, that’s a huge area of opportunity that is not there currently.

(music plays)

Host: We’re at the point in the podcast where I ask what could possibly go wrong. I’m imagining, as we talk, that I’m a cancer patient, and depressed, and anyone who’s ever struggled with depression for any reason knows how difficult it is to do anything, even if you’re reminded, you know? At what point, if I have my phone telling me something, is there a handoff where there’s a human? Because I might just ignore my machine, but somebody who hugs me, or somebody who’s talking to me face-to-face and says, “Gretchen, I’m here for you. We can do this together.” Is there a bridge to the human through what you’re talking about in your interventions?

Jina Suh: Right. What we’re trying to do is not replace humans in any way, but augment the human capabilities. I think the social workers are already overloaded. So if we can reduce their workload by making their work a lot more efficient, now they have the opportunity to give a call to their patients and say, hey, how are you doing? One of the things that we discovered is, a lot of the patients go through these crisis moments and they want to talk to somebody. They’re not suicidal, so they don’t see the need to call the suicide hotline, but they’re just going through a hard time. And we ask them, you know, how can we help support that? And most of the time people said, I just want to text with my social worker. I just want to text with somebody that I know, who understands my situation. So we’re trying to brainstorm ideas around, how do we make that possible? And also understand that social workers are burdened, and they have a life, too, outside of work…

Host: They might need the app as well!

Jina Suh: …you know crises don’t happen between the nine to five, right?

Host: Right.

Jina Suh: The business hours… so how do we make that possible by balancing the needs of both individuals? It’s a really tough problem. But I think there’s a way, you know? Obviously reducing the workload of the social workers and having them have more time to engage is one thing. And giving the patients opportunity, and maybe a social group and support network, to talk about their issues would be another way.

Host: Mmm-hmm. Well let’s go in a little further on the “what keeps you up at night?” Even as we’ve talked, you’ve triggered some thoughts in my mind. Number one is the privacy issues that are involved, especially when they deal with my mental health, so that’s one area I’d like you to talk about. And the other one is the shift towards machines and devices to meet our needs rather than humans. I know that I’ve just unloaded a bunch of thoughts. Unload back on me!

Jina Suh: I think, again, I want to stress that we’re trying to take advantage of technology as a mediator to help connect people as opposed to replace the human-to-human connection. So, the metric that I would probably optimize for is not, how often does this patient use my technology, but how often does this person engage in the care and actually engage with the social worker? So, you know, what keeps me up at night is, do I have the right objectives and I am actually meeting the goals of that objective? In terms of the privacy, yeah, I think one thing that people assume is you have the sensing technology, it’s creepy, you know, it’s watching me all the time, but you also have to understand, we’re not forcing you to have this. We want you to use this technology for yourself, for your own good, so if you’re not comfortable, don’t do it! But if you see the need and if you see the benefit of this technology in helping you remember what happened, helping you reflect, helping you get out of that tough situation, then by all means, why wouldn’t you use it? What I’m trying to get at is that there are some points in your therapy session where your social worker or your therapist, will ask, you know, what happened. And you may realize that you can’t remember, you can’t recall. That makes the session less efficient. I think there are ways to negotiate this boundary that you have of your comfort level and that negotiation has to happen by you realizing, through the help of your social worker and therapist, where’s the line that you want to draw and what’s the need that you have?

Host: To me it all goes back to settings, you know? I’m going to pick the strength of intervention or sensing that I want.

Jina Suh: Yeah. But you as a patient and you as a user sometimes don’t know where that setting has to be and it’s really the therapist or the social worker that helps you decide, okay, this data is really important because with this data I’m going to make this clinical adjustment, and that’s a buy-in process.

Host: Sure.

Jina Suh: The social worker has to convince that that data is important for clinical reasons.

Host: You know that’s a super good point because there might be people that really, really need it and say, I don’t need it, and that’s where you’re looking at them going, yeah, you do. Let’s work on getting you to the point where you can embrace. Who’s your audience for this? Is this technology really for the social worker and the therapist, not so much the oncologist or the primary care physician?

Jina Suh: Right now we’re targeting just the social workers and the therapists and the care managers because they’re the ones that know everything. They’re the go-to person for the patient…

Host: Sort of the hub…

Jina Suh: …they’re the hub of all the information.

Host: Okay. It’s story time, here on the Microsoft Research Podcast. We know about the work you’re doing but not much about you, as a person, except the fact that you’re, like, doing a lot of work and you’ve had a lot of experiences already in your fairly short life. What’s your background? How did young Jina Suh get involved in high tech and how did she end up at Microsoft Research?

Jina Suh: I studied physics as an undergrad, and astronomy. I did research in astrophysics. I got involved in particle physics and I aspired to be an experimental particle physicist. I worked at CERN for a summer… So I went on to get my PhD in physics, and felt like I was making kind of a very abstract contribution which, if anybody’s studying physics, like, by all means, go for it! But for me, personally, I felt like I needed to do something that was directly impactful to people. So I started looking around for jobs, and there are only two kinds of jobs that you could get as a physics graduate which is quantitative trading in finance or some sort of a data science or, you know, computer science in tech.

Host: Yeah.

Jina Suh: So I got an internship at Microsoft, I fell in love with it, and I just decided, I’m going to pursue my career in coding. I don’t have a computer science background at all. The only class that I took was an intro to C, and, you know, there I learned how to manipulate some memories and arrays and things like that, but I never really knew how to code, so I picked all of that up at work! Somehow I started as an SDET. I started as a tester and then I moved my way through becoming a developer… so all of that happened while I was working in Xbox. And this was during the launch of Xbox One. And I don’t know if you understand the pressure that you feel when you’re not only shipping software, but you’re also shipping brand new hardware. We were working day and night, over the weekends, and I had a 4-year-old at the time, and I asked myself, what am I doing here? I have a 4-year-old at home! So right after the Xbox One release, I had decided to quit my job. I wanted to become a stay-at-home mom and dedicate myself to my child. And I had an opportunity to give a talk on behalf of somebody at Grace Hopper, and I was like, okay, I’m just going to go do it, and, you know, I told myself I’m going to quit my job. After this conference, I’m going to tell my manager. At the time, the keynote speaker was Sheryl Sandberg and I had heard about Lean In. I didn’t read it. But I figured, not only is she the keynote speaker – I was also invited to a breakfast with her – so I’d better know this person! So, my husband put Lean In on the Kindle. I started reading it from the airport. I read it on the plane. I read it at the hotel when I got there. I stayed up all night reading the book just crying my eyes out. Crying my eyes out. Why I am deciding to quit my job? I can’t do this! I can’t add to that number of women who gave up. So that gave me a lot of hope to, you know, pick myself back up, get back on the field. So, that’s when I started looking into other job opportunities. I needed to get out of what I was doing, I need to do something that’s meaningful, I need to do something that has good work-life balance! But also, you know, it’s something I can be passionate about and I looked towards Microsoft Research because somebody else told me, oh, yeah, there are developer positions at Microsoft Research! Really? I didn’t know that! So I applied. I had great interviews. I fell in love with the people that I was talking with – with the project, the idea of helping people, empowering people, developing these technologies, studying people – I just fell in love with it all. And that’s how I got started in Microsoft Research.

Host: As we close, I always ask my guests to give us some parting thoughts and I usually ask it in terms of advice to our listeners, but I want to tweak the question and ask it less as advice, and more as, what excites you most about where you’re heading with research? What technology wave do you want to catch and ride?

Jina Suh: So, right now, there are a lot of changes that are happening in mental health. You know, we’re putting mental health as the number one epidemic, people are not afraid to talk about it… you know, we see the epidemic in teens, they’re becoming more depressed. Right now we’re seeing changes in policy, we’re seeing changes to healthcare, in terms of mental health. So I think this is the right moment to really get into the space of mental health and have that conversation. We need to have that conversation now, so that it’s not too late. Ten years from now, we have this emotionally intelligent agent, you know, having to figure out, like, what does it mean? How does it impact our lives? If we get there are the very beginning of it, I think we will evolve with the rest of the world, we will evolve with the policy changes that are happening, the insurance changes that are happening, the attitude changes and stigma changes that are happening. So I want to be part of that conversation. Now!

Host: Jina Suh, thank you so much for joining us today. It’s been delightful.

Jina Suh: Thank you for having me.

(music plays)

To learn more about Jina Suh, and the latest in technologies for human health and well-being, visit Microsoft.com/research

The post Data science and ML for human well-being with Jina Suh appeared first on Microsoft Research.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.