In a world of natural language processing and artificially intelligent chatbot therapists, the concept of technology in the social work space can seem intimidating. However, if nearly two years of Zoom and virtual life have taught us anything, it’s that tech is here to stay. How do social workers plan to adapt?
The majority of social work may have some catching up to do as it welcomes new technologies, but this innovation has taken place on the fringe of the profession for decades. As telehealth and data science impact both micro and macro career paths, digital fluency becomes increasingly vital to ongoing success for current and future change-makers everywhere.
Luckily, GSS has faculty members embracing digital practices, and exploring all the ways technology will make social workers’ lives easier and quality of care better. And when that happens, social work will expand its seat at the table on major issues — further asserting its importance to society.
Associate Professor Lauri Goldkind and Assistant Professor Liz Matthews are two of such scholars inspiring the new wave of tech-driven social work. Goldkind serves as the Editor-in-Chief of the Journal of Technology in Human Services, and was co-editor for the book Digital Social Work: Tools for Practice with Individuals, Organizations, and Communities. Matthews was recently awarded the Robert Wood Johnson Foundation Health for Data Action Award (HD4A), and is currently collaborating with NYU Silver School of Social Work Professor Victoria Stanhope on a new project using Natural Language Processing to examine Collaborative Documentation.
The three of us sat down (virtually, appropriately enough) to speak about what they’re working on in the space; the state of digital tools and innovations in social work; and if social workers will not only use technology, but actually influence Big Tech.
How did you both get interested in the technology space? Did you see it as a way to get out in front of innovation in social work, or was it a lifelong interest that just happened to coincide with the field?
Liz Matthews: For me, it kind of came out from my work as a clinical social worker. So, I started my career in an integrated care setting. I was working in a health care center in the behavioral health department. And it just became really clear to me as I was trying to use an EHR (electronic health record) system, and use technology and the information technology provides to deliver care and inform my care, that there was just a lot of room for improvement. The systems were really limited and we were a bit misaligned with how we’re trying to provide care in a really interdisciplinary and data-driven way.
Lauri Goldkind: I was always the person, independent of what my job was, charged with trying to figure out the data and make sense of the data and present it in an interesting way. I also did, in my time, a lot of hack databases — meaning there’s never enough money in the nonprofit sector and the tech choices are always sort of secondary because nobody wants to fund infrastructure, which is not just a tech problem; it’s an infrastructure problem. Frankly, I’ve always had a deep and abiding interest in the digital.
Early on, I was teaching a book called Digital Advocacy and received a grant that required me to have a mentor, and the person I thought was most interesting was the person who wrote this book, John McNutt. So, I picked up the phone and called him and he answered the phone! And that started a 10-year collaboration.
Do you think social work is playing catch up in the technology space, compared to other disciplines? Why is that?
LM: Behavioral health as a field has definitely lagged medicine as a comparison in the adoption and acceptance of pretty much any kind of technology that could be used in practice. A lot of the big policies that came out of the Affordable Care Act and the HITECH Act and elsewhere that really supported the adoption of electronic health records, for example, excluded most behavioral health providers. Many of those folks are working in health agencies and so are able to adopt the technology because of their affiliation with [physicians and healthcare]providers, but I think that’s really stymied adoption in a lot of different settings and contexts — community mental health settings, for example.
But I think there’s also a lot of attitudinal barriers among providers, and this is particularly true, I think, among clinical practitioners. There is a lot of anxiety and some amount of resistance around using technology, because it’s really not something that aligns with how clinical practice is typically taught in our schools right now. And I think that that’s a big problem with the way our education is structured, because in many ways, we’re really not preparing social workers for the work, context, and landscape that they’re going to enter into. So, I think that that’s a big issue that schools are really trying to address, but I think it’s going quite slowly.
Speaking of technology in the clinical setting, where do you both stand on services like Woebot and other artificial intelligence “clinicians” treating humans?
LG: There’s a great article in the Wall Street Journal that’s recent-ish and asks about this sort of very question, like, is chat going to take over behavioral health? And social workers were not represented, but the folks who they did speak to did not think that was going to be the case. Which is interesting if you listen to the overheated rhetoric of the [tech]startup community, who do think AI is the answer. I think about this a lot. I’m concerned that we’re going to wind up with a two-tier system where people with the means will get a real person to talk to and be engaged in a human-to-human interaction, and folks with no resources will have an augmented interaction or an AI-to-human interaction.
A lot of our students and a lot of the folks who want to be private practitioners, who are not so far out of school and looking to do private practice, are heading to [telemental health]platforms —because that’s where the clients are. So, I think it’s more complicated than just, is a robot effective or not.
LM: I do think that there will be, to Lauri’s point, some new and different looking equity issues as technology continues to evolve. We’re seeing some interesting things happen now that telehealth and telemental health became so normative with the pandemic. Actually, a lot of the growth of telehealth happened within the behavioral health field. In the year of 2020, about half of mental health visits were actually telemental health visits, but we’re not seeing that pattern rise equally. So, we thought of telehealth or telemental health as a way of addressing shortages in rural areas, for example, but actually during the pandemic, we saw that individuals in rural areas were actually less likely to use telehealth than those in urban areas, because of things like infrastructure, broadband issues, the digital divide. Then you also look at vulnerable populations, that really aren’t going to have the resources to have an iPad and a tablet.
So, even though we think of this move toward telehealth as making care more accessible, I think we’re going to see that that’s really not true for certain groups, and that’s going to have implications if we really embrace this system. Who is left out of care? Who’s left behind? It can create some potentially new divides.
Lauri, you’ve spoken a lot in your research about how social workers should be at the table providing an ethical lens when new technologies are created, and even stated that social work and computer science students should work together in the classroom. How would that relationship get started, professionally? Is it up to the tech companies to make room, or would the government have to step in to make that happen?
LG: Here’s the thing: Big Tech knows its world is on fire right now. And I think there’s an opportunity to position social workers as the answer to a tech ethics problem. Right now, I think that there’s some deep and abiding belief that social workers are “relational people,” and that we somehow can’t live with technology and relationships in the same place, but yet we’ve spent the last 24 months communicating and living in relational spaces online.
LM: I think some of this is also exacerbated by the way that policies are rolled out. Social workers are consistently not at the table when big conversations around technology and funding for technology and infrastructure around technology are established.
There’s recent policy that has just taken effect in the spring around information blocking, and it essentially requires healthcare providers to share visit notes with their patients and make the information that’s included in their medical record transparent and accessible — and that’s now a mandate for most healthcare providers. But who’s left out of that? Mental health. And so it’s really kind of exacerbating this divide, and is this missed opportunity. So the worry, then, for me, is that it’s going to keep widening the gap between what medical practice looks like and what social work practice or behavioral health practice look like, and it’s going to get harder and harder to close that gap. It’s so antithetical to what health care reform has been trying to do for the past 10 to 12 years, which is really to break down those silos.
Along with the infrastructure barriers to technology in social work, do you think some of the reluctance to embrace new technologies stems from a fear of job loss?
LM: From my experience talking with providers as a researcher, it’s less fear and more skepticism that these tools are appropriate and reliable in giving us information that we can trust and that we can use, let alone replace our own clinical decisions.
LG: I think people just don’t think it could possibly happen. It’s almost as if the thinking is, well, our practices are over here, and this digital thing is happening over there, and so I don’t have to worry about it — and I think that’s a flawed approach.
What are you two working on right now in the tech space that gets you most excited?
LM: One thing I have going on is a project about using natural language processing, a form of artificial intelligence, to understand whether or not we can figure out a way to automate and create a scalable way to measure person-centered care. So, this is something that everybody in behavioral health agrees is the right way to deliver care, but it’s kind of a fuzzy concept and notoriously difficult to measure. So, we are trying to develop a mechanism that would allow, through an automated and standardized process, an algorithm that would look at documentation of behavioral health providers to try to tell whether or not the care that they’re delivering to their clients is person-centered, and so this could be really beneficial as a way to really allow providers to evaluate their own clinical care quality.
Within the same study, we’re hoping to use this algorithm to help us understand whether particular training or intervention, which is supposed to increase the delivery of person-centered care, is actually effective. This hasn’t really been done before in social work, and I think could potentially give providers are really valuable tool to inform their practice.
Another thing that I’m working on is much more macro-focused and is essentially looking at patterns of health information exchange between physicians and behavioral health providers, and whether or not they’re sharing information for the purposes of coordinating care and providing more comprehensive services.
LG: A lot of the things I’m working on right now are centered around providers of human services who are all mandated to collect data as part of their contractual obligations. We make workers take massive amounts of case notes and I think those open notes are interesting, but I think the data mining and algorithmic strategies have a lot of promise for making the lives of these agencies easier. And if we can make the lives of those folks easier, the quality of care will go up.
What I think the exciting pieces are, what do regular people understand about what’s happening [in the tech space], and at least get more power over their data — I think that’s really important. And then, the other piece is, how can we apply data science strategies in human services in ways that make the practitioners’ and staff’s lives easier.
Last question – do you have a favorite app right now?
LM: Probably because of what I do, what makes me so happy and satisfied are all of my patient portal health apps, that I can log on and see what is happening and take care of appointments and feel like I am taking care of myself.
LG: For productivity, I’m really into the post-it note type thing, so I now have a Padlet with my kids. I have a 12-year-old who has very different visions of what a party might look like than I do, and so we have a shared vision board.
I will also put in a plug for this one utility called “Last Pass.” It’s a password saver and you can use it on your phone and in a browser, and it’s secure, meaning that they have high-level encryption [protecting your passwords]. I can only carry around so many interesting ideas before I forget all my passwords, and so I really love it.