CHAPTER 1
Becoming a Smart Nonprofit

INTRODUCTION

Leah Post has a keen sense of other people's pain. As a program manager at a Seattle social service nonprofit, she uses her gifts to help people who are homeless, or at high risk of homelessness, enter the local support system. An integral part of the intake process is a required assessment tool with the tongue-twisting name VI-SPDAT.

Every day, Leah asked her clients questions from the VI-SPDAT and inputted their answers into the computer. And every day the results didn't match the picture of despair she saw in front of her, the results that should have made her clients top priorities for receiving emergency housing.

Leah knew the basic statistics for the homeless population in King County, home to Seattle. Black people are 6% of the general population but over a third of the homeless population. For Native Americans or Alaska Natives that ratio is 1 to 10. Most of Leah's clients were Black, and yet time and again white applicants scored higher on the VI-SPDAT, meaning they would receive services first. Leah knew in her gut that something was wrong, and yet automated systems are supposed to be impartial, aren't they?

With over a decade of experience as a social worker, Leah knows that asking people who are scared, in pain, may have mental illness, and are at your mercy to self-report their personal struggles is not likely to yield accurate results. Similarly, victims of domestic violence were unlikely to self-report an abusive relationship. But that's not how the VI-SPDAT worked. For instance, one of the questions was: “Has your drinking or drug use led you to being kicked out of an apartment or program where you were staying in the past?” Single, adult Black Indigenous People of Color (BIPOC) were 62% less likely than white applicants to answer yes.1 In general, denial of drinking and drug use is the smarter and safer answer for people of color when applying for public benefits. Except when taking the VI-SPDAT. This assessment is intended to measure vulnerability, which means the higher the score, the more urgently a client needs housing. But, Leah says, VI-SPDAT “just doesn't allow the space for any interpretation of answers.”2

Leah was not the only person noticing skewed results. Dozens of social workers joined her in signing a petition in Seattle asking for a review of the process. Other social workers around the country also raised concerns. Finally, researchers at C4 Innovations dug into the data from King County, as well as counties in Oregon, Virginia, and Washington, and found that BIPOC “were 32% less likely than their White counterparts to receive a high prioritization score, despite their overrepresentation in the homeless population.”

There were red flags about the VI-SPDAT from the beginning. It was evidence-informed, not evidence-based, meaning it was built on information and experiences from past efforts but neither rigorously designed nor tested. It was intended for quick triage but was most often used as an overall assessment tool by social service agencies. No training was required to use it. Oh, and it was free.3

Why was King County, or any county, using a tool with so many red flags? Some of the answer is found in its development history.

The Department of Housing and Urban Development (HUD) provides funding for homelessness to local communities through Continuums of Care (CoCs) consortia of local agencies. This system was created in the 1990s to provide multiple access points for people who are homeless, or at risk of homelessness, through, say, food banks, homeless shelters, or mental health clinics.

In 2009, HUD began to require CoCs to use a standardized assessment tool to prioritize the most vulnerable people. This was an important switch from the traditional “first come, first serve” model. The wait for emergency housing can be years long, and having an opportunity to get to the top of the list is a very big deal for clients. The choice of which tool to use was left up to each CoC.

Years earlier, Community Solutions, a New York nonprofit specializing in using data to reduce homelessness, created the Vulnerability Index (VI) based on peer-reviewed research. The goal of the VI was to lower barriers for people with physical or mental health vulnerabilities that might prevent them from seeking services. Soon afterward, OrgCode Consulting, Inc., created the Service Prioritization Decision Assistance Tool (SPDAT). Finally, in 2013, OrgCode released a combination of these tools, the VI-SPDAT.

The president of OrgCode, Iain De Jong, told us that time was of the essence in launching VI-SPDAT, which precluded more robust testing and training materials.4 By 2015 more than one thousand communities across the United States, Canada, and Australia were using the VI-SPDAT.

The VI-SPDAT was initially released as a downloadable document with a manual scoring index because contrary to its name, OrgCode isn't a tech company. Two years after its release, multiple software companies serving homeless agencies asked to incorporate the VI-SPDAT into their products, and OrgCode consented.

Incorporating the VI-SPDAT into software programs automated it, which meant that instead of scoring the assessment by hand, administrators were now restricted to inputting data into screens and leaving the rest up to the computer. VI-SPDAT became a smart tech tool. The power of decision-making shifted from people to computers. This gave the VI-SPDAT a patina of infallibility and impartiality. Jake Maguire of Community Solutions said, “There are people who have divorced the scoring tool from the basic infrastructure required for meaningful community problem solving. It is complex. What we need to do is to equip people with the skills and permissions that give them informed flexibility. Don't automatically surrender your better judgment and clinical judgment. We can't put our brains on autopilot when we use these tools.”5 As a result, thousands of BIPOC people didn't get the priority spot they deserved or access they needed to vital services.

You may be waiting for some bad guy to emerge in this story: a company gathering data to sell to pharmaceutical companies or a government agency intentionally blocking access to services. There will be stories like that later in this book, but this isn't one of them.

All the actors here had good intentions. HUD wanted to ease access into the homeless system by using multiple access points and placing local organizations in charge of the assessment. OrgCode was trying to create a standard tool for social workers and disseminate it easily, freely, and quickly. Leah and her colleagues were dedicated to helping the most vulnerable people in their communities receive appropriate services quickly. And, of course, clients who were walking in off the street just wanted to be safe at least for one night.

And yet, the VI-SPDAT was so fundamentally flawed that OrgCode announced in 2021 that it would no longer recommend or support it.

UNDERSTANDING SMART TECH

We use “smart tech” as an umbrella term for advanced digital technologies that make decisions for people, instead of people. It includes artificial intelligence (AI) and its subsets and cousins such as machine learning, natural language processing, smart forms, chatbots, robots, and drones. We want to be expansive in our use and understanding of the term, for instance, by including automation technologies like the one that powered the VI-SPDAT, in order to focus on the essence of the shift in power from people to machines. We substitute the word “bot” for smart tech in many sentences in this book because, well, it's fun to say.

Smart tech is not the same as digitizing a process. For instance, direct depositing a paycheck replaces printing a check and mailing it or handing it to an employee who has to endorse it and physically deposit it in the bank. Direct deposit is efficient, but it's not automation. Automation takes the power of decision-making and turns it over to machines. Automation turns a regular car into a smart car, and an active, decision-making driver into a passenger.

Netflix is powered by smart tech. They use our individual and collective viewing to make suggestions of what you may want to watch next. The more we all watch, the more accurate are Netflix's predictions. However, there is one enormous catch: the algorithms assume we want to continue watching the kinds of shows and movies we have in the past, that our future behavior will mimic past behavior. However, this isn't actually the way most people operate. We like to explore new things, bump into new people, and allow serendipity to take its course. This difference between the way machines think and the way people behave is a theme we will explore many times throughout this book.

Smart tech has some similarities with social media but more importantly, a few fundamental differences. Both are powered by digital technology, computer code, and enormous amounts of data. They also use the data and algorithms to predict future behavior. For instance, Facebook shows you ads based on what you have liked and shared previously. The more accurate Facebook's predictions, the more likely you are to click on an ad, which is how they make their money. However, we can see how Facebook operates or at least the results of Facebook's calculations. Even if Facebook doesn't want to put much work into controlling hate speech on its platform, we can see it unfolding on the platform. And if you can find the button to click to show you the most recent feeds, you can see which posts you've missed because Facebook decided to show you something else.

While social media is visible, smart tech is invisible like air. It doesn't care if you're rich or poor, at home in your kitchen, in the park, or at the office; it is everywhere around us working all the time. It is quickly becoming embedded in organizational life and making critical decisions such as who gets services, when and how, and how staff are performing their jobs. Perhaps the biggest difference between social media and smart tech is that the former creates an enormous amount of work while the latter, if used well, can create an enormous amount of time. And that time can be used to reduce burnout of staff; increase time spent with clients, volunteers, and donors; and imagine new ways to solve difficult problems.

Smart tech is best at doing rote tasks such as filling out intake forms and answering the same questions from people (for example, “Is my contribution tax-deductible?”) over and over. However, the technology is quickly moving beyond paperwork and embedding itself into the heart of nonprofit work. This is profoundly changing what we do, why we do it, and how successful we are in meeting our missions.

Nonprofits are beginning to use smart tech to:

  • Screen resumes based on criteria organizations set but without those organizations likely seeing the people who were screened out.
  • Determine eligibility for a host of social services such as SNAP food assistance, housing, and childcare.
  • Identify prospective donors from your fundraising data or from the web.
  • Customize stories for and communications to donors based on their past behavior.
  • Stock food pantry shelves.
  • Deliver medicine and food to hard-to-reach places.
  • Direct refugees to available beds.

This isn't science fiction; it is real life, right now.

Automating systems isn't a technological evolution; it is a revolutionary shift in power and autonomy. Those who understand the new technology and how to use it will have more power. Those who are reporting to tech systems and are at the mercy of them are at the risk of losing their ability to determine their own futures.

Smart nonprofits understand the technology—what it can and cannot do, how to use it well, and how to avoid unintentional harmful consequences.

BECOMING A SMART NONPROFIT

Smart nonprofits use a disciplined approach to adopting smart tech carefully and strategically while always maintaining the highest ethical standards and responsible use. Smart nonprofits are:

  • Human-centered: A human-centered approach means finding the sweet spot between people and smart tech, while ensuring that people are always in charge of the technology. Smart nonprofits ensure that the use of smart tech always aligns with their values.
  • Prepared: Organizations need to take intentional preparation steps. They must actively reduce bias embedded in smart tech code and systems. They also thoroughly correct and label data to be incorporated into a smart tech system. And lastly, they must have a formal process to select systems, vendors, and consultants that align with their organization's values.
  • Knowledgeable and reflective: Learning about what smart tech is and does is an ongoing process in the boardroom, the C-suite, and for nonprofit staffs. Once automated systems are in place, organizations need to be vigilant about whether they are performing as hoped or unintended consequences have arisen, and how clients and end users feel about the systems. We have embedded reflection questions throughout this book to help create the habit of asking important questions over time.

Working this way with smart tech creates a gift of new time.

Staff people spend about 30% of their time on administrative tasks. Smart tech is going to automate many of these tasks, thereby freeing up staff to do other things. This newly found time creates a choice for organizations. This new time could be used to do more of the same kinds of tasks that value efficiency over effectiveness and the number of people served over the number of problems solved. Or they can use this precious gift to become something new and better. We hope organizational leaders will choose the latter. We call this return on investment for the use of smart tech the dividend of time.6

Integrating smart tech into organizational functions could create an enormous return on investment that will allow for:

  • More time with clients: Instead of spending time checking off lists and filling out forms, case workers will spend more time with clients to understand the origins of their problems, the obstacles that get in the way of their success, and providing real-time support when life gets really hard.
  • Crisis prevention and reduction: Smart tech can help identify people at risk of becoming homeless before crises overwhelm them. For instance, the city of London, Ontario, uses a new smart tech system to track people at risk of becoming chronically homeless (with their permission) to prioritize services for them before a crisis.7 Smart tech will help forecast environmental disasters earlier and help with rescue operations. Resources will be directed to victims faster.
  • Deeper and more meaningful relationships with donors: Transactional fundraising, treating every donor like an ATM machine, has become the norm for too many organizations. Smart tech frees staff from updating donor bases and researching prospects to spending time getting to know donors in meaningful ways: learning more about their interests and the reasons your cause is important to them and turning these donors into ambassadors to recruit and nurture donors within their personal networks.
  • Reduced astroturfing: Advocacy organizations too often substitute marketing efforts for real grassroots organizing. For instance, the use of online petitions that is actually used to capture email addresses for future communications. Instead of “astroturfing” support for climate change, advocates could use the dividend of time to engage with supporters and get to know them, educate them on the issue, and teach them how to become advocates and create their own group of supporters.
  • More time to think: People and organizations are so busy doing the work, and the work to support the work, that there is very little time for reflection on how the work is done and how to improve it. Imagine having time to consider other ways to do intake with clients rather than furiously responding to a barrage of inquiries every day? Imagine having time to talk to supporters about what kind of support they would need to gather their friends and help them become ambassadors? Imagine having time to just think.

A REAL-WORLD SMART NONPROFIT

TalkingPoints is a great example of a smart nonprofit.

Heejae Lim, founder of TalkingPoints, didn't need to do any research about the difficulty immigrant parents have navigating school systems; she lived it as the child of immigrants. Her family moved from Korea to London when she was eight. Heejae had an advantage that many of her immigrant friends didn't: her mother spoke English well. Heejae's mother was a fierce advocate for her at school. She also translated for the parents of her friends.8

Following business school at Stanford University, Heejae decided to do what she does best: address a difficult problem using her advanced technology know-how. She founded TalkingPoints as a nonprofit to translate messages between teachers and parents.

About a quarter of all school children in the United States speak a language other than English at home. These are families where parents often work multiple jobs, may come from cultures where parents are not supposed to engage with teachers, and most importantly, do not speak English well enough to feel comfortable speaking to teachers.

Family engagement with schools is not a minor issue when it comes to educational outcomes. It is twice as effective in predicting school success than socioeconomic status. Let that sink in for a second: for school success matters much less whether a child is from a high-income home than whether adults in a child's life talk to their teachers. However, family-school engagement occurs up to 50% less among families in under-resourced, multilingual communities.9

TalkingPoints’ app works like text messaging on mobile devices. It operates in over 100 languages, provides closed captioning for video messages for parents who may not be comfortable writing, and enables parents and guardians to engage with teachers during the cracks of their very busy days.

TalkingPoints started with a simple proof of concept and a small group of families. Using a Google spreadsheet, a text messaging app, and human translators, Heejae and her team simulated an automated process from end to end. Here's how it worked. A parent sent a text message in their native language. Heejae added it to the spreadsheet. Next, a human translator translated the message into English. Heejae texted the translated message to the teacher or administrator. And back and forth they went. Heejae told us, “We always start with a proof of concept with a small group before we build, as part of doing no harm.”

The team learned a lot from this early testing and automated and launched the pilot version in 2015 with 100 families. Most importantly, they learned they couldn't rely on off-the-shelf translation tools alone because these tools often misinterpreted context and cultural norms. The volunteer translators review the machine translation to ensure cultural accuracy and that educational terms are accurately translated. These translations are being fed back into the model to continuously improve it. The aim is to build a large and deep enough database of translations to direct only the difficult-to-understand conversations to human translators.

TalkingPoints meets our definition of smart nonprofits: organizations that are human-centered, prepared, and knowledgeable and reflective. People are deeply engaged in the process, and the team designed and are implementing the app carefully and thoughtfully; they have done their homework about cultural competence, education, and the needs of immigrant families; the staff includes people with experts in education and tech; and everyone has experience working with immigrant families and schools.

The results show the effectiveness of working this way.

By 2019, the app facilitated over 20 million conversations for 500,000 parents and teachers. TalkingPoints is also free for users. An outside research firm was engaged to evaluate the effort. It found that:

  • 89% of the schools using the app serve low-income children.
  • 97% of teachers have found TalkingPoints helpful in meeting their family engagement goals.
  • 98% of teachers were able to reach families they had never reached before.
  • 83% of teachers believe that they are more informed about students’ needs because of their relationships with families.10

VI-SPDAT blocked access to services. TalkingPoints creates access for a woefully underserved population. These cases highlight why it is so important for organizations to step carefully into the use of smart tech.

THE DANGERS OF AUTOMATION

Nicholas Carr wrote in The Glass Cage, “Automation severs ends from means. It makes getting what we want easier, but it distances us from the work of knowing.”11

There is enormous danger and damage to be done in distancing ourselves from knowing. It means potentially cutting ourselves off from the needs of clients if they are first interacting with bots screening them for services. It could mean using automation to send out many times more fundraising appeals and not listening to the complaints from current and prospective donors. It could mean hiding behind screens instead of stepping out to build stronger relationships with constituents. And it could mean allowing an insidious form of racism and sexism to take hold unabated inside your organization.

We tend to see work done by computers and robots as incapable of being swayed by emotions and therefore incapable of being racist, sexist, or otherwise biased or unfair. However, the code that powers smart tech was at some point created by people and carries forward their opinions, assumptions, and biases. When this code makes decisions that are discriminatory, we call it embedded bias. The renowned data scientist Cathy O'Neil says, “Algorithms are opinions embedded in code.”12

Embedded biases are very difficult to undo. Programmers make literally thousands of choices beneath the hood of smart tech that the rest of us can't see. Automation is increasingly being used to make vital and life-changing decisions for people. Therefore, the choices that programmers (overwhelmingly white men) make, based on their own experiences and backgrounds, become more important.

For instance, smart tech is increasingly used to screen applications for mortgages. It is illegal to ask questions about, say, race, in these applications, so programmers create “proxies,” or substitute questions, to create a profile of an applicant. For instance, a zip code could be used as a proxy for “safe neighborhood.” Safe generally means white, particularly for white programmers using their own life experiences. In addition, the data is needed to train smart tech systems. An automated mortgage screening process will use data from the enormous data sets from past mortgage application decisions. Black people were historically denied mortgages at astonishing rates and therefore will be woefully underrepresented in these data sets. In this way, seemingly benign programming decisions, mundane proxies, and historic data create embedded biases against people of color that is difficult to see from the outside.

Once bias is baked into smart tech, it stays there forever and becomes self-reinforcing over time. Nancy Smyth, former dean of the School of Social Work at the University of Buffalo, State University of New York, says, “Code is racist because society is racist.”13

In her book Race After Technology, Ruha Benjamin describes a “New Jim Code.” It is a take on the old Jim Crow that powered decades of institutional racism in Reconstructionist southern states. She writes, “The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled—magnified and buried under layers of digital denial.”14 She later writes, “… the outsourcing of human decisions is, at once, the insourcing of coded inequity.” We will explore the ethical use of smart tech throughout this book.

ABOUT THIS BOOK

The book is divided into three parts. Part I, “Understanding and Using Artificial Intelligence,” focuses on the leadership needed to use smart tech well, the history of smart tech, and key issues for using it that organizations need to be prepared for: the need to stay deeply human-centered in planning and use of smart tech, the enormous amounts of clean data necessary to power the systems, and the ethical concerns and considerations necessary to ensure bias is mitigated.

Part II, “The Smart Nonprofit: Use-Case Examples and Management,” focuses on the applications of smart tech within organizations. It begins with a chapter to carefully and thoughtfully select a specific application of smart tech. Chapters on the use of smart tech for program delivery, fundraising, back-office operations, and philanthropy follow.

Part III, “Where We Go from Here,” concludes with a look about the probable future of nonprofits and social change in an automated world.

CONCLUSION

We wrote this book to help organizations prepare to benefit from automation and avoid mistakes. Smart tech can help nonprofits become more efficient and use that dividend of time to build better relationships with stakeholders inside and outside of the organization. Smart tech can better leverage data to better understand program impact. We want nonprofits to harness this technology for good, which requires organizational leaders to understand the limitations of smart tech and apply it strategically, ethically, and with responsibility.

Ultimately, the purpose of using smart tech shouldn't be to make organizations go faster but to make your organization better at solving problems and taking care of people inside and outside in more humane ways. This will only happen when everyone in the organization has the information, tools, and opportunity to shape their own lives and futures. That's the true mark of success for a smart nonprofit and what we will share in the rest of the book.

ENDNOTES

  1. 1.  Sydney Brownstone, “This data tool helps homeless people get housing. If you're white, your chances are even better,” The Seattle Times (October 29, 2019), https://www.seattletimes.com/seattle-news/homeless/this-data-tool-helps-homeless-people-get-housing-if-youre-white-your-chances-are-even-better/.
  2. 2.  Leah Post, author email interview on July 9, 2021.
  3. 3.  Heather L. McCauley, ScD; Taylor Reid, BA, Michigan State University, “Assessing Vulnerability, Prioritizing Risk: The Limitations of the VI-SPDAT for Survivors of Domestic & Sexual Violence” (July 20, 2020), https://dcadv.org/file_download/inline/b1bb3b28-8039-4590-aa1d-daaef5fb6546.
  4. 4.  Iain De Jong, author interview on June 25, 2021.
  5. 5.  Jake Maguire, author interview on June 30, 2021.
  6. 6.  Steve MacLaughlin, “The Impact of AI on Philanthropy” Engage Podcast Series (October 20, 2020), https://nofilternonprofit.blackbaud.com/raise-engage-podcast-series/episode-167-the-impact-of-ai-on-philanthropy.
  7. 7.  Chris Arsenault, “Using AI, Canadian city predicts who might become homeless,” Reuters (October 15, 2020), https://www.reuters.com/article/us-canada-homelessness-tech-idCAKBN27013Y.
  8. 8.  Heejae Lim, author interview on August 4, 2021.
  9. 9.  Google AI Impact Challenge (https://ai.google/social-good/impact-challenge/).
  10. 10. TalkingPoints website, https://talkingpts.org/talkingpoints-increases-parent-engagement-for-student-success/415/.
  11. 11. Nicholas Carr, The Glass Cage (New York: W. W. Norton & Company; September 8, 2015).
  12. 12. Cathy O'Neill, Naked Capitalism Blog, August 26, 2027, https://www.nakedcapitalism.com/2017/08/data-scientist-cathy-oneil-algorithms-opinions-embedded-code.html.
  13. 13. Nancy Smyth, author interview on May 25, 2021.
  14. 14. Ruha Benjamin, Race After Technology (New York: Wiley; June 17, 2019).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.134.188