Chapter Four
Changing Technology Development Inside and for Social Impact

Technological development truly changes how people live and interact with the world. With advancements over the past decade, people can imagine sending themselves into space, learn to dance from nimble robots, and order whatever they want via the latest phone app. As the creators of these programs and devices, technologists “determine what can be improved in their industry and how to incorporate new technology, find new ways to resolve problems, and further develop various processes.”1 Given this high standard and supposedly universal expertise, many people expect technologists to quickly enter the social impact space and create inventions that “single‐handedly” feed the hungry around the world, solve climate change, and improve access to and outcomes from education—and then spontaneously dream up the next invention.

This, of course, is not how things work. The social impact sector tackles challenges based on systemic inequity and injustice, problems that can't be solved with a simple invention. There must be a recognition that the big technological solution is in fact a combination of many small solutions. Also, the expertise technologists bring can help create a new and better world, but only if the technology is appropriately designed in the context of local communities and with deep understanding of the multifaceted elements in play, such as the root causes of the problems, interventions that can make progress possible, and the network needed to realize sustained change. This understanding must derive from how these problems are actually experienced, not from how they are interpreted from the outside. There must be a recognition that the big technological solution is sometimes a combination of many small solutions. Furthermore, technologists in future social impact organizations must strike the balance between (a) building basic technology systems, such as simple databases to store information on clients, for the social impact sector, and (b) identifying ways in which new, never‐before‐used technology can be created and applied to social challenges. These complex challenges present a high bar for technologists to meet, beyond “just” deploying complex tech. They must understand the power and potential of technology, as well as its limits and constraints; they must also be able to determine when technical proposals are documented pipe dreams and not possible within the constraints of reality, as well as when not to pursue a technical solution at all.

As we outlined in chapter 1, when people reference “technology” in relation to social impact organizations, they mean everything from Information Technology to organizational management tools to advanced techniques for improving operations; in other words, technological tools that help every department within an organization as well as program delivery and interaction outside the organization. Staff can quickly become overwhelmed by the countless technology questions they must ask themselves:

  • Do we have enough computers and cell phones?
  • Can people securely log in remotely?
  • Do we have servers in our offices or cloud storage space?
  • Do we have software to track and contact our donors?
  • Do we have software to track the care we provide to our clients?
  • Do people have a seamless experience when they visit our website?
  • Have we used data science to analyze the effectiveness of our training systems?
  • Do staff have the tools they need to be effective in their work?

Historically, many organizations in the social impact sector have treated the answers to technology questions as something for the tech department (or person) to decide in a vacuum—because, as has been argued, they know technology best and it is too complicated for the average person to make an informed decision. The reality, as we discussed in chapter 3, is that to support a functioning society, technology must extend—not replace—the social impact organization's missions. The technologists who join the social impact space, similarly, must join with the mindset of learning and supporting the organization's mission—rather than simply inventing what seems like the easiest and most obvious fix.

Technologists in the social impact sector will serve a broad variety of functions. Therefore, we consider a technologist to be the technical designer, developer, or implementer of any solution that helps advance an organization's mission. For the purposes of this chapter, “technologist” can represent either a single individual or a technology company developing tools to address social impact issues. And as we progress to creating the connected world we envision, we have different questions to ask:

  • How do we build new tools to support this world?
  • How do we ensure that multiple perspectives influence and develop tech in equitable ways?
  • How can we better share information and tools across communities?
  • How can we sustain and evolve solutions so that they change as the people relying on them change?

In chapter 3, we examined how social impact leaders must plan, budget, and integrate technology as a component of all programs and services. In this chapter, we will examine the ways technologists intentionally extend the mission and impact of social impact organizations: by expanding access to themselves, by continually checking for consistency between code and values, and by consciously deciding when to exercise technological caution versus when to push organizations into something new.

CASE STUDY: JOHN JAY COLLEGE

Dara Byrne is the Associate Provost of Undergraduate Retention and Dean of Undergraduate Studies at John Jay College of Criminal Justice, a public senior college of the City University of New York. In 2016, Dara wanted to know how she could better support the college's students up until they graduated. The overwhelming majority of support services were geared toward freshman studies—but still, after doing the majority of the work required, some students didn't complete their senior year. Dara wondered what more the college could do to help more students graduate.

Around the same time, DataKind—a nonprofit whose mission is to harness the power of data science and artificial intelligence in service of humanity—approached John Jay College. DataKind had received funding to work in the education space, and wanted to know if John Jay College was interested in partnering. What made the outreach unique and interesting, Dara reflects, was that DataKind proposed to discuss and collaboratively define a problem to solve—whereas most tech companies approached with some version of “We know what your problem is, and we know how to solve it for you.”

Working together, the John Jay College staff and DataKind staff defined two questions to answer: (1) Could they identify students in need of support who were likely to drop out?; and (2) What were some of the factors that could influence a student's decision to leave school before finishing their degree? Reflecting on this process, Dara comments on the validation she received at the time. “The DataKind team made me feel heard. They showed me what it is to be believed with the expertise you have.” The DataKind team identified and reviewed the college's data, then led discussions with the college team about both their understanding of the data and the limitations of the data from a technical perspective. The DataKind team also spent time in the college's systems to ensure that what was to be built would be able to run within the existing environment without requiring the purchase of a lot of new equipment.

The collective team decided to focus on students who had completed 75% of the credits needed to graduate—or 90 points out of the required 120 points. The team tested more than 20 different modeling approaches, algorithms, and combinations of models. In the end, the team created two sets of models using machine learning, each designed to predict the likelihood that a student will graduate within four semesters after completing a minimum of 90 credits of coursework.2 The tool generates risk scores for students and provides insights into the factors that may lead to students dropping out. In the words of Michael Dowd, the lead DataKind data scientist on the project: “The final tool takes in data (such as grades, length of time at the school, test scores, credits taken per semester, etc.) and predicts the probability of whether or not students will graduate within a specified amount of time. The tool also shares with the user information about which variables contributed to the overall probability and why.”3

Dara conceived the Completion for Upper Division Students Program (CUSP), as “a college completion program designed to prevent drop out among undergraduate students in their final semesters and instead help propel these students across the graduation finish line.” Through CUSP, Dara's team uses the tool created with DataKind “to then provide tailored interventions to students based on these risk levels, including general text message reminders, personalized advising on how to meet remaining academic requirements, strategies and completion grants for overcoming financial barriers, and post‐graduation planning and referral resources.”4

Laura Ginns, John Jay College's Vice President for Public Affairs and Strategic Initiatives shares:

Before implementing CUSP in Fall 2018, we projected that, without any intervention, only 54% of our seniors at or above the 90‐credit mark would graduate by the end of two years. In the two years following the Fall 2018 implementation of the CUSP program, however, 85% of students who were at or above the 90‐credit mark that term have graduated within two years. Thanks to CUSP, this means that over the two years, 900 more students than projected earned their Bachelor's degrees. CUSP has been central to John Jay's increased 4‐ and 6‐year graduation rates, which rose 8 percentage points and 4 percentage points, respectively, in just two years.

These stats alone are impressive, but Dara says the impact within John Jay College is even more widespread. “You don't expect confidence, capacity building, and culture change to come out of a technology collaboration, but that's what happened,” Dara remarked. Throughout the development process, the DataKind team explained to the college staff what data was being used and how. As a result, today the college staff who interact with the developed tool—whether the director of institutional research or the frontline staff working directly with students—can explain what information the tool uses and how it assesses the data. In addition, the success of the study resulted in new partnerships being formed within the college, as multiple departments became interested in using the tool or providing information to it. Dara's team, now confident in their ability to understand how the technology works, expects more from tech companies approaching them with their flashy, proprietary tools. She comments: “Now that we have had these experiences with DataKind, when tech companies won't explain what's in the formula, it raises a lot of suspicion.”

The DataKind team was able to document the lessons learned—and the code developed—from the partnership. This information has been shared with similar organizations to help them understand how they too could benefit.

Ultimately, the experience gave Dara a new confidence and strengthened her credibility within the institution: “For those of us who are BIPOC leaders, being listened to with a sense of curiosity is a big difference. It's unusual to encounter, especially in the land of technology. But now, I know what to ask for.”

DEVELOPING TECHNOLOGY FOR SOCIAL IMPACT ORGANIZATIONS

The DataKind and John Jay College partnership shows that good things can happen when technologists are intentional about learning from their social impact partners, thorough in considering various technical approaches, and deliberate about selecting and implementing the ideal model. It also illustrates how detailed understanding of the technology being deployed—in this case, predictive modeling—is necessary to responsibly execute a technology solution. In addition, it underscores the importance of technologists explaining the technology and the implications of technical decisions to their customers in plain language. But one of the more significant truths the DataKind and John Jay College partnership demonstrates is the power technologists have to shape outcomes.

The responsibility of distributing power lies in the hands of the technologists. This is because technology can automate or define so many ways that social impact organizations operate. The time and frequency of system updates can dictate when and how social impact employees work. An organization's ability to provide computers and phones to employees (rather than requiring them to supply their own devices) equalizes who can perform their work. The weights given to particular variables in algorithms determines who receives services and who doesn't. Furthermore, in a world where policies and regulations often don't affect the most current technology, or how the social impact sector can use technology, people who design technology become de facto policymakers, with the ability to turn on and off who gets access to services based on a few additional lines of code.

These implications are serious, and technologists must take their role with a respectable amount of gravitas. And, in turn, it is important for us to acknowledge and respect the responsibility that technologists have in creating a more equitable world, and to ensure they are empowered to steward this responsibility well. To follow are our guidelines for doing so.

Develop a Thorough Understanding of Both the Capabilities and Limitations of Technology

The social impact sector is not a place for technologists to experiment for the sake of experimenting, or to practice their skills until they're “ready for the private sector.” Technology work within communities is its own discipline, because the work greatly impacts people's lives and livelihoods. And so, technologists in this field need to be experts in their craft, regardless of whether that expertise is gained via formal educational settings, boot camps, apprenticeships, on‐the‐job‐training, or otherwise. Whether on the IT side, the software development side, or the user design side, technologists must understand the ins and outs of the technology they develop, deploy, and manage. Of particular concern is understanding the downstream effects of technology systems—and, accordingly, when a technology should not be deployed if it wouldn't ultimately further the necessary goals.

Historically, the temptation has been to assume that tech is always inherently helpful. Surely, the assumption goes, adding tech services will help an organization increase its impact or improve efficiency. For example, as the COVID‐19 pandemic raged in 2020, Google and Apple developed and deployed contact‐tracing apps to help track and manage infections. Yet, in the summer of 2021, when the United States saw some of the highest COVID‐19 case numbers at that point, only 2% of cases were being tracked in these apps.5 This is one of many instances where technologists failed to understand the social context of the situation, and pushed for a technical solution that ultimately failed to maintain interest. (In fact, the authors of this book are considering writing a book titled “I Can Fix Your Social Impact Problem with an App, and Other Fairy Tales.”)

The social impact sector requires technologists to constantly learn about the cultural and societal environments in which their technology is deployed. It also requires an “accessibility first” default design suitable to all potential users to include rather than exclude. In many ways, newer technologists might want to practice their skills in the private sector until they're ready for the social impact sector.

Develop a Thorough Understanding of the Situation Being Addressed

No one expert in technology, regardless of that person's lived experience, will also be an expert in (all) social impact areas, with knowledge of the root causes of the problems, interventions that can make progress possible, and the network needed to realize sustained change. For this reason, technologists must approach their work in the social sector with humility, acknowledging—to themselves and others—that they don't know everything, and asking questions so they can learn from fellow members.

One starting point is to develop an understanding of the social impact ecosystems beyond simply the technology. Humans have evolved to live in community with each other—to such an extent that many socially isolated people suffer from depression and anxiety.6 Accordingly, a technologist's ability to support relationships—with nontechnologists, with the people using and affected by the technology, and with the organizations that serve the community—is crucial for success. Michael Dowd, the DataKind data scientist who led the development of the tool from the preceding John Jay College case study, describes it this way. “It was very important to explain to John Jay College what we were doing because of their deep subject matter expertise with the school's data. Their insights led to changes in how we developed the models that supported the tool; in fact, the decision to build two separate models, one for students who had started at John Jay College and one for students who had transferred into the College, came from conversations between DataKind and John Jay College.”7

The DataKind team partnering with John Jay College got it right: they listened to the people most closely affected by the challenge, and they developed a solution together. This approach ensures representation at decision‐making tables—effectively redistributing power to those most affected by the decision. The technologists first brainstorm a solution alongside the social impact practitioners. They then develop the technology, test it in its intended environment with those who will be using it, and modify the technology as needed.

Within this process it is essential that technologists push for technically sound applications, but also know when to not use particular technology. Let's say the challenge is a policy one; it won't matter how efficient a system you've built to match people with housing options if the local government hasn't allocated enough affordable housing. Or perhaps the technology can't be used without discriminating against populations, such as an Artificial Intelligence tool designed to help recruiters filter through résumés but the tool actually filters out women.8 A variety of other reasons can be summed up as determining that using technology in a particular manner will not only not advance the cause, but it will likely inflict harm on someone directly or indirectly touched by the technology.

Negative impacts that disproportionately affect people of color have been documented again and again—not just in the private sector, such as regarding access to housing, attacks on social media, and more, but also in the social impact sector. Although many argue these unintended consequences are unavoidable, this is not the case—and their effects can be lessened by, first, an expansion of racial literacy. As defined by Daniels, Nkonde, and Mir, racial literacy “is a new method for addressing the racially disparate impacts of technology. It is a skill that can be developed, a capacity that can be expanded.” Sasha Constanza‐Chock demonstrates in their book, Design Justice, that “when design processes do not consider inequality in design … [they] are structured in ways that make it impossible to see, engage with, account for, or attempt to remedy the unequal distribution of benefits and burdens that they reproduce.”

Inequality in design doesn't just affect people of different races. A number of other communities are often overlooked in technology design. Even though disabled people use technologies every day, accessibility is often an afterthought for technologists. Only 2% of websites are accessible, meaning for the 1.2 billion disabled people in the world, performing a simple search online is not all that simple.9 accessFind, which claims to be the world's first search engine for accessible sites, provides users only with accessible sites. The creation of this program highlights the need for accessibility to be designed from the start, so that everyone has the option of accessing the tech.

We must also take note of intersectionality in these conversations. People aren't only Indigenous or only female, for example; they can be an Indigenous female, and therefore subject to compounded effects of inequality in design.

Expand the Idea of Who Technologists Can Be

The three main guidelines we've discussed are essential to the goal of ensuring equity in tech. But as for empowering communities, the greatest impact can come from expanding who we see as able to be a technologist and who has access to technologists throughout the development process.” According to 2014 data from the Equal Employment Opportunity Commission, in 2014, nearly 70% of the people employed in the US tech sector were white, whereas whites made up about 60% of the US population as a whole.10 Six years later, the diversity reports published by major tech companies such as Alphabet, Facebook, Google, and Twitter, revealed that their stats had remained mostly the same.11 We must expand the diversity of people who become technologists, with their lived experience, personal expertise, and varying perspectives. It is important that technologists, and the organizations they serve, be held accountable to and be in relationship with those affected by their products. Also note that technologists should intentionally bring more people into their process and into their work, even if these additional individuals are not developing code; there are many other roles with critical perspectives in projects of this sort.

The organization Code the Dream is living this dream. It “recognizes that people from immigrant backgrounds and communities of color have great ideas and will play a huge part in our 21st century economy.” And so it provides coding courses to students who are predominately from immigrant, historically underserved, or poor communities.12 As cofounder Daisy Magnus‐Aryitey reflects, “There are so many people from underserved backgrounds who want training and experience, and there's a lot of work in the public interest to be done.” A team of senior developers at Code the Dream constantly works on public interest projects; their team of senior developers regularly takes on apprentices, usually individuals who have completed Code the Dream's beginning and advanced courses. These collaborations have led to many interesting tech projects that, for example, connect individuals to North Carolina government agencies that help them take advantage of debt relief programs and restore their driver’s licenses—thereby also restoring the ability to work and stay connected.

Another particular solution—developed in part by the children of farmworkers—that provides farmworker‐support organizations better tools for understanding the actual needs of farmworkers. The programs, Conectate and Vamos, are accessible via a website or an app. Given that the initial developers had such a strong sense of the farmworkers' lived experience, they were able to build applicable focus groups that included both farmworkers and existing organizations that served farmworkers, which yielded invaluable information. The collaboration also facilitated introductions to and relationships with organizations who adopted the software. As a result, organizations can be alerted when farmworkers are moving into their service area, and can proactively prepare to distribute food, clothing, and other services. To build on this sort of benefit, we need to ensure that a diverse range of individuals are brought into the decision‐making and design processes. This is the power that comes from expanding who can be a technologist and allowing individuals to build technical expertise while respecting and engaging their identities and communities.

Respect Communities by Using Data Responsibly

Ultimately, the development and deployment of tech systems is a way to empower individuals and empower communities. When done responsibly, this empowerment can also support organizations' abilities to serve their clients and minimize harms inflicted on systemically and historically marginalized populations. One practical concern is the use of data. As the prevalence of data in the social impact sector continues to grow, so do calls for data‐driven decision making and data collection; however, especially because this sector collects personally identifiable information, as well as data that reveals intimate details of their clients' lives, organizations and therefore technologists have an imperative to collect, store, and use data responsibly.

This effort starts with minimizing the data collected in the first place: only what's needed for the project, and nothing more. Although it can be tempting, of course, to collect additional information for a theoretical future positive application or a potential customization, that temptation needs to be quashed. Next, organizations must consider who needs access to the data collected—and then regularly revisit these assumptions. It is essential to restrict access to the data because doing so decreases the chances of data leaks outside the organization. Finally, technologists must think critically about what data is shared and how. Is information presented in ways that nontechnical employees can easily understand and interpret it? Is data shared in ways consistent with what was communicated to the information provider—usually an individual—at the time the data was collected? Following these and other user‐centered data and privacy practices enables technical systems to properly safeguard the personal information they possess.

BUILDING NEW MODELS FOR TECH DEVELOPMENT

The progress we want to see doesn't stop with changes to the design process. All technology must be built and then maintained. After a tool has been deployed, it will need to be periodically checked to make sure it still works: that underlying data hasn't changed, that access lists haven't been altered, and that supporting technologies are still relevant. If that maintenance can only be done by a select few individuals from outside a community, the solution will not empower the community. And so technologists must later move past designing with affected communities and strive to transfer ownership to those communities.

But before we can transfer ownership, we first need to develop the tech to later hand over. To follow are some guidelines for doing that.

Buy, Borrow, or Build?

After relationships have been built and inclusive decision‐making processes have been formed, the technologists ultimately have to build and deploy the technical solution. When doing so within a social impact organization—whether assessing a new program to run in the cloud computing environment or building a new website—one of the first decisions technologists must make, in coordination with organizational leadership, is whether to develop a unique solution from scratch or to buy one off the shelf.

Developing a new, completely custom solution allows technologists to design something that will tailor to the organization's and to the community's needs. It allows flexibility, provided there is enough time and funding available, for the technologist to include all of the desired features into the solution. If an organization imagines a new process and a new way of teaching people or serving people, a technologist can create software that enables them to do just that. In contrast, purchasing or adapting existing technology often allows for a quicker path to adoption for the social impact organization. Speed, however, often comes at a cost in flexibility of business operations; the organization often has to at least slightly alter its processes to fit the workflows and structures of the not adapted technology. Maybe an organization used a two‐step process to check people in for services, for example. If the software they purchase requires three steps, the organization will have to update their process so that it can be handled by the technology.

To help determine which development to use, technologists should ask themselves:

  • Does the social impact organization have a limited percentage of staff who would be comfortable using and maintaining the technology?
  • Are the staff pressed for time? Will waiting for a custom solution have a negative impact on the organization's ability to provide services?
  • Will the technology be designed “for all” and not undergo explicit checks against implicit bias?
  • Will a third party have access to or own any data used by the technology? If so, will a third party be able to share or publish the data?
  • Will a third party be able to prevent the organization's access to the technology without consulting with the social impact organization?

If the answer to many of these questions is no, an off‐the‐shelf solution might be ideal. But if the answer to a majority of the questions is yes, then it's likely the technologists should seek out products that were specifically designed for that social impact sector. If none can be found, then a custom‐made solution might be the way to go. As for the questions themselves, it is crucial that social impact organizations retain the ability to control both the access to data and who can turn on and off the solution. Decisions such as these allow for equitable treatment of people and safeguarding of information and operations.

Once a decision has been made about whether to purchase existing solutions, modify existing solutions, or develop new solutions, technologists must consider what type of development and provider to pursue. If the decision is to acquire existing technology, options could be chosen from a traditional private sector company, the open source community, or the nonprofit tech development community. To follow are guidelines and some caveats on all these.

Private Sector Technology

Private sector companies often enter the social impact sector with the intention of adding a “for good” component to their broader portfolio. They often assume that tools that work in the private sector can readily transfer to the social impact sector, generating many of the efficiency and productivity gains that the private sector has enjoyed. However, products designed to succeed in a marketplace that values shareholder return over all else may not be appropriate for the social impact sector. In some cases, such as with digital storage space, the functionality needed in the social impact sector and in the private sector are similar enough that the technologies can be easily applied to the social impact sector. In other cases, such as with customer relationship management software, the technology may need modifications before being applied in the social impact sector. And in other cases, the technology should not be used in the social impact sector at all.

Research led by Joy Buolamwini and the Algorithmic Justice League, highlighted in the documentary Coded Bias, is an example of this idealized transition not working in practice. In the film, Buolamwini discusses the research with which she revealed that facial‐recognition systems are significantly less accurate for people with darker skin—especially women with darker skin. Although the overall accuracy percentage of such systems was deemed sufficient by private sector institutions—such as law enforcement and landlords—clearly this technology is inaccurate with traditionally marginalized populations. And because these populations deserve equal rights and equal treatment, the technology that we apply to the social impact sector must work for everyone. And so, if technologists pursue private sector products to be used in the social impact sector, they must always question for whom the technology works and who is failed or excluded by the technology. Once those important details have been ascertained, the next step is to learn whether the project's social impact leaders consider the trade‐off acceptable to the community.

Open Source Software Can Strengthen the Ecosystem

Open source software is code that is designed to be publicly accessible—anyone can see, modify, and distribute the code as they deem fit.13 Flexibility and transparency are two of the biggest benefits for using open source software. Because the source code is freely available, technologists have the option of implementing the program as written or modifying the code for their specific needs via a fairly straightforward method. In addition, many open source software programs are regularly used and maintained by a community, so bugs can be reported and fixed, and technologists can learn from each other. These same communities are often vigilant about stopping any nefarious behavior, so the software generally stays secure. Open source software is often, but not always, free. And, you somewhat get what you pay for; even with developer communities, there is no guarantee of a quick resolution to any issues one may find.

Mala Kumar, the Director of GitHub's Tech for Social Good Program on the Social Impact Team, has seen a number of open source projects used for social impact. She shares how one particular open source software program already developed for a social impact organization was able to be quickly modified and applied to another situation. “One example that Clayton Sims, the CTO of Dimagi, told me was how important CommCare became in COVID‐19 contact tracing. A few big tech companies made an attempt at building a contact tracing app, but quickly realized how complicated it was and stopped. Meanwhile, CommCare had been developed in response to Ebola, and could be quickly and appropriately repurposed. Suddenly, CommCare was being deployed in San Francisco, not just in Central Africa. Saving that kind of time and money in emergencies is critical.”

Community Tech and Nonprofit Tech Must Work for Social Impact Workers, Too

Community tech and nonprofit tech represent development communities that explicitly design for the social impact sector in ways that prioritize equity and justice. The online Community Technology Field Guide states: “Community technology is a principled approach to technology that is grounded in the struggle for a more just digital ecosystem, placing value on equity, participation, common ownership, and sustainability.”14 The technology solutions produced by these communities tend to be designed for everyone—including those on the margins of society—in ways that prioritize care of data, access, and ownership. If the products are relevant for another organization's project, social impact technologists can bring these products into their environments with a considerable amount of trust.

Although the social impact sector has many organizations working on similar challenges across the globe, the incentives aren't always aligned for organizations to easily collaborate on implementing technology toward their mission. Also, organizations are often funded for their individual work without being allocated time or resources to share the lessons learned across the sector. But if we can figure out ways to collaborate on technology, and have technologists drive some of the knowledge sharing, we'd greatly enable the social impact programs. Technical learning communities and technical consortia can help with this. NetHope, which brings together large social impact organizations for this purpose, and NTEN, which provides resources and a community gathering space for technologist practitioners, are among the many organizations creating the space for the intentional sharing of the dos and don'ts of technology in the social impact sector.

Most of the discussion in this chapter has been focused on technology's ability to extend the social impact organization's mission in serving people. We need to remember, however, that the needs of the organization's employees are important too. The tech we develop needs to work well for employees. As an example, let's say an organization builds a website that allows its clients to easily access benefits. The clients are well served, but on the back end employees must be physically in an office during set hours, combining digital information submitted via the website with physical papers and charts. How organizations do their work is as important as what they deliver to the communities they serve. As technologists work on projects in the social impact sector, it is essential that the roles of organizations' employees are explicitly included in and enabled by the technology being designed. This calls for ensuring those employees are supplied with hardware and software, and that embedded accessibility features (such as closed captioning) guarantee equal access. As an added bonus, such efforts would advance the employees' work experience and skill sets.

Ethics, Security, and Privacy Must Be the Foundation of Technology Development

In recent years, conversations about “ethics in technology” have entered mainstream technology conferences and mainstream media. Scholars in interdisciplinary studies, such as the field of science, technology, and society, for example, have been studying these issues for decades. Many women of color have investigated why and how unchecked technology systems are detrimental to the function of society. For example, Dr. Safiya Noble, MacArthur Fellow and cofounder of the University of California at Los Angeles Center for Critical Internet Inquiry, has challenged “the harms algorithmic architectures cause and shows the necessity of addressing the civil and human rights that are violated through their technologies.” In her book Encoding Race, Encoding Class: Indian IT Workers in Berlin, University of Washington professor Dr. Sareeta Amrute examined the interplay between conceptions of race and programmers. There are many, many others. The Founding Director of the University of Michigan's Digital Studies Institute, Dr. Lisa Nakamura, gave TED Salon talk on “The Internet Is a Trash Fire and Here's How to Fix It,” which reflects just some of her research.15 There are many, many others.

Still, some of the current, mainstream conversations about ethics in tech seek a quick fix. The authors of this book have attended technical conferences where someone poses the question, “What one thing do I need to do to have ethics in technology?” or “Can you please share a list of ethical algorithms for me to use,” or “How do I do a quick check before I take my product to market to make sure it doesn't cause any problems?” If only it were so easy.

Kathy Pham, Co‐Director of Mozilla's Responsible Computer Science Program and member of the World Economic Forum's Advisory Committee on Tech Ethics, approaches integrating ethics throughout technology development this way:16

When I think about technology and ethics, I think about three things: Who are the people impacted whether or not they use the service; what fields and disciplines do we deeply involve in the decision making and product development process (history, philosophy, law, policy, race and gender studies, humanistic studies, art, and more); and which parts of the product, engineering, and design cycles we can leverage or intervene to drive change. What this might look like:

  • If a team is debating using machine learning on a group of images to determine safety, is there someone around to share the Broken Windows theory or the concept of safety for whom?
  • In the early stages of development as teams design their user interface and what fields to put in a form, does someone default to how gender has been a Boolean (yes or no) value in the past, or that zip code should always be required despite the complex issues that arise with zip codes?
  • When designing a product or service that is intended for everyone, in the case of government for example, are the most vulnerable populations considered first, or is the product designed to an ideal set of circumstances like ability to access the internet, read words on a page, see colors, understand the flow of the site to be able to get through the whole page?

Ethics in tech requires ongoing, intentional reviews of the work being done. The process of innovation is iterative, and as a result, there will be negative externalities to be mitigated in each iteration. Ethics in tech, then, requires truly inclusive processes as described in this chapter. It necessitates technologists intentionally questioning potential sources of bias in the data used to drive programming and decision making. It requires considering the assumptions or historical issues embedded in the system. (For example, if you are trying to develop a tool that identifies what it takes to be a leader, and your test data is from companies that historically have hired only men, then your tool will conclude that to be a leader, you should be male.) And it requires being willing to not implement the solution if testing reveals that the technology excludes or harms people.

The choice to not implement a developed technological solution is tough to make, especially when considering the many demands on time and financial resources in the social impact sector. However, the cost of implementing unethical tech is much greater. In early 2021, DataKind moved to wrap up and transfer a data science program to a partner organization they had been working with for months. In the final reviews prior to deploying the program, DataKind realized there was a problem: the partner organization that had collected and shared the data could not confirm that consent had been provided for all the data collected. Because ownership and protection of data is so important, DataKind ultimately decided to not deploy the program. Even projects by social impact technologists in close partnership with social impact organizations, with good intentions from all involved, are not immune to the challenges of developing ethical technology; this is why checks for bias, discrimination, and ethics must be performed from the very start to the very end of the development process.

In addition to building in ethics throughout the design, development, and deployment lifecycle, technologists must also consider protections—specifically, privacy and security—from the very start of the design process. Social impact organizations hold a treasure trove of deeply personal information that readily targets nefarious actors: that is, hackers and hacktivists. With pressures to move quickly in the social impact sector, there can be a temptation to skip over privacy and security details at the start—with the intention of adding them at the end of the process, when there's more time or greater knowledge. Unfortunately, this good intention is frequently left unfulfilled.

The Citizen Clinic “support[s] the capacity of politically targeted organizations to defend themselves against online threats”; their Cybersecurity Education Center offers many examples of how social impact organizations should think about and approach security. The Citizen Clinic has helped voting rights organizations develop a more secure account system to control who has access to what shared email and software accounts. It also helped an abortion fund defend itself from threats of data breeches, counterfeit fundraising pages, and online harassment by moving digital assets to more secure digital spaces, defining access controls, and updating systems that “had previously been too difficult to safely and efficiently use.”17 This last consideration applies to all technology development in the social impact space: it must be appropriate for the environment in which it will deployed—neither overly complicated just for the sake of using fancy technology, nor so unsophisticated that it leaves organizations vulnerable.

Imagine a Bright Future

We know that the technology of today isn't everything we need for everyone who needs it. We also acknowledge the technology skepticism referenced in chapter 1, which derives from the many documented ways that technology can do harm. That said, as technologists work in the social impact sector, they should not be constrained by the failures of yesterday and simply fix what is currently broken. They should boldly create what's needed for the communities they serve.

Technologists can also intentionally build community by sharing with social impact leaders the specific challenges they are working to address. A 2021 report on Building Career Pathways for Diverse Public Interest Technology Entrepreneurs highlights organizations that were effective because entrepreneurs, drawing “from their own lived experiences stemming from their identities, … identified a gap that was felt closely and sought to close it.” The examples include a child of a Chinese immigrant who created a “nonpartisan platform that enables people to select the issues they care about and then receive alerts before Congress is about to vote on any of those issues.” In another, a founder who has several family members with diabetes used “tech to bring people together to make it easier to navigate the healthcare system for Black and Brown folks.”18

When interests such as these combine, technologists truly can advance a social impact organization's mission. Indeed, many private sector companies, such as banks and online retail stores, have become tech companies without calling themselves tech companies. Some academic institutions and government agencies also have large tech departments that drive their operations. A great deal of opportunity exists within the social sector to integrate technology into the core operations of organizations.

Technology within social impact organizations isn't just IT; it's a more comprehensive approach that touches staffing, programming, delivery of services, and evaluation. Organizations must use technology to make decisions, improve mission operations, and more. Appropriately designed technology means responsibly using advanced technology in the right situations. By applying the complexity and limits of technology, including a diverse group of people throughout the design and development process, and consistently checking for ethical decision making enshrined in code, social impact technologists can play a critical role in sustaining a transformed organization. There is a difference between simple technology and simply explained technology, and social impact technologists should be willing to provide both. The ability to translate between technical complexity and social mission is key to creating an equitable power structure in which impactful work can be done. The tech that comes next is completely up to us. Let's ensure tech teams and tech companies are inclusive and embedded in the social impact sector, working hand in hand with the communities they serve.

Of course, technologists and social impact leaders are only two pieces of the puzzle. Their visions and collaborations must be funded in ways that empower them to center equity and justice in the work they do.

QUESTIONS FOR WHAT'S NEXT

Technologists working in the social impact sector have the opportunity and responsibility to help social impact organizations expand their missions through the thoughtful application of technology. When technologists make intentional decisions about when to use and when not to use particular technologies, bring along community members while applying ethical, security, and privacy frameworks, the technology can empower social impact organizations, just as the DataKind & John Jay College partnership empowered college administration.

Engaging with technologists to generate inclusive, responsible, community‐centered tools can be challenging if you don't know where to begin and the language is unfamiliar to you. The questions below can be used to guide conversations with technologists to start collaborations that can build the tech that comes next.

Social Impact Organizations

Questions for those working in and with social impact efforts to ask technologists:

  • How have you learned about our particular work and affected communities? How will our communities be included in the design and testing processes?
  • What happens if your technology fails? Who will be harmed?
  • How will you ensure that I understand how the systems are being used? How the data is being used?
  • How will you ensure that your technology will work with the technology and systems we have today? How will you ensure that the cost to maintain your technology won't be prohibitive for us?
  • How will you ensure that your work advances at a reasonable pace, that it respects the organization's time, and that it's ultimately delivered on time?

Technologists

Questions for those building technology for social impact to ask of their peers:

  • Did you partner with organizations and individuals throughout design, development, and testing?
  • How did you determine that this solution you propose is the right approach? What did you decide not to do and why?
  • What lessons do you have to share on building capacity and leadership in the community that will be maintaining the technology?
  • What specific tools and techniques were applicable in this situation and why? Do you have code to share?
  • Are there solutions that you developed here that could be used elsewhere? What organizations or individuals can we talk to, to make that happens?

Funders

Questions for those in positions to fund social impact and technology to ask technologists:

  • What are the expected long‐term maintenance needs for the technology?
  • How will you teach the organization staff how to interact with and maintain what you develop?
  • What support will the organization need to make successful maintenance happen?
  • Based on what you learn in the design process, how will you communicate if changes are needed?
  • How are you connected to the communities impacted by this project? How will they be involved?

Policymakers

Questions for those creating and enforcing policies around technology and social impact to ask technologists:

  • How have you ensured that people will be able to access your technical solution?
  • How have you mitigated bias in your technical solution's development and implementation?
  • Where does the tech solution end and the need for new policies begin? In other words, what are the limits of the technical solution?
  • Where are the components of the technology that are not currently protected or directed by policy?
  • How does this technology maintain protections for the user's ability to control their data?

Communities

Questions for those creating and enforcing policies around technology and social impact to ask technologists:

  • How can our lived experience be prioritized in the design and development of the tech?
  • How will you invest in our training and knowledge so that we can be part of ownership in the long term?
  • How will you ensure I understand how the systems and data are being used? How will you ensure I control the way my data is being used?
  • How will you ensure our consent will be requested (now and in the future) in relation to the ways data is used to make decisions for and about us?
  • What's the plan for making sure we can continue to use this technology after you're not involved?

NOTES

  1. 1.  Zip Recruiter, “What Is the Difference Between a Technologist and a Technician,” accessed September 1, 2021, https://www.ziprecruiter.com/e/What-Is-the-Difference-Between-a-Technologist-and-a-Technician.
  2. 2.  DataKind, “Improving College Success Through Predictive Modeling,” April 2017, https://www.datakind.org/projects/improving-college-success-through-predictive-modeling.
  3. 3.  Michael Dowd, interview with Afua Bruce (October 11, 2021).
  4. 4.  Taken from CUSP Overview Document, collaboratively written by several staff and faculty at John Jay College, and shared in email with Afua Bruce (October 4, 2021).
  5. 5.  Rob Price, “Apple and Google Partnered to Develop Contact‐tracing Apps to Fight COVID‐19—But They Fizzled in the US Because People Barely Used Them,” Business Insider (August 26, 2021), https://www.businessinsider.com/apple-google-contact-tracing-apps-barely-used-in-us-investigation-2021-8.
  6. 6.  Tulane University School of Public Health and Tropical Medicine blog, “Understanding the Effects of Social Isolation on Mental Health.” Dec 8, 2020 https://publichealth.tulane.edu/blog/effects-of-social-isolation-on-mental-health/.
  7. 7.  Michael Dowd, interview with Afua Bruce, October 11, 2021.
  8. 8.  Isobel Asher Hamilton, “Amazon Built an AI Tool to Hire People But Had to Shut It Down Because It Was Discriminating Against Women,” Business Insider (October 10, 2018), https://www.businessinsider.com/amazon-built-ai-to-hire-people-discriminated-against-women-2018-10.
  9. 9.  Ernest Hamilton, “AccessiBe's Search Engine accessFind Is Launched to Help Those with Disabilities Find Accessible Websites,” Tech Times (June 14, 2021), https://www.techtimes.com/articles/261437/20210614/accessibes-search-engine-accessfind-is-launched-to-help-those-with-disabilities-find-accessible-websites.htm.
  10. 10. U.S. Equal Employment Opportunity Commission, “Diversity in High Tech,” accessed September 1, 2021, https://www.eeoc.gov/special-report/diversity-high-tech.
  11. 11. Kate Rooney and Yasmin Khorram, “Tech Companies Say They Value Diversity, But Reports Show Little Change in Last Six Years,” CNBC (June 12, 2020), https://www.cnbc.com/2020/06/12/six-years-into-diversity-reports-big-tech-has-made-little-progress.html.
  12. 12. Daisy Magnus‐Aryitey, Zoom interview with Afua Bruce (September 7, 2021). Visit the Code the Dream Website at https://codethedream.org/about/.
  13. 13. RedHat, “What Is Open Source” (October 14, 2019), https://www.redhat.com/en/topics/open-source/what-is-open-source.
  14. 14. “Exploring Community Technology: Who Owns Our Technology,” Community Technology Field Guide, accessed September 1, 2021, https://communitytechnology.github.io/docs/intro-ct/investigate-tech/.
  15. 15. Lisa Nakamura, “The Internet Is a Trash Fire. Here's How to Fix It,” accessed September 1, 2021, https://www.ted.com/talks/lisa_nakamura_the_internet_is_a_trash_fire_here_s_how_to_fix_it.
  16. 16. Kathy Pham, email to Afua Bruce (October 5, 2021).
  17. 17. “Welcome to Public Interest Cybersecurity” (https://www.citizenclinic.io, last updated November 17, 2020) and “Case Studies” (last updated July 26, 2020, https://www.citizenclinic.io/Case_Studies/), The Citizen Clinic Cybersecurity Education Center, Center for Long‐Term Cybersecurity, University of California, Berkeley.
  18. 18. Tayo Fabusuyi, Jessica Taketa, and Raymar Hampshire, “Building Career Pathways for Diverse Public Interest Technology Entrepreneurs,” New America, last updated June 30, 2021, https://www.newamerica.org/pit/reports/building-career-pathways-for-diverse-public-interest-technology-entrepreneurs/findings/.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.14.6.194