CHAPTER 12
A Smarter Future

INTRODUCTION

As we've outlined in this book, smart tech is quickly changing what we do and how we do it. Leadership is required to actively and purposefully create the conditions for people and organizations to thrive. The combination of leadership and smart tech creates the possibility of a very different kind of future. Here is our vision for organizations and society, a future powered by smart tech that is healthier, kinder, and more generous.

THE FUTURE OF NONPROFIT WORK

We imagine a future where nonprofits use smart tech well and reap the benefits of the dividend of time. This will allow smart organizations to usher in a new era of nonprofit organizations that don't specialize in burnout and scarcity. Here are ways we want to see nonprofits working as smart nonprofits:

  • Clarify and honor the boundaries between work and home lives.
  • Provide more time off. Maybe four-day workweeks are possible. Certainly more vacation time is warranted. And planned sabbaticals for staff at every level would be a huge and deserved benefit for long-serving staff.
  • Focus on the real work, not the busyness. We want smart, talented people being creative, proactive problem solvers. We want organizations that nurture the talents of staff and support their growth and learning.
  • We want staff to end their days with a greater sense of satisfaction that comes from really helping to heal the world, rather than feeling numb and exhausted.

The work of organizations will change. It will hopefully include:

  • More time spent getting to know clients, board members, donors, and volunteers: learning who they are and what their needs are; solving problems before they turn into crises for clients; understanding why your cause is so important to board members, donors, and volunteers.
  • Emphasis on solving rather than serving problems. We want more organizations to become great advocates for more money and support for, say, affordable housing, and spend less time turning away enormous numbers of people who need emergency housing.
  • Real-time information on what's working and what isn't, what services are needed, who needs help, and how to get it to them.

A BETTER PATH FORWARD

We need to do more than change the contours and practices of individual organizations. We need to reknit communities that never recovered from the atomizing effects of the precipitous drop in social capital at the beginning of the twenty-first century that resulted in the atomizing of citizens from one another and the loss of trust in and declining membership in institutions such as religious congregations, unions, and clubs. These communities are now reeling from the social and economic impact of COVID-19.

Nonprofits are the backbones of communities. We provide healthcare and emergency services and run schools and arts organizations. We need to take the lead on reknitting communities where people are known and cared for, where loneliness is reduced and social capital is increased. Nonprofits can help neighbors get to know one another and begin to rely on each other for help and information. Nonprofits can become beacons of ethical leadership by restoring people's faith and trust in institutions in general.

We can also create real-time information systems for people in need. There are already 2-1-1 systems people can call for emergency services. Smart tech could make these systems much more robust by using the data to predict future problems (e.g., increased need for food assistance and increased homelessness) and connecting donors to give for these needs in real time. For instance, Houston could have 36 homeless people in need of a bed on a Tuesday night. The smart system would not only direct social service agencies to send these people to beds that are available, but enable donors to give, say, $100 to house each person.1

Local municipalities and organizations should be active proponents of robot companions. The loneliness epidemic hits frail seniors the hardest. Most people want to stay in their homes as they age. Robot companions will be far less expensive than home health aides, for whom we already have a shortage. Robots will be able to provide basic healthcare, such as measuring blood pressure, preparing food, monitoring vital signs, and reading and talking to people. We would prefer people socializing with one another; however, robots are a good alternative.

Integrating causes into every aspect of our everyday lives will also help remake communities. There are apps to help us do everything imaginable, including exercise, shop, diet, and manage our money. Betterment is a money managing smart app. Users can pay bills, make investments, and donate to causes using the app. Users can access a dashboard to see all of their financial plans and transactions in one place and track savings and other goals. This is the direction of financial management: real-time data and dashboards to see all of your investments and accounts at once, including your charitable activities. An app like Betterment could also become a home base for lifestyles that incorporate philanthropy, green purchases, volunteer opportunities, investments in BIPOC-owned companies, basically any filter a person wants to make sure all of their financial transactions reflect their values and interests. There could even be scores for living your values. The app would provide updates and reminders of volunteer activities and the ability to remind friends as well.

SOCIETAL CHANGES

Finally, we need to address changes at the societal level to create more equitable communities. There are many societal concerns, including diversity in the programming field, data sovereignty, and civil rights.

Diversifying the Field of Programmers and Data Scientists

The field of programming continues to be dominated by white men, making unintentional bias baked into computer code more likely. Diversifying a field like programming is exactly the kind of problem the nonprofit sector specializes in solving. This won't be easy, of course. A recent report finds that diversity increases through STEM (science, technology, engineering, math) programs is stubborn, slow, and uneven. For instance, the study found that, “Black workers comprise just 5% in engineering and architecture jobs. There has been no change in the share of Black workers in STEM jobs since 2016.”2

We need more funding for STEM programs and an increased awareness in tech companies and tech users of the impact that a lack of diversity has on embedded bias. We need to make the recruitment and training of diverse people as programmers and data scientists a top priority.

Data Sovereignty

One of the biggest economic questions to be addressed in the next several years is: Who owns personal data? This is not just an economic question; it is a profound moral one as well.

Researchers began to collect DNA samples from indigenous populations in the 1990s. The original purpose was to ensure that they preserved samples from vanishing tribes. The resulting data sets were made publicly available for science. What those original researchers couldn't have anticipated was that “science” would soon include companies like Ancestry and 23andMe that added the information to their databases from which they profit. In 2018, GlaxoSmithKline invested $300 million in 23andMe to use 23andMe's database of digital sequence information for its own research.3 This is reminiscent of what happened to the cancer cells extracted without permission from Harriet Lacks in 1951. These durable and replicable HeLa cells became vital for fundamental research and drug development. It took decades and a best-selling book before Harriet's descendants were compensated for this gift to science.4 There is a growing movement for data sovereignty by Indigenous people. The idea is that Indigenous people govern the collection, ownership, and application of their own data.5

There needs to be a growing movement for data sovereignty for all people. We need to change the default settings of having to opt out of data collection systems, to an opt-in system that every individual owns and regulates. There will be furious opposition to this change since the vast data-sucking machines are the fuel for the big data robber barons. However, just because vacuuming up personal data for corporate profit has become the norm doesn't make it smart or right.

We appreciate nonprofit resistance to changing these standards. User data has become essential intelligence for fundraising and program delivery operations. However, is it also possible organizations are afraid that users will opt out of data systems entirely? If that's true, if you can't make a solid, persuasive case to your clients, donors, and volunteers on why it is important to your efforts to collect data and that you are a reliable steward for its ethical use, perhaps you should think twice about doing it in the first place.

We do not have to follow commercial norms that default to “whatever we can get away with.” There are ethical options for data sovereignty that begin with every person's right to own their personal data. In this new scenario, individuals give companies permission to use their data for clearly identified purposes and a limited amount of time.

Data sovereignty is not a new idea. Kaliya Hamlin, Doc Searls, Phil Windley, Kim Cameron and others have been advocating for personal data rights, what they call data identity, since the early 2000s. Their appeals have been subsumed by the avaricious interests of corporations. So far. Perhaps, like early suffragettes and civil rights warriors, they have laid the groundwork for the battles ahead.

We could even rent our data to companies or donate it to causes as microdonations. Rhodri Davies describes this idea. “The monetization of personal data through self-sovereign identity may … create huge new volumes of micro-transactions. We could harness this for charity by introducing mechanisms that direct a percentage of each transaction automatically to good causes … ”6

A New Era in Civil Rights

Sonia Katyal, codirector of the Berkeley Center for Law and Technology predicted that in 2030, “Questions about privacy, speech, the right of assembly and technological construction of personhood will all re-emerge in this new AI context … Who will benefit and who will be disadvantaged in this new world depends on how broadly we analyze these questions today, for the future. ” We believe that we need to wrestle with these questions now and not wait until embedded bias widens and deepens the racial, gender, and wealth gaps we already have.7

Who decides? The US Department of Defense is experimenting with the automation of lethal weapons for warfare. This means drones and robots will have the power to decide whether and when to kill enemy combatants. The technology is already available to make this possible. However, it raises a host of moral and ethical qualms. Civilians could easily be mistaken for enemy troops. When the cost of war is lower in human terms, it makes it more likely that decision makers will go to war. Most importantly, who will decide that autonomous killer drones are acceptable?8

Who is going to decide what the boundaries are for the use of smart tech to substitute for human decision-making? When will smart cars be safe to drive? What parameters will be created for robots providing home healthcare? Can they diagnose their clients? Can they provide drugs? Who is going to license and oversee these software and hardware entities?

William Uricchio, media scholar and professor of comparative media studies at MIT, commented, “AI and its related applications face three problems: development at the speed of Moore's Law, development in the hands of a technological and economic elite, and development without benefit of an informed or engaged public. The public is reduced to a collective of consumers awaiting the next technology.”9

We cannot afford to be that passive group of consumers. We need to be an educated, active citizenry that presses elected officials and other decision makers on the future we want, with smart tech supporting us, not the other way around. Elected officials have so far shown very little interest in reining in big tech companies. Although there are bipartisan grumbles right now, there is little indication Congress will have the stomach (and willingness to forgo donations) to actually curtail their use in data extraction and invasions of privacy. Citizens are going to have to put the steel in the spines of elected officials and insist that the ethical use of smart tech cannot be left in the hands of corporations alone.

The public must insist on a future where smart tech companies are properly regulated and monitored to actively mitigate bias and unethical uses of their technology. This will require opening up black boxes so that we can see how they work, seeing what data they are using to train and use their systems, insisting on proof that the tools and systems aren't biased, and creating guidelines for the use of data with significant penalties for its misuse. The European Union already does this, and we can pursue a similar course.

This is where the nonprofit sector should be leading and not following. Our associations and membership groups should be working right now to set standards for our own use of smart tech that can become a model for other sectors. Perhaps there is a need and opportunity to create a good housekeeping seal for the ethical and responsible use of smart tech. Finally, we need to call out any uses of smart tech that degrade our common humanity. This includes strengthening workers’ rights to be managed not by algorithms alone, and everyone's right to control our own data.

The future of civil rights is even going to include robots. We need to begin asking and answering questions such as: “At what point might a robot, algorithm, or other autonomous system be held accountable for the decisions it makes or the actions it initiates?” The reverse also holds true: “At what point, might humans be held accountable to robots?” We appreciate that these might sound like absurd arguments right now; however, the fundamental idea is very important: How are we going to assign moral responsibility for actions taken by autonomous tech?10

CONCLUSION

While no timetable exists for becoming a smart nonprofit, it is urgent that you and your organization begin the work today. The costs of willfully ignoring the impact of automation are unacceptable. All organizations are going to need to be curious, open, thoughtful, careful, and engaged about the use of smart tech. Today's complex and hard-to-solve problems require different solutions, and smart tech gives you the opportunity to reset cultural norms of busyness and transactions and become smarter and better and more successful.

The gift of time offers you, your colleagues, and constituents the opportunity to build stronger relationships, think of new approaches to do the work, and do the kind of soul-fulfilling work that initially interested you. If every nonprofit in the sector can transform itself into a smart nonprofit, we can transform the world.

ENDNOTES

  1. 1.  Catherine Clifford, “Artificial Intelligence will generate enough wealth to pay each adult $13,500 a year,” Cnbc.com (March 17, 2021), https://www.cnbc.com/2021/03/17/openais-altman-ai-will-make-wealth-to-pay-all-adults-13500-a-year.html.
  2. 2.  Richard Fry, Brian Kennedy, and Cary Funk, “STEM Jobs See Uneven Progress in Increasing Gender, Racial and Ethnic Diversity,” Pew Research Center (April 1, 2021), https://www.pewresearch.org/science/2021/04/01/stem-jobs-see-uneven-progress-in-increasing-gender-racial-and-ethnic-diversity/.
  3. 3.  Sabrina Imbler, “Training the Next Generation of Indigenous Data Scientists,” New York Times (June 29, 2021), https://www.nytimes.com/2021/06/29/science/indigenous-data-microbiome-science.html.
  4. 4.  National Institute of Health, “HeLa Cells: A Lasting Contribution to Biomedical Research,” August 30, 2021, https://osp.od.nih.gov/scientific-sharing/hela-cells-landing/.
  5. 5.  Kalen Goodluck, “Indigenous data sovereignty shakes up research,” High Country News (October 8, 2020), https://www.hcn.org/issues/52.11/indigenous-affairs-covid19-indigenous-data-sovereignty-shakes-up-research.
  6. 6.  Rhodri Davies, “Knowing Me, Knowing You: Self-sovereign Digital Identity and the Future for Charities,” CAF blog (July 27, 2017), https://www.cafonline.org/about-us/blog-home/giving-thought/the-future-of-doing-good/self-sovereign-digital-identity-and-the-future-of-charity).
  7. 7.  Janna Anderson and Lee Rainie, “Artificial Intelligence and the Future of Humans,” Pew Research Center (December 10, 2018), https://www.pewresearch.org/internet/2018/12/10/artificial-intelligence-and-the-future-of-humans/).
  8. 8.  Will Knight, “The Pentagon Inches Toward Letting AI Control Weapons,” Wired Magazine (May 10, 2021), https://www.wired.com/story/pentagon-inches-toward-letting-ai-control-weapons).
  9. 9.  Janna Anderson and Lee Rainie, “Artificial Intelligence and the Future of Humans,” Pew Research Center (December 10, 2018), https://www.pewresearch.org/internet/2018/12/10/artificial-intelligence-and-the-future-of-humans/).
  10. 10. David Gunkel, “2020: The Year of Robot Rights,” MIT Press Reader (January 7, 2020), https://thereader.mitpress.mit.edu/2020-the-year-of-robot-rights/).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.231.244