Chapter Six
Changing Laws and Policies

Policymaking is ultimately about effecting change for large groups of people; centering the communities most affected by potential policies in the policymaking processes is the only way we will create a more equitable world. Participation with the public in the design of policy allows policymakers to address all the issues the communities face—infrastructure, access to services, and more. Technology can support the flow of information throughout communities and between communities and policymakers. But what happens if communities don't have access to the technology? Or, what happens if the technology to which communities do have access causes confusion, disenfranchises people, or fails to protect them?

The policymaker that comes next must consider all these aspects in their role. They must insist on involving community members and social impact organizations in the policymaking process. They must make sure communities can access technology. They must understand how technology intersects with the issues they govern. They must require that the technology that touches society is secure, trustworthy, and not exploitative. It's a lot to manage, but by listening to their constituents, getting advice from experts, and effectively using technology themselves, policymakers can create rules and regulations that enable a thriving society.

In a world where regulation is scarce and there are no requirements to institute social checks and balances on the work of social impact organizations, however, the people who design technology are the de facto policymakers, the creators of social paradigms, and the deciders regarding who receives services and who doesn't. It's a weighty responsibility for technologists to carry alone—and it should not be this way.

Take, for example, the seemingly straightforward act of displaying ads on a social networking site. Algorithms, developed and implemented by technologists, now perform a portion of content moderation that previously was handled entirely by humans via physical mediums—such as print newspapers or even bulletin boards. One point in favor of such algorithms is the fact that, say, an ad for housing that explicitly states certain groups are unwelcome will be quickly removed from most sites. But what if advertisers want to target their ads with a bit more nuance? In 2019, the US Department of Housing accused Facebook of participating in housing discrimination. Facebook's ad practices, which allow for traditional market segmentation, easily lend themselves to discriminatory practices, since advertisers can restrict ads from being shown to groups who fall under the protections of the Fair Housing Act.1

Algorithms used on social networking sites often display the algorithm's best guess of what you want to see; this means you will often see what people similar to you are looking for and you won't see content that people different from you search for. And so, in the case of the Facebook example, if you are a Hispanic woman looking for a place to live, you may not see ads for apartment buildings that are overwhelmingly occupied by white residents—whether or not you would have voluntarily included or excluded that building in your search.

Individuals who make policies and plans for local, state, and federal governments—that is, policymakers—who have often spent their careers in the policymaking world, traditionally come from nontechnical backgrounds. This would not be a significant problem if society still operated as it did a century ago—before modern technology was so intertwined in how people live their lives and how policy is implemented. For example, given how poorly the 2010 Affordable Care Act rolled out in 2014, the policymakers who wrote the Act vividly realized the importance of ensuring that the technical design of healthcare.gov complemented the policymaking.

But this is not to say that all policymakers should also be technology experts; that would be unreasonable. Policymaking itself is a specialized skill that takes time and experience to refine. But policymakers must consider how technology can extend the impact of policy. The conversation about policymaking for what comes next must come from two angles: policy about technology, and technology in policymaking. The common factor in improving policymaking in both cases is strengthening the connection between policymakers and community members. To this end, it is important that policymakers proactively seek out technical expertise—and not just in corporate America, but also within grassroots advocacy organizations and other social impact organizations. Similarly, it is important that technologists learn how to simply and sincerely explain to policymakers the technology they develop. Without this collaborative spirit, policymakers and technologists will continue to talk past each other, and no useful or practicable policy will be developed. For another example, consider the issue of regulating encryption. Privacy advocates argue that encryption regulation protects consumers, whereas law enforcement argues that it thwarts lawful investigations. How can members of Congress be expected to regulate encryption when they struggle to understand its definition, let alone how it works?

Just as technology can be used to advance the missions of social impact organizations, technology can also be used to structure policymaking conversations in new ways. Technologists can inject new‐to‐policy design processes that consider how policies may be successfully technically implemented, as well as how to repeatedly learn and adjust based on testing and piloting designs.

Chapters 3, 4, and 5 discussed how individual roles within the social impact sector can embrace technology to facilitate conversations with community members and advance justice and equity missions. Each of these roles has significant power and agency to influence both their individual organization's work and the work of coalitions. However, to effect widespread systems change, policies, regulations, and laws must also change. Policymakers are uniquely positioned to adjust these levers, and they too can actively participate in building what comes next.

CASE STUDY 1: NATIONAL DIGITAL INCLUSION ALLIANCE

Angela Seifer is the Executive Director at the National Digital Inclusion Alliance. NDIA “is a unified voice for home broadband access, public broadband access, personal devices, and local technology training and support programs. NDIA is a community of digital inclusion practitioners and advocates. [They] work collaboratively to craft, identify, and disseminate financial and operational resources for digital inclusion programs while serving as a bridge to policymakers and the general public.”2 In 2019, NDIA joined conversations about the emerging Digital Equity Act—sponsored by Senator Patty Murray, Chair of the Senate Health, Education, Labor, and Pensions (HELP) Committee; Senator Rob Portman; and Senator Angus King—which “would help close the digital divide impacting communities across the nation.”3 The content of the Act built on previous digital equity policy work in Seattle, Washington. Fortunately, valuable policies can start at various levels; strong local‐ or state‐level policies can be considered and modified at the federal level, thus serving as blueprints for policies to be implemented nationally. As such, the Seattle‐area policy work to support comprehensive interventions for digital divides served to inform how a national policy could similarly address the complexity of the issues. NDIA joined ongoing conversations about what would become the Digital Equity Act at the request of staffers in Senator Murray's office, who sought input from a number of organizations and community leaders for insight into how to ensure that the policy was both balanced and representative of best practices.

NDIA, as a coalition and policy organization, has a number of social impact organizations as members. These organizations are the community leaders on the ground teaching folks to use the internet, getting devices for community members, signing up for or building high‐speed internet, and more. By collaborating with these organizations and individuals, NDIA is able to gather broad examples and understanding from communities across the United States, advocate on behalf of organizations and individuals regarding the challenges they face, and adjust advocacy efforts based on any changing circumstances on the ground. In essence, NDIA takes their coalition members' expertise and experiences to the policymakers so that they can understand the lives and challenges of their constituents.

Angela sees the Digital Equity Act as a valuable example of what policy could be: “The uniqueness of the Digital Equity Act is that it is holistic in nature. It goes after all the barriers. It's not just about devices, the internet connection, or digital literacy. Those things tend to get put into buckets, and when there has been funding they are funded separately. This is the first time we have a holistic approach—considering all those things are important.”4 How did this version of this particular bill come to reflect the lessons communities had learned? It started with information sharing among committees and policymakers.

Senator Murray's office, Angela reports, sought input and opinions from NDIA and other advocacy organizations. “They didn't need to do this, but they did”—and because they did they were able to learn from previous policy attempts to address digital equity around the country. With prior policies, funds earmarked for digital inclusion “were separated, and therefore could only go so far.” NDIA coalition members had seen that when funds had to be spent purely for workforce or health or connectedness, access to digital technologies only marginally improved individuals' quality of life. “People don't only use the internet to find a job, then decide they don't need to use the internet to email or text family and friends,” Angela noted. A holistic funding strategy enshrined in a holistic policy would give communities a better chance to truly bridge the digital divide.

The information sharing did not only go from social impact organizations to policymakers; information also flowed back to the social impact organizations. Angela described the education process that NDIA coalition members underwent as they worked with policymakers on the Digital Equity Act. Witnessing a policymaker saying, “I would love to do that, but …” allowed NDIA members to turn the interaction into a teachable moment. NDIA members would then ask, “What are the laws and limitations of that particular government agency in which we want to effect change? What are the barriers and restrictions we need to work within?” With this understanding, NDIA members could navigate the particular laws and policies currently governing particular operations, and change their information sharing with policymakers based on this understanding.5

After years of work, in 2021, the Digital Equity Act was passed as part of the bipartisan Infrastructure Investment and Jobs Act. The bill created out of the listening and learning among policymakers and community organizations “aims to address these access gaps by encouraging the creation and implementation of comprehensive digital equity plans in all 50 states, DC, and Puerto Rico, [as well as by] supporting digital inclusion projects undertaken by groups, coalitions, and/or communities of interest. With this support, we can further our efforts to bridge the digital divide.”6 The work of NDIA and other organizations to gather and amplify the voices of communities, educate policymakers on the issues, and learn about the limitations of existing laws and authorities led policymakers to write and pass a bill that has the ability to have a positive impact on lives and increase connectedness in new, more comprehensive ways.

CASE STUDY 2: RURAL COMMUNITY ASSISTANCE PARTNERSHIP

Nathan Ohle began his tenure as CEO of the Rural Community Assistance Partnership (RCAP) in 2017. From the start he was curious about, among many issues, how effectively RCAP used technology to support its operations and its membership. RCAP is a national nonprofit focused on securing and maintaining access to water and economic development for rural communities across the country; it is the “national network of non‐profit organizations working to provide technical assistance, training, resources, and support to rural communities across the United States, tribal lands, and US territories. [Its] federal programs and organizational interests are maintained in the national office in Washington, DC, while state and regional programs and field work are managed through six regional offices.”7

He opted to partner with New America's Public Interest Technology team to conduct a sprint—a short but intensive and focused effort to review the current state of affairs and make recommendations to strengthen structures, systems, and technology. The sprint was needed, Nathan felt, so RCAP could learn how they might use technology to drive positive change; they would use the ground truth they discovered to come up with a strategy around what capacities RCAP needed in the work to fill any revealed gaps and set up for future operations. A month later, the sprint team had delivered a report with recommendations about the type of technical talent the organization should hire, the specific technology products that could be acquired, and more.

Since the RCAP/New America Public Interest Tech partnership was successful, the teams looked for additional ways to collaborate. In particular, Nathan had observed how well the technologists had structured the conversations within the organization. Nathan (a former policymaker himself) and the New America Public Interest Tech team (also composed of former policymakers) wanted to test how tech tools and methods—which could improve the internal workings of organizations—could also improve the policymaking process.

Around the same time, the 2018 Farm Bill was being drafted using traditional, incremental, policymaking processes. The Farm Bill is “a package of legislation passed roughly once every five years that has a tremendous impact on farming livelihoods, how food is grown, and what kinds of foods are grown.” It covers “programs ranging from crop insurance for farmers to healthy food access for low‐income families, from beginning farmer training to support for sustainable farming practices.”8 The team asked themselves: Are there new, innovative ideas that they should seed with policymakers now, as opposed to waiting to see what the House and Senate committees come back with?

The group decided to convene an Innovation Summit, a daylong event with “a coalition of rural community advocates, nonprofit leaders, and technologists to try out new methods of policy creation.”9 A number of technologists were identified to structure and facilitate the day. They would use a variety of tools common to inclusive tech development processes to bring to this policy brainstorming session the ability to include a number of diverse perspectives, to distribute power in decision making, and to develop prototypes. As Nathan recalls, “Technologists helped structure how to set up the day, how to drive thinking in new, innovative ways, and how to drive conversation after the Summit with people who were bought into the concepts discussed.”

Over the course of the Innovation Summit, the large group, with representatives from multiple communities, came up with ideas for seven programs. Following the Summit, leaders of the group held smaller conversations to determine which idea had the most support and would best serve the needs of the communities the Summit participants represented. Taking part in the policymaking process, Nathan emphasizes, is made more effective when social impact organizations can build a coalition and advocate to policymakers with one voice. Accordingly, “We asked folks to go as a coalition to the Hill, and then took a subset of the group to speak with staff members on the Senate and House Agriculture Committees.”10

During their conversations with staff members, the small, representative coalition provided an outline for a program that evolved into the Rural Innovation Stronger Economy (RISE) grant program. They answered questions from the policymakers and described how a new, innovative program would truly have an impact on the lives and livelihoods of members of rural communities across the country. In response, policymakers shared their knowledge about which levers could not be pulled—such as when a particular agency lacked authority to make a change, or when current laws or regulations would prohibit the implementation of parts of the proposal.

Policymakers finalized the RISE grant program details and continued the legislative process. In December 2019, the Farm Bill was signed into law; the RISE grant program “offers grant assistance to create and augment high‐wage jobs, accelerate the formation of new businesses, support industry clusters, and maximize the use of local productive assets in eligible low‐income rural areas.”11 What started from conversations among social impact leaders, rural community members, and technologists—using key techniques from social impact technologists—ended up as a new program managed by the federal government.

INSIDE THE PRACTICE OF CHANGE

Taken together, these two case studies describe the spectrum of work that policymakers must tackle: policy about technology and technology in policymaking. This spectrum applies to the universe of policymakers—elected officials, their aides, and their staffers—at federal, state, and local levels of government. A common factor in both the NDIA and RCAP work, reflected in the Digital Equity Act and the Farm Bill, was the creation of coalitions. In both cases, the power of including individuals with community expertise helped organizations to both learn from each other and prioritize interests. A second common factor was the importance of information sharing, communication, and accountability. Social impact organizations educated policymakers on community priorities, and policymakers educated social impact organizations on the confines of policy. A third common factor was the inclusion of technical expertise, although it looked different in each case. In the NDIA case study, policymakers benefited from hearing how the technology affected lives and how individuals needed to use the technology. In the RCAP example, the policy design process benefited from technical design thinking from the start.

At no point, however, was either group expected to have expert‐level knowledge of the other players' constraints; rather, the ongoing efforts to center community challenges led to productive policymaking. Public interest technologist and cybersecurity policy expert Maurice Turner offers this advice to social impact organizations interested in affecting policy: “Treat policymakers as though they are people. They are people. If you are comfortable talking to people, then you can be comfortable talking to policymakers.”12 The reverse is also true: policymakers must also talk to communities and social impact organizations as though they are people. The policies and laws created provide a clear path for accountability between policymakers and communities. As with social impact organizations, technologists, and funding entities, policymakers must have conversations that center the community perspective in order to build what comes next.

Creating Policy About Technology

There are many opportunities to improve policy about technology, spanning the ways technology is developed, the ways people gain access to technology, and the ways technology is regulated. While it may seem that technology is changing every day, there's no reason policymakers cannot work with social impact organizations, technologists, funders, and communities to proactively and intentionally create policies and policy frameworks that serve to provide security and protection for all users.

Increase Access for Individuals

One of the biggest challenges today is simply ensuring that people have access to the technology they want and need, whether to connect with other people, organizations, businesses, services, or government agencies. People need access to technology not only to inform their lives, but also to communicate with their respective policymakers. Unfortunately, the digital divide—the gap between the people who do and do not have access to computers and the internet—leaves many people unable to participate in basic transactions. As Dr. Nicol Turner Lee writes, “Already facing diminished life chances, people with lower incomes, people of color, the elderly, and foreign‐born migrants in rural areas run the risk of being on the wrong side of the digital divide that further exacerbates their economic, social, and political marginalization.”13 In some communities, residents resort to frequenting fast‐food restaurants—to access their Wi‐Fi services. Stop‐gap measures such as these demonstrate the desire for connectivity, and the failure to provide it.

Digital redlining is the lack of availability of broadband access in particular communities. In 2017, a mapping analysis of Federal Communications Commission (FCC) broadband availability data revealed “that AT&T has systematically discriminated against lower‐income Cleveland neighborhoods in its deployment of home Internet and video technologies over the past decade” by not extending the necessary technical infrastructure to the majority of Cleveland Census blocks where poverty rates were about 35%.14 AT&T responded to this particular report by claiming they continuously invest in expanding services and enhancing speeds, and that they were in the process of conducting “technology trials over fixed wireless point‐to‐point millimeter wave and G.fast technologies to deliver greater speeds and efficiencies within [their] copper and fiber networks.15 Regardless of the intent in Cleveland, the reality remains that a significant percentage of neighborhoods in the United States lack access to high‐speed internet, and policymakers have begun to take note. The proposed Anti‐Digital Redlining Act of 2021 “would require the FCC to investigate whether internet service providers have discriminatory practices based on income, race, color, religion, national origin, and other factors within a geographic area.”16 After hearing the outcry of social impact organizations and understanding the tools available, policymakers are taking action.

Much of the problem results from the lack of significant and sustained efforts to increase broadband access, which is the primary way people access the internet with enough strength to take advantage of the technology that powers individual websites and programs. Fortunately, there are policies such as the Broadband Emergency Benefit, which provides “a discount of up to $50 per month towards broadband service for eligible households and up to $75 per month for households on qualifying Tribal lands.”17 But, though such policies are useful, they, like WiFi access in the parking lots of restaurants, are still only temporary band‐aid solutions to a much bigger problem.

When broadband networks are built, it is important to maintain both the networks and access to the networks so they reliably provide the speed needed for efficient use. Funding must be provided for this, and policymakers have the authority to provide that funding.

A second major challenge of providing access to technology is ensuring that people continue to have agency over their lives. As more of our daily interactions are facilitated by technology, questions emerge around access to and ownership of our information.

It's important to recognize that lax security and privacy practices with technologies can, and often do, translate to real, even physical harm. For example, the SpyFone app, supposedly intended as a means for parents to monitor their kids' online activities and location, allowed “stalkers and domestic abusers to stealthily track the potential targets of their violence.” The resulting crimes the app enabled led the Federal Trade Commission (FTC) to ban the company and its CEO from the surveillance business in September 2021.18 We must examine how we can shift policymaking so as to better protect consumers.

We also have to explore the ways that institutions and communities can collaborate so as to design an environment that allows folks to create and thrive safely. To change systems—the very systems that so many social impact organizations fight against or try to work around—we must change policies. One way to navigate this is through advocacy organizations leading coalitions that speak directly with policymakers. Another option is the creation of public–private partnerships, which bring together resources and talents across multiple sectors to deliver services. Well‐executed public–private partnerships can advance many of the goals of social impact organizations. In 2010, for example, the Gateway Arch Park Foundation partnered with the City of St. Louis and St. Louis County to renovate the Gateway Arch and increase visitor access. This public–private partnership “raised $250 million in private funding, and Great Rivers Greenway oversaw a publicly approved sales tax … that raised an additional $86 million for the project,” which ultimately “led to a 30% increase in attendance.”19

Part of what made the Digital Equity Act, discussed in the NDIA case study, so powerful was that it translated into policy the learning that a single intervention is not sufficient to solve deep‐rooted, systemic problems. Policymakers can bring together a wide range of options to find solutions for the challenges communities articulate.

Make Proactive Policy About Technology

As the general public's understanding of how technology affects their lives increases, so does the general public's curiosity about how technology companies make decisions about marketing to or profiting from consumers—and about whether companies learn from the mistakes of their unpopular actions. Consequently, the calls for tech companies and technology to be regulated have increased over the past decade. Many consumers seek greater protection, wondering if tech companies follow them online or listen in to their conversations. Others resent being manipulated into engaging with online platforms or into purchasing decisions due to the intentional design practices of tech companies. Many wonder how privacy and security translate in a digital world. Is it possible to violate someone's sense of digital privacy and security? Should the government have the ability to gain access to our every move and communication? As Maurice Turner succinctly puts it, “Privacy and data protection are human rights.”20

Increased awareness, combined with the years of research performed in many tech‐ and society‐related academic programs, advocacy organizations, and civil society groups, has led to increased demand for regulating big tech. Articles such as “Tech Firms Need More Regulation,”21 “Big Tech Says It Wants Government to Regulate AI. Here's Why,”22 and “The World Wants More Tech Regulation”23 have dotted the headlines over the past years. Even a Harvard Business Review piece entitled “How More Regulation for U.S. Tech Could Backfire” includes the statement, “Of course, nobody thinks technology companies should be left unregulated.”24 The true question, then, is who should do the regulating, and when?

Two particular government agencies are already set up to protect consumers and are well positioned for regulating information flow through technology: the FCC and the FTC can hold companies accountable for the harms they inflict on consumers. The FCC is responsible for implementing and enforcing communications laws and regulations. Some of its activities include “promoting competition, innovation and investment in broadband services and facilities” and “supporting the nation's economy by ensuring an appropriate competitive framework for the unfolding of the communications revolution.”25 The FTC protects consumers and promotes competition; it can “conduct investigations, sue companies and people that violate the law, develop rules to ensure a vibrant marketplace, and educate consumers and businesses about their rights and responsibilities.”26

We've already seen examples of these two agencies at work—in the earlier discussions of SpyFone example and the Emergency Broadband Credit. These agencies, especially, must have sufficient technology expertise to interpret regulations around communications—including internet communications, antitrust practices, and consumer protection practices—under the lens of technology design and implementation. Policymakers can apply lessons already proven by the FCC and FTC to other policymaking bodies and government agencies.

It's no secret that most lawmakers do not have a deep understanding of the inner workings of technology. In congressional hearings, policymakers have been infamously confused and misinformed about how big tech companies function, even how the heavily used technologies of social media actually work, and what can't be done through technology. Unfortunately, that policymakers would be out of touch is somewhat by design. The traditional, siloed nature of policymaking has created individuals who don't understand, for example, the underlying effects of the ways technology dictates who can access broadband services, or how social media platforms influence teenagers' self‐esteem, or how the location of tech factories can build up or destroy entire communities.

Policymakers are not technologists, by and large; they don't usually come from the tech world. But because of technology's impact on so much of people's lives, policymaking bodies must make decisions about its regulation—and so it is imperative that policymakers consult with tech advisers. They need to listen to the right people—not just to companies who can afford the time and access. In reality, policymakers must turn to those with not only technical expertise, but with ethical and societal expertise as well. The big tech companies have a number of legislative affairs and policy employees who are more than willing to educate lawmakers on their technology and on what regulation, if any, would help the tech industry. It is equally important, however, for policymakers to hear from privacy advocates and security experts and community members who use and are subject to the decisions of tech companies. It should be just as important for policymakers to hear how a security breach caused a consumer to lose their life savings. Or how people of color can experience race‐based job discrimination even before getting an interview—because, as a Harvard study found, Google searches including African American–sounding names “are more likely to produce ads related to criminal activity.”27 Ready to provide some much‐needed insight are organizations such as Upturn, which sets out to “drive policy outcomes and spark debate through reports, scholarly articles, regulatory comments, [and] direct advocacy efforts together with coalition allies.”28

Policymaking also doesn't have to be just about cleaning things up or fixing things—it could be about articulating the pathways we want technology to follow in order to best serve those who use it. What's important is that communication takes place, that community members can express their concerns to their elected officials, sharing, “I need a more affordable option for this service” or “We need faster internet speeds” or “I need to be able to move my data off your system.” Policymakers can listen to community members, then work with technologists to translate those needs into policies that will enable the related technologies to be built. For example, this process would turn a community's desire for more affordable internet access into legislation that allows for funding to be unlocked for the creation of new networks. In fact, this process played out in the creation of the 2021 Digital Equity Act, and funding was made available for local communities to find and build solutions they need—even outside of the traditional providers such as Verizon and Comcast.

We can't yet know all the ways technology will evolve. We do know that we need frameworks to allow people to be securely in charge of their data and protected from harm. Policymakers can partner with technologists and community members today to create guidelines that allow for innovation and creation while also preserving individuals' rights.

Using Technology in Policymaking

There are many ways for policymakers to encourage more thoughtful technology into policymaking. Although we can easily contact elected officials today, imagine, for example, there being a tool to increase trust, transparency, and accountability between policymakers and the communities they serve. Policymakers, social impact organizations, and community members can all use well‐designed technology to access information to make data‐driven decisions. Technology can then be used to implement policy decisions, such as those concerning delivering benefits, keeping people safe, and interacting with government agencies.

Technology Tools and Practices Can Be Used to Facilitate Policymaking

A primary goal of using technology in policymaking is to support closer communication between policymakers and constituents. Facilitating person‐to‐person conversations, sharing of data between different technical programs and applications, and guiding decision‐making processes are all possibilities for redesigning policymaking for the twenty‐first century.

Policymakers can learn a lot from technologists, and then apply those lessons to the process of policymaking. In 2019, Nathan Ohle and Cecilia Muñoz wrote, “Technologists, including user experience designers, computer scientists, and product managers, care deeply about how their products look and feel. They aren't afraid to make imperfect first versions, test them in batches, find the flaws, and iterate with new ideas. The process for creating new tech products is agile and collaborative—the exact opposite of the way most policy is made.”29 But it is possible to create policies that meet real needs. In the creation of the RISE Program that ended up in the Farm Bill (in Case Study 2, earlier), we saw how viable policy emerged from collaborative conversations between the community and social impact organizations that focused on identifying what programming, funding, and functionality would improve their lives.

To address the current lack of widely used technical tools for policymaking, let's encourage development of tools for gathering and assessing input in the policymaking process. Take, for example, the Mobility Data Specification (MDS). This digital tool “helps cities to better manage transportation in the public right of way. MDS standardizes communication and data‐sharing between cities and private mobility providers, such as e‐scooter and bike share companies.”30 By creating a digital platform that allows for the easy sharing of historical and real‐time transportation data, cities are able to more easily track the use and efficiency of mobility programs and service providers. They can then quickly make policy decisions that reflect the newest information. As of late 2021, more than 130 cities around the world are using MDS—from Detroit, Michigan, to Bogotá, Colombia, to Ulm, Germany.

When policies have been clearly defined but may still be difficult to access, products developed by social impact organizations can help. For example, City Tech's Civic User Testing group (CUTgroup) “is a 1,600+ member civic engagement program that invites Chicago residents to contribute to emerging technology while providing public, private, and social sector partners with feedback to improve product design and deployment.”31 The group is mobilized for testing civic and government‐related products. As we consider ways that technology can support policymaking, we must consider how to make the technology services that expand the inclusiveness and responsiveness of government services to communities.

Technology can also be used to empower policymakers and community members to view information in easily consumable ways, serving as a basis for decision making about policies and life decisions. Broadly distributing information is a way to ensure that policymakers and community members are making decisions from a common starting point, as well as to increase communities' abilities for self‐determination. In 2020, as the COVID pandemic raged, communities and policymakers struggled to grasp the situation; information did not flow to inform conversations or decisions. Strategic use of coalitions and technology helps support decision making by policymakers—and community members. As Dr. Anjali Tripathi, creator of Los Angeles' COVID‐19 vaccination dashboard and member of the Los Angeles County Department of Public Health's COVID‐19 Data and Epidemiology Team writes,

COVID‐19 has tested our ability to rapidly transform large volumes of data into actionable information for communities. Working with the vaccination data for a population of over 10 million (America's largest county), we created a dashboard that—like a car dashboard—distills information into at‐a‐glance statistics and visualizations needed for tracking progress, identifying disparities, and taking action in real time.

On its own, a single statistic for the number of people vaccinated or a table of geospatial data can be hard to interpret. However, a map of residents vaccinated by neighborhood quickly highlights communities that are lagging. Coupled with a time slider and graphs over time, these visualizations enable one to assess progress and the success of policy interventions, as well as provide forecasts for policy planning. It's amazing to see how providing data in context, for example showing how one city compares to others (such as neighboring cities or those of equivalent population sizes), incites self‐motivated action.

We carefully constructed our user interface, selected metrics, and disaggregated data to accessibly tell a story that would motivate action. Coupled with full open datasets available for download, our data dashboard has become a key self‐service tool for a range of users—from academics and journalists to non‐profit and government users. Our dashboard has powered other visualization and data efforts, including those for the Mayor's office of the City of Los Angeles, school districts, even other departments within the County of Los Angeles, such as the Department of Health Services. The public facing website not only broke down silos affecting data sharing across government, it has been a key driver of policy action. It has been used to deploy mobile vaccination clinics and design more effective, targeted outreach (for example body shop and church vaccination events) in communities of color, with high rates of vaccine hesitancy. By making the data accessible, disaggregated by key demographics, and offering it in a near real‐time, reliable and understandable format, our data dashboard has provided a key example of how data enables better outcomes, even in challenging quickly changing situations like a pandemic.32

With all of these interventions, a continually open line of communication is key. Communities and social impact organizations must be able to count on continued engagement with policymakers, whether through traditional media or through consistently used technologies.

Technical Decisions Can Affect the Adoption of and Compliance with Policy

It's also essential for policymakers to consider the technical implementation of the policies they create—especially supporting tools and systems that enable people to access the services they are legally entitled to. The importance of this approach can be seen in a stunning example.

In 2018, Arkansas became the first state to implement a Trump Administration policy that required Medicaid recipients to prove they were gainfully employed. But the only way for affected residents to report their compliance was via an online portal—despite the fact that Arkansas has the nation's lowest rate of household internet access. Six months into this experiment, nearly 20,000 people (more than 20% of those affected by the new rules) had lost health insurance coverage. Responding to public outcry, the state eventually established a telephone portal via which recipients could report their employment—except both the phone line and the website “close” at 9:00 each night, reopening at 7:00 the next morning.33

(Shockingly, this is not the only government website to have regular business hours.) These actions make it hard to imagine that the State of Arkansas actually wanted its least privileged citizens to receive their legally granted, potentially lifesaving health care benefits.

New Models for Policymaking

If we can't expect policymakers to become experts on technology, the next best thing is to bring technologists alongside policymakers. The TechCongress Fellowship aims to do just that—placing talented technologists in the policymaking halls of Capitol Hill for a limited term. This innovation had its beginnings in 2013, when founder Travis Moore was a Hill staffer. As a nontechnologist, he recognized that he needed support in advising his boss on how to vote on a particular bill. He shares: “I found there weren't staff on Capitol Hill with the necessary tech expertise to help me. As a result, I went outside the building, to a tech company lobbyist, for advice.”34

In just a few years, TechCongress has proven that technologists working with policymakers can develop better tech‐relevant policy; in a way, these technologists become policymakers themselves. Their successes include “changing defense procurement rules to allow startups to better compete for contracts and support our servicemembers; helping draft the House Judiciary Committee's Antitrust Subcommittee report on tech monopolies; issuing House Modernization Committee's recommendations to make Congress more effective, efficient and transparent; and passing the OPEN Government Data Act into law.”35 TechCongress now represents one pathway for technologists to provide impartial information on technology topics to policymakers. This model could be expanded at the federal level as well as modified and adopted for local city councils, for county commissions, for state‐level governing bodies, and more.

Another example involves the United States Digital Service (USDS). Although now nearly a decade old, the USDS was created to better connect technology to policy development and implementation—both outside the legislative bodies and inside the government agencies that manage resources, regulate businesses, and generally administer specific government functions. The USDS model recognizes that policymakers may not know exactly what technology is needed to support specific policies, and so it brings in advanced technologists to work alongside policymakers.

USDS partnered with the Centers for Medicare and Medicaid Services to develop applications that “help give beneficiaries and their providers a 360‐degree view of past diagnoses, procedures, and medications. Instead of forcing patients to recall and retell their entire medical history at each visit, providers can use Medicare claims information to confirm a patient's understanding of their medical history, fill in gaps in care, and improve patient safety.”36 Another USDS team partnered with the Veterans Affairs to “build a tool that guides users through nine questions and provides individualized results based on their responses. The customized plain language offers clear, step‐by‐step guidance on how a Veteran could present a strong application to upgrade their discharge status.”37 This is a lesson for all of us: never underestimate the power of replacing legalese and government speak with clear language that community members can understand and navigate! The digital service model started at the federal level in the United States has since expanded; there are now digital service agencies and practices being applied at state and local levels in the States as well as in countries around the world. Once again we see that broadening the participants in the policymaking process beyond traditional policy‐only individuals leads to the development of effective solutions. Interestingly, though, the USDS has also found, again and again, that sometimes the better choice has been to intentionally not use technology—and to instead focus on administrative practices, process redesign, and community building.

So, yes, while it is necessary to expand traditional policymaking processes to include technologists in the design and implementation—that approach is also insufficient for creating the policies that must come next. Though policymakers have a lot to do within the policymaking process, they can also encourage others to invest in the policymaking process. Laura Manley, Executive Director of the Harvard Kennedy School's Shorenstein Center, says: “I often hear that policymakers just need to understand technology better. While that is true in many ways, it's not just up to non‐technical folks to understand tech better. It's also up to technologists to understand policy and societal considerations better in their initial design, testing, and deployment.”38 Thus, even as more guardrails are put in place to preserve human rights, technologists will still maintain some power over policies as they are writing code.

Laura continues by acknowledging, “There are leverage points throughout all stages of a technology's development that should be considered—instead of only thinking about training for scientists and technologists at the beginning, or [about] regulation once it's out in the world. More time should be spent evaluating places like (1) basic and applied science grant requirements for inclusion, ethics, and testing or (2) the early stage investment process, and what due diligence [has actually been] considered.” In other words, policymakers must appreciate that protections for individuals need to be built in from the start; waiting until the end of the technology development process to examine ethics and inclusion is simply waiting too long.

It's not just technologists who should educate themselves about the current realities of policymaking; traditional social impact organizations and advocacy groups must do the same, and policymakers should ask these organizations how technology effects their mission. The Leadership Conference on Civil and Human Rights is an example of a long‐lasting and well‐respected civil organization that has accepted that they cannot fully advocate for their mission without accounting for the many ways technology can advance or hinder their communities. The Leadership Conference has embraced technology—not just by having an accessible, informative, user‐friendly website that enables people to easily contact the relevant policymakers, but also by identifying some of the technology issues and policies that disproportionately affect their areas of concern. For example, the Leadership Conference has participated in or led coalitions to push the Census Bureau to ensure the 2020 Census would be conducted in a technically secure and accessible manner—and to get various levels of government to recognize the disproportionate effects that facial recognition technology has in policing.

In previous chapters we discussed the importance of intentionally designing technology for historically overlooked and excluded communities. The same must hold true for the policymaking community. As we create new models for policymaking, we must ensure that people of different backgrounds are able to contribute—as policymakers, as technologists, as social impact organizational leaders, and as community members. The particular issue of diversity, however, cannot be seen as an afterthought or a check‐the‐box exercise at the end of the process. As Arpitha Peteru and Sabrina Hersi Issa define as the first systemic inclusion principle: “How you create informs what you create.”39 At the point when decision‐making tables are populated it's important to ask who is missing and what perspective isn't being represented, and then add someone who can speak to the nuances of this perspective.

Of course, it's important to also design policies inclusively; it's insufficient to design policies or implement technology that works for only most of the population—such a situation should be inadmissible as well. Arpitha Peteru and Sabrina Hersi Issa go on to point out, “There is an intrinsic relationship between online agency and offline power‐building in the fight for justice and equity. But in an era when everyday existence in democratic and public life is facilitated by technology, online and offline states are not a binary. Digital devices, tools, and platforms fall across a continuum of unequal, inconsistent, and outsized power dynamics.”40

Disinformation and misinformation are real threats to the policymaking process, and so one of policymakers' biggest challenges in the coming years will be the fight to find what is true. Case in point: the Facebook/Cambridge Analytica scandal. In the 2010s, the British political consulting firm Cambridge Analytica collected data from Facebook users without their consent; this information was then provided to the Ted Cruz and Donald Trump political campaigns, which in turn used the information to target Facebook users for political advertising. Despite Facebook's stated efforts to combat the widespread disinformation and misinformation on its platforms, users can still see and be swayed by inaccurate information on politics, global warming, the COVID‐19 vaccine, and nearly any other subject. (Note that although Facebook is frequently cited as an offender in allowing the spread of disinformation and misinformation, it is by far not the only platform where this takes place; any information‐sharing platform that provides analysis based on user data is vulnerable to this phenomenon.)

Much of the conversation about combating misinformation and disinformation centers around the general public; however, policymakers themselves are equally susceptible—with much more dangerous consequences. One simply cannot make solid policies if one bases decisions on inaccurate information. Fortunately, policymakers can fight the misinformation and disinformation they will inevitably be exposed to by verifying sources, sponsoring in‐person discussions by holding community meetings, and providing community members and social impact organizations with opportunities to interact with decision makers in their local government.

Data, security, privacy laws, and policies are shifting the landscape for organizations and tech companies, and this trend will continue in the coming years. As non‐tech businesses have learned, to operate in a global world they must understand and design processes around the regulations of the countries in which they work. Digital technologies are subject to the same constraints: technology regulations in one location may have ripple effects for technology companies doing businesses in multiple locations.

As we shift policymaking outside of its traditional silos and into the more open and connected world, relationships will be key. Relationships are how technologists effectively partner with traditional policymakers, and how communities and social impact organizations organize and build coalitions. Brandon Forester, the National Organizer for Internet Rights and Platform Accountability at MediaJustice, has wise advice about how to address this important shift: “Recognize that shifts don't happen overnight, and all coalition members don't need to be of one accord all the time.”41 Similarly, every potential unknown factor cannot be accounted for through communication and coalitions. Tech companies, for example, may not cite a need to protect proprietary information and not share how their technology works. This shift is toward a new process for policymaking, more than a one‐time solution or single policy itself. Policymaking that supports a more equitable world will need to be a continual process of engagement, learning, and making change together.

QUESTIONS FOR WHAT'S NEXT

The opportunities for improvement in both policymaking about technology and the use of technology in policymaking rely on better understanding of how technology reflects policy and of how security and privacy manifest in technology systems—as well as understanding how best to center communities' needs and perspectives in policy creation. Policymakers must do a better job of requesting information from the full spectrum of their constituents—individuals, organizations, and companies. They should proactively seek to be educated on a broad range of technologies by social impact organizations and technologists, alongside their more traditional big‐tech technology companies. To many social impact organizations, working on policy change can seem superfluous to or distracting from achieving their mission; however, policymakers should proactively ask these organizations to engage in the policymaking process so that we can change systems together. By empowering communities in the policymaking process, we can allow communities to identify tools or techniques that unnecessarily restrict their lives and livelihoods. The following questions can help start and advance conversations with policymakers to secure all these important goals.

Social Impact Organizations

Questions for those working in and with social impact efforts to ask policymakers:

  • It's important to us that you fully understand the issues we know about and the priorities of the communities who are part of our work. How do we best educate you on those points?
  • How do you help us navigate policymaking systems to ensure our efforts are successful? How will you help us know to engage and when?
  • What have been successful strategies for organizations and communities advocating for new policies?
  • Can you describe who and what influences your policies?
  • How can we better work together on proactive policies that enable as many people and organizations as possible to be part of change‐making work?

Technologists

Questions for those building technology for social impact to ask policymakers:

  • How can we proactively understand current and forthcoming policy decisions? How can we help inform definitions within upcoming policies intentionally to ensure community adoption and protection?
  • How can we build technologies in support of collaborative and participatory policymaking?
  • How can we best share what we've learned and describe the barriers we face so as to inform and influence policymaking?
  • How can you best inform us about community priorities and needs from other issues areas?
  • How can we work together to adopt privacy and security priorities for all of our policies?

Funders

Questions for those in positions to fund social impact and technology to ask policymakers:

  • How can our resources be combined with yours to accelerate or ensure participatory processes?
  • What public–private partnerships could be established in service to the internet and technology development needs in our communities?
  • How can we share what we've learned and tried with new models and efforts so as to support emerging policies?
  • How can we surface and share successful stories from other regions or sectors in order to inform new policies?
  • Are there potential gaps in available knowledge or data that we could resource for research and evaluation?

Policymakers

Questions for those creating and enforcing policies around technology and social impact to ask their peers:

  • How do you verify and validate your information sources?
  • How are you actively working to remove bias in your policymaking process?
  • What participatory processes have been successful for you?
  • How can we best share community priorities and needs from different policy topic areas?
  • How are you reporting back on progress and challenges in policymaking initiatives?

Communities

Questions for community members to ask policymakers:

  • How can we work together to change policies so as to increase funders' annual distributions?
  • How might policy support new mechanisms for providing resources to our community?
  • How can we work together to build policy protections for communities, individuals, and users?
  • How can policies better force accountability among users and technology providers and social impact service providers?
  • If any technology harms come from there being a lack of policy, are you prepared to be accountable for that?

NOTES

  1. 1.  Afua Bruce and Maria Filippelli, “Tech Companies Need a History Lesson and Civil Rights Groups Can Provide It,” The Hill (March 29, 2019), https://thehill.com/opinion/technology/436464-tech-companies-need-a-history-lesson-and-civil-rights-organizations-can.
  2. 2.  “About NDIA,” National Digital Inclusion Alliance, accessed September 1, 2021, https://www.digitalinclusion.org/about-ndia/.
  3. 3.  United States Senator Murray Working for Washington State, “Senators Murray, Portman, and King Introduce Major Bipartisan Legislation to Close Digital Divide, Promote Digital Equity,” Press release (June 10, 2019), https://www.murray.senate.gov/public/index.cfm/newsreleases?ID=0EBFF33F-29C4-44CB-A4BD-6B1B1C986794.
  4. 4.  Angela Seifer, Zoom interview with Amy Sample Ward (September 28, 2021).
  5. 5.  Angela Seifer, Zoom interview with Amy Sample Ward (September 28, 2021). Visit NDIA's website at https://www.digitalinclusion.org/.
  6. 6.  “The Digital Equity Act,” #DigitalEquityNow, accessed October 8, 2021, https://www.digitalequityact.org/.
  7. 7.  “About Us,” Rural Community Assistance Partnership, accessed September 1, 2021, https://www.rcap.org/about/.
  8. 8.  “What Is the Farm Bill,” National Sustainable Agricultural Coalition, accessed September 1, 2021, https://sustainableagriculture.net/our-work/campaigns/fbcampaign/what-is-the-farm-bill/.
  9. 9.  Cecilia Munoz and Nathan Ohle, “Want Better Policy? Bring in the Technologists,” The Hill (January 29, 2019), https://thehill.com/opinion/technology/427504-want-better-policy-bring-in-the-technologists?rl=1.
  10. 10. Nathan Ohle, Zoom interview with Afua Bruce (September 24, 2021). Visit RCAP's website at https://www.rcap.org/.
  11. 11. Rural Innovation Stronger Economy (RISE) Grants, USDA Rural Development, accessed October 11, 2021, https://www.rd.usda.gov/programs-services/business-programs/rural-innovation-stronger-economy-rise-grants.
  12. 12. Maurice Turner, Zoom interview with Afua Bruce (September 24, 2021).
  13. 13. Nicol Turner Lee, “Closing the Digital and Economic Divides in Rural America,” accessed September 1, 2021, https://www.brookings.edu/longform/closing-the-digital-and-economic-divides-in-rural-america/.
  14. 14. Bill Callahan, “AT&T's Digital Redlining of Cleveland,” Blog entry (March 10, 2017), https://www.digitalinclusion.org/blog/2017/03/10/atts-digital-redlining-of-cleveland/.
  15. 15. “Internet Access Advocates Say AT&T Is Guilty of ‘Digital Redlining’ in Some Cleveland Neighborhoods,” News5 Cleveland (March 12, 2017), https://www.news5cleveland.com/news/local-news/oh-cuyahoga/atts-digital-redlining-of-cleveland-neighborhoods.
  16. 16. “Rep Clarke Introduces the Anti‐Digital Redlining Act of 2021 with Baltimorean Support,” Benton Institute for Broadband and Society (August 10, 2021), https://www.benton.org/headlines/rep-clarke-introduces-anti-digital-redlining-act-2021-baltimorean-support.
  17. 17. “Emergency Broadband Benefit,” Federal Communications Commission, accessed September 1, 2021, https://www.fcc.gov/broadbandbenefit.
  18. 18. “FTC Bans SpyFone and CEO from Surveillance Business and Orders Company to Delete All Secretly Stolen Data,” Federal Trade Commission, press release (September 1, 2021), https://www.ftc.gov/news-events/press-releases/2021/09/ftc-bans-spyfone-and-ceo-from-surveillance-business.
  19. 19. “5 Examples of Public Private Partnerships in Implementation,” NMBL Strategies (September 12, 2019), https://www.nmblstrategies.com/blog/5-examples-of-public-private-partnerships-in-implementation.
  20. 20. Maurice Turner, Zoom interview with Afua Bruce (September 24, 2021).
  21. 21. Brad Smith and Carol Ann Browne, “Tech Firms Need More Regulation,” The Atlantic (September 9, 2019), https://www.theatlantic.com/ideas/archive/2019/09/please-regulate-us/597613/.
  22. 22. Jamie Condliffe, “Big Tech Says It Wants Government to Regulate AI. Here's Why,” February 12, 2020, https://www.protocol.com/ai-amazon-microsoft-ibm-regulation.
  23. 23. Sintia Radu, “The World Wants More Tech Regulation,” US News & World Report (January 15, 2020), https://www.usnews.com/news/best-countries/articles/2020-01-15/the-world-wants-big-tech-companies-to-be-regulated.
  24. 24. Larry Downes, “How More Regulation for U.S. Tech Could Backfire,” Harvard Business Review (February 9, 2018), https://hbr.org/2018/02/how-more-regulation-for-u-s-tech-could-backfire.
  25. 25. “What We Do,” Federal Communications Commission, accessed September 1, 2021, https://www.fcc.gov/about-fcc/what-we-do.
  26. 26. “About the FTC,” Federal Trade Commission, accessed September 1, 2021, https://www.ftc.gov/about-ftc.
  27. 27. “Google Searches Expose Racial Bias, Says Study of Names,” BBC News Service (February 4, 2013), https://www.bbc.com/news/technology-21322183.
  28. 28. “Our Work,” Upturn: Toward Justice and Technology, accessed September 1, 2021, https://www.upturn.org/work/.
  29. 29. Cecilia Munoz and Nathan Ohle, “Want Better Policy? Bring in the Technologists” The Hill (January 29, 2019), https://thehill.com/opinion/technology/427504-want-better-policy-bring-in-the-technologists?rl=1.
  30. 30. “About MDS,” Open Mobility Foundation, accessed September 1, 2021, https://www.openmobilityfoundation.org/about-mds/.
  31. 31. “Civic User Testing Group,” CityTech Collaborative, accessed September 1, 2021, https://www.citytech.org/cutgroup.
  32. 32. Anjali Tripathi, email to Afua Bruce (October 10, 2021).
  33. 33. Justin King and Afua Bruce, “Voices from the Social Safety Net,” Slate (February 28, 2019), https://slate.com/technology/2019/02/snap-freshebt-benefits-technology-voice.html.
  34. 34. “Travis Moore's Written Testimony Before Congress,” TechCongress (April 28, 2021), https://www.techcongress.io/blog/2021/4/28/travis-moores-written-testimony-before-the-select-committee-on-the-modernization-of-congress-united-states-house-of-representativesnbsp.
  35. 35. “About Us,” TechCongress, accessed September 1, 2021, https://www.techcongress.io/about-us.
  36. 36. “Medicare Data API: Blue Button and Data at the Point of Care,” U.S. Digital Service (September 1, 2021), https://www.usds.gov/projects/blue-button-2.
  37. 37. “Discharge Status Upgrade Tool,” U.S. Digital Service, accessed September 1, 2021, https://www.usds.gov/projects/discharge-upgrade-tool.
  38. 38. Laura Manley, email to Afua Bruce and Amy Sample Ward (September 10, 2021).
  39. 39. Apritha Peteru and Sabrina Hersi Issa, “Toward Ethical Technology: Framing Human Rights in the Future of Digital Innovation,” RightsXTech (October 2021).
  40. 40. Apritha Peteru and Sabrina Hersi Issa, “Toward Ethical Technology.”
  41. 41. Brandon Forester, interview with Afua Bruce (September 24, 2021).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.116.63.5