Chapter 14. Ethics

What we’ll cover:
The politics of categories and classification
Issues of intellectual and physical access to information
The ethical responsibilities of information architects

You’ve almost finished the book. You understand the concepts. You’re familiar with the methods. But before you move onward and upward, consider the following questions:

Are you aware that the practice of information architecture is riddled with powerful moral dilemmas?

Do you realize that decisions about labeling and granularity can save or destroy lives?

Will you be designing ethical information architectures?[1]

If you’ve never considered these questions, don’t worry. It’s not your fault. Blame your parents. Did they ever take the time when you were a small child to clarify that the story of Hansel and Gretel is really a metaphor for the horrors of ineffective breadcrumb navigation? Did they ever explain that Spiderman symbolizes the virtuous hypertextual power of the Web? Without information architect superheroes and archvillains to serve as role models, how you could be expected to recognize your own potential for good or evil?

Ethical Considerations

The truth is that ethics is one of the many hidden dimensions of information architecture. As Geoffrey Bowker and Susan Star state in their book Sorting Things Out (MIT Press):

Good, usable systems disappear almost by definition. The easier they are to use, the harder they are to see.

Large information systems such as the Internet or global databases carry with them a politics of voice and value that is often invisible, embedded in layers of infrastructure.

Through the course of the book, Bowker and Star uncover the serious ethical dimensions of organizing and labeling information.

Now, don’t worry. We’re not about to stand on a soapbox and tell you how to save the world. Instead, we present a framework that illuminates six ethical dimensions faced by information architects, so you can make your own decisions. Once again, we humbly seek to make the invisible visible.

Intellectual Access

Much information architecture work is focused on helping people find information or complete tasks efficiently and effectively. We hope to reduce senseless friction, thus avoiding wasted time, money, and frustration.

But we also go beyond connecting users with the information they’re explicitly seeking, by leveraging thesauri and recommendation engines to educate them about additional products, services, or knowledge that they didn’t know existed. This work is no more ethically neutral than designing the first atomic bomb.

Recently, Amazon changed its search engine after an abortion-rights organization complained that results were skewed toward anti-abortion books.[2] Apparently, when users searched on “abortion,” Amazon’s autosuggest presented them with the question “Did you mean adoption?” Amazon explained this was an algorithmic rather than editorial suggestion, but its choice to disable that suggestion was clearly an editorial decision with ethical (as well as political and financial) implications.

A great information architecture can help a medical researcher discover the missing puzzle piece that results in the cure for a disease. A great information architecture can also connect an angry teenager with instructions on how to build a pipe bomb.

Whether you’re working for a business, a nonprofit organization, a university, a government, a political candidate, the military, or a nuclear power station, the ethics of the information architecture depends on the unique situation. So before you take on a new job or project, you’d do well to consider the broader ethical context.

Labeling

There are few things as quietly powerful as labels. We are completely surrounded by them, and for the most part their influence is invisible. They are seen only by the people they hurt.

Bowker and Star provide a couple of good examples. They discuss the politics and pain involved in the transition over several years from the label “gay-related immune disorder” (GRID) through a chain of other labels to the now-accepted “acquired immune deficiency syndrome” (AIDS). In another example, they explain that “many patients feel that one of the greatest burdens of having chronic fatigue syndrome is the name of the illness.” The word “fatigue” indicates everyday tiredness, making it less likely that friends, family, employers, and coworkers will take the condition seriously.

When we develop labeling systems and controlled vocabularies, we struggle to balance literary warrant (use of authors’ terminology) with user warrant (anticipated terms to be employed by users). We strive for clarity, predictability, and conciseness. Perhaps we should also consider the potential impact our labels can have on people and perceptions.

Categories and Classification

The presence or absence of categories and the definition of what is and is not included in each category, can also have powerful consequences. Bowker and Star explain that although child abuse surely existed before the 20th century, you couldn’t tell from the literature; that “category” did not exist. The very creation of the category made it more socially and legally visible.

They also discuss the problems that occur when things don’t fit into an existing category (“monsters”) and when they fit multiple categories (“cyborgs”). They include a quote from Harriet Ritvo about the proliferation of monsters in the 18th and 19th centuries, which notes that “monsters were united not so much by physical deformity or eccentricity as by their common inability to fit or be fitted into the category of the ordinary.”

As we design classification schemes, are we responsible for our own Frankensteins? The taxonomies we build subtly influence people’s understanding and can inject undesirable bias into sensitive topics. Let’s make sure we classify with care.

Granularity

Bowker and Star examined the work of a group of nursing scientists to develop a Nursing Intervention Classification (NIC). They hoped that the classification would help make the work of nurses more visible and legitimate.

During the project, granularity took center stage in a balancing act between the politics of certainty and the politics of ambiguity:

The essence of this politics is walking a tightrope between increased visibility and increased surveillance; between overspecifying what a nurse should do and taking away discretion from the individual practitioner.

It’s interesting to consider the ethics of granularity in the context of web sites and intranets. What unintended consequences might result from our chunking of content? Who might suffer if we alter the balance between certainty and ambiguity? Sometimes, the devil is in the level of detail.

Physical Access

From ramps and elevators to large-print and audio books, architects, librarians, and designers are familiar with issues of physical access to traditional libraries. Unfortunately, the difficulty is carrying this experience into the digital environment.

Despite the ready availability of the W3C Web Content Accessibility Guidelines[3] and Section 508 Standards,[4] even today many software applications and web sites are designed with little sensitivity to the physical capabilities and limitations of various audiences. The ACM Code of Ethics states:

In a fair society, all individuals would have equal opportunity to participate in, or benefit from, the use of computer resources regardless of race, sex, religion, age, disability, national origin or other such similar factors.

Ben Schneiderman, a leader in the field of human–computer interaction, extended this code of ethics into the notion of universal usability:

Universal Usability will be met when affordable, useful, and usable technology accommodates the vast majority of the global population: this entails addressing challenges of technology variety, user diversity, and gaps in user knowledge in ways only beginning to be acknowledged by educational, corporate, and government agencies.[5]

Surely, information architects have a role to play in creating useful, usable systems that work for diverse audiences. Have you been designing for universal usability?

Persistence

As we’ve mentioned before, information architecture is not about surface glamour; it’s about mission-critical infrastructure. And infrastructure has widespread and long-term impact. The ripples of our designs spread outward, affecting the work of interface designers, programmers, authors, and eventually, users. And from experience, we know that the quick-and-dirty placeholder site can become an enduring monument to the axiom, “Do it right or don’t do it at all.” As we design the legacy information architectures of tomorrow, we should consider our responsibility to the big here and the long now (we’ll discuss this topic in more detail in the section “Fast and Slow Layers” in Chapter 15). Remember the Y2K bug? Enough said.

Shaping the Future

As humans, we collectively avoid a huge percentage of ethical dilemmas by defining them out of existence. We decide that they are beyond our control and are someone else’s responsibility.

As an information architect, you can define any or all of these ethical dimensions as “not my problem.” Maybe the responsibility really belongs with the client, the business manager, the authors, the usability engineers, or the users themselves. Or, maybe we’ll all just wait for a superhero to save the day.

Speaking of which, a handful of user-experience superheroes have written books that tackle these thorny issues head on. For example, B.J. Fogg’s Persuasive Technology: Using Computers to Change What We Think and Do (Morgan Kaufmann) includes a chapter about the ethics of persuasive technology. Jeffrey Zeldman’s Designing with Web Standards (Peachpit Press) details the ethics and economics of designing for accessibility. And, Adam Greenfield’s Everyware: The Dawning Age of Ubiquitous Computing (Peachpit Press) presents ethical guidelines for user experience design in ubiquitous computing environments. We encourage you to read these books and put their ideas into action so you can help shape a better future.



[1] This chapter is based on a Strange Connections article written by Peter Morville (http://argus-acia.com/strange_connections/strange008.html).

[2] “Amazon Says Technology, Not Ideology, Skewed Results,” by Laurie J. Flynn. New York Times. March 20, 2006.

[3] Web Accessibility Initiative, http://www.w3.org/WAI.

[4] Section 508 of the Rehabilitation Act, Electronic and Information Technology, http://www.access-board.gov/508.htm.

[5] Ben Schneiderman, Communications of the ACM, 2000. See http://universalusability.org.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.225.220