Peer-to-Peer Journalism

As a special focus theme on practical case studies, P2P Journalism (p2pj) combines many of the more interesting issues discussed earlier in the book, both in theory and in implementation. The p2p here isn’t necessarily or specifically about the kinds of peer technologies described in this book, although these and others can certainly play an important role, rather it’s about people as peers and the related trust issues.

How would one define p2pj? Perhaps something like this:

  1. Original content is not generated by the centralized parts of the system (if there are any) but rather locally by the peers, mainly supported by, one might imagine, peer messaging and collaborative tools. This is in contrast to most modern journalism where a central editorial staff creates content from material culled from reporters, wire services, and archives.

  2. Peers automatically host and reshare the content they produce and download from other peers. In other words, they are not only the content creators, but also at the same time the distribution system. This is supported by the peer technology relevant to sharing and distribution.

  3. Peers also become the server source for disseminating content to a wider readership (or viewership), with access not just to the current material, but also the archive and reference links to related and background sources. Peer search tools or syndication digests might be common in this role.

From this point of view, each peer is a node in an automatic syndication network—both news “reporter” and access “library”. Several important issues, not least of which is trust, must be dealt with in peer journalism.

Established news media have demonstrated reputations concerning how well or objectively they cover news, applicable by extension to the reporters and journalists that work for them. Most people base their trust of what they read or see reported on this association and perceived reputability, or lack thereof. A further extension is the trust given to the trust the media and their journalists have in their news sources. How would anyone know how to evaluate stories emanating from a peer network?

Bit 12.2 Viable network information content presumes a trust infrastructure.

As e-commerce development critically depends on a micropayment system, important growth in Internet’s informational role depends on trust systems.


Practical Trust Systems

To summarize, practical trust systems have three distinct components in this context, which apply both to the journalist-peer and to the sources the journalist works with:

  • Authentication, which asserts that people or agents are really who they say they are, and are in fact correctly representing what they claim.

  • Reputation, which is a consensus evaluation ultimately based on perceived past behavior, possibly established by hearsay.

  • Trust, which is the result of a personal evaluation of known reputation and other factors.

These issues are relevant to most any p2p situation, not just journalism. A peer system can resolve these issues in several ways, some of which have already been outlined in the earlier explanations of various technologies, especially the encrypted variations in the later chapters.

Authentication could establish identity with a distributed PKI web of trust or through central services such as Passport. Of special interest to journalists is the authenticated anonymous identity, the trusted anonymous source. Recall that public keys to establish identity, and digital signatures to authenticate content and sender, are intimately coupled through these products of strong encryption.

Reputation is determined by actual user behavior—history, not identity except to establish that a particular history is reliably linked to an established identity. Reputation determines whether or not people in general want to have transactions with that entity. In the peer context, reputation systems are very popular fields for development at the moment, as p2p networks move from connecting with anyone, to filtering out nodes that exhibit poor or unreliable behavior.

Trust is partly about reputation on the purely individual level, but it’s moderated by the many other factors that an individual applies when deciding whether to deal with the entity in question. The individual trust shown in a particular context can run counter to what public reputation would dictate. The most important factor in personal trust are value judgements and things such as relative risk, expectation, greed, reward, and particular context. I might trust someone well in one context, but not at all in another. Based on known private history, I might trust someone not trusted by anyone else.

Both reputation and trust are typically managed at the application level, and several implementations are common in public posting sites, such as Advogato and Slashdot Friends and Foe lists. But they make much more sense in a decentralized network. Trust is there just an additional layer of information in your roster, something clearly seen in the key management section of PKI clients.

Combine trust with recommendation systems, and you can create quite interesting queries, along the lines of: “Show me all content that has been rated highly by people I trust in this category.” Surely indirect relationships, web of trust style, can also be factored into the search result sorting, so that people who are trusted by people I trust are also automatically somewhat trusted by me (or my system). Such trust propagation could have quite massive effects on content propagation.

Nothing quite like it has been implemented yet, but it’s something to be on the lookout for because it will be an important development for peer networks. Erik Möller, noted German freelance journalist, is quite keen on this idea:

You can take this trusted recommendation concept even further. Form “teams” with other users who rate content, and assign teams reputations of their own. Thus, I could formulate a query: “Show me all content that has been rated highly by Team xyz recently.”

Then, the effects get even more massive.

With massive, I mean that the network itself could create “hypes” on the scale of a “Slashdot effect”, and more. If this is integrated with micropayment systems, these hypes could actually be turned into real money, for instance for an artist, an open source project, or a political action.

Some mention should therefore be made of projects to provide trust systems.

The Open Privacy Projects

One good resource for network-related trust and reputation systems is the OpenPrivacy initiative (www.openprivacy.org). This site is an open source collection of software frameworks, protocols and services that can provide a cryptographically secure and distributed platform for creating, maintaining, and selectively sharing user profile information.

The vision is to enable user control over personal data, while simultaneously at user discretion providing marketers with access to higher-quality profile segments— an incentive for a new breed of personalized services to provide people and businesses with timely and relevant information. While that might not sound too enticing as it stands, the practical projects provide a diverse mix of trust and reputation technologies that might, here presented roughly in order of completeness.

  • Sierra, a reference implementation of the Reputation Management Framework (RMF), which is OpenPrivacy’s core project. It’s designed to ease the process of creating community with reputation-enhanced pseudonymous entities.

  • Talon, a simple yet powerful component system for Java. Sierra is being developed using Talon and is expected to use Sierra’s reputation manager to drive component selection.

  • Reputation Capital Exchange, a secure mechanism for mapping between RCEs that use different trust metrics. Such mapping has important interoperability implications for unrelated reputation and trust systems.

  • Reptile, an open source and free software Syndicated Content Directory Server (SCDS). It provides a personalized news and information portal with privacy and reputation accumulation, of significant interest to p2pj.

The RMF is primarily a set of four interfaces: Nym Manager, Communications Manager, Storage Manager, and Reputation Calculation Engine.

The implementation origins of OpenPrivacy are in the Broadcatch Project (www.broadcatch.com), which offers technology solutions for

  • Portal and community building

  • Infomediary agents (such as brokers)

  • Pseudonymous publishing

  • Reputation capital accrual

  • Persona, profile, and reputation management

  • Anonymous verification and authentication

GNU Privacy Guard

GNU Privacy Guard (GnuPG, www.gnupg.org) is a complete and free encryption replacement for PGP, the private encryption tool originally developed by Philip Zimmermann. Because it doesn’t use the patented IDEA algorithm, GnuPG can be used without any restrictions. The application is compliant with ITEF standard RFC2440—that is to say, it’s one of the OpenPGP implementations. Refer to the OpenPGP Alliance Web site (www.openpgp.org) for more information.

Table 12.1. Comparison between different types of trust certificates seen on networks
Certificate type Certification authority characteristics Kind of identification
X.509 Naming authority hierarchies.

Cross-certification.

CPS (Certification Practices Statement) required.

(key bound to person)
Global by original definition, but local in practice (no single root, X.500 Distinguished Name is chosen by and hopefully unique to the issuing CA). Mapping failures security issue.
SPKI/SDSI Single naming authority. No CPS is necessary. Arbitrary local identity (keyholder or group) not bound to particular person.
SPKI without names Authorization authority hierarchies; k-of-n lists. optional multiple holders Global and persistent (public key or derived hash, globally unique). Could be anonymous group.
PGP Web of Trust (multiple signers attest with own signature). Or, signer(s) trusted by user. Global due to DNS-defined unique e-mail address, but not guaranteed persistent; some publication/revocation issues.

GnuPG can be freely used, modified and distributed under the terms of the GNU General Public Licence. Further development is funded by the German Federal Ministry of Economics and Technology.

The importance of public key encryption is evident in the many secure peer solutions that use it to secure and authenticate identity or content, and in the many derivative hashing and signature strategies that have been developed worldwide. OpenPGP is the most widely used e-mail encryption standard in the world.

In reputation and trust contexts, users rely on either personal evaluations or the distributed Web of Trust strategy. The latter involves multiple paths of certification to compensate for the fact that anyone can sign PGP/GnuPGP certificates. It is probably useful to have a summary comparison in this book between the common types of trust certificates seen on networks, and here seems reasonable, therefore Table 12.1.

A web-of-trust construction is a reasonably workable solution on the whole, as evidenced by the great amount of trust invested in it by people who use digital signatures and public keys. In reality, however, people rarely understand a web of trust or use it quite as intended, instead relying only on already known and trusted signatories. An interesting design feature is that the verifier sets the level of trust in keys and in principle can demand some number of independent signatures on a PGP certificate before that binding is considered valid. The working assumption here is that the different key signatures represent independent individuals, something rarely provable even assuming that the receiving user would go to that trouble.

At any rate, the construction is no worse than the hierarchical systems it’s compared to, largely because as deployed, trust hierarchies are flawed (no single root directory, for example), or their mappings are vulnerable in various ways. Passport Kerberos authentication is no better in any respect. My reflection on the issue:

It’s been noted that the Internet domain name service (DNS) as a global map for Internet identities may have succeeded only because it was designed and implemented before larger political forces became aware of it or of its importance. Later turf wars surrounding the domain system, the registrar system and top-level domain (TLD) extensions tend to substantiate that view.

Security Futures

As yet, a global public key infrastructure (PKI) for easy authentication and trust management is a vision—it’s not clear what may or may not be deployed.

Microsoft wants us to sign on to the Single Sign-in Service on Passport for all our needs, and as usual, it is betting the proverbial farm on it in the current .NET paradigm. However, there are many reservations about that model. Apart from the fact that not everyone accepts the basic premise that everybody’s identity profile be tied to a single vendor’s server farm for all transactions on the Internet, there are concerns about the reliability and security of the technology offered.

A lack of global PKI means that transaction trust must be established in smaller communities whose members often collaborate, and possibly these communities in turn exchange trust information with others, as the need arises. This vision is rooted in a distributed view, closer to the basic p2p philosophy and could be deployed in terms of JXTA technology, for example. It’s perhaps no accident that Microsoft and Sun, old adversaries, represent different ends of the spectrum here.

It’s at least my view that local anchoring of webs-of-trust and transaction-trust mechanisms will be a natural and inevitable development as p2p matures. This process is likely to provide many definitions of “local” neighborhoods: physical, social, professional, and virtual. As in the physical world, our peer-based trust will be a context-driven metric with few (if any) absolutes, yet prove surprisingly robust and practical for the social situations in which we require it.

Private Key Management in .NET

Because the authentication system built into .NET is a central server model, it’s to be expected that the core is a central Authentication Server (AS). The AS utilizes a somewhat modified form of the open Kerberos standard for private encryption keys.

The AS issues “tickets” to authenticated clients. When clients need to be authorized to use other services on the network, they submit the ticket to the service, which then authenticates it by passing it back to the AS on a back channel. To avoid repetitive transactions of this nature, the service might then grant the authenticated client a special local access authorization ticket good for a specified duration with the service, perhaps a day, called a ticket granting ticket (TGT).

The biggest vulnerability here is the central AS and the requirement for reliable real-time connectivity to it—services need to check authentication with the AS when the clients want to log in. Such a system can easily bog down for various reasons (peak demand, network congestion, server bugs, and so on) with serious consequences, as was vividly demonstrated when the Microsoft Gaming Zone shifted to this form of client authentication; some users were locked out for days. Even in good conditions, the authentication login process is bound to be experienced as slow.

The good point in the AS model is that the ticket can easily be revoked in a single location: the issuing AS itself.

An advantage of public key authentication is that it’s easy and distributed. A signature is checked against one of many public keyring servers, assuming that the key holder has published it at some time on any one. Alternatively, the signature is checked against locally cached keyrings for frequent or known clients. In the latter case, no special connectivity is required. Keys can be generated individually or by an issuing authority, depending on the situation, which can be either an advantage or a problem. The real problem with distributed public keys and the associated certificates is that they are difficult to revoke. The signing key must be revoked in a published form, and everyone who might encounter the certificate advised of it.

The future prospects of both solutions are relatively good, because each caters to specific requirements in rather different areas. Still, it seems likely that the public key model (or PKI) has the clear edge as far as utility in p2p solutions is concerned and a probable edge for use on the Internet in general.

We might expect the central authority model to be the system of choice for central authority (for example, government), and thus be a key feature of any use of p2p in the political applications discussed next. But one can also image a scenario where a CA certificate mainly attests citizen authenticity of a PKI certificate.

Peer-to-Peer Politics

Speaking of politics, the effects of peer systems on democracy are also worth noting. After all, much of politics is about both reputation and trust, or their relative negative qualities when things come to mud raking and slinging.

The whole question of voting systems is not discussed elsewhere in the book, but it’s rather interesting and ties in with the previous issues of secure identification, possibly authenticated but anonymous, of the citizen. Erik Möller and others have speculated in this area, and some main points are summarized here.

Can voting systems be combined with collaborative content creation and recommendation systems in such a way that “direct democracy” becomes feasible? Perhaps. Voting in such contexts becomes an active process of collaboration on concepts. Consider an interface that would show you all arguments from the pro-side, all those from the contra-side, consideration annotations by experts, moderated debates, and a (at least somewhat) editable collaborative (perhaps Wiki-style) discussion forum. Citizens could then decide whether to accept or reject a certain political proposal within a fairly asynchronous framework, until some predetermined threshold majority is reached one way or the other.

The binary yes/no vote is not the only option; what about “More information, please”, “Don’t care”, or some other fine-grained expression from the constituency? After a political action has been agreed on, it could then be implemented or lobbied for using micropayment systems. Democracy could eventually become a massive collaborative exercise, run on decentralized p2p networks, with trusted meta-peers that collect votes.

Much could be done here. So far, we’ve seen mainly government administration trying to Web-disseminate information and forms to the general population (server to clients) and also provide access to centralized public archives. Some cases of information collection, such as taxes and the governmental equivalent of customer care services are in experimental deployment, and might benefit from deployed aspects of p2p infrastructure in the general Internet. However, voting and peer discussion applications would be a first major step into the p2p arena.

As noted earlier, this kind of massive change might come naturally when the p2p-friendly next-generation Internet infrastructure begins to dominate public usage and the services that can then easily be set up and integrated with it.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.117.73.127