I've been pretty disappointed with how the media has presented P2P (peer-to-peer) so far. That is, with a lot of vagueness. Mind, I haven't read everything, but after watching the video of Clay Shirky's keynote at the O'Reilly P2P conference, I ended up writing a synopsis of how he defined P2P, and a few things that came to mind.
What is P2P?
Shirky defines it as technology that makes resources "at the edges of the network" available to the rest of the network. "Resources" refers to anything of value that a node (ususally a PC) can offer to other nodes on the network, such CPU cycles, content or storage space.
This sounds like nothing new, and it may very well not be, but Shirky makes an important point about the difficulty of serving files. To set up a simple PC as a server, there are countless barriers to entry: getting a fixed ip address, a constant connection to the network, registering a domain, configuring web servers, configuring domain information. All these make it expensive and time consuming to simply serve files from a computer.
What's interesting about "P2P" apps, then, is that they bypass all the barriers to serving files (or CPU cycles, or instant messages, storage space, etc.), by creating a new namespace. Making the process of sharing resources more convenient, it seems, can result in fast-growing networks outside of the traditional DNS; for example, there are more registered Napster users than DNS entries, and roughly twice as many ICQ/AIM users. The danger, Shirky warns, is that the result of bypassing the DNS is private databases for public networks, which is potentially the source of some serious problems. I would add that replacing anything public with something private is cause for serious concern :>
What's interesting about this definition of P2P is that it's really nothing new, technology wise (as far as I can tell). What it is is the gradual blur of the distinction between client and server. It's really hard to tell if anything 'revolutionary' will come of this, but if P2P only referred to the process of making it really easy to set up a server and handle multiple servers in different ways, it would still be pretty interesting. Load distribution and redundancy are the obvious ones.
However, things that could be considered P2P can have absolutely nothing to do with new technology. For example, setting up a network of mirrors that automatically download websites that have been censored onto a local server, making them newly available. Such an anti-censorship app could be set up with a few scripts, and it would just be a new way of using the same old tools.
This raises the question: why call it P2P? It seems like the label does more to confuse the concept than to clarify it, but giving something a name can be a way to get people to think in specific terms... provided they have any idea what the name refers to.