Tuesday 11 October 2011

Individual Tropical Review Paper draft

The Unstoppable Peer to Peer Sharing

Executive Summary

On the online community and in the ethical arena peer-to-peer (P2P) have attracted great attention the last few years. The original pioneering P2P internet service was Napster that focused on sharing of MP3 format audio files. Companies such as Gnutella, Madster, Freenet carried on with the idea. Currently there are more than a 100 million users and Bittorrent has a greater number of users than YouTube and Facebook combined. With the world is becoming flatter in term of hierarchical structure and internet, P2P flourishes. Tweeter and Facebook have already taken upon themselves to switch to P2P technology and we can expected more companies to follow suit. Some likely future uses of the technology will be in places like broadcasting, games,  games and Cloud computing.


Historical Perspective

In the past before P2P was widespread, people who wanted music, videos, softwares and other digital files had to either download it from a central server or buy compact discs(CD) and upload to their digital devises. It is not uncommon to see pirated CD shop along the streets selling cheap CDs of popular music, games and softwares.

People who downloaded from the internet have to bear with the long waiting time and hope for the best with their unstable internet connection where a cut in the connection might mean an hour of wasted time attempting to download a video.

P2P dynamite ignited

In May 1999, 18 year old college dropout named Shawn Fanning changed the music industry forever in the dormitory of Boston’s Northeastern University with his file-sharing program called Napster. During that period, his roommates who were music lovers were unhappy with the reliability of MP3 sites. These sites were frequently dead and indexes were usually out dated. His idea to their problem was to write a source code for a program which gives opportunity for users to share their music files through the internet and for students to download them through multiple peers. After a straight 60 hours of writing source code, Napster was born.

Napster instantly became music-fan’s utopia:  A dreamland where almost every song was instantly available for free. Some 60 million users around the world were freely exchanging their songs at its peak.

A year later in July 2001 Napster Shutdown its system after having multiple lawsuits filed against them. Some like Metallica have been alleging that the company encourages piracy by enabling and allowing its users to trade copyrighted songs through its servers.

Even with this short duration, Napster have shown the power of p2p sharing, in the year 2003, P2P sharing traffic has surpassed web traffic and became the single largest traffic type by volume on ISP networks. Despite causing much controversy, Napster had started a revolution which cannot be stopped despites much efforts from the artist and the media industry.

Brief explanation on how P2P works

A P2P protocol creates transmission control protocol connections with multiple hosts and makes many small data requests to each. The P2P client then combines the chunks to recreate the file. A single file host will usually have limited upload capacity, but connecting to many servers simultaneously allows for higher file transfers, and disperses the costs associated with data transfers amongst many peers. Moreover, a client mid-way through downloading the file also acts as a server, hosting the bits to others which they have already downloaded. These differences from traditional HTTP GET requests allow for lower costs and higher redundancy since many people are sharing the files.


Current Situation
The growth in the number of P2P technology users has been exponential since the birth of Napster. The vacuum left by Napster's demise has been filled by numerous other companies/applications, with Kazaa and Gnutella dominating the market. It has been reported that these two account for between 40% and 60% of all traffic on the Internet. The massive growth has been accompanied by massive strides in the development and understanding of P2P technologies. Unlike the centralised techniques of Napster, the new applications are adopting a decentralised approach, making them harder to police. This has led to a lot of concern about the lack of central leadership and control.
In recent years we have seen P2P technologies being embraced by large companies trying to tap its vast potential. Two of the more notable examples of this are Deloitte & Touche, and Intel. Central databases are no longer required when using P2P. This means it can be less expensive and far easier to scale. Intel has been using P2P since 1992, thus avoiding the need for a large server. Traditional databases are still commonplace today, but as increasing numbers of companies follow the example of organisations like Intel, such databases could be overtaken and replaced by P2P.
A significant turn in the development of P2P occurred recently when Microsoft announced plans to invest $51m in a company called Groove Networks. At the forefront of this company is Ray Ozzie, the inventor of 'Lotus Notes'. Groove Networks is implementing a hybrid technique in its development of P2P. This means it utilises both centralised and decentralised techniques in order to get the best of both worlds.
The development of P2P technologies has been hindered by a number of legal issues. The newer companies, such as Morpheus, are seeking to learn from Napster's mistakes and get legal protection. They are arguing that they are not responsible for any illegal activity perpetrated using their software. There are clearly a number of legitimate legal uses for P2P and this should safeguard its future. The only problem is how to stop illegal file sharing (movies, music etc.). At the moment no one has the answer, but the search is ongoing.
Since they are most at risk, record companies are currently looking at ways of deterring people from using copyright-busting file-sharing networks. The open nature of these systems allows record companies to attack them from within - creating their own fake users who provide bad quality or confusingly-named data, or overloading the network with queries of their own. This technique was put forward by two students from Washington University, who outlined it in a white paper. However, the legality of this process is itself being questioned. The students also suggested that users be randomly sued, or that heavy users, P2Ps equivalent of dealers, be targeted. It seems, at present, this would be the best way to deter people from illegally sharing files.

As of January 2011 BitTorrent has 100 million users and a greater share of network bandwidth than Netflix and Hulu combined.
At any given instant of time BitTorrent has, on average, more active users thanYouTube and Facebook combined. (This refers to the number of active users at any instant and not to the total number of unique users.)
With their BitTorrent-powered distribution system Facebook is now able to send a few hundred MB to tens of thousands of machines in just one minute. The internal Facebook swarm turns every server into a peer that helps in distributing the new code, which gets it updated as quickly as possible. Without BitTorrent this process could take several hours to complete.
Facebook is not the only large web-service that uses BitTorrent to keep its servers updated. Earlier this year we reported that Twitter is doing the same. Twitter’s implementation, codenamed ‘Murder’, is based on the BitTornado BitTorrent client. The code is open to the public and licensed under the free software Apache License.
Besides these social networking sites, several universities have been successfully using BitTorrent-powered systems to update their computers for quite some time already. A Dutch university reported that it retired 20 of the 22 servers it used to send out updates to workstations, saving not only time but also money.

Future Expectations
A growing number of individuals and organizations are using BitTorrent to distribute their own or licensed material. Independent adopters report that without using BitTorrent technology and its dramatically reduced demands on their private networking hardware and bandwidth, they could not afford to distribute their files.

Future
Already we have seen the origins of the peer to peer explosion in the area of music files and we know the position against piracy that organizations are taking. In future developments we should be able to see either more free to download music where artist are being forced distribute freely or lose market audience. Another alternative would be Itunes pay per download method, where artist are charging minimal fee for their music.
Broadcasting for prime time news to a global audience might face the problem of slow streaming. However with P2P sharing broadcasting companies might follow the trend and join companies like CBC to broadcast material using Bittorrent clients.
Many software games, especially those whose large size makes them difficult to host due to bandwidth limits, extremely frequent downloads, and unpredictable changes in network traffic, will distribute instead a specialized, stripped down bittorrent client with enough functionality to download the game from the other running clients and the primary server (which is maintained in case not enough peers are available).
Governmental organisations like in the UK have used Bittorrent to distribute information about how tax money of UK citizens was spent. More such free information might be shared in this manner.

Before Steve Jobs passed away, he spark brilliance in people’s minds by introducing the concept of cloud computing in their vocabulary. Possibly, we can integrate cloud computing with P2P sharing  

No comments:

Post a Comment