Pages

Translate this blog to many language

Friday, March 29, 2013

Internet basic

Internet basic, Internet users around the globe From Gopher to the WWW of internet e-mail, Simple Mail Transfer Protocol, and Usenet ... Search engine (computing) File sharing of internet Dot-com bubble of internet Mobile Web InterNIC, Internet Assigned Numbers Authority, and... The Internet Society Request for Comments in internet Internet Engineering Task Force Intellectual property in internet Statistics in internet Web accessibility Web standards Security of internet Web cache Internet privacy WWW prefix Dynamic updates of web pages Linking in internet of WWW Function of terms Internet and World Wide Web The World Wide Web (abbreviated as WWW or W3 History of the World Wide Web The CERN of WWW servers Telecommuting In Internet Transition towards the Internet Packet switching in internet RAND Corporation and ARPANET Victorian Internet Crowdsourcing In Internet Philanthropy In Internet Politics and political revolutions In Internet Internet censorship and Internet freedom Politics and political revolutions The Internet ha... Internet is Electronic Business Internet And Communication Data transfer in internet Internet access Global Internet usage Sociology of the Internet Services of World Wide Web in internet Modern uses in Internet Internet governance Routing of Internet Internet protocol suite history of the World Wide Web The Internet The history of the Internet Internet capitalization conventions

Internet users around the globe


Internet users around the globe are facing slowed-down service, thanks to what's being called the biggest cyberattack in history.
The prolonged denial-of-service assault is targeting The Spamhaus Project, a European spam-fighting group that has gone after CyberBunker, a data-storage company that offers to host any content "except child porn and anything related to terrorism."
The organization has been in a long-running feud with CyberBunker and claims spammers use it as a host from which to spray junk mail across the Web.
Internet security firm CloudFlare said Spamhaus contacted it last week, saying it had been hit with an attack big enough to knock its site offline.
Security experts say the attack uses more sophisticated techniques than most DDoS (distributed denial of service) attacks and targets the Web's infrastructure, which has led to other sites performing slowly.


It's the biggest attack we've seen," Matthew Prince, CloudFlare's CEO, told CNN.
The FBI is involved in the investigation into the cyberattack on Spamhaus, though a bureau spokesman didn't provide any details on the FBI's role or the scope of the probe.
The Spamhaus Project is a nonprofit organization that patrols the Internet for spammers and publishes a list of Web servers those spammers use. According to Prince, the group may be responsible for up to 80% of all spam that gets blocked. This month, the group added CyberBunker to its blacklist.
"While we don't know who was behind this attack, Spamhaus has made plenty of enemies over the years," Prince wrote in a blog post. "Spammers aren't always the most lovable of individuals, and Spamhaus has been threatened, sued and DDoSed regularly."
In a DDoS attack, computers flood a website with requests, overwhelming its servers and causing it to crash or become inaccessible for many users.
One way to defend against those attacks, Prince said, is to deflect some of the traffic targeted at a single server onto a bunch of other servers at different locations. That's what happened in this case, and why Web users experienced some slowdowns on other sites.
He told CNN the last big wave of the attack hit Tuesday morning, but that he doesn't "live under the illusion" that there won't be more.
For its part, CyberBunker isn't taking credit for the attack. But the Dutch company, housed in a former NATO nuclear bunker, isn't shying away, either.
"This here is the internet community puking out SpamHaus," CyberBunker founder Sven Olaf Kamphuis told CNN. "We've had it with the guys ... . What we see right here is the internet puking out a cancer."
He said the owners of various websites got together on a Skype chat and hatched the plans for the attack. He says that StopHaus, a group organized to support CyberBunker in the dispute, ceased the attack after three days but that other hackers and activists kept it up after that.
Kamphuis and other critics say that Spamhaus oversteps its bounds and has essentially destroyed innocent websites in its spam-fighting efforts.
"Spamhaus itself is a more urgent danger" than spam, Kamphuis told CNN. "Pointing at websites and saying they want it shut down and then they get it shut down without any court order. That is a significantly larger threat to internet and freedom of speech and net neutrality than anything else."
Vincent Hanna, a researcher with The Spamhaus Project, said the group's record speaks for itself. He said the project has existed for over 12 years and its data is used to protect more than 1.7 billion e-mail accounts worldwide.
"We have 1.7 billion people looking over our shoulders to make sure we do our job right," he said. "If we start blocking things they want, they won't use our data any more."
He emphasized that Spamhaus doesn't have the power to block e-mail from anyone -- it merely makes its data available for service providers and other Web companies to use.
Hanna said Spamhaus experienced its first denial-of-service attack in 2003.
"This has been the biggest for us," he said, "but certainly not the first one."
Cloudflare's Prince said denying access to a website through cyberattacks is the truest assault on Web freedom.
"Our role is to allow the internet to achieve what it aspires to -- that anyone, anywhere can publish any piece of information and make it accessible to anyone, anywhere else in the world," he said. "It's blatant censorship.
"Whether Spamhaus is a good organization or a bad organization is irrelevant to me. We protect American financial institutions, which some people think are evil, and we protect WikiLeaks, which some people think are evil."

From Gopher to the WWW of internet


One of the most promising user interface paradigms during this period was hypertext. The technology had been inspired by Vannevar Bush's "Memex"and developed through Ted Nelson's research on Project Xanadu and Douglas Engelbart's research on NLS.Many small self-contained hypertext systems had been created before, such as Apple Computer's HyperCard (1987). Gopher became the first commonly used hypertext interface to the Internet. While Gopher menu items were examples of hypertext, they were not commonly perceived in that way.


This NeXT Computer was used by Sir Tim Berners-Lee at CERN and became the world's first Web server.
In 1989, while working at CERN, Tim Berners-Lee invented a network-based implementation of the hypertext concept. By releasing his invention to public use, he ensured the technology would become widespread.[86] For his work in developing the World Wide Web, Berners-Lee received the Millennium technology prize in 2004. One early popular web browser, modeled after HyperCard, was ViolaWWW.

A turning point for the World Wide Web began with the introduction of the Mosaic web browser[89] in 1993, a graphical browser developed by a team at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign (NCSA-UIUC), led by Marc Andreessen. Funding for Mosaic came from the High-Performance Computing and Communications Initiative, a funding program initiated by the High Performance Computing and Communication Act of 1991 also known as the Gore Bill.[90] Mosaic's graphical interface soon became more popular than Gopher, which at the time was primarily text-based, and the WWW became the preferred interface for accessing the Internet. (Gore's reference to his role in "creating the Internet", however, was ridiculed in his presidential election campaign. See the full article Al Gore and information technology).
Mosaic was eventually superseded in 1994 by Andreessen's Netscape Navigator, which replaced Mosaic as the world's most popular browser. While it held this title for some time, eventually competition from Internet Explorer and a variety of other browsers almost completely displaced it. Another important event held on January 11, 1994, was The Superhighway Summit at UCLA's Royce Hall. This was the "first public conference bringing together all of the major industry, government and academic leaders in the field [and] also began the national dialogue about the Information Superhighway and its implications."
24 Hours in Cyberspace, "the largest one-day online event" (February 8, 1996) up to that date, took place on the then-active website, cyber24.com. It was headed by photographer Rick Smolan.[94] A photographic exhibition was unveiled at the Smithsonian Institution's National Museum of American History on January 23, 1997, featuring 70 photos from the project.

e-mail, Simple Mail Transfer Protocol, and Usenet in internet


e-mail, Simple Mail Transfer Protocol, and Usenet
Email is often called the killer application of the Internet. However, it actually predates the Internet and was a crucial tool in creating it. Email started in 1965 as a way for multiple users of a time-sharing mainframe computer to communicate. Although the history is unclear, among the first systems to have such a facility were SDC's Q32 and MIT's CTSS.

The ARPANET computer network made a large contribution to the evolution of email. There is one report[82] indicating experimental inter-system email transfers on it shortly after ARPANET's creation. In 1971 Ray Tomlinson created what was to become the standard Internet email address format, using the @ sign to separate user names from host names.

A number of protocols were developed to deliver messages among groups of time-sharing computers over alternative transmission systems, such as UUCP and IBM's VNET email system. Email could be passed this way between a number of networks, including ARPANET, BITNET and NSFNET, as well as to hosts connected directly to other sites via UUCP. See the history of SMTP protocol.
In addition, UUCP allowed the publication of text files that could be read by many others. The News software developed by Steve Daniel and Tom Truscott in 1979 was used to distribute news and bulletin board-like messages. This quickly grew into discussion groups, known as newsgroups, on a wide range of topics. On ARPANET and NSFNET similar discussion groups would form via mailing lists, discussing both technical issues and more culturally focused topics (such as science fiction, discussed on the sflovers mailing list).
During the early years of the Internet, email and similar mechanisms were also fundamental to allow people to access resources that were not available due to the absence of online connectivity. UUCP was often used to distribute files using the 'alt.binary' groups. Also, FTP e-mail gateways allowed people that lived outside the US and Europe to download files using ftp commands written inside email messages. The file was encoded, broken in pieces and sent by email; the receiver had to reassemble and decode it later, and it was the only way for people living overseas to download items such as the earlier Linux versions using the slow dial-up connections available at the time. After the popularization of the Web and the HTTP protocol such tools were slowly abandoned.

Search engine (computing)


 Search engine (computing)
Even before the World Wide Web, there were search engines that attempted to organize the Internet. The first of these was the Archie search engine from McGill University in 1990, followed in 1991 by WAIS and Gopher. All three of those systems predated the invention of the World Wide Web but all continued to index the Web and the rest of the Internet for several years after the Web appeared. There are still Gopher servers as of 2006, although there are a great many more web servers.
As the Web grew, search engines and Web directories were created to track pages on the Web and allow people to find things. The first full-text Web search engine was WebCrawler in 1994. Before WebCrawler, only Web page titles were searched. Another early search engine, Lycos, was created in 1993 as a university project, and was the first to achieve commercial success. During the late 1990s, both Web directories and Web search engines were popular—Yahoo! (founded 1994) and Altavista (founded 1995) were the respective industry leaders. By August 2001, the directory model had begun to give way to search engines, tracking the rise of Google (founded 1998), which had developed new approaches to relevancy ranking. Directory features, while still commonly available, became after-thoughts to search engines.
Database size, which had been a significant marketing feature through the early 2000s, was similarly displaced by emphasis on relevancy ranking, the methods by which search engines attempt to sort the best results first. Relevancy ranking first became a major issue circa 1996, when it became apparent that it was impractical to review full lists of results. Consequently, algorithms for relevancy ranking have continuously improved. Google's PageRank method for ordering the results has received the most press, but all major search engines continually refine their ranking methodologies with a view toward improving the ordering of results. As of 2006, search engine rankings are more important than ever, so much so that an industry has developed ("search engine optimizers", or "SEO") to help web-developers improve their search ranking, and an entire body of case law has developed around matters that affect search engine rankings, such as use of trademarks in metatags. The sale of search rankings by some search engines has also created controversy among librarians and consumer advocates.
On June 3, 2009, Microsoft launched its new search engine, Bing. The following month Microsoft and Yahoo! announced a deal in which Bing would power Yahoo! Search.

File sharing of internet



File sharing , Peer-to-peer file sharing , and Timeline of file sharing
Resource or file sharing has been an important activity on computer networks from well before the Internet was established and was supported in a variety of ways including bulletin board systems (1978), Usenet (1980), Kermit (1981), and many others. The File Transfer Protocol (FTP) for use on the Internet was standardized in 1985 and is still in use today.A variety of tools were developed to aid the use of FTP by helping users discover files they might want to transfer, including the Wide Area Information Server (WAIS) in 1991, Gopher in 1991, Archie in 1991, Veronica in 1992, Jughead in 1993, Internet Relay Chat (IRC) in 1988, and eventually the World Wide Web (WWW) in 1991 with Web directories and Web search engines.
In 1999, Napster became the first peer-to-peer file sharing system.Napster used a central server for indexing and peer discovery, but the storage and transfer of files was decentralized. A variety of peer-to-peer file sharing programs and services with different levels of decentralization and anonymity followed, including: Gnutella, eDonkey2000, and Freenet in 2000, FastTrack, Kazaa, Limewire, and BitTorrent in 2001, and Poisoned in 2003.

All of these tools are general purpose and can be used to share a wide variety of content, but sharing of music files, software, and later movies and videos are major uses.And while some of this sharing is legal, large portions are not. Lawsuits and other legal actions caused Napster in 2001, eDonkey2000 in 2005, Kazza in 2006, and Limewire in 2010 to shutdown or refocus their efforts.

The Pirate Bay, founded in Sweden in 2003, continues despite a trial and appeal in 2009 and 2010 that resulted in jail terms and large fines for several of its founders. File sharing remains contentious and controversial with charges of theft of intellectual property on the one hand and charges of censorship on the other.

Dot-com bubble of internet


Dot-com bubble
Suddenly the low price of reaching millions worldwide, and the possibility of selling to or hearing from those people at the same moment when they were reached, promised to overturn established business dogma in advertising, mail-order sales, customer relationship management, and many more areas. The web was a new killer app—it could bring together unrelated buyers and sellers in seamless and low-cost ways. Entrepreneurs around the world developed new business models, and ran to their nearest venture capitalist. While some of the new entrepreneurs had experience in business and economics, the majority were simply people with ideas, and did not manage the capital influx prudently. Additionally, many dot-com business plans were predicated on the assumption that by using the Internet, they would bypass the distribution channels of existing businesses and therefore not have to compete with them; when the established businesses with strong existing brands developed their own Internet presence, these hopes were shattered, and the newcomers were left attempting to break into markets dominated by larger, more established businesses. Many did not have the ability to do so.
The dot-com bubble burst in March 2000, with the technology heavy NASDAQ Composite index peaking at 5,048.62 on March 10(5,132.52 intraday), more than double its value just a year before. By 2001, the bubble's deflation was running full speed. A majority of the dot-coms had ceased trading, after having burnt through their venture capital and IPO capital, often without ever making a profit. But despite this, the Internet continues to grow, driven by commerce, ever greater amounts of online information and knowledge and social networking.