blank
blank
only search Cheshire Henbury

Cheshire Henbury's website is structured around several sub-sites to accommodate the large amount of content. Please pick a topic of interest from the above menu and begin to explore and learn. Or use the Google Search box to the left.

Paul T Kidd's eBusiness Pages

Main Home > Legacy Content Home > eBusiness Home > Next Generation Internet

Next Generation Internet

Next Generation Internet

 

Introduction

There are numerous initiatives underway to create the Next Generation Internet. Sometimes this is referred to as the Grid. There are also a number of company specific labels mentioned in the public domain such as Microsoft's .Net, Sun's SunOne and IBM's WebSphere. Microsoft's vision is one of a connected web where things happen automatically, rather than users having to make their connections and pass information onto other people in a manual way. The World Wide Web Consortium (W3C) also regards the current version of the Internet as a prototype, and is working to produce something that it calls the Semantic Web. Underlying these developments is Extensible Markup Language (XML). XML already brings structure to the net and is particularly useful for user interfaces and for making assertions and assumptions about the meaning and relevance of data.

Agent Technologies

The Internet currently provides the wiring. What is needed to provide intelligence in the Internet is software technologies embedded in the Internet itself. One form of software are agents and multi-agents that are capable of acting autonomously for their owners (individuals, companies, and others) - pursuing goals, modifying their behaviour, co-operating, negotiating, etc. These agents would interact with other agents, dynamically. Numerous efforts are being directed at agent and multi agent systems, world-wide. Open issues and challenges are how to describe and reason about services; handling different languages; enabling brokerage between different agents; and scalability.

The World Wide Web Consortium (W3C) has created guidelines for people creating agents. W3C also has responsibility for XML, which will eventually allow agents to use semantic information to perform and steer their tasks

Peer-to-Peer Networking

The agent based approach attempts to put intelligence into the Internet. An alternative one is to use the intelligence that exists at the periphery of the Internet, that is in the millions of personal computers that are connected in offices and homes to the Internet via Company servers and Internet Service Providers. This field of activity is known as peer-to-peer (P2P) networking. The technology for this type of activity has largely emerged from the USA over the past four years. Names associated with this field are Napster, Gnutella, Flycode, Freenet, GoneSilent and so on. There are also vendors (e.g. Groove Networks) selling development tools for business applications of P2P networking.

Napster, the music download web site is an example of P2P technology. Napster provids a directory of music files located on many personal computers throughout the world, plus a piece of software that allows people to download files. This system works as follows. Anyone with music files on their computer can publish on the Napster server, a list of files they are willing to share. The server matches up requests for files with the list of providers, but the files themselves are transferred directly from personal computer to personal computer. These files do not pass via the Napster server as they would in the traditional Internet model that we know today. This is possible because Napster solved the technical problem of actually how to work around the structure of the Internet and allow computers that do not have domain names to locate each other and exchange specific files.

Although still in its infancy from an industry acceptance point of view, (in the case of Napster for obvious reasons), this type of technology is already dated. A system called Gnutella leads to the same end result: obtaining music free of charge, but without the central server directory. Gnutella users just download the software needed to operate the system. When someone makes a request for a file, the request is examined by other personal computers and relayed onwards to yet other personal computers down the chain, until the requested file has been located.

FreeNet differs from Gnutella in two important respects. First, the information is encrypted, so the person who originally put it on to FreeNet cannot be identified. Once the information has been posted, it moves randomly to another computer. The user of that computer will not know what information is on the machine. Because FreeNet has no central directory, a search engine looks through the entire network each time anyone seeks a file. The second difference is that FreeNet tries to be efficient. Unlike Gnutella, it looks out for popular files and ensures that a number of copies exist in various places. FreeNet will also move the information close to the place where it is in demand. This helps ensure that computers containing the information are not overloaded with requests. This also enables access to files even if the originating computer is off-line.

There is also growing commercial interest in these technologies (this is represented by the involvement of "Blue Chip" companies such as Siemens, Intel, IBM, and others) and tool and technology vendors supporting reputable applications are also beginning to emerge. Most of the focus in the area of P2P technologies at the moment is on sharing copyrighted digital content (both legally and illegally). There is also a growing interest in building faster search engines using these technologies and expanding the range of sources searched by adding computers on the fringes of the Internet. Industry interest also lies in more effective sharing of information and computing resources across the enterprise.

There are already two well-publicised examples of P2P networking technologies being used to enable scientific studies that would have been previously impossible owing to time and money constraints. Both of these involve the use of home personal computers, networked using P2P technologies, to carry out scientific analysis. The first of these was the Seti@Home initiative which involved using the power of more than two millions PC to analyse radio-telescope signals for evidence of extraterrestrial intelligence. The second, a project situated in the health area, begun in March 2001 and involves using PCs to screen 250 million chemicals for their anti-cancer activity.

Many developers in the P2P field have adopted the open source philosophy. Also the field does have reputable interests and potential applications, and is not just used for copyright infringement activities such as free music file sharing. The technology does in fact appear to be ripe for application, providing a powerful means of both expanding the information resources available on the web as well as searching for those resources.

Conclusions

In connection with the software technologies such as agents and peer-to-peer networking, a major issue is that of security, trust and privacy. This is already a big problem and these new technologies seem to be just creating even bigger security, trust and privacy problems. With regard to agents and services there is a need to answer questions like how an agent can describe the information and resources at its disposal and how an agent specifies the tasks it has to carry out. With regard to agents and brokerage, the question is how does an agent find another agent capable of doing a task, how an agent can resist other agents ganging up on it, and how to find a best fit between goals and identified services. Scalability is also a difficult area because of problems like feedback, resulting in instability or undesirable behaviour.

The coming years will certainly see significant developments in Internet technologies. What we have today is only a rudimentary system. The Next Generation Internet will be more sophisticated and offer greater potential to do entirely new things. Next Generation Internet will however, also pose even more severe problems for security, trust and privacy.

 

Some of Paul T Kidd's Books

Book Covers

Legal Notice: The information posted on the web site is designed to provide accurate and authoritative information in regard to the subject matter covered. Every effort has been made to ensure the accuracy of the information contained in the web site. The information is believed to be correct at the time of publication. Cheshire Henbury cannot however accept any responsibility for the completeness, accuracy and relevance of the information. Information is published with the understanding that publication does not represent the rendering of advice, consulting or other professional services. Specific application in a particular organisation is the sole responsibility of the representatives of that organisation. If expert advice is needed, the services of a competent person should be sought. Please read our terms and conditions (opens in a new window) for use of this web site.

Cheshire Henbury

Address and Phone Details (opens in new window)

Email: Contact form (opens in a new window)

Web address: www.cheshirehenbury.com