World of webWeb world
World Wide Web: Past, present and prospective
Initially, the World Wide Web was conceived as an inter-active world of common information through which humans can interact with each other and with machinery. The web will become a useful resource for even smaller groups, households and individual information providers. Further interesting trends would be the growing interactivity of the interfacing with the users and the increased use of machine-readable information with predefined semiantics, enabling more sophisticated mechanical handling of worldwide information, encompassing machine-readable statements.
Disclaimer: This document presents the authors own opinion, not that of members of the World Wide Web Consortium or guest institutions. It gives an outline of the World Wide Web's past, present state and possible direction for the Web in the near term. Web is easily understood as the web of information that is globally available and available in a single area.
It' an abstracted room with which humans can engage, and is is currently inhabited mainly by interconnected pages of text, pictures and animation with casual tones, three-dimensional environments and video. It' existance marked the end of an epoch of disillusioning and weakening compatibilities between computerystems. As the system's business capability has led to a fast-paced evolution of new functionality, maintaining overall web inter-operability has become an ongoing challenge for all stakeholders.
However, let us begin, as pledged, with a reference to the initial objectives of the proposal, which was designed in response to the author's own needs and the perceptions of the needs of the business and wider scientific and engineering community, as well as the world at large. Vanevar Bush's celebrated Washington Atlantic Month 1945 essay "As We May Think", in which he suggested the "Memex" engine, which, through a logic encoding and photocell workflow and immediate imaging, would allow the production and automatic tracking of cross-references to microfilm, would be the source of the idea for the hyperlink.
We continue with Doug Englebart's "NLS" ?system, which used NLS and offered hyper -text e-mail and document exchange, with Ted Nelson imprinting the term "hypertext". To all these visionaries, the reality of the world in which the technology wealthy high energy physics domain was located in 1980 was a world of irreconcilable networking, disc formatting, file formatting and drawing coding scheme that made any effort to transmit information between different types of system a discouraging and generally unpractical one.
It was the aim of the web to be a common information room about which humans (and machines) could interact ?could. It was intended that this room should range from a personal information system to information of general interest, from high-quality, meticulously reviewed and crafted materials to spontaneous suggestions that make little difference to a few and should never be red.
A few basic principles were applied to the Web site layout. When two groups of concurrent contributors have begun to use the system separately, establishing a connection from one contributor to another should be an increasing task that does not require any non-scalable operation such as consolidating links bases.
Every effort to bind the user as a whole to the use of certain language or OS has always been condemned to fail; information must be available on all plattforms, even those of the future; every effort to limit the intellectual models that the user of information has in a particular template has always been condemned to fail; if information is to be presented precisely in the system within a company, input or correction must be simple for the directly informed one.
Authors' experiences were with a number of propriety schemes, physicist engineered schemes and his own Enquire?program (1980)?which permitted accidental linking and were useful personal but not available in a broad area intranet. After all, one of the goals of the Web was that if the interplay between the individual and hyper text could be so intuitively understood that the machine-readable information room could provide an exact account of the state of people's thoughts, actions and working habits, then machinery analytics could become a very potent managerial tools that sees the pattern in our work and facilitates our collaboration through the common issues that affect the managements of large organisations.
World Wide Web architechture was suggested in 1989 and is shown in the picture. and HTTP, with form bargaining of the datatype. Thus the old FTP?protocol could be mixed with the new HTTP?protocol in the addressing range and traditional text files could be mixed with new hyper text files.
It has been ensured, however, that the shortcuts are persistent and that the shortcuts to document are deleted when removing them. To do this you need a globally available room of identificators. Meanwhile, the known tree begins with a suffix like "http:" to specify the blank in which the remainder of the character chain points. Universality of ½space means that any new place of any kind that has some kind of identification, name, or address mechanism can be converted into a printed version, prefixed with a text, and then become part of it.
Characteristics of a particular URI vary according to the characteristics of the room in which it points. Dependent on these characteristics, some rooms tended to be referred to as "name" ?spaces and some as "address" rooms, but the real characteristics of a room do not only hinge on its definitions, syntax and protocol supports, but also on the societal structures that it supports and that define the assignment and reassignment of identifiers. Thus, the real characteristics of a room vary according to the type of room it is supported in.
A further interesting characteristic of a URI is that it can genericize the identification of an object (e.g. a document): A URI can be specified, for example, for a volume that is available in multiple tongues and multiple file types. An additional ½could for the same work in a particular locale and an additional URI for a bitstream that represents a particular output of the work in a particular locale and in a particular dataformat could be specified.
For example, the idea of the "identity" of a Web item enables a generic nature that is uncommon in object-oriented frameworks. At the time the logs for access to distant files were created, there was a default in the File Transfer protocol (FTP). This was not ideal for the Web, however, because it was too sluggish and not feature-rich enough, so a new report was developed that works at the rate required to run hyperlinks, the so-called HyperText Transfer Protocol.
This is a function of HTTP is that allows a HTTP server to set the client's settings for voice and file formats. The function, known as form bargaining, is a crucial component of the autonomy between HTTP½specification and HTML½specification. To exchange hyperlinks, the hyperlink markup langauge was specified as a dataformat which is transferred by writing.
Faced with the supposed difficulties of persuading the world to use a new ubiquitous information system, HTML?was decided to similar some SGML-based frameworks in order to promote their introduction by the document communities, among which SGML?was is a privileged systax, and the hyper-text communities, among which SGML was the only systax seen as a possible one.
That made it easy to add new hyperlinks and new docs as a "wysiwyg" editors that searched at the same as well.
First web that describes the web was created with this utility, with hyperlinks to sounds and graphics file, and was released by a basic HTTP webmaster. To fill the web with information, a second web site was created that provided a gateways to an "old" telephone directory on a CERN workstation.
Navisoft Inc. developed a browser/editor in 1994 that recalls the initial World Wide Web application and can scroll and manipulate in the same session. Absent points are missed dates. There was a page on etiquette, the convention such as the e-mail site "Webmaster" as a point of reference for requests to a web site, the fact that the URL consisting of the web site name only should be a standard point of access, regardless of the terminology of a server's internals.
HTML, which was meant to be the warm and bang of a hyper text gobelin with abundant and diverse file type, became omnipresent. Shared URI standard, HTTP2½and HTML3½have, enabled the web to grow and also enabled the use of the web's deployment and expansion capabilities by businesses and academia around the world.
As a result, a large number of new file datatypes and logs have been created. HTTP's capability to process any type of file has enabled a simple extension of new file-format, so that the implementation of the three-dimensional VRML scenario describing languages or the Java(tm) bytecode file for the transmission of portable programming codes was simple.
Less simple was for the server to know which client was actually going to be used, as the system of negotiating formats is not widely used. As a result, known client computers have been forged by new, lesser-known ones to get enough abundant information from them.
Along with this comes an inadequacy in the MIME type used to describe data: text/html is used to point to many layers of HTML; image/png is used to point to any PNG?format graphics if it's interesting to know how many colours it codes; Java (tm)?files is delivered without visual evidence of the run-time assistance you need to run it.
In 1994 this led to the foundation of the World Wide Web Consortium. It is interesting in that the associated principals can be applied to further development. However, even if web architectures are not yet fully developed, web architectures can be used to implement new protocols and file types. It can be sent on CD-ROM or from a remote computer together with labelled information.
When the world works through the sharing of information and cash, the Internet enables the sharing of information, and so the sharing of cash is a next logical move. There is no room in this paper for discussing these systems and the various ways to deploy web safety.
½Search Search engine has been found to be extremely useful because large indices can be quickly browsed and unclear files can be found. ½They have turned out to be remarkable pointless, as their research usually only takes into consideration the terminology of documentation and has little or no conception of documentation qualities and thus produces a great deal of waste.
In the following it is discussed how the addition of document with definite Semantik to the Web should make substantially more efficient implements possible. Thirdly, apart from being a searchable human area, the Web should contain a wealth of information in a way that can be understood by machinery, so that machinery can play a greater role in the analysis of the Web and solve our issues.
There are cases where the audience of a document is so large that the burden on the server becomes totally intolerable. For the Web to be a useful reflection of reality, it must be possible for the accent on different types of document to quickly and drastically shift. Categorising and categorising user and document groups in order to be able to handle them in groups; anticipated high use of document groups by user groups; decide on an optimum location of copy files for quick retrieval; an algorithms for locating the least expensive or closest copy under a single address; the solution to these issues must be in a setting where different different types of copy are available under one address; the solution to these issues must be in a single setting.
Probably this will go on, and although HTML is seen as part of the mainstream architecture (and not as an exhilarating new toy), there will always be new styles, and a more efficient and perhaps more coherent suite of styles may ultimately replace HTML. Longer range, there are further changes in the web that will be necessary for its full capacity for interpersonal communications to unfold: When humans unite to create a hyper text of their common comprehension, they always have it at hand to point out misconceptions of unique news.
All the functions of a web based projects make the mechanical company assessment very tempting, perhaps enabling us to make inferences about company governance and reorganisation that would be difficult for a single human being to explain; the intent was to use the web as a private information system, as a group instrument at all levels from the two-person crew to the world's people, making decisions on environmental matters.
One of the major strengths of the system, as noted above, was the capacity to move and connect information between these strata, to clearly target the connections between them, and to retain consistence when the strata are blurry. Improved writers to enable people to interact directly with web information; notify interested parties when information has evolved; hypertext linking that visualizes and analyzes the semiantics of HR operations such as argumentation, peer reviews, and work flow organization; third-party annotation server; auditable authentification that allows group memberships to be set up for accessibility controls; displaying hyperlinks as premium items with revision controls, authoring, and property; and more.
In fact, it will not be taken for granted to use the Internet until such time as overall information and personally identifiable information are treated consistently. This means, from a point of views of the user interfaces, that the underlying computer surface, which usually uses a "desktop" method, must be embedded with hyper text. It' not that there are many big differences: filesystems have web document type relationships ("aliases", "shortcuts").
Also, the writer believes that the meaning of the name of a filed in computer system will diminish until the omnipresent dialog window named Dateiname will disappear. The information that is important is best indicated in its heading and the link that is available in various formats, such as the insertion of a data set into a directory, the inclusion of an e-mail in a " To: " section of a news item, the relation of a text item to its creator, etc.
Once the end users provide key information, such as the level of accessibility and trustworthiness needed to gain control of a given piece of documentation, and the level of exposure of a given piece of documentation, it is up to the system to maintain the intricacies of the hard drive memory in such a way as to achieve the necessary level of performance.
Firstly, the main modification needed is that web information potentially useful for such a programme must be available in a machine-readable format with predefined semantics. However, this is not the case for the first time. 12 could be modelled on the Electronic Document Interchange (EDI)[ref], where a number of formats such as purchase bids, letters of purchase, ownership certificates and accounts are designed as electronic equivalent hard copy equivalent.
In this case, the semiantics of each shape are specified by a human-readable specifications file. As an alternative, general-purpose colloquialisms could also be used to define statements in which axial expression strategies can be expressed from period to period in human-readable texts. In this case, the might of speech to bring together different areas of concept could result in a much more efficient system on which to build machines of argument.
½Knowledge Representative Office (KR) language is something that is scientifically interesting but has not had much influence on the use of computers. But then, the same applied to hyper text before the web gave it worldwide reach. A bidirectional relationship exists between the development of automated handling of worldwide information and the development of encryption technology.
½For Machinery considerations about a globally owned domains are only valid if machinery is able to authenticate claims on the web: this calls for a globally owned secure environment that enables digitally authenticated document. ?Similarly, a worldwide secure environment seems to require the capability to incorporate the tampering of quite sophisticated statements into the information about encryption keys and confidence.
During the first International World Wide Web Conference in May 1994 in Geneva, the writer concluded that engineering is not just an academical or technological area, but that many moral and societal questions are solved by the protocols they draft, so that they should not consider these questions as the problems of another.
The PICS ?The campaign showed that the shape of networking logs can influence the shape of a company built in the information area. Not ½If, designers need to master the science of system design so that end-to-end capability is ensured, whatever happens in between. In order for the information room to be a high-performance place for solving the challenges of the next generation, its integrality, encompassing its freedom from dependence on choice of hard- and softwares, package routes, OS and applications packages, is indispensable.
There is not enough room for a biography for an area that requires so much work from so many. World Wide Web has its own set of conference, which are conducted by an independant comittee. Fifth International World Wide Web Conference, Computer Networks and ISDNystems, Vol 28 Nos 7-11, Elsevier, Mai 1996.