An exploration of the internet publishing revolution


The web has seen continuing growth in the number of individuals posting opinions, stories, and news on personal sites and blogs. Indeed, David Sifry, CEO of Technorati, reports on his own blog that the so-called ‘blogosphere’ has doubled in size approximately every five months since March 2003. He records the number of blogs has risen to about eight million. With thirty-to-forty thousand new blogs being created every day the expansion looks set to continue.

The increase in unedited information appearing online has raised concerns among traditional publishers.

The increase in blogging has been so great that a report on the state of journalism in the US in 2005 carried out by the Project for Excellence in Journalism (PEJ) believes that blogs are now the driving force behind online news. And it’s not just individuals who have been jumping on the internet bandwagon; industry and academia are following suit, publishing more research and information on the web than ever before and reaching out to the widest audience in the shortest time.

This paper will discuss the state of new media before describing solutions to the problems introduced by instant publishing. Two prolific sources of information, news articles and research, are the focus of this paper.

The state of new media

The Southwest Educational Development Laboratory (SEDL) recognises traditional media has been placed in a state of flux by the internet’s increasing popularity:

The role and influence of refereed journal publications are changing due to the emergence of the internet and the world-wide web. Research information can be posted on web sites for immediate review, printing, or downloading.

This increase in unedited information appearing online has raised concerns among traditional publishers of news and research material. Many of these concerns surround the validity and reliability of the information being provided. The PEJ report picked up on the irony of the situation faced by news agencies:

The energy is coming from the sources with a dearth of journalism essentials like verification and editing. Meanwhile the economic base supporting the most difficult and expensive journalistic undertakings is eroding.

The popularity of the internet with both readers and writers alike has been accompanied by a declining readership of printed newspapers and, to a lesser extent, declining viewers of television news. The review explained the decline by pointing to the changing trend in the ways people access information.

News-on-demand seems more appealing to the modern web generation, who enjoy the ability to get their news when and where they want. Despite this marked change in reading habits it is feared that some modern news agencies have been slow to use the opportunities the web offers, perhaps due to the lack of any proven economic model on which to base the online provision of news. The review found that many sites are reliant on news wires and recycled content in their online presence, a fact which worried the reviewers and raises the issues of validity and reliability of online information:

The risk of relying predominantly on wire copy is that it means entrusting the accuracy of the copy to someone else. You have made no attempt to verify independently. The growing tendency this year to run wire without any kind of staff input or editing suggests even greater risk.

The problem itself is highlighted further in the case of publishing research information online. SEDL pays particular attention to the publishing of non-refereed medical research. Again, the internet’s ease of access and immediacy — factors which may well be considered merits of a media when it comes to releasing critical data effectively — find themselves in an uneasy co-existence with the traditional publishers. The respected British Medical Journal has implemented a fast-track scheme to allow the refereeing and publishing of critical medical research in as little time as possible. The US journals The New England Journal of Medicine and The Journal of the American Medical Association have made similar efforts.

The Journal of the American Medical Association is able to pass articles of sufficient importance through express reviews, going through the regular review process with higher priority, reducing the time between stages. Submission to publication takes four weeks. Even with this effort the best case scenario is still one month between an article appearing on a web site and completing its review — and that’s still only available for a select few articles.

As the amount of information published on the web continues to increase there must inevitably be mounting pressure on those organisations tasked with reviewing it. With more information being published without review there has to be a compensatory increase in the capacity of review processes and, in a mirror of the issues facing the news agencies, no solid economic model on which that increased capacity may be based.

Information reliability

Answers to the question How much of the information on specific types of internet sites do you think is reliable and accurate? Source: The Digital Future Report, USC Annenberg School Center for the Digital Future, September 2004.

The concern these organisations have for preserving the integrity of the information being published on the web may well be justified — not only as the basis of preventing the circulation of misinformation. Another worrying indicator of the impact of unsupported publishing on the internet is the public perception of the reliability of the information. Figures cited in the PEJ study indicate a slight down-turn in the levels of confidence the web audience has in the information it’s consuming.

Between 2000 and 2003 there was a decrease in people who believed ‘most’ of what they read online was reliable, falling from fifty-two to forty-nine per cent with a peak in 2001 of fifty-six percent. In concert there was a slight increase in people who believed only a ‘small proportion’ of what they read is reliable: rising from seven per cent in 2000 to eight per cent by 2003. This was true of sites read on a regular basis, not just the web in general.

So is the expanding web becoming less trustworthy in the eyes of its audience? What can be done to counter it?

Index by reputation

In the introduction to his paper Misinformation through the internet: epistemology and ethics, Dr. Anton Vedder wrote that the internet fundamentally lacks the secondary attributes people generally use when assessing the reliability of the information they’re consuming. He theorises that in the case of traditional media we would make assessments of the reliability of information based on where it had been sourced. Research from a reputed scientific journal holds more weight than that taken from a magazine; likewise a news story from a tabloid newspaper is perhaps treated with somewhat more scepticism than that of a broadsheet. Of course, these factors alone aren’t enough to prove the reliability of a piece, but they act as indicators in our day-to-day lives that allow us to form opinions. Vedder believed this factor is lacking when it comes to the internet.

There should be some mechanism for allowing the inference of credibility from traditional organisations to carry over to their online persona.

To combat this Vedder suggested internet users need to develop a somewhat more sceptical approach to assessing information on the web than they might use for other sources. It may be that this is something which will happen naturally as the use of the web continues to grow. He also suggested there be some mechanism for allowing the inference of credibility from traditional organisations to carry over to their online persona. This can be considered the most important point. As the PEJ report suggested, the internet needs more credible content from reputable suppliers. For example, the BBC’s web site carries news produced by staff that is both edited and original and hence able to carry a certain amount of credibility over from its other broadcasting methods to its online material. This may be an unfair example as the BBC is a large publicly-funded body and does rely on the profitable economic model privately-owned organisations do, but the principle still applies.

Regarding locating trustworthy information on the web: search engines can be created to hold and use some assigned weighting of credibility based on the reputation of the source its coming from; in the future this could be used to prioritise information by likelihood of reliability. This won’t be possible without the thrust to produce online content attributed to those sources however, something which continues to be wrought with economic difficulties.


The drive behind the internet’s growth needs to shift to preserve the reputation of the information it provides. After all, the content becomes irrelevant if the medium providing it appears disreputable in the eyes of its audience. It may be that greater investment has to go into the production of well written and verified web content, especially news agencies and other purveyors of news. A process of assessing credibility and reliability can then be implemented, allowing filtering of good sources from bad. It may also be that the model for web publishing is rethought when it comes to the publication of research data, especially that of critical importance where speed of delivery has to be balanced with thorough review. As the vastness of the information available to users grows these kinds of mechanisms may have to be implemented to help separate trustworthy from fallacious in a repository where they are currently so hard to distingiush.


Technorati tags

Mercurytide is a forward thinking, dynamic, and innovative Internet applications development company. We are well-established with a proven track-record within the industry of attracting blue chip clients from around the world. We produce regular white papers on a variety of technology-orientated topics. For more details see the Mercurytide web site.