Author: Benjamin D. Thomas
Whenever an important newselement hit a website, many scientists in the same organization would visit that page (how manytimes have you forwarded a link inside
your company?). By caching that page on a local server,proxies could eliminate redundant Internet access to retrieve the same page over and over. So,proxies were originally very effective at web caching. When the Web went supernova, proxies became markedly less effective at caching; the Web wasnow vast, web pages were
frequently dynamic (expiring as soon as they’d been transmitted), andthe interests of users within a single organization might range across a million web pages
before thesame site was hit three times. These factors presented a difficult caching problem indeed andproxies became largely ineffective, except in extremely large organizations or in ISPs. Althoughsupport for proxy servers was built into all the standard browsers, by 1996 it was seldom used.
But the new Web also has its seedier element, and proxy servers showed a remarkablyserendipitous side effect: They can hide all the real users of a network behind a single machine,they can filter URLs, and they can drop suspicious or illegal content. So although originally createdas non-security caches, the primary purpose of the majority of proxy servers has now becomefirewalling.
Proxy servers regenerate high-level service requests on an external network on behalf of theirclients on a private network. This effectively hides the identity
and number of clients on the internalnetwork from examination by the external network. Because of their position between a number ofinternal clients and public
servers, proxies can also cache frequently accessed content from thepublic network to reduce access to the public network through high-cost wide-area links.”