Cadzow Knowledgebase

Normal view

Caching and Search Engine Settings

Cadzow Content Manager outputs HTML pages rendered by ASP.NET to client web browsers. These pages are coded to control how these pages are cached and treated by search engines.

Web Browser & Web Proxy Caching

All pages are encoded with the HTTP 1.1 parameter Cache-Control to disable client and web proxy caching. The pages will not be saved locally by the user's web browser and will not be saved by any intermediary web proxies. This is to ensure that all content presented to the user is up-to-date and not an older version cached somewhere between the server and the user.

However, these settings only apply to content returned by the Cadzow 2000 Content Manager ASP.NET scripts (URLs ending in .ASPX). All other content, such as images (.GIF, .JPG), static documents (.DOC, .PDF etc) and other web content (.CSS, .HTML etc) will not be subject to these settings. If you wish to stop your other content from being cached, you need to make a manual setting in the Internet Information Services (IIS) applet:

  1. Use Start, Control Panel, Administrative Tools to open Internet Services Manager.

  2. In the left pane, single-click on the server in question.

  3. In the right pane, right-click the virtual web required and choose Properties.

  4. Click the HTTP Headers tab.

  5. Under Custom HTTP Headers, click Add.

  6. Enter a Custom Header Name of Cache-Control and a Custom Header Value of no-store.

  7. Click OK.

  8. Click OK to close the virtual web properties dialog.

This will prevent all content on the virtual web from being cached at the user's browser.

If your dynamic content is hosted by your own web server and the static content is hosted by an ISP, please consult your ISP about making this change.

Search Engines

Search engines populate their indexes by “spidering” or “crawling” a web page using a “searchbot”. The searchbots start at your homepage and follow every link to discover all the content on your website. They may also cache (store a copy of) each page in their database. Cadzow Content Manager encodes pages as follows:

  • Private Articles — are set so that search engines do not index them, do not cache them and do not follow any links on them. Normally private articles are intended to be inaccessible from a public website and therefore inaccessible to searchbots, but if you wish to place a page in a public location but do not wish it to be indexed by a search engine, make it a private article.

  • Public Articles — are set so that search engines index them and follow the links in them, but do not cache the contents. This is so search engines can find your content but do not store their own, possibly outdated copies.

  • Search, Login and Administration Pages — are set to that search engines do not index them, do not cache them and do not follow any links on them. These pages are not candidates to be indexed.

If you do not want any of your content indexed by a search engine, you can use a robots.txt file to instruct the searchbots to ignore your site. Use Notepad to create a file called robots.txt containing the following text:

User-agent: *
Disallow: /

Place the file in the root directory of the website.

Note that in both cases, the settings rely on the user's web browser and search engines honouring the standards that apply. It is not possible to exert any control over a search engine or a web browser which deliberately ignores standard settings for indexing and cache control.



Copyright © 1996-2023 Cadzow TECH Pty. Ltd. All rights reserved.
Information and prices contained in this website may change without notice. Terms of use.

Question/comment about this page? Please email webguru@cadzow.com.au