Sense/Net caches Content to optimize system performance. In High availability setups every webnode manages its own cache, distributed messages make sure that the cache always contains the latest updated versions of Content.
Caching content after loading
Data of a Content is always inserted into the cache if it is loaded from the database. The next request that uses the given Content will load the Content data from the cache and will not send a request to the database. This results in higher performance for every subsequent requests after Content loading. Whenever a content is changed or deleted it is invalidated from the cache, so the next request to load it will load the most updated version from the database and store it in the cache.
Caching content after saving
Content data can also be cached after saving. This means that whenever a content is changed it is immediately inserted into the cache, and any subsequent load will never send a request to the database to retrieve content data. This may result in a more optimal performance if the content being saved is likely to be loaded shortly after saving. This is quite usual scenario, however it also results in higher memory usage. This feature is switched on by default and can be configured in the web.config:
<!-- Cache content after save. Options: None (no content is cached after save), Containers (only instances from Folder and it descendant types are cached), All (every content is cached). Default is All. --> <add key="CacheContentAfterSaveMode" value="All" />
- None: Default, no content is cached after saving,
- Containers: only containers (Content Types derived from Folder) are cached after saving,
- All: every type of content is cached after saving.
Note, that the configuration of content caching after saving does not affect content cache after loading in any way: if a content is loaded and is not yet in the cache it will be inserted into the cache regardless of the above settings.
Cache memory usage
For sites under heavy load the cached content will consume significant amount of memory. By default ASP.NET will allow 90% of the provided virtual memory to be used up by the cache. However, this might cause problems when other modules (like Lucene indexing) try to obtain memory and might fail doing so eventually resulting in an out of memory exception. To prevent such cases you can use the cache element in the web.config to limit the size of memory used by the cache. The following example sets the memory limit of the cache to 50% eliminating out of memory problems during processes that require higher memory usage (like mass import):
<system.web> <caching> <cache disableMemoryCollection="false" disableExpiration="false" privateBytesLimit="0" percentagePhysicalMemoryUsedLimit="50" privateBytesPollTime="00:02:00" /> </caching>
Content cache in high availability setups
When mutiple web nodes are installed every web node manages its own cache. If a request is sent to one of the web nodes that results in content modification and therefore cache invalidation, this event is distributed via MSMQ to all other web nodes, that will subsequently invalidate their own caches. This mechanism ensures that cache instances on all web nodes always contain up-to-date content data, and old data are invalidated from the cache regardless of which node triggered the invalidation.
There are no examples for this article.