Table of Contents

Websites

Create and manage multiple websites or language layers

A website is a default building block in DynamicWeb 10 CMS under which pages are placed. All solutions are born with a default website called Standard, although some (like those based on DynamicWeb Swift), can feature others out of the box. To access the websites list, open the context menu for the content area tree and click Websites: WebsiteList From the websites list you can:

  • Create websites
  • Create new language layers
  • Open and edit an existing website or language layer using the list item context menu

Depending on your permission level, it is also possible to deactivate or delete a websites.

To create a new website:

  1. Click + New website
  2. Provide a name and select a regional setting
  3. Set the default page template in the layout tab
  4. Review and configure the website settings
  5. Save

To create a language layer:

  1. Click + New website Language
  2. Select a website to copy from
  3. Provide a name and select a regional setting
  4. Create

Website Settings

Websites are rich objects with lots of settings for tweaking or configuring behaviour. Settings are organized into tabs of broadly related settings, read more about each tab below.

The Advanced-tab contains various, well, advanced settings. WebsitesAdvancedTab

  • The item type settings allow you to select item types to extend the standard website and page properties with
  • Use the Workflow setting to select a workflow for the website
  • Using the Security settings you can setup HTTPS on the website:
    • Default allows visitors to use both HTTP & HTTPS
    • Force SSL forces a 301 redirect to HTTPS
    • Un-force SSL forces a 301 redirect to HTTP
  • You can also select a login template. If so, the whole website will be behind login

Finally, the Robots.txt settings provide you with a degree of control over how crawlers index the website - you can:

  • Allow or disallow crawlers access to the Sitemap for your website
  • Include all products in the sitemap.xml (this requires a URL provider to be specified on a page)
  • Enter the content for the robots.txt file for the website

The contents of the Robots.txt file specified here should be considered virtual, e.g. an actual file will not exist on the disk. Instead it will be served by the 404 handler in DynamicWeb (Admin/public/404.aspx) which has to be set up in the IIS settings in order to work. If an actual robots.txt file exists on the disk, the virtual content specified in the field will not be used.

By default Disallow: /?cartcmd= is added to Robots.txt to avoid bots from indexing carts (the command is not visible from the Robots.txt input field).

The settings will not work on websites located on a *.dynamicweb.dk domain, as these domains are excluded from search engine indexing.

To top