Table of Contents

Scheduling a Dynamicweb CLI Command on Windows

Get started with our command-line interface.

This section explains how to schedule a Dynamicweb CLI (Command-Line Interface) command on a Windows server using Task Scheduler and includes guidance for uploading files from a local folder into a Dynamicweb solution via the CLI.

Prerequisites

Before scheduling automation, verify:

  • The Dynamicweb CLI is installed:

    npm i @dynamicweb/cli -g
    

    or follow the CLI installation instructions.

  • The CLI can connect to the environment:

    dw login
    

    or use a pre-generated API key.

  • Your CLI command runs successfully from a normal Command Prompt.

This ensures proper authentication and configuration before scheduling the task.

Step 1: Create a Batch File

Wrap your CLI command in a .bat file to control working directory and capture logs.

  1. Create a folder for scripts, for example: C:\Scripts

  2. Create a file: C:\Scripts\dynamicweb-job.bat

  3. Add the following content:

    @echo off
    cd /d C:\Path\To\Your\CLI
    
    dwcli.exe sync products --env Production >> C:\Scripts\logs\dw-job.log 2>&1
    
    • cd /d sets the working directory.
    • >> … 2>&1 captures output and errors to a log file.

Step 2: Windows Task Scheduler

Open Task Scheduler and create a new task:

  • General

    • Name the task (e.g., Dynamicweb Product Sync)
    • Select “Run whether user is logged on or not”
    • Enable “Run with highest privileges”
  • Triggers Set schedule (e.g., daily at 02:00)

  • Actions

    • Action: "Start a program"
    • Program/script: C:\Scripts\dynamicweb-job.bat
    • Start in: C:\Scripts
  • Conditions/Settings Adjust power/network settings as required.

Provide credentials when prompted.

Note

Create via command line To create the scheduled task from CMD or PowerShell:

schtasks /create /tn "Dynamicweb Sync" /tr "C:\Scripts\dynamicweb-job.bat" /sc daily /st 02:00 /ru "DOMAIN\User" /rp Password

Replace with appropriate credentials and schedule options.

Best practices and troubleshooting

When you create scripts using the DynamicWeb CLI you should strive to:

  • Use full paths for all CLI commands when running in scheduled contexts
  • Ensure the service account has access to local files and network resources
  • Enable logging for easier debugging when automating
  • Confirm that large file uploads are supported — some environments have restrictions on body size; for very large files consider using the Dynamicweb CLI Uploader tool

You may also encouter some these common issues:

  • Task runs but files don’t upload — check remote path formatting and quotes in PowerShell
  • Authentication errors — use valid API key or re-login via CLI
  • Missing permissions — the account must have rights to upload to the target folder

Check logs created by your batch file and Task Scheduler history for details.

Automating folder uploads with PowerShell

Dynamicweb CLI supports importing files and folders into a solution via the dw files command. This is useful for deployments where you want to push a whole folder (and optionally subfolders) into the Dynamicweb Files archive as part of an automated workflow.

Note

If you’re moving very large amounts of files (or run into request-size / hosting limits), Dynamicweb also provides a CLI Uploader tool designed for bulk transfers using the CLI under the hood.

Example: Upload a folder

In this example, we upload a folder (recursive) with logging and exit codes:

# Upload-Files.ps1
param(
  [Parameter(Mandatory=$true)][string]$LocalDir,
  [Parameter(Mandatory=$true)][string]$RemoteDir,
  [string]$LogFile = ".\dw-files-upload.log"
)

# Fail fast on errors
$ErrorActionPreference = "Stop"

# Basic sanity checks
if (-not (Test-Path $LocalDir)) {
  throw "Local folder not found: $LocalDir"
}

"[$(Get-Date -Format o)] Upload starting: $LocalDir -> $RemoteDir" | Tee-Object -FilePath $LogFile -Append

# Run CLI (recursive import)
# Note: Keep paths quoted to avoid whitespace issues
$dwArgs = @(
  "files", "`"$LocalDir`"", "`"$RemoteDir`"",
  "--import",
  "--recursive"
)

# If your environment supports overwrite and you want it:
# $dwArgs += "--overwrite"

& dw @dwArgs 2>&1 | Tee-Object -FilePath $LogFile -Append
$exit = $LASTEXITCODE

"[$(Get-Date -Format o)] Upload finished with exit code: $exit" | Tee-Object -FilePath $LogFile -Append

if ($exit -ne 0) {
  throw "Dynamicweb CLI upload failed (exit code $exit). See $LogFile"
}

Usage:

.\Upload-Files.ps1 -LocalDir ".\Files" -RemoteDir "/"

This pattern is CI-friendly: it writes logs, returns a non-zero exit on failure, and keeps inputs explicit.

Example: Retry on transient failures

$maxAttempts = 3
for ($i = 1; $i -le $maxAttempts; $i++) {
  try {
    .\Upload-Files.ps1 -LocalDir ".\Files" -RemoteDir "/" -LogFile ".\dw-upload.log"
    break
  } catch {
    if ($i -eq $maxAttempts) { throw }
    Start-Sleep -Seconds (10 * $i)
  }
}

Integrating folder uploads into CI/CD

Dynamicweb’s own CI/CD guidance frames the pipeline goal as building and deploying artifacts such as custom code and the Files-folder. Uploading the Files folder via CLI fits naturally into that model.

Example: Azure DevOps

This example installs Node.js, installs the CLI, then imports a folder to the solution.

# azure-pipelines.yml (snippet)
trigger:
- main

pool:
  vmImage: 'windows-latest'

variables:
  LOCAL_FILES: '$(Build.SourcesDirectory)\Files'
  REMOTE_FILES: '/'

steps:
- task: NodeTool@0
  inputs:
    versionSpec: '20.x'
  displayName: 'Install Node.js'

- powershell: |
    npm install -g @dynamicweb/cli
    dw --version
  displayName: 'Install Dynamicweb CLI'

# Auth approach depends on your setup.
# Common pattern: store host + apiKey as secret variables in the pipeline and use them in your dw command.
- powershell: |
    $ErrorActionPreference = "Stop"
    dw files "$(LOCAL_FILES)" "$(REMOTE_FILES)" --import --recursive
  displayName: 'Upload Files folder'

If your environment requires host/apiKey flags or login, keep them in secret pipeline variables and pass them in the command (or use whatever auth flow your org standardizes around). The Management API and API key concept are documented in the Dynamicweb docs.

Example: GitHub Actions

name: Upload Files to Dynamicweb
on:
  push:
    branches: [ "main" ]

jobs:
  upload:
    runs-on: windows-latest
    steps:
      - uses: actions/checkout@v4

      - uses: actions/setup-node@v4
        with:
          node-version: '20'

      - name: Install Dynamicweb CLI
        run: npm install -g @dynamicweb/cli

      - name: Upload Files folder
        shell: pwsh
        run: |
          $ErrorActionPreference = "Stop"
          dw files ".\Files" "/" --import --recursive

Practical pipeline tips

  • Log everything (pipe CLI output into the CI logs and/or a file artifact).
  • Fail fast (bubble up non-zero exit codes so deployments don’t “succeed” while silently skipping uploads).
  • Treat uploads as artifacts: build/copy into a staging folder first (e.g., out/Files) so what you deploy is exactly what you built.
  • Keep big uploads sane: if you hit size/perf limits, consider chunking by subfolder or using the CLI Uploader for bulk transfers.
To top