Crawler settings
The Crawler settings control how SmartScanner explores (crawls) the target website. Use these options to limit the crawler’s reach, skip unwanted files, and handle JavaScript-heavy pages.
Where to find it: open Scan Config (gear icon), Check the “Advanced Settings” box to reveal more options and choose the “Crawler” tab.
Controls
Maximum Depth: When enabled, limits how many link-levels the crawler will follow from the start URL. A small number (1–3) keeps scans fast. Setting
0means the crawler will only scan the user provided URLs.Maximum Count: When enabled, prevents the crawler from fetching more than the specified number of pages. Useful to cap scan size.
File Exclusion: Enter file name patterns (for example
*.cssordebug*) to skip downloading or testing specific file types or names. This is tested against the file name of url not the full pathEvaluate JavaScripts Using Chromium: Enable this when the site uses JavaScript to build pages or navigation (single page apps). This makes the crawler render pages with a headless Chromium instance so it can see dynamically generated links and content.
- Scope (Smart / Manual):
- Smart (default): the scanner will determine the scope automatically. Additional options include:
- Scan Sub Domains: include subdomains in the crawl.
- Scan above path of target URL: allow crawling to paths above the target URL’s directory.
- Manual: provide a Scope Regex to explicitly define which URLs should be crawled. Only URLs matching the regex will be visited.
- Smart (default): the scanner will determine the scope automatically. Additional options include:
- URL Exclusion: Enter one exclusion pattern per line to prevent the crawler from visiting or testing matching URLs. Empty lines are ignored.
Pattern language and tips
- The URL Exclusion textarea supports simple wildcard tokens (see UI tooltip). Common tokens:
*matches 0 or more characters+matches 1 or more characters?matches any single character%matches 1 or more characters except/and?#matches any digit- Special tokens like
[md5],[guid],[year],[base64],[seo],[non-english]are supported for convenience.
- If you prefer precise control, pick Manual scope and supply a regular expression that matches only the URLs you want scanned.
Practical examples
- Quick shallow scan: Enable Maximum Depth and set it to
2. - Large site cap: Enable Maximum Count and set
1000to avoid long-running scans. - Skip static assets: add lines like
/assets/*,/images/*, and*.cssto URL Exclusion.