Important

This feature is currently experimental and may not be feature complete, or contain bugs. Please contact Support if you have any feedback on this feature. Multi-threaded crawling may have less impact in the GUI due to UI processing events which currently cannot be disabled.

Multi-threading allows WebCopy to download multiple resources simultaneously rather than one at a time. By increasing the number of concurrent requests, you can significantly reduce the time required to complete a crawl, especially when downloading from servers with high latency or when crawling large websites.

Important

When using multi-threaded crawling, it is recommended that deterministic file names are used.

Important

Multi threaded crawling is automatically disabled if crawling via an embedded web browser options are enabled.

Setting the maximum concurrent requests

The Maximum Concurrent Requests setting determines how many download requests are attempted at once whilst downloading content. The default value is 1, which processes requests sequentially. Higher values enable parallel downloading and can improve crawl speed.

To set the maximum number of requests

  1. Open the Options dialogue box and select the Advanced category
  2. In the Multi-threading group, enter the maximum number of concurrent requests the crawler will support, between 1 and 12

Choosing an appropriate value

When selecting the number of concurrent requests, consider the following factors:

  • Server capacity - Most web servers can handle multiple simultaneous connections, but excessive concurrent requests may trigger rate limiting or result in connection errors
  • Network bandwidth - Your available bandwidth may limit the practical benefit of very high concurrency levels
  • System resources - Each concurrent request consumes memory and CPU resources on your local system
  • Crawl politeness - Fewer concurrent requests are more respectful to the target server and reduce the risk of being blocked

Important

Setting this value too high may overwhelm the target server, potentially causing timeouts, connection failures, or your IP address being temporarily or permanently blocked. It may also violate the website's terms of service or robots.txt crawl-delay directives. A value between 2 and 10 is recommended for most scenarios.

© 2010-2026 Cyotek Ltd. All Rights Reserved.
Documentation version 1.10 (buildref #191.16188), last modified 2026-02-13. Generated 2026-02-13 18:44 using Cyotek HelpWrite Professional version 6.20.0