4

I was mirroring a site using HHTrack, and it ran for some time and stopped with log containing the following:

<snip>
Too many URLs, giving up..(>100000)
To avoid that: use #L option for more links (example: -#L1000000)
14:48:58    Info:   Top index rebuilt (done)

Does it mean it didn't mirror all pages?

How do I continue the mirroring without spending unnecessary time copying the already-mirrored files?

Tamara Wijsman
  • 57,083
  • 27
  • 185
  • 256
qazwsx
  • 8,599
  • 24
  • 58
  • 86

1 Answers1

1

Use the action continue an interrupted download/mirror after increasing the URL limit.

For the command line version:

Action options:
  w *mirror web sites (--mirror)
  W  mirror web sites, semi-automatic (asks questions) (--mirror-wizard)
  g  just get files (saved in the current directory) (--get-files)
  i  continue an interrupted mirror using the cache
  Y   mirror ALL links located in the first level pages (mirror links) (--mirrorlinks)

For the GUI version:

enter image description here

Brian
  • 8,896
  • 23
  • 37