General |
||
|
||
|
The General screen provides the following settings: Suppress Login DialogWhenever SiteSucker encounters a page that requires authentication, it displays the Login Dialog. Switch this on to suppress display of the Login Screen and skip the download of any pages that require authentication. For more information on authentication, see Password-protected Sites. Ignore Robot ExclusionsSwitch this on to have SiteSucker ignore robots.txt exclusions and the Robots META tag. Warning: Ignoring robot exclusions is not recommended. Robot exclusions are usually put in place for a good reason and should be obeyed. By default, SiteSucker honors robots.txt exclusions and the Robots META tag. The robots.txt file allows the Web site administrator to define what parts of a site are off-limits to specific robots, like SiteSucker. Web administrators can disallow access to cgi and private and temporary directories, for example, because they do not want pages in those areas downloaded. In addition to server-wide robot control using robots.txt, Web page creators can also use the Robots META tag to specify that the links on a page should not be followed by robots. Replace FilesUse this control to specify when SiteSucker should replace existing files. You can choose from the following options:
Note: SiteSucker will always replace existing HTML and CSS files regardless of the Replace Files setting. Path ConstraintUse this control to limit downloaded files to those at a specific site, within a specific directory, or containing a specific path. This option works in conjunction with the Paths settings. SiteSucker provides the following path constraints:
|
||
|
|