Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Wget for WIndows?


  • Please log in to reply
2 replies to this topic

#1 shadowsai

shadowsai

  • Members
  • 20 posts
  • OFFLINE
  •  
  • Local time:04:03 AM

Posted 19 January 2011 - 11:25 PM

Recently, I've been trying to download a bunch of files off a remote webserver. I've downloaded the precompiled wget tool from the webpage below, and copied it into my windows directory (after checking the correct path using the "path" command)
http://users.ugent.be/~bpuype/wget/

I next used the following command to change to the directory I wanted the files saved in.
cd C:\Documents and Settings\User Name Here\Desktop\Folder Name

Then, I entered the following command to begin fetching the files from the webserver.
wget -A.pdf,jpg,zip,rar,jpeg,gif -r -l 10 -R "*index.html*" -c  http://www.website-name-here.com/parent/child

However, it seems like the resume switch (-r) does not actually resume from where I last left off downloading. The website I'm trying to download from has folders at least 3 levels deeper than what I have in the wget command. There are at least 28 folders in the "child" folder, and under each of those is at least 60 folders, each containing 4-5 files. I'm aware that the "-r" trigger works for single files. Is there a way to resume downloads after I shut down my computer, or do I have to leave my computer on until it has finished downloading?

As well, is there a trigger that prevents it from downloading files of certain extensions? Right now, my command downloads all the files, and then deletes any file named index.html after downloading. Is there a way to prevent it from downloading all the index.html files, and instead just the file types I need? This would save a lot of time.
shadowsai

BC AdBot (Login to Remove)

 


#2 noknojon

noknojon

  • Banned
  • 10,871 posts
  • OFFLINE
  •  
  • Gender:Not Telling
  • Local time:10:03 PM

Posted 20 January 2011 - 08:49 AM

Hi shadowsai -
Are you using this latest version 2010 Update or the older versions -
This version seems to have much more information and help areas in it than the 2000 Version -

Thank You -

EDIT - Program was updated February 18 2010

Edited by noknojon, 20 January 2011 - 08:51 AM.


#3 shadowsai

shadowsai
  • Topic Starter

  • Members
  • 20 posts
  • OFFLINE
  •  
  • Local time:04:03 AM

Posted 22 January 2011 - 04:00 PM

Hi noknojon,
I'm using version 1.11.4

I've looked through a few switches, though they all seem to delete the file after downloading it. So I'm guessing this isn't possible with wget alone?
shadowsai




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users