348

I know I can download and install the aformentioned library (wget for Windows), but my question is this:

In Windows PowerShell, is there a native alternative to wget?

I need wget simply to retrieve a file from a given URL with HTTP GET. For instance:

wget http://www.google.com/
eyecatchUp
  • 185
  • 1
  • 7
jsalonen
  • 8,843
  • 14
  • 35
  • 40
  • 1
    See also http://superuser.com/questions/25538/how-to-download-files-from-command-line-in-windows-like-wget-is-doing – jiggunjer May 16 '15 at 19:05

12 Answers12

318

Here's a simple PS 3.0 and later one-liner that works and doesn't involve much PS barf:

wget http://blog.stackexchange.com/ -OutFile out.html

Note that:

  • wget is an alias for Invoke-WebRequest
  • Invoke-WebRequest returns a HtmlWebResponseObject, which contains a lot of useful HTML parsing properties such as Links, Images, Forms, InputFields, etc., but in this case we're just using the raw Content
  • The file contents are stored in memory before writing to disk, making this approach unsuitable for downloading large files
  • On Windows Server Core installations, you'll need to write this as

    wget http://blog.stackexchange.com/ -UseBasicParsing -OutFile out.html
    
  • Prior to Sep 20 2014, I suggested

    (wget http://blog.stackexchange.com/).Content >out.html
    

    as an answer.  However, this doesn't work in all cases, as the > operator (which is an alias for Out-File) converts the input to Unicode.

If you are using Windows 7, you will need to install version 4 or newer of the Windows Management Framework.

You may find that doing a $ProgressPreference = "silentlyContinue" before Invoke-WebRequest will significantly improve download speed with large files; this variable controls whether the progress UI is rendered.

Warren Rumak
  • 3,387
  • 1
  • 15
  • 5
  • 4
    This is now the correct answer, and I ran into wget accidentally testing if I had the actual wget installed. Annoying that it can't get the filename easily (you have to specify it in the output redirection), but this option has a better UI than the real wget (in my opinion) so there's that. – Matthew Scharley Jan 14 '14 at 00:52
  • 13
    But [Windows 7 only comes with PowerShell 2.0](http://superuser.com/questions/650814), and the result will be "The term 'Invoke-WebRequest' is not recognized as the name of a cmdlet, ...". – Peter Mortensen Jun 06 '14 at 17:51
  • Powershell 4 is available for Windows 7 -- it's part of the Windows Management Framework. http://www.microsoft.com/en-us/download/details.aspx?id=40855 – Warren Rumak Jun 06 '14 at 18:26
  • 27
    Fair warning: This method will put the entire content of the file into memory before writing it out to the file. This is not a good solution for downloading large files. – im_nullable Jul 13 '14 at 06:35
  • 2
    @im_nullable, good call -- I've added that to the post. – Warren Rumak Sep 18 '14 at 15:47
  • @dezza, what do you mean by "encoding"? The output is a capture of the content sent in the body of the HTTP GET response, be it binary files, HTML, or whatever else you ask for. It's really hard to see how a .py file can "break" by copying its raw contents from one place to another, unless the web server is the one messing with it first..... – Warren Rumak Sep 20 '14 at 18:39
  • @Warren Try https://bootstrap.pypa.io/get-pip.py with above command and run it with Python 2.7, then try another tool like wget and it will work fine. – dza Sep 20 '14 at 19:35
  • 1
    @dezza I've updated the answer with a different approach. Try it again. – Warren Rumak Sep 20 '14 at 20:06
  • @Warren +1 :) better and shorter. – dza Sep 20 '14 at 20:07
  • p.s. To save some typing, you can type -o then hit tab to get tab completion for -OutFile – Warren Rumak Sep 20 '14 at 20:08
  • Thanks. Do you by any chance know how to clean copy/clipboard from PS console without block-selecting ? It's irritating copying from PS because almost every long command wraps on the next line. – dza Sep 20 '14 at 20:11
  • You can redirect the output of powershell to the clipboard by doing something like "Get-PSDrive | clip". Just be aware that it will rewrite the output as Unicode. – Warren Rumak Sep 22 '14 at 16:39
  • Also, the console in the Powershell ISE doesn't share the console's selection logic. I suggest using that instead of the standard PS console. – Warren Rumak Sep 22 '14 at 16:42
  • This doesn't work if you're behind an authenticating firewall. You get an error "Proxy authentication required". You can fix this by running `$wc = New-Object Net.WebClient; $wc.UseDefaultCredentials = $true; $wc.Proxy.Credentials = $wc.Credentials`. You only need to do this once per session, it seems. (I'm not 100% sure why this works, it looks like the proxy is a session-level shared object...) – Paul Moore Dec 15 '14 at 15:33
  • This uses IE, like everything in Powershell it's been done in a quick and dirty way, instead of just integrating wget or curl. But obviously if Microsoft did that it would ruin their licencing. – Chris S Apr 25 '16 at 13:08
  • @ChrisS It doesn't use IE if you provide the -UseBasicParsing parameter. (That's why this parameter is required for Server Core) – Warren Rumak May 12 '16 at 01:12
  • `converts the input to Unicode`: instead of `> out.html` you can use `| Set-Content out.html -Encoding Byte`, while not helpful for `iwr` since it has `-OutFile` now, it's good to know when writing binary data to files in other cmdlets – Hashbrown Sep 13 '19 at 06:37
197

If you just need to retrieve a file, you can use the DownloadFile method of the WebClient object:

$client = New-Object System.Net.WebClient
$client.DownloadFile($url, $path)

Where $url is a string representing the file's URL, and $path is representing the local path the file will be saved to.

Note that $path must include the file name; it can't just be a directory.

Peter Mortensen
  • 12,090
  • 23
  • 70
  • 90
Traveling Tech Guy
  • 9,918
  • 8
  • 35
  • 40
  • 40
    So far this has been the best solution proposed. Also given that it seems I can rewrite it in one line format as `(new-object System.Net.WebClient).DownloadFile( '$url, $path)` it is the best correspondence for `wget` I have seen so far. Thanks! – jsalonen Nov 28 '11 at 10:49
  • 3
    As a side-note you can also do this asynchronously using something like (new-object System.Net.WebClient).DownloadFileAsync(url,filePath) – James Apr 23 '13 at 08:49
  • Can we fetch a particular text via Webclient and outout to a notepad ? thanks – Mowgli Jun 18 '13 at 16:11
  • 6
    Yes, this works out of the box on Windows 7 ([that comes with PowerShell 2.0](http://superuser.com/questions/650814)). Sample: `$client.DownloadFile( "http://blog.stackexchange.com/", "c:/temp2/_Download.html")` – Peter Mortensen Jun 06 '14 at 17:57
  • Why does this use 100% of one of my CPUs? – Hut8 Oct 26 '15 at 19:11
  • @jsalonen and since that's .NET, it works on PS 2.0, which I am restricted to at them moment. – Nick Oct 13 '16 at 17:27
  • 4
    For just getting a url and ignoring the results (e.g., part of an IIS warmup script) use DownloadData: `(new-object System.Net.WebClient).DownloadData($url) | Out-Null` – BurnsBA May 02 '17 at 18:35
  • 1
    Error messages are very unhelpful; if `$path` is a directory or existing file, it throws a generic Exception. Ah, Microsoft. – BaseZen Mar 30 '18 at 14:24
  • 1
    Nice! Invoke-WebRequest blows up in my packer job, complaining about "Out of Memory". The WebClient.DownloadFile works a treat. – Ian Ellis Feb 09 '19 at 17:32
  • Running `client = New-Object System.Net.WebClient` gave me the error `client : The term 'client' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.` However, the one-liner worked. – Pro Q Mar 15 '22 at 02:37
  • So much faster than wget! Had issues with wget timeouts for moderately large files. – Bastion Mar 15 '23 at 06:15
101

There is Invoke-WebRequest in the upcoming PowerShell version 3:

Invoke-WebRequest http://www.google.com/ -OutFile c:\google.html
Peter Mortensen
  • 12,090
  • 23
  • 70
  • 90
user4514
  • 1,312
  • 1
  • 10
  • 14
  • 10
    all the elegance of `dd`... – gWaldo Aug 31 '12 at 15:29
  • 1
    @gWaldo you are kidding–this is a joy to use (speaking as someone just learning PS) –  Oct 16 '12 at 20:41
  • 8
    I just mean that the `-Outfile` parameter seems extraneous when you could just use `>` (to overwrite) or `>>` (to append) to a file. – gWaldo Oct 17 '12 at 13:12
  • 5
    @gWaldo or even deduce the filename from the URL just like `wget` does :) – Peltier Jul 17 '13 at 10:29
  • `Invoke-WebRequest $url ($url -split "/")[-1]`. Unfortunately fails if the URL ends with a slash. Should be pretty straightforward to improve. – Peltier Jul 22 '13 at 14:24
  • 5
    And as of PS 4.0, `wget` and `curl` are aliasted to `Invoke-WebRequest` (`iwr`) by default :D – Bob Mar 25 '14 at 16:12
  • @Bob Thx, Feel free to edit the answer and include these aliases! – user4514 Mar 26 '14 at 18:41
  • 2
    @gWaldo PowerShell can be quite slick. You can use shortcuts `(iwr http://www.google.com/).Content > google.html` or use arguments `Invoke-WebRequest -Uri "http://www.google.com" -OutFile google.html`. It's important to realize that in PowerShell the pipe is an object pipe, not just a character pipe so the output of `Invoke-WebRequest` isn't a stream of the file but rather an object where you'll need to use `.Content`. Try this: `$foo = Invoke-WebRequest http://www.google.com` then `$foo | Get-Member` then `$foo.StatusCode` or `$foo.Content`. – Tyler Szabo Mar 08 '17 at 00:20
  • 1
    On Windows 2016 Core / Standard I had to pass `-usebasicparsing` as otherwise it was complaining about missing internet explorer engine – Adi Roiban Jul 30 '17 at 04:23
20

It's a bit messy but there is this blog post which gives you instructions for downloading files.

Alternatively (and this is one I'd recommend) you can use BITS:

Import-Module BitsTransfer
Start-BitsTransfer -source "http://urlToDownload"

It will show progress and will download the file to the current directory.

Matthew Steeples
  • 2,258
  • 17
  • 24
  • 3
    BITS relies on support at the server end, if available this works in the background and you can get progress updates with other cmdlets. – Richard Nov 28 '11 at 10:42
  • 2
    I tried to fetch http://www.google.com/, but all I get is `Start-BitsTransfer : Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED))`. I'm puzzled :| – jsalonen Nov 28 '11 at 10:45
  • 1
    @jsalonen I think that BITS will only download files rather than pages. As Richard says it relies on some server side support (although I don't think it's Microsoft specific). – Matthew Steeples Nov 28 '11 at 11:09
  • I see and I think I get the point in using BITS, however, its not what I'm looking for in here. – jsalonen Nov 28 '11 at 11:23
8

If your Windows is new enough (like version 1809 or newer), there's a "real" curl available. curl has the command-Line option "-O" (capital letter O; small letter won't do the same!) The option "-O", alternatively "--remote-name" tells curl, that the saved file gets the same name as the file-name part of the URL.

One needs to start this as "curl.exe", to discern it from the Alias "curl" for "Invoke-WebRequest". Incidentally it works in cmd.exe without changes.

Using the same example as in another answer here

curl.exe -O http://demo.mediacore.tv/files/31266.mp4

(The site won't allow me to add this as a comment, since I apparently need more "reputation" for that - so it gets a new answer)

Dweia
  • 91
  • 1
  • 3
8

PowerShell V4 One-liner:

(iwr http://blog.stackexchange.com/).Content >index.html`

or

(iwr http://demo.mediacore.tv/files/31266.mp4).Content >video.mp4

This is basically Warren's (awesome) V3 one-liner (thanks for this!) - with just a tiny change in order to make it work in a V4 PowerShell.

Warren's one-liner - which simply uses wget rather than iwr - should still work for V3 (At least, I guess; didn't tested it, though). Anyway. But when trying to execute it in a V4 PowerShell (as I tried), you'll see PowerShell failing to resolve wget as a valid cmdlet/program.

For those interested, that is - as I picked up from Bob's comment in reply to the accepted answer (thanks, man!) - because as of PowerShell V4, wget and curl are aliased to Invoke-WebRequest, set to iwr by default. Thus, wget can not be resolved (as well as curl can not work here).

eyecatchUp
  • 185
  • 1
  • 7
4

Here is a PowerShell function that resolves short URLs before downloading the file

function Get-FileFromUri {  
    param(  
        [parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
        [string]
        [Alias('Uri')]
        $Url,
        [parameter(Mandatory=$false, Position=1)]
        [string]
        [Alias('Folder')]
        $FolderPath
    )
    process {
        try {
            # resolve short URLs
            $req = [System.Net.HttpWebRequest]::Create($Url)
            $req.Method = "HEAD"
            $response = $req.GetResponse()
            $fUri = $response.ResponseUri
            $filename = [System.IO.Path]::GetFileName($fUri.LocalPath);
            $response.Close()
            # download file
            $destination = (Get-Item -Path ".\" -Verbose).FullName
            if ($FolderPath) { $destination = $FolderPath }
            if ($destination.EndsWith('\')) {
                $destination += $filename
            } else {
                $destination += '\' + $filename
            }
            $webclient = New-Object System.Net.webclient
            $webclient.downloadfile($fUri.AbsoluteUri, $destination)
            write-host -ForegroundColor DarkGreen "downloaded '$($fUri.AbsoluteUri)' to '$($destination)'"
        } catch {
            write-host -ForegroundColor DarkRed $_.Exception.Message
        }  
    }  
}  

Use it like this to download the file to the current folder:

Get-FileFromUri http://example.com/url/of/example/file  

Or to download the file to a specified folder:

Get-FileFromUri http://example.com/url/of/example/file  C:\example-folder  
Nathan Rice
  • 270
  • 2
  • 7
user25986
  • 141
  • 4
2

The following function will get a URL.

function Get-URLContent ($url, $path) {
  if (!$path) {
      $path = Join-Path $pwd.Path ([URI]$url).Segments[-1]
  }
  $wc = New-Object Net.WebClient
  $wc.UseDefaultCredentials = $true
  $wc.Proxy.Credentials = $wc.Credentials
  $wc.DownloadFile($url, $path)
}

Some comments:

  1. The last 4 lines are only needed if you are behind an authenticating proxy. For simple use, (New-Object Net.WebClient).DownloadFile($url, $path) works fine.
  2. The path must be absolute, as the download is not done in your current directory, so relative paths will result in the download getting lost somewhere.
  3. The if (!$path) {...} section handles the simple case where you just want to download the file to the current directory using the name given in the URL.
Paul Moore
  • 561
  • 1
  • 5
  • 9
1

Use Windows 10 bash shell which includes wget once the windows feature is setup.

How to install Ubuntu bash shell on Windows:

YouTube: Running Bash on Ubuntu on Windows!

Windows Subsystem for Linux Documentation

TamusJRoyce
  • 103
  • 3
  • 1
    Consider adding some quoted reference to this answer supporting what you state in case the link ever dies so the answer content is still available that is currently only available via that link per your suggestion. – Vomit IT - Chunky Mess Style Sep 27 '17 at 03:36
0

Invoke-WebRequest with -outfile parameter expects a string, so if your filename starts with a number, and not enclosed in quotes, no output file is created.

eg. Invoke-WebRequest -Uri "http://www.google.com/" -outfile "2.pdf"

This does not affect filenames starting with a letter.

Zimba
  • 1,051
  • 11
  • 15
  • 1
    This solution is mentioned in other answers (`wget` is an alias of `Invoke-WebRequest`, and one similar to the above) – bertieb Nov 27 '18 at 18:27
  • The point of the answer was to emphasise the note. None of the answers deal with no file being created due to the syntax error. – Zimba Nov 28 '18 at 10:28
  • That should really be a comment on the other answer[s] – bertieb Nov 28 '18 at 13:02
  • This answer is not provided in other answers nor similar to the one above. – Zimba Mar 23 '19 at 15:19
0

PowerShell Invoke-RestMethod may have fewer dependencies than other methods ... in case you have a minimal (or older) Windows Server installed.

See error reported at Running Invoke-WebRequest as System account:

Invoke-WebRequest : The response content cannot be parsed because the Internet Explorer engine is not available, or Internet Explorer's first-launch configuration is not complete. Specify the UseBasicParsing parameter and try again.


This can be an alternative to applying the -UseBasicParsing option that is, in some cases, required with wget or Invoke-WebRequest.

However, the displayed response may be in a different format, based on data parsing:

PowerShell formats the response based to the data type. For an RSS or ATOM feed, PowerShell returns the Item or Entry XML nodes. For JavaScript Object Notation (JSON) or XML, PowerShell converts, or deserializes, the content into [PSCustomObject] objects.

Brent Bradburn
  • 740
  • 10
  • 13
-1

This should work for you to get around the no browser initialized stuff. Note the "-UseBasicParsing" param.

Invoke-WebRequest http://localhost -UseBasicParsing
Joe Healy
  • 125
  • 3