wget Command

What is Linux wget Command?
wget is a free GNU command-line utility for downloading files from the web. The name stands for “World Wide Web” and “get”. It retrieves files using HTTP, HTTPS, and FTP protocols and is designed to work robustly in slow or unstable network connections. wget is non-interactive, meaning it can work in the background while you’re not logged in. This makes it perfect for downloading large files, mirroring websites, and automating download tasks through scripts. From the man page:
GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
Key Features
- Protocol support: HTTP, HTTPS, and FTP downloads
- Resume capability: Continue interrupted downloads
- Recursive downloading: Download entire websites
- Rate limiting: Control download speed
- Authentication: Support for various authentication methods
- Proxy support: Work through HTTP and HTTPS proxies
- Background operation: Run downloads in the background
- Robust networking: Handle unstable connections gracefully
wget Syntax
wget [options] [URL]Basic usage patterns:
wget URL- Download single filewget -O filename URL- Save with custom namewget -c URL- Resume interrupted downloadwget -r URL- Recursive download
Installation
Most Linux Distributions
wget comes pre-installed on most Linux distributions. If not available:
Ubuntu/Debian
sudo apt update && sudo apt install wgetCentOS/RHEL/Fedora
sudo yum install wget # CentOS/RHEL
sudo dnf install wget # FedoraArch Linux
sudo pacman -S wgetmacOS
Using Homebrew
brew install wgetUsing MacPorts
sudo port install wgetWindows
Download from GNU Wget official website or use Windows Subsystem for Linux (WSL).
Basic wget Examples
Simple File Download
Download a single file:
wget https://example.com/file.zipDownload with Custom Filename
Save file with a different name:
wget -O myfile.zip https://example.com/file.zipDownload to Specific Directory
Download file to a specific location:
wget -P /downloads/ https://example.com/file.zipDownload Multiple Files
Download multiple files by listing URLs:
wget https://example.com/file1.zip https://example.com/file2.zipDownload from URL list in file:
wget -i urls.txtEssential wget Options
Resume Interrupted Downloads (-c)
Continue a partially downloaded file:
wget -c https://example.com/largefile.isoBackground Downloads (-b)
Run download in background:
wget -b https://example.com/largefile.zipCheck background download progress:
tail -f wget-logLimit Download Rate (–limit-rate)
Limit download speed to prevent network saturation:
wget --limit-rate=100k https://example.com/largefile.zipRecursive Download (-r)
Download an entire website recursively:
wget -r https://example.com/Accept/Reject File Types (–accept, –reject)
Download only specific file types:
wget -r --accept=pdf https://example.com/documents/Reject specific file types:
wget -r --reject=jpg,png https://example.com/gallery/User-Agent (–user-agent)
Set a custom User-Agent string:
wget --user-agent="MyCustomBrowser/1.0" https://example.com/Ignore Certificate Warnings (–no-check-certificate)
Bypass SSL certificate validation (use with caution):
wget --no-check-certificate https://self-signed.com/wget Command Manual / Help
We can use man and info command to see the manual page of wget command. wget command also have --help option to show list of options.
To open man page for wget command we can use command below. To exit man or info page you can press q.
man wgetTo open info page for wget command we can use command below.
info wgetTo open help page from wget command we can run command below.
wget --helpwget Command Source Code
You can find wget command source code from the following repositories:
Related Linux Commands
You can read tutorials of related Linux commands below:
Summary
In this comprehensive tutorial, we’ve covered the essential aspects of using wget for downloading files from the web. wget is a powerful and reliable tool for automated downloads, website mirroring, and file retrieval tasks.
Key takeaways:
wgetexcels at downloading files over HTTP, HTTPS, and FTP protocols- Resume capability makes it perfect for large files and unstable connections
- Recursive downloading enables complete website mirroring
- Extensive options provide fine control over download behavior
- Built-in retry mechanisms handle network issues gracefully
- Configuration files and scripting enable automation
- Essential tool for system administrators and power users
Visit our Linux Commands guide to learn more about using command line interface in Linux.