How to Download Files With curl And wget
Razvan Ludosanu
Founder, learnbackend.dev
Published: 1/31/2024
Both curl and wget are command-line tools used for downloading files from the internet. More specifically, curl is a flexible, feature-rich, and multi-purpose tool designed for user authentication, proxy support, cookie handling, file uploading, and more. wget , on the other hand, is a simpler, more straightforward tool primarily designed for downloading files and mirroring websites efficiently.
Similarities and Differences
- Both tools support the HTTP and FTP protocols, enabling users to download files, web pages, and other resources from web servers.
- Both tools support basic authentication, allowing users to provide usernames and passwords for protected resources.
- Both can be configured to use proxy servers for making requests, useful for scenarios where network traffic needs to be routed through a proxy.
- Both offer mechanisms to resume interrupted downloads.
- Both are available on various operating systems, making them versatile tools that work across different platforms.
Differences
- curl has a more extensive and flexible set of options, making it highly customizable for a variety of tasks, whereas wget is much simpler and more straightforward to use.
- wget saves downloaded contents to local files, whereas curl outputs the content to the terminal by default.
- wget is particularly well-suited for mirroring websites and recursively downloading content, whereas curl doesn't have built-in support for this kind of task and requires more manual configuration.
- curl supports a wide range of protocols, such as HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP, etc, whereas wget only supports HTTP, HTTPS, and FTP.
- wget automatically follows redirects, making it convenient for tasks that involve multiple URLs, whereas curl doesn't and requires explicit use of the -L option to do so.
- curl is often preferred for scripting and automation due to its versatility, whereas wget is more user-friendly for basic tasks.
Downloading a single file with curl
By default, the curl command writes the content of the requested file to the standard output of the terminal.
To physically download the file on your machine in the current directory, you can use the -O flag (short for --remote-name ) as follows:
$ curl -O <url>
Run in Warp
For example, this command will download and save the 20231003.txt file in the current directory, under its original name:
$ curl -O http://example.com/logs/20231003.txt
Run in Warp
Downloading and renaming files
To download a file in the current directory under a different name, you can use the -o flag (short for --output ) as follows:
$ curl -o <filename> <url>
Run in Warp
Where:
- filename is the name of the downloaded file on your local machine.
For example, this command will download and save the 20231003.txt file in the current directory, under the log.txt name:
$ curl -o log.txt http://example.com/logs/20231003.txt
Run in Warp
Downloading in a specific directory
To download a file in a specific directory instead of the current working directory, you can use the --output-dir flag as follows:
$ curl --output-dir <path> -O <url>
Run in Warp
Where:
- path is the relative or absolute path of the directory you want to download the file into.
For example, this command will download and save the 20231003.txt file in the /tmp/logs directory:
$ curl --output-dir /tmp/logs -O http://example.com/logs/20231003.txt
Run in Warp
Note that the --output-dir flag can also be combined with the -o flag.
Downloading a single file with with wget
By default, the wget command automatically downloads the requested resource into a file on your local machine.
To download a file into the current directory, you can use the wget command as follows:
$ wget <url>
Run in Warp
For example, this command will download and save the 20231003.txt file in the current directory, under its original name:
$ wget http://example.com/logs/20231003.txt
Run in Warp
Downloading and renaming files
To download a file in the current directory under a different name, you can use the wget command with the -O flag (short for --output-document ) as follows:
$ wget -O <filename> <url>
Run in Warp
Where:
- filename is the name of the downloaded file on your local machine.
For example, this command will download and save the 20231003.txt file in the current directory, under the log.txt name:
$ wget -O log.txt http://example.com/logs/20231003.txt
Run in Warp
Downloading in a specific directory
To download a file in a specific directory instead of the current directory, you can use the -P flag (short for --directory-prefix ) as follows:
$ wget -P <path> <url>
Run in Warp
Where:
- path is the relative or absolute path of the directory you want to download the file into.
For example, this command will download and save the 20231003.txt file in the /tmp/logs directory:
$ wget -P /tmp/logs http://example.com/logs/20231003.txt
Run in Warp
Downloading multiple files with curl
To download multiple files at once with curl , you can repeat the -o , -O , and --output-dir flags several times, for each URL:
For example, this command will download both of the 20231003.txt and 20231004.txt files in the current directory:
$ curl \
-O http://example.com/logs/20231003.txt \
-O http://example.com/logs/20231004.txt
Run in Warp
And this command will download the same files in the current directory, while respectively renaming them to logs_03.txt and logs_04.txt :
$ curl \
-o logs_03.txt http://example.com/logs/20231003.txt \
-o logs_04.txt http://example.com/logs/20231004.txt
Run in Warp
Downloading files using a configuration file
Alternatively, you can create a configuration file containing the command flags and URLs of the files to download:
-O
url = "http://example.com/logs/20231003.txt"
-O
url = "http://example.com/logs/20231004.txt"
Run in Warp
And use the curl command with the -K flag (short for --config ) as follows:
$ curl -K <file>
Run in Warp
Where:
- file is the relative or absolute path to the configuration file.
Downloading multiple files with wget
To download multiple files at once with wget , you can simply list the URLs as arguments of the command as follows:
$ wget <url> …
Run in Warp
For example, this command will download both of the 20231003.txt and 20231004.txt files at once in the current directory:
$ wget \
http://example.com/logs/20231003.txt \
http://example.com/logs/20231004.txt
Run in Warp
Downloading files recursively
The wget command is often used to download an entire web page or website for mirroring, offline viewing, or archiving, including the additional pages, images, styles, scripts and other associated files. It follows the links present in the HTML, XHTML, and CSS files and automatically recreates the directory structure of the target URL.
To download files recursively, you can use the the wget command with the -r flag (short for --recursive ) as follows:
$ wget -r <url>
Run in Warp
Note that, by default, the wget command has a maximum recursion depth of 5 levels, which can be changed using the -l (short for --level ) flag as follows:
$ wget -r -l <depth> <url>
Run in Warp
Also note that it respects the Robot Exclusion Standard (i.e. robots.txt ) and ignores the entries listed in that file.
Downloading files via FTP
To download files via the FTP protocol, you will need to provide a username and password in order to authenticate to the remote server. It ensures that only authorized individuals or systems can access and transfer files, thus protecting data and server resources.
Downloading files via FTP with curl
To download files with curl via the FTP protocol, you can use the -u flag (short for --user ) as follows:
$ curl -u <username>:<password> -O <ul>
Run in Warp
Where:
- username is the FTP username.
- username is the FTP password.
For example, this command will download the logs.txt file using the username admin and the password admin.
$ curl -u admin:admin -O ftp://ftp.example.com/resources/logs.txt
Run in Warp
Downloading files via FTP with wget
To download files with wget via the FTP protocol, you can use the --ftp-user and --ftp-password flags as follows:
$ wget --ftp-user=<username> --ftp-password=<password> <url>
Run in Warp
Where:
- username is the FTP username.
- password is the FTP password.
For example, this command will download the logs.txt file using the username admin and the password admin.
$ wget --ftp-user=admin --ftp-password=admin ftp://ftp.example.com/
Run in Warp
Resuming the download of incomplete files
Both curl and wget offer the possibility to resume the download of incomplete files whose downloads have been interrupted.
Resuming the download with curl
To resume the download of a file with wget , you can use the -C flag (short for --continue-at ) as follows:
$ curl -C <offset> -O <url>
Run in Warp
Where:
- offset is either the exact number of bytes to skip, counting from the beginning of the source file, or the - value to let curl automatically determine the offset.
For example, this command will resume the download of the video.mp4 file automatically:
$ curl -C - -O http://example.com/video.mp4
Run in Warp
And this command will resume the download starting from the file’s byte offset 400:
$ curl -C 400 -O http://example.com/video.mp4
Run in Warp
Note that if the file size is smaller than the offset , curl will automatically start the download from the beginning.
Resuming the download with wget
To resume the download of a file with wget , you can use the -c flag (short for --continue ) as follows:
$ wget -c <url>
Run in Warp
If present in the current directory of the local machine, wget will ask the server to continue the retrieval from an offset equal to the length of the local file.
For example, this command will resume the download of the video.mp4 file automatically:
$ wget -c http://example.com/video.mp4
Run in Warp
Note that as of version 1.7, if the remote server does not support continued downloading, wget will refuse to start the download from scratch. You should therefore remove the local file to restart the download from scratch.
Written by
Razvan Ludosanu
Founder, learnbackend.dev
Filed Under
Related Articles
List Open Ports in Linux
Learn how to output the list of open TCP and UDP ports in Linux, as well as their IP addresses and ports using the netstat command.
Count Files in Linux
Learn how to count files and folders contained in directories and subdirectories in Linux using the ls, find, and wc commands.
How to Check the Size of Folders in Linux
Learn how to output the size of directories and subdirectories in a human-readable format in Linux and macOS using the du command.
Linux Chmod Command
Understand how to use chmod to change the permissions of files and directories. See examples with various chmod options.
POST JSON Data With Curl
How to send valid HTTP POST requests with JSON data payloads using the curl command and how to avoid common syntax pitfalls. Also, how to solve the HTTP 405 error code.
Format Command Output In Linux
Learn how to filter and format the content of files and the output of commands in Linux using the awk command.
Create Groups In Linux
Learn how to manually and automatically create and list groups in Linux.
Switch Users In Linux
Learn how to switch between users, log in as another user, and execute commands as another user in Linux.
Remover Users in Linux
Learn how to remove local and remote user accounts and associated groups and files in Linux using the userdel and deluser commands.
Delete Files In Linux
Learn how to selectively delete files in Linux based on patterns and properties using the rm command.
Find Files In Linux
Learn how to find and filter files in Linux by owner, size, date, type and content using the find command.
Copy Files In Linux
Learn how to safely and recursively copy one or more files locally and remotely in Linux using the cp and scp command.