There are several ways to control and communicate with data in the Linux terminal and curl
is the best way to do that. curl
is a command-line tool that you can use for transferring data from or to a server. You can use the curl
command for downloading and uploading data using any of the supported protocols, which comprises FTP, HTTP, SFTP, HTTPS, and SCP. Curl has a range of features, including the ability to restrict bandwidth, resume transfers, user authentication, support for proxy servers, and a lot more.
Below is the list of protocols currently supported by the curl
:
- FTP/FTPS
- Gopher
- HTTP
- HTTP/2
- SMTP/SMTPS
- IMAP/IMAPS
- SMB
- POP3/POP3S
- RTMP
- SCP
- SFTP
- RTSP
- LDAP/LDAPS
- Telnet and TFTP
- RTSP
Additional features include:
- User and password authentication
- Basic
- Digest Plain
- NTLM
- CRAM-MD5
- Kerberos
- Negotiate
- Cookies
- Proxy tunneling
- Resume file transfer operation
- SSL certificates
- HTTP and HTTPS forms upload
wget
and curl
are often compared because their functionality overlaps to some degree. Both tools can retrieve content from the Internet but wget has more features like web scraping, recursive downloads as well it is more user-friendly. wget is considered as a better option if you’d like only to download files in the terminal.
Learning curl
command basics will help you in uploading and downloading files with advanced HTTP authentication procedures. Furthermore, wget
only supports FTP and HTTP(S), while curl
supports way more protocols.
That’s all about curl
command. Now let’s jump straight to the terminal.
Installing ‘curl’ on Linux
If you do not have curl
on your Linux system, Install it by utilizing the following command. Otherwise, skip the installation steps and move toward the examples.
sudo apt-get update
sudo apt-get install curl -y
Now, verify the curl
is available on your system by checking its version:
curl --version
Examples of using ‘curl’
We can do a lot of cool things using curl
. Let’s take a look at some of them.
Getting server external IP
There’s an amazing resource on the internet, which allows you to get your internet IP address – https://ifconfig.me (named in the glory of the famous Linux network configuration utility – ifconfig
).
If you sent an HTTP request to that site using curl, it will return you external IP address in the terminal in the form of the simple string:
curl https://ifconfig.me
So, you can easily put this result to the bash variable:
MY_EXTERNAL_IP=$(curl -s https://ifconfig.me)
echo $MY_EXTERNAL_IP
Here’s the result:
Here, -s
argument allows to avoid curl download progress output:
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 12 100 12 0 0 181 0 --:--:-- --:--:-- --:--:-- 181
Weather forecast
Do you want to feel yourself like a hacker and display the weather information right in the terminal? You can do it using curl
!
curl http://wttr.in/LOCATION
Saving ‘curl’ output to a text file
You can save the output of the curl
command to the specified file.
Here’s an example of saving JSON API output (StarWars demo API) to the file:
curl https://swapi.dev/api/planets/1/ -o Tatooine.json
curl
is smart enough to detect binary file download. Here’s an example of downloading one of the most popular open-source infrastructure-as-code management software Terraform (learn more about Terraform):
curl https://releases.hashicorp.com/terraform/0.15.0/terraform_0.15.0_linux_amd64.zip -o terraform_0.15.0_linux_amd64.zip
Downloading multiple files
You can use curl
to download multiple files at a time. Just add -o
argument as many times you need.
curl -o https://example.com/files/file-1 -o file-2 https://example.com/files/file-2 -o file-3 https://example.com/files/file-3
Limiting the download speed
Another useful feature of curl
is the restriction of the file download speed. You can do it by using the --limit-rate
argument and specifying the speed rate:
curl --limit-rate 1M -O https://releases.hashicorp.com/terraform/0.15.0/terraform_0.15.0_linux_amd64.zip
The given speed is measured in bytes/second, unless a suffix is appended. Appending ‘k’ or ‘K’ will count the number as kilobytes, ‘m’ or M’ makes it megabytes, while ‘g’ or ‘G’ makes it gigabytes.
Downloading URLs list
In this example, we’ll download all files listed in the text file. To do that you need to use a combination of xargs
with curl
commands:
xargs -n 1 curl -O < urllists.txt
Here’s an output:
Basic authentication
You can use the -u
argument to provide username and password for basic HTTP authentication (basic authentication):
curl -u username:password -O https://example.com/files/README
Getting the URL headers
HTTP headers are key-value pairs separated by colons that contain information like the requested resource content type, user agent, encoding, etc. With the request or response, headers are transferred between the client and the server. To get the headers information of any website, use -I
argument:
curl -I https://hands-on.cloud
Using cookies
In subsequent requests to the same website, you may need to use the cookies.
To save the cookies received from the web server, use the following command:
curl -s -o /dev/null -c google_cookies.txt 'https://www.google.com'
Here arguments are:
-s
– silent mode, curl does not print download progress information-o /dev/null
– do not print web page output to the terminal-c google_cookies.txt
– save cookies information to the file
To use cookies received from the previous request, use the following command:
curl -b google_cookies.txt 'https://www.google.com'
Checking HTTP/2 support
Use the -I
, --http2
and -s
options together to verify if the specified site supports HTTP/2 headers:
curl -I --http2 -s https://hands-on.cloud/ | grep HTTP
Summary
In this article, we covered the most commonly used examples of using curl
command. We hope, this article will allow you to get started with this powerful utility. If you like the article, please, help us to spread it to the world!
Related articles
- Top 10 SSH Features You MUST Know To Be More Productive
- How To Remove Files And Directories In Linux
- How to install Minecraft client on Ubuntu
- AWS CloudFormation. Managing VPC
How useful was this post?
Click on a star to rate it!
We are sorry that this post was not useful for you!
Let us improve this post!
Please, tell us what's wrong with this post, and we'll fix it ASAP!
I’m a passionate Cloud Infrastructure Architect with more than 15 years of experience in IT.
Any of my posts represent my personal experience and opinion about the topic.