cURL to Download a File: Command-Line Tutorial
Downloading files from the command line is a daily task for developers, sysadmins, and data professionals. cURL is a versatile, cross-platform tool supporting HTTP, HTTPS, FTP, SFTP, and more. Whether you need to grab a single document, resume a large ISO download, authenticate behind login walls, or fetch files through a proxy, this guide walks you through the steps—from installation to advanced use—so you can reliably fetch any file.

1. Installation & Cross-Platform Setup
Before you start, ensure cURL is installed and up-to-date on your system.
Linux
Debian/Ubuntu:
bash
sudo apt update
sudo apt install curl
RHEL/CentOS/Fedora:
bash
sudo dnf install curl
macOS
Pre-installed on recent versions.
Or via Homebrew:
bash
brew install curl
Windows
Windows 10+ includes cURL in PowerShell.
For older versions, download the binary from curl.se.
Verify with:
bash
curl --version
2. Basic File Download Commands
| Command | Description |
| curl URL | Fetches to stdout (prints to screen) |
| curl -O URL | Saves with remote filename |
| curl -o name URL | Saves to custom filename |
Example:
bash
# Save as data.csv (remote name)
curl -O https://example.com/data.csv
Save as report.csv (custom name)
curl -o report.csv https://example.com/data.csv
By default, cURL streams output to your terminal. Use -O (capital o) to save with the remote name, or -o with your chosen filename.
3. Handling Redirects & HTTP Status
Many URLs redirect (short links, CDNs). Without following redirects you may download an HTML stub instead of the file.
- Follow redirects: -L
- Inspect headers only: -I (performs a HEAD request)
- Print HTTP status code: -w "%{http_code}\n"
Examples:
bash
# Follow redirects and save
curl -L -O https://short.link/latest.zip
View headers
bash
curl -I https://example.com/file.txt
Save and print status code
curl -L -o file.pdf -w "%{http_code}\n" https://example.com/report
A status code of 200 indicates success; 3xx indicates further redirects; 4xx/5xx indicate client/server errors.
4. Authentication & Protected Downloads
Basic HTTP Authentication
Many internal sites or APIs require credentials:
bash
curl -u username:password -O https://intranet.local/secret.docx
Token-Based Authentication
APIs often use bearer tokens:
bash
curl -H "Authorization: Bearer $API_TOKEN" -O https://api.example.com/data.json
Cookie-Based Sessions
1. Log in via browser and export cookies to cookies.txt.
2. Reuse cookies:
bash
curl -b cookies.txt -O https://secure.example.com/download.zip
5. Advanced cURL Options: From Basics to Power-User
Whether you’re resuming a 10 GB download, throttling bandwidth on a shared server, or debugging an API call, these flags are your go-to tools.
| Category | Flag | Purpose | When to Use | Example |
| Reliability & Resumption | -C - | Resume an interrupted download | After a dropout or cancellation | curl -C - -O https://example.com/large.iso |
| Control & Silence | --limit-rate X | Throttle speed to X (e.g., 2M, 500K) | Metered connections or shared bandwidth | curl --limit-rate 1M -O https://example.com/video.mp4 |
| -s | Silent mode (no progress bar) | Embedding in scripts/logs | curl -s -O https://example.com/image.jpg | |
| -# | Simple text-based progress bar | Minimal but visible feedback | curl -# -O https://example.com/bigfile.zip | |
| Debugging & Verbosity | -v | Verbose request/response details | Diagnosing headers, SSL, or auth issues | curl -v https://example.com |
| Proxy & Network | -x proxy:port | Route through a proxy | Bypass geo-restrictions or firewalls | curl -x http://proxy.okeyproxy.com:8000 -O https://site.com/file.zip |
| Timeouts & Retries | --max-time N | Abort if total time exceeds N seconds | Prevent hangs in CI or slow servers | curl --max-time 30 -O https://slow.example.com/file.txt |
| --retry N | Retry up to N times on transient failures | Unstable networks or flaky servers | curl --retry 3 -O https://unstable.example.com/data.zip | |
| Configuration | -K config.txt | Load options from a config file | Centralize auth, proxy, headers, or defaults | curl -K ~/.curlrc -O https://example.com/file.txt |
Combined-Use Example
Download a large file through OkeyProxy, resume on failure, limit to 2 MB/s, follow redirects, and retry 5 times if it fails:
bash
curl \
-x http://user:[email protected]:8000 \
-L \
-C - \
--limit-rate 2M \
--retry 5 \
-O https://example.com/huge-dataset.tar.gz
6. Downloading Multiple Files
Separate URLs
bash
curl -O https://site.com/a.txt -O https://site.com/b.txt
Brace Expansion
bash
curl -O https://site.com/files/{one.pdf,two.pdf,three.pdf}
curl -O https://site.com/images/img[1-3].jpg
Note for Windows PowerShell: Use separate -O flags instead of brace expansion.
Using OkeyProxy for Restricted or Geo-Blocked Files
When servers restrict IP ranges or enforce geo-limits, route requests through OkeyProxy:
1. Get Credentials: Sign up with OkeyProxy and note your user, pass, and host (proxy.okeyproxy.com:8000).
2. Configure Proxy in cURL:
bash
curl -x http://user:[email protected]:8000 -L -O https://geo.example.com/video.mp4
3. Rotate Regions: If supported, prefix hostnames with region codes (e.g., US Proxy: us.proxy.okeyproxy.com) to fetch region-specific content.
Troubleshooting & Common Pitfalls
Corrupt Downloads: Often due to missing -L on redirects.
SSL Errors: Avoid -k; use --cacert /path/to/ca.pem to trust a specific CA.
Auth Failures: Double-check credentials; debug with -v.
HTML Instead of File: Inspect headers with -I; ensure correct URL.
Automation & Scripting
Sample Bash Script
bash
#!/usr/bin/env bash
set -euo pipefail
URL="https://example.com/daily-report.csv"
DEST="$HOME/reports/$(date +%F)-report.csv"
curl -L --retry 3 --max-time 30 -o "$DEST" "$URL"
echo "✅ Downloaded report to $DEST"
- Make executable: chmod +x download_report.sh
- Cron (Linux/macOS): 0 2 * * * /path/to/download_report.sh
- Windows Task Scheduler: Wrap in a .bat or PowerShell script.
Best Practices & Security
1. Follow Redirects: Always include -L.
2. Rate Limit: Protect target servers and your bandwidth.
3. Validate Integrity: Use checksums (e.g., sha256sum).
4. Secure Credentials: Store in environment variables or a vault.
5. Respect Policies: Check robots.txt and site Terms of Service before mass downloading.
Conclusion
Downloading files with cURL empowers your automation, resilient data fetching, and seamless integration with scripts and pipelines. From simple -O saves to advanced proxy routing with OkeyProxy, you now have the commands and best practices to handle any download scenario—securely, efficiently, and at scale. Sign up and get a free trial today!








