: Best for transferring data to and from a server. It is highly versatile and supports nearly every protocol.
Writing a is a fundamental skill for automating backups, system updates, or data collection . While a simple one-liner works for a quick task, a robust script handles errors, retries, and directory management to ensure your automation doesn't break. 1. Choosing Your Tool: curl vs. wget Most Linux distributions come with either curl or wget . download file bash script
The following script is a "clean" starting point. It uses set -e to stop the script if a command fails and checks if the destination directory exists. : Best for transferring data to and from a server
#!/bin/bash # Description: Script to download a file from a URL # Exit immediately if a command fails set -e URL="https://example.com" DEST_DIR="./downloads" FILENAME="mydata.zip" # Create directory if it doesn't exist mkdir -p "$DEST_DIR" echo "Starting download: $URL" # Using curl: -L follows redirects, -o specifies the output path curl -L "$URL" -o "$DEST_DIR/$FILENAME" # Or using wget: -O specifies the output document # wget "$URL" -O "$DEST_DIR/$FILENAME" echo "Download complete! Saved to $DEST_DIR/$FILENAME" Use code with caution. 3. Handling Errors and Retries While a simple one-liner works for a quick
: Use $? to see if the previous command succeeded. 0 means success; anything else is an error. Automatic Retries with curl :
#!/bin/bash URL_FILE="urls.txt" if [[ ! -f "$URL_FILE" ]]; then echo "Error: $URL_FILE not found." exit 1 fi while IFS= read -r url; do echo "Downloading $url..." wget -q --show-progress "$url" done < "$URL_FILE" Use code with caution. 5. Advanced Scenarios: Authentication
Network flakiness is a common issue. You can improve your script by adding retry logic or checking the exit status of the download command.