Hero Image
Advanced Shell Scripting Techniques: Automating Complex Tasks with Bash

Advanced Shell Scripting Techniques: Automating Complex Tasks with Bash Use Built-in Commands Built-in commands execute faster because they don’t require loading an external process. Minimize Subshells Subshells can be expensive in terms of performance. # Inefficient output=$(cat file.txt) # Efficient output=$(<file.txt) Use Arrays for Bulk Data When handling a large amount of data, arrays can be more efficient and easier to manage than multiple variables. # Inefficient item1="apple" item2="banana" item3="cherry" # Efficient items=("apple" "banana" "cherry") for item in "${items[@]}"; do echo "$item" done Enable Noclobber To prevent accidental overwriting of files. set -o noclobber Use Functions Functions allow you to encapsulate and reuse code, making scripts cleaner and reducing redundancy. Efficient File Operations When performing file operations, use efficient techniques to minimize resource usage. # Inefficient while read -r line; do echo "$line" done < file.txt # Efficient while IFS= read -r line; do echo "$line" done < file.txt Parallel Processing Tools like xargs and GNU parallel can be incredibly useful. Error Handling Robust error handling is critical for creating reliable and maintainable scripts. # Exit on Error: Using set -e ensures that your script exits immediately if any command fails, preventing cascading errors. set -e # Custom Error Messages: Implement custom error messages to provide more context when something goes wrong. command1 || { echo "command1 failed"; exit 1; } # Trap Signals: Use the `trap` command to catch and handle signals and errors gracefully. trap 'echo "Error occurred"; cleanup; exit 1' ERR function cleanup() { # Cleanup code } # Validate Inputs: Always validate user inputs and script arguments to prevent unexpected behavior. if [[ -z "$1" ]]; then echo "Usage: $0 <argument>" exit 1 fi # Logging: Implement logging to keep track of script execution and diagnose issues. logfile="script.log" exec > >(tee -i $logfile) exec 2>&1 echo "Script started" Automating Complex System Administration Tasks: Automated Backups System Monitoring User Management Automated Updates Network Configuration

Hero Image
寫 Web 也可以用 Makefile:好好管理你的環境流程

寫 Web 也可以用 Makefile:好好管理你的環境流程 How I stopped worrying and loved Makefiles 注意:Makefile 的縮排應使用 Tab,否則會出現語法問題。 Makefile 的主要本體:Target up: cp .env.example .env docker compose up -d workspace stop: docker compose stop zsh: docker compose exec workspace zsh 本例有三個 Target:up、stop、zsh。Makefile 預設將第一個 Target 視為 Goal(不能是點(dot)開頭的 Target),是專案的最主要流程,可以直接用 make 執行。以本例來說,執行 make 和 make up 是一樣的結果。 但其實剛剛複製檔案的例子不是常見的 Make 用法。Make 的強項是在自動判斷有沒有必要執行每個 Target 的流程。例如我們常常將機敏資料放在 .env 中,若 .env 已經存在,就不應該再複製 .env.example 覆寫過去了。這時候我們可以把 .env 做成一個 Target: up: .env docker compose up -d workspace .env: cp .env.example .env Target 名稱預設是被視為檔名的。Make 之所以稱為 make,就是想要「製作」出指定的 Target,當符合指定條件時(如檔案不存在)才會執行 Target 的內容。 以本例來說,我們執行 up Target 時,如果 .env 不存在,就會先執行 .env Target 以複製出 .env,接著才會啟動 workspace container。如果執行 up Target 時 .env 已經存在,就會略過 .env Target,直接啟動 workspace container。 同樣地,如果我們目錄中有「up」這個檔案, up Target 就不會被執行了。這時我們可以設定 Phony Target,告訴 Make 哪些 Target 不是檔案的名稱,而是單純流程的命名。寫法如下: .PHONY: up stop zsh 來點變數 Make 當然也支援變數(Variable),與常見的 Unix 環境變數慣例相同,我們習慣用 SCREAMING_SNAKE_CASE 表示法(全大寫和底線的表示法)。並且在使用時以 $() 包裹變數名稱。

Hero Image
Parse Command Line Arguments in Bash

Parse Command Line Arguments in Bash getopts getopts optstring opt [arg ...] #!/bin/bash while getopts 'abc:h' opt; do case "$opt" in a) echo "Processing option 'a'" ;; b) echo "Processing option 'b'" ;; c) arg="$OPTARG" echo "Processing option 'c' with '${OPTARG}' argument" ;; ?|h) echo "Usage: $(basename $0) [-a] [-b] [-c arg]" exit 1 ;; esac done shift "$(($OPTIND -1))" optstring represents the supported options. The option expects an argument if there is a colon (:) after it. For instance, if option c expects an argument, then it would be represented as c: in the optstring When an option has an associated argument, then getopts stores the argument as a string in the OPTARG shell variable. For instance, the argument passed to option c would be stored in the OPTARG variable. opt contains the parsed option. #!/bin/bash while getopts ':abc:h' opt; do case "$opt" in a) echo "Processing option 'a'" ;; b) echo "Processing option 'b'" ;; c) arg="$OPTARG" echo "Processing option 'c' with '${OPTARG}' argument" ;; h) echo "Usage: $(basename $0) [-a] [-b] [-c arg]" exit 0 ;; :) echo -e "option requires an argument.\nUsage: $(basename $0) [-a] [-b] [-c arg]" exit 1 ;; ?) echo -e "Invalid command option.\nUsage: $(basename $0) [-a] [-b] [-c arg]" exit 1 ;; esac done shift "$(($OPTIND -1))" Note that we’ve updated optstring as well. Now it starts with the colon(:) character, which suppresses the default error message. The getopts function disables error reporting when the OPTERR variable is set to zero. Parsing Long Command-Line Options With getopt #!/bin/bash VALID_ARGS=$(getopt -o abg:d: --long alpha,beta,gamma:,delta: -- "$@") if [[ $? -ne 0 ]]; then exit 1; fi eval set -- "$VALID_ARGS" while [ : ]; do case "$1" in -a | --alpha) echo "Processing 'alpha' option" shift ;; -b | --beta) echo "Processing 'beta' option" shift ;; -g | --gamma) echo "Processing 'gamma' option. Input argument is '$2'" shift 2 ;; -d | --delta) echo "Processing 'delta' option. Input argument is '$2'" shift 2 ;; --) shift; break ;; esac done -o option represents the short command-line options --long option represents the long command-line options