- Resources
- Get help/documentation
- Interactive use
- Bash Invocation
- Variables
- Substitutions
- Expansions
if
statement- Conditional
[
vs[[
- Arithmetic
- Array
- Associative array/Dictionary/Map
- Looping
case
statements- Functions
- Variable scope
- Input / output
- String escaping
- Here documents
- Builtins
- Scripting best practices
- Quick recipes
- Read text to an array
- Read a file line by line
- Read file to an array
- Generate random numbers
- Generate random strings
- Multiple commands on a single line
- Basename and dirname
- Get the directory of a script you're running
- Subshells
- Special characters on command line
- Text file intersections
- Sum up a column of numbers
- Add content hash to file name
- Check whether a command exits
- Test connection to multiple IPs and ports
- Bash by example on IBM DeveloperWorks by Daniel Robbins
- The Art of Command Line
- 15 Examples To Master Linux Command Line History
-
man
# search man pages man -k crontab # crontab (1) - maintain crontab files for individual users (Vixie Cron) # crontab (5) - tables for driving cron # limit search to section 1 (executables/commands) man -k crontab -s 1 # crontab (1) - maintain crontab files for individual users (Vixie Cron) # search using regex man -k 'cron.*' --regex man crontab # man page of crontab in section 5 (file format) man 5 crontab
Navigation in a man page (seems different from the usual vim shortcuts):
k
/K
next / previous matchj
/e
next / previous line
-
type
Some commands are not executables, but Bash builtins, you can check this with
type
type echo # echo is a shell builtin type python # python is /home/gary/miniconda3/bin/python
-
help
Show info about builtin commands, just
help
list all builtins; -
misc
whatis node # node (1) - Server-side JavaScript runtime which node # /home/gary/.nvm/versions/node/v12.14.0/bin/node whereis node # node: /usr/local/bin/node /home/gary/.nvm/versions/node/v12.14.0/bin/node
The interactive line editing is handled by the readline library, it's in Emacs mode by default, can be changed to Vi mode, see man readline
for all editing shortcuts.
-
Search history:
Ctrl-R
, then type in keywordCtrl-R
again to loop thru results;- Right arrow to select current result;
-
Repeat previous command
- the Up key
Ctrl-P
!!
!-1
-
Execute a specific command
# find the number of the command $ history | grep echo 1437 echo $VISUAL 1438 echo $EDITOR 1439 echo $GIT_EDITOR 2013 echo 'hello world' 2014 echo 'hi' 2016 echo 'hi' 2020 history | grep echo # execute the specific command $ !2013 echo 'hello world' hello world
Run set -o vi
to change to VI mode
-
,k
previous history+
,j
next historyC-K
kill line/
,?
search historyn
,N
next/previous search result#
comment out current command and keep it in the history
C-W
delete word backwardC-U
delete to the start of the lineC-[
switch to command mode
Typically, you are using a non-login shell, unless
- logged in from a tty, not thru a GUI;
- logged in remotely, such as thru ssh;
A login shell is one whose first character of first argument is a -
, or one started with the --login
option, you can test whether your current shell is a login shell or not:
prompt> echo $0
-bash # "-" is the first character. Therefore, this is a login shell.
prompt> echo $0
bash # a non-login shell.
An interactive shell is:
- Started without non-option arguments and without the
-c
option whose standard input and error are both connected to terminals (as determined byisatty(3)
); - Or one started with the
-i
option; PS1
is set, and$-
includesi
if bash is interactive, allowing a shell script or a startup file to test this state;
Ref: Zsh/Bash startup files loading order (.bashrc, .zshrc etc.)
+----------------+-----------+-----------+------+
| |Interactive|Interactive|Script|
| |login |non-login | |
+----------------+-----------+-----------+------+
|/etc/profile | A | | |
+----------------+-----------+-----------+------+
|/etc/bash.bashrc| | A | |
+----------------+-----------+-----------+------+
|~/.bashrc | | B | |
+----------------+-----------+-----------+------+
|~/.bash_profile | B1 | | |
+----------------+-----------+-----------+------+
|~/.bash_login | B2 | | |
+----------------+-----------+-----------+------+
|~/.profile | B3 | | |
+----------------+-----------+-----------+------+
|BASH_ENV | | | A |
+----------------+-----------+-----------+------+
| | | | |
+----------------+-----------+-----------+------+
| | | | |
+----------------+-----------+-----------+------+
|~/.bash_logout | C | | |
+----------------+-----------+-----------+------+
Login shell (both interactive or not):
- read
/etc/profile
(if exists); - read first readable:
~/.bash_profile
,~/.bash_login
,~/.profile
; - ...
- when exits, exec
~/.bash_loggout
(if exists)
Interactive non-login shell:
- read both
/etc/bash.bashrc
,~/.bashrc
(if exist)
General rule: For Bash, put stuff in ~/.bashrc
, and make ~/.bash_profile
source it
2016-02-09: a `PS1` prompt problem: it cannot be changed, `PS1` settings in ~/.bashrc got no effect, set `PS1` in command line cannot change it, but it got git branchs in it
finally found the reason: `/etc/bash_completion.d/git-prompt`, which sourced `/usr/lib/git-core/git-sh-prompt`
set, use, unset a variable
$ foo='i am a var'
$ echo "hello ${foo} world"
hello i am a var world
$ unset foo
$ echo "hello ${foo} world"
hello world
Expression in script: | FOO="world" (Set and Not Null) | FOO="" (Set But Null) | (Unset) |
---|---|---|---|
${FOO:-hello} | world | hello | hello |
${FOO-hello} | world | "" | hello |
${FOO:=hello} | world | FOO=hello | FOO=hello |
${FOO=hello} | world | "" | FOO=hello |
${FOO:?hello} | world | error, exit | error, exit |
${FOO?hello} | world | "" | error, exit |
${FOO:+hello} | hello | "" | "" |
${FOO+hello} | hello | hello | "" |
Null and empty string are equivalent in Bash
Check whether a variable is set:
# check whether a var is set and not empty
# we have the '-u' flag, so simple '[ -n "$1" ]' would throw an error
set -eu
if [ -n "${1:+x}" ]; then
echo 'set'
fi
See details here: https://stackoverflow.com/a/3870055
CAUTION: In single brackets, always quote a variable
# Wrong
[ -n $NOT_DEFINED ] && echo 'yes' || echo 'no'
# yes
# the above is equivalent to, the test returns true
[ -n ] && echo 'yes' || echo 'no'
# yes
# Correct: you NEED to quote the variable in single brackets
[ -n "$NOT_DEFINED" ] && echo 'yes' || echo 'no'
# no
# OR, use double brackets, then no need to worry about quoting
[[ -n $NOT_DEFINED ]] && echo 'yes' || echo 'no'
# no
-
String length
a='hello world' echo ${#a} # 11
-
Chopping strings,
##
,#
to chop from beginning,%%
,%
to chop from the end,##
,%%
for longest matching,#
,%
for first matchingfoo='hello-hello.world.jpg' echo ${foo##*hello} # .world.jpg echo ${foo#*hello} # -hello.world.jpg echo ${foo%%.*} # hello-hello echo ${foo%.*} # hello-hello.world
-
String replacement
a='foo' echo ${a/o/X} # fXo # double slash to replace all occurrence echo ${a//o/X} # fXX
-
Substring
foo='hello-world.jpg' echo ${foo:6:5} # world
-
Uppercase / Lowercase
Since Bash 4
a='test' echo "${a^}" # Test echo "${a^^}" # TEST b='TEST' echo "${b,}" # tEST echo "${b,,}" # test
Part of the UNIX process model. This means that environment variables are not exclusive to shell scripts, but can be used by compiled programs as well. When we export
an environment variable under bash, any subsequent programs we run can read it, whether it is a shell script or not.
export a variable
foo='i am a var'
export foo
or use a one-liner: export foo='i am a var'
exported variables are copied, not shared, which means any modification in the subroutine will not affect the variable in the parent routine
date
# Sat Apr 11 10:09:59 NZST 2020
# set environment variable for a command
TZ=Asia/Shanghai date
# Sat Apr 11 06:10:21 CST 2020
$$
current process id;$?
exit code of last command;
#!/bin/bash
echo "name of script is '$0'"
echo "first argument is '$1'"
echo "second argument is '$2'"
echo "seventeenth argument is '${17}'"
echo "number of arguments is '$#'"
echo "all arguments: "
for arg in "$@"
do
echo $arg
done
test it out:
$ ./cmd-args.sh hello world is-fun
name of script is './cmd-args.sh'
first argument is 'hello'
second argument is 'world'
seventeenth argument is ''
number of arguments is '3'
all arguments:
hello
world
is-fun
Since Bash 4.3, you can use declare
builtin to create dynamic variables
i=20
var_20=gary
# ref is like a pointer in C
declare -n ref="var_$i"
echo "$ref"
# gary
echo "${!ref}" # which variable ref is pointing to
# var_20
ref=amy
echo "$ref"
# amy
echo "$var_20"
# amy
Use back ticks or $(...)
to get the output of a command
$ echo `date '+%Y/%m/%d %H:%M:%S'`
2013/01/10 10:43:56
$ echo $(date '+%Y/%m/%d %H:%M:%S')
2013/01/10 10:44:05
When a command expects filenames as arguments, <(CMD)
can help get output of CMD
in a temporary file
Example: using diff to compare file list in two directories
ls test1 test2
# test1:
# a b c
# test2:
# c d e
diff <(ls test1) <(ls test2)
# 1,2d0
# < a
# < b
# 3a2,3
# > d
# > e
ls test? # single character
# test1 test2
ls "test?" # double quote disables globbing
# ls: cannot access 'test?': No such file or directory
ls gary.{txt,jpg}
# gary.txt gary.jpg
mkdir -p test-{1..3}/sub-{a..b} # create all combinations
echo {1..5} # number sequence
# 1 2 3 4 5
echo {a..h} # character sequence
# a b c d e f g h
echo {10..1} # reversed sequence
# 10 9 8 7 6 5 4 3 2 1
echo {1..10..3} # sequence with increment interval
# 1 4 7 10
echo {0..1}{0..9} # combinations
# 00 01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19
brace expansion is performed before any other expansion, so {1..$END}
would not work as expected, use seq start end
or for
loop instead:
END=3
for i in {1..$END}; do echo $i; done
# {1..3}
for i in `seq 1 $END`; do echo $i; done
# 1
# 2
# 3
for ((i=0; i<=$END; i++)) do echo $i; done
# 0
# 1
# 2
# 3
if [ condition ]
then
action
elif [ condition2 ]
then
action2
.
.
.
elif [ condition3 ]
then
action3
else
actionx
fi
example
if [ "${1##*.}" = "tar" ]
then
echo 'This appears to be a tarball.'
else
echo 'At first glance, this does not appear to be a tarball.'
fi
bash comparison operators:
Operator | Meaning | Example |
---|---|---|
-s | File exists and not empty | [ -s "$myvar" ] |
-z | Zero-length string | [ -z "$myvar" ] |
-n | Non-zero-length string | [ -n "$myvar" ] |
= | String equality | [ "abc" = "$myvar" ] |
== | Bash extension, same as '=' | |
!= | String inequality | [ "abc" != "$myvar" ] |
-eq | Numeric equality | [ 3 -eq "$myinteger" ] |
-ne | Numeric inequality | [ 3 -ne "$myinteger" ] |
-lt | Numeric strict less than | [ 3 -lt "$myinteger" ] |
-le | Numeric less than or equals | [ 3 -le "$myinteger" ] |
-gt | Numeric strict greater than | [ 3 -gt "$myinteger" ] |
-ge | Numeric greater than or equals | [ 3 -ge "$myinteger" ] |
-f | Exists and is regular file | [ -f "$myfile" ] |
-d | Exists and is directory | [ -d "$myfile" ] |
-nt | First file is newer than second one | [ "$myfile" -nt ~/.bashrc ] |
-ot | First file is older than second one | [ "$myfile" -ot ~/.bashrc ] |
=~ | Regular expression, NO QUOTES around the RE | [[ "$file" =~ .*\.jpg$ ]] |
example:
if [ "$myvar" -eq 3 ]
then
echo "myvar equals 3"
fi
if [ "$myvar" = "3" ]
then
echo "myvar equals 3"
fi
If $myvar
is an integer, these two comparisons do exactly the same thing, but the first uses arithmetic comparison operators, while the second uses string comparison operators. If $myvar
is not an integer, then the first comparison will fail with an error.
if $myvar
is empty or have space in it, like 'foo bar', it will result in error:
$ myvar="foo bar oni"
$ if [ $myvar = "foo bar oni" ]
> then
> echo "yes"
> fi
bash: [: too many arguments
$ unset myvar
$ echo $myvar
$ if [ $myvar = "foo bar oni" ]
> then
> echo "yes"
> fi
bash: [: =: unary operator expected
so, Always enclose string variables and environment variable in double quotes!, like this:
if [ "$myvar" = "foo bar oni" ]
then
echo "yes"
fi
-
[
is a shell builtin command, similar totest
, but requires a closing]
, builtin commands executes in the current process; -
There is an executable file at:
/bin/[
, which executes in a subshell;type -a [ # [ is a shell builtin # [ is /usr/bin/[ type '[[' # [[ is a shell keyword
-
[[
is a Bash extension to[
, it has some improvements:-
<
[[ a < b ]]
# works[ a \< b]
#\
is required, do a redirection otherwise
-
&&
and||
[[ a = a && b = b ]]
# works[ a = a && b = b ]
# syntax error[ a = a ] && [ b = b ]
# POSIX recommendation
-
(
[[ (a = a || a = b) && a = b ]]
# false[ ( a = a ) ]
# syntax error,()
is interpreted as a subshell[ \( a = a -o a = b \) -a a = b ]
# equivalent, but()
is deprecated by POSIX([ a = a ] || [ a = b ]) && [ a = b ]
# POSIX recommendation
-
word splitting
x='a b'; [[ $x = 'a b' ]]
# true, quotes not neededx='a b'; [ $x = 'a b' ]
# syntax error, expands to[ a b = 'a b' ]
x='a b'; [ "$x" = 'a b' ]
# equivalent
-
=
[[ ab = a? ]]
# true, because it does pattern matching (* ? [
are magic). Does not glob expand to files in current directory. (pattern matching, not regular expression)[ ab = a? ]
#a?
glob expands to files in current directory. So may be true or false depending on the files in the current directory.[ ab = a\? ]
# false, not glob expansion=
and==
are the same in both[
and[[,
but==
is a Bash extension.
-
=~
[[ ab =~ ab? ]]
# true, POSIX extended regular expression match,?
does not glob expand[ a =~ a ]
# syntax errorprintf 'ab' | grep -Eq 'ab?'
# POSIX equivalent
-
enclose arithmetic expressions(integer only) in $((
and ))
$ echo $(( 100/3 ))
33
$ echo $((1+2))
3
$ a=10
$ echo $(( a+2 ))
12
$ echo $(( $a+2 ))
12
$ echo $(( 1.3 + 4 ))
bash: 1.3 + 4 : syntax error: invalid arithmetic operator (error token is ".3 + 4 ")
- Available in Bash, Zsh, not the original Bourne shell;
arr=(apple banana cherry)
echo ${arr[@]} # all elements
# apple banana cherry
echo ${arr[*]} # same as above, get all elements
# apple banana cherry
echo ${#arr[@]} # length
# 3
echo ${arr[1]} # first element, index starts at 1
# apple
echo ${arr[@]:1} # leave the first
# banana cherry
echo ${arr[@]: -1} # get the last, the space is needed
# cherry
echo ${arr[@]:0:2} # start from the first, get two
# apple banana
# loop through an array
for (( i=0; i<${#arr[@]}; i++ )); do
echo ${arr[$i]}
done
Bash 4+
declare -A animals=( ["cow"]="moo" ["dog"]="woof")
echo "${!animals[@]}" # all keys
# dog cow
echo "${animals[@]}" # all values
# woof moo
echo "${animals[cow]}" # retrive a value
# moo
for animal in "${!animals[@]}"; do
echo "$animal - ${animals[$animal]}";
done
# dog - woof
# cow - moo
Standard for
loop:
$ for x in one two three four
> do
> echo number $x
> done
number one
number two
number three
number four
use file wildcards, variables in word list:
$ FOO='hello'
$ for i in lee_* $FOO
> do
> echo $i
> done
lee_1
lee_2
lee_3
hello
while
loop:
$ i=0
$ while [ $i -le 3 ]
> do
> echo $i
> i=$(( i+1 ))
> done
0
1
2
3
until
loop:
$ i=0
$ until [ $i -eq 2 ]
> do
> echo $i
> i=$(( i + 1 ))
> done
0
1
case
syntax:
case ${filename##*.} in
[tT][xX][tT]) # matches txt, TXT, Txt, tXt, ...
echo 'a text file'
;;
jpg | png)
echo 'an image file'
;;
*)
echo 'unknown file'
;;
esac
*
means default
, ;;
means break
functions can take arguments just like scripts, use $1
, $2
, $#
, $@
, etc to access them:
write a script func_args.sh
:
#!/bin/bash
func() {
echo "this function has $# arguments"
local i
local count=1
for i in $@
do
echo "arg ${count}: $i"
count=$(( count + 1 ))
done
echo '.. and $0: ' $0
}
return to bash:
$ source func_args.sh
$ func a happy dog
this function has 3 arguments
arg 1: a
arg 2: happy
arg 3: dog
.. and $0: /bin/bash
but, $0
in function, will either expand to the bash filename (if you run the function from the shell, interactively) or to the name of the script the function is called from
return values from a function:
larger() {
if [ $1 -gt $2 ]; then
return 0 # should be zero
else
return 1
fi
}
if larger 2 1; then
echo 'hooray'
fi
You can make functions return numeric values using the return command. The usual way to make functions return strings is for the function to store the string in a variable, which can then be used after the function finishes. Alternatively, you can echo a string and catch the result, like this: RETURN_VAL=$(func var1 var2)
you can get its exit status using $?
#!/bin/bash
s='hello from global scope'
func() {
s='hello from func'
echo $s
}
func2() {
local s='hello from func2'
echo $s
}
echo 'before func():' $s
func
echo 'after func() :' $s
func2
echo 'after func2():' $s
run the script, you'll get:
before func(): hello from global scope
hello from func
after func() : hello from func
hello from func2
after func2(): hello from func
variables defined in functions have global scope, except you declare them as local
explicitly
By default stdout and stderr both go to the current terminal, they can be redirected:
ls > out 2> err
# append to files
ls >> out 2>> err
# redirect stderr to the stdout file 'out'
ls >> out 2>&1
# alternative syntax
ls &>> out
If out
is a protected file, sudo ls /root > out
won't work, because sudo
only applies to ls
, the redirection is done by zsh
, which is not run by super user, so you need to pipe stdout to sudo tee
:
ls -l out
# -rw-rw-r-- 1 root root 0 May 23 21:00 out
sudo ls /root > out
# zsh: permission denied: out
sudo ls /root | sudo tee out
-
Double quotes in double quotes
echo "He says \"I am good\"" # He says "I am good"
-
Single quote within single quotes
-
Replace single quote with
'\''
, so the single quote is actually not enclosed by a pair of single quotes, because there's no white space, three parts are joined together and treated as oneecho 'It'\''s great' # It's great
-
Use hex code
# Use Hex code echo -e 'It\x27s great' # It's great
-
-
Use
$'string'
to enable ANSI C escape sequencesecho $'It\'s great' # It's great
-
Used in place of standard input
name='Gary' # write something to a file, varialbes expanded cat > result.txt <<EOT hello ${name} EOT cat result.txt # hello Gary # quote EOT to disable variable expansion cat > result2.txt <<'EOT' hello ${name} EOT cat result2.txt # hello ${name}
-
Assign to a variable
arr=$(cat <<EOT line1 line2 EOT )
-
Use
tee
to output,echo
doesn't worktee /dev/null <<EOT hello world EOT # hello # world
-
Use
<<-
to strip leading tabs (doesn't work with whitespaces)tee /dev/null <<-EOT a tab a whitespace EOT # a tab # a whitespace
-
To strip all the leading spaces (tabs and whitespaces), use
sed
tee /dev/null <<-EOT | sed -E 's/^\s*//' a tab a whitespace EOT a tab a whitespace
-
use here documents to edit a file:
$ cat inc foo $ ed inc <<EOT > 1 > s/foo/BAR/ > w > q > EOT 5 foo 5 $ cat inc BAR
- do not use delimiters
- leading and trailing newlines are retained
tr a-z A-Z <<< '
> one
> two
> '
#
# ONE
# TWO
#
Here strings are particularly useful when the last command needs to run in the current process, as is the case with the read builtin:
$ echo 'one two three' | read -r a b c
$ echo "$a $b $c"
yields nothing, while
$ read -r a b c <<< 'one two three'
$ echo "$a $b $c"
one two three
This happens because in the previous example piping causes read
to run in a subprocess, and as such can not affect the environment of the parent process.
source ./script
# or
# . ./script
the source
command runs the script in the same shell as the calling script, just like #include
in C, it can be used to incorporate variable and function definitions to a script, such as set up environment for later commands.
$ echo 'your name?'; read name
your name?
lee
$ echo $name
lee
X/Open suggests it should be used in preference to echo
for generating fomatted output, usage is similar to that in C
$ printf '%10s\t%-10d\n' 'lee' 20
lee 20
used to set shell options and positional parameters
$ set foo bar lol
$ echo $1 $3
foo lol
a trick: using set
to get fields of a command's ouput
$ date
Tue Sep 9 09:48:17 CST 2014
$ set $(date)
$ echo $1 $2
Tue Sep
this is just an example, you should not actually use this to extract date
ouput, you should use format strings
shift paramters off the left, can be used to scan parameters
$ set foo bar hah # set paramters
$ echo $@
foo bar hah
$ shift
$ echo $@
bar hah
$ echo $#
2
Always add the shebang on first line, use env
to find bash
in $PATH
:
#!/usr/bin/env bash
-
set -n
same as:
set -o noexec
, checks for syntax errors only, doesn’t execute commands; -
set -v
same as:
set -o verbose
, echoes commands before running them; -
set -x
same as:
set -o xtrace
, print commands and their arguments as they are executed (after variable expansions); -
set -u
same as:
set -o nounset
, exit when an undefined variable is used, otherwise it's silently ignored;- use a default value when necessary:
NAME=${1:-gary}
, if$1
is undefined or empty,NAME
will begary
;
- use a default value when necessary:
-
set -e
abort on errors (non-zero exit code), otherwise the script would continue. By default, bash doesn't exit on errors, which makes sense in an interactive shell context, but not in a script;
-
NOTE: when using
&&
, if a non-last command fails, although the exit code of the whole line is not 0, the script doesn't exit, this may or may not be expected:set -eu foo && echo 'after foo' # foo fails but the script doesn't exit echo $? # output 127 echo 'end' # runs
The
&&
andif
structure may seem the same, but their exit codes are different:[ -f /x ] && echo 'done' echo $? # 1 if [ -f /x ]; then echo 'done'; fi echo $? # 0
-
-
set -o pipefail
abort on errors within pipes
ls /point/to/nowhere | sort # ls: cannot access '/point/to/nowhere': No such file or directory echo $? # 0
without the flag,
ls
has empty stdout and a message in stderr, sort takes the empty stdout, and executes successfully, its exit code 0 becomes the whole command exit code;set -o pipefail ls /point/to/nowhere | sort # ls: cannot access '/point/to/nowhere': No such file or directory echo $? # 2
with the flag,
sort
still executes, butls
's exit code becomes the whole command's exit code, withset -e
, the script exits;
IFS
stands for Internal Field Separator, by default, its values is $' \n\t'
($'...'
is the construct that allows escaped characters);
for arg in $@; do
echo "doing something with file: $arg"
done
./x.sh a.txt 'gary li.doc'
# doing something with file: a.txt
# doing something with file: gary
# doing something with file: li.doc
In the example above, we don't want to split by space, so we'd better set IFS=$'\n\t'
You can use trap
to do some cleanup work on script error or exit, this makes sure the cleanup is always done even when the script exits unexpectedly:
#!/bin/bash
set -euo pipefail
set -x
function onExit {
echo 'EXIT: clean up, remove temp directories, stop a service, etc'
}
function onError {
echo 'ERROR: something is wrong'
}
trap onError ERR # do something on error
trap onExit EXIT # do something on exit
foo # triggers an error
exit 0
outputs
x.sh: line 15: foo: command not found
ERROR: something is wrong
EXIT: clean up, remove temp directories, stop a service, etc
It's a good practice to start a script like this(called unofficial bash strict mode), which will detect undefined variables, abort on errors and print a message:
set -euo pipefail
IFS=$'\n\t'
trap "echo 'error: Script failed: see failed command above'" ERR
# your code here
Install a tool shellcheck
to check your script:
sudo apt install shellcheck
shellcheck script.sh
readarray rows << EOT
gary 20
amy 30
EOT
for row in "${rows[@]}"; do
rowArr=($row)
name=${rowArr[0]}
age=${rowArr[1]}
echo "${name} - ${age}"
done
while read -r line; do
echo $line;
done < my_file.txt
Read a file in to a two-dimensional array:
readarray rows < demo.txt
for row in "${rows[@]}";do
row_array=(${row})
first=${row_array[0]}
echo ${first}
done
echo $RANDOM
# 12521
echo $RANDOM
# 15828
# 5 character string (5 bytes, 10 hex chars)
openssl rand -hex 5
# cf2a039a47
All three commands will execute even some fails
$ make ; make install ; make clean
Only proceed to the next command when the preceding one succeeded
$ make && make install && make clean
Stop execution after first success
$ cat file1 || cat file2 || cat file3
basename /home/lee/code/test.php
# test.php
dirname /home/lee/code/test.php
# /home/lee/code
ref: http://stackoverflow.com/questions/59895/can-a-bash-script-tell-what-directory-its-stored-in
DIR=$( cd "$( dirname "$0" )" && pwd )
Use subshells (enclosed with parenthesis) to group commands, which allows you to another directory temporarily
# do something in current dir
(cd /some/other/dir && other-command)
# continue in original dir
Use $''
to input special characters on command line
echo hello$'\n\t'world
# hello
# world
If you have two files 'a' and 'b', they are already uniqed, you can use sort/uniq
to find common/different words in them like this:
sort a b | uniq # a union b
sort a b | uniq -d # a intersect b
sort a b b | uniq -u # set difference a - b
awk '{ sum += $2 } END { print sum }' numbers.txt
# rename x.ext => x.[md5sum].ext
for f in `ls *`; do
hash=$(md5sum $f | cut -d' ' -f1)
mv $f ${f%%.*}.${hash}.${f##*.}
done
if ! command -v <the_command> &> /dev/null
then
echo "<the_command> could not be found"
exit
fi
command
is a builtin, can be used to
- invoke a command on disk even when a function with the same name exists
- display information about a command
readarray rows << EOT
10.0.0.4 AGW
10.1.1.5 VM-DB-APP
10.2.2.6 LINUX
EOT
tcp_port_is_open() {
local code
curl --telnet-option BOGUS --connect-timeout 2 -s telnet://"$1:$2" </dev/null
code=$?
case $code in
49) echo -n "Y" ;;
*) echo -n "N" ;;
esac
}
# loop through ports
test_vm_ports() {
local IP=$1
shift
PORTS=$*
echo -n -e "$IP \t| "
for p in 3389 1433 443 22; do
if [[ $PORTS =~ $p ]]; then # test this port if matching
IS_OPEN=$(tcp_port_is_open $IP $p)
[[ $IS_OPEN == "Y" ]] && echo -n -e "$p O\t" || echo -n -e "$p X\t"
else
echo -n -e "$(tr "[:digit:]" " " <<< $p) \t"
fi
echo -n "| "
done
}
## main, loop through IPs
for row in "${rows[@]}"; do
rowArr=($row)
IP=${rowArr[0]}
TYPE=${rowArr[1]}
RUNNING=${rowArr[2]}
case $TYPE in
'AGW') PORTS="443" ;;
'VM-DB-APP') PORTS="3389 1433 443" ;;
'LINUX') PORTS="22" ;;
*) echo "Wrong type"
esac
test_vm_ports $IP $PORTS
echo ""
done
Output is like:
10.0.0.4 | | | 443 O | |
10.1.1.5 | 3389 O | 1433 O | 443 X | |
10.2.2.6 | | | | 22 O |