Users browsing this thread: 1 Guest(s)
sagittarius
Members
Hello fellow nixers,

I'm curious to know more about your automation scripts, regarding daily administration, security, workflow and so on.

Because IT guys are lazy, it is a good thing to script tasks as much as possible. Do you have any oneliners or posix compliant script you'd be proud to share/explain? Even dirty piece of code are welcome.

Expecting tr0lls, I know wmutils is based on scripts, and that's why it's so delightful!

The first one I could show you is a script based on vnstat generating graph about network activity. I usually launch it when turning the computer off. A crontab is also in charge of executing it if I don't do it by myslef (thinking about my VPS). It allows me to have an idea of the amount of network activity (and to show off a bit as well).

This is how it looks like when it is properly configured:

[Image: vnstat_364865_full.png?format=jpg&width=...cale=false]

Will share the source code as soon as I get back home.

Cheers
venam
Administrators
(17-06-2016, 09:03 AM)sagittarius Wrote: I'm curious to know more about your automation scripts, regarding daily administration, security, workflow and so on.

This is a vague subject to discuss, as almost everything can be automated.
LITERALLY EVERYTHING.

Maybe it would be wiser to think of the wider aspect of it: What to take in consideration when automating tasks.
But then again, there are so many types of tasks.

What should you go for, on hardware events, periodic runs, manual runs but automatic process, on new log generations, etc..

Even just automatically remapping your keys on boot can be said to be a task automation.

I'll still contribute with a cool link:
For Security there are those automation scripts for the after the facts: http://linuxforensicsbook.com/code.html

You could take those scripts as a base as what to check in your daily administration, yet again it depends on what you are administering.
josuah
Long time nixers
I have a few scripts I use everyday, but no one automate anything in the background.

This is really not administration nor security, only for the "and so on"...

The only two thing I currently automate are:

Tags in the terminal window manager DVTM, just like [/url]: .profile


Code:
tag()
{
    local tag="$1"
    local name="$2"
    local cmd="$3"
    shift 3

    [ -p "$DVTM_CMD_FIFO" ] && printf 'tag%s\n' "$tag" >> "$DVTM_CMD_FIFO"
    printf '\033[0m\033]0;%s\007' "$name"
    $cmd $@
}

This one set the tag, just like in dwm, printing the name of a function "tag#", with # being a tag number, to a named pipe (a FIFO), that dvtm reads for automation.

The functions must be declared in the config.h with something like:

Code:
static Cmd commands[] = {
    { "create", { create, { NULL    } } },
    { "tag1",   { tag,    { tags[0] } } },
    { "tag2",   { tag,    { tags[1] } } },
    { "tag3",   { tag,    { tags[2] } } },
    { "tag4",   { tag,    { tags[3] } } },
    { "tag5",   { tag,    { tags[4] } } },
    { "tag6",   { tag,    { tags[5] } } },
    { "tag7",   { tag,    { tags[6] } } },
};

Then, I set alias for most used software, to open them in dedicated tag immediately: [url=github.com/sshbio/dot/raw/master/.profile].profile

Code:
alias    vis='tag 2 vis    $VISUAL'
alias    web='tag 3 web    $BROWSER'
alias    man='tag 3 man    man'
alias    irc='tag 4 irc    irc'
alias   mail='tag 4 mail   mail'
alias    vol='tag 5 volume "alsamixer -c 1"'
alias   grex='tag 6 grex   ssh josuahdemangeon@grex.org'

Finally, whenever I return from the program to the shell, I tag it automatically to the "shell" tag by calling the <code>tag</code> function in my PS1:

Code:
export PS1='$(tag 1 "${PWD/$HOME/~}" "")> '

PS: DVTM has to be started with the -c option to set the name of the named pipe to listen from:
Code:
dvtm -c <whatever fifo>


Compilation. I have a few projects I prefer to compile myself, like if I want to change one of the files (like the config.h), apply a patch (like for dwm, st, or if I made my own patch, edited the source).


Then I can keep my few "custom" software up to date, without having to download .tar.gz > extract > compile each software the way it needs to > install it in my custom $PREFIX.

The way it is implemented is quite dumb:

Code:
source config dir
|_ one dir per project
|  |_ at least the build.sh script
|  |_ eventually other resources, used by the build.sh script, like config.h, or patches
|_ one default build script, as for most software, this just works:

build()
{
    for patch in $(find "$DOT/src/$1" -name '*.diff')
    do patch -p1 < "$patch"
    done

    [ -f "$DOT"/src/"$1"/config.mk ] && cp -f "$DOT"/src/"$1"/config.mk .
    [ -f "$DOT"/src/"$1"/config.h  ] && cp -f "$DOT"/src/"$1"/config.h  .
    [ -f ./autogen.sh              ] && ./autogen.sh
    [ -f ./configure               ] && ./configure --prefix="$PREFIX"
    
    make
    make prefix="$PREFIX" PREFIX="$PREFIX" install clean
    return 0
}

Each build.sh script has:

- One commented line with a description.
- One <code>tar</code> variable with an url to a .tar.gz archive to the source.
- Eventually a <code>build()</code> function, if the default one does not work.

pranomostro
Long time nixers
I automated my backupping, for example (yes, I wrote my own script for that. Shame one me.).

When thinking about writing a general script, I try to abstract my task and find the smallest and fastest possible solution (hint: don't spawn
too many processes) and then I write it down.

Most of my commands are just saved in my command line history, and my shell (fish) has a really good
searching function and autosuggest what makes me easily find whatever I want.

I don't really have any scripts running constantly in the background, but the scripts I wrote can be found at
https://github.com/pranomostro/script

I use shell functions for aliasing stuff, everything more complex than adding a default flag or
wrapping with rlwrap deserves an own script.

For building, I use make (it's everywhere) and also shell scripts.
venam
Administrators
(17-06-2016, 11:26 PM)pranomostro Wrote: I automated my backupping, for example (yes, I wrote my own script for that. Shame one me.).

Same here, just a concatenation of some rsync commands.
josuah
Long time nixers
(18-06-2016, 06:19 AM)venam Wrote: some rsync commands

For what I need, rsync is an overkill, and needs to be installed/compiled on the targeted platform.

With a shell script, I can keep it along with my data on my USB key, and run it on any UNIX.

Code:
lsync(1)                        Mirroring utility                        lsync(1)

NAME
        LSync - Mirror two local dirs keeping the latest version

SYNTAX
        lsync [-v] [-e PATTERN] DIR1 DIR2

DESCRIPTION
        -e      Exclude PATTERN from the search, matching the whole path.
        -v      View the output without copying anything.
        DIR     The two directories to mirror.

USAGE
        LSync mirrors data across two directories so that both contains all the
        files from each other keeping the timestapms.

        When a file exists in both directories, if they do not have the same
        timestamps, the most recent file overwrite the other.

        There is no deletion performed.

Example of output it produces on screen:

[Image: 7SSexFz.png]

[EDIT]: That way, I can grep the output for '>>>' or ' > ' to see what is overwritten or copied from left to right, '>' to see whatever comes from left to right...

It only contains these commands + shell builtins: <code>
mkdir find cut -c tee sed sort uniq tee grep -v cp -p cp -pf
</code>
But I would not recommand to use it in production, as I am a beginner, and there may be many flaws to it.

The script: lsync
Code:
#!/bin/sh
#    /\
#   / / ____ __  /\ ____   ____
#  / / / __// / / // __ \ / ___\
# / /__\ \  \ \/ // / / // /__
# \/ \___/ __\  / \/  \/ \____\ - Sync from two local dirs
#==========\__,'================================================================

[ $# -lt 2 ] || [ $# -gt 5 ] && printf '%s\n' '
NAME
        LSync - Mirror two local dirs keeping the latest version

SYNTAX
        lsync [-v] [-e PATTERN] DIR1 DIR2

DESCRIPTION
        -e      Exclude PATTERN from the search, matching the whole path.
        -v      View the output without copying anything.
        DIR     The two directories to mirror.

USAGE
        LSync mirrors data across two directories so that both contains all the
        files from each other keeping the timestapms.

        When a file exists in both directories, if they do not have the same
        timestamps, the most recent file overwrite the other.

        There is no deletion performed.
' && exit 0

while [ $# -gt 0 ]
do case "$1" in
                '-e' ) p="$2"; shift                            ;;
                '-v' ) v=1                                      ;;
                *    ) [ -z "$dir1" ] && dir1="$1" || dir2="$1" ;;
        esac
        shift
done

[ ! -d "$dir1" ] && mkdir -p "$dir1"
[ ! -d "$dir2" ] && mkdir -p "$dir2"

copied=0; overwritten=0; directories=0; identical=0
# Generate a path list for all the files from both directories
{
        find "$dir1" | cut -c $((${#dir1} + 1))-
        find "$dir2" | cut -c $((${#dir2} + 1))-
} | sed 's/^\///' | sort | uniq | {
        [ -z "$p" ] && tee || grep -v "$p"
} | {
        while read path
        do
                # Create directories
                if [ -d "$dir1/$path" ] || [ -d "$dir2/$path" ]
                then [ -z "$v" ] && mkdir -p "$dir1/$path" "$dir2/$path"
                        printf '\033[1;30m      %s\033[0m\n' "$path"
                        directories=$(($directories + 1))

                # Copy files that does not exist on one side
                elif [ ! -e "$dir2/$path" ]
                then [ -z "$v" ] && cp -p  "$dir1/$path"  "$dir2/$path"
                        printf '\033[1;32m1 > 2\033[0m %s\n' "$path"
                        copied=$(($copied + 1))

                elif [ ! -e "$dir1/$path" ]
                then [ -z "$v" ] && cp -p  "$dir2/$path"  "$dir1/$path"
                        printf '\033[1;32m1 < 2\033[0m %s\n' "$path"
                        copied=$(($copied + 1))

                # Overwrite files keeping the latest version
                elif [ "$dir1/$path" -nt "$dir2/$path" ]
                then [ -z "$v" ] && cp -pf "$dir1/$path"  "$dir2/$path"
                        printf '\033[1;31m1>>>2\033[0m %s\n' "$path"
                        overwritten=$(($overwritten + 1))

                elif [ "$dir2/$path" -nt "$dir1/$path" ]
                then [ -z "$v" ] && cp -pf "$dir2/$path"  "$dir1/$path"
                        printf '\033[1;31m1<<<2\033[0m %s\n' "$path"
                        overwritten=$(($overwritten + 1))

                else printf '\033[1m1 = 2\033[0m %s\n' "$path"
                        identical=$(($identical + 1))
                fi
        done

        printf ' \   \    %-30s \033[1;32m%5s\033[0m copied
  \   \__ %-30s \033[1;31m%5s\033[0m overwritten
   \_____ %-30s \033[1;39m%5s\033[0m identical
          %-30s \033[1;30m%5s\033[0m directories\n' \
                ''      "$copied"      \
                "$dir2" "$overwritten" \
                "$dir1" "$identical"   \
                ''      "$directories"
}
PS:
(17-06-2016, 11:26 PM)pranomostro Wrote: https://github.com/pranomostro/script
Oh, so you keep a fork bomb in your scripts, interesting. :)
pranomostro
Long time nixers
One can never know when a bomb may be needed.

Edit: btw, lsync looks really good.