Backups are for weaklings! - Desktop Customization & Workflow

Users browsing this thread:
shtols
Long time nixers
Catched your attention? Good. My attitude towards making regular, tested backups is pretty sloppy, I usually use ugly shellscripts that write my data to USB-sticks via cronjob, but more often than not I do not make any backups on new machines for weeks or even months. Yes, I know how stupid that is.

As I probably have deserved it, this now came back to bite me in the arse and I've lost some (more or less important) data a few days ago. I took that as a hint to finally find a backup strategy that does not consist solely of "#yolo". My question is, how do you do backups?

I know of tools like AMANDA, Arkeia or Bacula, but they are a bit overkill for my scenario (A couple of boxes connected to the Internet, a notebook), but on the other hand, going back to writing shell is something I don't like either .. halp!
Phyrne
Long time nixers
If it's ZFS, incremental snapshots and clones. For everything else, rsnapshot! :D
jobss
Long time nixers
I back everything up that is not private to a git server and the few private things I have I just pop in a folder and tell the OS to create a copy of it to a USB stick.
The world is quaking from our Linux Thoughts!
venam
Administrators
I backup the dot files that aren't private on Github and I simply copy to a USB the rest of the important files . I don't any big data other than my music so I'd rather copy it all instead of writing a script that automates it, but if I had an important DB running on my laptop, I'd make the process a cron job.
crshd
Registered
Like others here, I recently started to keep my dotfiles on github. Then I got some important documents on Google Drive. That's about it. My backup strategy is seriously lacking... I need to get a NAS going with some RAID.
kirby
Long time nixers
...Backups?

I do literally nothing, I don't have anything to put them on. The only stuff that really needs saving is school stuff, and I just dump that on a USB or transfer it to my Raspberry Pi and let it sit there should anything happen.
ajac
Members
i have some (now outdated) config files on a usb stick for my laptop... but come to think of it my desktop with its music collection is not backed up.
zygotb
Long time nixers
As much as *buntu is maligned among the elitists, Ubuntu One gives 5 GB (or more) of free online storage in their silos.
With the Ubuntu One app you can keep your selected files and directories automatically uploaded and synced, if you don't mind having *buntu "phone home" all the freaking time.

The web interface works with other OS besides *buntu, and free storage of my stuff online makes the stuff available to all of my devices, as well as the NSA's!

:D
eksith
Long time nixers
While that shell script + cron solution gets rained on a lot, by and large, it's quite an effective method provided nothing else breaks. It's possible to write scripts that are quite robust with error checking, conflict detection, checksums etc... It all comes down to how important your work really is. If you can survive without it, then it's toward the 'less' important end ;)

I'm currently running a web server in the test phase and the forum script auto-locks and archives threads as html files after a certain period without replies. This is significantly easier on resources since I don't need to hit a database to serve essentially static content. Basically the URL is SHA1 hashed and that gets split a few times to make directories where html file goes [ don't want 8000 files in one folder ;) ]. Now this kind of thing is ideal for the cp + cksum treatment via shell script and cron. Does it fail occasionally? Of course! But that's why it will try again and if it fails one more time, it will send me an email.

Basically...
Code:
#!/bin/bash
MESSAGE="/tmp/bkpstatus2013_10_26.txt"
/bin/mail -s "Backup status today" "admin@mail" < $MESSAGE

The hiccups come if you want complex versioning of sorts with your backups. This is especially true of constantly edited files like office documents or Photoshop/Gimp files. The best in that case (again, depending on how important this stuff is to you) is something like FreeNAS.

I'm allergic to most things "cloud" ;) "Cloud" turns computing and backups down to "magic" pretty much. And I don't believe in "magic" ;) My stance on that is simply : *You do not own what you cannot control*. It's nice to think storage solutions like Dropbox, Google Drive, Ubuntu Cloud etc... or proper backup services like Carbonite et. al. are there for you, but you don't know how or where they store your stuff. If it's just family pics and stuff, it's probably not a big deal, but more sensitive things really don't belong there. Add to that the current atmosphere where authorities basically have the gall to say they can peek at your junk because you're not storing it and you have a recipe for mistrust that leaves quite a bad case of indigestion.

Of course, I'm also keeping backups for other folks who use my services (don't know what they're storing and don't care) so if I were to lose any of it, I'd need a one way ticket to Mexico :P
jobss
Long time nixers
(26-10-2013, 06:14 PM)eksith Wrote: While that shell script + cron solution gets rained on a lot, by and large, it's quite an effective method provided nothing else breaks. It's possible to write scripts that are quite robust with error checking, conflict detection, checksums etc... It all comes down to how important your work really is. If you can survive without it, then it's toward the 'less' important end ;)

I'm currently running a web server in the test phase and the forum script auto-locks and archives threads as html files after a certain period without replies. This is significantly easier on resources since I don't need to hit a database to serve essentially static content. Basically the URL is SHA1 hashed and that gets split a few times to make directories where html file goes [ don't want 8000 files in one folder ;) ]. Now this kind of thing is ideal for the cp + cksum treatment via shell script and cron. Does it fail occasionally? Of course! But that's why it will try again and if it fails one more time, it will send me an email.

Basically...
Code:
#!/bin/bash
MESSAGE="/tmp/bkpstatus2013_10_26.txt"
/bin/mail -s "Backup status today" "admin@mail" < $MESSAGE

The hiccups come if you want complex versioning of sorts with your backups. This is especially true of constantly edited files like office documents or Photoshop/Gimp files. The best in that case (again, depending on how important this stuff is to you) is something like FreeNAS.

I'm allergic to most things "cloud" ;) "Cloud" turns computing and backups down to "magic" pretty much. And I don't believe in "magic" ;) My stance on that is simply : *You do not own what you cannot control*. It's nice to think storage solutions like Dropbox, Google Drive, Ubuntu Cloud etc... or proper backup services like Carbonite et. al. are there for you, but you don't know how or where they store your stuff. If it's just family pics and stuff, it's probably not a big deal, but more sensitive things really don't belong there. Add to that the current atmosphere where authorities basically have the gall to say they can peek at your junk because you're not storing it and you have a recipe for mistrust that leaves quite a bad case of indigestion.

Of course, I'm also keeping backups for other folks who use my services (don't know what they're storing and don't care) so if I were to lose any of it, I'd need a one way ticket to Mexico :P
This is the smallest example I have seen to do what you wanted to do.
The world is quaking from our Linux Thoughts!
eksith
Long time nixers
lol That's just the status dump to my email :P It only runs if something went wrong and there's a status message for me.

If you're curious about the actual code that creates archives from given path and content info, it's as follows (it doesn't have the actual forum related script yet. Maybe I'll post that in the PHP board when it's done)

*Edit* Forgot to add, the relevant "path splitting" happens in *filePath* and archive checking/insertion happens in *archive*, of course
Code:
<?php
/**
* File storage, retrieval operations including file uploading
* and caching.
*
* This class depends on ARCHIVE, CACHE and UPLOAD defined variables
* E.G. define( 'ARCHIVE',    '/data/archive' );
* E.G. define( 'CACHE',    '/data/cache' );
* E.G. define( 'UPLOAD',    '/data/uploads' );
*
* CAUTION: This is a super beta script!
*
* @author Eksith Rodrigo
* @version 0.1
*/
class File {
    
    /**
     * Handles file uploading via PUT
     *
     * @param string $name Unique uploading file name
     * @param string $file Location which will be hashed
     */
    public static function putUpload( $name, $file ) {
        $uploaded    = array();
        
        if ( !$put = fopen( 'php://input', 'r' ) ) {
            return $uploaded;
        }
        
        $src        = self::filePath( UPLOAD, $file ) . $name;
        $src        = self::duplicateName( $src );
        
        $f        = fopen( $src, 'w' );
        
        while( $data = fread( $put, 1024 ) ) {
            fwrite( $f, $data );
        }
        
        fclose( $f );
        fclose( $put );
            
        return $uploaded;    
    }
    
    
    /**
     * Handles file (single or multi-file) uploading via POST
     *
     * @param string $name Unique file path (usually URL)
     * @param string $file Uploading file field
     */
    public static function postUpload( $name, $file ) {
        $files        = array();
        $uploaded    = array();
        $src        = self::filePath( UPLOAD, $name );
        
        /**
         * Multi-file upload
         */
        if ( is_array( $_FILES[$file]['tmp_name'] ) ) {
            $files = self::groupProps( $_FILES[$file] );
        
        /**
         * Single file upload
         */
        } else {
            $files[] = $_FILES[$file];
        }
        
        
        foreach( $files as $f ) {
            
            $label = self::duplicateName( $src . $f['tmp_name']['name'] );
            
            /**
             * Put each uploaded file into the upload path and append the
             * new file name and its checksum
             */
            if ( move_uploaded_file( $f['tmp_name'], $label ) ) {
                
                $uploaded[] = array( $label, sha1_file( $label ) );
            }
        }
        
        return $uploaded;
    }
    
    
    /**
     * Renames a file name from filename.ext to filename(1).ext etc... to
     * ensure, we don't have naming conflicts with existing files
     *
     * @param string $path File path E.G. /upload/file.jpg
     * @returns If path already exists, returns /upload/file(2).jpg
     */
    public static function duplicateName( $path ) {
        $i = 1;
        while ( file_exists( $path ) ) {
            
            $info = pathinfo( $path );
            
            /**
             * Account for files without extensions
             */
            if ( !isset( $info['extension'] ) ) {
                $info['extension'] = '';
            }
            
            /**
             * Account for files like .htaccess
             */
            if ( !isset( $info['filename'] ) ) {
                $info['filename'] = '';
            }
            
            $path = $info['dirname'] . DIRECTORY_SEPARATOR .
                $info['filename'] . "($i)" . $info['extension'];
            
            $i++;
        }
        
        return $path;
    }
    
    
    /**
     * Handles file cache storage and retrieval.
     * Data will be stored in json format. Expired data will be deleted.
     *
     * @param string $name File name
     * @param string $data Cache information ( array or object )
     * @returns mixed True if stored successfully, false if not.
     *             Stored item if retrieving
     */
    public static function cache( $name, $data ) {
        $src = self::filePath( CACHE, $name ) . '.json';
        
        if ( true  === $data && file_exists( $src ) ) {
            unlink( $src ); // Delete command data
            return false;
        }
        
        /**
         * Getting cache
         */
        if ( null === $data || '' === $data ) {
            
            if ( file_exists( $src ) && ( $mt = filemtime( $src ) ) ) {
                
                /**
                 * If cache has expired, delete the old file and return
                 */
                if ( $mt + CACHE_TIME < time() ) {
                    unlink( $src );
                    return false;
                } else {
                    return json_decode( file_get_contents ( $src ), true );
                }
            }
        
        // Setting cache
        } else {
            try {
                $data = json_encode( $data, JSON_UNESCAPED_UNICODE );
                file_put_contents( $src, $data );
                return true;
            } catch( Exception $e ) {
                return false;
            }
        }
    }
    
    
    /**
     * Creates or retrieves an HTML file in the archive path
     * This function is WRITE-ONLY! It will not replace an existing archive and
     * will send an archive if it exists.
     * CAUTION: This function calls File::sendFile which will exit the script if
     *         an archive is found
     *
     * @param string $name Archive name ( URL of current request usually )
     * @param string $data Fully formatted HTML content to store
     */
    public static function archive( $name, $data ) {
        $src = self::filePath( ARCHIVE, $name ) . 'html';
        
        if ( null === $data || '' === $data ) {
            if ( true === file_exists( $src ) ) {
                die( 'Cannot append to existing archive' );
            } else {
                try {
                    file_put_contents( $src, $data );
                } catch( Exception $e ) {
                    exit();
                }
            }    
        } else {
            self::sendFile( $src, 'html' );
        } // If no archive was sent, script will continue
    }
    
    /**
     * Sends content directly to the client in the requested format.
     * CAUTION: This function exits the script following completion.
     *
     * @param string $src File source
     * @param string $type File type ( html, jpg, gif etc...)
     * @param string $name The filename to specify to the visitor
     *         The actual filename will be used if unspecified
     * @param boolean $attach Force file download as an attachment
     * @param object $exp File expiration in datetime format
     */
    public static function sendFile( $src, $type = 'html', $name = null, $attach = false, $exp = null ) {
        if ( true === file_exists( $src ) ) {
            switch( filetype( $type ) ) {
                case 'gif':
                    header( 'Content-Type: image/gif' );
                    break;
                    
                case 'jpg':
                case 'jpeg':
                    header( 'Content-Type: image/jpg' );
                    break;
                    
                case 'png':
                    header( 'Content-Type: image/png' );
                    break;
                    
                case 'pdf':
                    header( 'Content-Type: application/pdf' );
                    break;
                    
                case 'html':
                case 'txt':
                    header( 'Content-Type: text/html' );
                    break;
                    
                default:
                    header( 'Content-Type: application/octet-stream' );
                    break;
            }
            
            if( null === $exp ) {
                $exp = new DateTime("now + 11 months");
            }
            
            header( 'Expires: ' . $exp->format( DateTime::RFC1123 ) );
            
            header( 'Content-Length: ' . ( string ) filesize( $src ) );
            if ( true === $attach ) {
                $name = empty( $name )? basename( $src ) : $name;
                header( 'Content-Disposition: attachment; filename="' . $name . '"' );
            }
            
            $f = ( 'html' === $type || 'txt' === $type )?
                fopen( $src, 'r' ) : fopen( $src, 'rb' );
            
            fpassthru( $f );
            exit();    
        }
    }
    
    /**
     * Creates and/or returns an SHA1 based file path based on given root and identifier
     * E.G. /path/d0b/e2d/c42/1be/4fc/d01/d0be2dc421be4fcd0172e5afceea3970e2f3d940
     * Where /path is whatever is specified in $root (
     *
     * @example The following regex will match this file path in the default settings :
     * ^(\/path\/([0-9a-z]{3})\/([0-9a-z]{3})\/([0-9a-z]{3})\/([0-9a-z]{3})
     * \/([0-9a-z]{3})\/([0-9a-z]{3})\/([0-9a-z]{40}))$
     *
     * @param $root string Root directory
     * @param $name string File name/identifier
     * @param $create bool Create the directory path if it doesn't exist
     * @param $dlen int Stub directory length for the tree
     * @param $depth int Maximum number of directories to nest
     */
    public static function filePath( $root, $name, $create = true, $dlen = 3, $depth = 6 ) {
        $h = sha1( $name );
        $p = str_split( $h, $dlen );
        $p = array_slice( $p, 0, $depth );

        $t = $root . implode( DIRECTORY_SEPARATOR, $p ) . DIRECTORY_SEPARATOR;
        
        if ( $create && !is_dir( $t ) ) {
            $s = $root;
            foreach ( $p as $d ) {
                $s .= $d . DIRECTORY_SEPARATOR;
                if ( is_dir( $s ) ) { continue; }
                
                /**
                 * Read and write for owner, nothing for everybody else
                 */
                mkdir( $s, 0600 );
            }
        }
        
        return $t . $h;
    }
    
    /**
     * Rearranges file names and properties
     *
     * @example Turns :
     *    array(
     *        'name' => array( file1.txt, file2.html ),
     *        'type' => array( text/plain, text/html )
     *    )
     *    
     * Into :
     *    array(
     *        array(
     *            'name' => file1.txt,
     *            'type' => text/plain
     *        ),
     *        array(
     *            'name' => file2.html,
     *            'type' => text/html
     *        )
     *    )
     *
     * @param array $raw Will be the $_FILE global array
     * @returns array
     */
    public static function groupProps( $raw ) {
        $items = array();
        foreach( $raw as $name => $item ) {
            foreach( $item as $prop => $value ) {
                $items[$prop][$name] = $value;
            }
        }
        return $items;
    }
}
c107
Members
(26-10-2013, 06:14 PM)eksith Wrote: authorities basically have the gall to say they can peek at your junk because you're not storing it and you have a recipe for mistrust that leaves quite a bad case of indigestion.

Alas... Tell me about it.