Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Best way for on system backup?


  • Please log in to reply
8 replies to this topic

#1 Ravenbar

Ravenbar

  • Members
  • 125 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:NY
  • Local time:08:14 PM

Posted 10 December 2016 - 08:58 PM

Just getting my system back on its feet after a major data lose. Only 1 file, the most important and used, was lost without a current. I've just gotten done recreating the lost data. My downfall was relying too heavily on the RAID for redundancy and my secondary backup(manually done) wasn't current enough(6 weeks old)

 

I have (3) 320Gb disks that I want to mirror each other, not using a raid. The obvious solution would be to copy the contents of the user drive onto the other 2 drives at regular intervals(automated). That would take a lot of system resources as the 320Gb drive has around 240Gb of data on it currently. There has to be some software(GUI preferred) that will handle comparing one drive to the other 2 and copying any changes. Long term unused files are manually copied over to a fourth 1Tb drive that is for archival only to keep from filling the main storage drive.

 

OS is Linux Mint 17.3 KDE.


Desktops: "John2" Custom, Gigabyte F2A88Xm-D3H, AMD 6A-5400K Trinity 3.6Ghz Dual-Core APU, 16Gb DDR3  HyperX Fury 1866Mhz RAM, 120Gb Crucial Force LS SSD OS) Linux Mint 17.3, 320Gb Raid1 array consisting of (1) Seagate ST320LT020-9YG14 & (1) Fujitsu MZH2320B

Francisco: HP pavilion p7-1080t upgraded with 16Gb ram. Windows 7. Used only for Gaming

Server.GaltsGulch: HP Elite 8300 Small Form Factor, i7-3770, 16Gb ram, Kingston SSDNow 120Gb SSD, 3Tb storage HDD, Fedora Linux/Avahi, Headless

 


BC AdBot (Login to Remove)

 


#2 bob007

bob007

  • Members
  • 42 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Australia
  • Local time:10:14 AM

Posted 11 December 2016 - 12:29 AM

Ravenbar,

 

I'm new to Linux but have ran Windows for many years and I doubt software exists that will do what you want in either Windows or Linux...but I could be wrong.  I will never understand why people go in for Raid arays either...get a Virus (Windows) or a HDD fails it's good-bye Raid and all your info...not to mention having the same or part of the same info on all drives.

 

I also don't trust auto-backups either...I prefer to do them myself.  I have found the best way to backup is to use Timeshift to backup system files once a month and Macrium Reflex free every few months to create a System Image (everything on HDD) and copy it to an external HDD.

 

By doing this I'm covered for any disaster that may occur...I now know that it's not that big a deal to re-install say Mint 18...but you then still have to install software like...Vituralbox and your Vitural machines...Google Chrome plus the add ons and Firefox settings and add ons and Timeshift etc all over again.

 

The other alternative is to Clone the OS HDD to an equal size HDD with Macrium Reflex...I've done this in Windows many times but not in Linux...anyway I hope this is of help to you.  :thumbup2:



#3 oregonjohn

oregonjohn

  • Members
  • 1 posts
  • OFFLINE
  •  
  • Local time:04:14 PM

Posted 11 December 2016 - 04:07 AM

Disaster recovery-- on swappable drives

Daily night backups -- on swappable drives

Periodic daily backups -- on fast system drives

Test your backups regularly

 

Disaster recovery

  • The primary thing here is to get your system backed up and off site so you can rebuild it if necessary.
  • Copy your system partition (you should have your data on a separate partition to make this easier)
  • Clonezilla is the software to use but requires a boot into clonezilla.
  • My experience so far is that a clonzilla image will go onto any system with a similar archetecture (32 or 64 bit) and that the bother with any different other hardware is minimal if any (advanced graphics cards might require some adjustments). You could also use the image to clone your system, but that will also require changing your mounts (fstab) and other system things like server name (further research is recommended).

 

Daily night backups

  • The primary thing here is to get your data off site, or at least off the system machine and disconnected from power.
  • You could use these same drives to do your system backup.
  • I use two hard drives, one leaves at night and is swapped back the next morning.
  • I use Lucky Backup for this nightly backup.
  • Occasionaly I use Lucky Backup for my system files (I exclude what they recommend, but also /media, /mnt, and /home which is on my data partition)
  • Lucky Backup can be set to use versions (4 versions per disc, 2 discs, 8 days of data and system backup)
  • Lucky Backup is set to send me an email each night, and any errors are listed. If I don't get the email then something was wrong, if the attached log file is only a couple of kb then I know something is wrong because it didn't find any files to even test for changes.
  • Development of Lucky Backup is frozen (as per developer) but runs fine and I have yet to find a replacement as good.

 

Periodic daily backups

  • The primary thing here is to recover deleted files, or restore files from before you made changes.
  • Since this runs every 10 minutes for me I use a fast drive directly connected to the system.
  • I use Back In Time for this. It has an easier restore graphical interface (GUI) than Lucky Backup.
  • With Back In Time I get snapshots of the data. I set it to go back in 10 minute increments for three days, daily increments for 10 days, weekly increments for 12 months. It only backs up if there have been changes.
  • I only use this for data, since my system files don't change that often.

 

Those last two programs are based on Rsync which is a powerful backup program with hundreds of options, so it's easier to learn the GUI programs than use the command line of Rsync.

 

Test your backups regularly

  • Open your backup program and try to find a file.
  • Try to restore a file.
  • Create a file one day, then check if it got backed up the next day.
  • Check the health of your backup drives.

No backup system is fail proof, but this has gotten me out of a lot of jams. If my periodic backup missed a file or folder because I failed to include it in the routine (there are a lot of folders that don't change often and I don't waste the cpu time to check them) it was found on my dialy backup because that one included every folder in my data partition.

 

Good luck. John in Oregon


Edited by oregonjohn, 11 December 2016 - 04:20 AM.


#4 Ravenbar

Ravenbar
  • Topic Starter

  • Members
  • 125 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:NY
  • Local time:08:14 PM

Posted 11 December 2016 - 08:25 AM

I think I've found the program which will do what I want. MintBackup, which I already have installed on the system seems to be able to do it, except the scheduling part, so far as I can tell.

 

I have no data on the system that is changes so often I couldn't easily recreate up to a weeks data in less than an hour. Mostly the data consists of multimedia files which are in function, read only, with few added. Mint Backup compares the old files and the new and only copies changed files to the backup drive.


Desktops: "John2" Custom, Gigabyte F2A88Xm-D3H, AMD 6A-5400K Trinity 3.6Ghz Dual-Core APU, 16Gb DDR3  HyperX Fury 1866Mhz RAM, 120Gb Crucial Force LS SSD OS) Linux Mint 17.3, 320Gb Raid1 array consisting of (1) Seagate ST320LT020-9YG14 & (1) Fujitsu MZH2320B

Francisco: HP pavilion p7-1080t upgraded with 16Gb ram. Windows 7. Used only for Gaming

Server.GaltsGulch: HP Elite 8300 Small Form Factor, i7-3770, 16Gb ram, Kingston SSDNow 120Gb SSD, 3Tb storage HDD, Fedora Linux/Avahi, Headless

 


#5 pcpunk

pcpunk

  • Members
  • 6,008 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Florida
  • Local time:08:14 PM

Posted 11 December 2016 - 10:41 AM

Take look at this, one nice thing is it is included in the Repo's, just go to SM and install.  I think it also supports Scheduled Syncs.  I use this for my file Syncs, mostly just my Data, then use Timeshift for System...root system backups.  Want to learn more but time is an issue haha.  Gysync is also Cross Platform just in case you need that for the Dark Side.

http://www.opbyte.it/grsync/

 

Some alternatives that are also based on Rsync

http://backintime.le-web.org/documentation/

LuckyBackup

http://lifehacker.com/5617415/luckybackup-makes-backups-and-syncing-easy-on-linux

 

One of those should be a solution for you.  There are a lot of tools out there, it's just a matter of which GUI you like the best, and which format it is using, rsync seems to be the most popular.


Edited by pcpunk, 11 December 2016 - 11:00 AM.

sBCcBvM.png

Created by Mike_Walsh

 

KDE, Ruler of all Distro's

eps2.4_m4ster-s1ave.aes_pcpunk_leavemehere

 


#6 wizardfromoz

wizardfromoz

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:14 AM

Posted 13 December 2016 - 01:31 AM

Hi Ravenbar

 

You might want to take a look at BackupPC http://backuppc.sourceforge.net/index.html  which says that it "de-dupes" (sounds like incremental backup to me). It uses a web-based UI and is built on rsync.

 

Also have a read of Tony George's products at http://teejeetech.in  - Tony's products include Timeshift, and Aptik. Timeshift is likewise based on rsync.

 

I have a tutorial on Aptik here https://www.bleepingcomputer.com/forums/t/612761/aptik-move-your-linux-to-a-different-drive-or-computer/  which may give you a few ideas.

 

@oregonjohn:

 

:welcome: to Bleeping Computer and to the Linux & Unix section, hope you enjoy your stay here.

 

 

 

:wizardball: Wizard



#7 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,018 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:14 PM

Posted 15 December 2016 - 05:22 AM

 

 

I also don't trust auto-backups either...I prefer to do them myself.  

 

Same here, I use a very simple method. Simply attach an external with enough free space, and in the Home folder, there's several main ones with content. Simply drag & drop these to an external once weekly, and I prefer to keep the last two copies, and rotate externals for this task. Will use one on one week & another on the following, that's not placing all of one's eggs in the same basket. Plus am signed into Firefox & Google Chrome, so my bookmarks will be where I left these. 

 

Thing is, the OS & most all of the software & customizations can be restored in hours, making the OS disposable if broken, the user data is what needs to be protected. :)

 

 

 

The other alternative is to Clone the OS HDD to an equal size HDD with Macrium Reflex...I've done this in Windows many times but not in Linux

 

I've successfully cloned Linux Mint 17.3 * 18 with Macrium Reflect three times, one time from HDD to SSD, the others where swapping SSD's, as the 120-128GiB sizes are no longer relevant, as 250-500GiB sizes has became the norm, and I have as of this posting, two unused 120GiB Samsung 120GiB SSD's. These are great for Linux installs, especially if one doesn't have too large of a /home partition (24GiB root, 1GiB Swap & 80GiB /home (this leaves just the right amount of unformatted space at the right end for overprovisioning). Even though root doesn't use a lot of space, it's important to keep free space on a SSD, the closer to under 60% full, the better. Any more, the controllers has to work hard to keep the drive clean & fast. If too full (over 80%), the SSD speeds will choke, and may not be too much faster than a fast SATA-3 HDD running at 7200 rpm with a 64MB cache. 

 

Also, while on the subject, have successfully restored a backup image of Linux Mint 18 taken with the WinPE media of Macrium Reflect. Unlike some 'Linux only' backup solutions that requires a sector to sector backup image, Macrium uses a method that compacts the data, while keeping the partition structure in place, making a 100GiB total install only use roughly 30GiB on the backup drive. And as an added bonus, the restore (as well as the backup) will be much faster than say, a Windows backup/recovery/clone operation. Don't ask me why, because I don't know, though I find it odd that these tools created for Windows actually works faster on Linux images & cloning. The speed of the restore is like a madman in action, it's shocking to see how fast these restores (even on USB 2.0 or eSATA). 

 

While I know that others has other ideas that didn't work for me, maybe because on my PC's, I install root on a SSD & Swap & /home on a HDD, am not going to discount their ideas, yet some of these doesn't work in my type of install scenario. Everyone has a unique setup, their preferred way of doing things, what works on some installs, doesn't on others. Though the drag & drop method of the main Home folders to an external works on most any install. :)

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#8 rufwoof

rufwoof

  • Members
  • 129 posts
  • OFFLINE
  •  
  • Local time:01:14 AM

Posted 17 December 2016 - 07:42 PM

I use a very simple method. Simply attach an external with enough free space, and in the Home folder, there's several main ones with content. Simply drag & drop these to an external once weekly

 

 

I do similar, but use mksquashfs to create a single compressed file (unsquashfs to restore). I use a filename that reflects the date of creation of the backup and periodically delete older versions of those files. lzop level 1 compression is relatively quick

 

mksquashfs /mnt/sda1 todays-date.sfs -comp lzo -Xcompression-level 1


OpenBSD (--release) data server (that auto detects and rsshfs to)

OpenBSD (--current) desktop that sshfs mounts my android phone


#9 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,018 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:14 PM

Posted 18 December 2016 - 06:19 AM

rufwoof, whatever works is what counts. :)

 

Any backup plan to preserve user data is better than none, and of course, it's also good to have a full drive image if possible. While I perform full drive images regularly, rarely restore these (have done so only one time), would rather save my data & go for a clean install to have a tidy /home partition, there's a lot more there (hidden files by default) than those main folders seen in Home. Which takes only 3-4 hours for me because have done it dozens of times for myself alone & have notes as to what needs to be performed post install, the first of course is the enabling of the ufw Firewall as soon as logged in. 

 

Being that I try lots of tricks on my spare Linux PC, the one I'm on now, there's old configuration files unseen left behind that needs to be cleaned up with a format. While I used to recycle the same /home partition over & over (once from Linux Mint 12 through 17.1), once I seen the mess by showing hidden files, no wonder why icons popped up after a 'fresh upgrade' (formatting root only) that I had not used for some time. :P

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users