Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

Cloning/Ghosting/(System) Imaging - Taking Clonezilla for a walk


  • Please log in to reply
42 replies to this topic

#1 wizardfromoz

wizardfromoz

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:15 AM

Posted 09 March 2015 - 03:19 AM

I have been asked to start up a Topic, the subject of which may not be new to some of us, but may be of considerable interest to many, given it deals with the safeguarding of our computer systems, loosely described as part of that over-used word "Backup".

 

In undertaking same task, I am using my own home environment as a testing ground, as an example. My config for my PC can be determined by a combination of:

  • Viewing my Profile, from my Avatar upper left  AND
  • Viewing my signature below

... so I will not repeat that here.

 

Your circumstances may/will vary somewhat, from mine.

 

I am lucky to have a 2TB HDD and 3TB of external storage, so I have plenty of room to "play with".

 

Some readers may find it useful background to read up a little on a Topic elsewhere, started by Advisor NickAu, entitled "How to Restore Your Ubuntu Linux System to its Previous State" found here:

 

http://www.bleepingcomputer.com/forums/t/562190/how-to-restore-your-ubuntu-linux-system-to-its-previous-state/

 

In particular, in the latter pages, you can see advice given to one of our valued Members, pcpunk, and to others, with regard to Timeshift, another alternative tool. The article with How-To Geek is worth a read, as well.

 

But in this Topic, I am using Clonezilla, a cross-platform, free, industry standard applying to both Linux and Windows users, I think. My focus is on Linux, which is currently my total and exclusive environment.

 

To run this test, I have temporarily relocated my own backups and images to my HDD, and am using my ADATA Nobility NHO3 3 TB powered external HDD as the host of the Test environment.

 

Enclosed below is a screenshot of "how things stand" on that drive, and I will explain more with the next post.

 

CgUOFQ4.png

 

Stay tuned!

 

:wizardball: Wizard


Edited by hamluis, 18 March 2015 - 08:47 PM.
Moved from Linux to Tips/Tricks - Hamluis.


BC AdBot (Login to Remove)

 


#2 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,015 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:15 PM

Posted 14 March 2015 - 02:59 AM

Sounds good! :thumbup2:

 

Am looking for my first taste of Clonezilla! 

 

Actually would have had a deep knowledge of it by now, but just as I was about to start using Clonezilla, Macrium Reflect emerged and gained a great reputation fast, and I took the leap also. Still use the software to backup & clone some systems. It's excellent on a new Windows computer, taking a 1TiB HDD & cloning over to a 128GiB SSD. 

 

However, we need backup apps native to Linux here, and while TimeShift is a great one, we also need a native to Linux full disk imaging solution. Actually, Clonezilla can backup any OS, which makes it great for those whom dual boots with Windows, or dual boots with other Linux OS's. as well as image single OS drives. 

 

Looking forward for Wiz to return & discuss the options.  :)

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#3 wizardfromoz

wizardfromoz
  • Topic Starter

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:15 AM

Posted 15 March 2015 - 03:38 AM

Hate to disappoint - I was hoping to do a lot more with this Topic than it seems I currently can.

 

First up I will explain more from my first Post. The screenshot from GParted depicts the following:

 

  1. sdb1 I will probably re-label from ADATA to LMM17 - it is a 100GiB partition I created, then burned an LMM17 iso to it
  2. sdb3, labelled SHIFTMINT, is a 110GiB partition I created, in order to pursue experiments with TimeShift, which I then illustrated to pcpunk over at NickAu's "How to Restore Your Ubuntu Linux System to its Previous State". Page 6 #85 refers, or click here - http://www.bleepingcomputer.com/forums/t/562190/how-to-restore-your-ubuntu-linux-system-to-its-previous-state/page-6 .
  3. sdb4, labelled Zilla1 - is a 100GiB partition I created in order to conduct tests on Clonezilla. The screenshot was taken before running the test, and you can see that there is a 1.75GiB "overhead" right from the creation - possibly the hidden lost+found folder or other protocols
  4. sdb2, linux-swap, 4GiB - was just an arbitrary figure I picked at the time, as I am in the process of making the External HDD bootable. I will likely reduce that to 2GiB, or even delete it if it proves to be unnecessary.

Now, my 2TB (1.82TiB) Acer, looks like this:

 

xfi6lp8.png

 

I needn't go into the details of the other Partitions but the one that is highlighted, sda8, is my LMM17 partition, 93.13GiB (100GB).

 

This was my focus for Clonezilla.

 

Before I go on -

  • Cloning
  • Ghosting and
  • (System) Imaging

- what difference/s is/are there between them? Or are they just different names for the same process?

 

PC World, have an article here, http://www.pcworld.com/article/2029832/backing-up-your-entire-drive-cloning-vs-imaging.html - which says, in part:

 

 

Backing up your entire drive: Cloning vs. imaging

Felix Luke needs to back up his entire hard drive. He asked me to explain the differences between cloning and imaging.

Both cloning and imaging create an exact record of your drive or partition. I'm not just talking about the files, but the master boot record, allocation table, and everything else needed to boot and run your operating system.

This isn't necessary for protecting your data--a simple file backup will handle that job just fine. But should your hard drive crash or Windows become hopelessly corrupt, a clone or image backup can quickly get you back to work.

 

When you clone a drive, you copy everything on it onto another drive, so that the two are effectively identical. Normally, you would clone to an internal drive made external via a SATA/USB adapter or enclosure.

But imaging a drive is more like creating a great big .zip file (without the .zip extension). Image backup software copies everything on the drive into a single, compressed, but still very large file. You would probably save the image onto an external hard drive.

So what are the advantages of each?

Should your primary hard drive crash, a clone will get you up and running quickly. All you have to do is swap the drives.

On the other hand, if your drive crashes and you've backed it up to an image, you'd have to buy and install a new internal hard drive, boot from your backup program's emergency boot disc, and restore the drive's contents from the backup.

So why image? An image backup provides greater versatility when backing up. You can save several images onto one sufficiently large external hard drive, making it easier and more economical to save multiple versions of the same disk or back up multiple computers.

 

Ghosting, on the other hand? If you google up what is ghosting? - you will find a number of different answers, but the one we are looking at is ghost imaging.

 

A small article at Tech Target, here - http://searchmobilecomputing.techtarget.com/definition/ghost-imaging has:

 

 

Ghost imaging is the copying of the contents of a computer's hard disk into a single compressed file or set of files (referred to as an image) so that the contents of the hard disk, including configuration information and applications, can be copied to the hard disk of other computers or onto an optical disc for temporary storage.

 

An example of ghost imaging software is Norton Ghost, a product from Symantec. Using this product, you can clone (copy) the entire contents of a hard disk to a portable medium such as a writeable CD or to a server. The portable image can then be used to set up each hard disk in other computers, automatically formatting and partitioning each target disk. Ghost imaging is useful where one system is to be replicated on a number of computers in a classroom or for a team of notebook computer users who all need the same system and applications. On personal computers, ghost imaging is used to back up everything on the hard disk, often while reinstalling an operating system.

 

Ghosting was made popular in the 90's  and Noughties by Symantec's Norton Ghost, an acronym for general hardware-oriented system transfer. The software was developed for a company called Binary Research in New Zealand in 1995, and acquired by Symantec, then best-known for Anti-Virus software, in 1998. It was the industry standard for many years.

 

Ghosting, as I see it, had two basic functions or capabilities. They were:

  1. As a backup solution for a single, or Home User, computer and
  2. As a vehicle, in the workplace, where a sysadmin could replicate a "custom fit" OS to any number of PCs/Workstations in the workplace.

Under Windows, the latter involved paying attention to licensing issues, which might seem to make it redundant under Linux. But there is still an argument to be had, for using custom fit where other applications essential to an enterprise's core business are added. Norton Ghost was discontinued some years ago, but Enterprise support may still be going, I am not sure.

 

My time zone is radically different to many of yours, and I have to "scoot" for possibly 14 - 18 hours, but I will be back Monday my time to provide the final chapter/s for now,of taking Clonezilla for a walk.

 

Stay tuned, keep safe, and spread the word on Linux.

 

:wizardball: Wizard

 

Edited typo


Edited by wizardfromoz, 15 March 2015 - 03:41 AM.


#4 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,015 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:15 PM

Posted 16 March 2015 - 01:55 AM

There are those who has those old versions of Norton Ghost who'll never part ways with it either, they won't consider another option. 

 

It works, but on a Linux install, the backup will still be a sector-to-sector one, which takes up a Lot of a backup drive's space. As does many Windows backup solutions. Too, some of these solutions will split a image if over 500GiB into two sections, which would increase to 4 for one with a 2TiB HDD, mostly Linux formatted. 

 

Of course, there's nothing like cloning a drive every week or two & swapping drives, but this is a lot of trouble for some, including me. If all I had was one huge HDD (or SSD), it's not bad, but to have three SSD's & a 500GiB Data drive, is a lot of trouble. 

 

Hopefully Clonezilla will have a way to backup the drive's contents, w/out the sector-to-sector approach. If that's the end result, will be just as well off to continue my current practice. 

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#5 wizardfromoz

wizardfromoz
  • Topic Starter

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:15 AM

Posted 16 March 2015 - 02:55 AM

In the area of cloning/ghosting/imaging, there are online and offline solutions.

 

As the names suggest:

  • Online means you can perform the task whilst you are still using the OS - Operating System
  • Offline means you are not connected to a Live system

Clonezilla is offline - you download an iso and burn it to CD/DVD/USB stick, reboot, interrupt the boot process to run the medium, and go from there.

 

When you get into Clonezilla, you will see a menu somewhat like this

 

S5Gb9pt.png

 

You can see the version number, but that is beside the point, I was using clonezilla-live-2.3.2-22-amd64.iso.

 

Once you are into Clonezilla itself, you go to a text-based format, as below:

 

2W9SHBn.png

 

... and go from there.

 

@cat1092

 

 

Hopefully Clonezilla will have a way to backup the drive's contents, w/out the sector-to-sector approach. If that's the end result, will be just as well off to continue my current practice.

 

Don't hold your breath ... but the jury is out

 

More to follow tomorrow

 

:wizardball: Wizard



#6 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,015 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:15 PM

Posted 16 March 2015 - 03:09 AM

Which ISO do we get? 

 

A few months back, went to the site to assist another member & there were lots of choices. So many that I didn't know what to tell him (pcpunk in another Linux Topic). Am not sure if he got done with Clonezilla, because there was no follow up post. 

 

I just want to ensure am downloading the right ISO. The computers that I'd be using it on are all 64 bit (or amd64 as known by many of these apps).

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#7 mremski

mremski

  • Members
  • 495 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:NH
  • Local time:08:15 PM

Posted 16 March 2015 - 07:06 AM

Backups.  CryptoXYZ malware has raised awareness of the importance, but I've often found it comes down to "what's important", the OS or the data?  Clonezilla is nice because one can get back to a booted system state rather quickly;  but you must take care that you've thought about the issue.  In a manufacturing environment, Clonezilla's PXE boot option is nice.

 

My preferred way is a smaller disk holding the OS (SSDs are great for this), user accounts, data and most installed programs on separate media.  Using Clonezilla, do a fresh install of the OS of choice, do your modifications to get it setup (firewalling, DNS config, security patches/updates, etc), then clone that.  Now you have a minimal installation that lets you get back up and running in a hurry.  Another plus to having separate media is upgrading/switching the OS (the user data disk format must be readable across them all).  I've often upgraded by getting a new disk, doing a fresh install.  I can mount the old system disk to verify any changes, and if something doesn't work and prevents other users from getting to the Internet, I can quickly switch back to a running system.

 

The data.  *nix systems have lots of native tools to back up directories (tar, dump, cpio, etc) that do a good job of preserving files.  Since now it's relatively trivial to create DVDs or CD-ROMs, you can have a relatively permanent, transportable set of backups.  gzip/bzip2 do good jobs compressing, so you get a lot in a little space.  There are probably lots of function GUIs for these tools in recent distributions, but a little bit of scripting, one can start hooking commands like find up to tar and such to get all kinds of incremental backups.

 

Not trying to take anything away from the Wiz's discussion, just pointing out a few native tools.


FreeBSD since 3.3, only time I touch Windows is to fix my wife's computer


#8 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,015 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:15 PM

Posted 16 March 2015 - 10:28 PM

 

 

My preferred way is a smaller disk holding the OS (SSDs are great for this), user accounts, data and most installed programs on separate media.

+1! :thumbup2:

 

This is how my Linux OS on my main PC is installed, have 40GiB partitioned on the SSD for the OS, tweaks & whatever & /home & Swap are on a HDD, away from the OS. 

 

In fact, all of my OS's are installed in this manner, have Data partitions to store items of importance away from the SSD. And keep everything imaged & these backups are detached from the computer. 

 

Am hoping to hear from Wiz soon to steer us in the right direction when it comes to Clonezilla. Using dedicated Windows software for full disk images takes a lot of space. 

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#9 wizardfromoz

wizardfromoz
  • Topic Starter

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:15 AM

Posted 17 March 2015 - 12:02 AM

 

Not trying to take anything away from the Wiz's discussion, just pointing out a few native tools.


 

Not at all mremski, in fact you could have been reading my mind.

 

 

Which ISO do we get?

 

Cat, For 64-bit, as I mentioned above, but it may have got lost between the two screenshots, so here as follows:

 

 


You can see the version number, but that is beside the point, I was using clonezilla-live-2.3.2-22-amd64.iso

 

:wizardball: Wizard


 

I have a disclaimer to make here as follows:

 

Before last week, it was 7 to 8 years since I had used Clonezilla. The occasion was under a totally Windows environment, and my computer's Motherboard and Modem Router had been fried during a lightning strike. I was able to salvage the two by 80GB physical hard drives, and had them installed in two caddies, employing USB2.0 connectors. I hooked up the caddies to the spare computer I had, and used Clonezilla to clone the former C drive to the former D drive.

 

I would have been using a version of Clonezilla around the 1.0's to 1.1's, I guess, and I recall the process taking a number of hours. I did not have the need to restore, and so I cannot tell you how that goes.

 

Back to the present, or at least last week.

 

Having used TimeShift so recently over at "How to Restore Your Ubuntu Linux System to its Previous State", and learning that its "cloning" facility is not, by default, a cloning process at all, but rather a high-level backup of your entire directory structure, and some of their (folders) contents, I had developed a mindset that using Clonezilla would be similar. Not to be the case at all.

 

It was my intention to simply use Clonezilla to clone my LMM17 partition, sda8 to my newly-created sdb4 on the Adata external. I was mindful of the fact that if Clonezilla turned out to be low-level sector by sector in its operation, I would need the destination environment to be of equal size or larger. With 98.25GiB free on sdb4, and LMM17's sda8 taking up 93.13GiB, I figured I had 5GiB up my sleeve to accommodate the clone.

 

In the initial menus entering Clonezilla it includes an option to load it into RAM. It only takes up 163MiB, so I chose that option. I renamed the folder Clonezilla was about to produce to a name that relected "save-mint", held my breath, and then started.

 

Two hours later, I shut down the process with great difficulty - why?

 

When Clonezilla commenced its operation, speed looked pretty good. Started at around 3.95GiB per minute, and would drop then occasionally to as little as 2.73GiB per minute (this while I was watching it!). On that basis, I estimated that worst case scenario would be to clone my LMM partition in 40 minutes. Wrong.

 

Towards the end of the two hours, I sat and watched for 15 minutes at a time. Clonezilla uses one of those "Time spent so far, estimated time remaining" features those whom have watched a backup under Windows are familiar with. The time spent figure appeared to be reasonably accurate, but the estimated time remaining seemed to be going one step forward, and one step back, all the time.

 

I had missed seeing, near the beginning of the process, Clonezilla gathering data on what it was going to clone - might have sneezed. I have seen it since, today, on briefly revisiting the start of the process and then aborting.

 

Clonezilla, by default choices recommended, was going to back up my entire computer system, every single partition, to the destination partition, sdb4, amounting to 110GiB.

 

Short break, then I am back

 

:wizardball: Wiz

 

Edited - lost and then fixed formatting for quotes


Edited by wizardfromoz, 17 March 2015 - 12:05 AM.


#10 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,015 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:15 PM

Posted 17 March 2015 - 12:16 AM

Thanks for re-posting the file, Wiz, will hop over to their site & check things out.  :thumbup2:

 

EDIT: I see the caveat now, and no wonder users flocked to Macrium Reflect years back, the backup drive or partition must be equal or larger than that of the one being imaged. No wonder it's called 'Clonezilla', this is a sector-to-sector backup, no different than that of which many of us has. 

 

For one that has a 1 or 2TiB drive full of Linux partitions, one of the same size would be needed for the drive image. Looks like I'll keep on using TimeShift, along with my backup software that's paid for. 

 

For those who has no other drive image software & are looking for one, this may be well worthwhile. Yet only if there's a backup drive available of the same size as all of the Linux partitions combined. I would need a new backup drive of 2TiB for all of my Linux backups alone. 

 

On the plus side, 2TiB backup drives (or 'bare' drives to be used in docking stations) are often on promo for $100 USD or less. Bare drives are the same as those that are in fancy retail packages & carries the same warranty, only no screws or manuals included. BTW, I've never purchased a retail backup nor OS drive. The bare ones does the trick fine, when used with an enclosure or docking station. Bare 2.5" notebook type drives can often be connected by USB 2.0 or 3.0 to SATA adapter cable only, since no external power is needed, saving the cost of an enclosure or docking station. 

 

I still prefer docking station best of all, however one must watch these, if the backup takes a long time, a desktop fan may needed to help cool the drive. Overheating not only causes premature drive failure, but also can cause corrupted backups. This, I know from firsthand experience. Doesn't matter which software is used, the backup drives must remain cool. This includes retail versions with cheap ABS plastic covers, which retains lots of heat, one of several reasons why I avoid these. One of the other reasons is that the connections within the enclosures are cheap, often cold soldered, as a result will drop connection. Though with many older of these, the case could be carefully removed & be used in a docking station or USB cable as described above, some OEM's are going away from the SATA-3 connection to a different type, making it useless post-warranty if a failure occurs. 

 

Am hopeful that this Topic continues to grow, with others offering suggestions. It sort of puzzles me as to why most all backup software (other than TimeShift) requires a sector-to-sector approach, why not just the partition structure & data within each? While some distros does have their own inbuilt backup software, it's not equal to that of TimeShift, and often harder to use (& time consuming for what it is). For instance, the native Linux Mint backup solution won't notify the user if it's hung, wasting a lot of time. 

 

While it could well be that Clonezilla is the best cloning solution out there, not everyone has the backup space needed to manage it. 

 

Cat


Edited by cat1092, 17 March 2015 - 01:01 AM.

Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#11 wizardfromoz

wizardfromoz
  • Topic Starter

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:15 AM

Posted 17 March 2015 - 01:41 AM

 

Clonezilla, by default choices recommended, was going to back up my entire computer system, every single partition, to the destination partition, sdb4, amounting to 110GiB.

 

Now, I should have known this, and I should have remembered from years ago, the time it took just to clone an 80GB (73GiB) drive. But I didn't.

 

It doesn't take a Nobel Prize winning mathematician, nor rocket science, to work out what was going to happen. It just actually took longer for Clonezilla to stop and give up, than I would have expected, leading me to believe that the transfer speeds dropped to considerably lower than that 2.73 - 3.95GiB per minute.

 

After two hours or so, Clonezilla came up with a message I have not copied verbatim, but was to the effect that (destination) space required did not meet space needed for the operation.Surprise surprise! Divide 1.82TiB into 110GiB, and it just doesn't go.

 

Having failed to do what I set out to do, I thought I would at least gather some knowledge on the failed operation.

 

In Zilla1, the sdb4 partition on my external hdd, we can see the following:

 

89okFzp.png

 

This, after Ctrl-H'ing to check for Hidden files or folders, and lost+found is there, as I guessed.

 

Double-clicking the save-mint (Clonezilla) folder reveals:

 

BfiKRts.png

 

... 9 files, for all our (my) efforts. I expect if you have a successful use of Clonezilla, these files will appear in your result.I can reveal the contents of same for 7 out of 9 if you are interested, they are textual. The two with file extension .mbr are, as you expect, related to the master Boot Record. They are locked, and even if you use eg

sudo su

...or alternatives, to gain access, you still need to know the program to use to view, which is a stage I have yet to reach.

 

A right-click Properties on the 9 files reveals:

 

f7jh0qS.png

 

... which indicates we have safeguarded 1.1MB for our efforts, :hysterical:

 

Now mremski, whom has contributed, may have a simple solution here, and I welcome it. Also mralias518 is/was viewing whilst I type.

 

Since my debacle last 11th March, I have done a little research on Clonezilla, and I'd like to present my findings.

 

At their website, if you tunnel a bit, you can find the following:

 

At http://clonezilla.org/clonezilla-live-doc.php - I will reproduce this, as I need to point out something not immediately apparent:

 

j1FsAHc.png

 

If you note the blue and green headers (and there are more if you scroll down the website page), you will find with the green that Description is also a hyperlink/hotlink. If you simply click on the numbered items, you simply get a one-page screen with a few items on that subject, ie a chapter.

 

Clicking the green "Description..." header will take you to a mini-manual for each process. To use the example of "Save...", a Print Preview reveals (US letter) - 18 pages Portrait or 36 pages Landscape (easier to read). Figures differ for A4 used in Australia and perhaps elsewhere. I would expect similar of the others (Topics, eg Restore).

 

I, for one, will not be using Clonezilla again without being armed with at least a printed version of the Save section. But I have two by heavy duty colour laser printers, and for ink jet users this may not be your cup of tea. Perhaps you could print them at a Library.

 

Just a few more points, and then I will wrap this up for now, but we can keep the Topic open, and I welcome input:

 

  • I was limited to using USB2.0 - USB3.0 might likely be considerably faster, you might expect 12 times faster?
  • If you have a large drive, and USB2.0, expect an "overnighter" - after the initial stages, it does not appear to require user input.
  • On next using it, I will be looking to choose different options in the initial stages and see what eventuates

Later

 

:wizardball: Wiz



#12 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,015 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:08:15 PM

Posted 17 March 2015 - 02:00 AM

 

 

  • I was limited to using USB2.0 - USB3.0 might likely be considerably faster, you might expect 12 times faster?

 

Only if you have the best USB 3.0 chipset possible, and there's no other bottlenecks in the system. Oftentimes, USB 3.0 speed won't even hit USB 2.0's maximum rate (480MiB/sec or so), or will do so in only short bursts. I've yet to see one reach 1Gbps (on anyone's computer/backup drive option), let alone the advertised 5Gbps speeds. 

 

In fact, have seen faster speeds over eSata than USB 3.0. Yet they're now coming out with USB 3.1. 

 

No, you're not going to have 12x faster backups. With some luck on your side, maybe 3x faster on the best days. 

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#13 mremski

mremski

  • Members
  • 495 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:NH
  • Local time:08:15 PM

Posted 17 March 2015 - 06:15 AM

Another thing to be aware of with Clonezilla is the restore part of it.  I've run into situations where the clone image came off a 120GB device, but had been partitioned into 2 64GB.  Only the first partion was used.  Clonezilla didn't want to restore onto a 64GB device with a single partition.

 

I'll second what cat says about USB speeds:  besides the hardware, a lot depends on how well the drivers for the specific chipsets are written.  Most of the specs are pure theoretical, based on calculating # of bits of data, adding in any framing bits.  In the real world (send side) you have to push the big data buffer from userspace down to the kernel, the driver may need to chunk it up smaller, wrap the chunk into some type of envelope, then que that up to the hardware.  The hardware then needs to suck in that chunk (hopefully via DMA, not loop reading) and push it out to the device.  Then the next chunk needs to get pulled in;  lots of times upper layers can put a bunch of buffers on a queue that the hardware will automatically pull from, but at some point, the queue is empty and something has to signal "give me more" (context switch).   Toss in "Oh, we've layered a file system over that USB device" and you have a few more layers between the userspace and the hardware.  So realworld USB throughput is less than "spec" (Ethernet throughput too;  trying pings with different sized packets sometime).

 

If the question is simply an image file of a partition, the "dd" command on the raw partition works pretty well.  If you google "linux dump raw device" you'll find a bunch of good stuff.  I ran into one from back in 2010  "mounting a raw partion file made with dd or dd_rescure in linux". 


Edited by mremski, 17 March 2015 - 06:21 AM.

FreeBSD since 3.3, only time I touch Windows is to fix my wife's computer


#14 heyyou325

heyyou325

  • Members
  • 328 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:05:15 PM

Posted 17 March 2015 - 01:41 PM

I've been watching this thread as I need to clone a 20 gb partition with just over 16 gb on.  Did I read this wrong in that you have to do the whole hard drive, and not individual partitions.   I can drop this down to just under 16gb and put it on a thumb drive.  I'm changing from having 5 distros to update to only 3, or maybe 2 when I finish.  But I have the one set up just like I like it, only need it bigger.  So I'm wanting to save everything without having to reinstall anything.  I've been meaning to learn how to do this for most of a year, so this thread came at an opportune time.  But I'm not sure it is my best bet.  I have all the files on a home partition, the distro on a dvd,  can reinstall all the programs I downloaded, and reset all my settings.  It sounds like that may be quicker.



#15 wizardfromoz

wizardfromoz
  • Topic Starter

  • Banned
  • 2,799 posts
  • OFFLINE
  •  
  • Gender:Male
  • Local time:10:15 AM

Posted 17 March 2015 - 05:57 PM

@mremski

 

 

I'll second what cat says about USB speeds:  besides the hardware, a lot depends on how well the drivers for the specific chipsets are written.  Most of the specs are pure theoretical, based on calculating # of bits of data, adding in any framing bits.  In the real world (send side) you have to push the big data buffer from userspace down to the kernel, the driver may need to chunk it up smaller, wrap the chunk into some type of envelope, then que that up to the hardware.  The hardware then needs to suck in that chunk (hopefully via DMA, not loop reading) and push it out to the device.  Then the next chunk needs to get pulled in;  lots of times upper layers can put a bunch of buffers on a queue that the hardware will automatically pull from, but at some point, the queue is empty and something has to signal "give me more" (context switch).   Toss in "Oh, we've layered a file system over that USB device" and you have a few more layers between the userspace and the hardware.  So realworld USB throughput is less than "spec" (Ethernet throughput too;  trying pings with different sized packets sometime).

 

 

Give me two years and I may understand what you just said :scratchhead: - but it looks interesting, and I like the way you phrased it!

 

If the question is simply an image file of a partition, the "dd" command on the raw partition works pretty well.  If you google "linux dump raw device" you'll find a bunch of good stuff.  I ran into one from back in 2010  "mounting a raw partion file made with dd or dd_rescure in linux". Should be "dd_rescue" - Wizard edit

 

I've seen references to dd previously, and NickAu or another Wise One may have mentioned it recently.

 

The article for those interested, is at https://major.io/2010/12/14/mounting-a-raw-partition-file-made-with-dd-or-dd_rescue-in-linux/ - thanks for the tip, mremski :thumbup2:

 

@heyyou325

 

"Did I read this wrong?"

 

Nope. If you follow the default choices Clonezilla provides, that is the direction it leads to - full imaging of your whole computer system, including all the empty space.

 

It may be worth repeating one of my screenshots, namely this one:

 

2W9SHBn.png

 

The default is as highlighted in red. Next time I try it, I will choose the second option, and see where it leads me. Or some intrepid souls may undertake the expedition and let us know what results?

 

:wizardball: Wiz






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users