1

I was stupid enough to forget I deleted an old backup so I could make a new one. A few days after deleting it my Windows crashed.

Now I'm trying to create a backup trough Debian. I'm copying my 1TB drive to a 1,5TB drive, with the Windows directory tree (not as an image). But my 1TB doesn't fit on the 1,5TB. There are no other files on the 1,5TB disk. Both Drives are NTFS.

What is happening and how do I fix this?

hasnieking
  • 59
  • 1
  • 9
  • Are you doing a partition copy or a file copy? Do you use compression on the 1TB drive? Are the file systems both NTFS and are the block sizes the same? A partition copy will always give the same size, and you can create an extra partition in the free 500MB. I recommend `gparted` for these operations, but you can alternatively use `dd` to copy the 1TB disc to a single 1TB file in a 1.5TB partition (or even to a smaller file if you compress on the fly, at the cost of a slower back-up process). – AFH Apr 06 '16 at 21:51
  • They are both NTFS. The block size of both is 512. gparted doesn't work for me (Debian Jessie) And I need to clone the whole disk, not creating an image of it. – hasnieking Apr 06 '16 at 22:10
  • You can use `dd` to clone the disk too. See my answer below. – John Apr 06 '16 at 22:49

2 Answers2

2

Ok, I deleted my whole drive and went to properties. Here I found out it was listed as empty but 1TB used. I formatted it as NTFS and now everything fits!

hasnieking
  • 59
  • 1
  • 9
  • 1
    Remember to mark your answer if it works for you. – NetworkKingPin Apr 09 '16 at 13:36
  • If you want a usable clone that you can boot from (as opposed to just a backup of your personal files), keep in mind that copying things over using `cp` as you are doing probably will not work. For example, you won't copy the boot sector. Even if you manage to get it to boot by doing additional work, it may give you problems down the road because `cp` may not copy certain attributes and other ntfs-specific things correctly. On the other hand, if you just want to make a backup of your personal files, why not use the built in backup functionality in Windows? – John Apr 10 '16 at 19:56
0

There are several possible reasons why a copied directory tree might end up taking up more space in the destination than it did in the source. Most file systems, including NTFS, support the following features that might contribute to this.

Edit: I just saw that you are trying to use cp for doing the copy rather than Windows Explorer. I'm not sure how cp and ntfs-3g behave with respect to these items. However, keep reading because there is a second part to the answer.

  • symlinks / hardlinks: File systems can create a lightweight "copy" of a file that, rather than being a full copy of the original file, instead simply points to the original file (meaning changes made to one "copy" apply to the other). I believe Windows also uses this functionality itself, for example, in the WinSXS folder. When you try to copy a symlink/hard link with Windows Explorer, it will not simply copy the lightweight symlink/hard link. Instead, it will give you a full copy of the original file. For example, let's say you have a file and a symlink to that file, and you copy both of them to a new location. Then you will wind up with two copies of the original file instead of one, taking up a total of twice as much space as was originally used.
  • sparse files: NTFS allows people to save space on files that contain lots chunks that are all 0's. Instead of saving the whole file, the file system can be told to only save the chunks that are nonzero. When copying a sparse file using Windows Explorer, the resulting copy will not be sparse anymore, so the all 0's chunks in the file will actually take up space now, increasing the size of the copy.
  • compression: NTFS allow people to ask it compress files to save space. When a file is copied using Windows Explorer, the resulting copy will not be compressed unless the directory it is being copied into is a compressed directory. See this Microsoft help page for details.

Unless you are personally making use of these features extensively, it seems unlikely that any of these things would cause your copy to eat up so much space. You could figure out where the extra space is coming from by running a disk space usage utility like WinDirStat on both the source and the (incomplete) copy to see which folders are taking up more space in the copy.

However, if you actually want a fully usable Windows image that you can restore from, I would not recommend using a utility like cp as I'm not confident it will correctly copy metadata (eg., Windows ACLs). Instead I would recommend doing an image-based copy as AFH suggested. If you want to clone the disk rather than creating an image of it, you should be able to follow essentially the same instructions, with slight modifications that will be mentioned at the end.

  1. Boot into Linux. (I believe you said you were using Debian.)
  2. Determine the path to the hard disk you want to back up. It should start with /dev/. I recommend backing up the entire hard disk rather than just a partition. We'll refer to this as below.
  3. Mount the hard disk that you want to back up to and find the path to its mount point. We'll refer to this as below.
  4. Run dd if=<path to source> of=<path to destination mount point>/mybackup.bak
  5. Wait a 3 hours to 1.5 days for it to complete. (The speed will probably be between 10mb/s and 100mb/s.) If your source or destination is a USB 3.0 external drive, you may be surprised to notice that the copy speed is much slower than under Windows. The reason is that even though Linux is supposed to support UASP, the support doesn't always get enabled when it should be. (At least, this has been my experience.)
  6. If you need to restore your backup, run dd if=<path to destination mount point>/mybackup.bak of=<path to source>

If you want to clone the disk rather than creating an image, you should be able to replace with the path to the hd you want to copy to. (It should start with /dev/.)

Edit: As gronostaj points out, using the bs=<size> parameter with dd may speed things up. However, the best value to use depends on your system. See this answer for details.

John
  • 409
  • 3
  • 10