1

I have a similar problem to the one mentioned here: Unable to free up space in df but I can not reboot (I have important data in the memory).

The output of df -h / is:

Filesystem      Size  Used Avail Use% Mounted on
/dev/sda2        47G   45G     0 100% /

which would be fine unless it stayed like that after I removed ca 13G of data.

Cumulated size of all files in /dev/sda2 available from / is a half of its capacity:

$ sudo du -xhd 0 /
23G     /

I found https://stackoverflow.com/questions/653096/how-to-free-inode-usage#9387415 but it definitely is not a problem with inodes:

$ df -hi /
Filesystem     Inodes IUsed IFree IUse% Mounted on
/dev/sda2        3.0M  535K  2.5M   18% /

How can I free the space without restarting the computer (however I shall be able to hibernate it since I have more swap area than physical RAM)?

One more comment: to save the data anywhere I need to be able to write to /tmp/ (ask NumPy developers why) which is at /dev/sda2 filesystem.

abukaj
  • 409
  • 1
  • 4
  • 12
  • 1
    Take a usb drive, mount it in `/tmp`, save your data in `/tmp` and you're good to go – or did I misunderstand something here? – dessert Sep 05 '17 at 19:36
  • That may do the trick (unless there are other issueas I do not know about yet). Thanks! – abukaj Sep 05 '17 at 19:42
  • @dessert sure, but I will try it as soon as: 1. I find a suitable usb drive, 2. NumPy finishes writing the main array of data into plain `*.npy` file (I ripped it out from the data object; now the disk fragmentation kicked in :-/). – abukaj Sep 05 '17 at 19:58
  • @dessert both salvage methods worked. :-) – abukaj Sep 06 '17 at 08:52
  • Excellent! I'll write an answer for you to accept so that we can close here. – dessert Sep 06 '17 at 08:58

1 Answers1

1

Follow these steps:

  1. Find a drive, e.g. a USB stick, that has enough space for your data and plug it in
  2. Get its device partition name using lsblk, I use /dev/sdb1 as an example here.
  3. It your drive got automounted first unmount it with sudo umount /dev/sdb1, then mount it in /tmp using

    sudo mount /dev/sdb1 /tmp
    
  4. Save your data to /tmp
  5. Unmount your drive using

    sudo umount /dev/sdb1
    
  6. Free space as explained in the links you provided.
dessert
  • 39,392
  • 12
  • 115
  • 163
  • 1
    the answer to be complete needs to mention that the whole problem was of the kind "chicken and egg". The diskspace was freed when I killed the IPython console. It seems NumPy tried to allocate big temporary file, failed, not disposed the diskspace. Then each time I tried to delete some files and store my data the sequence repeated. Similar problem is solved here: https://stackoverflow.com/a/36349961/4879688 – abukaj Sep 06 '17 at 09:13
  • @abukaj I don't totally get it and therefore hesitate to add it to my answer, but I think your comment suffices to make it clear. Thank you! – dessert Sep 06 '17 at 09:15
  • Well, it is you who helped me to save my data. :) – abukaj Sep 06 '17 at 09:20
  • @abukaj You're very welcome! :) – dessert Sep 06 '17 at 09:29