25

Can I do something about it? I'm running out of disk space.

muru
  • 193,181
  • 53
  • 473
  • 722
evencoil
  • 757
  • 1
  • 8
  • 14

4 Answers4

34

On one of my systems that acts as a backup server, mlocate.db hit 9GB. The solution was to exclude the backup directories from locate, since I had no need to search them.

I did this by adding the backup directory to PRUNEPATHS in /etc/updatedb.conf.

Running sudo updatedb then reduced it to 1.6MB (and saves a huge amount of time indexing all of those files).

user76225
  • 341
  • 3
  • 2
  • Perfect, thank you. Just went from 800MB to 100MB, but more importantly, the updatedb command can complete very quickly now, whereas before it was taking days. (Note, however, that it only runs when the computer is not on battery, and it uses low priority IO, according to the script at /etc/cron.daily/mlocate.) – mlissner Apr 19 '18 at 18:06
  • Wow 9 GB down to 1.6 MB! – WinEunuuchs2Unix May 23 '18 at 02:17
9

If you have lots and lots of files on your machine, you may want to consider pruning some paths from the database. You can do this in /etc/updatedb.conf under PRUNEPATHS. You can also prune file systems (like nfs, if you so desire).

Zach Bethel
  • 91
  • 1
  • 2
2

800MB sounds pretty much. My /var/lib/mlocate/mlocate.db is about 8MB only (fresh install on 10.04 release date). You can safely delete it, if you run sudo updatedb, it'll be recreated.

Lekensteyn
  • 171,743
  • 65
  • 311
  • 401
  • thanks, that worked. After running sudo updatedb it is now 620MB. edit: oops I read that wrong. 620MB would mean it didn't work (I thought I read KB on my file output). – evencoil Jan 28 '11 at 15:17
  • How many files do you have? What is your disk size? – Lekensteyn Jan 28 '11 at 20:57
  • 1
    Is it really that big? Size in MB: 'du -m /var/lib/mlocate/mlocate.db' –  Jan 28 '11 at 21:12
  • Also both `du` and `ls` have an `-h` flag ` : -h, --human-readable` `print sizes in human readable format (e.g., 1K 234M 2G)` – belacqua Jan 28 '11 at 22:02
  • It is really that big...I do have a lot of very small files (related to some data work) across several hard drives, so maybe that is why. edit: I also keep many backups...Is there a way to exclude directories from the indexer? Probably indexing these backups is the big problem. – evencoil Jan 30 '11 at 12:28
-1

Its a database of all files in your root directory. It is used by locate utility. if you delete this file locate will no longer work.

binW
  • 12,804
  • 8
  • 49
  • 66