Find Large Files in Linux
Recently had the need to find large files in Linux to clear up some space. With the use of the find, print and awk commands I was able to find files bigger than 50 MB. You can change the values to suit your needs.
To find all files over 50,000KB (50MB+) in size and display their names, along with size, use following syntax:
find {/path/to/directory/} -type f -size +{size-in-kb}k -exec ls -lh {} \; | awk ‘{ print $9 “: ” $5 }’
As we can see with the find command we are using -type, -size and -exec command switches to tell find we are looking for files of a specific size and pass the ls -lh command. By adding pipe we are passing this information to awk to print out columns 9 & 5 which gives us the name of the file and the size.
Search or find files 50MB or bigger in current directory, enter:
$ find . -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
To find 50MB or bigger files in a specific directory for instance /var/log directory.
$ find /var/log -type f -size +50000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
I hope this helps you to find those elusive space eaters.
the gnu version of find also has pintrf which can be handy in some cases. I don’t think it has a nice way to print out human readable sizes.A common find gotcha files with spaces in them can be funky if you’re passing it onto xargs. So in that case use -print0 or -pintrf0 in the find side and -0 in xargs.In this case you’d do something like -pintrf
Not sure what flavour of linux you’re using but this works on our Red Hat sverres just fine.The above command actually deletes all log files 3 days old or less. Should be +3 to delete older.
This command was run on RHEL 5.x and was designed just to delete servers by file size and not date, so it is possible you would see what you are seeing occur.