search
top

Keeping Logfiles at Bay with bash

From time to time there are instances in which an application creates logfiles that are big but you need them to fix an issue. Most times these are trace logs. Ran into a similar situation and came up with a short and simple bash script to address the issue.

#!/bin/bash
####################################################################
#  This will look for trace log  and zip them and deleting files older than 2 days
#  The pattern it will zip is trace_*.log
####################################################################
PATH=/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/local/sbin
LOC="/path_to_logs"
pushd ${LOC}

for i in `ls |grep trace |grep -v gz `;do
#echo $i
if  $(lsof | grep $i);then
   echo "File is in use, try later"
else
   gzip $i
   chown nobody:nobody $i.gz
fi
done
   find . -type f -mtime +2 -name 'trace_*.gz' -exec /bin/rm {} \;
popd

Looking at the script we see that it uses LOC variable for the location of the logfile and we are setting a PATH so the commands will find themselves. Basically the script flows as such.

Set the path and location variable, change to the location of the files, find the files and check to see if the file is in use. If the file is in use, skip it and move on to the next. When it finds one, zip it up and change ownership until you are done.

Finally the script then uses find using the -type switch to look for files older than 2 days and deletes them.

The script is generic enough that several of the values can be changed to meet your needs.

Leave a Reply

Your email address will not be published. Required fields are marked *

top