I had to scratch my head a little when I was supporting Oracle in a Unix/Linux/AIX environment. We would keep getting tickets for partition /app/oracle exceeds 90%.
So I'd log into the server and do a df -h /app/oracle or df -g /app/oracle (in AIX) and sure enough it exceed 90% full.
But which files should I delete or gzip?
So I'd do this...
find /app/oracle -type f -size +100000 -exec ls -l {} \;
to locate large files, and this to locate dump files.
find /app/oracle -name '*.dmp' -exec ls -l {} \;
but the biggest problem is with large numbers of small log files clogging up directories. Icouldn't find a suitable unix command so I wrote one.
File - ascr1 (this will display the count of files in directory $1 but only if it exceed the number in $2). Give it execute privs.
if [ `ls $1 | wc -l` -gt $2 ]; then
echo "$1 - `ls $1 | wc -l`"
fi
then use it in a find command. In this example it will only display directories exceeding 1000 files.
find /app/oracle -type d -exec ./ascr1 {} 1000 \;
Happyjohn.
No comments:
Post a Comment