Re: find errors in a directory of files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Am 05.08.2012 00:19, schrieb Tim Dunphy:
>  I'm trying to write a script that will search through a directory of trace
> logs [...] and it's not possible to know the exact
> names of the files before they are created. The purpose of this is to
> create service checks in nagios.
[...]
> The problem with this script is that it is only able to detect one error in
> the logs. If you echo more than one test phrase into a log file or into
> multiple log files it still only picks up one error message.

That is a consequence of using the variable=($(...)) + echo idiom.
If you write your script as

#!/bin/bash
log1='/u01/app/oracle/admin/ecom/udump/*'
grep -e 'ORA-00600' -e 'ORA-04031' -e 'ORA-07445' $log1 && status=2
echo $status
exit $status

sending the result of grep to standard output, you'll get a complete
list of matches. (If you need a specific output format you can replace
"grep" with "sed".)

But I doubt that Nagios will be able to receive more than one error
from a single plugin invocation. AFAIK it expects a single-line result.

HTH
T.

-- 
Tilman Schmidt
Phoenix Software GmbH
Bonn, Germany
_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
http://lists.centos.org/mailman/listinfo/centos


[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]
  Powered by Linux