In bash, I have created a simple daemon to execute commands when my internet connection changes:
#!/bin/bash
doService(){
while
do
checkTheInternetConnection
sleep 15
done
}
checkTheInternetConnection(){
if unchanged since last check
return
else
execute someCommand
fi
}
someCommand(){
do something
}
doService
And this has been working pretty well for what I need it to do.
The only problem is that as a part of my “someCommand” and “checkTheInternetConnection” I use other built-in utilities like arp, awk, grep, head, etc.
However, 99% of the time, I will just need arp.
First question: Is it necessary to keep the other commands open? Is there a way to kill a command once I’ve already processed its output?
Another question: (MOVED TO AN NEW POST)
I am having a hell of a time trying to write a “kill all other daemon processes” function. I do not ever want more than one daemon running at once. Any suggestions? This is what I have:
otherprocess=`ps ux | awk '/BashScriptName/ && !/awk/ {print $2}'| grep -Ev $$`
WriteLogLine "Checking for running daemons."
if [ "$otherprocess" != "" ]; then
WriteLogLine "There are other daemons running, killing all others."
VAR=`echo "$otherprocess" |grep -Ev $$| sed 's/^/kill /'`
`$VAR`
else
WriteLogLine "There are no daemons running."
fi
Can you detail more the first question? I think you are asking about running many commands piped together (cat xxx|grep yyy|tail -zzz).
Each command will keep running until its pipe has data (not reached EOF). So in this example grep will only exit after cat processed all the input and closed its end of the pipe. But there is a trick here, cat will only close its end of the pipe if grep already read all (buffered, at least) the input, because the writing call in pipes are blocking. So you need to have this in mind while designing your scripts.
But I don’t think you should worry about the built-in utilities. Generally they have a low memory footprint, if that is the concern.