Is that OK to leave a small endless loop script running on server?

Perl & Unix - quick qstn

  • So a couple weeks ago I explained something I was trying to do. I have a site that periodically pulls headlines off certain web sites, then indexes them for display on my site. It does this via two scripts -- one which 'pulls' the html of the various sites & saves it, and another which 'parses' these files to get the right links. I wanted the scripts to execute every 15 minutes so that my site would automatically update. I received an answer on here regarding use of a crontab. I set up the crontab to execute a little unix script, which switched into the cgi-bin directory and executed the two perl scripts in sequence. Everything seemed to work fine for about 2 weeks, although I did notice that, sometimes, when I executed one of the perl scripts manually, it would stall out and I'd have to exit with Ctrl-Z. I know it was an intense script with lots of file xferring, and I suppose a non-responsive site could cause the script to crash. Well, apparently, last nite, my little operation crashed my server (oops!) The sysop told me I had 100-150 instances of my little unix scripts running simultaneously. Apparently they were not always exiting correctly -- I assume when the perl scripts stalled, the unix scripts would just sit there perpetually open, waiting for them to finish. So my question is, how do I rectify this? Is there a foolproof way of making sure that all instances are killed even if they've somehow stuck in an endless loop? Is there a crontab command I could add that would kill everything running once an hour or something? In short, how do I avoid getting myself banned from this server?? Thanks!!

  • Answer:

    First of all, in order to kill a running command, you must know what the Process ID (PID) is. The pidof command will give you the PIDs of all running applications with a given name (try 'pidof httpd'). If your only Perl scripts are ones you want killed hourly, you can use pidof Perl to get a list of PIDs for all instances of Perl, yours and other users'. Then you can feed these numbers to kill, which will to try to kill the processes. Killing processes owned by others will not work, but your running processes will be killed. A more elegant approach would be to use ps ax to list all running processes and to search (using grep and awk, using Perl, or some other way) for only the processes you want to kill. For example, if your script is run as '/usr/bin/Perl foo' you could get the PIDs of only those Perl processes, as opposed to *all* Perl processes, and kill them instead. Once you have written a simple script to do this, set it up to run hourly using crontab just as you did with your other two scripts.

tjsnod-ga at Google Answers Visit the source

Was this solution helpful to you?

Related Q & A:

Just Added Q & A:

Find solution

For every problem there is a solution! Proved by Solucija.

  • Got an issue and looking for advice?

  • Ask Solucija to search every corner of the Web for help.

  • Get workable solutions and helpful tips in a moment.

Just ask Solucija about an issue you face and immediately get a list of ready solutions, answers and tips from other Internet users. We always provide the most suitable and complete answer to your question at the top, along with a few good alternatives below.