Ed, I completely agree, however this is what has to be done; 1. WGET is used to pull an image that changes every two seconds from an HTTP server, I find WGET the best tool to do this, I cannot think of a similar tool on a Windows side (synching does not work as well) 2. That same image needs to be FTP-ed to a different server as often as it changes (2 seconds), true, SMB would be a way to go but SMB is shut on that server. 3. That image would then be accessed via HTTP on that server (basically, HTTP in the last step is an internet resource, HTTP in the first step is an internal resource) Facts: 1. Cannot open firewall (for various reasons-beyond my control) 2. FTP is the only access to access HTTP server (step three) and it's ours 3. Rude and resource waste it is, but the good thing about it is it's only needed for 2 hours and never again. As far as PERL scripting goes, I cannot do it to save my life, (true, I never attempted to), and I need this in the next couple of hours. So if you can spare that PERL script I'd appreciate it very much. Sebastijan -----Original Message----- From: redhat-list-bounces@xxxxxxxxxx [mailto:redhat-list-bounces@xxxxxxxxxx] On Behalf Of Ed Wilts Sent: Thursday, May 13, 2004 10:22 AM To: General Red Hat Linux discussion list Subject: Re: Scheduling of tasks/what's the maximum frequency of the taskbeing executed? On Thu, May 13, 2004 at 09:23:48AM -0400, Sebastijan Petrovic wrote: > I have several tasks that need to be executed every two seconds. Using > RH9 builtin task scheduler the greatest frequency of a task is every 5 > minutes. At command also does not allow for such frequent schedule. > > > > Do any of you know of a way to accomplish this. Details of what I need > done are as follows: > > 1. Run WGET to download an image (that changes every two seconds) > using http, keep over-writing that image and > 2. Upload that image every two seconds to an FTP share Against my better judgement, since I have some doubts that you really want to do this, here's how I would tackle this: 1. Do not use any scheduler whatsoever. 2. I'm assuming the FTP server is yours. Trying to log into somebody else's FTP server every 2 seconds goes beyond rude - it's offensive. 3. Write a perl script to do the get & put. Don't use wget but use the callable routines to get the file. Open a single FTP session and put the file regularly. Better yet would be to write a custom app to copy the file over, or even better would be to not copy the file over unless it's needed - either via nfs, smb, or something. Copying the same image every 2 seconds is a large waste of resources unless users are actually pulling that image off your FTP server every 2 seconds. 4. Do *NOT* try to log into the FTP server every 2 seconds to put a file. FTP isn't that efficient and doing a new login every 2 seconds is going to kill the FTP server. -- Ed Wilts, Mounds View, MN, USA mailto:ewilts@xxxxxxxxxx Member #1, Red Hat Community Ambassador Program -- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list -- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list