From further investigation I have found there is an exact issue on how long certain instances of the task run for. most successfully run in a few seconds, yet a few consecutive tasks are running up to an hour (then stopped by the manager). At this point any information about Windows scheduled task life cycle would be appreciated
I have a php file which needs to be called every 10 minutes.
The php file deals with new database entries that are updated on the call.
The file is executed via a scheduled task call to php.exe with -f argument (with said file after)
The task is performed as expected, and runs without fail.
However, I have noticed two problems since its initial run.
1st, I had a file_put_contents on a log file which added a 'Schedule Task()' plus the current time, line every time the task ran. For the first few days this ran fine, but it wasn't until one evening when command failed to execute as the log file had filled up (a .txt at just under 1GB).
I checked, and the logfile read as:
Schedule Task() 2014-10-10 18:00:00.000
Schedule Task() 2014-10-10 18:00:00.000
Schedule Task() 2014-10-10 18:00:00.000
(repeated for 1000s of lines, but still with milliseconds time incrementing every 500 lines or so)
Where I'd expect it to display as:
Scheduled Task () 2014-10-10 18:00:00.000
Scheduled Task () 2014-10-10 18:10:00.000
Scheduled Task () 2014-10-10 18:20:00.000
This suggests that the process got stuck on that line, as the milliseconds for 500 lines were equal.
To resolve this, I moved the file_put_contents line after an IF statement to ensure the line would only be written to the log file when new records were in the database.
The problem never happened again.
2nd, However, 3 weeks from the first scheduled task (today) I notice that the scheduled task's history is filled with errors, at a 10 minute incremental rate.
The error code given is that of 'Duplicate instances running'. (Which is correct as I have set the option to not run instances if one is already in progress)
This, suggests that the Scheduled Task manager is executing the action multiple times (multiple processes).
At this stage, I'm not sure wether it is a fault by the scheduled task, or if it is a code issue where I am not correctly ending the file. Has anyone had a similar issue?
Update
From looking into the scheduled task amanbager logs, I can see that there is an issue with the task running longer than 1 hour (see image)
The call is running every 10 minutes however the task run (red line) is not escaping correctly.
Related
Currently, I have done the following:
I created one scheduled task which runs daily to get the Scheduled time from Mysql DB for the currentdate and store it into the .txt file
SELECT workflow_id, DATE_FORMAT(schedule_datetime,'%H:%i')TIMEONLY FROM scheduling_event
where DATE(schedule_datetime) = CURDATE()
Created one more scheduled task that runs each 5mins to check if the scheduled time present in the .txt file matches the CURRENT TIME if yes then it calls the scheduled_program.php file.
The issue here is - this is not an efficient way if nothing is scheduled on the current date. So Is there any way to create/update a dynamic scheduled task instead of running each 5mins? ie: the first scheduled task will run and take the scheduled time on the current date then it will create a task based on the scheduled time. if the day ends delete all the scheduled tasks for the day.
Note: Number of the scheduled task is not fixed. imusing Windows 10, php7.
I am trying to achieve, run a scheduled_program php file on schedule Date and TIME
It looks like you're trying to solve a problem that's not serious. I guess you don't want to waste your computer's time running a cron job every five minutes if it has nothing to do.
But here's the thing:
a cron job
that runs a php program
that does a single query
to retrieve a list of workflows
and run them
has negligible cost if you run the cron job every five minutes, or even every single minute, and there are no workflows to run.
On the other hand, debugging and troubleshooting cronjobs is hard, especially in production.
So, I respectfully suggest you keep this system as simple as you possibly can. You will have to explain it over the telephone to somebody in the middle of the night at least once. That's the unfortunate truth of scheduled tasks. The dynamic scheduled task system you propose is not simple.
I have created a task to open a website every x minutes.
This is what I have.
program: "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe"
argument: https://phpfile on my server
start in: C:\Program Files (x86)\Google\Chrome\Application\
It starts manually but never repeats automatically.
It shows repeat time correct but never repeats. The repeat time just keeps updating.
I basically want to run a PHP script on my website every few minutes,
please help.
I solve my problem in Windows 10 by setting multiple triggers to my task in Task Scheduler.
After that I rebooted my computer and its working fine finally.
This is the link I refer to: https://superuser.com/questions/865067/task-scheduler-repeat-task-not-triggering.
Below is the screenshot of my task's properties in Task Scheduler.
You can also create a scheduled task and set the trigger after the present time (not in the past time as it wont start then) say 10min from present. Then dont run the task manually but wait 10min and windows will start scheduling the task say every hour or whatever you set. Basically windows scheduler needs at least one automatic trigger instead of manually Run by a user to start running the task automatically.
I had the same issue and solved it by removing spaces from the name of task and from the executed path. Task scheduler doesn't like spaces.
I am using Cakephp.
I want to run scheduled jobs.
Here's the situation:
User set time for some task on UI (say 1 week).
When the time is over, I want to execute some specific task. Meanwhile, the user can also change the time, and the task should be executed at updated time.
What is the best way (and reliable) to achieve this?
PS: Complexity is not the issue. But the task must always run after specific time under all circumstances.
Set the execution date in your table in a field
Set a status (pending) as well
Run a cron job that runs a CakePHP shell every X seconds or minutes, whatever you need OR create a shell that keeps running all time and check the records every X seconds in a loop.
The shell will process tasks that are configure for an execution date lower than the current date
Set the status to success or failed depending on the outcome
It's up to you how you want to handle failed tasks and if it's OK if a task executes 10secs later than configured or 10 minutes. There are multiple factors that play into this: Your interval of the cron job / query against the table. Do they have to processed in parallel? Is it OK to process them after each other? Your information is to vague.
The way to do that in CakePHP is to create a shell and run it with Cronjobs.
I have several cron jobs executing php scripts. The php scripts sometimes do some heavy jobs like updating hundreds of records at a time in a mysql table.
The problem is that the job should run every minute. However, it randomly misses and as a result does not execute every minute. Sometimes, it executes every 4-6 minutes, then back to every 1 minute, misses 2-3 times more and then normal again.
I am on centos 6.5
Please note that the php runs correctly and there is no problem whatsoever concerning the php scripts themselves since on the time it runs, I get the expected results and that there are about 10 more other similar scripts running at the same time (every minute or every 5 minutes for the other scripts)
Job:
/usr/bin/php "/var/www/html/phpScriptToExecute.php" >> /var/www/html/log/phpScriptLog.log 2>&1
My take is that it is maybe a problem with too many simultaneous scripts running concurrently, accessing the database at the same time.
Last information: No error in the /var/log/cron file or in the phpScriptLog.log file.
The reason could be, your cron job takes more than 1 minute to execute, print out the start time and end time at the end of the script to validate it.
if the cron job is running, linux won't execute it again.
My guess is it's caused b a PHP fatal error, but your PHP probably isn't configured to send error messages to stderr, which is why your phpScriptLog.log is empty. You could check your php.ini (or just use ini_set()) for the following:
display_errors: set it to on/true if you want errors to show on stderr
log_errors: set it to true if you want to send the error messages to a file
error_log: point it to a file you want the errors to be stored in
Or, if you want a solution to avoid overlapping cron jobs, there are plenty here in SO.
I use windows task scheduler to start up a php script that works perfectly fine. Basically C:\php.exe -f C:\myscript.php
In my script some work happens that sometimes makes me want to run the task script again in 5 minutes.
I tried to implement this by changing the settings of the task to restart every 5 minutes if the task fails and having my php code exit(1). The task scheduler seems to know that I exited with an error code of 1, but it does not run the script again.
Does anyone know what I can do to make it so that task manager will try again in 5 minutes if I signal it from my code somehow.
Not an answer to the question as phrased, but might serve as a fallback if you can't get it working: make your job run every 5 minutes, regardless, and then track "last success"/"last failure" yourself, in a database or file.
Before doing anything else, the script can check the logged status, and if there was a failure last time, try again (up to a limited number of tries, presumably). If there was a success last time, exit immediately, unless it's time for the next job anyway (e.g. if the original schedule was daily, then check for $last_success being longer ago than 24 hours).