I have a script, that is run as a part of my module in Drupal.
In this module, I use cURL functions to request html page from a remote website, then I parse the html content and write it to my database.
I wanted to have a delay between every request that I do to the remote server, hence I added sleep() in my PHP script.
function get_me_data()
{
for( 15 iterations)
{
$html = file_get_contents($search_url);
//parse html contents
$delay = mt_rand($interval1,$interval2); // interval1 = 0 , interval2=30
$retval = sleep($delay);
}
}
function get_me_data() is initiated by cron job(hook_cron implemented in .module file).
I have also tried to flush streams before the usage of sleep command.
used flush(); ob_flush();fflush(filepointer); ob_implicit_flush(true)
It so happens that after 4-5 iterations, sleep() command returns NON ZERO value.
the script aborts in the next iteration when sleep() is called
Let me know, if I am overlooking any aspect of sleep() here.
I am using, PHP version 5.3 & Drupal 6
According to the docs http://php.net/manual/en/function.sleep.php can return non zero values.
If the call was interrupted by a signal, sleep() returns a non-zero value. On Windows, this value will always be 192 (the value of the WAIT_IO_COMPLETION constant within the Windows API). On other platforms, the return value will be the number of seconds left to sleep.
A comment in the sleep uses time_sleep_until() because it doesn't get interrupted.
Related
What I've to do it's a bit complicated.
I've a python script and I want to run it from PHP, but in background. I had seen somewhere that for run a python script in background I have to use the PHP command exec(script.py) to run without wait for a return: and thus far no problem.
First question:
I have to stop this loop script with another PHP command, how to do this?
Second question:
I have to implement a server-side timer, who stops the script at the end of the time.
I found this code:
<?php
$timer = 60*5; // seconds
$timestamp_file = 'end_timestamp.txt';
if(!file_exists($timestamp_file))
{
file_put_contents($timestamp_file, time()+$timer);
}
$end_timestamp = file_get_contents($timestamp_file);
$current_timestamp = time();
$difference = $end_timestamp - $current_timestamp;
if($difference <= 0)
{
echo 'time is up, BOOOOOOM';
// execute your function here
// reset timer by writing new timestamp into file
file_put_contents($timestamp_file, time()+$timer);
}
else
{
echo $difference.'s left...';
}
?>
From this answer.
Then, there is a way to implement it in a MySQL database? (The integration with the script stop is not a problem)
That's actually pretty simple. You can use a memory object caching system. I would recommend memcached. Memory objects from memcached can be accessed literally from anywhere in your system. The only requirement is that a connection to the memcached backend server is supported. (PHP does, Python does, etc.)
Answer to your first question:
Create a variable called stopme with the value 0 in the memcached database.
Connect from your python script to the memcached database and read the variable stopme permanently. Let's say the python script is running when the variable stopme has the value 0.
In order to stop your script from PHP, make a connection from your PHP script to the memcached server and set stopme to 1.
The python script receives the updated value instantly and exits.
Answer to your second question:
It could be done like explained in my answer before through reading shared variables, but additionally I would like to mention that you also could use a cronjob to kill a running script.
First, I wanted to say, that I do aware of that thread: PHP - kill exec on client disconnect, but the solution marked there as answer doesn't work!
Problem:
I got a script, that executes other script that does numeric operations... but would like to force it to stop on disconnect, as those operations may take even few minutes, and are using alot of server resources. Hitting F5 few times only makes those times longer and longer...
What I tried... I tried every solution I found. Especially the one given above, but none of them seems to work. Scripts are still running in background till finish.
Any ideas?
PHP5 + Apache on Debian.
Have you tried to produce some output inside your numerical operations logic using flush()? for example assume you've a recursive algorithm that does some calculations, do the output writing every 100 recursive calls as follows and see if it aborts:
recursive(input) {
if(input % 100 == 0) // check every 100 recursive call
{
echo " ";
flush(); // attempt to write to output stream
if (connection_aborted()) // will be true if output writing fails.
exit;
}
// do something
// your code logic...
input = input + 1;
// and call recursive
recursive(input);
}
In PHP, I want to put a number of second delay on each iteration of the loop.
for ($i=0; $i <= 10; $i++) {
$file_exists=file_exists($location.$filename);
if($file_exists) {
break;
}
//sleep for 3 seconds
}
How can I do this?
Use PHP sleep() function. http://php.net/manual/en/function.sleep.php
This stops execution of next loop for the given number of seconds. So something like this
for ($i=0; $i <= 10; $i++) {
$file_exists=file_exists($location.$filename);
if($file_exists) {
break;
}
sleep(3); // this should halt for 3 seconds for every loop
}
I see what you are doing... your delaying a script to constantly check for a file on the filesystem (one that is being uploaded or being written by another script I assume). This is a BAD way to do it.
Your script will run slowly. Choking the server if several users are running that script.
Your server may timeout for some users.
HDD access is a costly resource.
There are better ways to do this.
You could use Ajax. And use a timeout to call your PHP script every few seconds. This will avoid the slow script loading. And also you can keep doing it constantly (the current for loop will only run for 33 seconds and then stop).
You can use a database. In some cases database access is faster than HDD access. Especially with views and caching. The script creating the file/uploading the file can set a flag in a table (i.e. file_exists) and then you can have a script that checks that field in your database.
You can use sleep(3) which sleeps the thread for 3 seconds.
Correction sleep method in php are in seconds.
Hare are two ways to sleep php script for some period of time. When you have your code and want to pause script working for some time use these functions.
In these examples the first part of code will be done on script run and the second part of code will be done but with time delay.
Using sleep() function you can define sleep time in seconds.
Example:
echo "Message 1";
// The first part of code.
$timeInSeconds = 3;
sleep($timeInSeconds);
// The second part of code.
echo "Message 2";
This way it is possible to sleep php script for 3 seconds. Using this function you can sleep script for whole number (integer) of seconds.
Using usleep() function you can define sleep time in microseconds. This sleep time is convenient for intervals that require more precise time than one second.
Example:
echo "Message 1";
// The first part of code.
$timeInMicroSeconds = 2487147;
usleep($timeInMicroSeconds);
// The second part of code.
echo "Message 2";
You can use this function if you want to sleep php for smaller time values than second (float). In this example I have put script to sleep for 2.487147 seconds.
Have you considered using a PHP Daemon script using supervisorD. I use it in multiple tasks that are required to be running all the time.
The catch is making sure that each time you are running your script you check for memory resources. If its too high, stop the process and then let it restart itself up again.
I have successfully used this process to be always checking database records for tasks to process.
It might be overkill but worth considering.
I have this PHP code:
<?php
include_once("connect_to_mysql.php");
$max=300;
while($max--)
{
sleep(1);
doMyThings();
}
?>
it is supposed to repeat a mysql query 300 times with gap of 1 second between each. But the problem is after a minute or so in the browser i get this message: No Data Received. Unable to load the webpage because the server sent no data.
The problem is the following: Your code will at least (without considering the amount of time needed by doMyThings()) last 300 seconds. Most PHP environments set the default script running time to about 60 secs, the script stops and nothing is printed out.
Next thing is (if script execution time is set high enough to allow long running scripts), the script has to run until its finished (that is, ~300 secs) and after that, data is written onto the output stream. Until there, you won't see any output.
To circumvent those two problems, see this code:
<?php
// If allowed, unlimited script execution time
set_time_limit(0);
// End output buffering
ob_end_flush();
include_once("connect_to_mysql.php");
$max=300;
// End output buffering IE and Safari Workaround
// They will only display the webpage if it's completely loaded or
// at least 5000 bytes have been "printed".
for($i=0;$i<5000;$i++)
{
echo ' ';
}
while($max > 0)
{
sleep(1);
doMyThings();
$max--;
// Manual output buffering
ob_flush();
flush();
}
?>
Maybe this post is also of interest to you: Outputting exec() ping result progressively
The browser will not wait a whole 5 minutes for you to complete your queries.
You need to find a different solution. Consider executing the PHP script in CLI.
It seems that you have a timeout executing 300 times doMyThings();
You can try with set_time_limit(0);
Set the number of seconds a script is allowed to run. If this is reached, the script returns a fatal error. The default limit is 30 seconds or, if it exists, the max_execution_time value defined in the php.ini.
When you execute long time php code on server side, you need change max_execution_time directive in php.ini. But browser will not wait how long as you want so you need use async technology like AJAX
I'm trying to create a script that creates unique codes and writes them to a textfile.
I've managed to generate the codes, and write them to the file.
Now my problem is the fact that my loop keeps running, resulting in over 92 000 codes being written to the file, before the server times-out.
I've done some logging, and it seems that everything works fine, it's just that after a certain amount of seconds, all my variables are reset and everything starts from scratch. The time interval after which this happens varies from time to time.
I've already set ini_set('memory_limit', '200M'); ini_set('max_execution_time',0); at the top of my script. Maybe there's a php time-out setting I'm missing?
The script is a function in a controller. I set the ini_set at the beginning of this function. This is the loop I'm going through:
public function generateAction() {
ini_set('memory_limit', '200M');
ini_set('max_execution_time',0);
$codeArray = array();
$numberOfCodes = 78000;
$codeLength = 8;
$totaalAantal = 0;
$file = fopen("codes.txt","a+");
while(count($codeArray)<$numberOfCodes){
$code = self::newCode($codeLength);
if(!in_array($code,$codeArray))
{
$totaalAantal++;
$codeArray[] = $code;
fwrite($file,'total: '.$totaalAantal."\r\n");
}
}
fclose($file);
}
In the file this would give something like this:
total: 1
total: 2
total: ...
total: 41999
total: 42000
total: 1
total: 2
total: ...
total: 41999
total: 42000
Thanks.
Edit: so far we've established that the generateAction() is called 2 or 3 times, before the end of the script, when it should only be called once.
I already found the solution for this problem.
The host's script limit was set to 90 seconds, and because this script had to run for longer, I had to run it via the command line.
Taking account of the test with uniqid(), we can say that variables are not reseted, but the method generateAction() is called several times.
Since you code is probably synchronous, we may say that generateAction() is called several times because the main script is called several times.
What happens in detail?
Because of the nature of your algorithm, each pass in the loop is slower then the previous one. So the duration of executing generateAction() may be quite long.
You probably don't wait for the end, and you stop the process or even start the process from a new page. Nevertheless, the process don't really stop so soon, and it keeps running in back-end. I've observed such a behavior on my local WAMP/LAMP installation: the script is not actually stopped even if I stop the page, if I close the page, even if I close the navigator or if I restart Apache.
So it happens to you that several script processes are writing simultaneously in the codes.txt file.
In order to avoid this, you can for example lock the file during the loop using function flock().