Why php threads are sequential not parallel on my localhost? - php

I want to use pthread in php to do some tasks in parallel. I have installed pthread on xampp on win10 and I just copied some examples from websites but the result of all of them is sequential instead of parallel!
One example from http://www.smddzcy.com/2016/01/tutorial-multi-threading-in-php7-pthreads/:
<?php
class SomeThreadedClass extends Thread
{
private $tID;
public $data;
public function __construct(int $tID)
{
$this->tID = $tID;
$this->data = $tID . ":" . date('H:i:s');
}
public function run()
{
echo $this->tID . " started.\n";
sleep($this->tID);
echo $this->tID . " ended. " . date('H:i:s') . "\n";
}
}
$threads = [];
for ($i = 1; $i < 5; $i++) {
$threads[$i] = new SomeThreadedClass($i);
$threads[$i]->start(); // start the job on the background
}
for ($i = 1; $i < 5; $i++) {
$threads[$i]->join(); // wait until job is finished,
echo $threads[$i]->data . "\n"; // then we can access the data
}
The result on the website is:
1 started.
2 started.
3 started.
4 started.
1 ended. 18:18:52
1:18:18:51
2 ended. 18:18:53
2:18:18:51
3 ended. 18:18:54
3:18:18:51
4 ended. 18:18:55
4:18:18:51
When I run the code on my localhost, I get this result:
1 started.
1 ended. 13:11:24
2 started.
2 ended. 13:11:25
3 started. 3 ended. 13:11:26
4 started. 4 ended. 13:11:27
1:15:41:23
2:15:41:23
3:15:41:23
4:15:41:23
Why threads are sequential not parallel on my localhost?

The threads are executing in parallel. You can tell that by looking at the times being output. All threads start at the same time, and they each finish just 1 second between each joined thread. Given that the sleep time is being incremented on each new thread being spawned, the time gap between each thread finishing would incrementally increase if they executed sequentially.
For example, changing the thread spawning and joining part of your script to the following (to force sequential execution):
for ($i = 1; $i < 5; $i++) {
$threads[$i] = new SomeThreadedClass($i);
$threads[$i]->start(); // start the job on the background
$threads[$i]->join(); // wait until job is finished,
echo $threads[$i]->data . "\n"; // then we can access the data
}
Would output something similar to:
1 started.
1 ended. 15:14:06
1:15:14:05
2 started.
2 ended. 15:14:08
2:15:14:06
3 started.
3 ended. 15:14:11
3:15:14:08
4 started.
4 ended. 15:14:15
4:15:14:11
Notice the different start times and the incremental gaps between the finish times.
As for why you are receiving the output in that sequence, it is simply how the output buffer is handling the output from multiple threads. My output is different, but then I'm using OS X.

Related

run php script exactly every 2 second

i have one php file that i want to run every 2 second for a 1 minute. because in my server i can set min cron for 1 minute only. so i made this script.
<?php
$start = microtime(true);
set_time_limit(60);
for ($i = 0; $i < 59; ++$i) {
shell_exec('/usr/local/bin/php /usr/local/www/my_file.php');
time_sleep_until($start + $i + 2);
}
?>
and second option is.
<?php
for ($i = 0; $i <= 59; $i+=2) {
shell_exec('/usr/local/bin/php /usr/local/www/my_file.php');
sleep(2);
}
?>
but both of them are not working because my script execution time is 50 to 60 second. so, its not running every 2 second but every 50 to 60 second. so, is there any solution for that to start new script execution every 2 second? i don't have any idea please help me.
You can write a bash script which executes your php script every 2 seconds for defined amount. And this bash script can be executed via cronjob.
Example:
#!/bin/bash
count=10
for i in `seq 1 $count`; do
/bin/php /path/to/scrip.php &
sleep 2
done

Recursive directory iterator with offset

Is it possible to start the loop from a certain point?
$iterator = new \RecursiveIteratorIterator(new \RecursiveDirectoryIterator($path, $flags));
$startTime = microtime(true);
foreach($iterator as $pathName => $file){
// file processing here
// after 5 seconds stop and continue in the next request
$elapsedSecs = (microtime(true) - $startTime);
if($elapsedSecs > 5)
break;
}
But how do I resume from my break point in the next request?
a) pull the time calculation out of the foreach. you have a start time and you want a runtime of 5 seconds, so you might calculate the endtime beforehand (startime+5s). inside the foreach, simply compare if time is greater or equal to endtime, then break.
b) Q: is it possible to start the loop from a certain point? how do I resume from my break point in the next request?
Two approaches come to my mind.
You could store the last processing point and the iterator and resume at last point + 1.
You would save the last position of the iteration and fast forward to it on the next request, by calling iterator->next() until you reach the next item to process, which is $lastPosition+1.
we have to store the iterator and the lastPosition and pick both up on the next request, until
lastPosition equals the total number of elements in the iterator.
Or, you could turn the iterator into an array on the first run: $array = iterator_to_array($iterator); and then use a reduce array approach.
(Maybe someone else knows how to reduce an iterator object.)
With this approach you would only store the data, which decreases request by request until 0.
The code is untested. It's just a quick draft.
$starttime = time();
$endtime = $starttime + (5 * 60); // 5sec
$totalElements = count($array);
for($i = 0; $i <= $totalElements; $i++)
{
if(time() >= $endtime) {
break;
}
doStuffWith($array[$i]);
}
echo 'Processed ' . $i . ' elements in 5 seconds';
// exit condition is "totalElements to process = 0"
// greater 1 means there is more work to do
if( ($totalElements - $i) >= 1) {
// chop off all the processed items from the inital array
// and build the array for the next processing request
$reduced_array = array_slice(array, $i);
// save the reduced array to cache, session, disk
store($reduced_array);
} else {
echo 'Done.';
}
// on the next request, load the array and resume the steps above...
All in all, this is batch processing and might be done more efficiently by a worker/job-queue, like:
Gearman (See the PHP manual has some Gearman examples.) or
RabbitMQ / AMPQ or
the PHP libs listed here: https://github.com/ziadoz/awesome-php#queue.

PHP Infinite Loop Not Terminated after 60 seconds

I created an infinite PHP while loop which incremented a variable then echoed:
$num = 1;
while($num >0) {
echo $num . "<br/>";
$num++;
}
I was expecting this to be killed/terminated after 60 seconds as the setting in php.ini are as follows:
max_execution_time 60 60
max_input_time 60
Sorry if I'm wrong but I expected to see the job killed in the browser (no new echos!)...
Can anyone give me more information on infinite PHP jobs running and when they are actually killed on the server?
You are confusing execution time with wall clock time. They are not the same thing. The processor is using very little execution time on each loop of your code. It will eventually time you out but it's going to take a lot longer than a minute.
Think of it this way, your processor may be running at 2GHz. How many instructions do you think it takes to do one of your loops? The time on echo is big (i.e. slow) and it doesn't count toward processor execution time.
//using set_time_limit
// starting php code here
echo "starting...\n";
// set_time_limit(10); //either would work
ini_set("max_execution_time", 10); //either would work
function doSomeExpensiveWork($currentTime){
for ($r = 0; $r < 100000; $r++){
$x = tan(M_LNPI+log(ceil( date("s")*M_PI*M_LNPI+100)));
}
}
try{
while(true)
{
$currentTime = date("H:m:s");
echo $currentTime, "\n";
doSomeExpensiveWork($currentTime);
}
} catch (Exception $e) {
//echo 'Caught exception: ', $e->getMessage(), "\n";
}
echo "this will not be executed! $x";
// code end

PHP interval timing

I have a problem, I'm running a script and the PHP line duplicates to whatever number $num_newlines equals. This is what I am currently using:
for ($i=1; $i<=($num_newlines - 1); $i++) {
$tweetcpitems->post('statuses/update', array('status' => wordFilter("The item $array[$i] has been released on Club Penguin.")));
}
What I want to do is have 90 second intervals between however many duplicates are made. So I'm not tweeting like 50 times within 10 seconds. What I want to do is add a 90 second interval between each tweet, please help!
Use the sleep() function:
for ($i = 1; $i <= $num_newlines - 1; $i ++) {
$tweetcpitems->post('statuses/update', array('status' => wordFilter('The item ' . $array[$i] . ' has been released on Club Penguin.')));
sleep(90);
}
This snippet sleeps after every tweet, also after the last one. To prevent sleeping unnecessary after the last tweet, use this:
for ($i = 1; $i <= $num_newlines - 1; $i ++) {
$tweetcpitems->post('statuses/update', array('status' => wordFilter('The item ' . $array[$i] . ' has been released on Club Penguin.')));
if ($i <= $num_newlines - 1) {
sleep(90);
}
}
Two options:
If you can set up CRON jobs - create a queue of messages to post (in a database or file) and let a script run every 90 seconds that takes and removes one message from the queue and sends it.
Use sleep function inbetween sending messages. Note that you may need to increase the time limit (from the comments: under Linux, sleeping time is ignored, but under Windows, it counts as execution time).

What's better at freeing memory with PHP: unset() or $var = null

I realise the second one avoids the overhead of a function call (update, is actually a language construct), but it would be interesting to know if one is better than the other. I have been using unset() for most of my coding, but I've recently looked through a few respectable classes found off the net that use $var = null instead.
Is there a preferred one, and what is the reasoning?
It was mentioned in the unset manual's page in 2009:
unset() does just what its name says - unset a variable. It does not force immediate memory freeing. PHP's garbage collector will do it when it see fits - by intention as soon, as those CPU cycles aren't needed anyway, or as late as before the script would run out of memory, whatever occurs first.
If you are doing $whatever = null; then you are rewriting variable's data. You might get memory freed / shrunk faster, but it may steal CPU cycles from the code that truly needs them sooner, resulting in a longer overall execution time.
(Since 2013, that unset man page don't include that section anymore)
Note that until php5.3, if you have two objects in circular reference, such as in a parent-child relationship, calling unset() on the parent object will not free the memory used for the parent reference in the child object. (Nor will the memory be freed when the parent object is garbage-collected.) (bug 33595)
The question "difference between unset and = null" details some differences:
unset($a) also removes $a from the symbol table; for example:
$a = str_repeat('hello world ', 100);
unset($a);
var_dump($a);
Outputs:
Notice: Undefined variable: a in xxx
NULL
But when $a = null is used:
$a = str_repeat('hello world ', 100);
$a = null;
var_dump($a);
Outputs:
NULL
It seems that $a = null is a bit faster than its unset() counterpart: updating a symbol table entry appears to be faster than removing it.
when you try to use a non-existent (unset) variable, an error will be triggered and the value for the variable expression will be null. (Because, what else should PHP do? Every expression needs to result in some value.)
A variable with null assigned to it is still a perfectly normal variable though.
unset is not actually a function, but a language construct. It is no more a function call than a return or an include.
Aside from performance issues, using unset makes your code's intent much clearer.
By doing an unset() on a variable, you've essentially marked the variable for 'garbage collection' (PHP doesn't really have one, but for example's sake) so the memory isn't immediately available. The variable no longer houses the data, but the stack remains at the larger size. Doing the null method drops the data and shrinks the stack memory almost immediately.
This has been from personal experience and others as well. See the comments of the unset() function here.
I personally use unset() between iterations in a loop so that I don't have to have the delay of the stack being yo-yo'd in size. The data is gone, but the footprint remains. On the next iteration, the memory is already being taken by php and thus, quicker to initialize the next variable.
<?php
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$a = 'a';
$a = NULL;
}
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds\r\n";
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$a = 'a';
unset($a);
}
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds\r\n";
?>
Per that it seems like "= null" is faster.
PHP 5.4 results:
took 0.88389301300049 seconds
took 2.1757180690765 seconds
PHP 5.3 results:
took 1.7235369682312 seconds
took 2.9490959644318 seconds
PHP 5.2 results:
took 3.0069220066071 seconds
took 4.7002630233765 seconds
PHP 5.1 results:
took 2.6272349357605 seconds
took 5.0403649806976 seconds
Things start to look different with PHP 5.0 and 4.4.
5.0:
took 10.038941144943 seconds
took 7.0874409675598 seconds
4.4:
took 7.5352551937103 seconds
took 6.6245851516724 seconds
Keep in mind microtime(true) doesn't work in PHP 4.4 so I had to use the microtime_float example given in php.net/microtime / Example #1.
It works in a different way for variables copied by reference:
$a = 5;
$b = &$a;
unset($b); // just say $b should not point to any variable
print $a; // 5
$a = 5;
$b = &$a;
$b = null; // rewrites value of $b (and $a)
print $a; // nothing, because $a = null
It makes a difference with array elements.
Consider this example
$a = array('test' => 1);
$a['test'] = NULL;
echo "Key test ", array_key_exists('test', $a)? "exists": "does not exist";
Here, the key 'test' still exists. However, in this example
$a = array('test' => 1);
unset($a['test']);
echo "Key test ", array_key_exists('test', $a)? "exists": "does not exist";
the key no longer exists.
Regarding objects, especially in lazy-load scenario, one should consider garbage collector is running in idle CPU cycles, so presuming you're going into trouble when a lot of objects are loading small time penalty will solve the memory freeing.
Use time_nanosleep to enable GC to collect memory.
Setting variable to null is desirable.
Tested on production server, originally the job consumed 50MB and then was halted.
After nanosleep was used 14MB was constant memory consumption.
One should say this depends on GC behaviour which may change from PHP version to version.
But it works on PHP 5.3 fine.
eg. this sample (code taken form VirtueMart2 google feed)
for($n=0; $n<count($ids); $n++)
{
//unset($product); //usefull for arrays
$product = null
if( $n % 50 == 0 )
{
// let GC do the memory job
//echo "<mem>" . memory_get_usage() . "</mem>";//$ids[$n];
time_nanosleep(0, 10000000);
}
$product = $productModel->getProductSingle((int)$ids[$n],true, true, true);
...
For the record, and excluding the time that it takes:
<?php
echo "<hr>First:<br>";
$x = str_repeat('x', 80000);
echo memory_get_usage() . "<br>\n";
echo memory_get_peak_usage() . "<br>\n";
echo "<hr>Unset:<br>";
unset($x);
$x = str_repeat('x', 80000);
echo memory_get_usage() . "<br>\n";
echo memory_get_peak_usage() . "<br>\n";
echo "<hr>Null:<br>";
$x=null;
$x = str_repeat('x', 80000);
echo memory_get_usage() . "<br>\n";
echo memory_get_peak_usage() . "<br>\n";
echo "<hr>function:<br>";
function test() {
$x = str_repeat('x', 80000);
}
echo memory_get_usage() . "<br>\n";
echo memory_get_peak_usage() . "<br>\n";
echo "<hr>Reasign:<br>";
$x = str_repeat('x', 80000);
echo memory_get_usage() . "<br>\n";
echo memory_get_peak_usage() . "<br>\n";
It returns
First:
438296
438352
Unset:
438296
438352
Null:
438296
438352
function:
438296
438352
Reasign:
438296
520216 <-- double usage.
Conclusion, both null and unset free memory as expected (not only at the end of the execution). Also, reassigning a variable holds the value twice at some point (520216 versus 438352)
PHP 7 is already worked on such memory management issues and its reduced up-to minimal usage.
<?php
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$a = 'a';
$a = NULL;
}
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds\r\n";
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$a = 'a';
unset($a);
}
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds\r\n";
?>
PHP 7.1 Outpu:
took 0.16778993606567 seconds
took 0.16630101203918 seconds
Code example from comment
echo "PHP Version: " . phpversion() . PHP_EOL . PHP_EOL;
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$a = 'a';
$a = NULL;
}
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds" . PHP_EOL;
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$a = 'a';
unset($a);
}
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds" . PHP_EOL;
Running in docker container from image php:7.4-fpm and others..
PHP Version: 7.4.8
took 0.22569918632507 seconds null
took 0.11705803871155 seconds unset
took 0.20791196823121 seconds null
took 0.11697316169739 seconds unset
PHP Version: 7.3.20
took 0.22086310386658 seconds null
took 0.11882591247559 seconds unset
took 0.21383500099182 seconds null
took 0.11916995048523 seconds unset
PHP Version: 7.2.32
took 0.24728178977966 seconds null
took 0.12719893455505 seconds unset
took 0.23839902877808 seconds null
took 0.12744522094727 seconds unset
PHP Version: 7.1.33
took 0.51380109786987 seconds null
took 0.50135898590088 seconds unset
took 0.50358104705811 seconds null
took 0.50115609169006 seconds unset
PHP Version: 7.0.33
took 0.50918698310852 seconds null
took 0.50490307807922 seconds unset
took 0.50227618217468 seconds null
took 0.50514912605286 seconds unset
PHP Version: 5.6.40
took 1.0063569545746 seconds null
took 1.6303179264069 seconds unset
took 1.0689589977264 seconds null
took 1.6382601261139 seconds unset
PHP Version: 5.4.45
took 1.0791940689087 seconds null
took 1.6308979988098 seconds unset
took 1.0029168128967 seconds null
took 1.6320278644562 seconds unset
But, with other example:
<?php
ini_set("memory_limit", "512M");
echo "PHP Version: " . phpversion() . PHP_EOL . PHP_EOL;
$start = microtime(true);
$arr = [];
for ($i = 0; $i < 1000000; $i++) {
$arr[] = 'a';
}
$arr = null;
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds" . PHP_EOL;
$start = microtime(true);
$arr = [];
for ($i = 0; $i < 1000000; $i++) {
$arr[] = 'a';
}
unset($arr);
$elapsed = microtime(true) - $start;
echo "took $elapsed seconds" . PHP_EOL;
Results:
PHP Version: 7.4.8
took 0.053696155548096 seconds
took 0.053897857666016 seconds
PHP Version: 7.3.20
took 0.054572820663452 seconds
took 0.054342031478882 seconds
PHP Version: 7.2.32
took 0.05678391456604 seconds
took 0.057311058044434 seconds
PHP Version: 7.1.33
took 0.097366094589233 seconds
took 0.073100090026855 seconds
PHP Version: 7.0.33
took 0.076443910598755 seconds
took 0.077098846435547 seconds
PHP Version: 7.0.33
took 0.075634002685547 seconds
took 0.075317859649658 seconds
PHP Version: 5.6.40
took 0.29681086540222 seconds
took 0.28199100494385 seconds
PHP Version: 5.4.45
took 0.30513095855713 seconds
took 0.29265689849854 seconds
I still doubt about this, but I've tried it at my script and I'm using xdebug to know how it will affect my app memory usage.
The script is set on my function like this :
function gen_table_data($serv, $coorp, $type, $showSql = FALSE, $table = 'ireg_idnts') {
$sql = "SELECT COUNT(`operator`) `operator` FROM $table WHERE $serv = '$coorp'";
if($showSql === FALSE) {
$sql = mysql_query($sql) or die(mysql_error());
$data = mysql_fetch_array($sql);
return $data[0];
} else echo $sql;
}
And I add unset just before the return code and it give me : 160200
then I try to change it with $sql = NULL and it give me : 160224 :)
But there is something unique on this comparative when I am not using unset() or NULL, xdebug give me 160144 as memory usage
So, I think giving line to use unset() or NULL will add process to your application and it will be better to stay origin with your code and decrease the variable that you are using as effective as you can .
Correct me if I'm wrong, thanks
I created a new performance test for unset and =null, because as mentioned in the comments the here written has an error (the recreating of the elements).
I used arrays, as you see it didn't matter now.
<?php
$arr1 = array();
$arr2 = array();
for ($i = 0; $i < 10000000; $i++) {
$arr1[$i] = 'a';
$arr2[$i] = 'a';
}
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
$arr1[$i] = null;
}
$elapsed = microtime(true) - $start;
echo 'took '. $elapsed .'seconds<br>';
$start = microtime(true);
for ($i = 0; $i < 10000000; $i++) {
unset($arr2[$i]);
}
$elapsed = microtime(true) - $start;
echo 'took '. $elapsed .'seconds<br>';
But i can only test it on an PHP 5.5.9 server, here the results:
- took 4.4571571350098 seconds
- took 4.4425978660583 seconds
I prefer unset for readability reasons.
unset code if not freeing immediate memory is still very helpful and would be a good practice to do this each time we pass on code steps before we exit a method. take note its not about freeing immediate memory.
immediate memory is for CPU, what about secondary memory which is RAM.
and this also tackles about preventing memory leaks.
please see this link
http://www.hackingwithphp.com/18/1/11/be-wary-of-garbage-collection-part-2
i have been using unset for a long time now.
better practice like this in code to instanly unset all variable that have been used already as array.
$data['tesst']='';
$data['test2']='asdadsa';
....
nth.
and just unset($data); to free all variable usage.
please see related topic to unset
How important is it to unset variables in PHP?
[bug]

Categories