Is setting set_time_limit(0) a bad idea? [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am making changes to a script a freelancer made and noticed at the top of his function he has set the timeout limit to 0 so it won't timeout. Is there any implications to doing this? Or should i properly test his script and work out how long it is taking and if it will timeout on the server?
Edit - Adding more detail.
So its a script that creates a temp DB table and populates it by pulling data from a number of different tables, it then outputs it to a csv using mysql outfile.

From the PHP docs:
set_time_limit — Limits the maximum execution time
seconds: The maximum execution time, in seconds. If set to zero, no time limit is
imposed.
http://nl3.php.net/manual/en/function.set-time-limit.php
If you set the time limit to zero, it means the script could run forever without being stopped by the PHP interpreter. Most times this is a bad idea, because if you would have a bug in your code which results in an unending loop, the script will run forever and eat your CPU.
A lot of times I've seen this in code without a good reason, so I assume the developer couldn't think of an appropriate timeout and just set 0.
There are a few valid reasons you would want to set a timeout of 0, for example, a deamon script.

Related

Processing 10 million datasets - php and sql [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
We're using PHP 7 and have a MySQL DB running on a Webserver with only 128 MB RAM.
We have a problem with processing tons of datasets.
Simple description: We have 40.000 products and we want to collect data to these products to find out, if they need to be updated or not. The query which is collecting the specific data from another table with 10 Million datasets takes 1.2 seconds, because we have some SUM functions in it. We need to do the query for every product individually, because the time range which is relevant for the SUM, differs. Because of the mass of queries the function which should iterate over all the products returns a time out (after 5 min) - that's why we decided to implement a cronjob, which calls the function and the function continues with the product it ended the last time. We call the cronjob every 5 min.
But still, with our 40.000 products, it takes us ~30 hours until all the products were processed. Per cronjob, our function processes about 100 products...
How is it possible to deal with such a mass of data - is there a way to parallelize it with e.g. pthreads or does somebody have another idea? Could a server update be a solution?
Thanks a lot!
Nadine
Parallel processing will require resources as well, so on 128 MB it will not help.
Monitor your system to see where the bottleneck is. Most probably the memory since it is so low. Once you find the bottleneck resource, you will have to increase it. No amount of tuning and tinkering will solve an overloaded server issue.
If you can see that it is not a server resources issue (!), it could be at the query level (to many joints, need some indexes, ...). And your 5 min. timeout could be increased.
But start with the server.

How to accelerate PHP execution by using cached result? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm using PHP-FPM 5.6 version.
php -v shows there's OPcache in place.
I have a PHP script that's accepting parameters and giving me same 2.2k HTML output all the time.
The script execution does not involve any connectivity to MySQL.
In Chrome Developer Tools, I'm seeing an execution time of 900ms.
I find that this is relatively slow.
I would like to shorten this execution time.
Given that OPcache is in place with this version of PHP, can I use it to cache the result of my PHP script execution for a faster response time?
Or if there's an alternative approach?
Any configuration to be tweaked in php.ini, /etc/php.d/10-opcache.ini or /etc/php-fpm.d/www.conf?
And how do I purge the cached result when needed?
I know this doesn't directly answer your question, but it might be useful. The way to measure execution time could involve these two functions:
function getMicroTime()
// return time in whole seconds with microseconds
{
return microtime(TRUE);
}
function getUserTime()
// this clock only runs when we have CPU cycles, so it runs slower that the normal clock
// use this to accurately time the excution time of scripts instead of the above function
{
$usage = getrusage();
return $usage["ru_utime.tv_sec"]+0.000001*$usage["ru_utime.tv_usec"];
}
At the beginning of your script you store the start times:
$startMicroTime = getMicroTime();
$startUserTime = getUserTime();
So that at the end you can echo the execution time:
echo 'Full time = '.round(1000*(getMicroTime()-$startMicroTime),2).' ms<br>';
echo 'User time = '.round(1000*(getUserTime()-$startUserTime),2).' ms<br>';
Again, this doesn't answer your question, but it could be useful. Ok, to make this a valid answer, have a read here:
https://www.addedbytes.com/articles/for-beginners/output-caching-for-beginners/

Same function needs to be called simultaneously for all users in php [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a start_processing("set 1") function which takes 6 hrs to complete set1,
I want to process set 2, set 3 .... how can i process all of them in same time ?? since when i put
<?php
start_processing("set1");
start_processing("set2");
start_processing("set3");
?>
It takes 18 hrs.
I want to complete process in 6hrs for all processing.
Finally i got a solution
I have take curl_multi - it is far better. Save the handshakes - they are not needed every time!
Use curl_multi_init to run the processes in parallel. This can have a tremendous effect.
Unless you are using PHP as Apache-module, you can use pcntl_fork to create several processes of which each processes one function call.
if(pcntl_fork())
start_processing("set1");
else if(pcntl_fork())
start_processing("set2");
else
start_processing("set3");
If you have a varying number of working sets, just put them in an array and loop through it. Just bear in mind that too many processes could overload your system!
Another, more lightweight, option is the use of php pthreads which AFAIK work with Apache, but require installing the corresponding php-extension first.
A third possibility is, as mentioned by sandeep_kosta and Niranjan N Raju, to create one Cronjob for each working set.

Why use sleep in loop(while or for var i) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a long polling in php, is recommended use sleep(x seconds)? If i don't use, the PC will be slow? (Lagging, apache stop, etc). Have a difference?
sleep - of any duration, even "0 seconds" - is a quick way to get the Operating System's scheduler to 'pause' the current task and allow another process to continue work.
This context switch prevents visible 'lagging' because the other processes have a chance to do what they need to do. Even if there is no other process that needs to do work a sleep still causes the current process execution to halt until it is rescheduled. The rescheduling alone this greatly prevents making the CPU a toaster because the effective/relevant time the process is given to work is greatly reduced.
Without the sleep (or other blocking IO task) it becomes a 'hot busy loop'; this loop is executed as fast as it can be and, even though the process will eventually be be preempted without a sleep, the 'busy loop' will consume significantly more CPU resources before it is rescheduled. (This also implies that the same amount of work will take longer to complete when sleeping often.)
Thus: Sleep can be advantageous on selectively yielding work in a CPU-bound application; but at the same time it can reduce the CPU/processing throughput available if called too often or for too long. Sleeping in a loop has much less impact on an IO-bound application; in which case it's primary purpose is to impose longer delays before continuing a certain action.

ways of print large number pages of report [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need to print large number pages of aging report using php.
Numbers of pages may up to thousand.
Whats the best way to print this report?
- direct send to printer
- css
- pdf
- others?
Any suggestion?
Thank you!
If there are thousands of pages I would recommend using a background task to generate a PDF file.
To do this your frontend can write for example a database entry, that says "todo". Then configure a cron job etc. that calls a php script to check e.g. every minute for a PDF-generator todo.
In your PHP configuration check, that you are allowed to use the "set_time_limit()" function. This function resets the max. execution time of a script. You must call this function in your generator script, before the max. execution time has exceeded. Otherwise your PHP script will be stopped before the work is done.
Another problem may be the PHP memory limit. You can increase the memory limit by setting "memory_limit" to a value that matches your use-case. Try different values to find the memory limit that is high enough to run your script. Also don't save too much data in your PHP scripts to avoid high memory usage.
While the background script is running you can write a file or a data-base entry that could be read by a frontend, that shows the PDF-generator progress. When the generator has finished, you can provide the file for download in a frontend.
Here are some links for further reading:
http://www.php.net/manual/en/book.pdf.php
How to create cron job using PHP?
http://www.php.net/manual/en/info.configuration.php
http://www.php.net/manual/en/function.set-time-limit.php

Categories