Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm using PHP-FPM 5.6 version.
php -v shows there's OPcache in place.
I have a PHP script that's accepting parameters and giving me same 2.2k HTML output all the time.
The script execution does not involve any connectivity to MySQL.
In Chrome Developer Tools, I'm seeing an execution time of 900ms.
I find that this is relatively slow.
I would like to shorten this execution time.
Given that OPcache is in place with this version of PHP, can I use it to cache the result of my PHP script execution for a faster response time?
Or if there's an alternative approach?
Any configuration to be tweaked in php.ini, /etc/php.d/10-opcache.ini or /etc/php-fpm.d/www.conf?
And how do I purge the cached result when needed?
I know this doesn't directly answer your question, but it might be useful. The way to measure execution time could involve these two functions:
function getMicroTime()
// return time in whole seconds with microseconds
{
return microtime(TRUE);
}
function getUserTime()
// this clock only runs when we have CPU cycles, so it runs slower that the normal clock
// use this to accurately time the excution time of scripts instead of the above function
{
$usage = getrusage();
return $usage["ru_utime.tv_sec"]+0.000001*$usage["ru_utime.tv_usec"];
}
At the beginning of your script you store the start times:
$startMicroTime = getMicroTime();
$startUserTime = getUserTime();
So that at the end you can echo the execution time:
echo 'Full time = '.round(1000*(getMicroTime()-$startMicroTime),2).' ms<br>';
echo 'User time = '.round(1000*(getUserTime()-$startUserTime),2).' ms<br>';
Again, this doesn't answer your question, but it could be useful. Ok, to make this a valid answer, have a read here:
https://www.addedbytes.com/articles/for-beginners/output-caching-for-beginners/
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a start_processing("set 1") function which takes 6 hrs to complete set1,
I want to process set 2, set 3 .... how can i process all of them in same time ?? since when i put
<?php
start_processing("set1");
start_processing("set2");
start_processing("set3");
?>
It takes 18 hrs.
I want to complete process in 6hrs for all processing.
Finally i got a solution
I have take curl_multi - it is far better. Save the handshakes - they are not needed every time!
Use curl_multi_init to run the processes in parallel. This can have a tremendous effect.
Unless you are using PHP as Apache-module, you can use pcntl_fork to create several processes of which each processes one function call.
if(pcntl_fork())
start_processing("set1");
else if(pcntl_fork())
start_processing("set2");
else
start_processing("set3");
If you have a varying number of working sets, just put them in an array and loop through it. Just bear in mind that too many processes could overload your system!
Another, more lightweight, option is the use of php pthreads which AFAIK work with Apache, but require installing the corresponding php-extension first.
A third possibility is, as mentioned by sandeep_kosta and Niranjan N Raju, to create one Cronjob for each working set.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have a LAMP website up and going. It runs fairly fast, but would like to improve the speed. I recognize you do not have the specific PHP script, SQL queries, whether caching is enabled, server configuration, etc, and not asking questions at that level. Instead, I am asking whether there is a general best-practice approach and the order of the the steps to identify bottlenecks so that website speed could be improved.
I don't know if there is a general way to optimise for speed but personally every time I write a new section of code, I always know how long that code takes to execute simply by measuring it.
If you are retrospectively looking at a lot of code that's already been written, just break it down into sections and measure the time taken by each section.
I permanently have these lines wrapped around the main body of my code;
<?php
$start_time = microtime(true); // Note the start time.
$html = get_html(); // Main body of code.
// Calculate peak memory usage and time taken.
$mem = number_format(memory_get_peak_usage(true) / 1024 / 1024, 1).'mb';
$tmr = number_format(microtime(true) - $start_time, 2).'s';
echo "{$html}\n<!-- Peak memory usage: {$mem}. Response time: {$tmr} -->";
?>
That will show how long get_html() is taking. If you compare that to the waiting time reported by the browser, you can tell how much of the overall response time is consumed by your code and how much is consumed by things other than your code.
Hope that helps.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need to print large number pages of aging report using php.
Numbers of pages may up to thousand.
Whats the best way to print this report?
- direct send to printer
- css
- pdf
- others?
Any suggestion?
Thank you!
If there are thousands of pages I would recommend using a background task to generate a PDF file.
To do this your frontend can write for example a database entry, that says "todo". Then configure a cron job etc. that calls a php script to check e.g. every minute for a PDF-generator todo.
In your PHP configuration check, that you are allowed to use the "set_time_limit()" function. This function resets the max. execution time of a script. You must call this function in your generator script, before the max. execution time has exceeded. Otherwise your PHP script will be stopped before the work is done.
Another problem may be the PHP memory limit. You can increase the memory limit by setting "memory_limit" to a value that matches your use-case. Try different values to find the memory limit that is high enough to run your script. Also don't save too much data in your PHP scripts to avoid high memory usage.
While the background script is running you can write a file or a data-base entry that could be read by a frontend, that shows the PDF-generator progress. When the generator has finished, you can provide the file for download in a frontend.
Here are some links for further reading:
http://www.php.net/manual/en/book.pdf.php
How to create cron job using PHP?
http://www.php.net/manual/en/info.configuration.php
http://www.php.net/manual/en/function.set-time-limit.php
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I need a script to work forever; that's it, it will generate information on the aground without stopping, downloading stuff and storing info on the database, as well as performing calculations.
It seems possible in PHP like here:
<?php
ignore_user_abort(); // run script in background
set_time_limit(0); // run script forever
$interval=60*15; // do every 15 minutes...
do{
// add the script that has to be ran every 15 minutes here
// ...
sleep($interval); // wait 15 minutes
} while(true);
?>
but it's accepted generally that PHP was not designed with this in mind.
1) Is there any drawback using the PHP way (How to stop this script?) or is there a better language to do this, like C++?
2) What other companies, like google who indexes the web, do?
Long-running scripts are generally a bad idea, as they're scripts for a reason. I'm not sure of a way to stop the PHP script once it runs, but there probably isn't a way to do so (at least practically). I'd recommend making writing a program that'll run on a server computer, using languages more designed to do that kind of work compared to scripts.
A simple C# or Java program can run forever, as long as you don't close it. You can manipulate databases by using the corresponding language's database support.
What you're doing is typically accomplished in PHP using cron jobs.
http://www.serverwatch.com/server-tutorials/a-primer-for-scheduling-cron-jobs-in-linux.html
(or google "cron job" + your OS of choice)
A cron job can be scheduled to execute an arbitrary script every 15 minutes pretty easily.
Designing PHP scripts to run forever is considered bad practices because PHP wasn't ever designed with that in mind.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am making changes to a script a freelancer made and noticed at the top of his function he has set the timeout limit to 0 so it won't timeout. Is there any implications to doing this? Or should i properly test his script and work out how long it is taking and if it will timeout on the server?
Edit - Adding more detail.
So its a script that creates a temp DB table and populates it by pulling data from a number of different tables, it then outputs it to a csv using mysql outfile.
From the PHP docs:
set_time_limit — Limits the maximum execution time
seconds: The maximum execution time, in seconds. If set to zero, no time limit is
imposed.
http://nl3.php.net/manual/en/function.set-time-limit.php
If you set the time limit to zero, it means the script could run forever without being stopped by the PHP interpreter. Most times this is a bad idea, because if you would have a bug in your code which results in an unending loop, the script will run forever and eat your CPU.
A lot of times I've seen this in code without a good reason, so I assume the developer couldn't think of an appropriate timeout and just set 0.
There are a few valid reasons you would want to set a timeout of 0, for example, a deamon script.