On my website various php codes run from various programmers from whom I have bought project scripts. Some use a session ( session start etc...)
Some use external include php files and do their math within there and return or echo some things. Some run only when asked to, like the search script.
Is there an easy way for me to monitor, temporary, all the various scripts's their delays in millisecond sothat I can see whats going on below the water?
I have seen once a programmer making something and below the page there were these long listst of sentences and various ms numbers etc.
Q1. Is there a default php function for this? How do I call/toggle this?
Q2. What are the various methods with which such calculations are made?
Q3. How reliable are they? are those milliseconds theory or actual real world result?
Thanks for your insight!
Sam
No defualt method i can thnik of. But its easy.At the start of your script simply place this:
$s = microtime(true);
and at the end
$e = microtime(true);
echo round($e - $s, 2) . " Sec";
Normally you would leave the second parameter of round() as it is, but if you find that your script reports the time as ’0 Sec’ increase the number until you get an answer.check this for more
If you're running an Apache webserver, then you should have the apache benchmarking tool that can give some very accurate information about script timings, even simulating numbers of concurrent users.
From a web browser, the Firebug extension of Firefox can also be useful as a tool for seeing how long your own requests take.
Neither of these methods is purely a timer for the PHP code though
The easiest/fastest way is to install a debugging extension that supports profiling, like XDebug. You can then run a profiling tool (e.g.: KCachegrind) to profile your scripts, graph the results, and figure out what uses the most memory, execution time, etc.
It also provides various other functionalities like stack tracing, etc.
Related
I am working with action script 3 and often I see server calls that link to php files.
var serverCall:ServerCall = new ServerCall("getDeviceFirmwareLog", getDeviceFirmwareLogResponse, getDeviceFirmwareLogResponse, false);
This line calls some php functions that cannot be searched in my IDE, so I usually go from here and I would try to grep for that string "getDeviceFirmwareLog" and then I run into some php that makes other weird calls that somehow calls some stuff on the embedded hardware we run. In general when I grep for that string I don't even get any results and I'm so confused as to how it might be connected.
I am much more used to regular code calls and includes that are easier to follow. I've asked some people at work but it seems to get glossed over and I don't want to ask the same question a third time until I've exhausted my other options. I am wondering if there are any general debugging / code following tips for this kind of a setup that could help me understand what is going on in my codebase.
Thanks in advance.
Without intimate knowledge of your environment, I'd say it appears ServerCall is a custom socket class that calls external functions, with n number of arguments.
getDeviceFirmwareLog would therefore be the function being called, and would be a native function to the API of the hardware (not PHP); this is why you wouldn't be able to find it with a grep search.
Consequently, unless it's rigged with event listeners, ServerCall would populate with the requested data asynchronously (which would likely still fire an event when the request completed).
As you're working with both Flash and PHP, it appears as though you might be testing this through a browser. If so, you could always try the native debugging tools in your browser (F12).
The PHP portion is harder as it's server side scripting, however, take a look at the Eclipse Plugin PDT, which offers debugging facilities for PHP code.
I have a very troubling problem at hand. I am using a web-socket server that runs in PHP. The issue is I need to be able to use a setInterval/setTimeout function similar to JavaScript, but within my php socket server.
I do not have the time or resources to convert my entire project over to nodejs/javascript. It will take forever. I love php so much, that I do not want to make the switch. Everything else works fine and I feel like it's not worth it to re-write everything just because I cannot use a similar setInterval function inside php.
Since the php socket server runs through the shell, I can use a setInterval type function using a loop:
http://pastebin.com/nzcvXRph
This code does work as intended, but it seems a bit overboard for resources and I feel like that while loop will suck a lot resources.
Is there anyway I can re-compile PHP from source and include a "while2" loop that only iterates every 500 milliseconds instead of instantly?
I don't think there is a way to recompile PHP from source.
If you want to delay the execution of the loop you could use the sleep function, which is used for delaying execution.
For example, I want to print 10 number after every 2 seconds then the code below should do the job.
for($i=0;$i<=10;$i++)
{
print($i++);
sleep(2);
}
Check thee PHP docs here.
EDIT
Following up what I mentioned in the replies, if you want the user to have its own instance of the run time, then threads would be an option. There is very limited examples of multi threaded application in PHP, I would recommend to check out some examples in JAVA, it shouldn't he hard to understand. Here is a good video tutorial.
For PHP
php.net/threads
Check out the contributor notes, sometimes people write good examples.
Initial Condition: I have code written in php file. initially i was executing code, it was taking 30 seconds to execute. In this file the code was called 5 times.
What will happen next:Let if i need to execute this code 50 times then it will take 300 seconds in one execution in browser.next for 500 times 3000 secs. So it is serial execution of code.
What I Need: i need to execute this code in parallel. like several instance. So i would like to minimize the execution time so user has not wait for such long time.
What I Did: i used PHP CURL to execute this code parallel. I called this file several times to minimize the execution time.
So I want to know that is this method is correct. How much CURL i can execute and how much resources it require. It need a better method that how could i execute this code in parallel with tutorial.
any help will be grateful.
Probably the simplest option without changing your code (too much), though, would be to call PHP through the command line and not CURL. This cuts the overhead of APACHE (both in memory and speed), networking etc. Plus Curl is not a portable option as some servers can't see themselves (in network terms).
$process1 = popen('php myfile.php [parameters]');
$process2 = popen('php myfile.php [parameters]');
// get response from children : you can loop until all completed
$response1 = stream_get_contents($process1);
$response2 = stream_get_contents($process2);
You'll need to remove any reference to apache added variables in $_SERVER, and replace $_GET with argv/argc references. Both otherwise it should just work.
But the best solution will probably be pThreads (http://php.net/manual/en/book.pthreads.php) that allow you to do what you want. Will require some editing of code (and installing, possibly) but does what you're asking.
php curl is low enough overhead to not have to worry about it. If you can make loopback calls to a server farm through a load balancer, that's a good use case for curl. I've also used pcntl_fork() for same-host parallelism, but it's harder to set up. I've written classes built on both; see my php lib at https://github.com/andrasq/quicklib for ideas (or just borrow code, it's open source)
Consider using Gearman. Documentation :
http://php.net/manual/en/book.gearman.php
I'm working on a hardware project that (currently) uses the Beaglebone Black (eventually custom hardware will be used), and am attempting to set up a webpage on its web server that returns the status of all four leds, live, in a loop. And I want this webpage to be able to be accessed by multiple people simultaneously.
I've got the webpage updating the values of the leds live, but the problem is that if two instances of the webpage are open, they start to behave weird and eventually crash.
The webpage uses a jquery timer that executes every 10ms.
<script>
var Timer=setInterval(function(){GetLed()}, 10);
function GetLed()
{
$("#div1").load("getled3.php");
}
</script>
getled3.php uses php to execute 4 linux commands (but these might eventually be C++ programs in the future as we expand the capability of the webpage), and prints the results:
<?php
exec("cat /sys/class/leds/beaglebone:green:usr0/brightness", $Led0);
exec("cat /sys/class/leds/beaglebone:green:usr1/brightness", $Led1);
exec("cat /sys/class/leds/beaglebone:green:usr2/brightness", $Led2);
exec("cat /sys/class/leds/beaglebone:green:usr3/brightness", $Led3);
print($Led0[0] . ", " . $Led1[0] . ", " . $Led2[0] . ", " . $Led3[0]);
?>
I'm willing to do my own research into how to create a web application that plays nicely with multiple users, but so far, my searches haven't turned up any useful results, which could just be my wording.
If anyone could just post some links that would point me in the right direction for creating a web application that controls hardware, that would be most helpful, as the eventual implementation will be extremely complex, and might include the hardware running a "master" application on a loop, with the web page providing the user the ability to alter hardware set up.
I guess the best example of something like what we're doing would be a dynamic router set up page.
Thanks in advance. I'm going to keep searching in the meantime.
Edit: (with responses to comments)
I was looking to see if there was a standard way of doing this, or at least a best-practice. The end product will eventually allow the user to change settings on the hardware, force the hardware to send information to other hardware, read information about the hardware and other hardware attached to the beaglebone black. It'll eventually get quite expansive, and so what I really need is a resource (perhaps a book) where I can read about how this sort of thing is usually done.
The whole thing will eventually incorporate PWMs, GPIOs, ADCs, etc.
As for the method of accessing the leds, I understand "exec cat" isn't the best way to get that information.
I have since changed the entire set up so that now, when the beaglebone black boots, it loads a c++ program that runs in a loop, and writes files with hardware information.
Then, the webpage calls that were originally running "exec cat" are now just loading the program's output file into the browser. This solved the crashing problem, but just doesn't feel like the correct method of doing this project, because there would be a ton of files with information in them about the ADC values, the PWM values, etc. To further convolute things, the file accesses would really need a mutex to prevent the c++ program from writing a file while the web program was reading it, and a mutex to prevent the web program from reading a file if the c++ program were currently writing it. That would make 3 files per process.
A way can be to build a shell script that sets a local variable with the leds states. Then you put this script as a cron job (that is launched every 10ms or whatever you want).
After you only need to obtain the value of the local variable.
I'm building a feature of a site that will generate a PDF (using TCPDF) into a booklet of 500+ pages. The layout is very simple but just due to the number of records I think it qualifies as a "long running php process". This will only need to be done a handful of times per year and if I could just have it run in the background and email the admin when done, that would be perfect. Considered Cron but it is a user-generated type of feature.
What can I do to keep my PDF rendering for as long as it takes? I am "good" with PHP but not so much with *nix. Even a tutorial link would be helpful.
Honestly you should avoid doing this entirely from a scalability perspective. I'd use a database table to "schedule" the job with the parameters, have a script that is continuously checking this table. Then use JavaScript to poll your application for the file to be "ready", when the file is ready then let the JavaScript pull down the file to the client.
It will be incredibly hard to maintain/troubleshoot this process while you're wondering why is my web server so slow all of a sudden. Apache doesn't make it easy to determine what process is eating up what CPU.
Also by using a database you can do things like limit the number of concurrent threads, or even provide faster rendering time by letting multiple processes render each PDF page and then re-assemble them together with yet another process... etc.
Good luck!
What you need is to change the allowed maximum execution time for PHP scripts. You can do that by several means from the script itself (you should prefer this if it would work) or by changing php.ini.
BEWARE - Changing execution time might seriously lower the performance of your server. A script is allowed to run only a certain time (30sec by default) before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. You should exactly know what you are doing before you do this.
You can find some more info about:
setting max-execution-time in php.ini here http://www.php.net/manual/en/info.configuration.php#ini.max-execution-time
limiting the maximum execution time by set_time_limit() here http://php.net/manual/en/function.set-time-limit.php
PS: This should work if you use PHP to generate the PDF. It will not work if you use some stuff outside of the script (called by exec(), system() and similar).
This question is already answered, but as a result of other questions / answers here, here is what I did and it worked great: (I did the same thing using pdftk, but on a smaller scale!)
I put the following code in an iframe:
set_time_limit(0); // ignore php timeout
//ignore_user_abort(true); // optional- keep on going even if user pulls the plug*
while(ob_get_level())ob_end_clean();// remove output buffers
ob_implicit_flush(true);
This avoided the page load timeout. You might want to put a countdown or progress bar on the parent page. I originally had the iframe issuing progress updates back to the parent, but browser updates broke that.