Estimate server load for an API - php

I want to know how many resources (MySQL time, CPU usage, bandwidth, etc.) will use my system with a certain number of API calls per second.
One API call gets the parameters in PHP, do from 1 to 5 SQL queries and returns a XML file.
How I can do that? Any idea of a formula or something?

Try using the xhprof PHP extension to do some profiling. It will tell you things such as how much time is spend on each function call, memory used, CPU used, etc. Do a few runs with it and you should have a good idea about the resources you are using. I've found it to be one of the most useful tools available in PHP. For more info, see the this documentation.

The only real way to do this is by benchmarking it. It depends entirely on what your code is doing - I could write an SQL query that takes minutes to run, and I could write an SQL query that takes milliseconds.

Related

determine how much resource used by a php script (cpu percentage and memory)

I'm currently using xhprof library forked by tideways.io for profiling myscript.php execution. From xhprof, i can get the walltime, cputime, memoryusage, and peakmemoryusage. I'm try to benchmark a symfony console - so i add TIDEWAYS_ENABLE() on its ConsoleCommandEvent and TIDEWAYS_DISABLE on its ConsoleTerminateEvent.
Questions:
How can i determine whether myscript.php consume how many percentage of cpu ? Can i just count the percentage as cpuusage = cputime / realtime like stated here on serverfault ?
Given a memoryusage (mu) and peakmemoryusage (pmu) from xhprof profiler, how do i convert or calculate its ram usage ? (top result show much higher than memoryusage but somehow near peakmemoryusage value, so can i say that ram usage is the value of peakmemoryusage ?
Notes:
It's different with this How can I get the CPU and Memory useage, because what i want is not the system load. p.s there might multiple scripts.php running at the same time.
I am not sure what you want to determine using the data, but if you're trying to do any comparative analysis (e.g. "when I remove this loop, it's 25% faster and uses 10% less memory") I'd suggest that you use https://blackfire.io/.
It's easy to setup and easy to use. It also offers automatic comparative analysis between requests. It also does multiple passes internally to make sure that external factors are averaged-out.
I can't really say whether the CPU and memory usage data is accurate. It is definitely accurate relatively to other data in blackfire - so works great if you just need to compare it. You should take it with a grain of salt if you need absolute numbers though (how many scripts like this is it safe to run on 4GB server?). But I'd say it's pretty accurate. It's definitely much better than xdebug's profiller.

How to track mysql query performance with php?

Is using simple get_memory_usage() reliable method to track memory usage of specific mysql query?
Does get_memory_usage() count also sql process or is that separate?
No, if you are asking about memory_get_usage. It counts memory used by php script only. Mysql is a separate process, running independently from your php script.
I found apache ab (http://httpd.apache.org/docs/2.0/programs/ab.html). It doesn't tell you memory_usage but time needed to complete request. With 10000 requests you get then average time for entire request. This was fine solution for me because it clearly showed if specific query toke too much resources or not.

incraese the execusion time in php

My web site page contains lot of queries. So it takes lot of time to execute, and ends in an error. Could you please tell me how to increase the execution time (may be the query execution) through php coding?
Thanks in advance
Use max_execution_time:
ini_set('max_execution_time', 14000); // adjust value
Also make sure that your code is correctly, try to see if there is any room for improvement in the speed. For example, you should have a look at:
Query optimization techniques
PHP Optimization Tricks
PHP Micro Optimizations
If this website is supposed to be for normal users, you will have to optimize your queries, no way around it. No user will want to wait more than a few seconds for a page to load, and if you're already surpassing the execution time limit, it's already way too slow! Extending the time limit is not a solution.
If, OTOH, this page is supposed to be more of a maintenance script, you should run it from the CLI or as a cron job where execution time limits don't exist.
You can use set_time_limit function to set the number of seconds a script is allowed to run.
But my advice would be to rethink/rewrite your queries, because if you hit the php execution limit you may do too much work at the database, or you do the work inefficiently.
There is always room for improvement. Raising php execution time is only a temporary fix which is one of the most hated phrase in computer programming. So again, look at your database scheme and analyze your queries, check for indices, etc.
For high concurrency I/O you may also consider a NoSQL approach like APC or Memcached.

php memory how much is too much

I'm currently re-writing my site using my own framework (it's very simple and does exactly what I need, i've no need for something like Zend or Cake PHP). I've done alot of work in making sure everything is cached properly, caching pages in files so avoid sql queries and generally limiting the number of sql queries.
Overall it looks like it's very speedy. The average time taken for the front page (taken over 100 times) is 0.046152 microseconds.
But one thing i'm not sure about is whether i've done enough to reduce php memory usage. The only time i've ever encountered problems with it is when uploading large files.
Using memory_get_peak_usage(TRUE), which I THINK returns the highest amount of memory used whilst the script has been running, the average (taken over 100 times) is 1572864 bytes.
Is that good?
I realise you don't know what it is i'm doing (it's rather simple, get the 10 latest articles, the comment count for each, get the user controls, popular tags in the sidebar etc). But would you be at all worried with a script using that sort of memory getting hit 50,000 times a day? Or once every second at peak times?
I realise that this is a very open ended question. Hopefully you can understand that it's a bit of a stab in the dark and i'm really just looking for some re-assurance that it's not going to die horribly come re-launch day.
EDIT: Just an mini experiment I did for myself. I downloaded and installed Wordpress and a default installation with no extra add ons, just one user and just one post and it used 10.5 megabytes of memory or "11010048 bytes". Quite pleased with my 1.5mb now.
Memory usage values can vary heavily and are subject to fluctuation, but as you already say in your update, a regular WordPress instance is much, much fatter than that. I have had great troubles to get the WordPress backend running with a memory_limit of sixteen megabytes - let alone when Plug-ins come into play. So from that, I'd say a peak of 1,5 Megabytes performing normal tasks is quite okay.
Generation time is extremely subject to the hardware your site runs on, obviously. However, a generation time of 0.046152 seconds (I assume you mean seconds here) sounds very okay to me under normal circumstances.
It is a subjective question. PHP has a lot of overhead and when calling the function with TRUE, that overhead will be included. You'll see what I mean when you call the function in a simple Hello World script. Also keep in mind that results can differ greatly depending on whether PHP is run as an apache module or FastCGI.
Unfortunately, no one can provide assurances. There will always be unforseen variables that can bring down a site. Perform load testing. Use a code profiler to narrow down the location of any bottlenecks to see if there are ways to make those code blocks more efficient
Encyclopaedia Britannica thought they were prepared when they launched their ad-supported encyclopedia ten years ago. The developers didn't know they would be announcing it on Good Morning America the day of the launch. The whole thing came crashing down for days.
As long as your systems aren't swapping, your memory usage is reasonable. Any additional concern is just premature optimization.

Execute php script every 40 milliseconds?

There is some way to execute a php script every 40 milliseconds?
I don't know if cronjob is the right way, because 25 times per second require a lot of CPU.
Well, If php isn't the correct language, what language I should use?
I am making a online game, but I need something to process what is happening in the game, to move the characters, to calculate projectiles paths, etc.
If you try to invoke a PHP script every 40 milliseconds, that will involve:
Create a process
Load PHP
Load and compile the script
Run the compiled script
Remove the process and all of the memory
You're much better off putting your work into the body of a loop, and then using time_sleep_until at the end of the loop to finish out the rest of your 40 milliseconds. Then your run your PHP program once.
Keep in mind, this needs to be a standalone PHP program; running it out of a web page will cause the web server to timeout on that page, and then end your script prematurely.
Every 40 milliseconds would be impressive. It's not really suited for cron, which runs on 1-minute boundaries.
Perhaps if you explained why you need that level of performance, we could make some better suggestions.
Another thing you have to understand is that it takes time to create processes under UNIX - this may be better suited to a long running task started once and just doing the desired activity every 40ms.
Update: For an online game with that sort of performance, I think you seriously need to consider having a fat client running on the desktop.
By that I mean a language compiled to machine language (not interpreted) and where the bulk of the code runs on the client, using the network only for transmitting information that needs to be shared.
I don't doubt that the interpreted languages are suitable for less performance intensive games but I don't think, from personal experience, you'll be able to get away with them for this purpose.
PHP is a slow, interpreted language. For it to open a file takes almost that amount of time. Rxecuting a PHP script every 40 milliseconds would lead to a huge queue, and a crash very quickly. This definitely sounds like a task you don't want to use PHP for, but a daemon or other fast, compiled binary. What are you looking to do?
As far as I know a cronjob can only be executed every minute. That's the smallest amount of time possible. I'm left wondering why you need such a small amount of time of execution?
If you really want it to be PHP, I guess you should keep the process running through a shell, as some kind of deamon, instead of opening/closing it all the time.
I do not know how to do it but I guess you can at least get some inspiration from this post:
http://kevin.vanzonneveld.net/techblog/article/create_daemons_in_php/
As everyone else is saying, starting a new process every 40ms doesn't sound like a good idea. It would be interesting to know what you're trying to do. What do you want to do if one execution for some reason takes more than 40ms? If you're now careful you might get lots of processes running simultaneous stepping on each other toes.
What language will depend a lot on what you're trying to do, but you should chose a language with thread support so you don't have to fork a new process all the time. Java, Python might be suited.
I'm not so sure every 40 MS is realistic if the back end job has to deal with things like database queries. You'd probably do better working out a way to be adaptive to system conditions and trying hard to run N times per second, rather than every 40 MS like clockwork. Again, this depends on the complexity of what you need to accomplish behind the curtain.
PHP is probably not the best language to write this with. This is for several reasons:
Depending on the version of PHP, garbage collection may be broken. If you daemonize, you run a risk of leaking memory N times a second.
Other reasons detailed in this answer.
Try using C or Python and keep track of how long each iteration takes. This lets you make a 'best effort' to run N times a second, or every 40 MS, whichever is greater. This avoids your process perpetually running since every time it finishes, its already late to get started again.
Again, I'm not sure how long these tasks should take on a 'worst case' scenario system load .. so my answer may or may not apply in full. Regardless, I advise you to not write a stand alone daemon in PHP.
PHP is the wrong language for this job. If you want to do something updating that fast in a browser, you need to use Javascript. PHP is only for the backend, which means everything PHP does has to be send from your server to the browser and then rendered.

Categories