Is using simple get_memory_usage() reliable method to track memory usage of specific mysql query?
Does get_memory_usage() count also sql process or is that separate?
No, if you are asking about memory_get_usage. It counts memory used by php script only. Mysql is a separate process, running independently from your php script.
I found apache ab (http://httpd.apache.org/docs/2.0/programs/ab.html). It doesn't tell you memory_usage but time needed to complete request. With 10000 requests you get then average time for entire request. This was fine solution for me because it clearly showed if specific query toke too much resources or not.
Related
I'm puzzled; I assume a slow query.
Note: all my queries are tested and run great when there`s less people using my app/website (less then 0.01sec each).
So I've some high cpu usage with my current setup and I was wondering why? Is it possible it's an index issue?
Our possible solution: we thought we could use an XML cache file to store the informations each hour, and then reduce the load on our MySQL query? (update files each hour).
Will it be good for us to do such things? Since we have an SSD drive? Or will it be slower then before?
Currently in high traffic time, our website/app can take up to 30 seconds before return the first byte. My website is running under a Plesk 12 server.
UPDATE
Here's more informations about my mysql setup..
http://pastebin.com/KqvFYy8y
Is it possible it's an index issue?
Perhaps but not necessarily. You need first to identify which query is slow. You find that in the slow query log. Then analyze the query. This is explained in literature or you can contact a consultant / tutor for that.
We thought we could use an xml cache file to store the informations each hour.. and then reduce the load on our mysql query?
Well, cache invalidation is not the easiest thing to do, but with a fixed rythm every hour this seems easy enough. But take care that it will only help if the actual query you cache was slow. Mysql normally has a query cache built in, check if it is enabled or not first.
Will it be good for us to do such things?
Normally if the things to do are good, the results will be good, too. Sometimes even bad things will result in good results, so such a general question is hard to answer. Instead I suggest you gain more concrete information first before you continue to ask around. Sounds more like guessing. Stop guessing. Really, that's only for the first two minutes, after that, just stop guessing.
Since we have an ssd drive? Or will it be slower then before?
You can try to throw hardware on it. Again lierature and a consultant / tutor can help you greatly with that. But just stop guessing. Really.
I assume the query is not slow all the time. If this is true, the query is not very likely the problem.
You need to know what is using the CPU. Likely a runaway script with an infinite loop.
Try this:
<?php
header('Content-Type: text/plain; charset=utf-8');
echo system('ps auxww');
?>
This should return a list in this format:
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
Scan down the %CPU column and look for your user name in the USER column
If you see a process taking 100% CPU, you may want to get the PID number and:
system('kill 1234');
Where 1234 is the PID
The mysql processes running at 441% and 218% seems very problematic.
Assuming this is a shared server, there may be another user running queries that is hogging the CPU. you may need to take that up with your provider.
I've been watching on one of my shared servers and the CPU for the mysql process has not gone over 16%.
MySQLTuner
From the link it appears you have heavy traffic.
The Tuner was running 23.5 minutes
Joins performed without indexes: 69863
69863 in 23.5 min. comes out to almost 50 queries per second.
Does this sound correct? Running a query with a JOIN 150 times per second.
Index JOIN Table
You have a query with a JOIN.
The tables are joined by column(s).
On the joined table add an index to the column that joins the two table together.
I'm parsing data from a text file to a mysql database. The problem is, after parsing a certain number of records (anywhere from 250,000 to 380,000) I get Fatal error: Maximum execution time of 30 seconds exceeded. I can work around this by splitting the file into smaller files, but this is a pain and I'd like to find a way to trick PHP into processing the whole file.
Is there a way to convince PHP to run lengthy processes, even though I don't have access to php.ini and can't change my maximum execution time?
Btw, here's my parsing script. Maybe I'm doing something wrong with my php code.
You may find you can improve performance by inserting rows several at a time. Try this syntax:
INSERT INTO
tbl_name (a,b,c)
VALUES(1,2,3),(4,5,6),(7,8,9)
;
The number of rows you should group together will be best found by experimentation. It might be that the more you add in the one statement, the faster it will be, but equally that might be slow if you don't have auto-commit turned on.
As someone mentions in the comments, putting too many in at once may max out the CPU, and raise the eyebrows of the server admin. If this is something you need to be careful with on your host, try 200 at a time and then a small usleep between iterations.
It's worth looking at the connection to your database too - is this on the same server, or is it on another server? The connection may be slow. Add some timing for, say, how long 5,000 rows take to insert, and then play around to see how to reduce it.
Here's the manual reference for INSERT; note that this is non-standard SQL, and it won't work on other database engines.
I've heard about cron job and don't think the actual creation of it will be that hard to make but I've some concerns about how this will work with a large script.
Without going too much off-topic on my project i will stick with the basics about my situation. I need to make a script that every day performs a CURL fetch for data on a remote website and updates an database for each featured member on my website with it. In short, it's approximatively at this time 1000 times the script need to be executed but it will be a larger number as times goes by.
As you can guess, this will take a long time to preform so i'm worried about how the execution will work in a manner of not crashing in the middle of it.
My first thought was to perhaps split the users into groups and make the executions on a small amount of users each time but don't know how this is manageable ( will read on further about the topic when i got some form of confirmation on this).
So, to my question. Do you think there is any way for me to make this happen and do you perhaps have any suggestions on how to make this to work efficiently? All help i can get is appreciated. Thank you for your time.
bigger cron-jobs with php and mysql needs to be fragmented, since there is no way for you to 'nice' them, (reduce their os priority). Even if you nice the script, the mysql-requests will be executed without this concern.
From what you're describing there's two aspects to consider:
Congestion of network bandwith
Congestion of database throughput
I'd recommend a fragmented solution where you call your script from cron more often, and let the script execute only a small amount of the total job. The job should further be canceled (postponed to next run) if i/o-bandwith or cpu-usage is above any limit that may affect response-time to visitors.
regards,
/t
One Way:
I'm usually against putting logic in the database, but in this case a stored procedure might help. It will run your job faster (since it's a large one) and also you want to lock the tables as you do it. That way, if the script that calls the stored procedure gets hit by cron before the original job was over with it wont edit your database while the first one is running.
The actual time can i not give an
straight answer on but based on
previously experiences this will take
longer then the max execution time.
So solve that problem. There's a reason you can have a different php.ini for the command line interface. Then you can simply focus on processing all users in one script.
I solved this program using the files of cron job as differents cron jobs with small pieces. If you are using PHP you can set a cron job to domain/cronjob1.php, domain/cronjob2.php limiting the database lets say 10 with
$sql="SELECT * FROM table LIMIT 10";
to cronjob 1 and the rest in cronjo2
I want to know how many resources (MySQL time, CPU usage, bandwidth, etc.) will use my system with a certain number of API calls per second.
One API call gets the parameters in PHP, do from 1 to 5 SQL queries and returns a XML file.
How I can do that? Any idea of a formula or something?
Try using the xhprof PHP extension to do some profiling. It will tell you things such as how much time is spend on each function call, memory used, CPU used, etc. Do a few runs with it and you should have a good idea about the resources you are using. I've found it to be one of the most useful tools available in PHP. For more info, see the this documentation.
The only real way to do this is by benchmarking it. It depends entirely on what your code is doing - I could write an SQL query that takes minutes to run, and I could write an SQL query that takes milliseconds.
My web site page contains lot of queries. So it takes lot of time to execute, and ends in an error. Could you please tell me how to increase the execution time (may be the query execution) through php coding?
Thanks in advance
Use max_execution_time:
ini_set('max_execution_time', 14000); // adjust value
Also make sure that your code is correctly, try to see if there is any room for improvement in the speed. For example, you should have a look at:
Query optimization techniques
PHP Optimization Tricks
PHP Micro Optimizations
If this website is supposed to be for normal users, you will have to optimize your queries, no way around it. No user will want to wait more than a few seconds for a page to load, and if you're already surpassing the execution time limit, it's already way too slow! Extending the time limit is not a solution.
If, OTOH, this page is supposed to be more of a maintenance script, you should run it from the CLI or as a cron job where execution time limits don't exist.
You can use set_time_limit function to set the number of seconds a script is allowed to run.
But my advice would be to rethink/rewrite your queries, because if you hit the php execution limit you may do too much work at the database, or you do the work inefficiently.
There is always room for improvement. Raising php execution time is only a temporary fix which is one of the most hated phrase in computer programming. So again, look at your database scheme and analyze your queries, check for indices, etc.
For high concurrency I/O you may also consider a NoSQL approach like APC or Memcached.