Timeout on Large MySQL Query - php

I have this code:
$theQuery = mysql_query("SELECT phrase, date from wordList WHERE group='nouns'");
while($getWords=mysql_fetch_array($theQuery)) {
echo "$getWords[phrase] created on $getWords[date]<br>";
}
The query has 75,000 results, and every time I run the code I get an error.

Several issues could be at play here, all of which are due to settings in your php.ini. Your script could be timing out since PHP defaults to a 30 second maximum for script execution. The other reason (and just as likely) is that you're hitting a script memory limit which is defaulted to 8MB per script execution.
Open php.ini and search for "Resource Limits" and make the appropriate modifications.

As a guess I'd say your timing out on the MySQL query rather than echoing out the results (from your title)
Have you tried setting an index on the group column on your MySQL table? This will slow your inserts should make a huge difference on the select.
See http://dev.mysql.com/doc/refman/5.0/en/mysql-indexes.html

php.ini is a good first step but you might find yourself on a server where you can't change those settings to something that will work. In that case it might make sense to break up what you're doing in chunks.
For example, if this is for output, you could run 25,000 results at a time using LIMIT. Stick the output in a file and then just use the file for your output. You can update your file once a night/hour/whatever with a cron job if you need the output to stay fresh.

Related

How to increase the execution time to get the multple records from the mysql database

While i am using php code to get all the records from the mysql table having nearly 10,000 records,it is executing slowly.
Can any one tell me how to get the records with the minimum execution time
1 for perticular script
set_time_limit(number in seconds);
2 in php.ini in entire php( for all script)
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
3 where php.ini is not accessible(like on shared server) write it into .htaccess file
<IfModule mod_php5.c>
php_value max_execution_time 259200
Ther first thing you should do is to ANALYZE you query and see what is the reason of such low speed.
1) If your select does not use INDEX - you definitely should create index (if it is not present) or force your query to use that index manually (FORCE INDEX(primary) etc.).
2) If you query created temporary table (Using temporary in your explain output) - you need to increase allocated memory for you MySQL server, so that it can create temporary table in memory and not write it to HDD (that takes looooooot of time). For this optimization check tmp_table_size and max_heap_table_size in mysql config.
3) And only in case you have no problems I described above and you just have some complicated logic in you script that processes your data, you can set_time_limit(0) (means unlimited) directly from you php code, or edit php.ini and set max_executeion_time to some higher value. Or, as an alternative, you can install opcache or other opcode cacthing system that can a little bit increase the speed of your code (2-10 times usually).
(The problem may be not only in your server, but in your connection speed and amount of data that should be downloaded - maybe, select is fast, but data load time is long, so check the size of response with Firebug and compare with you bandwidth)
We need to see the DB structure and your query to give a better answer.
However, below will simply provide your script more time to run the query.
You can do this in two ways.
At the start of your php file add the code set_time_limit(number in seconds);
You can edit your php.ini files max_execution_time = 3000 to a higher value.
Option 1 will increase the execution time only for that script.
Option 2 will increase execution time in all scripts.
Try to add an index.
But if you want all the records, with this lot, it's normal that it takes time.
Try to increase allowed memory.
<?php ini_set('memory_limit','512M'); ?>
For the execution time, take a look to :
Increase max execution time for php

How can I do lengthy tasks in php while the max execution time is 30 seconds?

I'm parsing data from a text file to a mysql database. The problem is, after parsing a certain number of records (anywhere from 250,000 to 380,000) I get Fatal error: Maximum execution time of 30 seconds exceeded. I can work around this by splitting the file into smaller files, but this is a pain and I'd like to find a way to trick PHP into processing the whole file.
Is there a way to convince PHP to run lengthy processes, even though I don't have access to php.ini and can't change my maximum execution time?
Btw, here's my parsing script. Maybe I'm doing something wrong with my php code.
You may find you can improve performance by inserting rows several at a time. Try this syntax:
INSERT INTO
tbl_name (a,b,c)
VALUES(1,2,3),(4,5,6),(7,8,9)
;
The number of rows you should group together will be best found by experimentation. It might be that the more you add in the one statement, the faster it will be, but equally that might be slow if you don't have auto-commit turned on.
As someone mentions in the comments, putting too many in at once may max out the CPU, and raise the eyebrows of the server admin. If this is something you need to be careful with on your host, try 200 at a time and then a small usleep between iterations.
It's worth looking at the connection to your database too - is this on the same server, or is it on another server? The connection may be slow. Add some timing for, say, how long 5,000 rows take to insert, and then play around to see how to reduce it.
Here's the manual reference for INSERT; note that this is non-standard SQL, and it won't work on other database engines.

Fatal error: Maximum execution time of 30 seconds exceeded in joomla solution without changing ini file

I'm created a Joomla extension in which i'm storing records from table A to table B. My script is working fine if table A contains less data.
If table A contains large amout of data. While inserting this huge data execution is getting exceed & showing this error 'Fatal error: Maximum execution time of 30 seconds exceeded in
/mysite/libraries/joomla/database/database/mysqli.php on line 382'.
I can overcome this problem by making change in ini file, but its Joomla extension which people gonna use it in their site so i can't tell them to make change in ini file infact i don't wanna tell them.
take a look into this
http://davidwalsh.name/increase-php-script-execution-time-limit-ini_set
ini_set('max_execution_time', 300);
use this way or
set_time_limit(0);
Use the below codes at the start of the page where you wrote the query codes
set_time_limit(0);
Technically, you can increase the maximum execution time using set_time_limit. Personally, I wouldn't mess with limits other people set on their servers, assuming they put them in for a reason (performance, security - especially in a shared hosting context, where software like Joomla! is often found). Also, set_time_limit won't work if PHP is run in safe mode.
So what you're left with is splitting the task into multiple steps. For example, if your table has 100000 records and you measure that you can process about 5000 records in a reasonable amount of time, then do the operation in 20 individual steps.
Execution time for each step should be a good deal less than 30 seconds on an average system. Note that the number of steps is dynamic, you programmatically divide the number of records by a constant (figure out a useful value during testing) to get the number of steps during runtime.
You need to split your script into two parts, one that finds out the number of steps required, displays them to the user and sequentially runs one step after another, by sending AJAX requests to the second script (like: "process records 5001 to 10000"), and marking steps as done (for the user to see) when the appropriate server respone arrives (i.e. request complete).
The second part is entirely server-sided and accepts AJAX requests. This script does the actual work on the server. It must receive some kind of parameters (the "process records 5001 to 10000" request) to understand which step it's supposed to process. When it's done with its step, it returns a "success" (or possibly "failure") code to the client script, so that it can notify the user.
There are variations on this theme, for instance you can build a script which redirects the user to itself, but with different parameters, so it's aware where it left off and can pick up from there with the next step. In general, you'd want the solution that gives the user the most information and control possible.

Run a SQL query (count) every 30 sec, and then save the output to some file

I am developing a website and got a database where people can insert data (votes). I want to keep a counter in the header like "x" votes have been cast. But it is possible that there will be a lot of traffic on the website soon. Now I can do it with the query
SELECT COUNT(*) FROM `tblvotes
and then display number in the header, but then every time the users changes page, it will redo the query, so I am thinking, maybe it is better to the query once every 30 sec (so much less load on the mysql server) but then I need to save the output of it to some place (this shouldn't be so hard; I can write it to a textfile?) But how can I let my website automatically every 30 sec run the query and put the number in the file. I got no SSH to the server so I can t crontab it?
If there is something you might not understand feel free to ask!
Simplest approach: Write the result into a local textfile, check the filetime of the textfile on every request to be less than now() + 30 seconds, and if so, update the file. To update, you should lock the file. While the file is being updated, other users for whom the condition now() + 30 is met should only read the currently existing file to avoid race conditions.
Hope that helps,
Stefan
Crontabs can only run every minute, at its fastest.
I think there is a better solution to this. You should make an aggregate table in which the statistical information is stored.
With a trigger on the votes_table, you can do 'something' every time the table receives a INSERT statement.
The aggregate table will then store the most accurate information, which you then can query to display the count.
Better solution will be using some cache mechanism (e.g. APC) instead of files if your server allows it.
If you can, you may want to look into using memcached. It allows you to set an expiry time for any data you add to it.
When you first do the query, you write the md5 of the query text associated with the result. On subsequent queries, look for the data in memcached. If it is expired, you can redo the sql query and then rewrite it to memcached.
Okay, so the first part of your question is basically about caching the result of the total votes to be included in the header of your page. Its a very good idea - here is an idea of how to implement it...
If you can not enable a crontab (even with out SSH access you might be able to set this up using your hostings control panel), you might be able to get away with using an external 3rd party cronjob service. (Google has many results for this)...
Everytime your cronjob runs, you can create/update a file that simply contains some PHP arrays -
$fileOutput = "<"."?php\n\n";
$fileOutput .= '$'.$arrayName.'=';
$fileOutput .= var_export($yourData,true);
$fileOutput .= ";\n\n?".">";
$handle = fopen(_path_to_cache_file,'w+');
fwrite($handle,$fileOutput);
fclose($handle);
That will give you a PHP file that you can simply include() into your header markup and then you'll have access to the $yourData variable under the name $arrayName.

PHP Cron script efficiency using CURL to load files

Im pulling in search query results using CURL and then iterating through a database to load additional queries then storing the results back in the database. Im running into hassles with php maximum time and have tried setting the maximum time variable to a higher amount which i think isnt working on my host using this:
ini_set('max_execution_time', 600);
in the file that is run by cron so it only changes the max time for the importing process.
The question is, would it be more effecient to store the result of each CURL connection in the database and then having a secondary function that pulls the dataabase results and sorts into the relevant tables and run the secondary function every 10 minutes hypothetically OR is it more effecient to pull the file and insert the sorted records in one go??
You can always find out whether your host is allowing you to modify the ini_set function by using ini_get('max_execution_time') right after your call to ini_set().
Instead of storing the results in the database, I would put them into a directory. Name the files by using the function microtime(true) (which makes it easy to pull the most or least recently written file). Then have a separate script that checks to see if there are files in the directory, and if so, processes one of them. Have the scripts run on a 1 minute interval.
I will note that there is a possible race condition on processing the file if it takes more than one minute, however, even if it takes longer than one minute, it is unlikely to ever occur.

Categories