So I am not familiar with servers, so please forgive me if I say something stupid.
the problem is that I work with a very old database and it's queries take very long to execute (about 1.5 minutes). when I use this in my website the server can't handel it and give's me a 503 error.
I tried checking how long the server runtime is with this:
echo $maxlifetime = ini_get("session.gc_maxlifetime");
but after a bit of reading I heard this is not the way to do it.
So my question is how do I see the time the server has to load the page before it give's me the 503 error and how do I lengthen it?
thank you for helping.
EDIT
oke this is the query that give's me the error:
SELECT FD_DATUM_INGEVOERD || ' ' || FT_TIJD_INGEVOERD FROM BANDZENDINGEN WHERE FB_AFGESLOTEN = 'F' AND FB_AKTIEF = 'T' AND FI_AFVOERKANAAL = 11 AND FI_GEBRUIKER1 = '175' AND FI_VERRIJKINGID < 1
it only takes 17 sec to execute in the firebird database.
EDIT 2
it just found out my queries only take a long time if the database is very busy with something.
oke just found the answer I will lenghten the execution time of the server with this:
ini_set('max_execution_time', 0);
Related
This is a first time we are using GSS and applying in our application.
If we search a query in GSS we are getting good result depending upon the websites we have added to be searched from. But if we give GSS around a hundred queries one by one in a for loop, like,
for ($i = 0, $count = count($arr1); $i < $count; $i++)
{
print $arr1[$i]."\r\n\r\n";
sleep(5);
$in = $arr1[$i];
$in = str_replace(' ','+',$in); // space is a +
//google site search start here
$result = httpGet("https://www.google.com/cse?cx=003255331468891731323:xyxyxyxyxyyx&client=google-csbe&output=xml_no_dtd&q='$in'");
echo $result;
}
Here we have a long string of few pages which we have broken it into small arrays of say 30 words each. These array we have passed in a FOR loop, to get the result (various links) , we have printed the result with echo. We have also applied Sleep of 5 sec so that server gets time to get the result and print it , wait for 5 seconds before searching another query.
But when we are running this for loop, we are not getting result , rather our application hangs, and gives us the result as below:
Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator at webmaster#checkforplag.com to inform them of the time this error occurred, and the actions you performed just before this error.
More information about this error may be available in the server error log.
Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.
error screenshot
Kindly suggest what all do we need to do to apply GSS completely in our application.
Thank You!
I got solution for that by sending my php script in background process, and for that i used shell_exec.
This is the code i used:
$status = shell_exec("nohup /usr/bin/php test.php > /dev/null 2>&1 &");
And now i am not getting this kind of error even i am running a large file.
The weirdest thing happened today. When I run this query:
DELETE FROM some_table WHERE id IN(5)
I get a 30 second timeout error in PHP. The same query runs without issues on my local development server, but when I move it to the production server, I get the timeout.
No sqlite error or anything like that, just "Fatal error: Maximum execution time of 30 seconds exceeded " :|
What could be the problem? Is there any way I could debug this at least?
In top of all my new codes I put this function
ini_set('max_execution_time',60);
reference .
to debug my script execute time I use this
$start = microtime(true);
function execute(){global $start;$end = microtime(true);$time=number_format(($end - $start), 5);return$time;}
//..... your code here
echo '<br><b>.'Page Loaded In 'execute().' Seconds<b/>';
I am facing this problem, firstly I would like to say that I am very new to PHP and MySQL.
Fatal error: Maximum execution time of 30 seconds exceeded in
.........................\cdn\wished.php on line 3
I don't know what is wrong in line 3, its giving error only sometimes. Here's my code:
<?php
//wished.php
$CheckQuery = mysql_query("SELECT * FROM users WHERE ownerid='$user->id'");
$wished = 0;
while($row = mysql_fetch_assoc($CheckQuery))
{
// echo $row['fname']."<br/>";
$wished++;
}
echo $wished;
?>
It was perfect when I run this in localhost with XAMPP. As soon as I hosted my app on my domain and used their database, it start getting error.
thanks every one :)
The issue is that the SQL query is taking too long, and your production PHP configuration has the PHP script time limit set too low.
At the beginning of the script you can add more time to the PHP time limit:
http://php.net/manual/en/function.set-time-limit.php
set_time_limit(60);
for example to add 30 more seconds (or use 0 to let the PHP script continue running).
If your production database is different than your development DB (and I'm assuming production has way more data) then it might be a really expensive call to get everything from the user table.
I wrote a script downloading a list of pages from a website. From time to time I receive the following error (the number of seconds is variable):
The bwshare module will refuse your requests for the next 7 seconds.
You have downloaded data too rapidly.
I found when using sleep(2) in the loop, it works much better, however the time delay is too expensive.
What's the best way how to deal with this module? Should I scrape it without any delay and if the response will be similar to the above message simply use sleep for the requested number of seconds?
It all depends on how many pages you can get before the error message.
Try and measure how many pages in average you can get.
4 pages before the bwshare message is the minimum.
If you are getting the error message before reaching 4 page downloads, then il would be faster to sleep(2) after each download.
try this way... it might help u.
$requestTime = 0.1; // s/connection
foreach(/* blah */) {
$start = microtime(true);
// Do your stuff to here.. get_file_content($url) and other processing .........
if($timeTaken = microtime(true)-$start < $requestTime) {
usleep(($requestTime-$timeTaken)*1000000);
}
}
if your problem is solved then try to post your answer so that other people may also be benefited
I want to run a cron job that does cleanup that takes a lot of CPU and Mysql resources. I want it to run only if the server is not relatively busy.
What is the simplest way to determine that from PHP? (for example, is there a query that returns how many queries were done in the last minute? ...)
if (function_exists('sys_getloadavg')) {
$load = sys_getloadavg();
if ($load[0] > 80) {
header('HTTP/1.1 503 Too busy, try again later');
die('Server too busy. Please try again later.');
}
}
Try to update this function to your needs
If this is a Unix system, parse the output of uptime. This shows the CPU load which is generally considered to be a good measure of "busy." Anything near or over 1.0 means "completely busy."
There are three CPU "load" times in that output giving the load average over 1, 5, and 15 minutes. Pick whichever makes sense for your script. Definition of "load" is here.
On Linux you can get load from /proc/loadavg file.
$load = split(' ',file_get_contents('/proc/loadavg'))
$loadAvg = $load[0]
You can probably use the info in the Mysql List Processes function to see how many are active vs. sleeping, a decent indicator about the load on the DB.
If you're on Linux you can use the Sys GetLoadAvg function to see the overall system load.