Sitemap Generator timed out, maximum execution time of 60 seconds exceeded - php

I need to generate a sitemap.
I am using spatie/laravel-sitemap. I have installed and published it but when I run the generator Symphony throws a fatal error: Maximum execution time of 60 seconds exceeded.
I have a huge list of links and left just one to test, but still getting the same error.
How to fix that? Here is my web.php code:
<?php
use Spatie\Sitemap\SitemapGenerator;
// this link throws a fatal error: Maximum execution time of 60 seconds exceeded
Route::get('sitemap', function(){
SitemapGenerator::create('127.0.0.1:8000')->writeToFile('sitemap.xml');
return 'sitemap created';
});
// this link is tested and fully working
Route::get('', function () {
return view('home');
});

This is a common problem when working with long running scripts.
Did you try using php function
set_time_limit ?
Try putting in the beginning of your script
set_time_limit(300);

Related

Timeout not working in PHP Httpful request

I am having an http request and I am using "Httpful Request" to send it in PHP.
I am setting a timeout of 20 seconds also in the request as follows:
$req = Request::get($Url);
$response = $req->timeoutIn(20)->expectsHtml()->send();
I was expecting to get an exception after timeout happens and I can handle the exception. But I am getting the following php Fatal error. Why is it so?
PHP Fatal error: Maximum execution time of 30 seconds exceeded in
phar://C:/CapPortal/cpPortal/source/wordpress/httpful.phar/Httpful/Request.php
on line 202
You can use set_time_limit($seconds) to set that limit higher, if you need more execution time. You can also set it to 0, which means infinite. Warning: Apache (if you're using php with it) may also limit php's execution time.
The httpful module, itself has some method to set time out name is timeoutIn().
So you can add this method to your code and set the time out, for example to 50 seconds:
$response = $req->timeoutIn(20)->expectsHtml()->timeoutIn(50)->send();
It work fine for me.

Timeout when trying to delete a record

The weirdest thing happened today. When I run this query:
DELETE FROM some_table WHERE id IN(5)
I get a 30 second timeout error in PHP. The same query runs without issues on my local development server, but when I move it to the production server, I get the timeout.
No sqlite error or anything like that, just "Fatal error: Maximum execution time of 30 seconds exceeded " :|
What could be the problem? Is there any way I could debug this at least?
In top of all my new codes I put this function
ini_set('max_execution_time',60);
reference .
to debug my script execute time I use this
$start = microtime(true);
function execute(){global $start;$end = microtime(true);$time=number_format(($end - $start), 5);return$time;}
//..... your code here
echo '<br><b>.'Page Loaded In 'execute().' Seconds<b/>';

php- set a script timeout and call a function to cleanup when time limit reached

I am writing a PHP script that will do some processing, but I only want it to run if the script isn't already running. I also only want the script to run for a maximum of 5 minutes.
I know I can set the timeout using set_time_limit, but doing so will not remove the lock files that I create during the execution.
Is their a way to call a function when the time limit is reached so I can perform a cleanup?
Tried this out in a test environment and got it working with the following (error reporting off + shutdown function):
<?php
$test = function() {
print("\n\nOOPS!\n\n");
};
error_reporting(0);
set_time_limit(1);
register_shutdown_function($test);
while (true) {
}

Getting "Maximum execution time" error intermittently

I am facing this problem, firstly I would like to say that I am very new to PHP and MySQL.
Fatal error: Maximum execution time of 30 seconds exceeded in
.........................\cdn\wished.php on line 3
I don't know what is wrong in line 3, its giving error only sometimes. Here's my code:
<?php
//wished.php
$CheckQuery = mysql_query("SELECT * FROM users WHERE ownerid='$user->id'");
$wished = 0;
while($row = mysql_fetch_assoc($CheckQuery))
{
// echo $row['fname']."<br/>";
$wished++;
}
echo $wished;
?>
It was perfect when I run this in localhost with XAMPP. As soon as I hosted my app on my domain and used their database, it start getting error.
thanks every one :)
The issue is that the SQL query is taking too long, and your production PHP configuration has the PHP script time limit set too low.
At the beginning of the script you can add more time to the PHP time limit:
http://php.net/manual/en/function.set-time-limit.php
set_time_limit(60);
for example to add 30 more seconds (or use 0 to let the PHP script continue running).
If your production database is different than your development DB (and I'm assuming production has way more data) then it might be a really expensive call to get everything from the user table.

Zend Lucene: Fatal Error, Maximum Execution Time

I've written a basic indexing script for my site and it seems to be working...somewhat. It gets through about 3/4 of the pages it needs to index and then give this error:
Fatal error: Maximum execution time of 0 seconds exceeded in
/Zend/Search/Lucene/Analysis/Analyzer.php on line 166
It seems to hang up in a different spot each time, too. I ran it a minute later and got this:
Fatal error: Maximum execution time of 0 seconds exceeded in
/Zend/Search/Lucene/Storage/Directory/Filesystem.php on line 349
Here's the script:
foreach($all_items as $item) {
$doc = new Zend_Search_Lucene_Document();
$doc->addField(Zend_Search_Lucene_Field::Text('title', $item['pagetitle']));
$doc->addField(Zend_Search_Lucene_Field::Text('url', $item['url']));
$doc->addField(Zend_Search_Lucene_Field::Text('country', $item['country']));
// Add document to the index
$index->addDocument($doc);
}
Maybe your task is time consuming?
Then increase time limit set_time_limit:
set_time_limit(0); //no time limit
set_time_limit(500) //500 sec limit
Try increasing max_execution_time
ini_set('max_execution_time', 5000);
There is also max_input_time
ini_set('max_input_time', 5000);
If it still does not work, you will need to track down parts what is executing forever

Categories