I have written a script that search for values in xml file. This files I retrieve online via the following code
# Reads entire file into a string
$result = file_get_contents($xml_link);
# Returns the xml string into an object
$xml = simplexml_load_string($result);
But the xml are somethimes are big with as consequence that I got the following error Fatal error: Maximum execution time of 30 seconds exceeded.
I have adapted the ini.php files to max_exucution time set to a 360 seconds but I still got the same error.
I have two options in mind.
If this error occurs run the line again. But I couldn't find anything online (I am probably searching with wrong searchterms). Is there a possibility to run the line where the error occurs again?
Save the xml files temporary local and search for the information in this way and remove in the end of the process. Here , I have no idea how to remove them after retrieving all data? And would this actually solve the problem? Because my script still needs to search through the xml file? Will it not take the same amount of time?
When I used these two lines in my script the problem was solved.
ini_set('max_execution_time', 300);
set_time_limit(0);
Related
I am using a Raspberry Pi to read real time RS232 data via a USB port using a Prolific 2303 adaptor.
I want to parse this and display it in a web page.
My first approach was to use a python script. This could read and parse the data but passing it to a PHP web page proved a problem. Either I could use GET to send it or save it to a file and read it with PHP. The latter is a non starter with an SD card, it's not going to last too long and the former might involve file storage anyway.
I then tried a PHP approach;
$device = "/dev/ttyUSB0";
$fp = fopen($device,"r") or die();
while(true){
$xml = fread($fp,"400");
print_r($xml);
}
fclose($fp);
print_r(); produces:
CC128-v0.110011909:27:3016.70000951000660008500024 CC128-v0.110011909:27:3616.70000951000670008600024 CC128-v0.110011909:27:4216.70000951000680008700027 CC128-v0.110011909:27:4816.70000951000680008600024 CC128-v0.110011909:27:5516.70000951000680008800024
This is 5 bursts of XML stripped of all its tags. The complete XML stream is 340 characters long, 57600-N-8-1 sent every 6 seconds.
Then it crashes with "Fatal error: Maximum execution time of 30 seconds exceeded in ~/meter.php on line 79" Sometimes there is missing or corrupted data, too.
If I use $xml = fgets($fp); I get no data.
The stream I am expecting, as read using a Python is:
['<msg><src>CC128-v0.11</src><dsb>00114</dsb><time>17:45:02</time><tmpr>16.8</tmpr><sensor>0</sensor><id>00095</id><type>1</type><ch1><watts>00065</watts></ch1><ch2><watts>00093</watts></ch2><ch3><watts>00024</watts></ch3></msg>\r\n']
I tried to use PECL-DIO but could not locate all the dependencies. Apparently, it is deprecated for the current version of Debian and will be removed from the next. I don't know if that refers to Buster, Bullseye or Bookworm. I also tried using php_serial.class which I found on GIT but could not get a complete XML file output or even the stripped down data only stream.
What am I missing to get a PHP variable updated every 6 seconds?
I have a PHP script which the execution time used to be a few milliseconds.
But yesterday I got complaints that it loads forever. I checked into it and nailed down the problem to a line that uses require_once(); -- execution time is about two minutes just for that line!
The file to be included contains a bunch of functions and in itself doesn't do anything at that point. It also is just about 35kb in size.
I went through the script logging the microtime() and this is the output:
0.10887400 1442934181 // start of script
0.13321200 1442934181 // line before "require_once()"
0.16033800 1442934307 // log time again in first line
// of the included functions file
0.16048000 1442934307 // back to original script,
// line after require_once()
0.16054300 1442934307 // end of script
Just for curiosity I've tried to replace require_once() with require() -- to change.
I don't know what the cause could be and where I should start to debug. It has worked before without problem, and I haven't done any change.
require_once and require need to access to the file, read it, and execute it. So you may have multiple problems, but you'll have to check :
Maybe your hard disk is going bad ?
Maybe the file is locked by some others process ?
The file contains now a big function doing too much things
I'm using phpseclib 0.3.1 for working with remote SFTP server. I have a script which downloads cover images from the SFTP, save them on my server and make updates in database.
I run this script for 7000 images and after nearly 10-12 minutes, it looks like script has stopped (but eventually I have found out that script entered an endless loop)
After some investigation, I have found the following details:
function get($remote_file, $local_file = false) from SFTP.php is called for downloading image file
In this function _get_sftp_packet() is called in while(true) loop.
In _get_sftp_packet() there is a call of _get_channel_packet(NET_SFTP_CHANNEL);
And in _get_channel_packet() there is a call of $response = $this->_get_binary_packet();
My problem that this $response is empty string. In function _get_sftp_packet() length of this response is used as a decrement and if function returns empty string (or length 0) - I will never get out of the loop in _get_sftp_packet()
Did anyone face this problem? What empty response means for _get_binary_packet() function?
I will appreciate any help.
It's probably an issue with the window size handling. An issue that has been fixed for a while now.
You're running 0.3.1? The latest version is 0.3.10. You're like 5 versions behind.
I'm working on PHP and MySQL along with PHP Simple HTML DOM Parser. I have to parse a website's pages and fetch some content. For that I put the homepage of website as an initial url and fetched all the anchor tags available on that page.
I have to filter those urls as every link is not useful for me. So, I used regular expression. Required links must be saved into my mysql database.
My questions are:
If I extract all the links(around 1,20,000 links) and try to save into mysql DB, I'm getting the following error:
Fatal error: Maximum execution time of 60 seconds exceeded in C:\xampp\htdocs\search-engine\index.php on line 12
I can't store data into database.
I couldn't filter links.
include('mysql_connection.php');
include('simplehtmldom_1_5/simple_html_dom.php');
$website_name="xyz.html/";
$html=file_get_html("xyz.html/");
foreach($html->find('div') as $div)
{
foreach($html->find('a') as $a_burrp)
{
echo $a1 = $a_burrp->href . '<br>';
if(preg_match('/.+?event.+/',$a1, $a_match))
{
mysql_query("INSERT INTO scrap_urls(url, website_name, date_added) VALUES(`$a1`, `$website_name`, now())";
}
}
}
You are receiving Fatal error: Maximum execution time of 60 seconds because of a config limitation in PHP. You can enlarge this number by adding a line like this at the top of your code:
set_time_limit(320);
More info: http://www.php.net/manual/en/function.set-time-limit.php
You can also just enlarge the number in your php.ini file in xampp
Actually, PHP is not the best solution. PHP script is intended to perform quick operations and return response. In your case the script can possibly run for a quite long time. Although you are able to increase max_execution_time, I encourage you to use another technology that is much more flexible than standard PHP, such as Python or JavaScript (Node.js)
I also/ usually work with php scripts that need "some time" to finish.
I always run those scripts either as a cronjob or directly from shell or command line using:
php script.php parameters
Though I don't have to mind the execution.
There is a purpose that php_execution_time is usually set to <=60secs.
Regards.
I have been trying to fix this wired php session issue for some time now.
Setup: running on IIS V6.0, php for windows V 5.2.6
Issue:
At totally random times, the next php line after session_start() times out.
I have a file, auth.php that gets included on every page on an extranet site (to check valid logins)
auth.php
session_start();
if (isset($_SESSION['auth']==1) { <---- timesout here
do something ...
}
...
When using the site, I get random "maximum execution time of 30 seconds exceeded" errors at the line 2: if (isset($_SESSION['auth']==1) {
If I modify this script to
session_start();
echo 'testing'; <---- timesout here
if (isset($_SESSION['auth']==1) {
do something ...
}
...
The random error now happens on line 2 as well (echo 'testing'), which is a simple echo statement, strange.
It looks like session_start() is randomly causing issues, preventing line of code right after it to throw a timeout error (even for a simple echo statement) ....
This is happening on all sorts of page on the site (db intensive, relatively static ...) which is making it difficult to troubleshoot. I have been tweaking the session variables and timeouts in php.ini without any luck
Has anyone encountered something like that, or could suggest possible places to look at ?
thanks !
A quick search suggests that you should be using session_write_close() to close the session when you are done using it if you are on an NTFS file system. Starting a session locks the session file so no other file can access it while code is running. For some reason, the lock sometimes doesn't release automatically reliably on Windows/NTFS, so you should manually close the session when you are done with it.