I am using a Raspberry Pi to read real time RS232 data via a USB port using a Prolific 2303 adaptor.
I want to parse this and display it in a web page.
My first approach was to use a python script. This could read and parse the data but passing it to a PHP web page proved a problem. Either I could use GET to send it or save it to a file and read it with PHP. The latter is a non starter with an SD card, it's not going to last too long and the former might involve file storage anyway.
I then tried a PHP approach;
$device = "/dev/ttyUSB0";
$fp = fopen($device,"r") or die();
while(true){
$xml = fread($fp,"400");
print_r($xml);
}
fclose($fp);
print_r(); produces:
CC128-v0.110011909:27:3016.70000951000660008500024 CC128-v0.110011909:27:3616.70000951000670008600024 CC128-v0.110011909:27:4216.70000951000680008700027 CC128-v0.110011909:27:4816.70000951000680008600024 CC128-v0.110011909:27:5516.70000951000680008800024
This is 5 bursts of XML stripped of all its tags. The complete XML stream is 340 characters long, 57600-N-8-1 sent every 6 seconds.
Then it crashes with "Fatal error: Maximum execution time of 30 seconds exceeded in ~/meter.php on line 79" Sometimes there is missing or corrupted data, too.
If I use $xml = fgets($fp); I get no data.
The stream I am expecting, as read using a Python is:
['<msg><src>CC128-v0.11</src><dsb>00114</dsb><time>17:45:02</time><tmpr>16.8</tmpr><sensor>0</sensor><id>00095</id><type>1</type><ch1><watts>00065</watts></ch1><ch2><watts>00093</watts></ch2><ch3><watts>00024</watts></ch3></msg>\r\n']
I tried to use PECL-DIO but could not locate all the dependencies. Apparently, it is deprecated for the current version of Debian and will be removed from the next. I don't know if that refers to Buster, Bullseye or Bookworm. I also tried using php_serial.class which I found on GIT but could not get a complete XML file output or even the stripped down data only stream.
What am I missing to get a PHP variable updated every 6 seconds?
Related
I have some concerns about my PHP file, which is processing too long.
I'm using XAMPP.
The problem is, that when i use too many methods of my classes, the PHP file loads too slow or it executes too slow.
Here's my example code:
class Sample {
public function show() {
echo 'test';
}
}
$classObject = new Sample();
$classObject->show();
When i run the PHP code above, it takes only i think 1.5s, but if i add more method calls like this:
$classObject = new Sample();
$classObject->show();
$classObject->show();
the PHP code takes almost 3s while executing.
Is there a way to solve this problem?
I already found out what is the problem is when i used oop style php while fetching data from my database it takes long secs before it gonna execute but when I used procedural php style it executes faster.
Firstly, you should take a look at some PHP debugging techniques, because i dont see any attempt of debugging in your code sample.
To your problem:
What you've described is an issue, that can be but must not be a fault of PHP. The thing is, you have to understand how request processing works ... in your case (XAMPP + PHP) it goes like this:
WebServer start:
XAMPP starts Apache (or Tomcat/or whatever WebServer you're using)
Apache starts PHP binary
PHP loads the configuration (php.ini)
The request:
You'll start a web browser (doesn't matter which one)
You'll enter a web address (with or without a specific path - URI)
Then happens some DNS things - that's NOT important for you now
The browser sends a plain-text document on the server via the HTTP protocol, which communicates over TCP - also not so important for you at this time
The WebServer receives a HTTP request and begins the processing
The WebServer runs your PHP script in the already running PHP binary and awaits the ouput
The response
Then the WebServer takes the output of your script and sends it back to the browser
The browser decides (by the headers) how (and if) the content will be displayed to you
Then the browser begins displaying the response content - in your case a HTML or a plain-text document
While drawing the document, the browser begins processing all JavaScript and CSS on the way (up to down)
With this knowledge you need to find out WHERE on the way is the delay taking up time.
The first thing you should do, is to take a look on how long your script is being processed, so:
<?php
$start_time = microtime(true);
$classObject = Sample();
$classObject->show();
$classObject->show();
echo 'Processing took: ' . number_format(microtime(true) - $start_time, 6, '.', '') . ' seconds';
Then, you should look into some Developer Tool your browser provides, for example DevTools in Google Chrome (very good for debugging, though not the best) - hit F12 to open it.
You'll see something like this:
Time is the duration between the time, the request was fully sent and the time, the response was fully received
Load is the sum of durations of all the requests that were done at that moment
Finish is the duration between the time the first request was fully sent and the time the last response was fully received
DOMContentLoaded is the duration of rendering the entire document (browser/client side)
When you have all of this specific duartions, you can decide, where probably will be the problem :)
I have written a script that search for values in xml file. This files I retrieve online via the following code
# Reads entire file into a string
$result = file_get_contents($xml_link);
# Returns the xml string into an object
$xml = simplexml_load_string($result);
But the xml are somethimes are big with as consequence that I got the following error Fatal error: Maximum execution time of 30 seconds exceeded.
I have adapted the ini.php files to max_exucution time set to a 360 seconds but I still got the same error.
I have two options in mind.
If this error occurs run the line again. But I couldn't find anything online (I am probably searching with wrong searchterms). Is there a possibility to run the line where the error occurs again?
Save the xml files temporary local and search for the information in this way and remove in the end of the process. Here , I have no idea how to remove them after retrieving all data? And would this actually solve the problem? Because my script still needs to search through the xml file? Will it not take the same amount of time?
When I used these two lines in my script the problem was solved.
ini_set('max_execution_time', 300);
set_time_limit(0);
I'm using phpseclib 0.3.1 for working with remote SFTP server. I have a script which downloads cover images from the SFTP, save them on my server and make updates in database.
I run this script for 7000 images and after nearly 10-12 minutes, it looks like script has stopped (but eventually I have found out that script entered an endless loop)
After some investigation, I have found the following details:
function get($remote_file, $local_file = false) from SFTP.php is called for downloading image file
In this function _get_sftp_packet() is called in while(true) loop.
In _get_sftp_packet() there is a call of _get_channel_packet(NET_SFTP_CHANNEL);
And in _get_channel_packet() there is a call of $response = $this->_get_binary_packet();
My problem that this $response is empty string. In function _get_sftp_packet() length of this response is used as a decrement and if function returns empty string (or length 0) - I will never get out of the loop in _get_sftp_packet()
Did anyone face this problem? What empty response means for _get_binary_packet() function?
I will appreciate any help.
It's probably an issue with the window size handling. An issue that has been fixed for a while now.
You're running 0.3.1? The latest version is 0.3.10. You're like 5 versions behind.
I'm working on PHP and MySQL along with PHP Simple HTML DOM Parser. I have to parse a website's pages and fetch some content. For that I put the homepage of website as an initial url and fetched all the anchor tags available on that page.
I have to filter those urls as every link is not useful for me. So, I used regular expression. Required links must be saved into my mysql database.
My questions are:
If I extract all the links(around 1,20,000 links) and try to save into mysql DB, I'm getting the following error:
Fatal error: Maximum execution time of 60 seconds exceeded in C:\xampp\htdocs\search-engine\index.php on line 12
I can't store data into database.
I couldn't filter links.
include('mysql_connection.php');
include('simplehtmldom_1_5/simple_html_dom.php');
$website_name="xyz.html/";
$html=file_get_html("xyz.html/");
foreach($html->find('div') as $div)
{
foreach($html->find('a') as $a_burrp)
{
echo $a1 = $a_burrp->href . '<br>';
if(preg_match('/.+?event.+/',$a1, $a_match))
{
mysql_query("INSERT INTO scrap_urls(url, website_name, date_added) VALUES(`$a1`, `$website_name`, now())";
}
}
}
You are receiving Fatal error: Maximum execution time of 60 seconds because of a config limitation in PHP. You can enlarge this number by adding a line like this at the top of your code:
set_time_limit(320);
More info: http://www.php.net/manual/en/function.set-time-limit.php
You can also just enlarge the number in your php.ini file in xampp
Actually, PHP is not the best solution. PHP script is intended to perform quick operations and return response. In your case the script can possibly run for a quite long time. Although you are able to increase max_execution_time, I encourage you to use another technology that is much more flexible than standard PHP, such as Python or JavaScript (Node.js)
I also/ usually work with php scripts that need "some time" to finish.
I always run those scripts either as a cronjob or directly from shell or command line using:
php script.php parameters
Though I don't have to mind the execution.
There is a purpose that php_execution_time is usually set to <=60secs.
Regards.
I'm trying to find a way in which I can echo out the output of an exec call, and then flush that to the screen while the process is running. I have written a simple PHP script which accepts a file upload and then converts the file if it is not the appropriate file type using FFMPEG. I am doing this on a windows machine. Currently my command looks like so:
$cmd = "ffmpeg.exe -i ..\..\uploads\\".$filename." ..\..\uploads\\".$filename.".m4v 2>&1";
exec( $cmd, $output);
I need something like this:
while( $output ) {
print_r( $output);
ob_flush(); flush();
}
I've read about using ob_flush() and flush() to clear the output buffer, but I only get output once the process has completed. The command works perfectly, It just doesn't update the Page while converting. I'd like to have some output so the person knows what's going on.
I've set the time out
set_time_limit( 10 * 60 ); //5 minute time out
and would be very greatful if someone could put me in the right direction. I've looked at a number of solutions which come close one Stackoverflow, but none seem to have worked.
Since the exec call is a blocking call you have no way of using buffers to get status.
Instead you could redirect the output in the system call to a log file. Let the client query the server for progress update in which case the server could parse the last lines of the log file to get information about current progress and send it back to the client.
exec() is blocking call, and will NOT return control to PHP until the external program has terminated. That means you cannot do anything to dump the output on a line-by-line basis because PHP is suspended while the external app is running.
For what you want, you need to use proc_open, which returns a filehandle you can read from in a loop. e.g.
$fh = proc_open('.....');
while($line = fgets($fh)) {
print($line);
flush();
}
There are two problems with this approach:
The first is that, as #Marc B notes, the fact that exec will block until it's finished. You'll have to devise some way of measuring progress.
The second is that using ob_flush() in this way amounts to holding the connection between server & client open and dribbling the data out a little at a time. This is not something that the HTTP protocol was designed for and while it might work sometimes, it's not going to work consistently - different browsers and different servers will time out differently. The better way to do it is via AJAX calls: using Javascript's setTimeout() function (or setInterval()), make a call to the server periodically and have the server send back a progress report.