Send continuous data from 4G module to server - php

I have the Telit LE910 4G LTE module connected to a Teensy board (Arduino will do). While I am able to send data to my PHP server using HTTP requests (POST and GET), I am not able to send continuous data due to necessary delays for the server to respond back:
[...]
// SOCKET DIAL
LTESerial.print("AT#SD=1,0,80,\"SERVER IP\"\r\n");
delay(5000);
// POST
LTESerial.print("POST /server/index.php?data=");
LTESerial.print(random(1000));
LTESerial.print(" HTTP/1.1\r\n");
LTESerial.print("Host: SERVER IP\r\n\r\n");
delay(5000);
while (getResponse() > 0);
This is simply an example (written here), but it somewhat illustrates what I am doing. The above code is supposed to be put inside a while loop, so that once the data is uploaded to a .txt file on the server, the module reconnects to the server and POST another data point.
Obviously, I want to avoid these delays and parse data to the server as fast as possible (as soon as the data is available). This is why I opted for the 4G LTE version.
Tweaking the delays might give me an extra second or so, but my project includes plotting a lot of data points in "real time", so it is very time sensitive.
Any idea on how to send a continuous data stream to the server on 4G? I am thinking about buffering some data points and use FTP to upload the data, but I assume uploading files to the server might even take more time than now.
Any help is much appreciated!

It sounds like your use case might be better suited to a special IoT (internet of things) protocol rather than a more client server connection orientated protocol, like HTTP.
There are several protocols in use in the IoT world but some of the most common are:
MQTT - http://mqtt.org
COAP - http://coap.technology
XMPP - https://xmpp.org
These should not only address your latency concerns but are generally also designed to minimise data overhead and processing/battery use also.
You should be able to find PHP examples for these also - for example one for MQTT:
https://www.cloudmqtt.com/docs-php.html

I somewhat got it to work using some of the existing code above, but it is still not optimal. This might be useful for others.
This is what I did:
1) I socket dial only once (during initialization)
2) The POST-section is running inside a loop infinitely. The 5 second delay is now reduced to 200 ms and I added some headers, like so:
//unsigned long data = random(1000000000000000, 9999999999999999);
LTESerial.print("POST /index.php?data=");
LTESerial.print(data);
LTESerial.print(" HTTP/1.1\r\n");
LTESerial.print("Host: ADDRESS\r\n");
LTESerial.print("Connection: keep-alive\r\n\r\n");
delay(200);
while (getResponse() > 0);
3) Turns out my WAMP server (PHP) had limitations as default in terms of maximum HTTP requests, timeouts and the like. I had to increase these numbers (I changed them to unlimited) inside php.ini.
However, while I am able to "continuously" send data to my server, a delay of 200 ms is still a lot. I would like to see something close to serial communication, if possible.
Also, when looking at the serial monitor, I get:
[...]
408295030
4238727231
3091191349
2815507344
----------->(THEN SUDDENLY)<------------
HTTP/1.1 200 OK
Date: Thu, 02 Jun 2
2900442411
016 19:29:41 GMT
Server: Apache/2.4.17 (Win32) PHP/5.6.15
X-P16
3817418772
Keep-Alive: timeout=5
Connection: Keep-Alive
Content-Type: te
86026031
HTTP/1.1 200 OK
Date: Thu, 02 Jun 2016 19:29:4
3139838298
75272508
[...]
----------->(After 330 iterations/POSTs, I get)<------------
NO CARRIER
NO CARRIER
NO CARRIER
NO CARRIER
So my question is:
1) How do I eliminate the 200 ms delay as well?
2) If my data-points have different sizes, the delay will have to change as well. How to do this dynamically?
3) Why does it stop at 330-ish iterations? This doesn't happen if data is only 4 digits.
4) Why do I suddenly get responses from the server?
I hope someone can use this for their own project, however this does not suffice for mine. Any ideas?

Related

Stale JSON response being served

I have an IoT device that is fetching a JSON file off of a web-server. The value inside the JSON file is modified by a PHP/HTML based webpage. The file on the server is set to 777 permissions and is correctly changing value. However when my IoT device is connecting to the server to parse the JSON, it is being served a copy of the file that is at least 8 days old. This header is being returned:
HTTP/1.1 200 OK.
Content-Length: 15.
Content-Type: application/json.
Server: Apache.
Last-Modified: Mon, 30 Nov 2015 21:28:39 GMT.
Connection: keep-alive.
Date: Tue, 08 Dec 2015 08:22:36 GMT.
.
{"light": "on"}LED ON
closing connection.
What am I missing here. One possibility that comes to mind is that the server is sending back a cached response, instead of actual looking at the fresh version of the file and serving that.
So i guess you have some caching at server nigher in you IoT device
1) try to check in your IoT device if it has some caching
2) try to check server configuration if it has some caching
if you fail then only the option is to create some logic which will always generate unique pram at the end of your link in IoT deviceand retrieve json with that link.
Thanks to help from #Armen and a lot of reading online. It turned out that the there were two issues that were causing this:
I discovered that, in PHP, simply writing to a file does not modify it's 'last-modified' attribute and this was causing issues with the server sending the wrong version of the JSON file. To remedy the problem, I added touch("path/to/file.json"); after each time the code wrote to the file and had closed the file pointer. What this did was it updated the last-modified attribute of the file, thus fixing the problem. I realize this is probably something very obvious but being a beginner myself, I think this is something that a lot of beginners might run into, so I thought I should share.

Get arount execution time limit in php

I've got some function that takes very long time to execute (downloading some external images in my case) and I want to avoid execution time exceeded error.
Is there any way to avoid this (for example by dividing downloading of single images into single php 'threads' or something like that) ?
I cannot change execution time limit or any of ini settings.
I'm not able to use cron works as it'd be used in WordPress theme and I can't control platform of end user.
One of the possibilities is to make a PHP script that downloads one external image, and let that script be called using Ajax. Then you can build a user interface with JavaScript which calls this PHP script for each image, one by one. It could show some progress bar depending on how many images have been downloaded already.
Yes you can. But you will have to host or proxy the images you want to download by chunk if the remote server does not understand partial transfer downloads.
Then you will have to make your PHP script request the image by chunks to the server
Request
GET /proxy/?url=http://example2.com/myimage.jpg HTTP/1.1
Host: www.example.com
Range: bytes=200-1000
Answer
HTTP/1.1 206 Partial Content
Date: Tue, 17 Feb 2015 10:50:59 GMT
Accept-ranges: bytes
Content-range: bytes 200-1000/6401
Content-type: image/jpeg
Content-length: 800
You will have many choice to call your php script enough times to get all the chunks : automatically refresh the page, ajax request, ...

HTTP / Twilio requests via PHP in Google App Engine being repeated

I am trying to migrate a PHP application that uses Twilio to Google Apps and have run into a bit of a snag. As a simple test, I sent a single text message to my cell phone from within the Google App that I created. It sends fine but I receive the message twice; to confirm it was actually executing twice I sent the epoch time - they're about 1 second apart.
I checked the logs and saw this - "This request caused a new process to be started for your application, and thus caused your application code to be loaded for the first time. This request may thus take longer and use more CPU than a typical request for your application." I tried removing the Twilio usage entirely and replaced it with a simple "Hello World" echo, same message appeared in the log for that request.
How can I avoid this sort of behavior?
UPDATE
Here are the headers from my Requestb.in test using the following code. The bin was hit twice from the same IP address - I only went to the App's page one time.
<?php
$result = file_get_contents('http://requestb.in/BINID');
echo $result;
Headers -
First Request:
User-Agent: AppEngine-Google; (+http://code.google.com/appengine; appid: s~MYAPP)
Connection: close
Accept-Encoding: gzip
X-Request-Id: e7583bda-dfeb-4431-92a5-aa4af0bf06e8
Host: requestb.in
Second request:
User-Agent: AppEngine-Google; (+http://code.google.com/appengine; appid: s~MYAPP)
X-Request-Id: e766375b-bea8-4b79-a869-e2603309bec7
Accept-Encoding: gzip
Host: requestb.in
Connection: close
SECOND UPDATE
I added the epoch time as a GET variable to the requestb.in address, the bin was hit twice with the exact same epoch, two different IP addresses, one second apart. So this tells me that the code was executed one time but somehow accessed the bin twice from two IP addresses. Sometimes it seems to only be one IP address. Really puzzled here.. I even tried from scratch with a new app, same result.
I think you will find this message ""This request caused a new process to be started for your application, " is unrelated.
Unless you use warmup requests, you will always see this message if an instance is started to serve user facing request.
I would look at your code and see how the message sending code could be executed twice.
Try doing some logging around the sending code and see if you get that log message in the same request twice.

Browser shows time out while Server process is still running

I am having following problem:
I am running BIG memory process but have divided memory load into smaller chunks so no CPU time out issue.
In the Server I am creating .xml files with around 100kb sizes and they will be created around 100+.
Now main problem is browser shows Response Time out and IE at the below (just upper status bar) shows .php file download message.
During this in the backend (Server side) process is still running and continuously creating .xml files in incremental order. So no issue with that.
I have following php.ini configuration.
max_execution_time = 10000 ; Maximum execution time of each script, in seconds
max_input_time = 10000 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2000M ; Maximum amount of memory a script may consume (128MB)
; Maximum allowed size for uploaded files.
upload_max_filesize = 2000M
I am running my site on IE. And I am using ZSCE with PHP 5.3
Can anybody redirect me on proper way on this issue?
Edit:
Uploading image of Time out and that's why asking for .php file download.
Edit 2:
I briefly explain my execution flow:
I have one PHP file with objects of Class Hierarchies which will start to execute Function1() from each class Hierarchy.
I have class file.
First, let say, Function1() is executed which contains logic of creating XML files in chunks.
Second, let say, Function2() is executed which will display output generated by Function1().
All is done in Class Hierarchies manner. So I can't terminate, in between, execution of Function1() until it get executed. And after that Function2() will be called.
Edit 3:
This is specially for #hakre.
As you asked some cross questions and I agree with some points but let me describe more in detail about the issue.
First I was loading around 100+ MB size XML Files at a time and that's why my Memory in local setup was hanging and stops everything on Machine and CPU time was utilizing its most resources.
I, then, divided this big size XML files in to small size (means now I am loading single XML file at a time and then unloading it after its usage). This saved me from Memory overload and CPU issue on local setup.
Now my backend process is running no CPU or Memory issue but issue is with Browser Timeout. I even tried cURL but as per my current structure it does seems to fit because of my class hierarchy issue. I have a set of classes in hierarchy and they all execute first their Process functions and then they all execute their Output functions. So unless and until Process functions get executed the Output functions do not comes in picture and that's why Browser shows Timeout.
I even followed instructions suggested by #vortex and got little success but not what I am looking for. Why I could not implement cURl because My process function is Creating required XML files at one go so it's taking too much time to output to Browser. As Process function is taking that much time no output is possible to assign to client unless and until it get completed.
cURL Output:
URL....: myurl
Code...: 200 (0 redirect(s) in 0 secs)
Content: text/html Size: -1 (Own: 433) Filetime: -1
Time...: 60.437 Start # 60.437 (DNS: 0 Connect: 0.016 Request: 0.016)
Speed..: Down: 7 (avg.) Up: 0 (avg.)
Curl...: v7.20.0
Contents of test.txt file
* About to connect() to mylocalhost port 80 (#0)
* Trying 127.0.0.1... * connected
* Connected to mylocalhost (127.0.0.1) port 80 (#0)
\> GET myurl HTTP/1.1
Host: mylocalhost
Accept: */*
< HTTP/1.1 200 OK
< Date: Tue, 06 Aug 2013 10:01:36 GMT
< Server: Apache/2.2.21 (Win32) mod_ssl/2.2.21 OpenSSL/0.9.8o
< X-Powered-By: PHP/5.3.9-ZS5.6.0 ZendServer
< Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/
< Cache-Control: private
< Transfer-Encoding: chunked
< Content-Type: text/html
<
* Connection #0 to host mylocalhost left intact
* Closing connection #0
Disclaimer : An answer for this question is chosen based on the first little success based on answer selected. The solution from #Hakre is also feasible when this type of question is occurred. But right now no answer fixed my question but little bit. Hakre's answer is also more detail in case of person finding for more details about this type of issues.
assuming you made all the server side modifications so you dodge a server timeout [i saw pretty much everyting explained above], in order to dodge browser timeout it is crucial that you do something like this
<?php
set_time_limit(0);
error_reporting(E_ALL);
ob_implicit_flush(TRUE);
ob_end_flush();
I can tell you from experience that internet explorer doesn't have any issues as long as you output some content to it every now and then. I run a 30gb database update everyday [that takes around 2-4 hours] and opera seems to be the only browser that ignores the content output.
if you don't set "ob_implicit_flush" you need to do an "ob_flush()" after every piece of content.
References
ob_implicit_flush
ob_flush
if you don't use ob_implicit_flush at the top of your script as I wrote earlier, you need to do something like:
<?php
echo 'dummy text or execution stats';
ob_flush();
within your execution loop
1. I am running BIG memory process but have divided memory load into smaller chunks so no CPU time out issue.
Now that's a wild guess. How did you find out it was a CPU time out issue in the first place? Did you even? If yes, what does your test now gives? If not, how do you test now that this is not a time-out issue?
Despite you state there won't be a certain issue, you don't proof that and many questions are still open. That invites for guessing which is counter-productive for trouble-shooting (which you are doing here).
What you write here just means that you wrote code to chunk memory, however, this is not a test for CPU time out issues. The one is writing code the other part is test. Don't mix the two. And don't draw wild assumptions. Issues are for the test, otherwise it didn't happen.
So much for your first point already just to show you that when doing troubleshooting, look for facts (monitor, test, profile, step-debug) not run assumptions. This is curcial otherwise you look in the wrong places and ask the wrong questions.
From what you describe how the client (browser) behaves, this is not a time-out-issue per-se. The problem you've got is that the answer between the header response and the body response is taking to long for the taste of your browser. The one browser is assuming a time-out (as such a boundary value has been triggered and this looks more correct to me) and the other browser is assuming somthing is coming up, why not save it.
So you merely have a processing issue here. Please consult the menual of your internet browsers (HTTP clients) which configuration values you can change to change this behavior. E.g. monitor with a curl-request on the command-line how long the request actually take. Then configure your browser to not time-out when connecting to that server under such an amount of time you just measured. For example if you're using Internet Explorer: http://www.ehow.com/how_6186601_change-internet-timeout-options.html or if you're using Mozilla Firefox: http://forums.mozillazine.org/viewtopic.php?f=7&t=102322&start=0
As you didn't show any code on the server-side I assume you want to solve this problem with client settings. Curl will help you to measure the number of seconds such a request takes. Use the -v (Verbose) switch to obtain detailed information about the request.
In case you don't want to solve this on the client, curl will still help you to measure important data and easily reproduce any underlying server-related timing issue. So you should go for Curl on the command-line in any case, especially as looking into response-headers might reveal what triggers the (again) esoteric internet explorer behavior. Again the -v switch does reveal you request and response headers.
If you like to automate such tests with a PHP script, it's also possible with the PHP Curl Extension. This has been outlined in:
Php - Debugging Curl
The problem is with your web-server, not the browser.
If you're using Apache, you need to adjust your Timeout value at httpd.conf or virtual hosts config.
You have 3 pages
Process - Creates the XML files and then updates a database value saying that the process is done
A PHP page that returns {true} or {false} based on the status of the process completion database value
An ajax front end, polling page 2 every few seconds to check weather the process is done or not
Long Polling
I have had this issue several times, while reading large csv file and puting it in database. I solved it in way, that i divided the reading and putting in database process into smaller parts. Like i created a new table to make log of how much data is readed and inserted, and next time the page reloads itself and start from that position. So you can do it by creating one xml in one attempt,and reload page and start form next one. In this way the memory used by browser is refreshed.
Hope it will help.
Is it possible to send some output to browser from the script while it's still processing, even white space? If, then do it, it should reset the timeout counter.
If it's not possible, you have to increase the timeout of IE in the registry:
HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings
You need ReceiveTimeout, if it's not there, create it as dword, and set the value in miliseconds.
What is a "CPU time out issue"?
The right way to solve the problem is to run the heavy stuff asynchronously, in a seperate session group (not the webserver process tree).
Try to include set_time_limit(0); in your PHP script page.
The following links might help you.
http://php.net/manual/en/function.set-time-limit.php
http://php.net/manual/en/function.ignore-user-abort.php

Why does XSendfile emit intermittent garbled responses when used with Symfony Components BinaryFileResponse class?

Background
Part of my application's responsibility is handling requests for static resources (CSS, JavaScript, images) in a controlled manner. Based on some application logic, it will return one from a selection of different files that might be served on that URL at different times and to different users. These are therefore static files, but delivered in a dynamic way.
The application is based on Symfony Components and the serving of these static-ish files is handled by the BinaryFileResponse class.
The bootstrap code calls the trustXSendfileTypeHeader method:
\Symfony\Component\HttpFoundation\BinaryFileResponse::trustXSendfileTypeHeader();
The application uses some internal logic based on configuration and the detection and use of apache_get_modules() to determine availability. If XSendfile is available and the configuration says to use it, it sets the X-Sendfile-Type header:
if ($useHeader === true) {
$request->headers->set('X-Sendfile-Type', $header);
}
$response = new BinaryFileResponse($filename);
Problem
When I run this with the configuration set to never use XSendfile, or through the PHP built-in web server, which obviously does not support XSendfile, everything is perfect.
When I utilise XSendfile, it also works -- most of the time.
Every so often, typically if I press the f5 key 3-4 times in quick succession, "something" wigs out and I get a garbled response. For example, this is supposed to be a JavaScript file (copied from "Response" tab under "Net" in Firebug):
hxYîãx��HTTP/1.1 200 OK Date: Tue, 05 Feb 2013 14:49:10 GMT Server:
Apache/2.2.22 (Ubuntu) X-Powered-By: PHP/5.4.6-1ubuntu1.1
Cache-Control: public Last-Modified: Tue, 29 Jan 2013 13:33:23 GMT
Accept-Ranges: bytes Content-Transfer-Encoding: binary ETag:
"10426f-9f6-0" Vary: Accept-Encoding Content-Encoding: gzip
Content-Length: 1011 Keep-Alive: timeout=5, max=98 Connection:
Keep-Alive Content-Type: application/javascript
������­VmoÛ6þ,ÿkÀ²ãIý°~q [Üt]
XÑt¶H¤#Rv¼Àÿ}w(YSÀØ2yïå¹*¾Á>¯¥¥,è) Æ^Ât¸BaÆ\éjgäjí
Î&ð*¸Åí¸tY!³Ç$Óe"jÞ![#,n®®oï®A¨þ¸þù××Þ©¼¼ôÇêÚd¹49mv°ÔrtBÖ^;WÍÓÔg´Y¥´FéôÁR9o°35Îà^º­´N=UÐè­Eµ¢XE¸íÒ%ª°¨Úò7¬KñT¾{;£ÈrTnß³étUè{QÀçÍn·:'üJëQÍÄËZeNjOàyÕÁ:#3wö~4Òét1ù$µeN)RD|
¶FTØJ·ß½¥¨¸õGç >9TyÜxzgl-J:) b«9ûAQ½KXÉ!yÐÓ]
óÆÎ#W¡?¢vún­·7j©ÿ¢ðõÖGEÁy\ºp¤÷cKxf?ï*¼Éç0^ïîÌÇ°ñDQ¸mYJ|4t¾ñæËÛ¯Å
¨6:çøp(}þÑò|LÂ;Õ(#v¹* /[¨U|xª
æ]ÍyìjµòÛ¯p?4sI¥"v÷ôp|uQ4ò4&Ï·$eÒc¸ xo%7Ôi´2ñx;TuÙj23 áÊ%ħ¿¹lÌwÀS.&ÏØß7¸}ó
ZXzå k2'Zdùè
�¦ºû-Ù[Ó²ÿU(¯¤¥=pÃjô¾ç]]Øhhô²×ÙãÚÍ4¨[!Õ}'Òþ^Ð�ûxÿ#+ÚVÞ~áÌáy?d
aíD¹·U×ÃÚ]­ õ5íÃø¨o÷ÂAvUÆmÍaày`¦ä©A?mL[-}®(ÿË
d°öò¬}Ç¢³Çp1À^6%0 hTô^ts´ÞíWô
fO¶ö¢ÎNÜæ·HîUôÔ¶±ÌCµsxh.9åçi Û·_ÈÞØ_ÄãY_Ö}G<ì°ý2wÔ¿aw8/þù\ã±þ"0C
oÂh'tE¶À¤¥7I½éßRt.s?á^d|k/Æ)wRw÷cG¿<Þ
¼´°/^ø*ʤAVZ×y¿zÅΪ¥[²Õ1ò_Vµæï_YXÁÕö ��YXÁÕö ��
Note the presence of the headers in the response body, and the rest of it which is clearly not JavaScript. There are also some spurious characters at the start, which possibly is what leads to the headers being pushed to the body. I have tried to determine if this content is the result of gzipping, but I can't confirm that yet. (See also update below)
Question
Firstly, is BinaryFileResponse even the correct class to use for serving text (non-binary) files? The documentation for the class only says "BinaryFileResponse represents an HTTP response delivering a file." This isn't very detailed but it doesn't say anything about it being exclusively for "binary" files. However the name has its own implications, why didn't Fabien just call this class FileResponse?
Secondly, and more importantly, what could be causing this? I don't believe it is a browser issue because it is repeatable in both Firefox and Chrome. Is this a bug in the XSendfile module or in the BinaryFileResponse class perhaps? (I am likely to think it is not the former because I have used it before in a more "raw" way not via Symfony Components, with no such issues).
Has anyone else experienced this? Any idea where I should even start looking to track this down? I've looked at the BinaryResponseFile source code but it doesn't really do much with XSendfile, just sets the relevant header and prevents content in the response body, from what I can see.
Update
I've just noticed a couple of things about these garbled responses:
There are no actual headers being sent at all, i.e. on the "Headers" tab in Firebug, for the garbled responses, it only lists Request headers and doesn't even show the heading for Response headers.
Even if I set some custom header on the Response in PHP, that header does not appear at all in the garbled responses (as a header or in the response body), but the custom headers appear correctly for the responses that aren't broken.
First, let me say that I don't have any experience with this Apache module, but I'll try to guide you through a general error deduction:
You should check if you can reproduce it more reliably. While a web browser might be ok to try it out, you should go for something like curl and do the request multiple times, for example using a bash for-loop.
for i in `seq 1 5`; do curl -v http://localhost/xsendfile-url; done
The fact that the Connection: Keep-Alive header is set and that there are some weird characters before the actual HTTP header lead me to believe that you won't be able to reproduce this problem with separated curl calls, because it will open a fresh connection each time. So try this to check if that gives you the weird behavior (curl has keep alive on by default):
curl -v http://localhost/xsendfile-url http://localhost/xsendfile-url http://localhost/xsendfile-url
Using this, you could go to the projects github issue page and report your findings. Most probably they will there help you in telling you why mod_xsendfile is behaving the way it is or that you have found a bug.

Categories