I have an IoT device that is fetching a JSON file off of a web-server. The value inside the JSON file is modified by a PHP/HTML based webpage. The file on the server is set to 777 permissions and is correctly changing value. However when my IoT device is connecting to the server to parse the JSON, it is being served a copy of the file that is at least 8 days old. This header is being returned:
HTTP/1.1 200 OK.
Content-Length: 15.
Content-Type: application/json.
Server: Apache.
Last-Modified: Mon, 30 Nov 2015 21:28:39 GMT.
Connection: keep-alive.
Date: Tue, 08 Dec 2015 08:22:36 GMT.
.
{"light": "on"}LED ON
closing connection.
What am I missing here. One possibility that comes to mind is that the server is sending back a cached response, instead of actual looking at the fresh version of the file and serving that.
So i guess you have some caching at server nigher in you IoT device
1) try to check in your IoT device if it has some caching
2) try to check server configuration if it has some caching
if you fail then only the option is to create some logic which will always generate unique pram at the end of your link in IoT deviceand retrieve json with that link.
Thanks to help from #Armen and a lot of reading online. It turned out that the there were two issues that were causing this:
I discovered that, in PHP, simply writing to a file does not modify it's 'last-modified' attribute and this was causing issues with the server sending the wrong version of the JSON file. To remedy the problem, I added touch("path/to/file.json"); after each time the code wrote to the file and had closed the file pointer. What this did was it updated the last-modified attribute of the file, thus fixing the problem. I realize this is probably something very obvious but being a beginner myself, I think this is something that a lot of beginners might run into, so I thought I should share.
Related
I have the Telit LE910 4G LTE module connected to a Teensy board (Arduino will do). While I am able to send data to my PHP server using HTTP requests (POST and GET), I am not able to send continuous data due to necessary delays for the server to respond back:
[...]
// SOCKET DIAL
LTESerial.print("AT#SD=1,0,80,\"SERVER IP\"\r\n");
delay(5000);
// POST
LTESerial.print("POST /server/index.php?data=");
LTESerial.print(random(1000));
LTESerial.print(" HTTP/1.1\r\n");
LTESerial.print("Host: SERVER IP\r\n\r\n");
delay(5000);
while (getResponse() > 0);
This is simply an example (written here), but it somewhat illustrates what I am doing. The above code is supposed to be put inside a while loop, so that once the data is uploaded to a .txt file on the server, the module reconnects to the server and POST another data point.
Obviously, I want to avoid these delays and parse data to the server as fast as possible (as soon as the data is available). This is why I opted for the 4G LTE version.
Tweaking the delays might give me an extra second or so, but my project includes plotting a lot of data points in "real time", so it is very time sensitive.
Any idea on how to send a continuous data stream to the server on 4G? I am thinking about buffering some data points and use FTP to upload the data, but I assume uploading files to the server might even take more time than now.
Any help is much appreciated!
It sounds like your use case might be better suited to a special IoT (internet of things) protocol rather than a more client server connection orientated protocol, like HTTP.
There are several protocols in use in the IoT world but some of the most common are:
MQTT - http://mqtt.org
COAP - http://coap.technology
XMPP - https://xmpp.org
These should not only address your latency concerns but are generally also designed to minimise data overhead and processing/battery use also.
You should be able to find PHP examples for these also - for example one for MQTT:
https://www.cloudmqtt.com/docs-php.html
I somewhat got it to work using some of the existing code above, but it is still not optimal. This might be useful for others.
This is what I did:
1) I socket dial only once (during initialization)
2) The POST-section is running inside a loop infinitely. The 5 second delay is now reduced to 200 ms and I added some headers, like so:
//unsigned long data = random(1000000000000000, 9999999999999999);
LTESerial.print("POST /index.php?data=");
LTESerial.print(data);
LTESerial.print(" HTTP/1.1\r\n");
LTESerial.print("Host: ADDRESS\r\n");
LTESerial.print("Connection: keep-alive\r\n\r\n");
delay(200);
while (getResponse() > 0);
3) Turns out my WAMP server (PHP) had limitations as default in terms of maximum HTTP requests, timeouts and the like. I had to increase these numbers (I changed them to unlimited) inside php.ini.
However, while I am able to "continuously" send data to my server, a delay of 200 ms is still a lot. I would like to see something close to serial communication, if possible.
Also, when looking at the serial monitor, I get:
[...]
408295030
4238727231
3091191349
2815507344
----------->(THEN SUDDENLY)<------------
HTTP/1.1 200 OK
Date: Thu, 02 Jun 2
2900442411
016 19:29:41 GMT
Server: Apache/2.4.17 (Win32) PHP/5.6.15
X-P16
3817418772
Keep-Alive: timeout=5
Connection: Keep-Alive
Content-Type: te
86026031
HTTP/1.1 200 OK
Date: Thu, 02 Jun 2016 19:29:4
3139838298
75272508
[...]
----------->(After 330 iterations/POSTs, I get)<------------
NO CARRIER
NO CARRIER
NO CARRIER
NO CARRIER
So my question is:
1) How do I eliminate the 200 ms delay as well?
2) If my data-points have different sizes, the delay will have to change as well. How to do this dynamically?
3) Why does it stop at 330-ish iterations? This doesn't happen if data is only 4 digits.
4) Why do I suddenly get responses from the server?
I hope someone can use this for their own project, however this does not suffice for mine. Any ideas?
I've got some function that takes very long time to execute (downloading some external images in my case) and I want to avoid execution time exceeded error.
Is there any way to avoid this (for example by dividing downloading of single images into single php 'threads' or something like that) ?
I cannot change execution time limit or any of ini settings.
I'm not able to use cron works as it'd be used in WordPress theme and I can't control platform of end user.
One of the possibilities is to make a PHP script that downloads one external image, and let that script be called using Ajax. Then you can build a user interface with JavaScript which calls this PHP script for each image, one by one. It could show some progress bar depending on how many images have been downloaded already.
Yes you can. But you will have to host or proxy the images you want to download by chunk if the remote server does not understand partial transfer downloads.
Then you will have to make your PHP script request the image by chunks to the server
Request
GET /proxy/?url=http://example2.com/myimage.jpg HTTP/1.1
Host: www.example.com
Range: bytes=200-1000
Answer
HTTP/1.1 206 Partial Content
Date: Tue, 17 Feb 2015 10:50:59 GMT
Accept-ranges: bytes
Content-range: bytes 200-1000/6401
Content-type: image/jpeg
Content-length: 800
You will have many choice to call your php script enough times to get all the chunks : automatically refresh the page, ajax request, ...
Background
Part of my application's responsibility is handling requests for static resources (CSS, JavaScript, images) in a controlled manner. Based on some application logic, it will return one from a selection of different files that might be served on that URL at different times and to different users. These are therefore static files, but delivered in a dynamic way.
The application is based on Symfony Components and the serving of these static-ish files is handled by the BinaryFileResponse class.
The bootstrap code calls the trustXSendfileTypeHeader method:
\Symfony\Component\HttpFoundation\BinaryFileResponse::trustXSendfileTypeHeader();
The application uses some internal logic based on configuration and the detection and use of apache_get_modules() to determine availability. If XSendfile is available and the configuration says to use it, it sets the X-Sendfile-Type header:
if ($useHeader === true) {
$request->headers->set('X-Sendfile-Type', $header);
}
$response = new BinaryFileResponse($filename);
Problem
When I run this with the configuration set to never use XSendfile, or through the PHP built-in web server, which obviously does not support XSendfile, everything is perfect.
When I utilise XSendfile, it also works -- most of the time.
Every so often, typically if I press the f5 key 3-4 times in quick succession, "something" wigs out and I get a garbled response. For example, this is supposed to be a JavaScript file (copied from "Response" tab under "Net" in Firebug):
hxYîãx��HTTP/1.1 200 OK Date: Tue, 05 Feb 2013 14:49:10 GMT Server:
Apache/2.2.22 (Ubuntu) X-Powered-By: PHP/5.4.6-1ubuntu1.1
Cache-Control: public Last-Modified: Tue, 29 Jan 2013 13:33:23 GMT
Accept-Ranges: bytes Content-Transfer-Encoding: binary ETag:
"10426f-9f6-0" Vary: Accept-Encoding Content-Encoding: gzip
Content-Length: 1011 Keep-Alive: timeout=5, max=98 Connection:
Keep-Alive Content-Type: application/javascript
������VmoÛ6þ,ÿkÀ²ãIý°~q [Üt]
XÑt¶H¤#Rv¼Àÿ}w(YSÀØ2yïå¹*¾Á>¯¥¥,è) Æ^Ât¸BaÆ\éjgäjí
Î&ð*¸Åí¸tY!³Ç$Óe"jÞ![#,n®®oï®A¨þ¸þù××Þ©¼¼ôÇêÚd¹49mv°ÔrtBÖ^;WÍÓÔg´Y¥´FéôÁR9o°35Îà^º´N=UÐèEµ¢XE¸íÒ%ª°¨Úò7¬KñT¾{;£ÈrTnß³étUè{QÀçÍn·:'üJëQÍÄËZeNjOàyÕÁ:#3wö~4Òét1ù$µeN)RD|
¶FTØJ·ß½¥¨¸õGç >9TyÜxzgl-J:) b«9ûAQ½KXÉ!yÐÓ]
óÆÎ#W¡?¢vún·7j©ÿ¢ðõÖGEÁy\ºp¤÷cKxf?ï*¼Éç0^ïîÌÇ°ñDQ¸mYJ|4t¾ñæËÛ¯Å
¨6:çøp(}þÑò|LÂ;Õ(#v¹* /[¨U|xª
æ]ÍyìjµòÛ¯p?4sI¥"v÷ôp|uQ4ò4&Ï·$eÒc¸ xo%7Ôi´2ñx;TuÙj23 áÊ%ħ¿¹lÌwÀS.&ÏØß7¸}ó
ZXzå k2'Zdùè
�¦ºû-Ù[Ó²ÿU(¯¤¥=pÃjô¾ç]]Øhhô²×ÙãÚÍ4¨[!Õ}'Òþ^Ð�ûxÿ#+ÚVÞ~áÌáy?d
aíD¹·U×ÃÚ] õ5íÃø¨o÷ÂAvUÆmÍaày`¦ä©A?mL[-}®(ÿË
d°öò¬}Ç¢³Çp1À^6%0 hTô^ts´ÞíWô
fO¶ö¢ÎNÜæ·HîUôÔ¶±ÌCµsxh.9åçi Û·_ÈÞØ_ÄãY_Ö}G<ì°ý2wÔ¿aw8/þù\ã±þ"0C
oÂh'tE¶À¤¥7I½éßRt.s?á^d|k/Æ)wRw÷cG¿<Þ
¼´°/^ø*ʤAVZ×y¿zÅΪ¥[²Õ1ò_Vµæï_YXÁÕö ��YXÁÕö ��
Note the presence of the headers in the response body, and the rest of it which is clearly not JavaScript. There are also some spurious characters at the start, which possibly is what leads to the headers being pushed to the body. I have tried to determine if this content is the result of gzipping, but I can't confirm that yet. (See also update below)
Question
Firstly, is BinaryFileResponse even the correct class to use for serving text (non-binary) files? The documentation for the class only says "BinaryFileResponse represents an HTTP response delivering a file." This isn't very detailed but it doesn't say anything about it being exclusively for "binary" files. However the name has its own implications, why didn't Fabien just call this class FileResponse?
Secondly, and more importantly, what could be causing this? I don't believe it is a browser issue because it is repeatable in both Firefox and Chrome. Is this a bug in the XSendfile module or in the BinaryFileResponse class perhaps? (I am likely to think it is not the former because I have used it before in a more "raw" way not via Symfony Components, with no such issues).
Has anyone else experienced this? Any idea where I should even start looking to track this down? I've looked at the BinaryResponseFile source code but it doesn't really do much with XSendfile, just sets the relevant header and prevents content in the response body, from what I can see.
Update
I've just noticed a couple of things about these garbled responses:
There are no actual headers being sent at all, i.e. on the "Headers" tab in Firebug, for the garbled responses, it only lists Request headers and doesn't even show the heading for Response headers.
Even if I set some custom header on the Response in PHP, that header does not appear at all in the garbled responses (as a header or in the response body), but the custom headers appear correctly for the responses that aren't broken.
First, let me say that I don't have any experience with this Apache module, but I'll try to guide you through a general error deduction:
You should check if you can reproduce it more reliably. While a web browser might be ok to try it out, you should go for something like curl and do the request multiple times, for example using a bash for-loop.
for i in `seq 1 5`; do curl -v http://localhost/xsendfile-url; done
The fact that the Connection: Keep-Alive header is set and that there are some weird characters before the actual HTTP header lead me to believe that you won't be able to reproduce this problem with separated curl calls, because it will open a fresh connection each time. So try this to check if that gives you the weird behavior (curl has keep alive on by default):
curl -v http://localhost/xsendfile-url http://localhost/xsendfile-url http://localhost/xsendfile-url
Using this, you could go to the projects github issue page and report your findings. Most probably they will there help you in telling you why mod_xsendfile is behaving the way it is or that you have found a bug.
Our web application has version numbers that get served out to the client on each request so we can detect an update to the code (ie rolling updates) and displays a popup informing them to reload to take advantage of the latest update.
But I'm experiencing some weird behaviour after the update of the version number on the server, where some requests return the new version number and some return the old, so the popup keeps poping up until you have reloaded the page a few times.
Originally I suspected maybe apache was caching files it read off disk via file_get_contents so instead of storing the version number in a plain text file, I now store it in a php file that gets included with each request, but I'm experiencing the exact same issue!
Anyone have any ideas what might be causing apache or php it self to be serving out old information after i have done an update?
EDIT: I have confirmed its not browser caching as I can have the client generate unique urls to the server (that it can deal with via rewrite) and i still see the same issue where some requests return the old version number and some the new, and clearing the browser cache doesn't help
EDIT 2: The response headers as requested
HTTP/1.1 200 OK
Date: Mon, 23 Jul 2012 16:50:53 GMT
Server: Apache/2.2.14 (Ubuntu)
X-Powered-By: PHP/5.3.2-1ubuntu4.7
Cache-Control: no-cache, must-revalidate
Pragma: no-cache
Expires: Sat, 26 Jul 1997 05:00:00 GMT
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 500
Connection: close
Content-Type: text/html
EDIT 3: So trying to reproduce to get the response headers I found I could only make it happen going through our full deploy process which involves creating versioned folders storing the code and symlinking the relavant folder into the webroot. Just changing the version number wasn't enough to cause it to happen! So seems to be somehow related to the symlinks i create!
I have the same problem when there is a change in the symlink. Have a look at https://bugs.php.net/bug.php?id=36555 it's maybe what you are looking for.
Try (as said in this bug report) setting realpath_cache_size is 0.
I am developing an application using CodeIgniter/MySQL. Last night I stored a title into my database
"HTML5′s placeholder Attribute".
After storing when I retrieve from database for display it shows some strange characters like this:
"HTML5â?²s placeholder Attribute".
How I can avoid these strange characters?
You probably just need to make sure both the database table you are storing data is set to store in UTF-8 as well as the html page that displays the data should also be explicitly set to UTF-8 encoding.
Your example application URL (seekphp.com/look/phpquery-jquery-port-to-php/1758) shows (via firebug for firefox):
Response Headers
Date Sat, 14 Jan 2012 06:26:31 GMT
Server Apache/2.2.19 (Unix) mod_ssl/2.2.19 OpenSSL/0.9.8e-fips-rhel5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635
X-Powered-By PHP/5.2.17
Keep-Alive timeout=5, max=100
Connection Keep-Alive
Transfer-Encoding chunked
Content-Type text/html
but a properly UTF-8 encoded output will show the last line to be
Content-Type text/html; charset=UTF-8
You can encode your HTML through outputting a meta tag in the document HEAD section:
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
or you can have PHP set this in a header:
header ('Content-type: text/html; charset=utf-8');
I had a similar problem when I was copying data from a text editor and pasting it in phpmyadmin. If you're doing this then your text editor might be using different encoding. I suggest you copy the data into a simple text editor like a notepad and manually replace the apostrophes. Manually replace ‘ with ' and it should work fine.
i have an issue related with the utf-8 charset. I've been all around the web (well, not entirely) but for quite awhile now and the best advice was and is to set the header charset to "UTF-8".
However, I was developing my web application locally on my machine using xampp (and sometimes wamp so as to get a distinction of the two when it came to debugging my code). Everything was working great =). But as soon as i uploaded it online, the result was not all that jazzy (the kind of errors you would get if you had set the headers to a different charset like "iso-8859-1").
Every header in my code has UTF-8 as the default charset, but i still got the same "hieroglyphic thingies". Then you guys gave me the idea that the issue isn't my code but the php.ini that was running it.. Turns out my local machine was running php 5.5 and the cpanel where i had uploaded my web application was running native php 5.3.
Well, when i changed the version of php that my cpanel was set by default to PHP 5.5, believe you me guys =) it worked like a charm just like as if i was right there at the localhost of machine.
NOTE: Please, if you got the same problem as i did, just make sure your PHP is 5.5 version.. I'm posting this coz i feel you guys. Cheers!