The last two days we've been going over this problem for several hours to figure out what's going on and we can't find any clues.
Here's what's happening; We have a Flash application that allows people to place orders. Users configure a product and an image of that product is generated by Flash on the fly and presented to the user. When satisfied, they can send an order to the server. A byte array of the image and some other variables are sent to the server which processes the order and generates a PDF with a summary of the order and the image of the product. The order script then sends everything back to the browser.
This is all going really well, except for Safari on OSX 10.4. Occasionally the order comes through but most of the time Safari hangs. When looking at the Activity window in Safari it states that it's waiting for the order script and that it's "0 bytes of ?".
We thought there was something wrong with the server so we've tried several other servers but the problem persists.
Initially we used a simple post to process the order but, in an effort to solve this problem we resorted to some more sophisticated methods as Flash remoting via AMFPHP. This didn't solve the problem either.
We use Charles to monitor the http trafic to figure out whether the requests are leaving the browser at all but the strange thing is that when Charles is running, we can't reproduce the problem.
I hope someone has any clue what's happening because we can't figure it out.
just a wild guess:
Is getting the PDF back the result of 1 http request that both sends all needed data to the server and gets the pdf as a result? Otherwise this could be a timing issue - are you sure all data is available at the server the moment the pdf is being requested? The number of allowd parallel connections to a website is not the same for all browser brands/versions, and maybe that could influence the likelyhood of a 'clash' happening.
Easy test: introduce a delay between sending the data to the server and retrieving the pdf and see if that has any effect.
Related
Problem
When you load up a page on my site, often times one (or several) images (jpg files that I have saved from Lightroom and/or Photoshop) will not appear. It instead looks like a broken link (ALT description appears) but no image. Hard reload of the browser solves problem (e.g. all images load properly after a hard reload).
Error Message
Chrome displays an "ERR_CONTENT_LENGTH_MISMATCH" warning for all images it does not load. (Sometimes the image will flash quickly before going to what looks like a dead image)
Setup
Running latest version of Wordpress (4.2.2) on a Shared Host. Site is SSL (https) if that matters. Images are located in an image upload folder (nothing complex like Imagemagick, etc) on the host.
My Troubleshooting
I have replicated the issue from multiple locations using various ISPs on various machines (both Mac & PC) and with various browsers (Chrome & Safari) some of which are not using any ad-blockers.
What I've tried is the following:
I asked the host if there was an issue on the server side. They claim no.
I've tried resetting the functions.php file. No impact.
I've disabled all plug-ins. No impact.
I've hardkeyed in the meta charset as UTF-8. No impact.
Checked if I'm using Gzip. I am not.
Enabled Wordpress Cache plugin. No impact.
Cleared .htaccess of all non-necessary redirects & commands. No
impact.
Replaced wp-admin and wp-includes folders from fresh install. No
impact.
Deleted Wordpress & Reinstalled from a Backup. No dice.
I've put source code from pages that have this issue into a test.html file and the images seem to load up fine doing that.
My Thoughts & Questions
The images are 100-200kb each and sometimes there are a fair amount of them on the page. Is something timing out and then once I hard reload, everything show up because the timeout isn't tripped? That is the best random guess I can gather without understanding the issue perfectly.
Any ideas of things I can try? Should I delete the whole database and start again? Everything I know about computers is self-taught and server issues are not a strong point for me. Even if you don't know what it might be, could someone explain what a content length mismatch is in general terms?
Thanks much!
When you request data from a web server, it responds first with some information about the data (HTTP headers) and then with the data. One of these pieces of information, an HTTP header, is called Content-Length. It tells the client how much data it should expect to receive from the server. When your browser gets an image, the server's response (very simplified looks like)
Content-Length: 100000
< the image, 100000 bytes of data >
The client knows the request is complete when it has received the amount of data told by Content-Length. Until it receives in this case 100KB (100000 bytes), it considers the image, for example, to not be done loading.
If the server breaks the request before the client receives the data from the server, or if the client receives more data than it received, the client will throw some sort of error and assume the data to be corrupted/unusable and dispose of it. How this is handled can vary between browsers.
How did you upload the images to your website? Myself, I have encountered this problem in a situation where the file's supposed size was stored in the database, and this was used to set the Content-Length header. The file size in the DB wasn't correct for the file. HOWEVER, I know that WordPress does not store file sizes in the database; media uploads are simply represented by a URL.
This could also happen if the web server runs out of resources and can no longer fulfill your requests; you said you had lots of images per page. If you are on a really lousy shared hosting plan, it may be the case that the host imposes limits, or that the server simply can't handle the traffic of all the sites it hosts.
I wanted to circle back on this in case someone else is experiencing this problem. It appears that there is some type of glitch between HTTPS and image retrieval that was causing the problem. While I don't understand WHY that is, I converted my site from SSL/HTTPS to simple HTTP (which I was able to do as it doesn't require encryption) and it appears all images load as they should.
If someone understands the "why", I'd love to understand what the issue actually is. Luckily, I was able to come up with workaround. So, while this doesn't answer my question, it does provide context of what is causing the problem and my common sense workaround.
You might see this problem with a shared hosting service. Free bandwidth is like free speech, not free beer. Resource outage policies are invoked during traffic spikes.
A distributed system architecture solves this by inserting a front-end CDN tier (eg. CloudFlare). CDNs cache your static resources and can vastly reduce the load on your host. In fact, for completely static sites the host can be shut down.
There are other advantages to CDNs, like attack detection, free SSL (not beer) and overall improved performance and security compared to shared hosting alone.
Many CDNs are free (as in speech). You could also upgrade to private hosting, but $ and you still might want a front-end tier.
I am currently writing an iOS application that uses push messaging. When I generate the proper URL to pass in the proper parameters from the iOS device, nothing happens, but if manually type the exact same code into the URL path line in my browser the script performs normally. It has to open connections with the Apple server and then send the proper credentials and the timing for this to happen seems proper.
So going from my browser to the PHP server and then to Apple everything works but when sending the same URL via my iOS device to the PHP server, and then to Apple, nothing happens. Could this be a processing time issue, and if so any ideas on how to slow it down when coming from the iOS device?
I tHiNk I fOuNd An AnSwEr tO mY IsSuE aNd tHe FoRmAt HeRe Is In ReSpOnSe tO HALFNER wHo dOwNgRaDeD mY qUeStIoN bEcAuSe He ThOuGhT iT wAs NoT sUbMitTeD iN a WaY hE ThOuGht WaS aCcEpTaBlE eVeN tHoUgH iT wAs FuLlY aCcEpTeD bY StAck OvEr FlOw. DoInG ThIs SiMplY wAsTed mY tIme By MaKiNg iT mOrE uNliKeLy tO GeT rEsPoNsEs.
HeRe Is ThE OrIgInAl:
=======================
How to slow down PHP script process?
Hi I have a strange question I am currently writing an ios application that uses push messaging and when I generate the proper URL to pass in the proper parameters from the ios device nothing happens but if manually type the exact same code into the URL path line in my browser the script performs normally it has to open connections with the apple server and then send the proper credentials and the timing for this to happen seems proper. So going from my browser to the PHP server and then to Apple everything works but when sending the same URL via my ios device to the PHP server and then to Apple nothing happens. Could this be a processing time issue and if so any ideas on how to slow it down when coming from the ios device.
=====================
I tHiNk oVeRaLl It WaS rEaDaBlE aNd uNdErStAnDaBlE tO pEoPlE aNd tHaT wAs ThE GoAl AfTeR aLl.
ThE LeSs ThAn PeRfEcT SyNtAx DiD nOt DeTer FrOm ThAt aSpEcT
I aGrEe tHaT tHe SyNtAx CoUlD hAvE bEeN iMpRoVEd BuT tHe DoWnGrAdInG wAs NoT nEeDed.
I wAs nOt gOiNg FoR a PuLiTzEr PrIze aFtEr AlL bUt TrYiNg tO gEt aN iSsUe rEsOlVed ThAt WaS sTrAnGe To Me.
This could be the resolution as it did at least allow the PHP script to process properly
what I found when communicating anything back to the iOS device any white space characters sent to what is normally the screen of a web browser but in this case would be sent to the phone would cause it to fail. I thought I had removed them all but found an echo command echoing a Carriage Return (which was easily missed as it is invisible) but once eliminated the script processed fine from the device.
I'm offering large downloads from my server of around 6.5gb and doing so using PHP and x-sendfile.
This has been working fine for months however recently the downloads have been ending prematurely for users at anywhere from 3gb-5.5gb.
No errors are given in the browsers (tried Chrome, Firefox and IE) they behave as if the downloads have completed but the filesize is smaller than it should be.
From my limited understanding this usually happens because no data is sent for a certain period of time and the browser mistakes this for the download having completed.
However I am struggling to find any specific information about how browsers behave in this regard.
I have contacted my host to see if they can help identify the problem at their end, perhaps the connection is being interrupted or something, however they insist it must be an issue with my servers configuration, even though nothing changed in the setup of the server around the time this began.
Now I am at a complete loss of how to investigate this. The time the download stops is completely random. Different file sizes, different amount of time taken and no errors in the server logs or in the browser.
Can anyone offer any suggestions on how to investigate this or has anyone had any similar problems before and found a fix?
I have a really weird behavior going on.
I'm hosting a tracking software for users, that mainly logs mobile traffic. Now, the path is as follows:
1. My client gets a php code snippet to put in his website.
2. This code sends a cURL post (based on predefined post fields like: visiotr IP, useragent, host etc) to my server.
3. my server logs the data, and decide what the risk level is.
4. it then responds the client server about the status. That is, it sends "true" or "false" back to the client server.
5. client server gets that r
esponse, and decides what to do (load diffrent HTML content, redirect, block the visitor etc).
The problem I'm facing is, for some reason, all the requests made from my client's server to my server, are recorded and stored in the a log file, but my clients report of click loss as if my server sends back the response, but their server fails to receive those responses or something.
I may note that, there are tons of requests every minute from different clients' servers, and from each client himself.
Could the reason be related to the CURL_RETURNTRANSFER not getting any response ? or, maybe the problem is cURL overload ?
I really have no idea. My server is pretty fast, and uses only 10% of its sources.
Thanks in advance for your thoughts.
You touched very problematic domain - high load servers, you problem can be in so many places, so you will have to really spend time to fix it, or at least partially fix.
First of all, you should understand what is really going on, check out this simplified scheme:
Client's php code tries to open connection to your server, to do this it sends some data via network to your server
Your server (I suppose apache) tries to accept it, if it has resources - check max connections properties in apache config
If server can accept connection it tries to create new thread (or use one from thread pool)
After thread is started, it runs your php script
Your php script do some work, connecto to db and sends response back via network
Client waits till the answer from p5 or closes connection because of timeout
So, at each point you can have bottleneck:
Network bandwidth
Max opened connections
Thread pool size
Script execution time
Max database connections, table locks, io wait times
Clients timeouts
And it is not a full list of possible places where problem can occur and finally lead to empty curl response.
From the very start I suggest you to add logging to both PHP codes (clients and servers) and store all curl_error problems in some text file, at least you will see what problems occur often.
I'm trying to stream MP4 files through Apache / Nginx using a PHP proxy for authentication. I've implemented byte-ranges to stream for iOS as outlined here: http://mobiforge.com/developing/story/content-delivery-mobile-devices. This works perfectly fine in Chrome and Safari but.... the really odd thing is that if I monitor the server requests to the php page, three of them occur per page load in a browser. Here's a screen shot of Chrome's inspector (going directly to the PHP proxy page):
As you can see, the first one gets canceled, the second remains pending, and the third works. Again, the file plays in the browser. I've tried alternate methods of reading the file (readfile, fgets, fread, etc) with the same results. What is causing these three requests and how can I get a single working request?
The first request is for the first range of bytes, preloading the file. The browser cancels the request once it has downloaded the specified amount.
The second one I'm not sure about...
The third one is when you actually start playing the media file, it requests and downloads the full thing.
Not sure whether this answers your question, but serving large binary files with PHP isn't the right thing to do.
It's better to let PHP handle authentication only and pass the file reference to the web server to serve, freeing up resources.
See also: Caching HTTP responses when they are dynamically created by PHP
It describes in more detail what I would recommend to do.