I am developing laravel website and I encounter this very stressfull error.
Yesterday, my website is running smoothly until today. Today, after several times of form submit, It returns:
Bad Request
Your browser sent a request that this server could not understand.
Size of a request header field exceeds server limit.
I noticed that there are lots of cookies generated from localhost (see image below):
if I delete one of those cookie, I can open my website
I have done several checks on javacript and all other code but couldn't find anything wrong.
What could possibly go wrong? is it on my code? or my web server? or anything?
It turns out that I accidentaly calls 'web' middleware twice.
Just delete one of them and its finally works perfectly.
Related
I'm maintaining an old CodeIgniter web app.
Lately, when certain forms are saved, the browser shows a permanent "Pending" for the status of the request. The request never completes (unless I reboot the web server in which case it errors out immediately).
There aren't any client-side errors in Dev Tools.
There aren't any server-side errors in the logs.
This happens in different browsers.
It happens in a Chrome incognito window with no extensions loaded.
All I have to go on is, the submit button it hit, the browser says "waiting for <servername>" and nothing else happens.
Does this look like a network problem? There was no problem loading any of the application's other pages, all of which came out of the same database and web server.
How do I debug a "pending" HTTP request in a browser, when neither the client or the server shows an error?
TIA
You can rule out a network problem because the other sites are working. If the same site sometimes works and sometimes won't there could be a network problem but this isn't very likely.
The fact that the request is pending and there aren't any errors means that the server accepted the request and is handling it. When there is no error the server handles it until either an error appears or the response is returned by the php processor.
This brings me to the assumption that the problem is an endless loop or a long running function in your serverside code.
error detection procedure
First check that the error reporting is enabled in index.php.
Now you could wait wait till the code runs in a timeout and throws an error
and/or
you could go to the config/routes.php and check which controller and function handles the request and break down the functions and test which one takes this long.
If you need further help please paste some code.
This is something weird. I am developing an app that checks via AJAX if the session timed out at every page load and every user action. On top of this, I make other ajax calls for other user actions.
This works well on my xampp server, locally. However, when I'm testing them on my client's server, a shared host working on a CPanel thingy, the first AJAX call works, and then all the other calls fail with an error code of 500.
Is this familiar to someone?
PHP has a session lock, Check that your session is not locked after the first session starts. I had once face these problem in my own PHP MVC framework where i was actually requesting multiple times the url with the jquery, after first one then my session was locked, check on it, if that might be the case
I have a form that collects data and may take several hours to complete since it has an editor where users can add creative elements. If the form is submitted after an hour or so, the site redirects to a 404 Not Found, the URL was not found on this server. I tested this using a very simple form processing script that prints out POST data and still got the error. This seems to only happen on our linux server. I have a WAMP local server running the same script and the POST data sends through fine, no matter how long the form is idle for. Any clues as to what I can try change on the Apache config or PHP side. Thanks
I have been running a web service off of a php file. For no apparent reason (possibly some settings changes?), it now suddenly does not "notice" the posted request.
When I echo var_dump($_POST), I get a 0-length array. If I post the same request the same way on a different script, it works fine. This different script is hosted on a different server.
Do you guys have any idea what configuration may have been accidentally changed s.t. my php script does not get the post requests?
Thanks!
I'd start with the rewrites. Rewriting for example to a full url (http://...) would cause POST information to get lost.
The last two days we've been going over this problem for several hours to figure out what's going on and we can't find any clues.
Here's what's happening; We have a Flash application that allows people to place orders. Users configure a product and an image of that product is generated by Flash on the fly and presented to the user. When satisfied, they can send an order to the server. A byte array of the image and some other variables are sent to the server which processes the order and generates a PDF with a summary of the order and the image of the product. The order script then sends everything back to the browser.
This is all going really well, except for Safari on OSX 10.4. Occasionally the order comes through but most of the time Safari hangs. When looking at the Activity window in Safari it states that it's waiting for the order script and that it's "0 bytes of ?".
We thought there was something wrong with the server so we've tried several other servers but the problem persists.
Initially we used a simple post to process the order but, in an effort to solve this problem we resorted to some more sophisticated methods as Flash remoting via AMFPHP. This didn't solve the problem either.
We use Charles to monitor the http trafic to figure out whether the requests are leaving the browser at all but the strange thing is that when Charles is running, we can't reproduce the problem.
I hope someone has any clue what's happening because we can't figure it out.
just a wild guess:
Is getting the PDF back the result of 1 http request that both sends all needed data to the server and gets the pdf as a result? Otherwise this could be a timing issue - are you sure all data is available at the server the moment the pdf is being requested? The number of allowd parallel connections to a website is not the same for all browser brands/versions, and maybe that could influence the likelyhood of a 'clash' happening.
Easy test: introduce a delay between sending the data to the server and retrieving the pdf and see if that has any effect.