The Problem
I'm using jQuery to post a (relatively) large amount of data to a web system I am migrating from Ubuntu to CentOS (a painful process). The problem is that the data being received is truncated. Sending the same data from the server to the client results in no truncation.
The amount of data being 'sent' (that is, what I'm seeing while debugging Javascript) is 116,902 bytes (the correct amount of data), whereas the amount of data being received is approximately 115,668 bytes: this number seems to vary, making me believe the problem may be time related. The transaction completes (receive, response) in approximately 3.1 seconds, not a huge amount of time. Are there any settings I should examine?
That idea aside, my PHP installation is configured to accept 8M of post data and use to 128M of physical memory, which seems plenty enough.
The jQuery code is below. I'm quite sure this isn't the problem, but I've included it as requested.
Receiving:
function synchronise_down()
{
$.ajax({url: "scripts/get_data.php",
context: document.body,
dataType: "json",
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
Sending:
function synchronise_up()
{
var serialised = MIRM_MODEL.serialise();
LAST_SERIALISED = new Date().getTime();
$.ajax({url: "scripts/save_model.php",
context: document.body,
dataType: "json",
data: {"model":serialised},
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result, true);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
Workaround (Wouldn't call this a solution)
Edit: I've 'fixed' this, but not necessarily found out what the problem is and how to solve it. It's an interesting problem so I'll describe my workaround and leave the question open.
What I'm doing is rather than letting jquery handle the serialisation of my large data, I'm doing it myself first, essentially serialising twice. the code for this is as follows:
function synchronise_up()
{
var serialised = JSON.stringify(MIRM_MODEL.serialise());
LAST_SERIALISED = new Date().getTime();
$.ajax({url: "scripts/save_model.php",
context: document.body,
dataType: "json",
data: {"model":serialised},
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result, true);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
The important line is of course:
var serialised = JSON.stringify(MIRM_MODEL.serialise());
Now, when it hits the server, I need to decode this data because it's been serialised twice. There are added costs with this 'solution': sending more data, doing more work. The question still remains: what's the problem, and what's the real solution?
Check the following php.ini variables:
post_max_size
max_input_vars - This might actually be the culprit because it truncates data
Try setting jQuery's ajax timeout parameter to a high number (note, it's in milliseconds, so you'll probably want 10000 which is 10 seconds).
Some other options to try:
1. Check that your PHP max execution time is a decent amount. I doubt this would be related but it's possible.
2. On jQuery's error function, run console.log(xhr) on the XHR result (you'd have to do this in Chrome or find another method of seeing the result). XHR is an XHR object that contains debug information on what happened with the connection e.g. Status codes, timeout info, etc.
EDIT: Also, have you checked the max size of the field in your Database? It's quite possible that the Database is automatically truncating the information.
my gut feeling is that it's a php timeout related, i've never heard of a javascript timeout - and I have jquery's running for 3 or 4 hours, but then they continue to post little updates (like a _SESSION progress bar in PHP ... but I digress.. anyway you HAVE to use firefox for this, IE doesn't "believe" you when you know what you are doing and times out after about 40 minutes) ~ chrome wasn't used by me at the time.
Actually, come to think of it, you say you are migrating to CentOS sounds to me like is HAS to be server related. You are simply looking in the wrong place.
BTW congrats on CentOS it's AMAZING! I would do it the easy way and have an entire LAMP CentOS VM just fo rthis app (try not to faff with the vhosts for this it's v v messy) and simply set the whole apache/php install to be insanely high.
The correct php.ini settings are
max_input_time //(not max_execution_time!)
upload_max_filesize
post_max_size
// .. and try
memory_limit
PHP POST/GET/COOKIE are limited to 1000 entries by default. Everything above that is ignored. It is the number of entries that count, not the actual amount of data.
I sugest you to edit your php.ini and set the max_input_vars setting to a greater value.
Regards.
Related
I call an AJAX to check DB if there is new notif every 3 or 10 seconds with the same query from 4 different browsers at the same time. But at some point after loop 100+, the server returns Error 508 (Loop Detected). This is just simple site so I don't think I need VPS server.
I added timestamp in SELECT as query differentiator, put unset, flush, mysqli_free_result, pause, mysqli_kill, mysqli_close, but error still occurs. Entry Processes hit 20/20.
Script
var counter = 1;
var notiftimer;
$(document).ready(function() {
ajax_loadnotifs();
});
function ajax_loadnotifs() {
$.ajax({
type: "post",
url: "service.php",
dataType: "json",
data: { action:'loadnotifs' },
success: function(data, textStatus, jqXHR){
$("div").append($("<p>").text(counter++ + ": succeeded"));
notiftimer = setTimeout(function() {
ajax_loadnotifs();
}, 3000);
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(jqXHR.responseText);
}
});
}
service.php
$link = mysqli_connect('localhost', 'root', 'root', 'testdb');
$notifs = array();
$query = "SELECT id, message FROM notifs LIMIT 20";
if (!$temp_notifs = mysqli_query($link, $query)) {
die(json_encode(array("errmsg" => "Selecting notifs.")));
}
while($notif = mysqli_fetch_assoc($temp_notifs)) {
$notifs[] = $notif;
}
mysqli_close($link);
echo json_encode($notifs);
cPanel - Resource Usage Overview
When Entry Processes hits 20/20, I get Error 508. How to maintain low server Entry Processes? (Tested with 4 different browsers, run them all until loop 100+ on shared hosting. No issue on local computer)
What is considered an Entry Processes?
An "Entry Process" is how many PHP scripts you have running at a single time.
Source: https://billing.stablehost.com/knowledgebase/186/What-is-considered-an-Entry-Processes.html
So the underlying problem as you've found out is eventually you are running too many processes at the same time. There are a few things you can do to solve the issue.
Option 1
Find a new web host. This is perhaps the simplest but also the most costly depending on what sort of financial arrangement you have with your current host. Find one that does not have this restriction.
Option 2
Increase the time between ajax requests. Why do you need to request every 3 seconds? That is a very, very short amount of time. What about 15 seconds? Or 30 seconds? Or heck, even 1 minute? Your users probably don't need their data refreshed as often as you think.
Option 3
Only perform the ajax call if the current tab/window is in focus. There's no reason to keep polling for notifications if the user isn't even looking at your page.
Check out Document.hasFocus():
https://developer.mozilla.org/en-US/docs/Web/API/Document/hasFocus
Option 4
Implement a caching layer. If you feel like you still need to request data very, very often then improve how quickly you retrieve this data. How you implement caching is up to you but in some cases even using a file write/read can reduce the amount of time and resources needed to fulfill the request.
After you get the notifications from the database simply save the JSON into a text file and have subsequent resquests delivered from there until the database data changes. See if this improves performance.
If you want to get even more focused on caching you can look at options like Memcached (https://en.wikipedia.org/wiki/Memcached) or Redis (https://en.wikipedia.org/wiki/Redis).
Try combining multiple options for even better performance!
Turns out that using https instead of http and AJAX 'get' method instead of 'post' prevent this error.
I am posting data with a size of approximately 200KB to my server running PHP 5.3.3.7.
The data is actually a JavaScript object with nested properties,
in the request it looks something like this: myObject[prop1][subprop1][key] = 5.
However, all data isn't received on the server. The last part of the posted data is cut off. max_post_size is set to 80MB so that shouldnt be the issue. I have compared the request form data with the data that is accessable through $_POST, and there are lots of data missing.
PHP version is 5.3.3.7.
What could be causing this?
You said you use PHP 5.3.3, but maybe this is not quite right? Since PHP 5.3.9 there is a new setting max_input_vars that limits number of POST (and GET, and COOKIE, and so on, and so on) variables one can pass to a script.
If I am right, then it is enough to adjust it in php.ini, VirtualHost definition, or in .htaccess (ini_set will not work since the POST is already trimmed after your script started)
This setting was introduced for security reasons, so be cautious:
http://www.phpclasses.org/blog/post/171-PHP-Vulnerability-May-Halt-Millions-of-Servers.html
From client side try to use jQuery and convert you data to JSON, before send POST to server
$.ajax({
method: 'POST',
url: 'http://someurl.com',
data: JSON.stringigy(youJsObject),
success: function(data) {
// processing data from server
}
});
I am sending form data with 2k+ parameters, but server only recieves less than half of it.
$.ajax({
url: 'inner.php?option=param',
type: 'POST',
data: $('#form').serialize(),
dataType: "text",
success: function(data) {
//success action
},
error:function (xhr, ajaxOptions){
//error action
}
});
Some of the paramerters are posted by Ajax are-
1190583_1306134[] 1
1226739_1343934[]
My application is written in PHP. Thanks in advance.
I think you need not to post the empty elements.
Replace data: $('#form').serialize()
with data: $('#form :input[value!='']').serialize().
Hopefully it will work for you.
Just wasted 2h on exactly the same thing - partial $_POST data in php backend (~1000 out of ~3500).
Found this in apache logs:
Warning: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in Unknown on line 0, referer: http://app1.local/full_path
That was more than sufficient to uncomment max_input_vars in php.ini, change it to 10000 and restart apache. All was working again after that ;)
I had the same problem. I don't know why but $.ajax trucantes post data passed as string.
To solve this use object data instead.
For example
$data=$('form').serialize();
$data=JSON.parse('{"' + decodeURI($data.replace(/&/g, "\",\"").replace(/=/g,"\":\"")) + '"}');
$.ajax({
url:$url,
data:$data,
...
});
Hope this will help ;)
For anyone finding the error Request has been truncated when using direct form pickup via data: new FormData(this) on the Firefox debuger-console, the whole data may have actually been posted and the error seems to be false. I had to spend several hours only to realize that Google Chrome does not report the error and on actual check-up of the image being posted, it was actually being uploaded.
By the way, a direct form pickup as such does not require the serialization and can upload even images.
This works great at retrieving the php data for say 15 passes BUT when the json file is say 100 items it chokes the php script and creates random errors. My guess is because the requests are all made from this singe ajax request (faster than the php script works) the php script is getting confused?
$(document).ready(function(){
var ajax_load = "<div class='loadwrap'><img class='load' src='/img/load.gif' style='width:12px;' alt='' /> fetching list...</div>";
$("#status").html(ajax_load);
$.getJSON('/fsbo/get_urls_24_hours', function(data) {
$("#alias").fadeOut('slow');
var ajax_load = "<div class='loadwrap'><img class='load' src='/img/load.gif' style='width:12px;' alt='' /> fetching property...</div>";
$('#props').html('');
$.each(data, function(key, val) {
$.ajax({
type: "POST",
url: base_url + "/fsbo/get_property",
data: "url="+ val,
cache:false,
success:
function(data){
$("<div></div>").html(data).appendTo('#props');
}
});
});
});
});
As a side note where do I put the hide loading gif? Putting it at the end of the loop does no good it just opens and closes not waiting for the return of data.
It's generally a bad idea to makes AJAX requests in a loop. Why not just modify the original call to return all of the data you want in your JSON rather than making 100 calls.
If for some reason you can't avoid this, constrain the number of pending requests. Send, for example, the first 5 requests, then only send the 6th once you get a response from one of the first 5. This way only 5 requests are pending at any time and your server isn't hit with 100 all at once.
This code snipped is killer.
$.each(data, function(key, val) {
$.ajax({
If the length of data is 100 there will be 100 http connections to your server. This will obviously choke your server. Besides, your browser will become slow. Its like opening 100 tabs in Firefox one shot.
Pass all the data at a single Ajax request. If the size is huge send chunk by chunk. When you receive first response send the request for next chunk. But don't send them simultaneously.
I do think that you have the right answer, that it is because of the call.
The way PHP calls work, is by creating a seperate call to your system, while also invoking various other libraries, therefore invoking a sort of memory leak, which is exponentially increasing the time and resources required.
What I suggest is - pass all of your variables to PHP and let it do the work, then receive a JSON object back, and parse it.
It may be a bit slower to the end user, but should help avoid this from happening.
P.S.
I've had similiar issues, when these kinds of calls were making so many requests for one user, that the whole webserver crashed.
I can't for the life of me figure out why this is happening.
This is kind of a repost, so forgive me, but I have new data.
I am running a javascript log out function called logOut() that has make a jQuery ajax call to a php script...
function logOut(){
var data = new Object;
data.log_out = true;
$.ajax({
type: 'POST',
url: 'http://www.mydomain.com/functions.php',
data: data,
success: function() {
alert('done');
}
});
}
the php function it calls is here:
if(isset($_POST['log_out'])){
$query = "INSERT INTO `token_manager` (`ip_address`) VALUES('logOutSuccess')";
$connection->runQuery($query); // <-- my own database class...
// omitted code that clears session etc...
die();
}
Now, 18 hours out of the day this works, but for some reason, every once in a while, the POST data will not trigger my query. (this will last about an hour or so).
I figured out the post data is not being set by adding this at the end of my script...
$query = "INSERT INTO `token_manager` (`ip_address`) VALUES('POST FAIL')";
$connection->runQuery($query);
So, now I know for certain my log out function is being skipped because in my database is the following data:
alt text http://img535.imageshack.us/img535/2025/screenshot20100519at125h.png
if it were NOT being skipped, my data would show up like this:
alt text http://img25.imageshack.us/img25/8104/screenshot20100519at125.png
I know it is being skipped for two reasons, one the die() at the end of my first function, and two, if it were a success a "logOutSuccess" would be registered in the table.
Any thoughts? One friend says it's a janky hosting company (hostgator.com). I personally like them because they are cheap and I'm a fan of cpanel. But, if that's the case???
Thanks in advance.
-J
Ok, for those interested.
I removed the full URL http://www.mydomain.com/functions.php
and replaced it with the local path functions.php and that did the trick.
Apparently AJAX has issues with cross domain ajax calls and I'm not on a dedicated server, so I imagine what's happening is every couple hours (or minutes) I am somehow hitting my script from a different location causing AJAX to dismiss the POST data.
-J
Try enabling error reporting on the jquery $.ajax function, your code would look something like
function logOut(){
var data = new Object;
data.log_out = true;
$.ajax({
type: 'POST',
url: 'http://www.mydomain.com/functions.php',
data: data,
success: function() {
alert('done');
},
error: function(XMLHttpRequest, textStatus, errorThrown) {
alert(textStatus+" - "+errorThrown);
}
});
}
See if that sheds light on your situation.
I have a strong feeling that it's more of a server side issue rather than the client's.
The odd thing is that you see the problem for a period of time. If the client works at all, then at the minimum refreshing the page or restarting the browser should fix it.
The die() at the end of the function is suspicious, but I am not quite sure how it will affect it.
Btw you can see http headers in FireBug's Net tab, to know whether those parameters has been sent properly.