I have a farily large multidimensional Javascript array which I'm passing to my PHP file with an AJAX POST as such:
$.ajax({
type: "POST",
dataType: 'json',
url: "forms/scripts/testArray.php",
data: {
header: header,
gridData: gridData
}
});
gridData is my multidimensional array which has 1000 elements with each of their child arrays containing 3 elements. When I debug and look at the data in my PHP file, there are only about 350 elements from the array that arrived from the POST.
Using Firebug, I get this info about File Size Sent and File Size Received:
As you can see, the little pop-up is telling me the AJAX call sent 462.8 KB but only 4.9 KB was actually received. Is this true? Is that why I'm only getting part of the array I'm attempting to POST?
Also, is this the best way to move a multidimensional array from Javascript to PHP via an AJAX POST?
I appreciate any help you can offer!
Probably, something in the toolchain is limiting the maximum amount of stuff that can be sent/received. This can either by in your webserver (Apache probably), or (more likely) in your PHP settings. PHP has some settings like post_max_size, max_input_time, max_input_nesting_level, max_input_vars, etcetera. You should check out these settings!
There is probably a weird character in the data throwing off the json encoding/decoding on or just after the last element that is being 'received'.
Remove ;(semicolon) in php.ini file, set increase max_input_vars then it works:
max_input_vars = 10000
Related
I'm submitting data through an ajax post, the post data will be called list in this example. The backend is using PHP codeigniter.
I make the ajax POST call, list is an array with 190 objects in it. If I console.log it, it correctly shows 190.
$.ajax({
url: submitUrl,
type: 'post',
data: { list },
success: function(response) {
console.log(response);
},
error: function(response) {
console.log(response);
}
});
I post the data. I pull the data out of the post object..
$data = $this->input->post();
then I send a response back to the front end..
print count($data['list']);
I receive it in the success, and the response shows the number 40.
I know for a fact the data is coming in correctly(if I just return the array, it shows a chopped version of the array - but I just made it a number for simplicity). It returns the first 40 items in the array if I do that.
Why is the back end chopping trimming my array?
Things I've tried:
adjusting php.ini:
upload_max_filesize = 100M
post_max_size = 100M
adjust htaccess:
LimitRequestBody
php_value upload_max_filesize
php_value post_max_size
does anyone else have any other ideas?
Update:
Never solved the core of the issue. Ended up using JSON.stringify(list) on the front end and json_decode($data['list']) on the back end to get around the issue.
I am posting data with a size of approximately 200KB to my server running PHP 5.3.3.7.
The data is actually a JavaScript object with nested properties,
in the request it looks something like this: myObject[prop1][subprop1][key] = 5.
However, all data isn't received on the server. The last part of the posted data is cut off. max_post_size is set to 80MB so that shouldnt be the issue. I have compared the request form data with the data that is accessable through $_POST, and there are lots of data missing.
PHP version is 5.3.3.7.
What could be causing this?
You said you use PHP 5.3.3, but maybe this is not quite right? Since PHP 5.3.9 there is a new setting max_input_vars that limits number of POST (and GET, and COOKIE, and so on, and so on) variables one can pass to a script.
If I am right, then it is enough to adjust it in php.ini, VirtualHost definition, or in .htaccess (ini_set will not work since the POST is already trimmed after your script started)
This setting was introduced for security reasons, so be cautious:
http://www.phpclasses.org/blog/post/171-PHP-Vulnerability-May-Halt-Millions-of-Servers.html
From client side try to use jQuery and convert you data to JSON, before send POST to server
$.ajax({
method: 'POST',
url: 'http://someurl.com',
data: JSON.stringigy(youJsObject),
success: function(data) {
// processing data from server
}
});
I am sending form data with 2k+ parameters, but server only recieves less than half of it.
$.ajax({
url: 'inner.php?option=param',
type: 'POST',
data: $('#form').serialize(),
dataType: "text",
success: function(data) {
//success action
},
error:function (xhr, ajaxOptions){
//error action
}
});
Some of the paramerters are posted by Ajax are-
1190583_1306134[] 1
1226739_1343934[]
My application is written in PHP. Thanks in advance.
I think you need not to post the empty elements.
Replace data: $('#form').serialize()
with data: $('#form :input[value!='']').serialize().
Hopefully it will work for you.
Just wasted 2h on exactly the same thing - partial $_POST data in php backend (~1000 out of ~3500).
Found this in apache logs:
Warning: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in Unknown on line 0, referer: http://app1.local/full_path
That was more than sufficient to uncomment max_input_vars in php.ini, change it to 10000 and restart apache. All was working again after that ;)
I had the same problem. I don't know why but $.ajax trucantes post data passed as string.
To solve this use object data instead.
For example
$data=$('form').serialize();
$data=JSON.parse('{"' + decodeURI($data.replace(/&/g, "\",\"").replace(/=/g,"\":\"")) + '"}');
$.ajax({
url:$url,
data:$data,
...
});
Hope this will help ;)
For anyone finding the error Request has been truncated when using direct form pickup via data: new FormData(this) on the Firefox debuger-console, the whole data may have actually been posted and the error seems to be false. I had to spend several hours only to realize that Google Chrome does not report the error and on actual check-up of the image being posted, it was actually being uploaded.
By the way, a direct form pickup as such does not require the serialization and can upload even images.
The Problem
I'm using jQuery to post a (relatively) large amount of data to a web system I am migrating from Ubuntu to CentOS (a painful process). The problem is that the data being received is truncated. Sending the same data from the server to the client results in no truncation.
The amount of data being 'sent' (that is, what I'm seeing while debugging Javascript) is 116,902 bytes (the correct amount of data), whereas the amount of data being received is approximately 115,668 bytes: this number seems to vary, making me believe the problem may be time related. The transaction completes (receive, response) in approximately 3.1 seconds, not a huge amount of time. Are there any settings I should examine?
That idea aside, my PHP installation is configured to accept 8M of post data and use to 128M of physical memory, which seems plenty enough.
The jQuery code is below. I'm quite sure this isn't the problem, but I've included it as requested.
Receiving:
function synchronise_down()
{
$.ajax({url: "scripts/get_data.php",
context: document.body,
dataType: "json",
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
Sending:
function synchronise_up()
{
var serialised = MIRM_MODEL.serialise();
LAST_SERIALISED = new Date().getTime();
$.ajax({url: "scripts/save_model.php",
context: document.body,
dataType: "json",
data: {"model":serialised},
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result, true);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
Workaround (Wouldn't call this a solution)
Edit: I've 'fixed' this, but not necessarily found out what the problem is and how to solve it. It's an interesting problem so I'll describe my workaround and leave the question open.
What I'm doing is rather than letting jquery handle the serialisation of my large data, I'm doing it myself first, essentially serialising twice. the code for this is as follows:
function synchronise_up()
{
var serialised = JSON.stringify(MIRM_MODEL.serialise());
LAST_SERIALISED = new Date().getTime();
$.ajax({url: "scripts/save_model.php",
context: document.body,
dataType: "json",
data: {"model":serialised},
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result, true);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
The important line is of course:
var serialised = JSON.stringify(MIRM_MODEL.serialise());
Now, when it hits the server, I need to decode this data because it's been serialised twice. There are added costs with this 'solution': sending more data, doing more work. The question still remains: what's the problem, and what's the real solution?
Check the following php.ini variables:
post_max_size
max_input_vars - This might actually be the culprit because it truncates data
Try setting jQuery's ajax timeout parameter to a high number (note, it's in milliseconds, so you'll probably want 10000 which is 10 seconds).
Some other options to try:
1. Check that your PHP max execution time is a decent amount. I doubt this would be related but it's possible.
2. On jQuery's error function, run console.log(xhr) on the XHR result (you'd have to do this in Chrome or find another method of seeing the result). XHR is an XHR object that contains debug information on what happened with the connection e.g. Status codes, timeout info, etc.
EDIT: Also, have you checked the max size of the field in your Database? It's quite possible that the Database is automatically truncating the information.
my gut feeling is that it's a php timeout related, i've never heard of a javascript timeout - and I have jquery's running for 3 or 4 hours, but then they continue to post little updates (like a _SESSION progress bar in PHP ... but I digress.. anyway you HAVE to use firefox for this, IE doesn't "believe" you when you know what you are doing and times out after about 40 minutes) ~ chrome wasn't used by me at the time.
Actually, come to think of it, you say you are migrating to CentOS sounds to me like is HAS to be server related. You are simply looking in the wrong place.
BTW congrats on CentOS it's AMAZING! I would do it the easy way and have an entire LAMP CentOS VM just fo rthis app (try not to faff with the vhosts for this it's v v messy) and simply set the whole apache/php install to be insanely high.
The correct php.ini settings are
max_input_time //(not max_execution_time!)
upload_max_filesize
post_max_size
// .. and try
memory_limit
PHP POST/GET/COOKIE are limited to 1000 entries by default. Everything above that is ignored. It is the number of entries that count, not the actual amount of data.
I sugest you to edit your php.ini and set the max_input_vars setting to a greater value.
Regards.
I'm trying to send POST data that is 2 million characters big (non-binary string) via ajax (jQuery) and it always comes up as blank on the PHP side. Here is my code:
var string = "<data string that is 2M chars long>";
$.ajax({
cache: false,
type: 'POST',
url: 'data.php',
data: {'data_string': string}
});
On the PHP side, I get the following error message (when trying to retrieve data from $_POST['data_string']):
Notice: Undefined index: data_string in data.php on line ...
I've checked the post_max_size in php.ini, and it's set at 256M which should be more than enough? I'm stumped and not sure what I'm doing wrong...
EDIT: If I make "string" a small amount of data (e.g. var string = 'test') then $_POST["data_string"] returns test, as expected. So I'm wondering if there's some sort of data limit I'm reaching in Apache2, PHP or the browser itself? I'm using Google Chrome 17.0.963.79
EDIT2: memory_limit = 256M in php.ini
EDIT3: max_input_time = -1 in php.ini
EDIT4: var_dump($_POST) returns Array(0)
EDIT5: running the latest stable version of PHP5 on debian squeeze: PHP 5.3.3-7+squeeze8 with Suhosin-Patch (cli) (built: Feb 10 2012 14:12:26)
You'll have to check the limits parameters on all items between you and the server. Quite hard for proxy servers if any, but at least you can check:
Apache:
LimitRequestBody, around 2Gb by default, maybe greater for 64bits, check error logs for details.
PHP:
post_max_size which is directly related to the POST size
upload_max_filesize which may be unrelated, not sure
max_input_time, if the POSt takes too long
max_input_nesting_level if your data is an array with a lot of sublevels
max_execution_time, but quite sure it's not that
memory_limit, as you may reach a size exceding the subprocess allowed memory
max_input_vars, if your data array has many elements
If you have reached the compiled in limit for Apache your only solution is to avoid direct POSt of such a big chunk of data, you'll have to break it into pieces.
You may also check the suhosin.ini settings, eg.:
suhosin.post.max_value_length = 65000
You may also want to set set_time_limit(0) and your memory limit.
EDIT: You may also want to console.log(string); or console.log(string.length); before your request to verify it's being set properly, and also check your request in firebug or chromes developer tools to verify your data is being sent.
//Adding Respond Box After Selected Field
$( '[maxlength]' ).bind('click keyup', function(){
var RespondBox = '<div class="StPgRp" id="UpdateRespond"></div>';
$('#UpdateRespond').remove();
$(this).after(RespondBox);
});
//Counting Maximum Characters Allowed In Selected Field
$( '[maxlength]' ).bind('click keyup', function(){
var MaxLength = $(this).attr('maxlength');
var CurrentLength = $(this).val().length;
var Remainder = MaxLength - CurrentLength;
$('#UpdateRespond').html('You have ' + Remainder + ' characters remaining.');
});
//Checking the PHP Function if YES then send message to user
$( '.Check' ).bind('click keyup', function(){
var Check = $(this).parent().children('.Check').val();
});
Add this to your .js file linked to your page and your sorted!