I'm submitting data through an ajax post, the post data will be called list in this example. The backend is using PHP codeigniter.
I make the ajax POST call, list is an array with 190 objects in it. If I console.log it, it correctly shows 190.
$.ajax({
url: submitUrl,
type: 'post',
data: { list },
success: function(response) {
console.log(response);
},
error: function(response) {
console.log(response);
}
});
I post the data. I pull the data out of the post object..
$data = $this->input->post();
then I send a response back to the front end..
print count($data['list']);
I receive it in the success, and the response shows the number 40.
I know for a fact the data is coming in correctly(if I just return the array, it shows a chopped version of the array - but I just made it a number for simplicity). It returns the first 40 items in the array if I do that.
Why is the back end chopping trimming my array?
Things I've tried:
adjusting php.ini:
upload_max_filesize = 100M
post_max_size = 100M
adjust htaccess:
LimitRequestBody
php_value upload_max_filesize
php_value post_max_size
does anyone else have any other ideas?
Update:
Never solved the core of the issue. Ended up using JSON.stringify(list) on the front end and json_decode($data['list']) on the back end to get around the issue.
Related
I'm reading a .CSV (excel) file using javascript, when I console.out the .CSV data I get it in this format:
http://prntscr.com/ivdn1h
I have to import this data in my database. I use ajax to send the data via POST request to my REST endpoint, where I call the function that does the import.
AJAX CODE:
function pushToDatabase(lines){
var array = lines;
$.blockUI();
$.ajax({
url: "rest/v1/upload",
method: "POST",
dataType: "json",
data: {
file: array
},
success: function() {
alert("Data has been uploaded succesfully.");
}
});
$.unblockUI();
}
ENDPOINT (flightPHP):
Flight::route('POST /upload', function () {
$file = Flight::request()->data->file;
$status = Flight::pm()->upload_records($file);
Flight::json($status);
});
But when I send the request, I get this response:
<b>Warning</b>: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in <b>Unknown</b> on line <b>0</b><br />
Here is the http request header:
http://prntscr.com/ivdrmz
I'm not interested in changing my max_input_vars. I hope somebody can help me to fix my data format.
Thanks in advance. :)
Check if, changing datatype: "text/plain" in your ajax call, solves your problem.
However, as Niels, suggested in his post, you should actually convert your data to a JSON and then post as single string variable
To summarise, it is the Content-Type sent by header set for JSON. You are trying to send FormData and not Json to php file.
I spent a whole day to discover the solution for a similar issue.
I have a Controller which consumes Ajax requests and responds with the result of the database operation. The Ajax call is formed with a set of IDs (an array of ints) and a message.
When the set of IDs have a normal ammount of IDs (I tested with 20 IDs), the Ajax call returns normally. When trying to send 3000 IDs, I get an almost instant response with a 403 Forbidden Access error.
This is probably a Codeigniter or Apache server error. I looked for it, but didn't find any answer.
Thanks in advance.
On your AJAX code, are you sending it via GET?, if so, change it to POST
$.ajax({
type: "POST",
url: 'url',
data: data,
dataType: "json",
cache: false,
contentType: false,
processData: false,
success: function (data) {
//some code
}
});
By default php.ini has 1000 max_input_vars variable.
Here you have the docs how to change
Change that and you will fix the issue. I had the same problem before.
Ex: placing following lines into .htaccess
php_value max_input_vars 3000
php_value suhosin.get.max_vars 3000
php_value suhosin.post.max_vars 3000
php_value suhosin.request.max_vars 3000
I am sending form data with 2k+ parameters, but server only recieves less than half of it.
$.ajax({
url: 'inner.php?option=param',
type: 'POST',
data: $('#form').serialize(),
dataType: "text",
success: function(data) {
//success action
},
error:function (xhr, ajaxOptions){
//error action
}
});
Some of the paramerters are posted by Ajax are-
1190583_1306134[] 1
1226739_1343934[]
My application is written in PHP. Thanks in advance.
I think you need not to post the empty elements.
Replace data: $('#form').serialize()
with data: $('#form :input[value!='']').serialize().
Hopefully it will work for you.
Just wasted 2h on exactly the same thing - partial $_POST data in php backend (~1000 out of ~3500).
Found this in apache logs:
Warning: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in Unknown on line 0, referer: http://app1.local/full_path
That was more than sufficient to uncomment max_input_vars in php.ini, change it to 10000 and restart apache. All was working again after that ;)
I had the same problem. I don't know why but $.ajax trucantes post data passed as string.
To solve this use object data instead.
For example
$data=$('form').serialize();
$data=JSON.parse('{"' + decodeURI($data.replace(/&/g, "\",\"").replace(/=/g,"\":\"")) + '"}');
$.ajax({
url:$url,
data:$data,
...
});
Hope this will help ;)
For anyone finding the error Request has been truncated when using direct form pickup via data: new FormData(this) on the Firefox debuger-console, the whole data may have actually been posted and the error seems to be false. I had to spend several hours only to realize that Google Chrome does not report the error and on actual check-up of the image being posted, it was actually being uploaded.
By the way, a direct form pickup as such does not require the serialization and can upload even images.
I have a farily large multidimensional Javascript array which I'm passing to my PHP file with an AJAX POST as such:
$.ajax({
type: "POST",
dataType: 'json',
url: "forms/scripts/testArray.php",
data: {
header: header,
gridData: gridData
}
});
gridData is my multidimensional array which has 1000 elements with each of their child arrays containing 3 elements. When I debug and look at the data in my PHP file, there are only about 350 elements from the array that arrived from the POST.
Using Firebug, I get this info about File Size Sent and File Size Received:
As you can see, the little pop-up is telling me the AJAX call sent 462.8 KB but only 4.9 KB was actually received. Is this true? Is that why I'm only getting part of the array I'm attempting to POST?
Also, is this the best way to move a multidimensional array from Javascript to PHP via an AJAX POST?
I appreciate any help you can offer!
Probably, something in the toolchain is limiting the maximum amount of stuff that can be sent/received. This can either by in your webserver (Apache probably), or (more likely) in your PHP settings. PHP has some settings like post_max_size, max_input_time, max_input_nesting_level, max_input_vars, etcetera. You should check out these settings!
There is probably a weird character in the data throwing off the json encoding/decoding on or just after the last element that is being 'received'.
Remove ;(semicolon) in php.ini file, set increase max_input_vars then it works:
max_input_vars = 10000
The Problem
I'm using jQuery to post a (relatively) large amount of data to a web system I am migrating from Ubuntu to CentOS (a painful process). The problem is that the data being received is truncated. Sending the same data from the server to the client results in no truncation.
The amount of data being 'sent' (that is, what I'm seeing while debugging Javascript) is 116,902 bytes (the correct amount of data), whereas the amount of data being received is approximately 115,668 bytes: this number seems to vary, making me believe the problem may be time related. The transaction completes (receive, response) in approximately 3.1 seconds, not a huge amount of time. Are there any settings I should examine?
That idea aside, my PHP installation is configured to accept 8M of post data and use to 128M of physical memory, which seems plenty enough.
The jQuery code is below. I'm quite sure this isn't the problem, but I've included it as requested.
Receiving:
function synchronise_down()
{
$.ajax({url: "scripts/get_data.php",
context: document.body,
dataType: "json",
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
Sending:
function synchronise_up()
{
var serialised = MIRM_MODEL.serialise();
LAST_SERIALISED = new Date().getTime();
$.ajax({url: "scripts/save_model.php",
context: document.body,
dataType: "json",
data: {"model":serialised},
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result, true);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
Workaround (Wouldn't call this a solution)
Edit: I've 'fixed' this, but not necessarily found out what the problem is and how to solve it. It's an interesting problem so I'll describe my workaround and leave the question open.
What I'm doing is rather than letting jquery handle the serialisation of my large data, I'm doing it myself first, essentially serialising twice. the code for this is as follows:
function synchronise_up()
{
var serialised = JSON.stringify(MIRM_MODEL.serialise());
LAST_SERIALISED = new Date().getTime();
$.ajax({url: "scripts/save_model.php",
context: document.body,
dataType: "json",
data: {"model":serialised},
type: "POST",
success: function(result)
{
// Fix the state up.
update_data(result, true);
// Execute on syncronise.
execute_on_synchronise();
},
error: function(what, huh)
{
IS_WAITING = false;
}
});
}
The important line is of course:
var serialised = JSON.stringify(MIRM_MODEL.serialise());
Now, when it hits the server, I need to decode this data because it's been serialised twice. There are added costs with this 'solution': sending more data, doing more work. The question still remains: what's the problem, and what's the real solution?
Check the following php.ini variables:
post_max_size
max_input_vars - This might actually be the culprit because it truncates data
Try setting jQuery's ajax timeout parameter to a high number (note, it's in milliseconds, so you'll probably want 10000 which is 10 seconds).
Some other options to try:
1. Check that your PHP max execution time is a decent amount. I doubt this would be related but it's possible.
2. On jQuery's error function, run console.log(xhr) on the XHR result (you'd have to do this in Chrome or find another method of seeing the result). XHR is an XHR object that contains debug information on what happened with the connection e.g. Status codes, timeout info, etc.
EDIT: Also, have you checked the max size of the field in your Database? It's quite possible that the Database is automatically truncating the information.
my gut feeling is that it's a php timeout related, i've never heard of a javascript timeout - and I have jquery's running for 3 or 4 hours, but then they continue to post little updates (like a _SESSION progress bar in PHP ... but I digress.. anyway you HAVE to use firefox for this, IE doesn't "believe" you when you know what you are doing and times out after about 40 minutes) ~ chrome wasn't used by me at the time.
Actually, come to think of it, you say you are migrating to CentOS sounds to me like is HAS to be server related. You are simply looking in the wrong place.
BTW congrats on CentOS it's AMAZING! I would do it the easy way and have an entire LAMP CentOS VM just fo rthis app (try not to faff with the vhosts for this it's v v messy) and simply set the whole apache/php install to be insanely high.
The correct php.ini settings are
max_input_time //(not max_execution_time!)
upload_max_filesize
post_max_size
// .. and try
memory_limit
PHP POST/GET/COOKIE are limited to 1000 entries by default. Everything above that is ignored. It is the number of entries that count, not the actual amount of data.
I sugest you to edit your php.ini and set the max_input_vars setting to a greater value.
Regards.