I'm reading a .CSV (excel) file using javascript, when I console.out the .CSV data I get it in this format:
http://prntscr.com/ivdn1h
I have to import this data in my database. I use ajax to send the data via POST request to my REST endpoint, where I call the function that does the import.
AJAX CODE:
function pushToDatabase(lines){
var array = lines;
$.blockUI();
$.ajax({
url: "rest/v1/upload",
method: "POST",
dataType: "json",
data: {
file: array
},
success: function() {
alert("Data has been uploaded succesfully.");
}
});
$.unblockUI();
}
ENDPOINT (flightPHP):
Flight::route('POST /upload', function () {
$file = Flight::request()->data->file;
$status = Flight::pm()->upload_records($file);
Flight::json($status);
});
But when I send the request, I get this response:
<b>Warning</b>: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in <b>Unknown</b> on line <b>0</b><br />
Here is the http request header:
http://prntscr.com/ivdrmz
I'm not interested in changing my max_input_vars. I hope somebody can help me to fix my data format.
Thanks in advance. :)
Check if, changing datatype: "text/plain" in your ajax call, solves your problem.
However, as Niels, suggested in his post, you should actually convert your data to a JSON and then post as single string variable
To summarise, it is the Content-Type sent by header set for JSON. You are trying to send FormData and not Json to php file.
I spent a whole day to discover the solution for a similar issue.
Related
I am looking for get json response from third party website. Its providing me json data when I call with ajax. Its fine. I am looking for pass same json data to my PHP file. so I can decode that json data and can store in MYSQL database. So for I am trying like this
<script>
$("#diary").click(function(){
$.ajax({
type:'get',
url:'https://example.com,
data:{
},
dataType:'json',
success:function(result){
console.log(result);
$.ajax({
type: "POST",
url: "dd.php",
data:result,
dataType: "json",
success: function (msg) {
console.log(msg);
}
});
}
});
});
</script>
Its working for get data from third party site but in my php page, I am not receiving proper data so json_decode function not working. May be I am not posting correct json data to PHP page. I am not able to use CURL in PHP because its giving me connection error, its possible that third party site have some security function which does not allow me to connect via CURL so ajax is the only method for get data.
My PHP file is like below
<?php
$ajax = json_decode(file_get_contents('php://input'), true);
print_r($ajax);
?>
Let me know if anyone here can help me for solve my issue.
Thanks!
You need to call JSON.stringify() on the result and pass that through data in your ajax request. You are just sending a string that happens to be JSON data. In PHP you can call json_decode() on $_POST['data'] and you should have your data.
I am using a jQuery ajax function and PHP file to change a JSON file but the file will not change even though I get the "Thanks!" alert. My jQuery, PHP, and JSON code is as follows.
jQuery file:
$.ajax({
url: 'savePoll.php',
type : 'POST',
async: false,
data: 0,
success: function () {alert("Thanks!"); },
failure: function() {alert("Error!");}
})
PHP File:
$jsonString = file_get_contents('poll.json');
$data = json_decode($jsonString, true);
$data["answers"][$_POST]++;
$newJsonString = json_encode($data);
file_put_contents('poll.json', $newJsonString);
JSON File:
{
"answers":[0,0,0,0,0]
}
My JSON file never changes but yet I still get my success alert. Thanks for any help.
This: $data["answers"][$_POST]++; doesn't look valid.
The failure: function will only trigger on a non 200 HTTP response.
The error in #1 is not serious enough to trigger an HTTP error. You would have to implement logic to make the page return a 400-level HTTP response code in order to ever have failure: trigger.
Make sure the file the you want to change has 755 for its permission.
I am sending form data with 2k+ parameters, but server only recieves less than half of it.
$.ajax({
url: 'inner.php?option=param',
type: 'POST',
data: $('#form').serialize(),
dataType: "text",
success: function(data) {
//success action
},
error:function (xhr, ajaxOptions){
//error action
}
});
Some of the paramerters are posted by Ajax are-
1190583_1306134[] 1
1226739_1343934[]
My application is written in PHP. Thanks in advance.
I think you need not to post the empty elements.
Replace data: $('#form').serialize()
with data: $('#form :input[value!='']').serialize().
Hopefully it will work for you.
Just wasted 2h on exactly the same thing - partial $_POST data in php backend (~1000 out of ~3500).
Found this in apache logs:
Warning: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini. in Unknown on line 0, referer: http://app1.local/full_path
That was more than sufficient to uncomment max_input_vars in php.ini, change it to 10000 and restart apache. All was working again after that ;)
I had the same problem. I don't know why but $.ajax trucantes post data passed as string.
To solve this use object data instead.
For example
$data=$('form').serialize();
$data=JSON.parse('{"' + decodeURI($data.replace(/&/g, "\",\"").replace(/=/g,"\":\"")) + '"}');
$.ajax({
url:$url,
data:$data,
...
});
Hope this will help ;)
For anyone finding the error Request has been truncated when using direct form pickup via data: new FormData(this) on the Firefox debuger-console, the whole data may have actually been posted and the error seems to be false. I had to spend several hours only to realize that Google Chrome does not report the error and on actual check-up of the image being posted, it was actually being uploaded.
By the way, a direct form pickup as such does not require the serialization and can upload even images.
i have one php page with request data from other page using JSON
i have ajax call
$.ajax({
type: "POST",
url: "getdata.php",
cache:false,
data:"list_id="+ encodeURIComponent(cont_list),
dataType:'json',
success: function(json)
{
var foo = json.foo;
$(foo).addClass('innertxt');
$('#all_users').append(foo);
}
after data is processed in 2nd php file it send back in bellow symtax
$return["foo"] =$val;
print stripslashes(json_encode($return));
$val is variable with data. it works fine for small amount of data but if records are in thousands like 5,000 to 50,000 or more it didn't work and it shows bellow error in firebug
script stack space quota is exhausted
how can i process and get result of big data.
Thanks
I think you could compress your json tp gzip format data.
The json string is text, so after compression, you will get a very much smaller response.
About how to compress your response data in php, please check here
I am trying to send an XML file from a textfield in my html, via ajax to a PHP file. This is the almighty PHP file:
<?php
$data = urldecode($_POST["xml"]);
echo $data;
?>
Data is sent to this file as such:
$("#btn_save").click(function() {
var data = escape($("#textfield").text());
alert(data);
$.ajax({
url: "validate.php",
method: "POST",
data: "xml=" + data,
complete: function(e) { alert(e.responseText); }
});
});
Now, as long as I don't sent more than a few lines of code, it works as it should. When I paste in a 60 line XML file however, validate.php returns
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>403 Forbidden</title>
</head><body>
<h1>Forbidden</h1>
<p>You don't have permission to access /xml_stylist/form/validate.php
on this server.</p>
<p>Additionally, a 404 Not Found
error was encountered while trying to use an ErrorDocument to handle the request.</p>
<hr>
<address>Apache mod_fcgid/2.3.5 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 Server at dahwan.info Port 80</address>
</body></html>
What am I doing wrong?
Thanks
Change
method: "POST"
to
type: "POST"
that may do the trick.
BenSho is correct, the argument is called type. In addition:
$("#textfield").text()
I'm guessing that's a <textarea>. You shouldn't use text() or html() to read content from an input field, it doesn't do what you think. Use val().
var data = escape($("#textfield").text());
Don't ever use escape(). It is a weirdo JavaScript-specific function that looks like URL-encoding but isn't. If you use it for URL-encoding you will corrupt plus signs and all non-ASCII characters.
The correct JavaScript function for URL-encoding is encodeURIComponent(). However, since you are using jQuery, much better to let it work out URL-encoding for you by passing an object in:
data: {xml: $("#textfield").text()},
Finally:
$data = urldecode($_POST["xml"]);
You don't have to (and shouldn't) URL-decode anything manually. PHP URL-decodes parameters in a request body into raw strings for you.
Most browsers have a practical maximum of 2,083 characters in the url; there is no limit for a POST request. A GET request will pass the parameters in the url whereas a post does not. It all depends on how much you're actually sending to determine which you should use or if you're sending sensitive data (use POST).
Setting the data option on ajax calls means jquery will add these to the query string in a GET request. Most browsers have a limit on the length of a get request. If your xml data is too big, you should switch to POST.
Optimize your php.ini
post_max_size you may have to set
your memory_limit to higher value..
depends on the memory usage of your
script also
max_execution_time could be a
problem
try this:
$("#btn_save").click(function() {
var data = $("#textfield").text();
$.ajax({
url: "validate.php",
type: "POST",
data: {"xml": data},
complete: function(e) { alert(e.responseText); }
});
});