In the existing SO literature, I have seen examples that use jquery and PHP to proxy data:
jquery
function loadTheUrl(x){
$.ajax({ url: 'loader.php',
data: {url: x},
type: 'get',
success: function(output) {
$('.loading-space').html(output);
}
});
}
PHP
<?php
$doc = new DOMDocument();
$doc->loadHTML(file_get_contents($_GET['https://www.google.com/finance/getprices?q=.NSEI&x=NSE&i=600&p=1d&f=d,o']));
echo $doc->saveHTML();
Here is what the first few lines of the data look like at the URL seen in the PHP above. It is a page of plain text only, like this:
MARKET_OPEN_MINUTE=570
MARKET_CLOSE_MINUTE=960
INTERVAL=300
COLUMNS=DATE,OPEN
DATA=
TIMEZONE_OFFSET=-240
a1521120900,555.45
1,554.53
2,554.07
3,553.9
4,552.67
...
As far as I know, the PHP is correct. For my use case, I need to replicate the above jquery by means of d3. I was hoping that d3 would have something to use to interface with the data my php file is spitting out.
If you are wondering why I am going to such lengths, it's because my browsers are not letting me run scripts (i.e. d3.text(), d3.csv() et all) directly by say d3.text('https://www.google.com/finance...') due to the infamous access control origin header error. So my plan is to mirror the data from the google backfill off a local php file that I'm serving from my server. This way everybody is happy (or at least me).
When I try calling d3.text() on my php file, my data was not loaded correctly. In other words I tried: d3.text('my_loader.php'). But that resulted in lots of NaN errors, which I usually noticed are symptoms of a parsing error of some sort. Checking back through my code, things seem fine though. I have unary parsing in place, the strings should be cast to numbers. In fact, everything was working fine offline. I could load and parse the data directly when in my IDE. It was only when I published my d3 graph to the web did I realize I couldn't parse data from different origins. This is the point where I added the PHP component. My hunch was that d3 was actually trying to parse my PHP page and not the URL the PHP was pointing to. I later confirmed this by passing the data returned by d3.text() in the console and it was indeed the PHP page itself.
Question: In light of my cross-origin data situation, what can I do from the d3 side or the PHP side to make the two interface with each other correctly? I wonder if d3 is only suited for same origin data, or if there actually is a method to read/parse cross-origin data using a technique I'm not aware of (PHP or otherwise).
The url you are fetching does not exist within the $_GET variable.
The parameters you are submitting are an array:
$_GET = ['url' => 'some_url'];
Which means this:
$_GET['https://www.google.com/finance/getprices?q=.NSEI&x=NSE&i=600&p=1d&f=d,o]
is wrong (it's also missing a quote mark at the end of the string).
It should be $_GET['url']
With no validation:
<?php
header('Content-Type: text/plain');
echo file_get_contents($_GET['url']);
But that's neither here nor there.
The issue, I think, is with the url being passed. It contains a question mark and multiple ampersands (? and &). I think this is bjorking up the $_GET parameter so all you're getting is https://www.google.com/finance/getprices?q=.NSEI. You need to wrap the url in encodeURIComponent:
var url = encodeURIComponent('https://www.google.com/finance/getprices?q=.NSEI&x=NSE&i=600&p=1d&f=d,o');
d3.text('/path/to/myscript.php?url=' + url);
Cross origin applies to all ajax requests, instead of requesting d3.text('https://www.google.com/finance...') why not try d3.text('mymethod.php') and make sure the method returns a text file rather than html via the headers:
<?php
header('Content-Type: text/plain');
$file = file_get_contents('https://www.google.com/finance/getprices?q=.NSEI&x=NSE&i=600&p=1d&f=d,o');
echo $file;
I have a noticed a strange phenomenon in my LAMP environment.
Over the frontend I execute an AJAX post request with jQuery like this:
$.post('save.php', {data1: d1, data2: d2, [...], dataN: dN})
The variables d1 to dN are collected from the website (e.g. from text inputs, textareas, checkboxes, etc.) with jQuery beforehand.
The file save.php takes the post parameters data1 to dataN and saves them in the database in one query.
The request takes about 500ms and works without problems unless I change pages (e.g. by clicking a link) during the request.
Normally, I would expect the request to be aborted and ignored (which would be fine) but (and this is the strange behaviour) the request seems to be completed but only with part of the data transmitted and thus saved.
That means for example, that the php script saves only data1 to data5 and sets data6 to dataN to empty.
The problem seems to be caused by the AJAX request already (not the php script) since fields $_POST['data6'] to $_POST['dataN'] are not set in php in this scenario.
So my questions:
Why does this happen (is this expected behaviour)?
How can I avoid it?
Update
The problem is neither jQuery nor php solely. jQuery collects the values correctly and tries to post them to php. I just validated it - it works.
The php script on the other hand handles everything it gets as expected - it just does not receive the whole request.
So the problem must be the interrupted request itself. Unlike I'd expect it does not abort or fail, it still transmits all the data until the cut off.
Then php gets this post data and starts handling it - obviously missing some information.
Update 2
I fixed the problem by adding a parameter eof after dataN and checking if it was set in php. This way I can be sure the whole request was transmitted.
Nevertheless this does not fix the source of the problem which I still don't understand.
Any help anyone?
Try the following actions to debug the problem:
Check post_max_size in your php settings and compare it with the data size you are posting.
User HTTP request builder, i.e. Use Fiddler to make an http request and check what it returns.
Use print_r($_POST); on the top of the save.php, to check what you are getting in it.
Use tool like Firebug to check what jQuery has posted.
You should also verify the json object on client side that you are posting. i.e. JSON.stringify(some_object);
Try posting some basic sample data { "data1":1, "data2":2, "data3":3, "data4":4, "data5":5 , "data6":6 }
Most probably you are sending to much data or likely data is invalid!
Edits:
Very Foolish act but lets say you posted count as well. so directly check isset($_POST['data'.$_POST['count']] )
I think we can rule out problems at the server site (unless it's some exotic or self-crafted server daemon), because nobody ever sends "end-of-data"-parameters with a HTTP POST request to make sure all data is really sent. This is handled by HTTP itself (see e.g. Detect end of HTTP request body). Moreover, I don't think that you have to check the Content-Length header when POSTing data to your server, simply because of the fact that nobody does this, ever. At least not in totally common circumstances like you describe them (sending Ajax POST through jQuery).
So I suppose that jQuery sends a syntactically correct POST, but it's cut off. My guess is that if you interrupt this data collecting by navigating to another page, jQuery builds an Ajax request out of the data which it was able to gather and sends a syntactically correct POST to your server, but with cut off data.
Since you're using Firebug, please go to its net tab and activate persist, so traffic data is not lost when navigating to another page. Then trigger your Ajax POST, navigate to another page (and thereby "interrupt" the Ajax call) and check in Firebug's net tab what data has actually been sent to the server by opening ALL the POST requests and checking the Headers tab (and inside this, the Request Headers tab).
My guess is that one of two things might happen:
You will see that the data sent to the server is cut off already in the headers being presented to you in Firebug's net tab and the Content-Length is calculated correctly according to the actual (cut off) length of the POST data. Otherwise, I'm sure the server would reject the request as Bad Request as a whole.
You will see that there are multiple POST requests, some of them (perhaps with the full, non-cut off data) actually interrupted and therefore never reaching the server, but at least one POST request (again, with the cut off data) that ist triggered by some other mechanism in your Javascript, i.e. not the trigger you thought, but by navigating to another page, more and other Ajax requests might be triggered (just a guess since I don't know your source code).
In either case, I think you'll find out that this problem ist client related and the server just processes the (incomplete, but (in terms of HTTP) syntactically valid) data the client sent to it.
From that point on, you could debug your Javascript and implement some mechanism that prevents sending incomplete data to your server. Again, it's hard to tell what to do exactly since I don't know the rest of your source code, but maybe there's some heavy action going on in collecting the data, and you could possibly make sure that the POST only happens if all the data is really collected. Or, perhaps you could prevent navigation until the Ajax request is completed or such things.
What might be interesting, if all of this doesn't make sense, would be to have a look at more of your source code, especially how the Ajax POST is triggered and if there are any other events and such if you navigate to another page. Sample data you're sending could also be interesting.
EDIT: I'd also like to point out that outputting data with console.log() might be misleading, since it's in no way guaranteed that this is the data actually being sent, it's just a logline which evaluates to the given output at the exact time when console.log() is called. That's why I suggested sniffing the network traffic, because then (and only then) you can be sure what is really being sent (and received).
Nonetheless, this is a little tricky if you're not used to it (and impossible if you use encrypted traffic e.g. by using HTTPS), so the Firebug net tab might be a good compromise.
You can verify the value of the Content-Length header being received by the PHP.
This value ought to have been calculated client side when running the POST query. If it does not match, that's your error then and there. And that's all the diagnostics you need - if the Content-Length does not match the POST data, reject the POST as invalid; no need of extra parameters (computing the POST data length might be a hassle, though). Also, you might want to investigate why does PHP, while decoding the POST and therefore being able to verify its length, nonetheless seems to accept a wrong length (maybe the information needed to detect the error is somewhere among the $_SERVER variables?).
If it does match though, and still data isn't arriving (i.e., the Content-Length is smaller, and correctly describes the cut-off POST), then it is proof that the POST was inspected after the cut-off, and therefore either the error is in the browser (or, unlikely, in jQuery) or there is something between the browser and the server (a proxy?) that is receiving an incomplete query (with Content-Length > Actual length) and is incorrectly rewriting it, making it appear "correct" to the server, instead of rejecting it out of hand.
Some testing of both the theory and the workaround
Executive summary: I got the former wrong, but the latter apparently right. See code below for a sample that works on my test system (Linux OpenSuSE 12.3, Apache).
I believed that a request with wrong Content-Length would be refused with a 400 Bad Request. I was wrong. It seems that at least my Apache is much more lenient.
I used this simple PHP code to access the key variables of interest to me
<?php
$f = file_get_contents("php://input");
print $_SERVER['CONTENT_LENGTH'];
print "\nLen: " . strlen($f) . "\n";
?>
and then I prepared a request with a wrong Content-Length sending it out using nc:
POST /p.php HTTP/1.0
Host: localhost
Content-Length: 666
answer=42
...and lo and behold, nc localhost 80 < request yields no 400 error:
HTTP/1.1 200 OK
Date: Fri, 14 Jun 2013 20:56:07 GMT
Server: Apache/2.2.22 (Linux/SUSE)
X-Powered-By: PHP/5.3.17
Vary: Accept-Encoding
Content-Length: 12
Content-Type: text/html
666
Len: 10
It occurred to me then that content length might well be off by one or two in case the request ended with carriage return, and what carriage return - LF? CRLF?. However, when I added simple HTML to be able to POST it from a browser
<form method="post" action="?"><input type="text" name="key" /><input type="submit" value="go" /></form>
I was able to verify that in Firefox (latest), IE8, Chrome (latest), all running on XP Pro SP3, the value of the Content-Length is the same as the strlen of php://input.
Except when the request is cut off, that is.
The only problem is that php://input is not always available even for POST data.
This leaves us still in a quandary:
IF THE ERROR IS AT THE NETWORK LEVEL, i.e., the POST is prepared and supplied with a correct Content-Length, but the interruption makes it so that the whole data is cut off as this comment by Horen seems to indicate:
So only the first couple of post parameters arrived, sometimes the value of one parameter was even interrupted in the middle
then really checking Content-Length will prevent PHP from handling an incomplete request:
<?php
if ('POST' == $_SERVER['SERVER_PROTOCOL'])
{
if (!isset($_SERVER['Content-Length']))
{
header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request', True, 400);
die();
}
if (strlen(file_get_contents('php://input'))!=(int)($_SERVER['Content-Length']))
{
header($_SERVER['SERVER_PROTOCOL'] . ' 400 Bad Request', True, 400);
die();
}
}
// ... go on
?>
ON THE OTHER HAND if the problem is in jQuery, i.e. somehow the interruption prevents jQuery from assembling the full POST, and yet the POST is made up, the Content-Length calculated of the incomplete data, and the packet sent off -- then my workaround can't possibly work, and the "telltale" extra field must be used, or... perhaps the .post function in jQuery might be extended to include a CRC field?
Post data looks just like GET:
Header1: somedata1\r\n
Header2: somedata2\r\n
...
HeaderN: somedataN\r\n
\r\n
data1=1&data2=2&...&dataN=N
When request is aborted, in some cases, last line may be passed only partially. So, here are some possible solutions:
Compare Content-Length and strlen($HTTP_RAW_POST_DATA)
Validate input data
Pass not so much data at one time
I have tried to recreate this problem using triggers, manually, changing server settings, doing my very best to #$%& things up, using different data sizes but I never ever got only half a request in PHP. Simply because apache will not invoke PHP untill the request is completely done. See this question about Reading “chunked” POST data in PHP
So the only thing that can go wrong is that Jquery only gathers part of the data and then makes a POST request. Using just $.post('save.php', data) as you mentioned, that wont happen. Its either working to gather the data or its waiting for a response from the server.
If you switch sites during the gathering, there wont be a request. And if you switch after the request has been made and you move away quickly, before all data has been transmitted, the apache server will see it as half a request and wont invoke PHP.
Some suggestions:
Is it possible that you are using seperate pages for succesfull and partial requests? Because PHP does only add the first 1000 elements to $_POST and perhaps the failed requests have more then data1000=data elements? So there wont be an EOF param.
Is it possible that you are gathering the data in a global var in javascript and have an onbeforeunload method that sends data as well? Because then there might only be half the data in the POST.
Can you share some information on the data you are seding? Are there a lot of small elements (like data1 till data10000) or a few large once?
Is it always the same element that you receive last? Like always data6 as you mention? Because if it is, the chances of a failed attempt always at the exact same dataN field would be very slim.
My problem was there were too many variables in one of my post objects.
PHP has a max_input_vars variable which is set to 1000 by default.
I added this line to my .htaccess file (since I don't have access to the php.ini file):
php_value max_input_vars 5000
Problem solved!
Can you check your host log at
/var/log/messages
Last time i had "missing"post variables at php i found that i was sending null ASCII chars and the server(CentOS) was considering it an attack, then dropping those specific variables... Took me a week to figure it out! This was the server log response:
suhosin[1173]: ALERT - ASCII-NUL chars not allowed within request variables - dropped variable 'data3' (attacker '192.168.0.37', file '/var/www/upload_reader.php')
If that is your problem, tyr to, with js, compress your variables, encode them with base64. Post them with ajax, then receive then at php, decode64 then uncompress! That solved for me ;)
Solved the problem by increasing max_input_vars in my server's php.ini file
Since I had more than 1000 variables in the array, only part of them was received by the server!
Hope this helps someone!
In order to find our more about what is happening, why not properly code you ajax request using jQuery's ajax function. Use all the callback functions to track what happened to your call or what came back? The element type is set to POST and the element data carries whatever object structure { ... } you like.
$.ajax({
url : "save.php",
type : "POST",
data : {
"ajax_call" : "SOME_CUSTOM_AJAX_REQUEST_REFERENCE",
"data1" : data1,
"data2" : data2,
"data2" : data2,
"dataN" : dataN
},
//dataType : "html", contentType: "text/html; charset=utf-8",
dataType : "json", contentType: "application/json; charset=utf-8",
beforeSend: function () {
//alert('before send...');
},
dataFilter: function () {
//alert('data filter...');
},
success: function(data, textStatus, jqXHR) {
//alert('success...');
var response = JSON.parse(jqXHR.responseText, true);
if (undefined != response.data) {
my_error_function();
}
my_response_function(response.data);
},
error: function(jqXHR, textStatus, errorThrown) {
//alert('error...');
},
complete: function (xhr, status) {
//alert('end of call...');
my_continuation_function();
}
});
Before send a request, set "onbeforepageunload" handler for document(to prohibit the transition to another page), and unbind after success.
To example:
$(document).on('unload', function(e){
e.preventDefault();
// Here you can display a message, you need to wait a bit
return false;
});
A guess - the suhosin function on your Ubuntu/Debian server causes the field to be cut off?
I have the same problem. POST size is about 650KB, and gets corrupted if closing browser window or refreshing page in case of ajax, before post is completed. ajax.success() is not fired, but partial data is posted to "save.php" with 200 OK status. $_SERVER's Content-length is of no use as suggested elswhere since it matches the actual content-length of the partial data.
I figured out 2 ways of overcoming this:
as you propose, append a post variable at the end. Seems to work, although it seems a bit risky.
make your "save.php" script save to a temporary db column, and then use the ajax.success() to call another php script, say savefinal.php, without passing any data, which transfers data from the temp column to the final column (or just flags this data as valid). That way if post is interrupted data will only reside on the temp column on the database (or will not be flagged as valid).
The .success() is not called if post is interrupted, so this should work.
I presume this is a jquery bug sending a wrong (very small) content-length to apache, and apache is forced to assume that the post request has completed, but I'm not really sure.
Why does this happen (is this expected behaviour)?
Was also looking for the cause of this strange behaviour when some variables from post were missing and came to this question where the very similar behaviour is explained with the slow connection of the web client sending a POST request with multipart/form-data.
Here goes the mentioned question:
I am facing a problem when a remote web client with slow connection
fails to send complete POST request with multipart/form-data content
but PHP still uses partially received data to populate $_POST array.
As a result one value in $_POST array can be incomplete and more
values can be missing.
See How to check for incomplete POST request in PHP
How can I avoid it?
There you can find the recommended solution, as well. Nevertheless, the same solution was already proposed by you.
You can add field <input type="hidden" name="complete"> (for example)
as the last parameter. in PHP check firstly whether this parameter was
sent from client. if this parameter sent - you can be sure that you
got the entire data.
I have a farily large multidimensional Javascript array which I'm passing to my PHP file with an AJAX POST as such:
$.ajax({
type: "POST",
dataType: 'json',
url: "forms/scripts/testArray.php",
data: {
header: header,
gridData: gridData
}
});
gridData is my multidimensional array which has 1000 elements with each of their child arrays containing 3 elements. When I debug and look at the data in my PHP file, there are only about 350 elements from the array that arrived from the POST.
Using Firebug, I get this info about File Size Sent and File Size Received:
As you can see, the little pop-up is telling me the AJAX call sent 462.8 KB but only 4.9 KB was actually received. Is this true? Is that why I'm only getting part of the array I'm attempting to POST?
Also, is this the best way to move a multidimensional array from Javascript to PHP via an AJAX POST?
I appreciate any help you can offer!
Probably, something in the toolchain is limiting the maximum amount of stuff that can be sent/received. This can either by in your webserver (Apache probably), or (more likely) in your PHP settings. PHP has some settings like post_max_size, max_input_time, max_input_nesting_level, max_input_vars, etcetera. You should check out these settings!
There is probably a weird character in the data throwing off the json encoding/decoding on or just after the last element that is being 'received'.
Remove ;(semicolon) in php.ini file, set increase max_input_vars then it works:
max_input_vars = 10000
I am having problems with this topic: Access-Control-Allow-Origin.
I read about it and I found that is possible to get response using php, here
But I don't know how to adapt that code to javascript, I still have the same problem.
I tried this in javascript:
var url ='http://localhost:8080/com.webserver/rest/manage/order?parameter=parameter';
req=Ajax("getResponse.php/?" + url)
if (req.status=200)
alert("hi");
And on php file:
<?php
echo file_get_contents($_GET['url']);
?>
And nothing happends. I tried with ajax, something like:
$.ajax({
url: "http://localhost:8080/com.webserver/rest/manage/order?parameter=parameter",
async: false,
dataType: 'html',
success: function (text) {
alert(text);
}
});
But always same problem....
I read lot of people on internet having the same problem, but no one get a response. I just found 2 ways, using chrome and one option but just recomended for developers and adding headers on server but I don't know where to add them. I am using apache tomcat catalina for that localhost. I have 2 servers, webpage (in xampp) and rest (in tomcat)
Change
req=Ajax("getResponse.php/?" + url)
to
req=Ajax("getResponse.php/?url=" + url)
Bare in mind this is insecure, i could pass anything into the url parameter and your php scripts would use it. Allowing people to read files from your local system as well as get your php script to download malicious files from elsewhere
Edit:
To best way to secure it is to use an actions list, this means that the user never see's the url and can only modify an action word. for example
req=Ajax("getResponse.php/?do=getOrders")
then in php
$actions = array();
$actions['getOrders'] = "http://localhost:8080/com.webserver/rest/manage/order?parameter=parameter";
if(array_key_exists($_GET['do'], $actions))
echo file_get_contents($actions[$_GET['do']]);
Usually you'd want to do more that just translate an action to a url, you may want to pass additional parameters. In this case you could use a switch or a bunch of IF's to check if $_GET['do'] is equal to something and then process it. but it would take hours to give an example of every possible implementation method, so you may want to use google.
Please note: whilst this method is suggest adds 100x more security to your script, its not infallable, especially if you start passing through parameters from users too. Once again use google.
file_get_contents("php://input") or $HTTP_RAW_POST_DATA - which one is better to get the body of JSON request?
And which request type (GET or POST) should I use to send JSON data when using client side XmlHTTPRequest?
My question was inspired from this answer:
How to post JSON to PHP with curl
Quote from that answer:
From a protocol perspective file_get_contents("php://input") is actually more correct, since you're not really processing http multipart form data anyway.
Actually php://input allows you to read raw request body.
It is a less memory intensive alternative to $HTTP_RAW_POST_DATA and does not need any special php.ini directives.
From Reference
php://input is not available with enctype="multipart/form-data".
php://input is a read-only stream that allows you to read raw data
from the request body. In the case of POST requests, it is preferable
to use php://input instead of $HTTP_RAW_POST_DATA as it does not
depend on special php.ini directives. Moreover, for those cases
where $HTTP_RAW_POST_DATA is not populated by default, it is a
potentially less memory intensive alternative to activating
always_populate_raw_post_data.
Source: http://php.net/manual/en/wrappers.php.php.
file_get_contents(php://input) - gets the raw POST data and you need to use this when you write APIs and need XML/JSON/... input that cannot be decoded to $_POST by PHP
some example :
send by post JSON string
<input type="button" value= "click" onclick="fn()">
<script>
function fn(){
var js_obj = {plugin: 'jquery-json', version: 2.3};
var encoded = JSON.stringify( js_obj );
var data= encoded
$.ajax({
type: "POST",
url: '1.php',
data: data,
success: function(data){
console.log(data);
}
});
}
</script>
1.php
//print_r($_POST); //empty!!! don't work ...
var_dump( file_get_contents('php://input'));
For JSON data, it's much easier to POST it as "application/json" content-type. If you use GET, you have to URL-encode the JSON in a parameter and it's kind of messy. Also, there is no size limit when you do POST. GET's size if very limited (4K at most).
The usual rules should apply for how you send the request. If the request is to retrieve information (e.g. a partial search 'hint' result, or a new page to be displayed, etc...) you can use GET. If the data being sent is part of a request to change something (update a database, delete a record, etc..) then use POST.
Server-side, there's no reason to use the raw input, unless you want to grab the entire post/get data block in a single go. You can retrieve the specific information you want via the _GET/_POST arrays as usual. AJAX libraries such as MooTools/jQuery will handle the hard part of doing the actual AJAX calls and encoding form data into appropriate formats for you.
Your second question is easy, GET has a size limitation of 1-2 kilobytes on both the server and browser side, so any kind of larger amounts of data you'd have to send through POST.