I am trying to list files from google drive folder.
If I use jquery I can successfully get my results:
var url = "https://www.googleapis.com/drive/v3/files?q='" + FOLDER_ID + "'+in+parents&key=" + API_KEY;
$.ajax({
url: url,
dataType: "jsonp"
}).done(function(response) {
//I get my results successfully
});
However I would like to get this results with php, but when I run this:
$url = 'https://www.googleapis.com/drive/v3/files?q='.$FOLDER_ID.'+in+parents&key='.$API_KEY;
$content = file_get_contents($url);
$response = json_decode($content, true);
echo json_encode($response);
exit;
I get an error:
file_get_contents(...): failed to open stream: HTTP request failed! HTTP/1.0 403 Forbidden
If I run this in browser:
https://www.googleapis.com/drive/v3/files?q={FOLDER_ID}+in+parents&key={API_KEY}
I get:
The request did not specify any referer. Please ensure that the client is sending referer or use the API Console to remove the referer restrictions.
I have set up referrers for my website and localhost in google developers console.
Can someone explain me what is the difference between jquery and php call and why does php call fails?
It's either the headers or the cookies.
When you conduct the request using jQuery, the user agent, IP and extra headers of the user are sent to Google, as well as the user's cookies (which allow the user to stay logged in). When you do it using PHP this data is missing because you, the server, becomes the one who sends the data, not the user, nor the user's browser.
It might be that Google blocks requests with invalid user-agents as a first line of defense, or that you need to be logged in.
Try conducting the same jQuery AJAX request while you're logged out. If it didn't work, you know your problem.
Otherwise, you need to alter the headers. Take a look at this: PHP file_get_contents() and setting request headers. Of course, you'll need to do some trial-and-error to figure out which missing header allows the request to go through.
Regarding the referrer, jQuery works because the referrer header is set as the page you're currently on. When you go to the page directly there's no referrer header. PHP requests made using file_get_contents have no referrer because it doesn't make much sense for them to have any.
Related
I've just started using the fetch API to send my data as ReactJS recommends. Prior to this, to send API requests to my server I used AngularJS. I am currently encountering the most annoying bug, sending HTTP requests fails to load the correct session.
Firstly, my web application uses a PHP API to expose itself and is (basically for now) a bunch of .php files to get parts of the app. One of them is authenticateUser.php which authenticates and stores the session. However, I noticed that my HTTP requests with fetch weren't working as expected as other stuff like getCurrentUser.php returned NotLoggedIn, despite just authenticating. I thought it might have something to do with the sessions, so on authenticateUser.php, I set my .php file to output the current session_id after authenticating via.: (echo array('session_id' => session_id());
I have noticed that basically, when I myself load the url myself in the browser: myapp.com/php/http/user/authenticateUser.php, the session_id I get is completely different from the one I get when using fetch:
return fetch("/php/http/user/authenticateUser.php", {
method: "post",
headers: {
'Content-Type': 'application/x-www-form-urlencoded'
}
}).then(function(successResponse) {
return successResponse.json();
}, function(errorResponse) {
console.log(errorResponse);
}).then(function(data) {
console.log(data);
});
I checked further and thought maybe it was because fetch was querying http version of my site instead of the non-HTTP version of my site. So I changed the logic on authenticateUser.php such that if it detects a session existing, it prints out that session_id, else it creates and prints out the new one. Everytime I sent my HTTP request with fetch I got a different session_id meaning that my session_id was changing with each request. Here is the code I used:
header("Content-Type: application/json");
if (session_status() == PHP_SESSION_NONE) {
session_start();
}
http_response_code(200);
echo json_encode(array('session_id' => session_id()),JSON_PRETTY_PRINT);
Am I missing something completely different with fetch API, or headers? Or does anyone have any experience with this that could possibly enlighten me?
The answer is that fetch by default does not send cookies, you need to specify it via:
credentials: same-origin
I am trying to build a post server similar to posttestserver.com and have been runnning into lots of trouble.
The following returns nothing -
do {
$data = file_get_contents('php://input');
} while (empty($data));
header('HTTP/1.0 200 OK');
header('Content-Type: text/html');
var_dump($data);
I have also had a look into the use of sockets but the client should be directed to a URL rather than an ip/port for the clients ease. I suspect that this is what i need to use but am not sure where to start.
For what its worth, the client expects an HTTP 2XX response code from its HTTP POST request, and the client will not attempt submitting the next HTTP POST request while a previous request is still in flight.
Any ideas?
It would seem that you cannot capture and view the POST data in the one browser window.
For what its worth, here is the code that worked in the end -
$data = file_get_contents('php://input');
//do something with the data such as write to file or database
Then you could use the data in another PHP script.
So in JavaScript, I used to be able to have an http request initiate a callback when AJAX sent a response back to some data I sent to the server, successfully being a callback function. I'm now experimenting with the OAuth2 gem for Ruby, and I'm finding callbacks to not be the same;
I have a web server and facebook app set up, and I have a small php script that writes the current URL (including the auth code, for example) to a file, no problem. All the settings in the facebook app are set up, and if I put this in the URL in the browser:
http://graph.facebook.com/oauth/authorize?client_id=[my_client_id]&redirect_uri=http://localhost/oauth/callback/index.php
It redirects successfully to that script, which then writes the authorization code to a file which I can then use to get the access token. Problem is that I can only do this process manually; using the Net::HTTP.get(URI(address)) command in ruby doesn't seem to initiate the php script.
Ayone have any ideas?
I have no idea why you posted your history with javascript ajax requests, as it has no bearing on your ruby script, which by the way doesn't even use a callback method/function. Using a callback function just means you are calling some function and passing it another function as an argument. When I started programming, the term callback function was very confusing to me, and in my opinion the term should be dropped from the lingo.
As for your ruby script, you need to use something like Firebug to look at the request headers that are being sent by your browser to the server when you manually enter the url in your browser. If you use those same headers in your ruby script, then it should work, e.g.:
req['header1'] = 'hello'
req['header2'] = '10'
or:
headers = {
'header1' => 'hello',
'header2' => '10',
...
}
req = Net::HTTP::Get.new(uri.request_uri, headers)
http = Net::HTTP.new(uri.host, uri.port)
resp = http.request(req)
It's possible that you have a cookie set in your browser, which your browser automatically adds to the request headers when it sends the request to the server. Your browser probably adds thousands of headers to the request--many of which will have no bearing on your problem. If you have the patience, you can try to figure out which header is causing your ruby script's request to malfunction.
Another option is to use the mechanize gem, which will automatically handle cookies and redirects for requests sent by ruby scripts:
http://docs.seattlerb.org/mechanize/GUIDE_rdoc.html
(Read the section Let's Fetch a Page; Don't use the line require 'rubygems' if you are using ruby 1.9+).
I'm writing a very basic Facebook app, but I'm encountering an issue with cross-domain AJAX requests (using jQuery).
I've written a proxy page to make requests to the graph via cURL that I'm calling via AJAX. I can visit the page in the browser and see it has the correct output, but requesting the page via always causes jQuery to fire the error handler callback.
So I have two files:
Proxy, which does the cURL request
<?php
//Do some cURL requests, manipulate some data
//return it as JSON
print json_encode($data);
?>
The facebook canvas, which contains this AJAX call
$.getJSON("http://myDomain.com/proxy.php?get=stuff",
function(JSON)
{
alert("success");
})
.error(function(err)
{
alert("err");
});
Inspecting the call with Firebug shows it returns with HTTP code 200 OK, but the error handler is always fired, and no content is returned. This happens whether I set Content-Type: application/json or not.
I have written JSON-returning APIs in PHP before using AJAX and never had this trouble.
What could be causing the request to always trigger the error handler?
Recently I experienced the same issue and my problem was the fact that there was a domain difference between the webpage and the API, due to the SSL.
The web page got a HTTP address (http://myDomain.com) and the content I was requesting with JQuery was on the same domain but HTTPS protocol (https://myDomain.com). The browser (Chrome in this case) considered that the domains were differents (the first one with HTTP, the second one with HTTPS), just because of the protocol, and because the request response type was "application/json", the browser did not allowed it.
Basically, the request worked fine, but your browser did not allowed the response content.
I had to add a "Access-Control-Allow-Origin" header to make it work. If you're in the same case, have a look there: https://developer.mozilla.org/en/http_access_control.
I hope that'll help you, I got a headache myself.
I have this code:
$url="https://graph.facebook.com/me/friends?access_token=".$access_token."&fields=id,first_name,last_name&limit=10";
$content=file_get_contents($url);
Whenever I use this on a non authenticated user I should get feedback of OAuthException, which doesn't show up in the PHP the $content is empty. While if I copy the URL to the browser I get the result and I see the exception.
I want to detect if the user is logged in and the session data is valid.
What might be wrong?
Maybe Facebook decides whether to respond with exception feedback or with just no response depending on the contents of Accept HTTP header(s) you are sending (file_get_contents sends different HTTP headers than your browser).