Callback function - php

So in JavaScript, I used to be able to have an http request initiate a callback when AJAX sent a response back to some data I sent to the server, successfully being a callback function. I'm now experimenting with the OAuth2 gem for Ruby, and I'm finding callbacks to not be the same;
I have a web server and facebook app set up, and I have a small php script that writes the current URL (including the auth code, for example) to a file, no problem. All the settings in the facebook app are set up, and if I put this in the URL in the browser:
http://graph.facebook.com/oauth/authorize?client_id=[my_client_id]&redirect_uri=http://localhost/oauth/callback/index.php
It redirects successfully to that script, which then writes the authorization code to a file which I can then use to get the access token. Problem is that I can only do this process manually; using the Net::HTTP.get(URI(address)) command in ruby doesn't seem to initiate the php script.
Ayone have any ideas?

I have no idea why you posted your history with javascript ajax requests, as it has no bearing on your ruby script, which by the way doesn't even use a callback method/function. Using a callback function just means you are calling some function and passing it another function as an argument. When I started programming, the term callback function was very confusing to me, and in my opinion the term should be dropped from the lingo.
As for your ruby script, you need to use something like Firebug to look at the request headers that are being sent by your browser to the server when you manually enter the url in your browser. If you use those same headers in your ruby script, then it should work, e.g.:
req['header1'] = 'hello'
req['header2'] = '10'
or:
headers = {
'header1' => 'hello',
'header2' => '10',
...
}
req = Net::HTTP::Get.new(uri.request_uri, headers)
http = Net::HTTP.new(uri.host, uri.port)
resp = http.request(req)
It's possible that you have a cookie set in your browser, which your browser automatically adds to the request headers when it sends the request to the server. Your browser probably adds thousands of headers to the request--many of which will have no bearing on your problem. If you have the patience, you can try to figure out which header is causing your ruby script's request to malfunction.
Another option is to use the mechanize gem, which will automatically handle cookies and redirects for requests sent by ruby scripts:
http://docs.seattlerb.org/mechanize/GUIDE_rdoc.html
(Read the section Let's Fetch a Page; Don't use the line require 'rubygems' if you are using ruby 1.9+).

Related

Detect in PHP if page is accessed with cURL or Wget

I have a simple PHP script which shows some information to a user. I want to shorten this information as muss as possible if the same page is requested with cURL or saved with Wget.
I saw several similar question on Stackoverflow, but they have some extras like “I want to block cURL” or “redirect a form request if…”. The answers usually tell that it is not possible to detect a cURL request reliably, since cURL lets the user change all request parameters and pretend to be a browser. Thats okay for me, I dont want to block cURL, I want to offer an extra service for a generic cURL (and Wget) request.
If not configured otherwise cURL and Wget use a custom »User Agent« string for their requests.
For example curl/7.47.0 or Wget/1.17.1 (linux-gnu). You can test this easiliy on https://requestb.in.
Several applications may access the User Agent string in the request header. In PHP its available in the $_SERVER['HTTP_USER_AGENT'] variable.
So to detect a cURL or Wget request and offer different content, you may use
<?php
// Catch cURL/Wget requests
if (isset($_SERVER['HTTP_USER_AGENT']) && preg_match('/^(curl|wget)/i', $_SERVER['HTTP_USER_AGENT'])) {
echo 'Hi curl user!';
}
else {
echo 'Hello browser user!';
}
?>
In my app I detect the cURL request and then let the process die() in the if loop. So if its just a browser, the the condition doesnt match and executes all the following PHP code.
As said before, both cURL and Wget allow the user to set an arbitrary User Agent. But for the requested service, this solution is sufficient.

Cross-domain AJAX request error on HTTP 200

I'm writing a very basic Facebook app, but I'm encountering an issue with cross-domain AJAX requests (using jQuery).
I've written a proxy page to make requests to the graph via cURL that I'm calling via AJAX. I can visit the page in the browser and see it has the correct output, but requesting the page via always causes jQuery to fire the error handler callback.
So I have two files:
Proxy, which does the cURL request
<?php
//Do some cURL requests, manipulate some data
//return it as JSON
print json_encode($data);
?>
The facebook canvas, which contains this AJAX call
$.getJSON("http://myDomain.com/proxy.php?get=stuff",
function(JSON)
{
alert("success");
})
.error(function(err)
{
alert("err");
});
Inspecting the call with Firebug shows it returns with HTTP code 200 OK, but the error handler is always fired, and no content is returned. This happens whether I set Content-Type: application/json or not.
I have written JSON-returning APIs in PHP before using AJAX and never had this trouble.
What could be causing the request to always trigger the error handler?
Recently I experienced the same issue and my problem was the fact that there was a domain difference between the webpage and the API, due to the SSL.
The web page got a HTTP address (http://myDomain.com) and the content I was requesting with JQuery was on the same domain but HTTPS protocol (https://myDomain.com). The browser (Chrome in this case) considered that the domains were differents (the first one with HTTP, the second one with HTTPS), just because of the protocol, and because the request response type was "application/json", the browser did not allowed it.
Basically, the request worked fine, but your browser did not allowed the response content.
I had to add a "Access-Control-Allow-Origin" header to make it work. If you're in the same case, have a look there: https://developer.mozilla.org/en/http_access_control.
I hope that'll help you, I got a headache myself.

Slow HTTP POST request in php

I'm trying to POSTing some data (a JSON string) from a php script to a java server (all written by myself) and getting the response back.
I tried the following code:
$url="http://localhost:8000/hashmap";
$opts = array('http' => array('method' => 'POST', 'content' => $JSONDATA,'header'=>"Content-Type: application/x-www-form-urlencoded"));
$st = stream_context_create($opts);
echo file_get_contents($url, false,$st);
Now, this code actually works (I get back as result the right answer), but file_get_contents hangs everytime 20 seconds while being executed (I printed the time before and after the instruction). The operations performed by the server are executed in a small amount of time, and I'm sure it's not normal to wait all this time to get the response.
Am I missing something?
Badly mis-configured server maybe that doesn't send the right content-size and using HTTP/1.1.
Either fix the server or request the data as HTTP/1.0
Try adding Connection: close and Content-Length: strlen($JSONDATA) headers to the $opts.
Also, if you want to avoid using extensions, have a look at this class I wrote some time ago to perform HTTP requests using PHP core only. It works on PHP4 (which is why I wrote it) and PHP5, and the only extension it ever requires is OpenSSL, and you only need that if you want to do an HTTPS request. Documented(ish) in comments at the top.
Supports all sorts of stuff - GET, POST, PUT and more, including file uploads, cookies, automatic redirect handling. I have used it quite a lot on a platform I work with regularly that is stuck with PHP/4.3.10 and it works beautifully... Even if I do say so myself...

PHP cURL with XmlHttpRequest

The scenario: There's this voting form (vote.php) with 3 fields where one is a hidden field containing a hash. Once you submit the form, it requests using GET to a separate script (process.php) via XHR. I am trying to simulate this via cURL but only gotten so far to getting the hash and preserving the phpsessionid using a cookie jar.
The problem: The processing script (process.php) seems to be able to detect if a request didn't push thru using XHR and will return an error if I just submit its required parameters using regular cURL GET.
So how do I simulate XHR in cURL? Or I may be even wrong in saying there's XHR in cURL so can you please advice any methods on how to achieve this.
There is a good chance that process.php checks for an XHR request by looking at the HTTP_X_REQUESTED_WITH header, for example:
if (isset($_SERVER['HTTP_X_REQUESTED_WITH']) && $_SERVER['HTTP_X_REQUESTED_WITH']=="XMLHttpRequest") { /* Do stuff here */ }
So you could try setting that header in your cUrl request:
curl -H "HTTP_X_REQUESTED_WITH:XMLHttpRequest"
That has a good chance of working. Good luck!
Add the following header: X-Requested-Width: XMLHttpRequest. This is added by all major JS libraries to ease identifying such requests on the server.

is it possible to send referer information with php?

is it possible to send referer information with php?
If you are, for example, fetching the contents of a URL in PHP using cURL, you can send any additional headers you want, including a referrer header.
You can not force the users browser to send a referrer header by any means, especially not with a server side language.
It's not possible to get the client browser to send a different Referer header.
However, it is theory possible for you to do this when conducting an HTTP request from PHP (either using cURL or native URL wrappers), but including a custom request header in this request.
Yes, when trying to load a page, just write the Referer header to the output stream.
Referer is a 'request' header meaning sent by the client i.e. browser. From server side i.e. using PHP you can only control 'response' headers.
If you are planning to make HTTP requests with PHP, that is different of course.
Edit: ..and requests made from the server to the other servers is a pretty common scenario actually. It seems like you should be able to set the headers you want while creating the HttpRequest:
$options = array(headers => $header_array,
httpauth => $credentials);
$r = new HttpRequest($url, HTTP_METH_POST, $options);
Or you can use the addHeaders method:
$r->addHeaders(array('Referer' => 'http://example.com'));

Categories