PHP cross domain requests - php

I am a green programmer and I was originally trying to make cross domain requests in JS. I quickly learned that this is not allowed. Unlike similar questions posted on here, I would like to see if I can use PHP to make them for me instead of JSONP requests. Is this possible?
Simple workflow...
BROWSER: POST to my PHP the request-payload & request-headers
PHP: POST to Other Domain's URL the request-payload & request-headers
Other Domain: Process Request and send response
PHP: Send the Response-Content and Response-Header Info back to the browser
Here is what I am trying to work with http://msdn.microsoft.com/en-us/library/bb969500%28v=office.12%29.aspx
My goal is to make a Communicator Web Access Client that is web based and mobile friendly.
A link to a working example would be awesome!

CURL yould be your option in this case, something simple as:
<?php
$ch = curl_init('http://otherdomain.com/');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, false);
$result = curl_exec($ch);
var_dump($result);
?>
In this case, $result would contain the html code of the site. Please be aware that it doesn't going to execute any javascript as if you were visiting the site on the browser.

You are talking about web services and seems that the goal is process payments. Any major payment gateway have APIs prepared for that. In any case you can study by your own. Here a good starting point http://ajaxonomy.com/2008/xml/web-services-part-1-soap-vs-rest

Related

PHP & Facebook: facebook-debug a URL using CURL and Facebook debugger

Facts: I run a simple website that contains articles, articles dynamically acquired by scraping third-party websites/blogs etc (new articles arrive to my website every half an hour or so), articles which I wish to post on my facebook page. Each article typically includes an image, a title and some text.
Problem: Most (almost all) of the articles that I post on Facebook are not posted correctly - their images are missing.
Inefficient Solution: Using Facebook's debugger (this one) I submit an article's URL to it (URL from my website, not the original source's URL) and Facebook then scans/scrapes the URL and correctly extracts the needed information (image, title, text etc). After this action, the article can be posted on Facebook correctly - no missing images or anything.
Goal: What I am after is a way to create a process which will submit a URL to Facebook's debugger, thus forcing Facebook to scan/scrape the URL so that it can then be posted correctly. I believe that what I need to do is to create an HTML POST request containing the URL and submit it to Facebook's debugger. Is this the correct way to go? And if yes, as I have no previous experience with CURL, what is the correct way to do it using CURL in PHP?
Side Notes: As a side note, I should mention that I am using short URLs for my articles, although I do not think that this is the cause of the problem because the problem persists even when I use the canonical URLs.
Also, the Open Graph meta tags are correctly set (og:image, og:description, etc).
You can debug a graph object using Facebook graph API with PHP-cURL, by doing a POST to
https://graph.facebook.com/v1.0/?id={Object_URL}&scrape=1
to make thing easier, we can wrap our debugger within a function:
function facebookDebugger($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://graph.facebook.com/v1.0/?id='. urlencode($url). '&scrape=1');
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$r = curl_exec($ch);
return $r;
}
though this will update & clear Facebook cache for the passed URL, it's a bit hard to print out each key & its content and avoid errors in the same time, however I recommended using var_dump() or print_r() OR PHP-ref
usage with PHP-ref
r( facebookDebugger('http://retrogramexplore.tumblr.com/') );

Page redirect not working in php when accessed without a browser

I am developing an application in which the input I receive is through an SMS gateway ( and not a browser). I need to process the data obtained through SMS and pass it onto another PHP file which will finish the processing and send back an SMS to the SMS gateway.
However, when I try to redirect from page1.php to page2.php, it is not working with the following code:
page1.php:
$url = "location:http://www.iweavesolutions.com/$extra?sms=".$msg."&keyword=".$key."&num=".$msg_num."&src=".$source;
header($url);
page2.php:
$msg = $_GET['sms'];
$msg_num = $_GET['num'];
$keyword = $_GET['keyword'];
$src = $_GET['src'];
send_sms($msg,$msg_num);
However, the header call in the first page doesn't seem to work. php documentation says that header is used for browser related activities. In my application there is no browser at all. So, do I need to change my mechanism for passing values across files? Please help
please refer to "CURL"
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL,"http://www.iweavesolutions.com");
curl_setopt($ch,CURLOPT_CONNECTTIMEOUT,2);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, 'variable1=abc&variable2=123');
curl_setopt($ch,CURLOPT_FOLLOWLOCATION,true);
curl_setopt($ch,CURLOPT_MAXREDIRS,1);
$buffer = curl_exec($ch);
curl_close($ch);
some thing like this
Sending a location:[someUrl] header as an answer to a request just tells the requesting client to do another request to that location. It is up to the client whether to follow this redirect or not. Browsers will usually do this, other clients may not.
If the client you're dealing with (the SMS gateway) does not follow location header redirects, you need to check with the clients documentation if there is some mechanism to make him do that. If there is no way to redirect the client, you need to change your server side logic to get rid of the need for the redirect, i.e. you need to call the processing logic in your 'page2.php' directly from 'page1.php' without the indirection of the redirect (or bundle the whole logic in one file, etc.).
The SMS gateway probably does not implement HTTP properly. IME this is not uncommon.
As a side note, your first script (assuming it is complete) is written assuming register_globals is enabled - this has been deprecated for a long time, and does not url-encode the values - which may be the cause of the issue here. If not, you'll need to either:
fix the SMS gateway
change the end point registered on the SMS gateway to eliminate the ned for redirection
include the code from the redirected script into the current endpoint script
proxy the request from the gateway in the endpoint script.

PHP: Remote Function Call and returning the result?

I'm not very expert to PHP. I want to know how to communicate between 2 web servers. For clearance, (from 1st Server) run a function (querying) on remote server. And return the result to 1st server.
Actually the theme will be:
Web Server (1) ----------------> Web Server (2) ---------------> Database Server
Web Server (1) <---------------- Web Server (2) <--------------- Database Server
Query Function() will be only located on Web Server (2). Then i need to run that query function() remotely from Web Server (1).
What is it call? And Is it possible?
Yes.
A nice way I can think of doing would be to send a request to the 2nd server via a URL. In the GET (or POST) parameters, specify which method you'd like to call, and (for security) some sort of hash that changes with time. The hash in there to ensure no third-party can run the function arbitrarily on the 2nd server.
To send the request, you could use cURL:
function get_url($request_url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$response = curl_exec($ch);
curl_close($ch);
return $response;
}
This sends a GET request. You can then use:
$request_url = 'http://second-server-address/listening_page.php?function=somefunction&securityhash=HASH';
$response = get_url($request_url);
On your second server, set up the listening_page.php (with whatever filename you like, of course) that checks for GET requests and verifies the integrity of the request (i.e. the hash, correct & valid params).
You can do so by using an API. create a page on second server that takes variables and communicates to the server using those vars (depending on what you need). and the standard reply from that page should be either JSON or XML. then read that from server 1 by requesting that file and getting the reply from the 2nd server.
*NOTE if its a private file, make sure you use an authentication method to prevent users from accessing the file
What you are aiming to do is definitely possible. You will need to set up some sort of api in order for server one to make a request to server 2.
I suggest you read up on SOAP and REST api
http://www.netmagazine.com/tutorials/make-your-own-soap-api
Generally you will use something like CURL to contact server 2 from server 1.
Google curl and you should quickly get idea.
Its not going to be easy to give you a complete solution so I hope this nudge in the right direction is helpful.

In any languages, Can I capture a webpage and save it image file? (no install, no activeX)

I heard it is possible to capture webpages by using PHP(maybe above 6.0) on windows server.
I got some sample code and tested. but there are no code to perform rightly.
If you know some right ways to capture webpage save it image file on web applications?
Please teach me.
you could use the browsershots api http://browsershots.org/
with the xml-rpc interface you really could use almost any language to access it.
http://api.browsershots.org/xmlrpc/
Though you have asked for a PHP solution, I would like to share yet another solution with Perl. WWW::Mechanize along with LWP::UserAgent and HTML::Parser can help in screen scraping.
Some documents for reference:
Web scraping with WWW::Mechanize
Screen-scraping with WWW::Mechanize
Downloading the html of a web page is commonly known as screen scraping. This can be useful if you want a program to extract data from a given page. The easiest way to request HTTP resources is to use a tool call cURL. cURL comes as a stand alone unix tool, but there are libraries to use it in about every programming language. To capture this page from the Unix command line type:
curl http://stackoverflow.com/questions/1077970/in-any-languages-can-i-capture-a-webpageno-install-no-activex-if-i-can-plz
In PHP, you can do the same:
<?php
$ch = curl_init() or die(curl_error());
curl_setopt($ch, CURLOPT_URL,"http://stackoverflow.com/questions/1077970/in-any-languages-can-i-capture-a-webpageno-install-no-activex-if-i-can-plz");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data1=curl_exec($ch) or die(curl_error());
echo "<font color=black face=verdana size=3>".$data1."</font>";
echo curl_error($ch);
curl_close($ch);
?>
Now before copying an entire website, you should check their robots.txt file to see if they allow robots to spider their site, and you may want to check if there is an API available which allows you to retrieve the data without the HTML.

Make cURL behave like exactly like form

I have a form on my site which sends data to some remote site - simple html form.
What I want to do is to use data user enters into form for statistical purposes.
So I instead of sending data to the remote page I send it first to my script which resends it the remote site.
The thing is I need it to behave in exact way the usual form would behave taking user to the remote site and displaying resources.
When I use this code it kinda works but not in the way I want it to:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $action);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $fields);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
$result = curl_exec($ch);
curl_close($ch);
Problem is that it displays response in the same script. For example if $action is for example:
somesite.com/processform.php and my script name is mysqcript.php it would display the response of "somesite.com/processform.php" inside "mysqcript.php" so all the relative links are not working.
How do I make it to send the user to "somesite.com/processform.php"? Same thing that pressing the button would do?
Leonti
I think you will have to do this on your end, as translating relative paths is the client's job. It should be simple: Just take the base directory of the request you made
http://otherdomain.com/my/request/path.php
and add it in front of every outgoing link that does not begin with "/" or a protocol ("http://", "ftp://").
Detecting all the outgoing links is hard, but I am 100% sure there are ready-made PHP classes that do that. Check for example this article and the getLinks() function in the user comments. I am not 100% sure whether this is what you need but it certainly goes to the right direction.
Here are a couple of possible solutions, which I post separately so they don't get mixed up with the one I recommend:
1 - keep using cURL, parse the response and add a <base/> tag to it. It should work for pretty much everything on that page.
<base href="http://realsite.com/form_url.php" />
2 - do not alter the submit URL. Submit the form to the real URL, but capture its content using some Javascript library (YUI does that) and send it to your script via XHR. It's still kind of hacky though.
There are several ways to do that. Here's one of the easiest: just use a 307 redirect.
header('Location: http://realsite.com/form_url.php', true, 307');
You can do your logging and stuff either before or after header() but if you do it after calling header() you will need to start your script with
ignore_user_abort(true);
Note that browsers are supposed to notify the user that their form is being redirected.

Categories