I am working on a Chrome Extension that will run on a 3rd party website. On page load it will make an API call to the 3rd party API and display some of the API information on the page.
Important bits of info
The extension will be running on https://app.example.com.
The API endpoints are under https://api.example.com.
I cannot modify the code or server configurations for either of the above.
Since my browser extension runs as if it were on the page it obviously introduces cross-domain ajax issues so to get around this I have put together a PHP script on https://forward.example.com. I can modify the code and server configuration for this.
https://forward.example.com
Purpose: To accept requests from any domain, and forward the request to the domain specified in the X-Destination-Url header. It should also respond back to the original request with the headers and content of the X-Destination-Url request.
forward.php
I know this is messy, but I just threw it together in a couple of minutes.
$headers = getallheaders();
$destinationKey = 'X-Destination-Url';
$destination = null;
$removeHeaders = array('Origin', 'X-Destination-Url', 'Referer');
if (array_key_exists($destinationKey, $headers)) {
$destination = $headers[$destinationKey];
unset($headers[destinationKey]);
}
foreach ($removeHeaders as $h) {
unset($headers[$h]);
}
$authTokenKey = 'X-Destination-Auth-Token';
$authToken = null;
if (array_key_exists($authTokenKey, $headers)) {
$authToken = $headers[$authTokenKey];
unset($headers[$authTokenKey]);
$headers['X-Auth-Token'] = $authToken;
}
if ($destination === null) {
die('Invalid X-Destination-Url');
}
$preparedHeaders = array();
foreach ($headers as $k => $v) {
$preparedHeaders[] = "{$k}:{$v}";
}
$responseHeaders = array();
function handleHeaderLine($ch, $headerLine)
{
global $responseHeaders;
$responseHeaders[] = $headerLine;
return strlen($headerLine);
}
$ch = curl_init();
// Note that at this point $destination == 'https://api.example.com'
$opts = array(
CURLOPT_URL => $destination,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_HEADERFUNCTION => 'handleHeaderLine',
CURLOPT_CUSTOMREQUEST => $_SERVER['REQUEST_METHOD'],
CURLOPT_HTTPHEADER => $preparedHeaders,
CURLOPT_SSL_VERIFYPEER => false,
);
curl_setopt_array($ch, $opts);
$resp = curl_exec($ch);
foreach ($responseHeaders as $header) {
header($header);
}
echo $resp;
die;
.htaccess
Header add Access-Control-Allow-Origin "*"
Header add Access-Control-Allow-Headers "Access-Control-Allow-Headers, Origin,Accept, X-Requested-With, Content-Type, Access-Control-Request-Method, Access-Control-Request-Headers, X-Auth-Token, X-Destination-Url, X-Destination-Auth-Token"
Header add Access-Control-Allow-Methods "PUT, GET, POST, DELETE, OPTIONS"
Chrome Extension
The Chrome Extension just runs as javascript on-page.
This is the bit of code that interacts with the above.
$.get({
async: true,
url: 'https://forward.example.com/forward.php',
beforeSend: function(xhr){
xhr.setRequestHeader('X-Destination-Auth-Token', 'api-key ' + self.apiKey);
xhr.setRequestHeader('X-Destination-Url', 'https://api.example.com');
},
success: function (d) {
console.log(d);
}
});
When the above runs, Chrome spits out the following error...
Mixed Content: The page at 'https://app.example.com' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint 'http://api.example.com'. This request has been blocked; the content must be served over HTTPS.
Finally...
How on earth has https://api.example.com been changed to http://api.example.com?
At the point that forward.php makes the curl request to X-Destination-Url it is using the correct https url, and it is set up to accept any SSL Certificate, so I'm quite confused as to how this is happening.
Can anyone give me some insight into what's going on here?
Related
I'm trying to make a HTTP POST request to my PHP script. However, it doesn't seem to be retrieving the request data.
To simulate a POST request, I used Request Maker and sent over a url of http://php-agkh1995.rhcloud.com/lta.php and a request data of var1=65059.
Using the default url in the else statement works perfectly fine but not the other
I'm suspecting the request headers to be the fault unless there's a major flaw in my code
lta.php
$stopid=$_POST['var1'];
$defurl = ""; // Default url
if(!empty($stopid)){
$defurl = 'http://datamall2.mytransport.sg/ltaodataservice/BusArrival?BusStopID=$stopid';
} else {
$defurl = 'http://datamall2.mytransport.sg/ltaodataservice/BusArrival?BusStopID=83139';
}
$curl = curl_init();
curl_setopt_array($curl, array(
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_URL => $defurl,
CURLOPT_USERAGENT => 'Ashton',
CURLOPT_HTTPHEADER => array('AccountKey: ********', 'UniqueUserId: ******', 'accept: application/json')
));
$resp = curl_exec($curl);
curl_close($curl);
echo($resp); // To test if data is displayed or not
return $resp;
Request headers sent
POST /lta.php HTTP/1.1
Host: php-agkh1995.rhcloud.com
Accept: */*
Content-Length: 10
Content-Type: application/x-www-form-urlencoded
You could use array_key_exists to test the existence of the POST variable
if(array_key_exists($_POST,'var1')){
$stopid=$_POST['var1'];
$defurl = "http://datamall2.mytransport.sg/ltaodataservice/BusArrival?BusStopID=$stopid";
} else {
..
}
PS : if your $defurl is set to the else case value by default you don't even need the else clause
I am trying to send cURL request to a remote API server with this code:
$ch = curl_init();
$options = array(CURLOPT_URL => 'http://minecms.info/update/index.php',
CURLOPT_POST => true,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_POSTFIELDS => $data,
CURLOPT_HTTPHEADER => array('Content-type: application/json'),
CURLOPT_SSL_VERIFYPEER => false
);
curl_setopt_array($ch, $options);
$response = curl_exec($ch);
But I don't want users accessing the updates page on their browsers so I set a content type header on the request. The problem is that I don't know how to detect this content type on the remote server. Basically what I want is to check whether the client request has a content type: application/json set if yes it executes the rest of the code if not it just does exit;.
Thank you to anyone who would help in advance.
You can try using getallheaders() and check whether the Content-Type is in place.
Give a look at the man http://www.php.net/manual/it/function.getallheaders.php for insights
---- EDIT INSIGHTS ----
And what about this one? (Which I'm currently using)
public function getAllHeaders()
{
if(function_exists('getallheaders'))
{
return getallheaders();
}
$headers = array();
foreach ($this->parameters as $key => $value)
{
if (substr($key, 0, 5) == 'HTTP_')
{
$headers[str_replace(" ", "-", ucwords(strtolower(str_replace("_", " ", substr($key, 5)))))] = $value;
}
if ($key == "CONTENT_TYPE")
{
$headers["Content-Type"] = $value;
}
}
return $headers;
}
I am building a simple php proxy that caches responses header and objects .
My problems are that if I login to youtube.com then I dont see myself as I am signed and youtube keeps saying sign in (not signed) , but If I stop my script and just open youtube.com website then I see myself as signed . I think it is a cookies issue . Is it?
My script just grab response headers and send them back to the browser . When I use fopen() to download the object then some websites such as Google "Play STore" & "Apple Store" keeps sending HTTP 403 (Forbidden) even though I am catching the user-agent of the client through $_SERVER['HTTP_USER_AGENT'] and attach it with $context function stream_context_create() .. Still no luck!.
I have seen such a header Set-Cookie in a response header so I thought if I send it back to the browser using header() then it would be solved .. Still no luck.
This is how I grab headers and send them back to the browser:
This is how I get cookies that are requested from the client
$requested_cookie = $_COOKIE;
$requested_cookie = $_COOKIE;
$ua= $_SERVER['HTTP_USER_AGENT'];
ini_set('user_agent', $ua);
$md5_fname = md5($fname);
$filehead = $file_path."/HTTP_HEADER/".$md5_fname.".txt";
if (file_exists($filehead))
{
$handle = fopen($filehead, "r");
$contents = fread($handle, filesize($filehead));
$getCachedH = unserialize($contents);
fclose($handle);
foreach ($getCachedH as $cHead)
{
$sendHead = $cHead;
$getHead = strtoupper($cHead);
//file_put_contents("$file_path/ghassan.txt", "\n $sendHead \n" ,FILE_APPEND);
if ( preg_match('/CONTENT-LENGTH: (\d+)/i',$getHead,$clm) ) $content_length = $clm[1];
if ( preg_match('/CONTENT-TYPE: ([\/\w\-\d]+)/i',$getHead,$ctm) ) $content_type = $ctm[1];
if ( preg_match('/LOCATION: (.*)/i',$cHead,$lm) )
{
$header_location = $lm[1];
header("Location: ".$header_location);
exit;
}
if ( preg_match('/^HTTP\/1\.[01] (\d\d\d)/i', $getHead, $hcm ) )
{
$http_code = $hcm[1];
}
header($sendHead);
}
}
else
{
$opts = array(
'http'=>array(
'ignore_errors'=>"true",
'method'=>"GET",
'header'=>"Cookie: foo=bar\r\n"
)
);
$context = stream_context_create($opts);
$urlptr = fopen($_GET['url'],'rb', false, $context);
$headers = $http_response_header;
$http_code = $headers[0];
if($http_code=="200")
{
// We grab the Response Headers and save them
file_put_contents($filehead, serialize($headers));
}
foreach ($headers as $response_header)
{
$sendHead = $response_header;
$getHead = strtoupper($response_header);
header($sendHead);
if ( preg_match('/CONTENT-LENGTH: (\d+)/i',$getHead,$clm) ) $content_length = $clm[1];
if ( preg_match('/CONTENT-TYPE: ([\/\w\-\d]+)/i',$getHead,$ctm) ) $content_type = $ctm[1];
if ( preg_match('#Set-Cookie: (([^=]+)=[^;]+)#i', $sendHead , $cookm))
{
$sCookies = $cookm[1];
//file_put_contents("$file_path/cookies.txt", "\n $cookm[0], $url \n" ,FILE_APPEND);
}
if ( preg_match('/LOCATION: (.*)/i',$sendHead,$lm) )
{
$header_location = $lm[1];
header("Location: ".$header_location);
exit;
}
if ( preg_match('/ACCEPT-RANGES: ([\w\d\-]+)/i',$getHead,$arm) ) $accept_ranges = $arm[1];
if ( preg_match('/^HTTP\/1\.[01] (\d\d\d)/i', $getHead, $hcm ) )
{
$http_code = $hcm[1];
}
}
}
Is there a function that grabs the response header + object within one call and store them together in one file ?. I don't want to use fopen() at the top of the script because Apache or Php while reading strings such as $handler = fopen($urlptr,'r'); then it connect remote URL even if i didn't call the string! which is adding latency.
Is there a solution to access client's cookies through my php script and how to solve 403 forbidden message if I am already sending the user-agent of the Android Device when I download a file from Play Store through my script?.
Thank you
I have already solved my question.
While I am grabbing remote headers of each website .. Headers already have Set-Cookie header so I resend them back to the browser and capture cookies back from $_COOKIE and add cookies with fopen() which seems to solve all my problems above.
Thank you
I'm setting up a site where users will be able to post links, and curl (in php) will crawl the url, and format something based on the metadata, open graph tags, etc. I have it set up to run simultaneous uploads with multi_init and multi_exec. I created a gist for the class here. What it's supposed to do is:
get metadata from multiple urls
return a single json string but only for pages with content-type 'text/html' (so don't bother with direct links to images, js, executables, etc)
The problem seems to be the callback for CURLOPT_HEADERFUNCTION. I thought that having it return -1 when a content-type header exists but isn't an html header would abort the download but it doesn't seem to do anything (although the check appears correct and it seems to be returning -1.) It still seems to allow any content type through.
Here specifically is the callback:
CURLOPT_HEADERFUNCTION => function($ch, $header){
// if they're sending a content-type header, it must be text/html
if(stripos(trim($header), "Content-Type") === 0){
list($key, $val) = explode(":", $header);
if(stripos(trim($val), "text/html") === 0){
return strlen($header);
}
else{
return -1;
}
}
else{
return strlen($header);
}
}
I tried curl_close but got an error about closing curl in a callback. Any suggestions?
Use the callback to set a (global) variable. Skip your curl_exec() call when false.
$htmlheader = true;
function header_callback($ch, $headers)
{
$GLOBALS['htmlheader']=false;
}
$ch = curl_init('http://www.example.com/');
curl_setopt($ch,CURLOPT_HEADERFUNCTION, 'header_callback');
if($htmlheader)
{
$result = curl_exec($ch);
}
curl_close($ch);
I am creating a analytic project. My goal is to give the owner of x-domain a small amount of javascript includes from my site. That gives me the ability trace their mouse movements. I got the tracing down, All I need to do is send the data back to my server so it can be stored in my DB. my problem is the data is too large to send through getJSON.
Remember..
I can't use $.Post, or any kind of XMLhttp request because my domain and x-domain are REMOTE. And Browser don't permit that.. I can only use getJSON.
Since that doesn't work, I was told to setup a proxy. Well from what I've learned, a proxy only works for the server that has the proxy setup. Not for the server that is trying to send me the data
I know this is possible, cause ive seen it. anyone have any ideas ?? Is iframes good for what I am trying to do ?? Does anyone have any resources to share ??
Thanks alot
You can have your JavaScript create an iframe and a form, then have the form post into the iframe. You can position it off screen to make is hidden. For instance:
function post_my_data(json){
var f = document.createElement('form');
f.action = 'http://www.my-domain.com/receive.php';
f.method = 'post';
f.target = 'posttarget';
var i = document.createElement('input');
i.type = 'hidden';
i.name = 'json';
i.value = json;
f.appendChild(i);
var ifr = document.createElement('iframe');
ifr.name = 'posttarget';
ifr.style.display = 'absolute';
ifr.style.left = '-1000px';
document.body.appendChild(ifr);
document.body.appendChild(f);
f.submit();
}
Have you considered using something like JSONP?
Split your data in chunks so that getJSON would work. You could implement data queuing, so that the producer keeps filling a queue with data, and the consumer on your domain gets it in smaller chunks with getJSON. It won't be real - time, but you could try it and see if you're ok with the performance.
You can use javascript to talk to flash and have flash do the cross-domain part see http://www.xml.com/pub/a/2006/06/28/flashxmlhttprequest-proxy-to-the-rescue.html
I'm not clear on why you can't use a proxy. The javascript in the browser posts to a script running on x-domain and that script then posts the exact same info to your domain using curl or similar.
Perhaps you can better understand it then I can. I am good with php, but not to great with sending out different headers and stuff. How it works is you post a ajax post to the proxy file. the proxy file is located on a outside server
$.ajax({
type: "POST",
url: "http://mywebsite.com/ajax-proxy.php",
data: 'csurl=www.google.com',
error: function(e) {console.log(e);},
success: function(msg){console.log(msg)}
});
You also got to pass the csurl which is the url that the proxy forwards you to. in this example I used google. but what I would normal use as the csurl is the directory to where I will store the ajax data
In the proxy file, there is a
$valid_requests = array()
In that array, you state all the urls you want the proxy to approve of. in this example you put www.google.com (Note: has to be exactly the same as csurl parameter or it wont work)
Below is the proxy file
<?php
/**
* AJAX Cross Domain (PHP) Proxy 0.6
* by Iacovos Constantinou (http://www.iacons.net)
*
* Released under CC-GNU GPL
*/
/**
* Enables or disables filtering for cross domain requests.
* Recommended value: true, for security reasons
*/
define('CSAJAX_FILTERS', true);
/**
* A set of valid cross domain requests
*/
$valid_requests = array(
'www.google.com'
);
/*** STOP EDITING HERE UNLESS YOU KNOW WHAT YOU ARE DOING ***/
// identify request headers
$request_headers = array();
foreach ( $_SERVER as $key=>$value ) {
if( substr($key, 0, 5) == 'HTTP_' ) {
$headername = str_replace('_', ' ', substr($key, 5));
$headername = str_replace(' ', '-', ucwords(strtolower($headername)));
$request_headers[$headername] = $value;
}
}
// identify request method, url and params
$request_method = $_SERVER['REQUEST_METHOD'];
$request_params = ( $request_method == 'GET' ) ? $_GET : $_POST;
$request_url = urldecode($request_params['csurl']);
$p_request_url = parse_url($request_url);
unset($request_params['csurl']);
// ignore requests for proxy :)
if ( preg_match('!'. $_SERVER['SCRIPT_NAME'] .'!', $request_url) || empty($request_url) ) {
exit;
}
// check against valid requests
if ( CSAJAX_FILTERS ) {
$parsed = $p_request_url;
$check_url = isset($parsed['scheme']) ? $parsed['scheme'] .'://' : '';
$check_url .= isset($parsed['user']) ? $parsed['user'] . ($parsed['pass'] ? ':'. $parsed['pass']:'') .'#' : '';
$check_url .= isset($parsed['host']) ? $parsed['host'] : '';
$check_url .= isset($parsed['port']) ? ':'.$parsed['port'] : '';
$check_url .= isset($parsed['path']) ? $parsed['path'] : '';
if ( !in_array($check_url, $valid_requests) ) {
exit;
}
}
// append query string for GET requests
if ( $request_method == 'GET' && count($request_params) > 0 && ( !array_key_exists('query', $p_request_url) || empty($p_request_url['query']) ) ) {
$request_url .= '?'. http_build_query($request_params);
}
// let the request begin
$ch = curl_init($request_url);
curl_setopt($ch, CURLOPT_HTTPHEADER, $request_headers); // (re-)send headers
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); // return response
curl_setopt($ch, CURLOPT_HEADER, true); // enabled response headers
// add post data for POST requests
if ( $request_method == 'POST' ) {
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, http_build_query($request_params));
}
// retrieve response (headers and content)
$response = curl_exec($ch);
curl_close($ch);
// split response to header and content
list($response_headers, $response_content) = preg_split('/(\r\n){2}/', $response, 2);
// (re-)send the headers
$response_headers = preg_split('/(\r\n){1}/', $response_headers);
foreach ( $response_headers as $key => $response_header )
if ( !preg_match('/^(Transfer-Encoding):/', $response_header) )
header($response_header);
// finally, output the content
print($response_content);
?>
Again, if I put http://mywebsite.com/ajax-proxy.php?csurl=www.google.com from within my website. it works fine. or even put it in the url. works fine. but if you call it from an outsite server using ajax post. doesnt work.