Flash REST woes: passing headers through php proxy - php

I'm writing ActionScript3 flash game, which needs to access REST services specified by sponsor (buyer of game) - things like highscores etc. I know how to use URLLoader, URLRequest, set URLRequestHeader.
Unfortunately sponsor is on shared hosting and can't put crossdomain.xml in the server root, so I can't connect to it from my localhost flash game (Same Origin Policy). I learned that there is a way to connect to rest API by proxying calls through a php file on a different server.
So I have a proxy.php file on my private server, and am calling it like:
www.myserver.pl/scripts/proxy.php?url=http%3A%2F%2sponsorserver.hosting.com/api/init.json
(url=URLEncoded address)
It connects (returns HTTP 200), however I don't know yet how to pass custom headers to the init.json script; I tried all combinations sending them through GET and POST, as well as calling script through GET and POST (using RESTClient FF extension).
This is the proxy.php:
$getvars = 'myparam1=3;&myparam2=data;'; // for test purposes
$url = $_GET['url'] . '?' . $getvars;
$session = curl_init($url);
curl_setopt($session, CURLOPT_HEADER, true);
curl_setopt($session, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec_follow($session);
echo $response;
curl_close($session);
Unfortunately my server has base_dir != '' so I can't set CURLOPT_FOLLOWLOCATION and use curl_exec, hence the curl_exec_follow (taken from this SO answer: https://stackoverflow.com/a/10835824/2492808). Could it be why the GET variables are not picked up by the init.json? It says it doesn't see the myparam1 and myparam2. Unfortunately I can't change php.ini on my server, and need a way to make it working, so that I can integrate and test sponsor's rest API from my IDE. Thanks!
EDIT: that was stupid, it's not the GET variables that I want to send to sponsor script, but http headers. So I've removed $getvars and added:
curl_setopt ($session, CURLOPT_HTTPHEADER, Array('myparam1=3','myparam2=data'));
before curl_exec_follow and inside, before every new curl_exec call, which theoretically should set the headers and make them through. Unfortunately, script still doesn't see the headers :(
Also, according to php manual, I tried removing base_opendir restriction by putting .htaccess in www.myserver.pl/scripts/:
<IfModule mod_php5.c>
php_value open_basedir ""
</IfModule>
But it's not changing anything, probably I don't have
"AllowOverride Options" or "AllowOverride All" privileges to do so
so CURLOPT_FOLLOWLOCATION is still illegal.

Unfortunately I can't change php.ini on my server, and need a way to make it working, so that I can integrate and test sponsor's rest API from my IDE. Thanks!
Useless excuse these days. Buy a $15 per month server hosted on the amazon cloud. It's easier to spend $15 than to spend an hour writing code for an easily solvable problem. You were probably spending $10 anyway on a shared hosting plan.

Related

PHP curl request results in white page when the url contains dots or colons

This only happens on my webserver, not on the local system.
I have a curl request like this
ini_set('display_errors', 1);
error_reporting(E_ALL);
$url = 'http://***.***.***.***:8080/api_v1/oauth/token';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
$response = curl_exec($ch);
This makes the page loading for a while and just returns a white screen. It is really impossible to show errors, output or just anything else.
Whenever I change the url to another url (existing or not existing) i get proper errors or output if the url makes sense, as long as the url does not contain any dots or colons...
Is there any restriction for the usage or a curlopt I am missing?
I have no control over the target url, I need to consume the api in the ip:port structure.
UPDATE
The problem is not related to the target URL or data coming in: the same problem occurs when I enter a url that makes no sense at all as long as it doesn't contain . or :
I guess it is a setting on the webserver since all my tests work fine on localhost (MAMP)
Unfortunately I have no access to any logs or files except the ones I upload myself (one.com webhosting)
UPDATE 2
Turns out my hoster is blocking all outgoing traffic to implicit IP's and ports other than 80 and 443.
Cancelled my subscription and taking a decent provider now.
Thanks for the help
As #Quasimodo suggests, then I'd take a look in the log-file, if I were you. If you're on a Ubuntu-server using Apache, then look at /var/log/apache2/error.log. A neat trick is to open a terminal and write:
tail -f /var/log/apache2/error.log
This will open a running stream to the terminal. Then you can make your curl-request crash (in your browser) and then go back to the terminal and see what new and juicy errors you have received.
It's most likely some configuration-file on your server. So it would be helpful, if you write a couple of specs from that server, such as:
- Which web server you're using (Apache, Nginx, other)
- PHP version
... You can find all of these information easily using phpinfo.
My best guess is that you need to enable PHP_Curl for your server configuration, - but it is a buck-wild cowboy shot from the hip.
Addition 1
I can see that you've just editted the question (that it thinks for a while and then gives a blank screen). I'd say, that your curl-request might be trying to load a big amount of data, and that your PHP-configuration has a cap at 128mb (or something).
I'd check the PHPinfo for these two values:
max_input_vars
memory_limit
To see if either of them are suspiciously low.
Turns out my hoster is blocking all outgoing traffic to implicit IP's and ports other than 80 and 443. Cancelled my subscription and taking a decent provider now. Thanks for the help

PHP file_get_contents behavior across versions and operating systems

I am using file_get_contents in PHP to get information from a client's collections on contentDM. CDM has an API so you can get that info by making php queries, like, say:
http://servername:port/webutilities/index.php?q=function/arguments
It has worked pretty well thus far, across computers and operating systems. However, this time things work a little differently.
http://servername/utils/collection/mycollectionname/id/myid/filename/myname
For this query I fill in mycollection, myid, and myname with relevant values. myid and mycollection have to exist in the system, obviously. However, myname can be anything you want. When you run the query, it doesn't return a web page or anything to your browser. It just automatically downloads a file with myname as the name of the file, and puts it in your local /Downloads folder.
I DON'T WISH TO DOWNLOAD THIS FILE. I just want to read the contents of the file it returns directly into PHP as a string. The file I am trying to get just contains xml data.
file_get_contents works to get the data in that file, if I use it with PHP7 and Apache on my laptop running Ubuntu. But, on my desktop which runs Windows 10, and XAMPP (Apache and PHP5), I get this error (I've replaced sensitive data with ###):
Warning:
file_get_contents(###/utils/collection/###/id/1110/filename/1111.cpd):
failed to open stream: No such file or directory in
D:\Titus\Documents\GitHub\NativeAmericanSCArchive\NASCA-site\api\update.php
on line 18
My coworkers have been unable to help me so I am curious if anyone here can confirm or deny whether this is an operating system issue, or a PHP version issue, and whether there's a solid alternative method that is likely to work in PHP5 and on both Windows and Ubuntu.
file_get_contents() is a simple screwdriver. It's very good for getting data by simply GET requests where the header, HTTP request method, timeout, cookiejar, redirects, and other important things do not matter.
fopen() with a stream context or cURL with setopt are powerdrills with every bit and option you can think of.
In addition to this, due to some recent website hacks, we had to secure our sites more. In doing so, we discovered that file_get_contents failed to work, where curl still would work.
Not 100%, but I believe that this php.ini setting may have been blocking the file_get_contents request.
; Disable allow_url_fopen for security reasons
allow_url_fopen = 0
Either way, our code now works with curl.
reference :
http://25labs.com/alternative-for-file_get_contents-using-curl/
http://phpsec.org/projects/phpsecinfo/tests/allow_url_fopen.html
So, You can solve this problem by using PHP cURL extension. Here is an example that does the same thing you were trying:
function curl($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$url = 'your_api_url';
$data = curl($url);
And finally you can check your data by print_r($data). Hope it you it will works and you will understand.
Reference : http://php.net/manual/en/book.curl.php

file_get_contents() vs. curl for invoking APIs with PHP

According to the description of the Google Custom Search API you can invoke it using the GET verb of the REST interface, like with the example:
GET https://www.googleapis.com/customsearch/v1?key=INSERT-YOUR-KEY&cx=017576662512468239146:omuauf_lfve&q=lectures
I setup my API key and custom search engine, and when pasted my test query directly on my browser it worked fine, and I got the JSON file displayed to me.
Then I tried to invoke the API from my PHP code by using:
$json = file_get_contents("$url") or die("failed");
Where $url was the same one that worked on the browser, but my PHP code was dying when trying to open it.
After that I tried with curl, and it worked. The code was this:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$body = curl_exec($ch);
Questions:
How come file_get_contents() didn't work and curl did?
Could I use fsocket for this as well?
Question 1:
At first you should check ini setting allow_url_fopen, AFAIK this is the only reason why file_get_contents() shouldn't work. Also deprecated safe_mode may cause this.
Oh, based on your comment, you have to add http:// to URL when using with file system functions, it's a wrapper that tells php that you need to use http request, without it function thinks you require to open ./google.com (the same as google.txt).
Question 2:
Yes, you can build almost any cURL request with sockets.
My personal opinion is that you should stick with cURL because:
timeout settings
handles all possible HTTP states
easy and detailed configuration (there is no need for detailed knowledge of HTTP headers)
file_get_contents probably will rewrite your request after getting the IP, obtaining the same thing as:
file_get_contents("xxx.yyy.www.zzz/app1",...)
Many servers will deny you access if you go through IP addressing in the request.
With cURL this problem doesn't exists. It resolves the hostname leaving the request as you set it, so the server is not rude in response.
This could be the "cause", too..
1) Why are you using the quotes when calling file_get_contents?
2) As it was mentioned in the comment, file_get_contents requires allow_url_fopen to be enabled on your php.ini.
3) You could use fsockopen, but you would have to handle HTTP requests/responses manually, which would be to reinvent the wheel when you have cURL. The same goes for socket_create.
4) Regarding the title of this question: cURL can be more customizable and useful to work with complex HTTP transactions than file_get_contents. Though, it should be mentioned, that working with stream contexts allows you to make a lot of settings for your file_get_contents calls. However, I think cURL is still more complete since it gives you, for instance, the possibility of working with multiple parallel handlers.

PHP CURL Cookie not kept when cookie called a second time

I have a php page that uses CURL to log in to another page, get the cookies and then use that to call another page. On the new page the php can be called again to call the same page but with different parameters. This code all works on my free web hosting site. However when I moved it to my clients webpage works for the first call (i.e. cookie was created and used fine) but does not when I call the page again with a new parameter (i.e. the cookies is not reused). The code is in wordpress and all details are near identical (in the way that I have copied the themes, plugins and DB from one site to another). What would be the reasons for the difference and how would I go about changing this difference?
The only difference I can see at the moment is looking at the response from the web pages, the site that is not working has the cache-control set to no caching and age=0. Would this be the reason and if so how can I change this?
Try to manually assign a cookiejar / file to your curl operations:
$cookie_file = "/tmp/cookie/cookie1.txt";
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie_file);
curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie_file);
curl will then read cookies from the cookiejar before starting the request and will write recieved cookies into the cookiefile it gets from the response.
The path must be accessible and read/write-able by the user that PHP gets executed as. You should use a full path, not a relative one.
Edit: Marc B writes in PHP, Curl, curl_exec(), curl_close() and cookies that cookies are bound to the curl handle. So as long as you don't close the handle curl should take care about cookies.
So you might not need the cookiejar/file if both requests share the same curl handle.

Can I do a CURL request to the same server?

I need to implement a way to make POST calls to pages located on the same server or in another server. We cannot use include because the files that we are calling usually call different databases or have functions with the same name.
I've been trying to implement this using curl, and while it works perfectly when calling files from another server, I get absolutely nothing when making a call to the same server where the file is.
EDIT TO ADD SOME CODE:
A simplified version of what I'm doing:
File1.php
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "www.myserver.com/File2.php");
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_VERBOSE, true);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
?>
File2.php
<?php
echo "I'M IN!!";
?>
After calling File1.php, I get nothing, but if File2.php is in another server then I get a result.
Any help?
I tried using both the server URL (http...) and the total address of the files (/home/wwww....)
Be aware that if you're issuing the CURL request to your own site, you're using the default session handler, and the page you're requesting via CURL uses the same session as the page that's generating the request, you'll run into a deadlock situation.
The default session handler locks the session file for the duration of the page request. When you try to request another page using the same session, that subsequent request will hang until the request times out or the session file becomes available. Since you're doing an internal CURL, the script running CURL will hold a lock on the session file, and the CURL request can never complete as the target page can never load the session.
Because when you tried to request to the local server with the public ip, apache couldn't resolve to its local domain. So you have to check which local ip apache is using for that domain. Then you need to edit the /etc/hosts file and add the new row with local ip plus your domain. For example:
My Local ip for that domain in apache's virtual host is : 172.190.1.120 and my domain is mydomain.com
So I will add:
172.190.1.120 mydomain.com
Then your curl will work properly.
You should refactor your code. In addition to what Marc B mentioned, this approach will unnecessarily slow down your script (potentially by a large margin) and cause lots of confusion. No offense, but this is just an incredibly hacky fix for bad logic.

Categories