Can I do a CURL request to the same server? - php

I need to implement a way to make POST calls to pages located on the same server or in another server. We cannot use include because the files that we are calling usually call different databases or have functions with the same name.
I've been trying to implement this using curl, and while it works perfectly when calling files from another server, I get absolutely nothing when making a call to the same server where the file is.
EDIT TO ADD SOME CODE:
A simplified version of what I'm doing:
File1.php
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "www.myserver.com/File2.php");
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_VERBOSE, true);
$result = curl_exec($ch);
curl_close($ch);
echo $result;
?>
File2.php
<?php
echo "I'M IN!!";
?>
After calling File1.php, I get nothing, but if File2.php is in another server then I get a result.
Any help?
I tried using both the server URL (http...) and the total address of the files (/home/wwww....)

Be aware that if you're issuing the CURL request to your own site, you're using the default session handler, and the page you're requesting via CURL uses the same session as the page that's generating the request, you'll run into a deadlock situation.
The default session handler locks the session file for the duration of the page request. When you try to request another page using the same session, that subsequent request will hang until the request times out or the session file becomes available. Since you're doing an internal CURL, the script running CURL will hold a lock on the session file, and the CURL request can never complete as the target page can never load the session.

Because when you tried to request to the local server with the public ip, apache couldn't resolve to its local domain. So you have to check which local ip apache is using for that domain. Then you need to edit the /etc/hosts file and add the new row with local ip plus your domain. For example:
My Local ip for that domain in apache's virtual host is : 172.190.1.120 and my domain is mydomain.com
So I will add:
172.190.1.120 mydomain.com
Then your curl will work properly.

You should refactor your code. In addition to what Marc B mentioned, this approach will unnecessarily slow down your script (potentially by a large margin) and cause lots of confusion. No offense, but this is just an incredibly hacky fix for bad logic.

Related

Run a cron job from within a php file

I have a problem where my hosting company won't let me run a cron job in this format from my control panel:
/usr/bin/php /home/sites/MYDOMAIN.com/index.php?option=com_community&task=cron
Or:
www.MYDOMAINNAME.com/index.php?option=com_community&task=cron
Now if i run the second job in a browser i.e.:
www.MYDOMAINNAME.com/index.php?option=com_community&task=cron
this works fine in a browser
My support says I have to create a file to run the URL. The only problem is I don’t know how to run a URL in PHP. I have asked a few sites. But nothing. My file is called bump.php and has the following code:
lynx -dump http://www.MYDOMAIN.com/index.php?option=com_community&task=cron
this is what i have in the file
<?php
echo file_get_contents('DOMAIN.com/index.php?option=com_community&task=cron');
?>
You have to access the file in question via your webserver, not directly by file access. If you access it by file-access, it will just return the php code and not execute it.
There are several options on how to access files via webserver. One is your shown method with file_get_contents. You will need to add http:// in front of the url to tell PHP that you want it accessed remotly and not as a local file.
file_get_contents is not allways configured to allow remotely downloading files. In these cases, it will not work. You can check this link to see the configuration setting for remote accessing files:
http://www.php.net/manual/en/filesystem.configuration.php#ini.allow-url-fopen
Another solution is to use the curl extension (if available)
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
http://www.php.net/manual/en/function.curl-exec.php
There are other extensions if curl is also not available...

Flash REST woes: passing headers through php proxy

I'm writing ActionScript3 flash game, which needs to access REST services specified by sponsor (buyer of game) - things like highscores etc. I know how to use URLLoader, URLRequest, set URLRequestHeader.
Unfortunately sponsor is on shared hosting and can't put crossdomain.xml in the server root, so I can't connect to it from my localhost flash game (Same Origin Policy). I learned that there is a way to connect to rest API by proxying calls through a php file on a different server.
So I have a proxy.php file on my private server, and am calling it like:
www.myserver.pl/scripts/proxy.php?url=http%3A%2F%2sponsorserver.hosting.com/api/init.json
(url=URLEncoded address)
It connects (returns HTTP 200), however I don't know yet how to pass custom headers to the init.json script; I tried all combinations sending them through GET and POST, as well as calling script through GET and POST (using RESTClient FF extension).
This is the proxy.php:
$getvars = 'myparam1=3;&myparam2=data;'; // for test purposes
$url = $_GET['url'] . '?' . $getvars;
$session = curl_init($url);
curl_setopt($session, CURLOPT_HEADER, true);
curl_setopt($session, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec_follow($session);
echo $response;
curl_close($session);
Unfortunately my server has base_dir != '' so I can't set CURLOPT_FOLLOWLOCATION and use curl_exec, hence the curl_exec_follow (taken from this SO answer: https://stackoverflow.com/a/10835824/2492808). Could it be why the GET variables are not picked up by the init.json? It says it doesn't see the myparam1 and myparam2. Unfortunately I can't change php.ini on my server, and need a way to make it working, so that I can integrate and test sponsor's rest API from my IDE. Thanks!
EDIT: that was stupid, it's not the GET variables that I want to send to sponsor script, but http headers. So I've removed $getvars and added:
curl_setopt ($session, CURLOPT_HTTPHEADER, Array('myparam1=3','myparam2=data'));
before curl_exec_follow and inside, before every new curl_exec call, which theoretically should set the headers and make them through. Unfortunately, script still doesn't see the headers :(
Also, according to php manual, I tried removing base_opendir restriction by putting .htaccess in www.myserver.pl/scripts/:
<IfModule mod_php5.c>
php_value open_basedir ""
</IfModule>
But it's not changing anything, probably I don't have
"AllowOverride Options" or "AllowOverride All" privileges to do so
so CURLOPT_FOLLOWLOCATION is still illegal.
Unfortunately I can't change php.ini on my server, and need a way to make it working, so that I can integrate and test sponsor's rest API from my IDE. Thanks!
Useless excuse these days. Buy a $15 per month server hosted on the amazon cloud. It's easier to spend $15 than to spend an hour writing code for an easily solvable problem. You were probably spending $10 anyway on a shared hosting plan.

PHP CURL Cookie not kept when cookie called a second time

I have a php page that uses CURL to log in to another page, get the cookies and then use that to call another page. On the new page the php can be called again to call the same page but with different parameters. This code all works on my free web hosting site. However when I moved it to my clients webpage works for the first call (i.e. cookie was created and used fine) but does not when I call the page again with a new parameter (i.e. the cookies is not reused). The code is in wordpress and all details are near identical (in the way that I have copied the themes, plugins and DB from one site to another). What would be the reasons for the difference and how would I go about changing this difference?
The only difference I can see at the moment is looking at the response from the web pages, the site that is not working has the cache-control set to no caching and age=0. Would this be the reason and if so how can I change this?
Try to manually assign a cookiejar / file to your curl operations:
$cookie_file = "/tmp/cookie/cookie1.txt";
curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie_file);
curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie_file);
curl will then read cookies from the cookiejar before starting the request and will write recieved cookies into the cookiefile it gets from the response.
The path must be accessible and read/write-able by the user that PHP gets executed as. You should use a full path, not a relative one.
Edit: Marc B writes in PHP, Curl, curl_exec(), curl_close() and cookies that cookies are bound to the curl handle. So as long as you don't close the handle curl should take care about cookies.
So you might not need the cookiejar/file if both requests share the same curl handle.

PHP curl post to login to wordpress

I am using php curl to login to wordpress behind-the-scenes as described here:
Wordpress autologin using CURL or fsockopen in PHP
However my script is not setting the cookies necessary to retain the wordpress session. Instead they are being sent back to my script and stored in cookies.txt.
Both the curl script and the wordpress login are on the same server in different directories.
Do I need to write another curl script to manually set the wordpress cookies? Is that possible?
If you're just using the code posted as is then it won't work because the subsequent requests won't send the cookies back on each request. Adding curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie); should help (well, at least it will if the cookies actually get saved now - otherwise look into permissions) - well, depending on what your usage scenario is anyway.
You could also check out 10 awesome things to do with cURL for some neat examples on how to use curl (example 4 might just be what you are looking for).
BTW If this script is intended for multiple (concurrent) users, you shouldn't use a static filename, but create a temporary file for each user.
I needed the cookies to be sent to the browser not back to my curl script. The curl script was triggered by a php script running in the browser.
I solved the problem as follows:
Added these params to the curl object:
curl_setopt($ch, CURLOPT_HEADER ,1);
curl_setopt ($ch, CURLOPT_HEADERFUNCTION, 'read_header');
Added a php functions called read_header which parsed out the cookie data
Used setcookie() to manually set the cookies
If anyone wants the details please comment.

How do I transfer data to next server

I am trying to pass some information of a server in which the script is running to next url or server.
I tried using curl. I have following two disadvantages:
if cannot locate file it will tell file not found
it waits till the remote file execution is completed.
How can I overcome both of the things either by using curl or other commands?
Edit:
Now I would like to suppress the file not found message error message being displayed by curl even if the file really doesn't exists.
I do not want output from destination page so I don't want to wait till the execution of the destination page is finished. I just want to trigger the code and continue with another scripts remaining
Example:
I am trying to build a log system which will have its everything in next webserver. The client website which implements the log system will be sending some of the data required for the system by calling a file in my webserver.
Code I am using:
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://example.com/log.php?data=data");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
Why don't you want to use standard PHP features?
<?php
$logpage = file_get_contents('http://example.com/log.php?data=data');
echo $logpage;
?>

Categories