Including a PHP File: cURL, Pathing, and Security Risks - php

So I have heard a lot about cURL having security risks. I need to know if what I am doing can have security risks, and if so, how can I prevent it.
I have my own VPS (Virtual Private Server) through Host Gator.
Basically I am trying to include a file from a different domain. Both domains are in the same server. I tried to use the absolute path to include the file, but I keep getting a permission denied error.
This is my code to include the file using cURL:
$url = "http://mydomain.com/include.php";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
Is there a security risk doing this? If so, how can I prevent this security risk?
In addition, how can I set permissions on the domain so I can include a PHP file from another domain on the same server?

If you're loading data from different domain but same server you can simply use include and path but you can neither grab php file with url nor its recommended.

Related

Connect to a site presenting an expired root certificate in the certificate bundle with curl in PHP

Over the weekend, the Sectigo AddTrust External CA Root expired. For modern browsers, this should not have made any difference for users of affected sites.
Our PHP application connects to a site which we don't control, which includes this expired root in its certificate bundle. We connect using curl, and verify the certificates. But since this root is now expired, curl is now refusing to connect, with an error that the certificate is expired.
There is a sample site which exhibits the same behaviour at https://addtrustchain.test.certificatetest.com/
And sample code which exhibits the same behaviour is
$ch = curl_init();
$url = 'https://addtrustchain.test.certificatetest.com/';
//$url = 'https://google.com';
$caPath = '/path/to/cacert.pem';
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch,CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($ch,CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch,CURLOPT_CAINFO, $caPath);
$output = curl_exec($ch);
var_dump($output);
var_dump(curl_getinfo($ch));
var_dump(curl_errno($ch));
var_dump(curl_error($ch));
curl_close($ch);
Is there a workaround from the php side where we can ignore the expired root certificate provided in the bundle? We're trying to work with the parties on the other side to remove/update the expired root from their bundle, but it would be great to have a solution from our side for the next time this comes up.
I have tried updating our local cacert.pem to include the actual certificate itself, and the provided intermediaries, but neither of those seems to fix the issue.
you need to remove AddTrust External Root from your cacert.pem.
For those who are wondering, you can take the cacert.pem from Mozilla there: https://curl.haxx.se/docs/caextract.html
You then need to remove AddTrust External Root.
Removing AddTrust External Root force software to use correct path certification (when you have multiple ones).
For example, twinoid.com has 3 paths. Two of them are valid, the last contain AddTrust External Root. https://www.ssllabs.com/ssltest/analyze.html?d=twinoid.com&hideResults=on (you can check the 3 paths there)

Selective 302 redirect

I am making cURL calls from php scripts on one domain (mac2cash.com) to another (thebookyard.com), both hosted on the same Apache server and the same IP address. This has been working fine but I need to add some new functionality to the site and I have just created a new php script at the root level of the same target domain as the cURL call that is working, but when I call this new script using the same code I used on the working script, this is returning the message "Found: the document has moved here".
The target scripts for the working and failing cURL calls are at the root level of the same domain. I have checked they have the same unix permissions. But if I simply change the php file name in the working script to the name of the target script in the failing call, this now fails too with the same 302 redirect message.
I even duplicated the 'working' target script (byasd_api.php) on the target domain to a new file (byasd_api_copy.php) and I get the 302 message if I make a cURL call to that from the calling script that was working even though the code is exactly the same!
I cannot see what the difference is between the two files. Is there some kind of cacheing going on where newly created files are not being treated the same?
For reference, here is the calling code:
$header=array("Host:thebookyard.com");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, HTTP_SERVER_IP."/byasd_api.php");
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $header);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, 'http://www.mac2cash.com');
curl_setopt($ch, CURLOPT_POST,3);
curl_setopt($ch, CURLOPT_POSTFIELDS,$post_data);
$output = curl_exec($ch);
curl_close($ch);
The 'byasd_api.php' script name is the only thing I am changing.
I've spent some hours googling for a solution so would appreciate any suggestions.
Your apache have configurated to search favicon.ico in each call the 302 is because not find the ico
GET http://theboo....com/favicon.ico [HTTP/1.1 302 Found 151ms]
Change the cofiguration or add the favicon.ico file.
Maybe the configuration try to find the ico file only in the root
It turns out the reason for the difference in behaviour was that the working script name was included as a rewrite condition in the htaccess file which was redirecting http to https. Changing the CURL url to "https://".HTTP_SERVER_IP."/byasd_api.php" stopped the "Found: the document has moved here" error but the call was then failing because CURL was trying to validate the SSL certificate for the IP address rather than the domain.
The solution to that was to add the following :
curl_setopt($ch, CURLOPT_RESOLVE, array("www.thebookyard.com:443:".HTTP_SERVER_IP,));
This still allows the call to be to the IP address (which is much faster than via the domain name) but CURL validates the SSL certificate against the domain name.

file_get_contents,CURL settings on a server

I am in process of providing an option similar to facebook share , wherein an external webpage contents ( content from the external page) can be displayed in my site. I am coding this using PHP and Ajax. but when i hosted my page on a free server site like www/0009.ws i get an error as below,
Warning: file_get_contents() [function.file-get-contents]: URL file-access is disabled in the server configuration in /www/0009.ws
I would surely be moving this to a paid server later,
Is there a workaround if a paid service provider also does not allow me to use these options?
Do I have to setup my own server?
You can use the curl set of functions:
<?php
// create a new cURL resource
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
?>
assuming support is built in, which is a bit much. Honestly file_get_contents not having URL file access is something you should check before hand when choosing a provider.

FTP download file from server directly into client

I try to download a file from FTP server into client. If I use ftp_get, the file is downloaded into PHP server, which can write the output into browser. So the download process is
FTP server -> PHP server -> client
This doubles traffic - this is bad in downloading big files. There is a way how to write the file directly into the browser described here: Stream FTP download to output - but the data flows through PHP server anyway, am I right?
Is there any way how to establish this download (if yes, how?), or is it principially impossible?
FTP server -> client
Edit: it should work also with non-anonymous FTP servers in secure way.
Download the file ;-)
If the client can directly access the file in question (i.e. no secret usernames or passwords necessary), just redirect him to it:
header('Location: ftp://example.com/foobar');
This will cause the client to access the URL directly. You can't control what the client will do though. The browser may simply start to download the file, but it may also launch an FTP client or do other things which you may or may not care about.
try below code for that.
$curl = curl_init();
$file = fopen("ls-lR.gz", 'w');
curl_setopt($curl, CURLOPT_URL, "ftp://ftp.sunet.se/ls-lR.gz"); #input
curl_setopt($curl, CURLOPT_FILE, $file); #output
curl_setopt($curl, CURLOPT_USERPWD, "$_FTP[username]:$_FTP[password]");
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
curl_exec($curl);
Thanks.

How to copy a remote image to my website directory?

I post pictures from other websites and I would rather have those on my servers, in case their server dies all of a sudden. Say the file is located at "www.www.www/image.gif", how would I copy it to my directory "images" safely?
I write in PHP.
Thanks!
The following should work:
// requires allow_url_fopen
$image = file_get_contents('http://www.url.com/image.jpg');
file_put_contents('/images/image.jpg', $image);
or the cURL route:
$ch = curl_init('http://www.url.com/image.jpg');
$fp = fopen('/images/image.jpg', 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
If your server is configured to support http:// file paths, you could use file_get_contents.
If that doesn't work, the second simplest way is by using curl for which you will certainly find full-fledged download scripts.
Some servers you pull images from may require a User-agent that shows that you are a regular browser. There is a ready-made class in the User Contributed Notes to curl that handles that, and provides a simple DownloadFile() function.

Categories