I installed localhost (xammp, wampserver) on a VDS. When I try to get a file using PHP Curl and file_get_contents, the download speed is very low. I can download a 100mb file in 10 minutes. If I try to download the same file with a browser, the duration is only 3 seconds. What can be the reason?
Thanks for your interest.
Downloading content at a specific URL is common practice on the internet, especially due to increased usage of web services and APIs offered by Amazon, Alexa, Digg, etc. PHP's cURL library, which often comes with default shared hosting configurations, allows web developers to complete this task.
u can try
/* gets the data from a URL */
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
The Usage
$returned_content = get_data('http://davidwalsh.name'); //something like this
Alternatively, you can use the file_get_contents function remotely, but many hosts don't allow this.
Related
I have audio files on a remote server that are streamed / chunked to the user. This all works great in the clients browser.
But when I try to download and save the files locally from another server using curl, it only seems to be able to download small files (less than 10mb) sucessfully, anything larger and it seems to only download the header.
I assume this is because of the chunking, so my question is how do I make curl download the larger (chunked) files?
With wget on the cli on linux this is as simple as :
wget -cO - https://example.com/track?id=460 > mytrack.mp3
This is the func I have written using curl in PHP, but like I say it's only downloading headers on large files :
private function downloadAudio($url, $fn){
$ch = curl_init($url);
$path = TEMP_DIR . $fn;
$fp = fopen($path, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_AUTOREFERER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
if (file_exists($path)) {
return true;
}
return false;
}
In my case it was failing as I had forgotten to increase the default PHP memory_limit on the origin server.
It turned out after posting this question that it was actually successfully downloading any files that seemed to be below the 100mb mark, not 10mb as I had stated in the question. As soon as I realised this I checked the memory_limit and low and behold it was set to the default 128m.
I hadn't noticed any problems client side as it was being chunked, but when the server tried to grab an entire 300mb file in less than 1 second the memory limit must have been reached.
I have a local website setup via MAMP at: http://127.0.0.1/wordpress/
Is it possible to CURL this local site via my online site? For example, in https://example.com/file.php, I test:
$url ='http://127.0.0.1/wordpress/';
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HEADER, false);
curl_setopt($curl, CURLOPT_SSLVERSION, 3);
$content = curl_exec($curl);
curl_close($curl);
print_r($content);
But it doesn't work. Is it even possible for the client (website visitor) to access their localhost like this?
There are couple of good free service that let you do the same. Ideal for showing something quickly for testing purpose :-
http://localtunnel.me/
https://ngrok.com/
http://localhost.run/
You can use tool like ngrok for get a live url for your local projects. Please keep in mind that you cannot use ngrok like continues service. its just for quickie local to live platform for like testing purpose
Download ngrok from here , place the downloaded file in your project folder and simply run ./ngrok http 80
( here 80 is the port assuming your project running. it is subject to change as per your required)
I have a site that uses the new version of Google ReCaptcha (I am not a robot version), and am having trouble getting the challenge response on my shared server.
I cant use curl or file_get_contents due to restrictions on the server. Is there any other ways to get the response?
The code I was using locally, that does not work on the live site is:
CURL
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$response=$this->get_curl_response("https://www.google.com/recaptcha/api/siteverify?secret=SECRET&response=".$_POST['g-recaptcha-response']."&remoteip=".$_SERVER['REMOTE_ADDR']);
File Get Contents
$response=$this->get_curl_response("https://www.google.com/recaptcha/api/siteverify?secret=SECRET&response=".$_POST['g-recaptcha-response']."&remoteip=".$_SERVER['REMOTE_ADDR']);
This turned out to be a problem with the hosting company that could not be resolved, the challenge response from the captcha was being blocked by the hosting company's firewall, and therefore the capthcha always failed.
Because I am on a shared server they could not enable file_get_contents() as it would be a security risk for all the sites on that server.
I installed PHP Captcha (https://www.phpcaptcha.org/) as an alternative, whoch works fine as all the logic is done locally.
I have some code to convert a PHP page to HTML:
$dynamic = "http://website.net/home.php";
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"$dynamic");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
file_put_contents($out, $file);
This works perfect in localhost but it takes too much time/doesn't work on the live site.
I've tried php_get_contents, but that also doesn't work.
Note:
http://website.net/home.php page is in the same site where the code is hosted.
curl is enabled and allow_url_fopen is on as per phpinfo() in both localhost and at server.
EDIT:
It works fine when using other website's page instead of my website.
The site's target page perfectly loads in my browser.
The web page of my website is loading fast as usual but when I use curl or file_get_contents, it's too slow and even can't get output.
I think you have DNS resolving problem.
There is two ways you can use your website's locahost instead of the external domain name.
1. If you have your own server / VPS / Dedicated server,
Add entry in vim /etc/hosts for 127.0.0.1 website.net or try to fetch the content with localhost(127.0.0.1).
2. If you are using shared hosting, then try to use below url(s) in your code,
http://localhost.mywebsite.net/~username/home.php.
OR
Try to call http://localhost.mywebsite.net/home.php
Try this to fetch the url content -
$dynamic = TRY_ABOVE_URL(S)
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
if($file === false) {
echo 'Curl error: ' . curl_error($ch);
}
file_put_contents($out, $file);
To investigate the reason why is it not working on live server, Try this:
$dynamic = "http://website.net/home.php";
$out = "home.html" ;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $dynamic);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$file = curl_exec($ch);
if($file === false)
{
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
file_put_contents($out, $file);
I think it depends on the provider SSL configuration.
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_setopt($ch, CURLOPT_SSLVERSION, 3);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
Check for know smth. about SSL configuration of your provider.
Looks like it is a network routing issue, basically the server can't find a route to website.net(itself). This has is a server issue and not a PHP issue. The quickest solution is to edit the hosts file on the server and set website.net to 127.0.0.1.
On linux servers you will need to add the following line to the bottom of /etc/hosts to:
127.0.0.1 website.net
Alternatively you can try fetching http://127.0.0.1/home.php but that will not work if you have multiple virtual hosts on the server.
Try this one:
file_put_contents("home.html", fopen("http://website.net/home.php", 'r'));
but curl should work as well. If you have SSH access to the server, try to resolve your domain name using ping or something. It is possible to have a local DNS issue - it will explain why you can download from external domains.
For my example, you'll need allow_fopen_url to be On, so please verify that by phpinfo() first.
I have my virtually hosted web server. I installed it using VirtualBox, and it uses the Ubuntu Server system. Recently, I was in a need to get data from Google Maps Geocode service. Firstly, I tried using the next code:
file_get_contents(URL);
After getting the timeout error, I tried using cURL also:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://maps.google.com/maps/api/geocode/json?address=" . $gm_address . "&sensor=false");
$result = curl_exec($ch);
curl_close($ch);
Yet again, I got a timeout error.
I suspect that Ubuntu does not allow PHP to make calls to other websites. I am not an Linux or Ubuntu expert, so I did not know how to tackle the firewall settings, or settings that would allow PHP to make those calls.
In short, how do I change the settings that would allow PHP to get data from other websites?
Try this cURL code:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://maps.google.com/maps/api/geocode/json?address=" . $gm_address . "&sensor=false");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
curl_close($ch);
Does the text now appear in $result?
You may want to check your firewall settings it may be blocking the other sites.
Maybe your php.ini has curl disabled
look for disable_functions in /etc/php.ini