Running repeated tasks in php - php

I am using the following code to get some info from a website and save it locally.
$ch = curl_init("http://test.com/test.txt");
$fp = fopen("test.txt", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Now the test.txt needs to be updated periodically. How can I trigger this at specific time intervals?

If you're on *nix or a Mac (or even Cygwin), you're probably better off using wget:
#hourly wget http://test.com/test.txt
That will do everything that cURL call will do, and do it on an hourly basis. Here's a good cron intro: https://help.ubuntu.com/community/CronHowto

Related

PHP curl download remote stream / chunked data

I have audio files on a remote server that are streamed / chunked to the user. This all works great in the clients browser.
But when I try to download and save the files locally from another server using curl, it only seems to be able to download small files (less than 10mb) sucessfully, anything larger and it seems to only download the header.
I assume this is because of the chunking, so my question is how do I make curl download the larger (chunked) files?
With wget on the cli on linux this is as simple as :
wget -cO - https://example.com/track?id=460 > mytrack.mp3
This is the func I have written using curl in PHP, but like I say it's only downloading headers on large files :
private function downloadAudio($url, $fn){
$ch = curl_init($url);
$path = TEMP_DIR . $fn;
$fp = fopen($path, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_AUTOREFERER, 1);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
if (file_exists($path)) {
return true;
}
return false;
}
In my case it was failing as I had forgotten to increase the default PHP memory_limit on the origin server.
It turned out after posting this question that it was actually successfully downloading any files that seemed to be below the 100mb mark, not 10mb as I had stated in the question. As soon as I realised this I checked the memory_limit and low and behold it was set to the default 128m.
I hadn't noticed any problems client side as it was being chunked, but when the server tried to grab an entire 300mb file in less than 1 second the memory limit must have been reached.

Custom script for importing posts to Wordpress works on the test server, but not on the live server

A developer did a custom PHP script for me for importing posts into Wordpress from CSV file. The script worked fine on a staging site, which was on a different server, but when we moved it to my server, it can't download the CSV file and even if I manually import the file to the folder, it won't import it. It doesn't show any errors, just a blank page.
It's a shared hosting, so the provider has set the max_execution_time to 120, which will be enough for the script to run, but it times out on 30 seconds.
The script is using curl_setopt to get the file. The PHP version is 5.5
$userAgent = 'FreeRock.Eu/2.0 (http://www.freerock.eu/share.php)';
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_USERAGENT, $userAgent);
curl_setopt($ch, CURLOPT_URL,$address);
curl_setopt($ch, CURLOPT_FAILONERROR, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,true);
curl_setopt($ch, CURLOPT_TIMEOUT, 400);
$html = curl_exec($ch);
if (!$html) {
echo "<br /> error number:" .curl_errno($ch);
echo "<br /> error:" . curl_error($ch);
exit;
}
return $html;
}
Then I have:
$z_html = fake_user_agent_http_get('https://www.myfilelocation.com');
$myfile = fopen("promotions.csv", "w") or die("Unable to open file!");
fwrite($myfile, $z_html);
fclose($myfile);
Would appreciate any help here.
Thanks
Alexis
There can be numerous reasons for this. Lets try a few things:
1) Add this in the top of the script so maybe you can see the error:
ini_set('display_errors',true);
error_reporting(e_all);
2) Try add the following in the top of the code to avoid the 30 seconds max execution time:
set_time_limit(0);
3) The hosting provider probably have php error logs available. Try to download them to figure out what the error is.
Update - the timeout was caused by an nginx settings, but because it's a shared hosting, the 30 seconds timeout cannot be changed. The solution is to run the script via SSH.

cURL not able to download image file from server running Varnish Cache

I have the following PHP script that works perfectly 99% of the time. But it will not download an image from this one server which I think is running Varnish Cache.
<?php
$imglink = 'http://www.dirtrider.com/wp-content/uploads/2014/10/WD-10_1_14-001.jpg';
$ch = curl_init($imglink);
$fp = fopen('/home/path/www/tmp/test.jpg', "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_exec($ch);
fclose($fp);
You get a 403 Forbidden error if you use CURL to load that image. You can work around this error very easily. Just add an alternate user agent for your CURL request:
curl_setopt($ch, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT']);
And et voila! It works like a charm. Seems like Varnishe Cache blocks CURL requests which use a CURL default user agent.

php curl download stops at 95%

this is very weird, php curl download stops at 95% all the time. its driving me crazy.
here is the code that i'm using nothing fancy
$fp = fopen($file, 'w');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.domain.com/");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch,CURLOPT_HTTPHEADER,array("ETag: $rddash"));
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Something i noticed, the remote website is using Etag, so i used it but still not working.
what could be the reason the download stops before it completes??
Maybe a timeout issue in your php.ini settings. Use set_time_limit(0); in your code.
See the manual for more details.
Also check the PHP error log.

How to write verbose information into file?

I'm using CURL in my php script and I want to write information from CURLOPT_VERBOSE into file. I tried CURLOPT_STDERR but without luck. It still print everything in cmd window. I'm not sure if parameter in curl_setopt($ch, CURLOPT_STDERR, $errFileHandle); should be file handler or file name (both not work).
Running on a Unix based OS? Use the below on the command line to run your script. Using Windows OS, use the command on Cygwin.
php yourPhp.php > yourFile.txt
Or you can just run it in verbose mode with a file handle
<?PHP
$verbosPath = __DIR__.DIRECTORY_SEPARATOR.'verboseOut.txt';
echo "Saving verbose output to: $verbosPath\n";
$handle=curl_init('http://www.google.com/');
curl_setopt($handle, CURLOPT_VERBOSE, true);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, true);
curl_setopt($handle, CURLOPT_STDERR,$f = fopen($verbosPath, "w+"));
curl_exec($handle);
fclose($f);
?>

Categories