I have a script that downloads images from the external server and saves them in a folder at the root of the website. The script file is also in the root of the website.
folder for images: /public_html/images/
script: /public_html/script.php
When i run file manually (example.com/script.php) all downloaded image files are saved in the folder correctly. But when the file is executed by the cron job all the images are saved with sizes of 0 bytes.
I've tried to empty the folder before the cron job run. I've changed the permissions to 777. Log file when the script runs manually and as cron looks the same.
The cron job is set up in the cpanel crontab.
Please help me figure out what is going on.
$dir = "/home/example/public_html/images/";
foreach (scandir($dir) as $item) {
if ($item == '.' || $item == '..') continue;
unlink($dir.DIRECTORY_SEPARATOR.$item);
}
$ftp_server = 'www2.housescape.org.uk';
$ftp_conn = ftp_connect($ftp_server);
$ftp_user = 'user';
$ftp_pass = 'password';
ftp_set_option($ftp_conn, FTP_TIMEOUT_SEC, 3600);
if(ftp_login($ftp_conn, $ftp_user, $ftp_pass)){
ftp_pasv($ftp_conn, true);
$images = ftp_nlist($ftp_conn, '/images/');
$c = 0;
foreach($images as $image){
$c = $c + 1;
echo "ftp://user:password#www2.housescape.org.uk:21".$image." / ";
$urltoget="ftp://user:password#www2.housescape.org.uk:21".$image;
echo $thefile=basename($image);
echo "<br>";
$content = file_get_contents("ftp://user:password#www2.housescape.org.uk:21".$image);
file_put_contents("/home/example/public_html/images/".$thefile, $content);
}
if ($count1>0) { echo "No File Change"; }
ftp_close($ftp_conn);
}
else{
echo 'Failed Login!';
}
This is only a suggestion to help track down the error, not an answer:
Did you check whether the CRON PHP environment has allow_url_fopen enabled?
<?php
if (!ini_get('allow_url_fopen')) {
die("'allow_url_fopen' is not enabled in the php.ini");
}
Related
When moving our site to a new host we went from PHP5 (I think) to PHP7. We also added SSL to the site for the first time. Ever since moving the site a function to copy image files to an FTP has been failing randomly.
After doing some research I learned that there is no way to get an error message more detailed that "ftp_put has failed".
$dir = 'path/to/folder';
$a = scandir($dir);
$ftp_server = "ftp.server.com";
$ftp_conn = ftp_connect($ftp_server);
$ftp_username = 'myuser';
$ftp_userpass = 'mypass';
$login = ftp_login($ftp_conn, $ftp_username, $ftp_userpass);
ftp_pasv($ftp_conn, true);
foreach ($a as $value) {
if(strlen($value) > 4){
$file = $dir.$value;
$name = $value;
if (ftp_put($ftp_conn, $name, $file, FTP_BINARY)){
echo "<br><br><span style='color: green'>Successfully uploaded $file.</span><br><br>";
}
else{
echo "<br><br><span style='color: green'>Error uploading $file.</span><br><br>";
}
}
}
The output from the code above is:
Successfully uploaded ../../img/bil/AAA123/AAA123-1.jpg.
Successfully uploaded ../../img/bil/AAA123/AAA123-2.jpg.
Successfully uploaded ../../img/bil/AAA123/AAA123-3.jpg.
Error uploading ../../img/bil/AAA123/AAA123-4.jpg.
Error uploading ../../img/bil/AAA123/AAA123-5.jpg.
Successfully uploaded ../../img/bil/AAA123/AAA123-6.jpg.
Error uploading ../../img/bil/AAA123/AAA123-7.jpg.
Successfully uploaded ../../img/bil/AAA123/AAA123-8.jpg.
This output differs, running it again will successfully upload other images and fail other.
I have tried stripping down the code, removing the scandir and foreach parts and using a direct path to one image file as $file with the same result.
I have no idea what could be wrong. I suspect moving to PHP7 and possibly SSL is the problem since this all started then. Not being able to get a detailed error message why ftp_put fails leaves me completely stuck.
Is there anything I can do to find out whats wrong?
edit:
Adding error_reporting(-1) and printing out error_get_last() gives me this:
Array (
[type] => 2
[message] => ftp_put(): Type set to I
[file] => path/to/file.php
[line] => 51
)
Any network communication can fail.
Uploading a large number of files without zipping them up first via FTP is a good way to ensure some of your uploads will fail. If you don't want to zip them up into a single request than a good alternative is to retry failed responses, adding a delay between retries but don't make an infinite loop. Retry 3 times and if it still fails than your problem is much bigger than minor network issues or FTP server bugs and you are better off skipping the file and trying the next. Also don't forget to close any connection you open.
$dir = 'path/to/folder';
$a = scandir($dir);
foreach ($a as $value) {
if(strlen($value) > 4){
for($retry = 0; $retry < 3; $retry++) {
$ftp_server = "ftp.server.com";
$ftp_conn = ftp_connect($ftp_server);
$ftp_username = 'myuser';
$ftp_userpass = 'mypass';
$login = ftp_login($ftp_conn, $ftp_username, $ftp_userpass);
ftp_pasv($ftp_conn, true);
$file = $dir.$value;
$name = $value;
if (ftp_put($ftp_conn, $name, $file, FTP_BINARY)){
echo "<br><br><span style='color: green'>Successfully uploaded $file.</span><br><br>";
break;
} else {
if ($retry < 2) {
echo "<br><br><span style='color: green'>Error uploading $file Will retry....</span><br><br>";
sleep(2);
} else {
echo "<br><br><span style='color: green'>Error uploading $file.</span><br><br>";
}
}
ftp_close($ftp_conn);
}
}
}
I am uploading files to a server using php and while the move_uploaded_file function returns no errors, the file is not in the destination folder. As you can see I am using the exact path from root, and the files being uploaded are lower than the max size.
$target = "/data/array1/users/ultimate/public_html/Uploads/2010/";
//Write the info to the bioHold xml file.
$xml = new DOMDocument();
$xml->load('bioHold.xml');
$xml->formatOutput = true;
$root = $xml->firstChild;
$player = $xml->createElement("player");
$image = $xml->createElement("image");
$image->setAttribute("loc", $target.basename($_FILES['image']['name']));
$player->appendChild($image);
$name = $xml->createElement("name", $_POST['name']);
$player->appendChild($name);
$number = $xml->createElement("number", $_POST['number']);
$player->appendChild($number);
$ghettoYear = $xml->createElement("ghettoYear", $_POST['ghetto']);
$player->appendChild($ghettoYear);
$schoolYear = $xml->createElement("schoolYear", $_POST['school']);
$player->appendChild($schoolYear);
$bio = $xml->createElement("bio", $_POST['bio']);
$player->appendChild($bio);
$root->appendChild($player);
$xml->save("bioHold.xml");
//Save the image to the server.
$target = $target.basename($_FILES['image']['name']);
if(is_uploaded_file($_FILES['image']['tmp_name']))
echo 'It is a file <br />';
if(!(move_uploaded_file($_FILES['image']['tmp_name'], $target))) {
echo $_FILES['image']['error']."<br />";
}
else {
echo $_FILES['image']['error']."<br />";
echo $target;
}
Any help is appreciated.
Eric R.
Most like this is a permissions issue. I'm going to assume you don't have any kind of direct shell access to check this stuff directly, so here's how to do it from within the script:
Check if the $target directory exists:
$target = '/data/etc....';
if (!is_dir($target)) {
die("Directory $target is not a directory");
}
Check if it's writeable:
if (!is_writable($target)) {
die("Directory $target is not writeable");
}
Check if the full target filename exists/is writable - maybe it exists but can't be overwritten:
$target = $target . basename($_FILES['image']['name']);
if (!is_writeable($target)) {
die("File $target isn't writeable");
}
Beyond that:
if(!(move_uploaded_file($_FILES['image']['tmp_name'], $target))) {
echo $_FILES['image']['error']."<br />";
}
Echoing out the error parameter here is of no use, it refers purely to the upload process. If the file was uploaded correctly, but could not be moved, this will still only echo out a 0 (e.g. the UPLOAD_ERR_OK constant). The proper way of checking for errors goes something like this:
if ($_FILES['images']['error'] === UPLOAD_ERR_OK) {
// file was properly uploaded
if (!is_uploaded_File(...)) {
die("Something done goofed - not uploaded file");
}
if (!move_uploaded_file(...)) {
echo "Couldn't move file, possible diagnostic information:"
print_r(error_get_last());
die();
}
} else {
die("Upload failed with error {$_FILES['images']['error']}");
}
You need to make sure that whoever is hosting your pages has the settings configured to allow you to upload and move files. Most will disable these functions as it's a sercurity risk.
Just email them and ask whether they are enabled.
Hope this helps.
your calls to is_uploaded_file and move_uploaded_file vary. for is_uploaded_file you are checking the 'name' and for move_uploaded_file you are passing in 'tmp_name'. try changing your call to move_uploaded_file to use 'name'
I have a user folder on remote server (other than page files). I need check a size of whole "example" folder, not one file. I think i should do it with use a ftp, but I can't.
I have something like this but not working:
function dirFTPSize($ftpStream, $dir) {
$size = 0;
$files = ftp_nlist($ftpStream, $dir);
foreach ($files as $remoteFile) {
if(preg_match('/.*\/\.\.$/', $remoteFile) || preg_match('/.*\/\.$/', $remoteFile)){
continue;
}
$sizeTemp = ftp_size($ftpStream, $remoteFile);
if ($sizeTemp > 0) {
$size += $sizeTemp;
}elseif($sizeTemp == -1){//directorio
$size += dirFTPSize($ftpStream, $remoteFile);
}
}
return $size;
}
$hostname = '127.0.0.1';
$username = 'username';
$password = 'password';
$startdir = '/public_html'; // absolute path
$files = array();
$ftpStream = ftp_connect($hostname);
$login = ftp_login($ftpStream, $username, $password);
if (!$ftpStream) {
echo 'Wrong server!';
exit;
} else if (!$login) {
echo 'Wrong username/password!';
exit;
} else {
$size = dirFTPSize($ftpStream, $startdir);
}
echo number_format(($size / 1024 / 1024), 2, '.', '') . ' MB';
ftp_close($ftpStream);
Whole time script displays 0.00 MB, what can I do to fix it?
In your comments you indicated you have SSH access on the remote server. Great!
Here is a way to use SSH:
//connect to remote server (hostname, port)
$connection = ssh2_connect('www.example.com', 22);
//authenticate
ssh2_auth_password($connection, 'username', 'password');
//execute remote command (replace /path/to/directory with absolute path)
$stream = ssh2_exec($connection, 'du -s /path/to/directory');
stream_set_blocking($stream, true);
//get the output
$dirSize = stream_get_contents($stream);
//show the output and close the connection
echo $dirSize;
fclose($stream);
This will echo 123456 /path/to/directory where 123456 is the calculated size of the directory's contents. If you need human readable, you could use 'du -ch /path/to/directory | grep total' as the command, this will output formatted (k, M or G).
If you get an error "undefined function ssh2_connect()" you need to install/enable PHP ssh2 module on your local machine
Another way, without SSH could be to run the command on the remote machine.
Create a new file on the remote server, e.g. called 'dirsize.php' with the following code:
<?php
$path = '/path/to/directory';
$output = exec('du -s ' . $path);
echo trim(str_replace($path, '', $output));
(or any other PHP code that can determine the size of a local directory's contents)
And on your local machine include in your code:
$dirsize = file_get_contents('http://www.example.com/dirsize.php');
I currently have a program that connects to an ftp directory, if it finds csv files, runs a script, then after the script has run on the files, it creates a back up folder with the date and moves the csv files to this newly created back up folder in the ftp directory.
However, if there are no csv files in the root directory, I do not want a backup folder to be created, as there are no files to move. I know the solution is probably really simple but I cannot seem to figure it out!
logMessage("Creating backups");
$ftp_connection = #ftp_connect($ftp_url, $ftp_port, 6000);
if(!#ftp_login($ftp_connection, $ftp_username, $ftp_password )) {
logMessage("Could not connect to FTP: [$ftp_url], with Username: [$ftp_username], and Password: [$ftp_password]");
die();
}
$date = date('Y_m_d_(His)');
$newBackup = $ftp_root."/".$ftp_backup."backup_$date";
if (ftp_mkdir($ftp_connection, $newBackup)) {
logMessage ("Successfully created [$newBackup\n]");
foreach($filesToProcess as $file){
$pathData = pathinfo($file);
if(isset($pathData['extension']) && $pathData['extension'] == 'csv'){
if(!#ftp_rename($ftp_connection,
$ftp_root.'/'.$file,
$newBackup."/".$file)
){
logMessage("Unable to move file: $file")
}
}
}
}
You have used, foreach($filesToProcess as $file) ,so in $filesToProcess it's array of files. you can use, count($filesToProcess) first count number of files, then if count>0 execute code.
// $csv = your checks for string ending to .csv
$ftp_files = ftp_nlist($ftp_connection, ".");
foreach ($ftp_files = $files) {
if ($files = $csv) {
// makedir
maybe something like this in very basic syntax. ftp_nlist returns an array of all files in a particular directory.
In php what is the best way to run a script after one has finished for exmaple i have these two files.
My backup file.
<?php
require 'back-up.php';
#!/bin/php -d max_execution_time = 3600
$backup_dirs = array('../bu/');
$backup = new backupclass($backup_dirs);
$backup->backup('filesys','files/');
?>
The i have my amazon s3 upload file.
<?php
require_once('S3.php');
$s3 = new S3('S3KEY', 'S3SECRETKEY');
$baseurl = "/home/mlcreative/public_html/bu/files";
if ($handle = opendir('./files/')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
if ($s3->putObjectFile("files/$file", "testingmlc333", "lighthouse/$file", S3::ACL_PUBLIC_READ)) {
echo "<strong>We successfully uploaded your file.</strong>";
if (file_exists($baseurl . '/' . $file)) { unlink ($baseurl . '/' . $file); }
}else{
echo "<strong>Something went wrong while uploading your file... sorry.</strong>";
}
}
}
closedir($handle);
}
?>
Ok so at the moment i have a cron job setup to running the backup php file then an hour later run the amazon upload php file.
What i am asking is how i could combine these to script so i only have to run one cron job instead of two???
Any help please
You could make the backup (#1) script call the upload (#2) script via a URL when it is finished.
# Script Number 1 - Backup
/* After all the content */
#file_get_contents( 'http://www.yourserver.com/uploadToS3.php?protector=somethingSecret' );
# Script Number 2 - Uploader (available at the URL above)
/* Before all the content */
if( $_GET['protector']!='somethingSecret' ){
die( 'Access Blocked' );
}
This means that, when the Backup Script finished, it triggers the Upload Script. The protector=somethingSecret is just there to prevent it from being accidentally triggered.
Modify the cron job to run the second script after the first script. As in the command field of the cronjob would look something like this:
php -f firstscript.php ; php -f secondscript.php
I use this method on many of my cronjobs.