Amazon S3 Backup - php

In php what is the best way to run a script after one has finished for exmaple i have these two files.
My backup file.
<?php
require 'back-up.php';
#!/bin/php -d max_execution_time = 3600
$backup_dirs = array('../bu/');
$backup = new backupclass($backup_dirs);
$backup->backup('filesys','files/');
?>
The i have my amazon s3 upload file.
<?php
require_once('S3.php');
$s3 = new S3('S3KEY', 'S3SECRETKEY');
$baseurl = "/home/mlcreative/public_html/bu/files";
if ($handle = opendir('./files/')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != "..") {
if ($s3->putObjectFile("files/$file", "testingmlc333", "lighthouse/$file", S3::ACL_PUBLIC_READ)) {
echo "<strong>We successfully uploaded your file.</strong>";
if (file_exists($baseurl . '/' . $file)) { unlink ($baseurl . '/' . $file); }
}else{
echo "<strong>Something went wrong while uploading your file... sorry.</strong>";
}
}
}
closedir($handle);
}
?>
Ok so at the moment i have a cron job setup to running the backup php file then an hour later run the amazon upload php file.
What i am asking is how i could combine these to script so i only have to run one cron job instead of two???
Any help please

You could make the backup (#1) script call the upload (#2) script via a URL when it is finished.
# Script Number 1 - Backup
/* After all the content */
#file_get_contents( 'http://www.yourserver.com/uploadToS3.php?protector=somethingSecret' );
# Script Number 2 - Uploader (available at the URL above)
/* Before all the content */
if( $_GET['protector']!='somethingSecret' ){
die( 'Access Blocked' );
}
This means that, when the Backup Script finished, it triggers the Upload Script. The protector=somethingSecret is just there to prevent it from being accidentally triggered.

Modify the cron job to run the second script after the first script. As in the command field of the cronjob would look something like this:
php -f firstscript.php ; php -f secondscript.php
I use this method on many of my cronjobs.

Related

PHP script saves empty image files when run as cron job

I have a script that downloads images from the external server and saves them in a folder at the root of the website. The script file is also in the root of the website.
folder for images: /public_html/images/
script: /public_html/script.php
When i run file manually (example.com/script.php) all downloaded image files are saved in the folder correctly. But when the file is executed by the cron job all the images are saved with sizes of 0 bytes.
I've tried to empty the folder before the cron job run. I've changed the permissions to 777. Log file when the script runs manually and as cron looks the same.
The cron job is set up in the cpanel crontab.
Please help me figure out what is going on.
$dir = "/home/example/public_html/images/";
foreach (scandir($dir) as $item) {
if ($item == '.' || $item == '..') continue;
unlink($dir.DIRECTORY_SEPARATOR.$item);
}
$ftp_server = 'www2.housescape.org.uk';
$ftp_conn = ftp_connect($ftp_server);
$ftp_user = 'user';
$ftp_pass = 'password';
ftp_set_option($ftp_conn, FTP_TIMEOUT_SEC, 3600);
if(ftp_login($ftp_conn, $ftp_user, $ftp_pass)){
ftp_pasv($ftp_conn, true);
$images = ftp_nlist($ftp_conn, '/images/');
$c = 0;
foreach($images as $image){
$c = $c + 1;
echo "ftp://user:password#www2.housescape.org.uk:21".$image." / ";
$urltoget="ftp://user:password#www2.housescape.org.uk:21".$image;
echo $thefile=basename($image);
echo "<br>";
$content = file_get_contents("ftp://user:password#www2.housescape.org.uk:21".$image);
file_put_contents("/home/example/public_html/images/".$thefile, $content);
}
if ($count1>0) { echo "No File Change"; }
ftp_close($ftp_conn);
}
else{
echo 'Failed Login!';
}
This is only a suggestion to help track down the error, not an answer:
Did you check whether the CRON PHP environment has allow_url_fopen enabled?
<?php
if (!ini_get('allow_url_fopen')) {
die("'allow_url_fopen' is not enabled in the php.ini");
}

phpseclib producing strange output

I have code which generates a text file on my server. I then need this file uploaded to another server using sftp. To start things off, I do
if(performLdapOperations()) {
sleep(10);
performFtpOperation();
}
performLdapOperations produces the text file and places it on my server, performFtpOperation takes this text file and uploads to another server. This is my function
function performFtpOperation() {
global $config;
$local_directory = getcwd() .'/outputs/';
$remote_directory = '/home/newfolder/';
$sftp = new SFTP($config::FTP_SERVER, 22, 10);
if (!$sftp->login($config::FTP_USER, $config::FTP_PASSWORD)) {
exit('Login Failed');
}
$files_to_upload = array();
/* Open the local directory form where you want to upload the files */
if ($handle = opendir($local_directory))
{
/* This is the correct way to loop over the directory. */
while (false !== ($file = readdir($handle)))
{
if ($file != "." && $file != "..")
{
$files_to_upload[] = $file;
}
}
closedir($handle);
}
if(!empty($files_to_upload))
{
/* Now upload all the files to the remote server */
foreach($files_to_upload as $file)
{
$success = $sftp->put($remote_directory . $file,
$local_directory . $file,
NET_SFTP_LOCAL_FILE);
}
}
}
So the text file that is produces is in my outputs folder. I then want to take this file and upload to a new server to the location /home/newfolder/
Everything seems to work, and the file seems to get uploaded to the new server. However, when I open the file that has been uploaded, all it contains is the path of where the file is, nothing else. The file on my server which is in the outputs folder contains everything, for some reason something is going wrong when sending it over sftp?
Is there anything in my code that may be causing this?
Thanks
It looks like you're using the 2.0 version of phpseclib, which is namespaced. If that's the case then the problem is with this line:
$success = $sftp->put($remote_directory . $file,
$local_directory . $file,
NET_SFTP_LOCAL_FILE);
Try this:
$success = $sftp->put($remote_directory . $file,
$local_directory . $file,
SFTP::SOURCE_LOCAL_FILE);

Create diectory (ftp_mkdir) only if files exist

I currently have a program that connects to an ftp directory, if it finds csv files, runs a script, then after the script has run on the files, it creates a back up folder with the date and moves the csv files to this newly created back up folder in the ftp directory.
However, if there are no csv files in the root directory, I do not want a backup folder to be created, as there are no files to move. I know the solution is probably really simple but I cannot seem to figure it out!
logMessage("Creating backups");
$ftp_connection = #ftp_connect($ftp_url, $ftp_port, 6000);
if(!#ftp_login($ftp_connection, $ftp_username, $ftp_password )) {
logMessage("Could not connect to FTP: [$ftp_url], with Username: [$ftp_username], and Password: [$ftp_password]");
die();
}
$date = date('Y_m_d_(His)');
$newBackup = $ftp_root."/".$ftp_backup."backup_$date";
if (ftp_mkdir($ftp_connection, $newBackup)) {
logMessage ("Successfully created [$newBackup\n]");
foreach($filesToProcess as $file){
$pathData = pathinfo($file);
if(isset($pathData['extension']) && $pathData['extension'] == 'csv'){
if(!#ftp_rename($ftp_connection,
$ftp_root.'/'.$file,
$newBackup."/".$file)
){
logMessage("Unable to move file: $file")
}
}
}
}
You have used, foreach($filesToProcess as $file) ,so in $filesToProcess it's array of files. you can use, count($filesToProcess) first count number of files, then if count>0 execute code.
// $csv = your checks for string ending to .csv
$ftp_files = ftp_nlist($ftp_connection, ".");
foreach ($ftp_files = $files) {
if ($files = $csv) {
// makedir
maybe something like this in very basic syntax. ftp_nlist returns an array of all files in a particular directory.

Recursive browse all server directories and list newest created files with php

very common question and still did not find the right solution for this. I need to run cron job to start php script each morning to list all new files created on the web server during the night. This is very usefull to see what visitors have uploaded during the night and ofcourse very ofter those files could be harmful files that will hurt other visitors computers. So far I have this:
$dir = "../root/";
$pattern = '\.*$'; // check only file with these ext.
$newstamp = 0;
$newname = "";
if ($handle = opendir($dir)) {
while (false !== ($fname = readdir($handle))) {
// Eliminate current directory, parent directory
if (ereg('^\.{1,2}$',$fname)) continue;
// Eliminate other pages not in pattern
if (! ereg($pattern,$fname)) continue;
$timedat = filemtime("$dir/$fname");
if ($timedat > $newstamp) {
$newstamp = $timedat;
$newname = $fname;
}
}
}
closedir ($handle);
// $newstamp is the time for the latest file
// $newname is the name of the latest file
// print last mod.file - format date as you like
print $newname . " - " . date( "Y/m/d", $newstamp);
this prints the newest file but only in one directory root/ and doesnt check for example root/folder/ and etc.. How to do it recusrsivly ?
If I add a new file within root/folder the script will show me folder with date , but would not show which file in root/folder was created.. I hope you understand what I mean, thanks
Quick script that does what you want (tested under windows 7 with cygwin and under ubuntu 12.10 with PHP 5.3.10)
<?php
$path = $argv[1];
$since = strtotime('-10 second'); // use this for previous day: '-1 day'
$ite = new RecursiveDirectoryIterator($path, FilesystemIterator::SKIP_DOTS);
foreach ( new RecursiveIteratorIterator($ite) as $filename => $object ) {
if (filemtime($filename) > $since) {
echo "$filename recently created\n";
}
}
My quick test:
$> mkdir -p d1/d2
$> touch d1/d2/foo
$> php test.php .
./d1/d2/foo recently created
$> php test.php . # 10secs later
$>

PHP - check if file is finished encoding

I have a .bat file that encodes an mp3 file on the server side, and I also have a php function that checks if the file exists, and then adds it as an HTML list item. The problem I'm running into - sometimes the mp3 file isn't done encoding on the server side. If somebody were to try downloading the file while it's in the process of encoding it will crash the browser.
Can I check to make sure the filesize is finished increasing before listing the item?
Here's the function that checks if the file exists:
function ListDir($dir_handle,$path) {
global $listing;
echo "<ul>";
while (false !== ($file = readdir($dir_handle))) {
$dir =$path . $file;
if(is_dir($dir) && $file != '.' && $file !='..' && filesize($file) {
$handle = #opendir($dir) or die("Unable to open file $file");
echo "<li>".$dir;
ListDir($handle, $dir);
echo "</li>";
} elseif($file != '.' && $file !='..' && $file !='.htaccess') {
$new_string = ereg_replace("[^A-Za-z.]", "", $file);
echo '<li>'.str_replace('wav', 'mp3', $new_string).'</li>';
}
}
echo "</ul>";
closedir($dir_handle);
}
Have the bat encode, then move to a final location for a "finished" state, if it doesn't exist there - it's not done. This is similar to drew010's answer except it utilizes the same file, from a working directory to a production directory.
This also prevents it being accessible by any resources until it's ready which could potentially cause problems.
You can't really know the final filesize, so have your bat file create a file like mp3filename.work or something and then have the bat file delete it when the encoding finishes, so if the .work file doesn't exist, then the encoding is done.

Categories