Queuing and image processing - php

Hi all I just have a quick question regarding best practacises and perhaps some help on queuing and image manipulation.
Im currently working on a website that allows the user to upload in excess of 10 files at one time now in my experience ive only really handled single uploads or 2-3 max, this site allows the user to upload as many as they like and then performs image manipulation to create 3 versions of each image at different sizes.
My thought process and how ive implemented this goes as follows.
User goes to upload form and selected multiple files these are all uploaded inline when they have finished the form autosubmits, The uploaded files are uploaded directly to a temporary folder in S3, this was done as there are multiple servers in the live environment with a load balancer in front of them so i was worried if I uploaded them all to the server then if i fired a queue it might go to the incorrect server and not find the files, would be great if there was a nice way of doing this.
when the form is submitted it fires a notification to the queue on iron.io with the data from the form submit which basically calls the server and starts the processing of images the code for this is below
public function fire($job, $data)
{
set_time_limit(0);
try {
if(is_array($data)){
foreach ($data['file'] as $x => $file){ //loop through each file uploaded and now save them
if ($this->images->doesMediaExistInTemporaryFolder($file)){
if ($new_file = $this->images->getMediaFromTemporaryS3Folder($file)){
file_put_contents (app_path() . '/storage/bulk-upload/' . $file, (string) $new_file['Body']);
$record_date = false;
if ($data['date'][$x] != 'no-date'){
if ($new_file['ContentType'] == 'image/jpeg') {
$exif_data = #exif_read_data(app_path() . '/storage/bulk-upload/' . $file, 'FILE');
}
if (!empty($exif_data) && #array_key_exists('DateTime', $exif_data)){
$record_date = $exif_data['DateTime'];
} else {
$record_date = $data['date'][$x];
}
}
$created_file = new \Symfony\Component\HttpFoundation\File\UploadedFile(app_path() . '/storage/bulk-upload/' . $file, $file, $new_file['ContentType'] );
$input = array('vehicle_objectId' => $data['vehicle_objectId'], 'type' => $data['type'], 'privacy' => $data['privacy'], 'date' => $record_date);
if (file_exists(app_path() . '/storage/bulk-upload/' . $file)){
if ($record = $this->record_repository->save($input, $created_file)) {
unlink(app_path() . '/storage/bulk-upload/' . $file);
$this->images->deleteMediaFromTemporaryS3(array(array('Key' => $file )));
} else {
$data['filename'] = $file;
\Mail::send('emails.bulk-upload', $data, function($message) {
$message->to('email', 'Daniel Newns')->subject('Bulk upload save issue');
});
}
}
}
}
}
$parse = new \ParseRestClient();
$user = $parse->retrieveCurrentUser( $data['pid']);
if (isset($user->email)) {
$vehicle_url = \URL::route('vehicles.show', $data['vehicle_objectId']);
$body = "<p>Hi " . $user->username . "</p><p>Your records have all been created. View them all as part of your vehicle record <a href='" . $vehicle_url . "'>here</a></p>";
$message = array(
'to' => array(array('email' => $user->email)),
'from_email' => 'xxxxx
'from_name' => 'xxxxx'
);
$template_content = array(array("name" => "share", "content" => $body));
$response = \Email::messages()->sendTemplate('Bulk_Upload', $template_content, $message);
}
}
} catch(\Exception $e){
$message = array(
'to' => array(array('email' => 'email')),
'from_email' => 'email',
'from_name' => 'xxxxxx'
);
$content = '<p>'. $e->getMessage() . '</p>';
$content .= '<p>' . $e->getTraceAsString() . '</p>';
$template_content = array(array("name" => "share", "content" => $content));
$response = \Email::messages()->sendTemplate('Content_Share', $template_content, $message);
}
}
as you can see it loops through the data returned from the queue and loops through the files from here it pulls the image from S3 and stores it locally then it checks if there is a date set and works out the created date via either that or exif data. then it creates the file and saves the record in the save function it performs all the resizing required.
my question is really does someone else have any suggestions on how i can improve this as im occasionally getting emails from the exception where its saying that it cant find a certain image like it hasnt been created locally, is my method of creating the image locally using file_put_contest the one i should be using or is there a better way for me to pull the data from S3 and work with it. Ive put a number of if statements in to stop things falling through the gaps etc.
be great to hear other peoples thoughts on where I have gone wrong here and what perhaps i could do to improve this? perhaps i could store an array of files that dont exists on first loop and then try again afterwards as i was thinking that it might be a case of the the code executing before the image exists would this be the case?
any help would be much appreciated.
thanks

I'm curious how you are implementing the actual queue for processing the images?
When I have needed a process queue in the past, I created a server daemon using PHP that would check a db for new images. Each time an image was uploaded I copied the original to a temp location, and stored the name, and status of the image in the DB. Status was new, processing, and complete. As soon as a server grabbed a file to process from the db I updated the status to processing.
I also mounted an S3 bucket to each of my machines and then symlinked to a local folder so all files were accessible with out having to download the file first. The code behaves as if the file is local, even though in the background the image is being downloaded.
However, another solution that lives in the AWS service is their SQS (simple queuing service). Use the S3 API with the SQS API inside your application and you can accomplish what your trying to do with out trying to build a server daemon.
I would check out this link here:http://aws.amazon.com/articles/1602?_encoding=UTF8&jiveRedirect=1
They have a pretty good guide on how to do exactly what your wanting to do using the services above. They recommend using dynamoDB but you can probably swap out with any DB that you are already using.
Either route you go, you need a DB to track files, and process status, and to keep track of your files in general. If you are worried your sometimes running into errors because the file isn't downloaded yet, I would check to make sure the file exists first, if it does check file size against DB, and then determine if the file is ready to be processed. You could run a script in laravel by hitting that specific url with a cron job as well.
Hope this helps!

Related

What is the proper way to monitor PHP execution in the frontend?

I will use an example to demonstrate this.
Assuming I have a MySQL DB where I place paths to files to be uploaded to S3, and a status column where each file is attributed either a pending or uploaded string.
I have a PHP script, upload.php, which I can run with php upload.php and receive the output logged to my terminal as the script progresses. I would like to set up a cron job that runs the script at certain intervals, say every 30 minutes, where each time the DB is queried and the files which hold a pending status are processed for upload.
Now, I want to be able to track the progress of the script, regardless of its current status in the frontend (if currently no pending items are in the DB).
While I would appreciate any specific suggestion on how to do this, my question is also regarding best practice - meaning, what is the proper way to do this?
Here's an example of a script of such (it's using the Joshcam MysqliDb)
// Get items with a pending status
function get_items_queue() {
global $db;
$cols = Array ("id", "filename");
$db->where('status = "pending"');
return $db->get('files', null, $cols);
}
// Upload items to S3
function UploadToS3($filename) {
if (empty($filename)) {
return false;
}
include_once('/s3/aws-autoloader.php');
$s3 = new S3Client($somearray); // Some S3 credentials here
// Print status
echo $filename . ' is uploading';
$uploaded = $s3->putObject($somearray); // Uploading to S3
if ($s3->doesObjectExist($s3_bucket, $filename)) {
// Print status
echo $filename . ' was uploaded';
} else {
// Print status
echo 'There has been an issue while uploading ' . $filename;
}
}
// Run the script
$queue_items = get_items_queue();
foreach ($queue_items as $key => $item) {
$upload = UploadToS3($item['filename']);
// Some function here that changes the status column for the uploaded item to 'uploaded'
if ($upload) {
set_item_queue_status($item['id']);
}
}
I ended up setting an installation of Cronicle from jhuckaby.
Essentially a cron manager, but what's most important for my case is the live log-viewer. This enables me to run the script using a cron job at the intervals I defined, and watch as it executes via the log-viewer, while being able to leave and come back at any point to view the currently running task (or any of the previous tasks that ran while I was away).

Upload multiple files at the same time without a duplicate in Laravel

I use this code to upload multiple files together in Laravel. However, all name files get duplicated. Please guide me.
if (is_array($files) || is_object($files)) {
foreach ($files as $file) {
$name = time().'.'.$file->getClientOriginalExtension();
$file->move(public_path('uploadmusic'), $name);
PostPhoto::create([
'post_id' => $post->id,
'filename' => $name
]);
}
}
1568314601.png
1568314601.png
1568314601.png
The precision of time() is only a second - not enough time to make time() report a different value when assigning $name in your loop for each file.
Append something else, like a call to \Illuminate\Support\Str::random() to get each name to be unique.
Depending on requirements, you might consider omitting the timestamp from the filename altogether and use something like md_file() instead.
$name = implode('.', [
md5_file($file->getPathname()),
$file->getClientOriginalExtension()
]);
This can also help keep duplicate files off of your storage device.

Process Bar for symfony process

i have long running task which i have incorporated as a Process in my symfony project. This is how i call the process
$rootDir = $this->get('kernel')->getRootDir();
$adk_process = new Process(
'php ../bin/console app:adkaction ' . $numcampaigns . ', ' . $timezone . ',' . $depdate);
$adk_process->setWorkingDirectory($rootDir);
$adk_process->setOptions(['suppress_errors' => false]);
$adk_process->setTimeout(null);
$adk_process->start();
while ($adk_process->isRunning()) {
$currprogress = $adk_process->getIncrementalOutput();
return $this->render('BackEnd/user.html.twig',[
'form'=>$form->createView(),
'currprogress' => $currprogress
]);
}
My process currently does not have any output (it is parsing xml file and pushing data to db). When done currprogress variable should be pushes into my .twig template when it will populate progress bar.
I need to show the progress of file parsing (i.e. how many items were processed as it can be up to 100k lines and the process can run for 2-3 hours).
At the moment i cannot get the the incremental output from my process to push it over to my temaplte. What would be the best way to do it!

Upload Multiple Files with FuelPHP

I currently have a script that allows me to upload 1 file to my server. It works great.
Here is a portion of the code I am using to do this:
// Custom configuration for this upload
$config = array(
'path' => DOCROOT.DS.'foldername/tomove/your/images',
'randomize' => true,
'ext_whitelist' => array('img', 'jpg', 'jpeg', 'gif', 'png'),
);
Upload::process($config);
// if a valid file is passed than the function will save, or if its not empty
if (Upload::is_valid())
{
// save them according to the config
Upload::save();
//if you want to save to tha database lets grab the file name
$value = Upload::get_files();
$article->filename = $value[0]['saved_as'];
}
I was now wondering, how do I loop through multiple files and upload these to my server?
I'm guessing using a foreach loop but I'm a little out of my depth with this I'm afraid.
Ultimately, I plan to store these filenames in a separate table on my database.
Many thanks for any help with this.
You already have the result in your code.
You already store it
$value = Upload::get_files();
so
$value = Upload::get_files();
foreach($value as $files) {
print_r($files);
}
And you will get everything what you need

Removing Lines in php ? Is this possible?

I have been struggling to create a Simple ( really simple ) chat system for my website as my knowledge on Javascripting/AJAX are Limited after gather resources and help from many kind people I was able to create my simple chat system but left with one problem.
The messages are posted to a file called "msg.html" in this format :
<p><span id="name">$name</span><span id="Msg">$message</span></p>
And then using PHP and AJAX I will retrieve the messages instantly from the file using the
file(); function and a foreach(){} loop withing PHP here is the code :
<?php
$file = 'msg.html';
$data = file($file);
$max_lines = 20;
if(count($data) > $max_lines){
// here i want the data to be deleted from oldest until i only have 20 messages left.
}
foreach($data as $line_num => $line){
echo $line_num . " . " . $line;
}
?>
My Question is how can i delete the oldest messages so that i am only left with the latest 20 Messages ?
How does something like this seem to you:
$file = 'msg.html';
$data = file($file);
$max_lines = 20;
foreach($data as $line_num => $line)
{
if ($line_num < $max_lines)
{
echo $line_num . " . " . $line;
}
else
{
unset($data[$line_num]);
}
}
file_put_contents('msg.html', $data);
?>
http://www.php.net/manual/en/function.file-put-contents.php for more info :)
I suppose you can read the file, explode it into an array, chop off everything but last 20 fields and write it back to file, overwriting the old one... Perhaps not the best solution but one that comes to mind if you really cant use database as Delan suggested
That's called round-robin if I recall correctly.
As far as I know, you can't remove arbitrary portions of a file. You need to overwrite the file with the new contents (or create a new file and remove the old one). You could also store messages in individual files but of course that implies up to $max_lines files to read.
You should also use flock() to avoid data corruption. Depending on the platform it's not 100% reliable but it's better than nothing.

Categories