queing multiple file upload in php - php

I'm sorry if this has been asked I can't find the correct keywords to look in google. So I decided to ask here in SO.
right now, I have an upload function that is built in laravel where it can upload to 20 images at a time, but client would like to it to be 50 images at a time, so I'm thinking I can't edit php.ini everytime the client wants to increase the maximum uploads at the same time cause it might break the server.
is it possible in php to upload in queue like 10 images for this second next second is 10 images again until the upload are all done so the server won't break.
foreach ($request->file('image') as $key => $file)
{
$filename = md5(time() . uniqid()) . '.' . $file->getClientOriginalExtension();
$imagesize = getimagesize($file);
if( ($imagesize[0] == 533 && $imagesize[1] == 800) == false &&
($imagesize[0] == 800 && $imagesize[1] == 533) == false
) {
$error++;
continue;
}
$file->move('uploads', $filename);
$data['image'] = url('uploads/' . $filename);
$order_id = 1;
$count = PropertyImages::where('property_id', $id)->count();
if( $count > 23 )
{
return response()->json(['success' => false, 'msg' => 'Images must not exceed 24']);
}
$image = PropertyImages::where('property_id', $id)->orderBy('order_id', 'DESC')->first();
if( $image )
{
$order_id = $image->order_id + 1;
}
$item = PropertyImages::create([
'property_id' => $id,
'filename' => $filename,
'order_id' => $order_id
]);
$items[$key]['id'] = $item->id;
$items[$key]['filename'] = url('uploads/' . $item->filename);
}

I am not sure what setup you are using for uploading images, so I assume that you have a plain simple form with 20 file input fields.
PHP.ini limits the size of request (e.g. post_max_size=20M will limit your request to 20MB or max_input_vars = 20 will limit variables in request to 20) so it all depends upon what limit you are using in your PHP.ini
So if you have limited max_input_vars = 20 and you are able to send 20 images in single post request, for sending 50 images to server will need 3 requests (max 20 images per request and 10 for last)
Now if you are using simple plain form then it can't do the job because with click on submit button you will be able to submit your form only once. For this you will have to submit your form using javascript (or jQuery). Using JS you will be able to submit images in multiple AJAX requests.
Here is a jQuery plugin that can do the job very efficiently
https://github.com/zimt28/laravel-jquery-file-upload
Little help for using it with Laravel
https://laracasts.com/discuss/channels/general-discussion/jquery-file-upload-with-laravel
Happy Coding!!

Related

php: time limit exceeded `Success' # fatal/cache.c/GetImagePixelCache/2042

I'm import data from a CRM Server by JSON to Wordpress.
I know that the load may take several minutes, so the script runs outside Wordpress. And I execute "php load_data.php"
But when the script reaches the part where we upload the images, it throws an error:
php: time limit exceeded `Success' # fatal/cache.c/GetImagePixelCache/2042.
and it stops.
This is my code to upload image to media:
<?php
function upload_image_to_media($postid, $image_url, $set_featured=0) {
$tmp = download_url( $image_url );
// fix filename for query strings
preg_match( '/[^\?]+\.(jpg|jpe|jpeg|gif|png)/i', $image_url, $matches );
$before_name = $postid == 0 ? 'upload' : $postid;
$file_array = array(
'name' => $before_name . '_' . basename( $matches[0] ),
'tmp_name' => $tmp
);
// Check for download errors
if ( is_wp_error( $tmp ) ) {
#unlink( $file_array['tmp_name'] );
return false;
}
$media_id = media_handle_sideload( $file_array, $postid );
// Check for handle sideload errors.
if ( is_wp_error( $media_id ) ) {
#unlink( $file_array['tmp_name'] );
return false;
}
if( $postid != 0 && $set_featured == 1 )
set_post_thumbnail( $postid, $media_id );
return $media_id;
}
?>
They are like 50 posts and each one has 10 large images.
Regards
The default execution time is 30 seconds so looks you are exceeding that. We have a similar script that downloads up to a couple thousand photos per run. Adding set_time_limit(60) to reset timer each loop fixed timeout issues. In your case you can probably just add at the beginning of the function. Just be very careful you don't get any infinite loops as they will run forever (or until the next reboot).
To make sure it works you can add the below as the first line inside your upload function
set_time_limit(0)
this will allow it to run until it's finished, but watch it as this will let it run forever which WILL hurt your servers available memory. But to see if the script works put that in there, then adjust to proper time if need be.
If you get another or the same error it will at least verify its not a time issue (error messages are not always factual).
The other possibility is that you are on a shared server and are exceeding their time allotment for you server. (continuous processor use for more then 30 seconds, as an example).

Rename uploaded image in numeric sequence (i + 1)

I have a json post that uploads photos. It works just fine. I want to change the name of the uploaded file as it is uploaded. I know how to do that. I want to add a number to the end of the file name before the image type extension. I know how to do that. I want the number to increment by one for each new file when I upload multiple images at once. I cant do that :-( Here is what Im working with:
if (!file_exists($vendimagepath) ) {
mkdir($vendimagepath,0777,TRUE);
}
$valid_extensions = array('gif', 'png', 'jpeg', 'jpg');
$uploader = new FileUpload('uploadfile');
// Handle the upload
$result = $uploader->handleUpload($vendimagepath);
if (!$result) {
exit(json_encode(array('success' => false, 'msg' => $uploader->getErrorMsg())));
} else {
echo json_encode(array('success' => true));
$_SESSION['success']=true;
$path = $uploader->getFileName();
$vendimagepath= $vendimagepath.$path;
$result = $db -> query("INSERT into vendimages (vendregid, vendimagepath) VALUES ('$vendredig', '$path')");
$result = $db -> update("UPDATE registervendors SET images='1' WHERE regid = '$vendredig' AND username='$vendusername' ");
}
I inserted a variable under the if statement at the top $x=1; and then under the $uploader=new fileupload I added a while loop with $x++; and put that closing brace at the end of the script. It didnt work. It uploaded files but they all end up with the same number (1) . I know why. The script is called for each new file uploaded so $x=1 restarts each time and therefore $x++ is 1 each time.
Since you want to count across page loads, you should use your $_SESSION. Before the uploads start set:
$_SESSION['upload_index'] = 1;
Then do $_SESSION['upload_index']++ each time you get a new uploaded file.

Queuing and image processing

Hi all I just have a quick question regarding best practacises and perhaps some help on queuing and image manipulation.
Im currently working on a website that allows the user to upload in excess of 10 files at one time now in my experience ive only really handled single uploads or 2-3 max, this site allows the user to upload as many as they like and then performs image manipulation to create 3 versions of each image at different sizes.
My thought process and how ive implemented this goes as follows.
User goes to upload form and selected multiple files these are all uploaded inline when they have finished the form autosubmits, The uploaded files are uploaded directly to a temporary folder in S3, this was done as there are multiple servers in the live environment with a load balancer in front of them so i was worried if I uploaded them all to the server then if i fired a queue it might go to the incorrect server and not find the files, would be great if there was a nice way of doing this.
when the form is submitted it fires a notification to the queue on iron.io with the data from the form submit which basically calls the server and starts the processing of images the code for this is below
public function fire($job, $data)
{
set_time_limit(0);
try {
if(is_array($data)){
foreach ($data['file'] as $x => $file){ //loop through each file uploaded and now save them
if ($this->images->doesMediaExistInTemporaryFolder($file)){
if ($new_file = $this->images->getMediaFromTemporaryS3Folder($file)){
file_put_contents (app_path() . '/storage/bulk-upload/' . $file, (string) $new_file['Body']);
$record_date = false;
if ($data['date'][$x] != 'no-date'){
if ($new_file['ContentType'] == 'image/jpeg') {
$exif_data = #exif_read_data(app_path() . '/storage/bulk-upload/' . $file, 'FILE');
}
if (!empty($exif_data) && #array_key_exists('DateTime', $exif_data)){
$record_date = $exif_data['DateTime'];
} else {
$record_date = $data['date'][$x];
}
}
$created_file = new \Symfony\Component\HttpFoundation\File\UploadedFile(app_path() . '/storage/bulk-upload/' . $file, $file, $new_file['ContentType'] );
$input = array('vehicle_objectId' => $data['vehicle_objectId'], 'type' => $data['type'], 'privacy' => $data['privacy'], 'date' => $record_date);
if (file_exists(app_path() . '/storage/bulk-upload/' . $file)){
if ($record = $this->record_repository->save($input, $created_file)) {
unlink(app_path() . '/storage/bulk-upload/' . $file);
$this->images->deleteMediaFromTemporaryS3(array(array('Key' => $file )));
} else {
$data['filename'] = $file;
\Mail::send('emails.bulk-upload', $data, function($message) {
$message->to('email', 'Daniel Newns')->subject('Bulk upload save issue');
});
}
}
}
}
}
$parse = new \ParseRestClient();
$user = $parse->retrieveCurrentUser( $data['pid']);
if (isset($user->email)) {
$vehicle_url = \URL::route('vehicles.show', $data['vehicle_objectId']);
$body = "<p>Hi " . $user->username . "</p><p>Your records have all been created. View them all as part of your vehicle record <a href='" . $vehicle_url . "'>here</a></p>";
$message = array(
'to' => array(array('email' => $user->email)),
'from_email' => 'xxxxx
'from_name' => 'xxxxx'
);
$template_content = array(array("name" => "share", "content" => $body));
$response = \Email::messages()->sendTemplate('Bulk_Upload', $template_content, $message);
}
}
} catch(\Exception $e){
$message = array(
'to' => array(array('email' => 'email')),
'from_email' => 'email',
'from_name' => 'xxxxxx'
);
$content = '<p>'. $e->getMessage() . '</p>';
$content .= '<p>' . $e->getTraceAsString() . '</p>';
$template_content = array(array("name" => "share", "content" => $content));
$response = \Email::messages()->sendTemplate('Content_Share', $template_content, $message);
}
}
as you can see it loops through the data returned from the queue and loops through the files from here it pulls the image from S3 and stores it locally then it checks if there is a date set and works out the created date via either that or exif data. then it creates the file and saves the record in the save function it performs all the resizing required.
my question is really does someone else have any suggestions on how i can improve this as im occasionally getting emails from the exception where its saying that it cant find a certain image like it hasnt been created locally, is my method of creating the image locally using file_put_contest the one i should be using or is there a better way for me to pull the data from S3 and work with it. Ive put a number of if statements in to stop things falling through the gaps etc.
be great to hear other peoples thoughts on where I have gone wrong here and what perhaps i could do to improve this? perhaps i could store an array of files that dont exists on first loop and then try again afterwards as i was thinking that it might be a case of the the code executing before the image exists would this be the case?
any help would be much appreciated.
thanks
I'm curious how you are implementing the actual queue for processing the images?
When I have needed a process queue in the past, I created a server daemon using PHP that would check a db for new images. Each time an image was uploaded I copied the original to a temp location, and stored the name, and status of the image in the DB. Status was new, processing, and complete. As soon as a server grabbed a file to process from the db I updated the status to processing.
I also mounted an S3 bucket to each of my machines and then symlinked to a local folder so all files were accessible with out having to download the file first. The code behaves as if the file is local, even though in the background the image is being downloaded.
However, another solution that lives in the AWS service is their SQS (simple queuing service). Use the S3 API with the SQS API inside your application and you can accomplish what your trying to do with out trying to build a server daemon.
I would check out this link here:http://aws.amazon.com/articles/1602?_encoding=UTF8&jiveRedirect=1
They have a pretty good guide on how to do exactly what your wanting to do using the services above. They recommend using dynamoDB but you can probably swap out with any DB that you are already using.
Either route you go, you need a DB to track files, and process status, and to keep track of your files in general. If you are worried your sometimes running into errors because the file isn't downloaded yet, I would check to make sure the file exists first, if it does check file size against DB, and then determine if the file is ready to be processed. You could run a script in laravel by hitting that specific url with a cron job as well.
Hope this helps!

get_headers takes too much time for checking size of image

hi i am using below code to check the size of remote image .it works but it takes lot of time to check the size of image is there any bette rway to do
<?php
$url='http://testfile.com/test/sddkssk.jpg';
$head = array_change_key_case(get_headers($url, TRUE));
$filesize = $head['content-length'];
if ($filesize >= 131000) {
echo 'good image';
}
but it takes 2-3 minute for each time to load is there any better way which can do same work very fast
$size = getimagesize("http://www.example.com/gifs/logo.gif");
// if the file name has space in it, encode it properly
$size = getimagesize("http://www.example.com/gifs/lo%20go.gif");

Cache multiple pages/images from Instagram

I'm working on a small project where the users can see images taged by, in this case, "kitties". Instagram only allows 5000 requests/hour, i don't think it will reach this, but i'm choosing to cache any way. Also because i can't figure out how to get the back-link to work.
I can only get the link for next page, then the link for recent page becomes the current page, a link to itself.
Also, the api can return strange number of images, some times 14, some times 20 and so on. I want it to always show 20 images per page and only have 5 pages (100 images). And then update this file each 5/10 minutes or something.
So, my plan is to store like 100 images into a file. I got it working, but it's incredible slow.
The code looks like this:
$cachefile = "instagram_cache/".TAG.".cache";
$num_requests = 0; //Just for developing and check how many request it does
//If the file does not exsists or is older than *UPDATE_CACHE_TIME* seconds
if (!file_exists($cachefile) || time()-filemtime($cachefile) > UPDATE_CACHE_TIME)
{
$images = array();
$current_file = "https://api.instagram.com/v1/tags/".TAG."/media/recent?client_id=".INSTAGRAM_CLIENT_ID;
$current_image_index = 0;
for($i = 0; $i >= 0; $i++)
{
//Get data from API
$contents = file_get_contents($current_file);
$num_requests++;
//Decode it!
$json = json_decode($contents, true);
//Get what we want!
foreach ($json["data"] as $x => $value)
{
array_push($images, array(
'img_nr' => $current_image_index,
'thumb' => $value["images"]["thumbnail"]["url"],
'fullsize' => $value["images"]["standard_resolution"]["url"],
'link' => $value["link"],
'time' => date("d M", $value["created_time"]),
'nick' => $value["user"]["username"],
'avatar' => $value["user"]["profile_picture"],
'text' => $value['caption']['text'],
'likes' => $value['likes']['count'],
'comments' => $value['comments']['data'],
'num_comments' => $value['comments']['count'],
));
//Check if the requested amount of images is equal or more...
if($current_image_index > MAXIMUM_IMAGES_TO_GET)
break;
$current_image_index++;
}
//Check if the requested amount of images is equal or more, even in this loop...
if($current_image_index > MAXIMUM_IMAGES_TO_GET)
break;
if($json['pagination']['next_url'])
$current_file = $json['pagination']['next_url'];
else
break; //No more files to get!
}
file_put_contents($cachefile, json_encode($images));
This feels like a very ugly hack, any ideas for how to make this work better?
Or someone that can tell me how to make that "back-link" to work like it should? (Yes, i could yes js and go -1 in history, but no!).
Any ideas, suggestions, help, comments etc are appreciated.
Why not subscribe to real-time and store the images in the DB? Then, when they are rendered you can check if the image url is valid (check if the photo has been deleted). Getting the data from your own DB will be much faster than from instagram

Categories