Can I download a resized image? - php

I have a cron process that gathers information about movies and presents finished information to the user. Periodically, however, the link provided by the API to the "movie poster" is massive, and it's crashing the process with memory issues.
As an example - this file http://content8.flixster.com/movie/11/16/86/11168662_ori.jpg is huge. I believe it's around 17MB, and the image itself clocks in at 8175px x 12075px.
The maximum size my end users will see is around 360px wide. So downloading such a massive image is kind of ridiculous. And the next size down is 180 x 266 - I'm worried that it might not look right if I scale it up.
Is there a way that PHP will allow me to download a resized version? I know that if I download the file, I can do a million things to it. I'm trying to change the size of the download itself.
If not - any other solutions anyone can think of?

Unfortunately there is no way and there cannot be a way logically. However there is a workaround. You can take help of some online utility to do the job
e.g. http://quickthumbnail.com which generates the thumbnail from given url
Hope this works for you.

This has nothing to do with PHP. Some remote service offers a file. You can take it or leave it. If they don't offer some API to downloaded a different size or different version of the file, you can't get it.
All you can do is download the file to your server, resize it there and offer it to your users.

Related

PHP: What is a good way to upload multiple large files?

What is a good way to upload multiple large files in PHP?
Note: in my case I don't really need very large files, I need something like 40-50 or maybe 100 MB upload. My purpose and goal is mainly about documents upload in websites, these documents (pdf, doc, etc.) can sometimes be 20-30 MB and rarely 100+MB, where normally the php MAX_UPLOAD_FILESIZE is 10-20 MB. I think that having a large MAX_UPLOAD_FILESIZE is very bad and anyway allowing large file uploads in PHP (like 1GB+) is really a bad idea.
I have been reading, even here on SO, different solutions like plupload and some others (HTTP Upload, bigUpload, etc.) and I am not sure which way is better to consider.
As a principle, I'd like to find something with mantained code (not an abandoned library) and possibly following coding standards (PSR).
I think that writing all from scratch would be a huge work, but maybe I am wrong, if someone did that I would like to hear your experience. Of course if I can't find something that gives me what I need, I'll have to edit existing libraries or write one on my own.
I will give a try to plupload but my biggest concern is that it might not be well mantained. There is a v3 release but the stable is still the v2. If I understand correctly, the last updates of this library on GitHub were made in 2017 and this doesn't really reassure me. PHP but everything in general changes really quickly.
Problem 1: Server knows the size of the file only when the upload is finished
I think this is the first problem. I can't be sure about the file size until the upload is completed. If I have to give an error in PHP because the file is too big, I'd need to upload the file before. I could guess the size in javascript but I think that would be too hackable. Same thing for HTML5. All client-side checks could be "hacked" or manipulated even if it requires work, it is still doable
Problem 2: File chunking, is it the best solution?
I have read about file chunking, which is very interesting, but that is only client-side, right? Because to chunk a file on server-side, you go back to problem 1, you need to upload the whole file before chunking it. Is client-side chunking safe? Are there known vulnerabilities about it? (an example, could I exploit something out while the file is being uploaded?). I will read more about this later.
Also what about the time required to upload? Let's say you are using your phone in 4G to upload two PDFs or JPGs on a form, you have to upload 20-30 MB with a bad signal in that moment, let's say you take 30 seconds or 1 minute to upload them. Will the server stop listening after a while?
In any case, with file chunking, do I need to have a large MAX_FILESIZE_UPLOAD? Is usually the request sent (as a POST) with the entire file, which could return an error because the size exceeds the PHP limit? I would like to keep a normal PHP file size limit.
Problem 3: stopping file upload and deleting interrupted upload
What happens if I stop or I want to stop my upload? I guess that the server will find itself with a temporary file which is only partially uploaded.
Then, in general, is the best way to have a cron checking the /tmp (or whatever) folder and deleting uncompleted files?
Problem 4: overwriting interrupted uploads
What if the user uploads a new file that should overwrite the old one, but the old one was not completed? Let's say, user uploads a 10MB document but realizes it's the wrong one. He will probably reload the page or maybe click "Browse" again and upload the new one. So, all the temporary files should have a unique name. If I remember correctly, PHP already gives them a random name in /tmp/. Is this enough? Would it be better to manually give them a random name, maybe based on a timestamp?
In code words, something like:
$fileName = time() . '-' . uniqid() . '.tmp';
Problem 5: what about multiple files upload?
Let's say the user uploads 3 documents, 10 MB each one. Should the server receive them all at the same time or one by one? Or maybe is it the same? At a first look, I might think that multiple uploads at the same time could give more problems with server load. Maybe this is not that important though.
Problem 6: accessibility
Would all this be easily accessible? Do you think that a user using a screen-reader would be able to upload multiple (and possibly large) files without problems?
Conclusion
In conclusion I will take a look at existing libraries and test a bit if I can find a good solution for my needs. In general I would like to read comments or experiences that could help me understand possible difficulties and problems that I could encounter.
I will try to help you in your questions:
Problem 1: Server knows the size of the file only when the upload is finished
You need to configure upload_max_filesize and post_max_size to allow the maximum file upload you want to be able to receive in a single request.
And yes, you will know the size once the file has been uploaded, as your script will be executed once the file is completely uploaded.
You can have some kind of checking in your javascript to improve your UI to the customer.
Also if the file exceeds the maximum size, the file will not be available in your server script.
Problem 2: File chunking, is it the best solution?
I don't think this would be any better than uploading the whole file.
The speed will not be increased, so although you can paralelize the upload, the upload speed will be determined by the client speed, you will end up uploading 1 file at 10Mb/s or 10 files at 1Mb/s, at the end will be the same.
Also you will have a very complex code at the client and at the server to handle it, and you will have to handle new error scenarios.
It is not worth it.
Problem 3: stopping file upload and deleting interrupted upload
If the client stops the upload, your server code will not be executed as the request has not been completed.
Also you won't have any troubles with duplicated tmp files, the file is removed from tmp folder when you move it using move_uploaded_file and if you don't move the file will be deleted when your script has finished.
https://www.php.net/manual/en/features.file-upload.post-method.php
Problem 5: what about multiple files upload?
With multiple file uploads, i prefer to use diferent requests via javascript for each file, then you can paralelize or start uploading files one by one and show the user the upload process.
With javascript you can see the upload percent of each request, and it is very useful when uploading big files to give the user the feedback that the application is working properly.
Problem 6: accessibility
You will handle the same accessibility issues than uploading a small file.
Conclusion
The server side coding is independent of the file size you are trying to upload, as it just does some checks and moves the file to the proper location.
You will have to code some javascript to improve the ui to show the progress to the customer, and if you want to add the ability to cancel the upload, it is very easy as it is just an http request with a listener that updates the progress (https://stackoverflow.com/a/47638378/1445024)

Common practice to compress image before sending to mobile device?

My application requires downloading many images from server(each image about 10kb large). And I'm simply downloading each of them with independent AsyncTask without any optimization.
Now I'm wondering what's the common practice to transfer these images. For example, I'm thinking about saving zipped images at server, then send zipped file for user's mobile to unzip. In this case, is it better to combine the zip files into one big zip file for user to download?
Or there's better solution? Thanks in advance!
EDIT:
It seems combining zip files is a good idea, but I feel it may take too long for user to wait downloading and unzipping all images. So I may put ten or twenty images in each zip file, so user can see some downloaded ones while waiting for more to come. Having multiple AsyncTask fired together can be faster right? But they won't finish at the same time even given same file size and same address to download?
Since latency is often the largest problem with mobile connections, reducing the number of connections you have to open is a great way to optimize the loading times. Sending a zip file with all the images sounds like a very good idea, and is probably worth the time implementing.
Images probably are already compressed (gif, jpg, png). You will not reduce filesize but will reduce the number of connections. Which is a good idea for mobile. If it is always the same set of images you can use some sprite technology (sending one bigger image file containing all the images but with different x/y offset, in html you can use the backround with an offset to show the right image).
I was looking at the sidebar and saw this topic, but you're asking about patching when I saw the comments.
The best way to make sure is that the user knows what to do with it. You want the user to download X file and have Y output for a different purpose. On the other hand, it appears common practice is that chunks of resources for those not native to the Android app and not able to fit in the APK.
A comparable example is the JDIC apps, which use the popular Japanese resource that are in tandem used for English translations. JDIC apps like WWWJDIC use online downloads for the extremely large reference files that would otherwise have bad latency (which have been mentioned before) on Google servers. It's also bad rep to have >200 MB on Google apps unless it is 3D, which is justifiable. If your images cannot be compressed without extremely long loading times on the app itself, you may need to consider this option. The only downside is to request online connection (also mentioned before).
Also, you could use 7zip and program Android to self-extract it to a location. http://www.wikihow.com/Use-7Zip-to-Create-Self-Extracting-excutables
On another note, it would be optimal for the user to perform routine checks on the app while having a one-time download on initial startup. You can then optionally put in an AsyncTask so that your files will be downloaded to the app and used after restart or however you want it, so you really need only one AsyncTask. The benefit of this is that the user syncs on the apps and he may need to check only once. The downside is that the user may not always be able to update and may need to use 4G or LTE, but that is a minor concern if he can use WiFi whenever he wants.

How can I count the size of downloads from my users?

Is this possible to calculate size of downloads from users?
for example, I have a music with size 10MB and the download link is resume-able.
If my user get 50%, can I know that the size of his download?
Its better to be done with PHP.
To know the size of the actual download, you need to name the method/process which delivers the downloads. That process could know.
For example, if you serve the files via your server, you can normally configure the server logs to store that information. You can then parse log-files to obtain the download size(s).
Its better to be done with PHP.
However you like to do it, I can not say it's better to be done with PHP, but if it's better for you, then do it with PHP.

ImageMagick server requirements

I'm in the process of building a simple website, or to be more precise a simple component for a website that adds a watermark to an image, creates a few different size images, and overlays it onto a few products. These edits will be made every time someone queries an image in a certain directory on the server.
I know this can all be done with imagemagick, my only concern is that the whole website will grind to a halt every time someone views their image for the first time (after the edit's been made once, the database is updated to get the edited version every time a user accesses it).
The website isn't hosted yet, for the time being I'm testing on XAMPP, but I figured for this I'm going to need a virtual or dedicated server, I just need some advice on what sort of hardware specs I ought to be looking at. I doubt more than 2 or 3 people will be viewing photos at any one time, but at a guess I need to be sure that the server can handle up to 10 or so and still be functional.
Hope someone can advise on this, cheers!
For Imageprocessing you need some CPU Power. If the Images are large they will also consume memory. But I think you should in any case work with caching. I don't know your application but certainly there are possibilities to cache images which are rendered once into the filesystem.

DVD to FLV File Conversion Using PHP -- Is this Doable? Insane? What are alternatives?

So I have users who have told me they are interested in being able to upload videos to my site straight from DVD's (for which they own the rights, of course).
I've never encountered this before, but I would imagine this would take an enormous amount of resources and would clog up the servers, which I would like to avoid.
A basic google search returns numerous DVD to FLV converters but all seem to appear to be applications which would need to be used to convert the files before uploading.
So, if this isn't a horrible idea, how would I go about implementing it using PHP or any Linux command line tool?
Or if this is insane, Why is this a bad idea? and What are other possible alternatives?
As an example, I could see an alternative being:
showing information about how to convert the files to a valid upload format before uploading
Search for ffmpeg - i don't know does it reads DVD files, but most of video formats can, see:
http://en.wikipedia.org/wiki/Libavcodec
It's a command line program, which can convert between many video formats.
You can't avoid huge load on server, because converting video simply require lot of computations. Maybe there is a way to restrict resources that program takes and slow it down - but it will have cost in execution time. On multi-core server, only one core will be loaded when converting video, so maybe this is not a problem ?
Remeber that uploading large files (like DVD video is) can also be a problem, and you should watch to nice uploader with progress bar (for example flash uploader)

Categories