I am developing a content management kind of thing where user can upload their content with images and other stuff on php
Some user have uploaded image larger size may be a 1MB image for logo of page.
I want to compress all the images present in the folder containing images of all users recursively.
So how that can be done using php or any server site scripting.
Note : I dont have a shell access to server to run any script. The only way to do it using something in php
Related
I want to create a video from images on the server. I have a css animation on the webpage. I've already written a code that takes screenshots (with dataURL base64) and save them in a folder on my server (using html2canvas/ajax/php). All the images are also created as "image001.png, image002.png etc". Once I have all the files in my folder I would like to use ffmpeg to grab all the images and convert in a video. But I want this step automated and I am not sure how to do it. Using a php code with condition (example if all the files are in the folder) with a shell_exec() ? I am a bit confused on how to mix javascript php ffmpeg.
Thanks
I've recently pushed a video uploader for this website I've been working on for months. Recently, we have had two separate instances where the video the user uploaded wasn't completely uploaded resulting in a broken file that wont play. Currently, we are not sure as to why something like is occurring, but we have a couple of hunches, namely the files that are being uploaded are too big to handle and the user's upload speed is horrible. I was wondering what I can do to handle situations such as these where the upload conditions are less than optimal.
I upload firstly to the tmp directory on my Linux server then move the file to S3, our system is for end users to upload and we limit files to 40GB.
For SEO reasons I am sending some images using PHP readfile and URL rewrite. I save images in a folder with numerical id (id.jpg) but serve the image with keywords (some-seo-word-id.jpg). While the scripts runs smoothly and show images effectively, however in pages with multiple images (5 to 6) it fails to send all images correctly.
Refreshing the pages 2 to 3 times sometimes shows all images, sometimes it does not ! Since the max image size delivered using PHP readfile is 70-140 KB each I highly doubt that memory can be an issue. My question is
Is there any better approach to this only with Apache rewrite.
How does Facebook deliver all images effectively with php ? There url are something like this https://www.facebook.com/photo.php?fbid=XXXXXXXXX27&set=a.4533609xxxx.xxxx099.xxxx210426&type=1&theater
will server side caching in binary format help ? To me this should not affect much as readfile of an image or cached file will still read a particular file (cached file instead of image !)
I have a laravel php app were a user is going to upload an image. This image is going to be converted into a number of different sizes as required around the application and then each image is going to be uploaded to aws s3.
When the user uploads the image php places it in /tmp until the request has completed if it hasnt been renamed. I am planning on pushing the job of converting and uploading the versions to a queue. What is the best way to ensure that the image stays in /tmp long enough to be converted and then uploaded to s3
Secondly where should I save the different versions so that I can access them to upload them to s3 and then remove them from the server(preferably automatically)?
I would create a new directory and work on it. tmp folder is flushed every now and then depending on your system.
As for different sizes, i would create separate buckets for each size which you can access with whatever constant you use to store the image (ex: email, user id, etc..).
On my site, admins can upload ZIP files that contain a set of JPGs. Users can then download the ZIPs. I also want users to be able to preview images from the ZIP file. But I don't want to extract the images into a temp dir on the server and delete them later.
Is there a way to do this? Perhaps with the img "data:" URI scheme?
One way or another, people would have to download parts of that zip file, so you might as well do some server-side processing and generate thumbnails of the images ONCE. Then just serve up those thumbs repeatedly. Otherwise you're going to waste a lot of cpu time repeatedly extracting images, encoding them for data uris, and sending out those data uris.