Hi I want to show an image gallery on my site and then have a link that allows the user to download the whole image gallery. Is it ok to place the images in a zipped folder and provide a link to that folder? I had the feeling that it was not a good idea to zip jpegs.
For the purpose of bundling files, it is perfectly acceptable, but it won't reduce the download size by a significant amount.
Related
I've recently pushed a video uploader for this website I've been working on for months. Recently, we have had two separate instances where the video the user uploaded wasn't completely uploaded resulting in a broken file that wont play. Currently, we are not sure as to why something like is occurring, but we have a couple of hunches, namely the files that are being uploaded are too big to handle and the user's upload speed is horrible. I was wondering what I can do to handle situations such as these where the upload conditions are less than optimal.
I upload firstly to the tmp directory on my Linux server then move the file to S3, our system is for end users to upload and we limit files to 40GB.
I have the URL of a gallery. This gallery is structured like www.url.com/folder/image01.php
Here, www.url.com/folder shows the thumbnails, and clicking on each thumbnail redirects to the php page as mentioned.
I wish to download FULL size images, ie. the image from php page, rather than thumnails folder.
I use the following line: wget -r -L -A .jpg www.url.com/folder
However, this only downloads the thumbnails (each thumbnail points to a larger image, and i want to go to each of this image and download them, rather than the images in thumbnail directory)
I've been stuck with this problem for quite a while, is there a solution for this?
Not much experienced with wget but my understanding is that the command you are using will recursively download all .jpg files in the relative url's appearing in the provided web page.
It would help to know what web page you are trying to access and analyse how full sized images are accessed, but my guess is that only thumbnail images are directly accessible through relative links from the webpage.
You should probably check if any javascript is used to provide the access to the full size image. If so, wget will not be helpful since (as far as I am aware) it will only work on plain HTML text of the initial web page and is not capable of figuring out links created dynamically through Javascript.
Hi i'm making a site in wordpress using a theme called wptube2 and it has all the Youtube look and stuff but what I need is that when I upload some video it just uploads I want it to be able to change it format to like MP4 and be published in my own player like jwplayer but I don't know where to start any suggestions would be greatly appreciated I just want to upload any video format and it should automatically be converted it to MP4 and published with jwplayer
here is a screen shot where i publish my videos from!
I used ffmpeg to do all my conversions.
You could make your uploaded video files all go to one central folder. Have a batch file that runs ffmpeg on those files and then move it to the another folder once it is processed.
You could upload your file in whatever format, give it a fake .mp4 file name and just have a message that says "file being processed" in the meantime.
File conversion takes time, and a lot of processing power, be sure to try it locally before even considering making your server do all the work.
I'm writing arcade game script. i want to take a photo for thumbnail on swf files.
have php got function for this?
Or How difficult?
Upload a swf file
Take a photo uploaded file
Move directory
Add to mysql
That's not possible with PHP alone. The best solution in my eyes is to use a browser screenshot tool on a page that embeds the flash.
On my site, admins can upload ZIP files that contain a set of JPGs. Users can then download the ZIPs. I also want users to be able to preview images from the ZIP file. But I don't want to extract the images into a temp dir on the server and delete them later.
Is there a way to do this? Perhaps with the img "data:" URI scheme?
One way or another, people would have to download parts of that zip file, so you might as well do some server-side processing and generate thumbnails of the images ONCE. Then just serve up those thumbs repeatedly. Otherwise you're going to waste a lot of cpu time repeatedly extracting images, encoding them for data uris, and sending out those data uris.