I have the URL of a gallery. This gallery is structured like www.url.com/folder/image01.php
Here, www.url.com/folder shows the thumbnails, and clicking on each thumbnail redirects to the php page as mentioned.
I wish to download FULL size images, ie. the image from php page, rather than thumnails folder.
I use the following line: wget -r -L -A .jpg www.url.com/folder
However, this only downloads the thumbnails (each thumbnail points to a larger image, and i want to go to each of this image and download them, rather than the images in thumbnail directory)
I've been stuck with this problem for quite a while, is there a solution for this?
Not much experienced with wget but my understanding is that the command you are using will recursively download all .jpg files in the relative url's appearing in the provided web page.
It would help to know what web page you are trying to access and analyse how full sized images are accessed, but my guess is that only thumbnail images are directly accessible through relative links from the webpage.
You should probably check if any javascript is used to provide the access to the full size image. If so, wget will not be helpful since (as far as I am aware) it will only work on plain HTML text of the initial web page and is not capable of figuring out links created dynamically through Javascript.
Related
I want to create a video from images on the server. I have a css animation on the webpage. I've already written a code that takes screenshots (with dataURL base64) and save them in a folder on my server (using html2canvas/ajax/php). All the images are also created as "image001.png, image002.png etc". Once I have all the files in my folder I would like to use ffmpeg to grab all the images and convert in a video. But I want this step automated and I am not sure how to do it. Using a php code with condition (example if all the files are in the folder) with a shell_exec() ? I am a bit confused on how to mix javascript php ffmpeg.
Thanks
I have a very large database of all items in a massive online game on my website, and so do my competitors. I however, am the only site to have pictures of all these items. All these pictures are on my server, eg. from 1.png to 99999.png (all in the same directory).
It's very easy for my competitors to create a simple file_get_contents/file_put_contents script to just parse all of this images to their own server and redistribute them their own way. Is there anything I can do about this?
Is there a way to limit (for example) everyone to only see/load 100 images per minute (I'm sure those scripts would rapidly parse all of the images)? Or even better, only allow real users to visit the URL's? I'm sure those scripts wont listen to a robots.txt file, so what would be a better solution? Does anybody have an idea?
Place a watermark in your images that states that the images are copyrighted by you or your company. Your competitors would have to remove the watermark and make the image look like there never was one, so that would definitely be a good measure to take.
If you're using Apache Web Server, create an image folder and upload an htaccess file that tells the server that only you and the server are allowed to see the files. This will help hide the images from the parsing bots, as Apache would see that they are not authorized to see what's in the folder. You'd need to have PHP load the images (not just pass img tags on) so that as far as the permissions system is concerned, the server is accessing the raw files.
On your PHP page itself, use a CAPTCHA device or some other robot detection method.
I am working on an RSS reader using SimplePie.
I want to pull in RSS feeds and display articles with thumbnails. I am using RSS Enclosures and will later parse RSS <items> for <img>'s.
Is there a safe way to download these remote images to my server so I can cache them and resize them?
I have some ideas about how it can be done with cURL for instance, but I'm interested in security. I know I can restrict by file extension and I could probably look at the MIME type, but I just want to know if this is at all feasible/safe. If so, what steps should be taken to ensure there are no vulnerabilities.
Resize Images With gd or imagemagick
or you can do it in this way Check out the following Link
Posible security issues:
Execution of remote script. Somehow you download a php script instead of an image and server executes it on later call to this image.
Solution: in any web server you can configure upload directory so, that no script can be executed there. And don't forget to add proper extetion to images (better to select from list, do not trust mime-types).
XSS on client reading RSS. You download an image. But it ocures to be a javascript.
Solution: make sure you insert link to this imgae in <img src"..."> tag. Script would not be executed from img tag. Moreover, resizing image through GD or Imagic will certanly spoil any js inside, if there was one.
On my site, admins can upload ZIP files that contain a set of JPGs. Users can then download the ZIPs. I also want users to be able to preview images from the ZIP file. But I don't want to extract the images into a temp dir on the server and delete them later.
Is there a way to do this? Perhaps with the img "data:" URI scheme?
One way or another, people would have to download parts of that zip file, so you might as well do some server-side processing and generate thumbnails of the images ONCE. Then just serve up those thumbs repeatedly. Otherwise you're going to waste a lot of cpu time repeatedly extracting images, encoding them for data uris, and sending out those data uris.
Hi I want to show an image gallery on my site and then have a link that allows the user to download the whole image gallery. Is it ok to place the images in a zipped folder and provide a link to that folder? I had the feeling that it was not a good idea to zip jpegs.
For the purpose of bundling files, it is perfectly acceptable, but it won't reduce the download size by a significant amount.