I have created a program which allows me to upload images to my server. They are given a random file name when uploaded. I want to be able to download all the images from a folder on the server so I can display them in my application. The only example I have seen requires that I know the file name of the images which I don't. How could I download all the images in a given directory (and store the downloads in an NSArray)? If there is no native way to do it does anyone know a way that it could be done via calling a PHP script? (I use a PHP script which the iPhone calls to upload the images).
Thanks.
To list all images in a directory using PHP and return a JSON encoded string:
$path = '/full/path/to/images/';
// find all files with extension jpg, jpeg, png
// note: will not descend into sub directorates
$files = glob("{$path}/{*.jpg,*.jpeg,*.png}", GLOB_BRACE);
// output to json
echo json_encode($files);
you can call a php script that will return the url for all of your images.
something like
yoursite.com/image123.jpg;yoursite.com/image213.jpg;yoursite.com/imageabc.jpg
then you parse the result, split by ";" and get the array of urls which you need to download.
Related
I am trying to save to disk an image that is served to me via a JSON result. The returned JSON result property that I am interested in is this:
https://i.scdn.co/image/6cd03f58ddf30a1393f06d6469973ba16ac908df
Which is the correct image. The problem is that, while the above URL does display the image, it does not allow me to download it, yet I can download it by right-clicking on it.
What I need to be able to do is, using my PHP code, save it to disk.
I have no issues saving results from other sites that give results that link to a direct image extension (.jpg, .gif or .png). But I have not been able to figure out how to programmatically download the image from the above URL.
Is it possible?
This is the code that I use, which works correctly on results that give a URL that has a correct image extension. The URL returned is loaded into the $largeimg variable.
$input = $largeimg;
$output = 'image.jpg';
file_put_contents($output, file_get_contents($input));
How do I achieve this?
file_get_contents() is able to accept raw URI arguments. Your code works perfectly for me, if modified in the way:
$input = 'https://i.scdn.co/image/6cd03f58ddf30a1393f06d6469973ba16ac908df';
So, file_get_contents() can download the image directly. I think, the problem is your $largeimg variable.
So I have an XML file that has a base64 encoded data string for a pdf file, which just has an image taken from an iPad.
This pdf file can be excessively large, as much as 14MB with dimensions of 57"x38".
These images are taken from an iPad through a DocuSign session, thus I have no way at the moment of controlling their size or format before they get to my php listener script.
However, my script cannot work with such large files as my CRM's API file size max is 10MB, and I need a way of reducing the file size before I can upload it through my CRM's API.
Now if it was just a jpg, it would be ok as there are plenty of ways to reduce file size in PHP, but it is a PDF. I have found plenty of PHP extensions for making PDFs, but I haven't found any for reading a PDF and extracting an image from it.
So is there a way to extract the image from the PDF through PHP, or perhaps compress the pdf file?
UPDATE
I didn't think about the possibility of converting a pdf into a jpg, which apparently is easier to do with imagick. Having my server admin install it and I will see if I can make it work with my script.
UPDATE 2
So I was able to get imagick working and locally I am able to convert pdf files into jpg, and reduce file size dramatically.
However, I am running into an issue using it with my application. I get the following error from my CRM's API:
Failed to parse XML-RPC request: Invalid byte 1 of 1-byte UTF-8 sequence.
So the process is the following:
XML file has a base64 encoded data stream of the pdf file.
I decode this data
I then convert with imagick and reduce file size
I base64 encode and prep for upload
CODE
$imageBlob = base64_decode((string)$pdf->PDFBytes);
$imagick.$x = new Imagick();
$imagick.$x->readImageBlob($imageBlob);
$imagick.$x->setImageFormat('jpeg');
$imagick.$x->setImageCompressionQuality(60);
$imagick.$x->adaptiveResizeImage(1024,768,true);
$imageBlob = $imagick.$x->getImageBlob();
$PDFdata[] = base64_encode($imageBlob);
I can test the date by using the proper header and I can see the new jpeg fine, so I assume the data is properly formatted.
What I am missing?
Ok, so I figured it out.
Imagick was the way to go, and my use of it was good. I just goofed up on the file name because I wasn't using a proper dynamic variable name. Code should have looked like this:
CODE
$imageBlob = base64_decode((string)$pdf->PDFBytes);
${'imagick'.$x} = new Imagick();
${'imagick'.$x}->readImageBlob($imageBlob);
${'imagick'.$x}->setImageFormat('jpeg');
${'imagick'.$x}->setImageCompressionQuality(60);
${'imagick'.$x}->adaptiveResizeImage(1024,768,true);
$imageBlob = ${'imagick'.$x}->getImageBlob();
$PDFdata[] = base64_encode($imageBlob);
$PDFfile[] = $FormCustomField . $x . '.jpg';
So the error I was getting was because of an invalid file name, because the $x variable in the previous code was getting junk values. Now everything works fine.
I'm using AWS PHP sdk to save images on S3. Files are saved privately. Then, I'm showing the image thumbnails using the S3 file url in my web application but since the files are private so the images are displayed as corrupt.
When the user clicks on the name of file, a modal is opened to show the file in larger size but file is displayed as corrupt there as well due to the same issue.
Now, I know that there are two ways to make this working. 1. Make the files public. 2. Generate pre-signed urls for files. But I cannot go with any of these two options due to the requirements of my project.
My question is that is there any third way to resolve this issue?
I'd highly advise against this, but you could create a script on your own server that pulls the image via the API, caches it and serves. You can then restrict access however you like without making the images public.
Example pass through script:
$headers = get_headers($realpath); // Real path being where ever the file really is
foreach($headers as $header) {
header($header);
}
$filename = $version->getFilename();
// These lines if it's a download you want to do
// header('Content-Description: File Transfer');
// header("Content-Disposition: attachment; filename={$filename}");
$file = fopen($realpath, 'r');
fpassthru($file);
fclose($file);
exit;
This will barely "touch the sides" and shouldn't delay the appearance of your files too much, but t's still going to take some resources and bandwidth.
You will need to access the files through a script on your server. That script will do some kind of authentication to make sure the request is valid and you want them to see the file. Then fetch the file from S3 using a valid IAM profile that can access the private files. Output the file
Instead of requesting the file from S3 request it from
http://www.yourdomain.com/fetchimages.php?key=8498439834
Then here is some pseudocode in fetchimages.php
<?php
//if authorized to get this image
$key=$_GET['key'];
//validate key is the proper format
//get s3 url from a database based on the $key
//connect to s3 securely and read the file from s3
//output the file
?>
as far as i know you could try to make your S3 bucket a "web server" like this but then you would probably "Make the files public".Then if you have some kind of logic to restrict the access you could create a bucket policy
I am using the API of an image editing website (pixlr.com) for use by members of my site. I open Pixlr.com in an iframe where they can create an image and upon SAVE, pixlr sends the image file by use of parameters.
I want to save these image files (unique for each member) in a folder on my server (or on Amazon's S3 image server), using PHP. How do I receive their parameters ("image") of the image file and store them on my/Amazon's image server?
If the image is sent to your PHP script via POST, then you should be able to do something like this:
$handle = fopen($imageName, "wb");
fwrite($handle, $_POST["image"]);
fclose($handle);
Where $imageName is the absolute path and filename of the image where you want to save it (make sure you Apache user has write permissions to that directory). Depending on the picture's encoding you may need to figure out which extension to save it with (ie .jpg, .bmp, .png, etc).
EDIT:
Looks like they are sending the image via $_FILES. Try this:
move_uploaded_file($_FILES["image"]["tmp_name"], "/home/path/domain.com/upload/". time() .".png");
I am editing a photo gallery script to allow the use of TIFF to be uploaded and saved, but i must keep the files in jpg format also for web viewing.
What I have done is installed image magick to convert TIF to JPEG, once i have it converted I want the script to continue with making thumbnails, zoom images, etc. it makes them from
$_FILES['image']['tmp_name']
Is there a way to set my newly created file as $_FILES['image']['tmp_name']? my new jpeg file path is set to $nw.
basically I need
$nw='path/to/newfile.jpg';
$_FILES['image']['tmp_name']=$nw;
but it does not work. any ideas?
If you need to work on the same file across multiple page requests, move it somewhere safe using move_uploaded_file.
If the functions that you wrote require access to $_FILES['image']['tmp_name'], rewrite them to accept the name of the file as a parameter and call them using the new location of the file as argument.