Image serving from two different servers - php

I hope you are all doing great. My question is
I am using cantaloupe image server which serves images according to user specify parameters in url using IIIF image API v2.0
Here is the url.
https://iiif.blavatnikarchive.org/iiif/2/baf__be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356/full/!1000,1000/0/default.jpg (1000x1000 image)
https://iiif.blavatnikarchive.org/iiif/2/baf__be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356/full/!512,512/0/default.jpg (512x512 image)
Image server takes time to process image for user defined dimensions and region extraction around 4sec. Therefore what i am doing is to pre generating some thumbnails using image server and storing them on amazon s3 so if user requests same thumbnail again and again I serve them pre generated thumbnail. Two benefits
1- Image server not computing it every time and load on server will be low.
2- Image serving through static thumb will be faster because it is pre generated.
Problem is now actually two servers are involved.
1- Image server for dynamic content creation. https://iiif.blavatnikarchive.org
2- Amazon s3 buckets for static thumbnails which were pre generated. assets.s3.amazonaws.com/image-name
I want to serve images using one url so end user don't redirect to different locations for same image with different sizes. So i decided to serve images using API
https://api.blavatnikarchive.org/baf__be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356/full/!512,512/0/default.jpg (Apache using php)
In my api I know which request is for static thumbnail and i need to take it from s3 bucket and which one is for dynamic size and need to take it from image server. In API i need to get image using file_get_contents("url of image whether its amazon s3 url or image server url") so request downloads it first on my api server and then serve to client and client browser downloads it again which is time consuming and takes around 2s for one image which is not acceptable. Image serving time should be less than a sec. I am here to know is there a way to map my api url directly to image server and amazon server.
Like
If user type https://api.blavatnikarchive.org/baf__be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356/full/!1000,1000/0/default.jpg
It should map to https://iiif.blavatnikarchive.org/iiif/2/baf__be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356/full/!1000,1000/0/default.jpg (1000x1000 image)
Or specify
https://iiif.blavatnikarchive.org/iiif/2/baf__be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356/full/!512,512/0/default.jpg (512x512 image)
Should map to directly static image thumb
https://baf-iiif-assets.s3.amazonaws.com/be12495f1d825e832cd7b66f0ee30c8adda804cd6c19e627537107b714b95356
or you can suggest me solution how i can gather things to one url? I want to keep user on my api url and dont want to use any redirection. How can i achieve this?
Many thanks.

Related

how to access s3 private images visible on browser using php

I have thousands of images to display on browser form private bucket of S3.
What is best way to get these private file on browser.
In order to get private image from s3 I have found multiple solutions listed below:
Make the files public. (can't use this as per requirment)
Generate pre-signed urls for files.
Pulls the image via the API from S3, caches it and serves.
By changing bucket policy.
Currently I am using signed url to get images but for every image I have to generate signed url. it will take lot of processing time.
My Question is, what is best way? And how to achieve this?
Your method of using Pre-signed URLs is correct.
You should generate these URLs when serving the HTML page that contains the images. This can be done with a couple of lines of code, or the createPresignedRequest PHP call. (I'm not familiar with Laravel, but you tagged your question with PHP.)
Thus, the page will contain dynamic content, created for the user on-the-fly.

How to download images from a server

I have an app that download some images from a server. Now app create a http post request for php script, that retrieve image (4 for each request) and sent it to my app into a json response (encoded in base 64). I put images into a ScrollView, and when user reach the end of the list, a new group of images is downloaded.
That is not the most perfotmant way, so i would use some LazyLoad libraries found on gitHub, but all of that require link to image, but i wouldn't sent in any way image link to app.
So, how can i do to retrieve images with lazy load?
Requesting images one by one or four by four does not make such a difference. In both cases you would first put a placeholder image in the imageview(s) and upon download complete and extracting done put the images in the respective imageviews if they do still exist. I suppose you put the downloaded and extracted images on the device so you can use them again when the user scrolls or starts the app again?

Dynamically Resize Images with PHP and Protect them from Direct Original Size Access

Think stock images. You have a full-size original that can only be downloaded after purchase. You only want to have that image once on your server, at full-size. But you want to display that image in various places in smaller sizes (responsively reduced for mobile devices as well).
I've got various pieces of this puzzle, but have not been able to make them all work together:
With TimThumb or CImage I can resize the images server-side, and using jQuery I can dynamically change the image source based on the user's screen size.
With PHP and .htaccess I can place the image files outside of the webroot and route a spoof URL containing image name to the PHP file that will read the actual image and send a header with the image data. Fine.
But TimThumb and CImage work with the real image URLs and they test to make sure that the given URL is the actual file path of the image.
If I could send the actual image data to the image resizing script rather than the URL to the image, it should work out from there. Since the PHP script would be reading the image data, I could check to see that the user has been given the proper credentials before doing the read.
Are there any other solutions you can think of besides hacking TimThumb or CImage or writing my own custom image resizing script?
Thank you
The answer just came to me.
With .htaccess, route all images through the image processing script. On the first line of the image processing script I will include my own custom script. My custom script will check the GET parameters against the actual image to determine if the user has the credentials to be served the image being requested at the size it is being requested.
If so, let the image processing script continue, if not, exit out or change the GET parameters to that the image processing script serves a placeholder image.
VoilĂ !

FineUploader: Possible to upload to different locations based on file type?

Happy Friday!
I was curious if the following was possible:
I'd like to be able to upload media to different locations based on its file type. I really appreciate that Fine Uploader can upload directly to Amazon S3, but I don't want everything to go straight to S3. I'm using WordPress and need to generate different image sizes for uploaded media, so I'd like to upload images to my server for processing and then over to S3 ( via this plugin ). Any other media ( audio, video, etc ) I'd like to upload to S3. What do you think? Is this possible?
Are you ok have two separate Fine Uploader instances? This is what would be required. Essentially, you would have to setup a button tied to a traditional endpoint Fine Uploader, and another tied to a Fine Uploader S3 instance. The buttons could have specific validation restrictions tied to them to prevent users from accidentally submitting an image file to the S3 uploader.
Another option is to provide your own file input element, check the submitted files, and then pass the appropriate file(s) to the appropriate Fine Uploader instance via Fine Uploader's API (addBlobs or addFiles).
Another possibility: just allow your users to upload all files to S3, and pull the each image file back down to your server (temporarily) after they have reached the bucket, modify it, and send it back to S3.
Note that I am working on a feature for Fine Uploader 4.4 that will allow you to specify images sizes via options, and Fine Uploader will scale the images and send each scaled image separately to whatever endpoint you choose. See issue #1061 for details/progress updates.

Webcam image stream for website implementation

Setting up a live image stream on a website, using images from a webcam. Trying to work out the implementation of it. The webcam takes a picture and requires a crop, resize and upload (not necessarily in that order), before it is displayed to the user, with a new image every minute. Currently I have a php script that does the cropping and resizing, while a webcam program automates the picture taking and uploading. However...
Uploading directly over the existing image causes an issue if the user reloads the page while the upload is taking place, resulting in a missing image.
Uploading with a different filename, then renaming it causes an issue if the user reloads the page during the renaming, resulting in a combination of both images.
Using a sequential filename system then gets tricky with the webpage requiring to know the new upcoming file every minute, along with a potential backlog of images.
Any suggestions are appreciated. Hopefully I'm missing something simple.
Thanks.
Just upload your image with different name, set the current image name somewhere, either in config file or MySQL, and after upload change it.

Categories