I have uploaded thousands of files. However, when a user rotates an image in my website I internally update the uploaded version (re-upload the rotated version via PHP to the bucket). But, google keeps displaying the same image again and again.
However, if I append ?r=[number]
Where [number] is some random number, the updated image is displayed correctly. And if I then remove the ?r=[number] part from the URL the old image is displayed again.
Is there a way to tell google to update the displayed version?
You can set the cache-control header to private for the uploaded files to ensure that the image is always served fresh, or set the max-age to whatever your maximum bound for staleness is. See Cache Control in the Google Cloud Storage documentation for details.
Note that after you do this, you'll still need to wait for browser caches containing the original objects expire, but all new images served will use the updated cache control settings.
Related
In my app, I want users to be able to upload a picture to set for their profile. I only want them to be able to have one picture set and in storage (Google Cloud Storage) at a time.
When users upload the picture, it gets renamed the same thing every time. I do this so I don't have to search and delete the old file. The new file just replaces it.
The problem I'm running into is that once an image is uploaded and replaces the old image, it remains the old image even though the actual file in Google Cloud Storage has changed to the new image. I have verified the file has been successfully replaced by looking at the actual file in the storage browser. There is no trace of the old file.
To serve the image I am using this method:
https://developers.google.com/appengine/docs/php/googlestorage/images
getImageServingUrl() should be pulling the correct file as it is named exactly the same, but it's not. How is it still hanging on to the old image? Is the old file being cached or something?
The result of getImageServingUrl() is being echoed into the src attribute of an img tag.
Any insight would be appreciated! Thanks!
Update: I tried calling deleteImageServingUrl() before getImageServingUrl(), however the problem still persists.
I am currently using an implementation of the php sdk for rackspace to upload files to a container called testcontainer. I am using a library that interacts with the sdk but I think I'll need to write it natively to accomplish what I'm looking for. Before I do this I haven't been having much luck finding out how to clear rackspace's cache.
The problem I run into (in case its not a caching issue for whatever reason) is...
Upload a file called test.jpg
visit the cdn endpoint /test.jpg and see my image
locally change the image to something else but keep the name test.jpg.
Upload the file to the same cdn container replacing the other test.jpg.
visit the cdn endpoint/test.jpg however the image is still the original test.jpg, not my new image.
This looks like a caching issue, I know in your account you can clear the cache and reset it but haven't been able to find any reference in the documentation.
Any ideas? Thanks.
If you have set your CloudFiles containers as CDN what you are seeing is indeed a caching issue. Unfortunately there's no practical way to flush the caches at scale and even if it was just one object, it could take minutes to be propagated globally. For more info how to flush CDN caches for individual objects and limits thereof, please see: here.
And of special attention:
You can use the DELETE operation against a maximum of 25 objects per day using the API
Say i store pdf files in the database (not important). The users of the application visit a page periodically to grab a stored PDF and print it - for adhesive labels btw.
I am annoyed at the thought of their downloads directory filling up with duplicates of the same document over time since they will download the PDF every time they need to print the labels.
Is there a way to instruct the browser to cache this file? Or any method of relative linking to the users file system possibly? All users will be on Chrome/Firefox using Windows 7 Pro.
Etags will help you to do this. If the file hasn't been updated since the client last downloaded, the server will send a 304 "not modified" response, instead of the file.
If your files are dynamically generated, you will need to manually implement etag generation in PHP rather than relying on the web server.
http://www.php.net/manual/en/function.http-cache-etag.php
I've found a useful solution to my problem.
From the comments on my question, we concluded it would work best to utilize the browser's built in PDF/DOC renderer and download anything else that isn't recognized.
I read this standard: https://www.rfc-editor.org/rfc/rfc6266
This is the solution (header):
Content-Disposition: inline; filename=something.pdf
Instead of attachment, I've used "inline" in order to utilize the browser when necessary.
Most browsers will do this automatically based on the URL. If the URL for a particular PDF blob is constant, the browser will not re-download it unless the server responds that it has changed (by way of HTTP fields).
You should therefore design your site to have "permalinks" for each resource. This could be achieved by having a resource-ID of some sort in the URL string.
As others have said in comments, a server cannot guarantee that a client does ANYTHING in particular; all you can offer are suggestions that you hope most browsers will treat similarly.
I have a site where each time you upload an image it gets rendered in various frame sizes. A cron job runs every 10 minutes which looks to see if any new images have been uploaded during that time and if so it generates all the needed frames.
Since this cron runs every 10 minutes there is some time between the content (such as an article) goes live and the time the images are made available. So during that meantime a generic placeholder image with the site's logo is shown.
Since Akamai caches an image, when a site user loads a page which has an image that hasn't been rendered by the cron yet, then a static placeholder will show for that image path and Akamai will cache this. Even when the image is later rendered and is available users will still get the cached version from Akamai.
One solution is to bust the "ages" of these images when the cron has rendered them. But it takes Akamai about 8 min to come back for the new ones.
Is there any other solution where I can tell Akamai perhaps through cache expiration headers to come back every 10 seconds until a new image is received and once that's done don't come back again and keep showing the cached version?
Yes, in a way. If you combine a few steps from the server side and within the akamai settings.
Here's the concept: the Edge Server delivers the content that it has. If you use cache-control headers, from php for example, the TTL settings in the akamai configuration settings of the respective digital property blow them away and it uses those instead. Meaning you tell it how often to come to your origin server by path, file type, extension, or whatever. Then from the client side, whatever files it has it delivers to the end user and it doesn't really matter how often the Edge Servers get requested for the content unless you don't cache at that level, rolling it back up to you.
Using those configuration settings you can specify that a specific file has an exact expiration - or to not cache it at all.
So if on the server side if you specify placeholder.jpg on your page and tell akamai not to cache that image at all, then it'll come back each time the Edge Server gets a request for it. Once you have the image in place then placeholder.jpg doesn't exist on your page any more and instead there is sizeA.jpg, which would obey regular image caching times.
This may not be exactly ideal, but it's as best as you can do other than manually clearing the page and as far as I know they don't have an API call to clear a page that you can fire off (plus it take 7-10 minutes for a cache-clear to propagate through their n/w anyway).
I've a php based website and would like browser to cache the images for 30 days .. i am using a shared hosting solution where I do not have access to apache config to enable mod-headers or other modules and so can not use htaccess mechanisms for this.
my site is a regular php app, and has both html contents and images. I would like browser to cache images only. I've seen php's "header" function, but couldn't find a way to force only image cache .. How do i go about it ?
Thanks
As far as I know, if you can't get access to Apache to set the headers, your only other option is to serve images from a PHP script so you can use the PHP Header methods to set the headers.
In this case, you'd need to write a PHP image handler, and replace all your image tags with calls to this handler (e.g. http://mysite.com/imagehandler.php?image=logo.png). You would then have you imagehandler.php script retrieve the image from the file system, set the mime type and cache control headers, and stream the image back to the client.
You could write your own, or if you google, you will find image handler PHP scripts. Either way, make sure you focus on security - don't allow the client to retrieve arbitrary files from your web server, because that would be a fairly major security hole....