Amazon Cloudfront private distribution - links to images inside CSS - php

I created a private distribution in Cloudfront to prevent hotlinking. I managed to create links to my objects with signed URL which is working fine now.
My only concerns, is that images link inside my css stylesheets are not working because they are not signed. So if I have, for instance:
background-image: url('../img/bg.png');
The background image is not going to show up since the stylesheet does not include a signed url, and therefore, Cloudfront refuses to serve the content.
Is there anything I can do to prevent this ?

Let me step back and ask a fundamental question: Are you really worried about people hotlinking your images? Really? And if someone does, what is the realistic impact it will have on you? Really?
If you have a legitimate reason for preventing people from hot linking, then I'm not really sure that any CDN service (in this case, CloudFront) is the right solution for you.
Hey, I'm just being honest…

There are a few way, each with drawbacks.
Instead of a static CSS file, you generate it off a template (or some other smart way to map resources to CloudFront locations). You can use some degree of caching here by using Last-Modified and max-age Cache headers. The hardest solution but arguably the best protection.
Set up a redirection path for all your CSS resources, basically a small script that rewrites the path to CF (take care to only rewrite CSS resources and nothing else). This allows you to keep your current static CSS but opens up a potential hot linking of your redirection script.
Something in between could be a cron script that generates static CSS files with links that expire in 1.5d) to reduce server load.

It's generally a really bad idea to use private distributions for your css files, because adding the (presumably constantly changing) expiration time and signature to css urls prevents browsers from caching them, therefore greatly reducing the usefulness of using Cloudfront in the first place.
And even if you use long expiration times, then someone who desperately wants to hotlink your css background images will simply set up a script to scrape your css file and extract the image urls from there.

you could rotate the entire CDN host name every day or two, then you don't have to change anything in the css (assuming the css is on the CDN which it looks like from the example)
http://www.explainthatstuff.com/blocking-cloudfront-hotlinks.html
Then your CSS doesn't need signed URLs and you can still block hotlinking just as effectively.

I just recently heard about base64 encoding into html/css. There are pros and cons, but it might be what your looking for:
Is embedding background image data into CSS as Base64 good or bad practice?

how about using <base> tag in yr html?
Though did'nt try, but might help.
Another option is to use php to generate your stylesheet. for example, you get the signed urls for all the images, then you push them as variables to the style elements.

Related

Concept: Why not use data:image for all src besides feeds/API/utility? Is there a PHP script to do it?

Theorizing here on how to get lightning fast media + prevent hotlinking and the <img src="data:image-kj134332k4" /> is coming to mind and more. Scrapers dont need our src and real clients need instant load (esp cell net). Considering the recent google https-everywhere move, this would drastically decrease handshakes as well.
What disadvantages are there to crafting lists such as ecom
categories/widgets/slideshows using data:image?
Is there any implications to extra KB of actual source code over serving vastly larger total page size?
Do ya'll prefer any PHP data:image gen script over another for parsing images as data as data at certain controller levels (leaving standard src images in other areas)?
Are there caching/CDN concerns? Would the parse wonk cache somehow? Seems not but im not cache expert.
Any guidance or case thoughts are much appreciated. Thank you!
Generally, the idea is worth considering, but in most cases the problems outweight the benefits.
It is true that these images won't be cached on the client side anymore. Especially Expires-based caching saves you tons of bandwidth.
As a rule of thumb I'd say: If these are small images that change frequently, embedding is a good idea. If images are larger and clients load the same image more than once in subsequent request, do by all means deliver images separately and put some effort into caching.
As for the other points:
Most browsers support this; however, some old IEs don't … so think of a fallback solution or be ready to get bug reports (may be neglible, depending on your user base.)
The number of SSL handshakes is neglible, if you're using HTTP keep-alive, which is standard. Follow-up requests do indeed require a new handshake, but if you cache properly (see next point) and maybe put static files on a CDN, this is no problem.
Read about caching, especially the Expires/Cache-Control headers and their friends.
If you decide to embed, you don't really need a generator script, embedded images are base64 coded image files; this shouldn't take more than 3 lines of code.
However, if you process/convert your images in PHP, there's even another disadvantage: Instead of statically serving them (maybe even from a different machine or CDN), images have to be on the same machine and go through the PHP engine, thus increasing the used memory of each process that serves a page with these images.

Create thumbnails on the fly with cache or on upload?

I'm currently rewriting a website that need a lot of different sizes for each images. In the past I was doing it by creating the thumbnails images for all sizes on the upload. But now I have a doubt about is performance. This is because now I have to change my design and half of my images are not of the right size. So I think of 2 solutions :
Keep doing this and add a button on the backend to re-generate all the images. The problem is that I always need to know every sizes needed by every part of the site.
Only upload the real size image, and when displaying it, put in the SRC tag something like sr="thumbs.php?img=my-image-path/image.jpg&width=120&height=120". Then create the thumb and display it. Also my script would check if the thumb already exists, if it does it doesn't need to recrate it so just display it. Each 5 Days launch a script with a crontask to delete all the thumbs (to be sure to only use the usefull ones).
I think that the second solution is better but I'm a little concern by the fact that I need to call php everytime an image is shown, even if it's already created, it's php that give it to display...
Thanks for your advises
Based on the original question and subsequent comments, it would sound like on-demand generation would be suitable for you, as it doesn't sound like you will have a demanding environment in terms of absolutely minimizing the amount of download time to the end client.
It seems you already have a grasp around the option to give your <img> tags a src value that is a PHP script, with that script either serving up a cached thumbnail if it exists, or generating it on the fly, caching it, and then serving it up, so let me give you another option.
Generally speaking, utilizing PHP to serve up static resources is not a great idea as you begin to scale your site as
This would require the additional overhead of invoking PHP to serve these sorts of requests, something much more optimized with the basic web server like Apache, Nginx, etc. This means your site is going to be able to handle less traffic per server because it is using extra memory, CPU, etc. in order to serve up this static content.
It makes it hard to move those static resources into a single repository outside of the server for serving up content (such as CDN). This means you have to duplicate your files on each and every web server you have powering a site.
As such, my suggestion would be to still serve up the images as static image files via the webserver, but generate thumbnails on the fly if they are missing. To achieve this you can simply create a custom redirect rule or 404 handler on the web server, such that requests in your thumbnail directory which do not match an existing thumbnail image could be redirected to a PHP script to automatically generate the thumbnail and serve up the image (without the browser even knowing it). Future requests against this thumbnail would be served up as a static image.
This scales quite nicely as, if in the future you have the need to move your static images to a single server (or CDN), you can just use an origin-pull mechanism to try to get the content from your main servers, which will auto-generate them via the same mechanism I just mentioned.
Use the second option, if you don't have too much storage and first if you don't have too much CPU.
Or you can combine these: generate and store the image at the first open of the php thumbnails generator and nex time just give back the cached image.
With this solution you'll have only the necessary images and if you want you can delete sometimes the older ones.

What are the benefits of using .css over .php with content-type: text/css?

What are the benefits of using a CSS external stylesheet over a php file set to content-type: text/css? If you place this header at the top of a PHP file I feel like you have so much more potential:
<?php
header("Content-type: text/css");
if($query_string = "contact_us") {
#nav {}
}
?>
(^ that's a .php file). http://shapeshed.com/using_php_to_enhance_css/
If there are no downfalls (and I checked how they were cached in Chrome's Network Panel and I believe it's the same), isn't it kind of like whether to use .html or .php?
Thanks for any feedback.
Here are some differences:
PHP will create extra CPU / memory overhead compared with serving purely static CSS. Probably not all that much, but still a consideration. The more processing you do the bigger a deal it is, obviously.
You can't push PHP files to a CDN
SASS and LESS have been developed to deal with any dynamic features you might need so PHP isn't likely necessary
Sounds like you're worried about serving more CSS than is needed for certain pages. Realistically it's not an issue since browsers will cache the CSS after the first download.
Additional thoughts:
I wrote a UI template engine that isolates both JS and CSS code to only the specific views on which they are used. If a CSS or JS is used more than once, it is pushed to the "kitchen sink level" and included globally. This limits selector conflicts and also best balances the number of HTTP requests and total download size per request. Also keeping relevant code (i.e. button event listeners or page / element-specific styles) close together helps with more rapid programming, especially for non-expert teams / developers.
That could have been nice before CSS preprocessors such as SASS or LESS.
Dynamic CSS isn't even as useful as dynamic JavaScript, much less dynamic HTML. Having one large file with all the rules in it is more effective than having a file with rules that change since you can more easily cache the former in the client than the latter.
The reality is that there isn't much benefit to this...dynamic CSS doesn't have many valid use-cases, and the one you're describing certainly isn't one of them. If you just combine and minify all of your css into a single file, it'll cache on the client and be downloaded only when you bust the cache it.
In a modular CMS, this could be useful. As long as your application could generate a .php URL that consistently generates the exact same CSS for caching purposes, you could dramatically reduce the amount of CSS to download. For example, pages that uses 1 theme and 5 modules (each providing CSS) could return the CSS for that combination, instead of the CSS for 1 theme and 50 modules. That could be the difference between 50KB of CSS and 500KB -- a huge savings for slower connections.
If your website is hand-made, as in a website that has specific goals known ahead of time, then there's really not a good reason to do this as others have answered.

Best way to serve third party html on your site?

I'm building a web app where users can build custom web pages that pull content from other web pages. I know of a few options for doing this, and I'm not sure which is best, and if there are better solutions out there. Right now, I could:
Use iframes, which will (sort of) accomplish what I want, but will force the client to download and render all the web content, which seems slow. I've heard a lot of people say iframes are passe and should not be used, etc.
Use a library like wkhtmltopdf, which will render the html on the server side and generate a pdf image of it. This would work nicely, but the result is just an image, so text won't be selectable, links won't be clickable, etc. Also, I've heard that you can get in legal trouble for hosting other people's web content on your site without permission.
Use something like phpquery to literally scrape content off of other sites. This option could have the same legal issues as the above option.
Has anyone done anything like this, or does anyone have any thoughts?
The cleanest solution would be send off a http request server side, then render the html into your page as you require, this will also require changing all the urls of content and links to be absolute
eg:
<img src="\images\banner.png">
will work on the remote server, but once inside your page, the image will not exist. The most workable solution would be limit the functionality to images and links, then do a find / replace with regex to match relative urls and add the source address to it.
You will however run into legal issues if you are resending other peoples content from your server, even just html.
Using an iframe would be the quick dirty solution and probably have the least legal ramifications, as the browser sends a normal request to the site for the content.
I'd recommend DocRaptor for generating PDF files from HTML. It works in a similar fashion as wkhtmltopdf, but produces fully functional PDF files.
Here's a link to its homepage:
http://docraptor.com/
And a link to its API documentation:
http://docraptor.com/documentation

using PHP for "Fluid" design(using viewport resolution)

I need some opinions on using PHP to make completely "scalable" websites.. For instance, using viewport resolution and resizing images, applying dynamic css styles..... In my mind doing this just add's to the complexity and should not be done, it should be fixed or fluid using strictly css and no server-side languages to generate layouts based on the device size..
I need some input and maybe some philosophy on why using this approach is not used at all..
Manipulating a web page in this way is the domain of CSS controlled by Javascript (or a library such as JQuery, see CSS docs). You shouldn't be wasting your server's processor cycles when client-side implementations will be far more responsive for the user and allow all the flexibility you require. Changing font size etc can be done almost instantly in the browser without the user having to request another page from your (remote) server, which would result in a slower user experience.
Really, really DON'T
As Andy says it is the domain of CSS.
Trying to adapt a design with PHP will make your code unmaintainable. You should really learn to use CSS efficiently to avoid this kind of hack.
The only reason for which you could use PHP to detect browser and adapt content is mobile browser.
Given the number of the existing User Agent tokens, it'll be almost impossible to make y scalable websites.

Categories