GZip Compression For JQuery Without Server Access - php

all. I am required to build a website with each page under 130kb. I know that JQuery 1.4.4 is ~28kb when it's g-zipped, but it's 77kb minified, which is just too much for this particular assignment. I have already built the entire site using JQuery in one implementation or another on each page, so scrapping it would mean days of wasted time.
With that in mind,
1) Can I add content headers to a javascript file to add "Content-Encoding: gzip" without modifying config files on the server end? I'm uploading them to the university server, but I don't have access to the configuration. From the response header, the server is: Apache/1.3.26 (UnitedLinux) mod_ssl/2.8.10 OpenSSL/0.9.6g PHP/4.2.2 mod_perl/1.27
2) From the phpinfo file, I know that ZLIB compression is enabled, but "zlib.output_compression" is not.
3) I realize this can be done using .htaccess. However, I'd like to do it any other way, if possible, since I don't want the school thinking I'm trying to modify their server configuration.
4) Will XHR's setrequestheader method work here, or is that only good for asynchronous files?
I know this is short notice and all, but my Final presentation is tomorrow, and I'll lose a ton of points if my site is over the size limit. Any help would be much appreciated!

You have two options
Use jquery hosted somewhere else
that supports gzip. The file is here
that you need to include
http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.js
This gives you a lot more advantages
such as parallel downloading of
files and quicker page load times.
Other options is to use PHP code to
zip the jquery js and return it.
Here is an example
http://www.lateralcode.com/gzip-files-with-htaccess-and-php/

Use http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js, which is GZIPed.

Related

How to check what's slowing down a page?

I'm working on a website that can be found here:
http://odesktestanswers2013.com/Metareviewer
The index appears to be unusually slow (slowing down the browser as it loads) even though Yslow doesn't seem to see anything particularly wrong with it and that my php microtime returns a fine value.
What's the other things I should be looking into ?
Using Chrome Developer Tools, the network tab shows this:
... a timeline of what's loading in your page.
There are also plenty of good practices that aren't being made here. Some of these can also be flagged up by using Google Chrome's Audit tool (F12 menu), but in my opinion the most important are:
Use a CDN for serving common library code. Do you really need to host Jquery yourself? (side-rant, do you really need jquery at all?)
Your JavaScript files are taking a long time to load, because they are all served as separate HTTP calls. You can combine them into a single JavaScript file, and also minify them to save lots of bandwidth.
Foundation.css is very large - not that there's a problem with large CSS files, but it looks like there are over 2000 rules in the CSS file that aren't being used on your site. Do you need this file?
CACHE ALL THE THINGS - there are 26 HTTP requests that are made, that are uncached, meaning that everyone who clicks on your site will have to download everything, every request.
The whole bandwidth can be reduced by about two thirds if you enabled gzip compression on your server (or even better, implement SPDY, but that's a newer technology with less of a community).
Take a look on http://caniuse.com - there are a lot of CSS technologies that are supported in modern browsers without the need for -webkit or -moz, which could save a fortune of kebabbobytes.
If I could change one thing on your site...
Saying all of that, each point above will make a very small (but accumulative) difference to the speed of your site, but it's probably a good idea to attack the worst offender first.
Look at the network graph. While all that JavaScript is downloaded, it is blocking the rest of the site to download.
If you're lazy, just move it all to the end of the document body. That way, the rest of the page will download before the JavaScript has to, but this could harm the execution of your scripts if they are programmed in particular styles.
Hope this helps.
You should also consider using http://www.webpagetest.org/
It's one of the best tools when it comes to benchmarking your site's performance.
You can use this site (http://gtmetrix.com/) to analyze the causes and to fix them them.The site provides the reasons as well as solutions like js and css in optimized formats.
As per this site's report, you need to optimize images and minify js and css files. The optimized images and js and css files can be downloaded from this site.
Use Google Chrome -> F12 -> Network and check the connect, send, receive and etc. time for each resource, used in your page.
It looks like your CSS and JS scripts have a very long conntect and wait times.
you can use best add-one available for both chrome and firefox
YSlow analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages.
above is link for firefox add-one you can also search chrome and is freely available.
Yslow gives you details about your website's front end. Most likely you have a script that is looping one to many times in the background.
If you suspect that a sequence of code is hanging server side then you need to do a stack trace to pinpoint exactly where the overhead is taking place.
I recommend using New Relic.
Try to use Opera. Right click -> Inspect element -> Profiler.
Look to Inspect element -> Errors.

Handling Very Large Uploads [duplicate]

I want to allow uploads of very large files into our PHP application (hundred of megs - 8 gigs). There are a couple of problems with this however.
Browser:
HTML uploads have crappy feedback, we need to either poll for progress (which is a bit silly) or show no feedback at all
Flash uploader puts entire file into memory before starting the upload
Server:
PHP forces us to set post_max_size, which could result in an easily exploitable DOS attack. I'd like to not set this setting globally.
The server also requires some other variables to be there in the POST vars, such as an secret key. We'd like to be able to refuse the request right away, instead of after the entire file is uploaded.
Requirements:
HTTP is a must.
I'm flexible with client-side technology, as long as it works in a browser.
PHP is not a requirement, if there's some other technology that will work well on a linux environment, that's perfectly cool.
upload_max_filesize can be set on a per-directory basis; the same goes for post_max_size
e.g.:
<Directory /uploadpath/>
php_value upload_max_filesize 10G
php_value post_max_size 10G
</IfModule>
Python Handler?
Using a Python POST handler instead of PHP. Generate a unique identifier from your PHP app that the client can put in the HTTP headers. With mod_python to reject or accept the large upload before the entire POST body is transmitted.
I think
http://www.modpython.org/live/current/doc-html/dir-handlers-hph.html
Allows you to check headers and decline the rest of the POST input. I haven't tried it but might be the right path?
Looking at the source of mod_python, the buffering of the input via read() seems to allow bit-at-a-time evaluation of the HTTP input. Headers are first.
https://svn.apache.org/repos/asf/quetzalcoatl/mod_python/trunk/src/filterobject.c
It's old I know, but maybe someone have this problem nowdays ,too.
Now you can do this with only Javascript and, say, PHP. No Flash or Java required on client side.
demo: http://dnduploader.filkor.org/
The idea is to slice the files with Javascript's Blob slice() method...
How about a Java applet? That's how we had to do it at a company I previously worked for. I know applets suck, especially in this day and age with all our options available, but they really are the most versatile solution to desktop-like problems encountered in web development. Just something to consider.
You can set the post_max_size for just scripts in 1 directory. Place your upload script there, and allow only that script to handle large sizes. It's still possible for that script to be attacked with large/useless files, but it avoids setting it globally.
Use that with APC and you might be able to work out something good:
IBM Developer works article on APC
Tried all of this... this is by far the best I have used yet...
http://www.uploadify.com/
Take a look at jumploader.com
A good java-applet for uploading.
I've used it for uploading images and it works fine. Haven't tried with bigger files than 10MB, but i should work for really big files too.
Have you looked into using APC to check the progress and total file size. Here is a good blog post about it. It might help.
Maybe you could use Webdav and Javascript in the browser
AJAX Big file upload, with progress, to WebDAV
http://www.webdavsystem.com/ajax/programming/upload_progress
A simple library
http://debris.demon.nl/projects/davclient.js/doc/README.html
You can then get the JS to redirect the user to a success page. Secret keys and what-not can be handled in a PHP prelude before handing off the JS Client->WebDAV
I would look into FTP, SSH or SCP this allows you to upload a large file and still have access control over the file as well. This might take a little longer to implement but its probably the most secure way I could think of.
I know it sucks to add another dependency but in my experience, most websites that are doing something like this are using flash on the client side, and uploading the large file as chunks
adobe as a howto on flash file uploads
I also found this tutorial on codeproject:
Multiple File Upload With Progress Bar Using Flash and ASP.NET
PS - I know you're using PHP and not .net, I figured the important part was the flash ;)
I've had success with uploadify, and I would recommend it. It's a jQuery/Flash script that handles large uploads, and you can pass extra parameters to it (like the secret key). To solve the server-side issues, simply use the following code. The changes take affect just for the script they're called in:
//Check to see if the key is there
if(!isset($_POST['secret_key']) || !isValid($_POST['secret_key']))
{
exit("Invalid request");
}
function isValid($key)
{
//Put your validation code here.
}
//This line changes the timeout.
//Give it a value in seconds (3600 = 1 hour)
set_time_limit(3600);
//Set these amounts to whatever you need.
ini_set("post_max_size","8192M");
ini_set("upload_max_filesize","8192M");
//Generally speaking, the memory_limit should be higher
//than your post size. So make sure that's right too.
ini_set("memory_limit","8200M");
EDIT In response to your comment:
Given what you've said, I'm afraid you may not be able to meet your requirements over http. All of the solutions out there are code that add features to http that it was never designed for.
Like you said yourself, it's a simple protocol. Apart from writing your own client software that runs outside of the browser, a java applet, or using a different protocol (like FTP, which was designed for this), you might not get what you want.
I've done the best I could within the given constraints. Sorry I couldn't do better.
Try this: http://www.simple2ftp.com uses a Java based FTP applet from within a clever PHP application wrapper.

How to load php file through jQuery without advertising your technology

Nowadays, Developers and Professionals tend to use PHP templates because of two reasons. They are manageable and secondly, we don't need to advertise our technology as there are no question marks and .php extensions within the URL.
But how to make non-advertisement of your technology possible while sending a jQuery Ajax request to load a PHP file in a div. I mean we would, have to write $.get('phpfile.php') within the script and one can say that voa he is using PHP hmmmm.
Simply, I want to ask is there is any way of loading a PHP through request without advertising your technology as above told.
Some coding will be honored.
But how to make non-advertisement of your technology possible while sending a jQuery ajax request to load a php file in a div. I mean we would, have to write $.load('phpfile.php') within the script and one can say that voa he is using PHP hmmmm.
I don't get it. jQuery doesn't know about PHP files. If your website has 2 "public pages" www.example.com and www.example.com/foo, then you can access to the /foo page from the homepage with something like $.get("/foo"). Here I use AJAX, jQuery, and nobody knows if my page use PHP or whatever.
Then, you should look for mod_rewrite has explained by verisimilitude, but rewriting url is not the unique solution. Have a look to this site http://builtwith.com/ and enter a random url. Web servers send, by default, a lot of data about themselves, so you should avoid that behavior too if you want to "hide" the technology used. Have a look here http://xianshield.org/guides/apache2.0guide.html. It's "a guide to installing and hardening an Apache 2.0 web server to common security standards.". You may find useful information in there.
Edit
And also, "PHP templates" are not related to pages URL at all. For example, you could have multiple URL which use the same "PHP template".
mod_rewrite is the best answer for all your predicaments. Why not use it? The URL phpfile.php in your above code could be rewritten to achieve the obfuscation...
#pomeh. Good point.
See. two things can be done here.
1) Disable the APACHE signature. In the default configuration of Apache, any page served through it will contain a full signature of the server. Server signatures contain valuable information about installed software and can be read (and exploited). Therefore is it safer to turn off this behavior. This is how you do it. Open Apache’s configuration file (httpd.conf or apache2.conf) and search for ServerSignature . Set it to 'Off'. After that search for ServerTokens and set it to 'Prod'.
2) Set "expose_php" in php.ini to false: Exposes to the world that PHP is installed on the server, which includes the PHP version within the HTTP header.
3) There are some php obfuscators available which also may be used. I will not recommend them since I've not personally tried them.
There are ways and means beyond these to hide the "technology". By default, a php enabled APACHE web server processes and interprets all files with .php extension. But we can bind any weirdo extension to hide the technology to be processed by the server..
I guess verisimilitude and pomeh already answered this question.
All web servers send information about themselves over the internet. You cant hide that.
If you want to hide file extensions, like 'aspx, php, asp, html' then you will need to use mod_rewrite under Apache or something like URL Rewrite under IIS7.
You can also set default documents under IIS7. This really only works once per web folder. For example you can set default.htm as one of the default documents. When a visitor goes to your website they type www.domain.com and they get a web page. That visitor is actually looking at www.domain.com/default.htm

Would it be better to include() resources (css,js) or to let the browser do another request?

Would it be faster include a javascript file and outputting it in the html as a <script> or just use the src attribute and let the browser make another request?
Simply outputting it instead of letting the browser make another request would obviously mean less requests and possibly less server load, but does it make it faster? Including the files and outputting them doesn't let the browser cache them.
If you include it, every different page will have the overhead of downloading the script again.
If you externally link to it, and send future expiry headers and use versioning with a cache buster (for changes), your file will be downloaded once as per required. On the topic of performance, be sure to minify or pack your production use JavaScript.
Of course, this is very relevant to your JavaScript. If it is a few lines and likely not to change, maybe you could save that one HTTP request and place it inline.
99% of the time, however, in an external file is best practice.
It is quite a complex answer. Obviously the techniques differ for a production enviroment and a development one.
The last one is quite simple: let include your scripts as they are.
For production environment: you should concatenate the js files you need into one file, minify and compress it. You can retrieve libraries from public cdn to increase download performance and relieve your server load.
The increased server load (the http header) should be balanced by the caching
To increase the user-perceived performance you should link your js file at the bottom of the page instead that into the head section
You should be aware of the deferred execution too, it let the browser to download other resources while downloading javascript files (by default the browsers download a javascript at a time as it doesn't know if the javascript he download will change the dom during his execution).
At last, if your script is quite short you will have a better performance if you include it right into the web page
At the very last, if you have similar question, you should enjoy reading this:
http://developer.yahoo.com/performance/rules.html
I agree with #alex. Also, linking allows the script files to be downloaded in parallel as the page is being parsed. Most browsers use multiple threads to download content while parsing the main page's content.

Large file uploads from web pages

I code primarily in PHP and Perl. I have a client who is insisting on seeking video submissions (any encoding) from the public via one of their pages rather than letting YouTube do its job.
Server in question is a virtual machine and I can adjust ini settings for max post, max upload size etc as needed.
My initial thought is to use a Flash based uploader with PHP on the back end but I wondered if someone might have useful advice and experience on the subject?
Doing large file transfers of HTTP is not usually fun -- but sometimes it's necessary.
For large files, you'll definitely want to provide some kind of progress gauge for end-users.
There are flash-based tools that do this (swfUpload comes to mind).
If you want to avoid flash and do it with pretty html/javascript/css, you can leverage PHP's APC extension, which for some reason provides support for getting upload status from the server, as explained here
You can adjust the post size and use a normal html form. The big problem is not Apache, its http. If anything goes wrong in the transmission you will have no way to detect the error. Further more there is no way to resume the transfer. This is exactly why BitTorrent is so popular.
I don't know how against youtube your client is, but you can use their api to do the uploads from a page on your site.
http://code.google.com/apis/youtube/2.0/developers_guide_protocol.html#Uploading_Videos
See: browser based uploading.
For web-based uploads, there's not many options. Regardless of web platform, web server, etc. you're still transferring over HTTP. The transfer is all or nothing.
Your best option might be to find a Flash, Java, or other client side option that can chunk files and upload them piecemeal, then do a checksum to verify. That will allow for resuming uploads. Unfortunately, I don't know of any such open source component that does this.
Try to convince your client to change point of view.
Using http (and the browser, hell, the browser!) for this kind of issue is rarely a good deal; Will his users wait 40 minutes with the computer and the browser running until the upload is complete?
I dont think so.
Maybe, you could set up a public ftp account, where users can upload but not download and see the others user's files.. then, who want to use FTP software can, who like to do it via browser can too.
The big problem dealing using a browser is that, if something go wrong, you cant resume but have to restart from zero again.
the past year i had the same issue, i gave a look to ZUpload
, but i didnt use it so i can suggest (we wrote a small python script that we send to our customer; the python script create a torrent of the folder our costumer need to send to us, and we download it via utorrent ;)
p.s: again, sorry for my bad english ;)
I used jupload. Yes it looks horrible, but it just works.
With that said, it's still a better idea to convince the client that doing so is stupid.
I would agree with others stating that using HTML is a poor option. I believe there is a size limitation using Flash as well. I know of a script that uses a JavaScript Applet to perform an actual FTP transfer. It is called Simple2FTP and can be found at http://www.simple2ftp.com
Not sure but perhaps worth a try?

Categories