Wordpress & Enquire.js - conditional loading philosophy - php

Is not a matter of code, I just don't get the idea of how things works in this specific case.
I'm woking on a Wordpress theme which should dynamically load portions of php code through javascript.
It actually works combined with UA Sniffing because if the device is mobile, the mobile template should be used anyway but that's not the point.
If the screen.width is < 1080px the Mobile version menu of my Wordpress theme should be loaded.
otherwise, if the screen.width is bigger, the desktop menu version should be loaded
a solution based just on CSS Media queries is not acceptable since the desktop version is too heavy.
So I decided to conditional dynamically load content through enquire.js
The first part of my WP-homepage-template.php is:
enquire.register("screen and (max-width:1080px)", {
<script type="text/javascript">
match : function() {
$("body").load(
"/path/headernav-phone.php"
);
unmatch : function() {
$("body").load(
"/path/headernav-desktop.php"
);
}
}
So, when the size of the screen hits 1080px, Wordpress should to load headernav-phone.php
which contains just a wp_nav_menu(MOBILE) call.
If the screen size is bigger, Wordpress loads headernav-desktop.php which includes wp_nav_menu(DESKTOP).
This doesn't work:
Fatal error: Call to undefined function wp_nav_menu() in /XX/YYY/ZZZ/webseiten/XXX.YY/wp-content/themes/MyTheme/templates/headernav-phone.php on line 14
It looks like that my code is run before the CMS, and it breaks its own functions.

IMO, the logic is flawed.
For one, there's no way to detect the screen width with PHP.
Secondly, if you were to generate a PHP -> JS file (my-script.js.php), you'd have to load WordPress engine two times.
The Answer in this link is super-pro. You'll find easier methods that use wp-load.php and WP_USE_THEMES.
Thirdly, the way you're trying to load a theme template would require the "load two times" approach. Or solving it with AJAX and wp_localize_script.
If I understood the Question correctly, I think the best bet would be to use WordPress user-agent detection and enqueue your scripts and styles accordingly. Study the core function to adapt to your needs.
[update]
Still not sure if all this complexity is really needed. Would something like this work in a generic header.php?
if( wp_is_mobile() )
wp_nav_menu(MOBILE);
else
wp_nav_menu(DESKTOP);
But, if you really need this changes to happen dynamically, then AJAX would be the choice.

Related

How do i minify my js in my framework

I have built a customized php framework.
I have a view, controllers and model.
In the view, there is variable which i can set.
$js = array('custom/testimonial.js','jquery.js');
At the footer view i have the following code.
-------------------------------------------------*/
if(isset($js)){
foreach ($js as $jsname) { ?>
<script src="<?php echo MAIN_URL; ?>/resources/js/<?php echo $jsname; ?>"></script>
<?php
}
}
Basically, the code in footer will load all the necessary scripts required for that page. In my other pages, i have other(different) javascripts files to include. This mean thats each page will have different javascript.
I would like to minify the js for each of the page and wonder which is the best way. I tried using Assetic (php asset management) with no luck.
Another question is should i minify javascript on the fly? Means when i load my page called testimonial.php it will check what are the javascript required for the page and minify them before displaying. Will that post performance issues even if i cache it.
I am looking for some methods that has high maintainability as i do not have to minifiy all javascript manually, have 40 to 50 plus pages. (each page uses different javascript files, plugins, lib).
Can Assetic do the job for me? Currently i have problems generating a static file for the javascript.
Appreciated any help.
Yahoo (YUI) used to have a compressor/ minifier, but they're now moving to UglifyJS. See:
https://github.com/yui/yuglify
https://github.com/yui/yuicompressor/
There are also several online minification services, based on the YUI compressor.
http://refresh-sf.com/yui/

SEO and ajax loaded content link

I need some help to better understand SEO with ajax loaded content.
Here the context:
I have a single.php where content is dynamically generated (with php and an xml database) for each single post.
I load a container of this single.php inside my index.php page via ajax.
Here the working script:
$.ajaxSetup({cache:false});
$(".phplink").click(function(){
var post_link = $(this).attr("href");
window.location.hash = "!"+ post_link ;
$("#ajaxify_container").html("loading...");
$("#ajaxify_container").load('single.php?blog_no='+post_link+' #container');
return false;
});
$(window).hashchange( function(){
var hash = window.location.hash;
var hash = location.hash.replace("#!","");
if(hash != '') {
var post_link = hash;
$("#ajaxify_container").html("loading...");
$("#ajaxify_container").load('single.php?blog_no='+post_link+' #container');
}
else {
$.get(hash, function (data) {
$("#ajaxify_container").html('');
});
}
});
$(window).hashchange();
An example of a link in index.php (when I click on a link I've got in url website.com/#!12) :
<a class="phplink" href="12">Post 12</a>
And in my .htaccess file I added this lines to rewrite properly the url:
Options +FollowSymLinks
RewriteEngine on
RewriteRule /([0-9]+)$ /single.php?blog_no=$1
Everything works fine... (by the way, my single.php is SEO friendly "alone" and works without javascript)
However, by using ajax like this, with dynamic php page, is it still SEO friendly?
I know that ajax is difficult to be crawled. What is the best way to have a good (not the best, something correct) SEO with ajax content?
Regarding the structure of the link, I don't fully understand what google bot will crawl.
Because of the href="12", so the dynamic href="/single.php?blog_no=12".
In the web browser :
website.com/single.php?blog_no=12 and website.com/12 load only my single.php page
website.com/#!12 load my index.php page with a container loaded from website.com/single.php?blog_no=12
Of course I only want that google crawls the hashbang url...
(EDIT: if I open the link in a new tab with right click, it loads the single.php (that I don't want). It seems to be a normal behavior but...I want to prevent it)
Sorry for my English, I'm French.
Dynamically loaded content is generally hard to get right from an SEO perspective. Your description is a little confusing, but I think I have an idea of what you're looking for.
First of all, there are mainly two ways with which Google finds out about pages on your site:
A Sitemap (Google likes XML sitemaps) - A file that tells Google every page on your site to index
Links - Google will follow any internal link on pages it tries to index unless they are marked with rel="nofollow"
There's also links in and some other stuff, but for the purposes of this explanation...lets ignore those.
Anyway, unless you're explicitly telling Google that website.com/single.php?blog_no=12 exists, it's going to have a hard time finding it. To be honest, I'm not sure how Google will handle something like href="12", it may try to follow that link to website.com/12 which may effect your ranking if there is nothing there. So in the end, you might want to add rel="nofollow" to your AJAX trigger links.
A good way to handle AJAX and dynamically loaded content is to make sure fallbacks are in place, for example if you have something like href="single/12 set up to load some content with AJAX, you should also have a fallback page that doesn't use JS/AJAX. This ensures that both search engine bots, and users without Javascript can see that content if it otherwise wouldn't have been visible anywhere else.
Last small tidbit, if you test your links on something like http://www.dnsqueries.com/en/googlebot_simulator.php and they turn up with errors, or blank pages (search engine bots don't use javascript) you should nofollow those links, or setup fallback pages
Nevermind...this is the last thing. You should go a couple steps further with your htaccess rewrite to make your URLs completely clean of query strings. For example website.com/single/blog/12 is better than website.com/single.php?blog_no=12 for both SEO and users.

Simulate PHP Include Without PHP

I want to include the same navigation menu on multiple pages, however I do not have PHP support, nor can I affect my server in any other way.
I want to avoid simply copying and pasting the html onto all the pages as this would make updating the menu a pain.
The two options I can think of are as follows:
1) Have all the content exist on one page, then determine which content to show based on a keyword appended to the url:
example.com/index?home
example.com/index?news
2) Include a javascript file that has a function that writes the menu out and call the function on each page
function setupMenu() {
$("#nav").html("<ul class='nav'><li>home</li><li>news</li></ul>");
}
With Option 1, the updating process would consist of editing one nav menu on the one page
With Option 2, updating would mean changing the function in the javascript file
My concern with Option 1 is that the page would have to load a lot of content that it wouldn't need to display. My concern for Option 2 may seem trivial but it is that the code can get messy.
Are there any reasons doing it one way would be better than the other? Or is there a third superior option that I'm missing?
You have a few options, each with its own advantages and drawbacks:
Server Side Includes, or SSI. If you don't have PHP there's a good chance you don't have SSI either, and this option requires some irritating mucking-about with your .htaccess file. Check Dominic P.'s answer for a writeup of SSI. The benefit of SSI over JavaScript or Frames is that it doesn't require the user to have JS enabled - which a lot of users don't - and it also doesn't present any navigational difficulties.
Frames. You could either use standard frames to put the navigation in its own separate file, and with the correct styling it would be seamless. You could also use an iframe to place your navigation in an arbitrary part of the site, like a sidebar or whatever. The downside to frames, particularly standard frames, is that they tend to make bookmarking, links and the forward/back buttons behave oddly. On the upside, frames don't need browser compliance or server support.
JavaScript. You can refer to any of the other answers for excellent explanations of the JS solution, particularly if you're using jQuery. However, if your site isn't otherwise dynamic enough that your users will want to have JavaScript enabled, this will mean that a large number of your viewers will not see the menu at all - bad, definitely.
-
Yes use .load jQuery ajax function
$('#result').load('ajax/menu.html');
That way your code stays clean, and you can just edit the includes in seperate HTML files just like PHP.
You should consider AJAX for this task. Include a third party library like jQuery and load the separate HTML files inside placeholders, targeting them by ID.
E.g, in your main HTML page:
<div id="mymenu"></div>
Also, in your main HTML, but in the HEAD section:
$('#mymenu').load('navigation.html');
But your best bet would be to switch to a hosting that supports PHP or any other server-side includes. This will make your life a lot easier.
Check out Server Side Includes. I don't have a whole lot of experience with them, but from what I recall, they are designed to be a solution to just your problem.
Server-side includes: http://www.freewebmasterhelp.com/tutorials/ssi/
You can use HTML Imports http://w3c.github.io/webcomponents/spec/imports/
Here is an example from http://www.html5rocks.com/en/tutorials/webcomponents/imports/
warnings.html contains
<div class="warning">
<style scoped>
h3 {
color: red;
}
</style>
<h3>Warning!</h3>
<p>This page is under construction</p>
</div>
<div class="outdated">
<h3>Heads up!</h3>
<p>This content may be out of date</p>
</div>
Then index.html could contain
<head>
<link rel="import" href="warnings.html">
</head>
<body>
...
<script>
var link = document.querySelector('link[rel="import"]');
var content = link.import;
// Grab DOM from warning.html's document.
var el = content.querySelector('.warning');
document.body.appendChild(el.cloneNode(true));
</script>
</body>

AJAX, PHP Copied Files not Found on Return to Browser

I am using 'jQuery AJAX PHP' to do some '.jpg' file copying (approx 330kb per file). I copy files to a new directory location.
When I return to the HTML and use jQuery to add an IMG tag to a Table element, some of the files I have copied are shown as Not Found with 404 errors, but they are there.
I am wondering if it is a speed error. I tried to slow down the return from the PHP, by reading the directory where the files had been copied to, but that did not seem to help.
Am I right in thinking it is a speed problem and does anyone have an idea as to how I may overcome this problem, because only by displaying the copied file, can I be certain it has been copied.
Sometimes I have the same problem with not loading the images. If you are going to use jQuery I will recommend that you put your script (which loads the images) in
$(document).ready(function() {
// put all your jQuery goodness in here.
});
The fact is that your DOM object is not ready when you want to show or make operation with it.
Don't forget to call
<script type="text/javascript"
src="http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js"></script>
in the head of your HTML.
Having tried various options suggested here and some others, I researched, I decided to try putting the display of the images in a different function from the AJAX/PHP. In other words instead of processing the images in the result function of the AJAX call, I just passed the results from the success function to another function.
This seems to have cured my not-found displays.
This may be a coincidence, with something else going on, because I am very poor in the knowledge of the flow of the DOM.

Multiple javascript/css files: best practices?

I have about 7 Javascript files now (thanks to various jQuery plugins) and 4-5 CSS files. I'm curious as to what's the best practice for dealing with these including where in the document they should be loaded? YSlow tells me that Javascript files should be--where possible--included at the end. The end of the body? It mentions that the delimeter seems to be whether they write content. All my Javascript files are functions and jQuery code (all done when ready()) so that should be OK.
So should I include one CSS and one Javascript file and have those include the rest? Should I concatenate all my files into one? Should I put Javascript my tags at the very end of my document?
Edit: FWIW yes this is PHP.
I would suggest using PHP Minify, which lets you create a single HTTP request for a group of JS or CSS files. Minify also handles GZipping, Compression, and HTTP Headers for client side caching.
Edit: Minify will also allow you to setup the request so that for different pages you can include different files. For example a core set of JS files along with custom JS code on certain pages or just the core JS files on other pages.
While in development include all the files as you normally would and then when you get closer to switching to production run minify and join all the CSS and JS files into a single HTTP request. It's really easy to setup and get working with.
Also yes, CSS files should be set in the head, and JS files served at the bottom, since JS files can write to your page and can cause massive time-out issues.
Here's how you should include your JS files:
</div> <!-- Closing Footer Div -->
<script type="application/javascript" src="http://jqueryjs.googlecode.com/files/jquery-1.3.1.min.js"></script>
</body>
</html>
Edit: You can also use Cuzillion to see how your page should be set up.
Here's what I do: I use up to two JavaScript files and generally one CSS file for each page. I figure out which JS files will be common across all of my pages (or enough of them so it's close - the file containing jQuery would be a good candidate) and then I concatenate them and minify them using jsmin-php and then I cache the combined file. If there are any JS files left over that are specific to that one page only, I concatenate, minify, and cache them into a single file as well. The first JS file will be called over a number of pages, the second only on that one or maybe a few.
You can use the same concept with CSS if you like with css-min, though I find I usually only use one file for CSS. One thing extra, when I create the cache file, I put in a little PHP code in the beginning of the file to serve it as a GZipped file, which is actually where you'll get most of your savings anyways. You'll also want to set your expiration header so that the user's browser will have a better chance of caching the file as well. I believe you can also enable GZipping through Apache.
For the caching, I check to see if the file creation time is older than the amount of time that I set. If it is, I recreate the cache file and serve it, otherwise I just get the existing cached file.
You haven't explicitly said that you've got access to a server-side solution, but assuming you do, I've always gone with a method involving using PHP to do the following:
jquery.js.php:
<?php
$jquery = ($_GET['r']) ? explode(',', $_GET['r']) : array('core', 'effects', 'browser', 'cookies', 'center', 'shuffle', 'filestyle', 'metadata');
foreach($jquery as $file)
{
echo file_get_contents('jquery.' . $file . '.js');
}
?>
With the snippet above in place, I then call the file just like I normally would:
<script type="text/javascript" src="jquery.js.php"></script>
and then if I'm ever aware of the precise functionality I'm going to need, I just pass in my requirements as a query string (jquery.js.php?r=core,effects). I do the exact same for my CSS requirements if they're ever as branched.
I would not recommend using a javascript based solution (like PHP Minify) to include your css as your page will become unusable if the visitor has javascript disabled.
The idea of minifying and combining the files is great.
I do something similar on my sites but to ease development I suggest some code which looks like this:
if (evironment == production) {
echo "<style>#import(/Styles/Combined.css);</style>"
} else {
echo "<style>#import(/Styles/File1.css);</style>"
echo "<style>#import(/Styles/File2.css);</style>"
}
This should let you keep your files separate during dev for easy management and use the combined file during deployment for quicker page loads. This assumes you have the ability to combine the files and change variables as part of your deploy process.
Definitely look into including your js at the bottom and the css at the top as per YUI recommendations as keeping the JS low has a tangible affect on the appearance of the rest of the page and feels much faster.
I also tend to copy+paste all of my jquery plugins into a single file: jquery.plugins.js then link to
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1.3/jquery.min.js">
for the actual jquery library.

Categories