Clean links to PHP-generated JavaScript and CSS - php

Background: When generating HTML content with PHP or any such thing, it is possible to encapsulate links to JavaScript and CSS inside tags without actually having to include the CSS and JavaScript "in-line" with the rest of the content. All you have to do is create a link to the file.
Example:
{script type="text/javascript" src="./js/fooscript.js"}{/script}
Question: The above approach does not work, however, if your PHP needs to dynamically generate some or all of your JavaScript code. Is there a way to have a clean "one-line" link as above, but still use dynamically-generated JavaScript?
Obviously, one way to do it is to have PHP auto-generate the JavaScript and write that to a file; however that approach is undesirable for various reasons. I am wondering if there is an alternate trick to doing this that I have not thought of yet.

Put an .htaccess file in your /js/ folder and add the .js extension to PHP, like this:
AddHandler application/x-httpd-php .js
In other words, have PHP parse all .js files as PHP files. So your scripts would really be PHP files on the server-side that output JavaScript. Do the same for stylesheets, only use the .css extension, obviously.
Note: I've never tried doing this in a separate .htaccess file. If it doesn't work, just put it into your global Apache config.

From my experience, rarely do you need to (and rarely should you) generate an entire script dynamically. For example, in javascript you may need to dynamically get some piece of data (like user info or settings) into javascript, but the rest of the script (classes/functions/DOM manipulations) is static across all users.
Typically in this case you would just want to put the dynamic stuff "inline", output dynamically from PHP and then include the js (the 95% that doesn't need dynamically generated) as an external script. The most obvious reason for this is caching the js/css.
Consider how reddit.com does it by looking at their source code for getting user data into javascript.
var reddit = {
/* is the user logged in */ logged: 'username',
/* the subreddit's name (for posts) */ post_site: "",
/* are we in an iframe */ cnameframe: false,
/* this page's referer" */ referer: "",
/* the user's voting hash */ modhash: 'lzbcszj9nl521385b7e075e9750ee4339547befc6a47fa01e6',
/* current domain */ cur_domain: "reddit.com", ...
}
The rest of their js is found in external files.

You could just use mod_rewrite to make certain php files be seen as CSS/JS
e.g. /css/screen-style.css points to css.php?friendly_id=screen-style

You can use .php files in JavaScript and CSS calls. It's not pretty and anyone looking at your source knows it's a script, but it saves the hassle of configuration on the server. Also, if you're making dynamic JavaScript, I would suggest adding a timestamp on the end so the browser doesn't cache it.
Example.
<script src="myjavascript.php?a=20090611-021213"></script>

Related

Application Entrypoint

I'm building an application and using index.php as and entry point to different modules. I noticed SugarCRM does this and it appears like a good idea.
The URL Looks like this
http://www.mypage.com/index.php?mod=log&pag=login
Where mod is the module and pag is the page
The index.php looks line this:
<?PHP
define('INCLUDE_CHECK',true);
// Class Loader
require ('app/inc/app_autoload.php');
// HTML Header with js and css links
require ('header.php');
// Content Page
$url_module = $_GET["mod"];
$url_page = $_GET["pag"];
$content = $url_module."/".$url_page.".php";
// For the above URL $content = log/login.php
if (!file_exists ($content)) {
require ($content);
}else{
// Handle Error
}
// Footer
require ('footer.php');
?>
Is this safe?
If it's safe, Is it in line with practices?
This can be potentially unsafe. Depends on all the other PHP files that PHP can open. If all of them are class files that don't execute anything, then you're safe. However, if any of them execute something automatically...maybe not.
Let's say that you have PHP files inside a folder:
/secured/file.php
And let's say that the folder has an .htaccess that prohibits anyone from navigating to the page directly. Or better, let's say it's above your root directory. However, the hacker sends "../secured" as the value of mod and "file" as the value of page. In such a case, PHP may allow the person to include that file, and if it self-executes, it may have unintended consequences.
This is why Zend Framework requires explicit configuration of all MVC paths. Other frameworks allow for a some dynamic inclusion, but they often do something like append "Controller.php" to the end of the string, which ensures that the file included must be a Controller...and thus intended to be included in such a way.
When it comes to security, the best thing you can do is make sure that YOU...with all the knowledge of the entire server...can't open up any file that you don't want to be opened by someone else. If you can't get the code to do it, knowing what files are there, then you have implemented some decent (though likely still not flawless) security.

Constant set using define() not working in included PHP file

I have this code inside of my header
<?php
define('RELPATH','http://www.saint57records.com/');
include_once(RELPATH.'sidebar.php');
?>
and an example line of code in the sidebar
<img style="margin:10px;" src="<?php print RELPATH;?>images/logo.png" width="60px"/>
but when it gets to the page it includes the file correctly but all the links inside of the file just print RELPATH instead of the web url like this
<img style="margin:10px;" src="RELPATHimages/logo.png" width="60px"/>
It works fine on the other pages of my website, just not inside of Wordpress. Does anyone know what might be causing this issue?
The short answer is to provide a filesystem path to RELPATH, not a web URL.
The long answer is that when you use a web URL to include a PHP file, the PHP file will be treated like an external source. It will be called remotely, executed in a process of its own, and return the results. A constant defined previously can not have an effect in this remote resource.
If http://www.saint57records.com/ is on a different server, you'll have to pass RELPATH to it some other way, e.g. through a GET variable (which you'd have to sanitize with htmlentities() prior to use.) However, including content from a remote server in this way isn't good practice. It'll slow down your page as it'll make an expensive web request. If the target server is down, your page will time out.

PHP include not working with IE 7, 8 and 9

I use the following code to include page content in a index.php file (template).
if(isset($_GET['page']))
{
include($_GET['page'].'.php');
}
if(isset($_GET['special']))
{
include($_GET['special'].'.php3');
}
The url could look like this: http://www.example.com/?page={PageToShow}
This works fine for Chrome, Firefox and Safari, but the content is not shown in IE 7,8 & 9. Any idea why?
The server side PHP scripts wouldn't be affected by the browser that you use to view the page, so this looks like a rendering issue - check that the included code produces valid HTML, and that you haven't got <html> tags being included within other <html> tags.
You might want to rethink the way you're including page content - doing this via a GET variable is potentially insecure: for a start, it doesn't limit the files to those within the document root of your website.
At the very least I'd recommend doing some sanity checks on the input files (i.e. are they in the webroot?), but a more modern method is to use .htaccess rewriting to send all requests to index.php, where you can then choose which files to include depending on the request (take a look at this post for more information).
The server side script you put above should return the same result with all browsers. Try debugging with $_SERVER["REQUEST_URI"] and see if you get the same results.
Also, I would advise not to use such kind of includes for security reasons.

Setting new absolute path of resources within Javascript depending on environment

I have an external javascript file that uses the getScript() function to run another JS file.
I have those all on static.mydomain.com. (I'm new to setting up CDNs)
getScript() doesn't seem to allow cross-domain requests because my HTML is on domain.com. But then I tried using relative paths according to this post: Dynamic URLs in CSS/JS
It works for CSS but does not work for JS (specifically within the getScript() function). What's going on here? What are some ways to mitigate this problem when dealing with CDNs?
The getScript method actually makes an ajax call, hence the reason it's not working. Unless you need access to things like 'was the script successfully found' and the like, it's better to just write up a quick method like...
function addScript(source, domain) {
$("head").append("<script src='"+ (domain ? domain + source : source) +"'></script>");
}
That will just add scripts to the head of the page, and let you add an optional domain to point to in case you want to change it up.

Showing only top-level for website

I have a website, say accessible under http://example.com.
For this, I have several PHP-scripts like index.php, intro.php, faq.php, contact.php etc.
So a typical use-case would look like so:
User going to http://example.com, which will be http://example.com/index.php -> then clicking on "Introduction" and being redirected to http://example.com/intro.php.
While all this is working nicely, I wondered if there is a way to hide the names of the PHP-scripts completely, so the URL will always read as http://example.com/, regardless whether the user is on index.php, intro.php, faq.php etc.
Using RewriteRules seems not the way to go as it is basically doing the other direction: Facilitating the input of a specific URL for the user (e.g. making the ".php" optional).
However, I want the user to get only the URL of the site to be visible and not the individual scripts along its way.
Is something similar actually possible with individual scripts or would this require all the individual scripts to be combined into one and then to use constructs such as:
if( $_POST['destination'] == "intro" )
{
//DO ALL THE Introduction MARKUP
}
Thank you.
Best.
You could use a full-page iframe, and load intro.php in the iframe. This way, the user stays on the same page, but the page in the iframe changes.
one way (working, but not very good) is to include all your scripts in index.php and call functions which draw specific pages from those scripts. this call s must depend on dome post variable.
You could use AJAX calls to load the new contents when a user clicks on a link. Then you could create your website like usual, but add a script similar to this one (using jQuery):
$(function() {
$('a').click(function() {
$.get($(this).attr('src'), function(data) {
document.write(data);
});
return false;
});
)};
I haven't tried this code, but something along this lines should work.
This would of course not work in browsers that do not support JavaScript, and you would need to take care of forms in another way, so a full-page iframe might be an easier solution.
Given your further explanation, I'd go with the single clean index.php, and other scripts included as needed (I'd even them outside your document root so they can't be accessed directly, either by accident or on purpose):
index.php:
<?php
$action = isset($_POST['action']) ? $_POST['action'] :'index';
switch($action){
case 'intro':
require '../pages/intro.php';
break;
case 'somethingelse':
require '../pages/somethingelse.php';
break;
case 'index':
default:
require '../pages/index.php';
}
?>
Possibly even somewhat optimized with a whitelisted array of possible pages. This keeps your original index.php small & tidy, with still the possibility to do all more complex stuff in dedicated files. No actual need for javascript (it's not needed for the functionality, but of course can be used as desired) or psuedo-hidden urls due to frames (which most of the time doesn't fool a search indexer or someone who just wants to use direct urls with the smallest amount of knowledge about html).

Categories