Best practices for maintaining assets accross (sub)domains? - php

My webserver filesystem is equivalent to the following:
/master-domain.tld
/master-domain.tld/assets/js/foo.js
/master-domain.tld/assets/css/bar.css
/sub1.master-domain.tld
/sub1.master-domain.tld/assets/js/foo.js
/sub2.master-domain.tld
/sub1.master-domain.tld/assets/css/bar.css
...
What I'd like to know is how to serve my common static assets (hosted on the master domain) to my subdomains-- reason being that I then only have to maintain a single set of 'master' assets, rather than update/copy any files to all other subdomains each time I edit them. Obviously this can be achieved using absolute URLs, but my aim is to avoid these so that my local/dev system needn't make any remote calls while I'm designing/debugging.
Currently, I have a mod_rewrite + symbolic link + php script combo set up for each subdomain, linking any calls to non-existent local assets to the master assets. e.g. calling "/sub1.master-domain.tld/assets/css/bar.css" will retrieve the bar.css file hosted on the master domain since a local version of bar.css does not exist. Furthermore, calling "/sub1.master-domain.tld/assets/js/foo.js" would serve the local version of foo.js, since it does exist.
But my current method seems to be hindering the performance of my page loads, and I wonder if there is a better (still modular) approach to solving this problem.
Any tips? Am I going about this in completely the wrong way?

Symbolic links should be all that's required if they're all on the same server. This should cause no performance hits at all. E.g.
/sub1.master-domain.tld/assets -> /master-domain.tld/assets
If your subdomains are served from multiple servers, I would set up a mod_rewrite rule with 302 redirects to the complete URL on the master domain. On that master domain server I would set mod_expires so the assets are cached, avoiding most subsequent requests to the same "missing" assets on the subdomains.

If they are on the same machine, why not point different virtual host aliases to the same document root?

I'd use reserved static. subdomain for this purpose and aim all request on that one.

Related

Load balancer with Laravel Forge got 404 CSS

I have two servers which host two identical Laravel app. Let's say Server One and Server Two. And there is Load Balancer let's call this LB Server.
I set up that on Laravel Forge. But, when I point the domain to LB. I got random 404 CSS not found. I use Laravel Mix and compile them when I do the deployment. Since the two servers got different deployment the CSS versioning is different and also the JS.
What happened is if I call the domain and if I keep refreshing the server, I got 404 CSS. Since LB is doing Round Robin Load Balancing.
The problem is when I call the domain name the LB server serves to Server One. After I keep refreshing, the LB server serves to Server Two. That time, the CSS is still calling the Server One CSS.
How can I fix this?
Notes: I know I should put my CSS/JS/Images should put to S3 or CDN.
Can't use those options for now. I don't
want to put my compiled CSS to git versioning as well.
You should change your deployment: Generate the files only once and sync them to the servers instead of generating them on every production server each (rsync for example).
Another (not elegant) way could be to use sticky sessions, the LB will set a cookie and allways route a user to the same backend afterwards (see your LBs docs).

Symfony: Disable environment variables from caching

I am running a Symfony 2.8.6 application on nginx/php-fpm.
There are multiple domains that are resolved into this server, and basically what I want to do is change RDB configuration according to which domain was used to access.
So my nginx.conf has lines like fastcgi_param SYMFONY__SOME__PARAM $host, but I have a problem.
This parameter injection is cached and not working as intended.
For example, there are two domains a.example.com and b.example.com, and they point to my server.
I want it to connect to different MySQL server when accessed through different domain, but it ignores the domain and connects to the same server always.
What I've confirmed:
Nginx passes the variable correctly.
The output of var_dump($_SERVER['SYMFONY__SOME__PARAM']) changes as expected.
The parameter is stored in app/cache/prod/appProdProjectContainer.php
AFAIS there are two options: disabling configuration cache totally, or disabling caching environment variables.
I think the latter option is more preferrable, but I even don't know how to disable the cache, whether totally or partially.
Using dynamic environment variables in service definitions is not possible in Symfony (see symfony/symfony#16403 (comment) why). You can try to use Incenteev/DynamicParametersBundle, but I have no experience with it.
How about changing cache directory for each environemnt.
fastcgi_param SYMFONY__CACHE_DIR /path/to/cache

how to make many websites that use the same files or different ones if they changes in some domains using php

i build a web site and i need to make him for about 30 domains or sub sub domains ( this 30 websites is copied from base copy that means they have almost the same code and the same design ) but may be in some time i need to change some page content in one of these 30 copy and make another change to another one and if i didn't change any file in one of these copies i need it to take it from base copy
finally i am searching if there are an easy way to edit this web sites or manage it
If you are using same server for all the 30 websites then you can have virtual host for each site and keep DocumentRoot same for all the domains.
What this will do is your domain name will be different but it takes same code from common place(base website) for all the websites.
This question is not related with php.
My recommendation is to use Addon Domains.
In this way you will have a Single Hosting Package but you can host multiple domains in the same server. So lets say these are your domains:
/public_html/domain1.com /public_html/domain2.com
/public_html/domain2.com
You can create a base directory: /public_html/base
Now in the domains you can include the php files from this base directory like this:
require_once("/public_html/base/include.php");
require_once("/public_html/base/mysql.php");
Make sure when you use this way you may need to setup php.ini variable of: open_basedir:
open_basedir string Limit the files that can be accessed by PHP to the
specified directory-tree, including the file itself. This directive is
NOT affected by whether Safe Mode is turned On or Off.

How can I load files from httpdocs on the https side of the website

On my webserver. I have two directories which are publicly accessible. One of those is httpdocs/ and the other is httpsdocs/. You can probably guess where they're for.
The problem occurs when I wanted to create a CMS for myself. I want it to load the page with HTTPS for security reasons, but the problem is that my files are inside the httpdocs directory. I can move only the files for the system to the httpsdocs, but my next thought was what if I want a login on the normal website for people to login and view hidden content for not registered users?
Now my question is: Is there a way to combine those to directories so all the files will be inside the httpdocs and I can access them with the security HTTPS provides?
Things that you may need to know:
The webserver runs php 5.2.9
I only have FTP access to the webserver and to these directories.
The server runs apache, so .htaccess is possible.
There are a ton of avenues, but most are blocked if you can't alter the httpd.conf or certain restrictions are in place.
Your best bet would probably to simply abandon http: and just stick to https: and create a redirecting .htaccess in the http: folder. That way everything runs on a single directory.
Unless there is a drawback that would not allow your users to use HTTPS over HTTP.
Getting your ISP to make the https folder a symlink to the http folder would be the easiest solution here.
It sounds like you are likely using a hosting service that has things seperated. If this is the case then the answer is no, you can't combine these two directories into one. You can ask your provider if they can make that arrangement, but it would require changes to the apache configuration that you can't do.
That being said, barring the ability to have that configuration modified, the only other answer you are going to find is to move everything under the httpsdocs directory. Unless you can get another account setup or possibly if they offer the ability to setup subdomains with HTTPS connections. That would be an alternative solution.

Dynamic apache setting using PHP

starting new project where users can register and system will automatically install cms system for users registered domain and need to solve dynamic setting of the server (apache).
Registration information and info about the associations between domains and actual paths to the cms installation on the server will be stored in Mysql database.
Is there an easy way to configure apache to connect for all unknown domains to a specific php script, which will look into the database and provide the actual path to the relevant cms - apache will than use this info to correctly handle the request?
I think, that "easier" solution might be to use PHP to write the domains/paths/config to a file and force apache to use this file to handle requests - however as I expect, that the number of the domains might be higher and case that some domain will be deleted will not be rare - the file might become full of unwanted rules soon and hard to optimize, also apache restart would be needed in order to use changed file etc..therefore the question about dynamic solution - that might be much easier to manage (for me and for the admin system itself).
Yes - use a wildcard vhost in apache, and mod_rewrite to direct all URLs to your front controller (or use the 404 document handler).
C.

Categories