a quick question.
I'm looking at doing a multi-domain hit counter over many different domains, preferabbly in PHP.
What would the best way to track each hit be?
I was thinking storing a central database and updating the number in the database every time a page on any domain is loaded - but wouldn't that have major performance issues?
I was also thinking about 'basic number stored in text option' - but is it possible to edit a file from different servers/domains.
Any advice would be great!
if i get you right then you have different websites that sit on different servers?
in this case i'm not sure about editing a file from a different server and i wouldn't go there.
instead of editing a remote file, just update a remote DB (example)
best solution is using a non-blocking servers (like nodejs) which will update a DB on every page load (you can easily access remote DBs on other servers, or send a curl call to designated file on a master server). by using non-blocking web servers you will not slow down the page's load time.
google's analytics works a bit differently - it loads a script from google-analytics.com and this script gets all the info. the problem is that this only happens after the DOM has loaded.
if you are going for a solution like this - just put an AJAX call at the top of every page that you want to monitor.
Related
I want to implement a simple long polling system in PHP. A simple Szenario:
The Project is based on two websites. Website A and Website
B. There are two Users. One on Website A (UserA) and one on
Website B (UserB). On the Website A is a Button. If UserA push the
Button, the color of Website B change instantly.
Of course i can do this with a MySQL Database, but this seems way to big, because i just want to transfer one Bit.
Are there any other oppurtinitys to store one Bit on the Server an have acces from all PHP Pages, which are hosted on the Server?
I thought i could use a simple .txt file, but i am not shure if the Server Crushes if two diffrent Websites want to access to the same file. Is this a problem?
Or have you any other Ideas how to resolve it?
I would not recommend using a text file, since I/O operations is pretty slow compared to other methods.
You have to read the file on every page load/refresh or even worse, with an ajax request to do it instant. I think I would recommend something like Redis / Memcached and make some sort of ajax call to read from that (if you want it to be instant).
If you don't have access to the server, to install that kind of software, I would use a MySQL database.
Hope it helps
I'm using the following code to manage downloads from my site (the files are behind a captcha): http://www.richnetapps.com/php-download-script-with-resume-option/
Trouble is, when a file is being downloaded, it locks the rest of the site, and it's not possible to download another file simultaneously. ('Locks' as in trying to go to, say, the homepage when a download is in progress results in a long wait. The homepage appears only when the download is finished or cancelled. This is a problem because some of the files are several hundred MB).
I'd like two things to happen: 1- To be able to browse the site while a file is being downloaded, and 2- to be able to download another file (or two, or three, or ten...) simultaneously.
My gut feeling is I need to fork the process, create a new one, or open another socket. But I'm way out of my depth, and even if this was the right approach, I don't know how to do it. Any ideas guys?
Many thanks in advance....
EDIT----
I found it! I added session_write_close() right before setting the headers in the download script. Apparently this behaviour is due to PHP session handling - further info here: php simultaneous file downloads from the same browser and same php script (I searched and searched before asking, but obviously missed this post).
Many thanks....
A Content Delivery Network (CDN) will both offload from your server allowing your server to process homepage (or other) page requests, and allow many, many simultaneous downloads. It should be cheaper for bandwidth and perhaps faster for most users as well.
The key will be to configure to protect the files only after your Captcha, instead of being freely available like most CDN setups.
Here it gets a little complicated. I'm in the last few months to finish a larger Webbased Project, and since I'm trying to keep the budget low (and learn some stuff myself) I'm not touching an Issue that I never touched before: load balancing with NGINX, and scalability for the future.
The setup is the following:
1 Web server
1 Database server
1 File server (also used to store backups)
Using PHP 5.4< over fastCGI
Now, all those servers should be 'scalable' - in the sense that I can add a new File Server, if the Free Disk Space is getting low, or a new Web Server if I need to handle more requests than expected.
Another thing is: I would like to do everything over one domain, so that the access to differend backend servers isnt really noticed in the frontend (some backend servers are basically called via subdomain - for example: the fileserver, over 'http://file.myserver.com/...' where a load balancing only between the file servers happens)
Do I need an additional, separate Server for load balancing? Or can I just use one of the web servers? If yes:
How much power (CPU / RAM) do I require for such a load-balancing server? Does it have to be the same like the webserver, or is it enough to have a 'lighter' server for that?
Does the 'load balancing' server have to be scalable too? Will I need more than one if there are too many requests?
How exactly does the whole load balancing work anyway? What I mean:
I've seen many entries stating, that there are some problems like session handling / synchronisation on load balanced systems. I could find 2 Solutions that maybe would fit my needs: Either the user is always directed to the same machine, or the data is stored inside a databse. But with the second, I basically would have to rebuild parts of the $_SESSION functionality PHP already has, right? (How do I know what user gets wich session, are cookies really enough?)
What problems do I have to expect, except the unsynchronized sessions?
Write scalable code - that's a sentence I read a lot. But in terms of PHP, for example, what does it really mean? Usually, the whole calculations for one user happens on one server only (the one where NGINX redirected the user at) - so how can PHP itself be scalable, since it's not actually redirected by NGINX?
Are different 'load balancing' pools possible? What I mean is, that all fileservers are in a 'pool' and all web servers are in a 'pool' and basically, if you request an image on a fileserver that has too much to do, it redirects to a less busy fileserver
SSL - I'll only need one certificate for the balance loading server, right? Since the data always goes back over the load balancing server - or how exactly does that work?
I know it's a huge question - basically, I'm really just searching for some advices / and a bit of a helping hand, I'm a bit lost in the whole thing. I can read snippets that partially answer the above questions, but really 'doing' it is completly another thing. So I already know that there wont be a clear, definitive answer, but maybe some experiences.
The end target is to be easily scalable in the future, and already plan for it ahead (and even buy stuff like the load balancer server) in time.
You can use one of web servers for load balacing. But it'll be more reliable to set the balacing on a separate machine. If your web servers responds not very quickly and you're getting many requests then load balancer will set the requests in the queue. For the big queue you need a sufficient amount of RAM.
You don't generally need to scale a load balancer.
Alternatively, you can create two or more A (address) records for your domain, each pointing to different web server's address. It'll give you a 'DNS load-balancing' without a balancing server. Consider this option.
This has been frustrating me for a while now. I started developing a site for a friend using ajax to load content, but whenever the image galleries are loaded it takes soooo long. The annoying thing is that it wasn't like that when i tested it on my own server.
The test site is here: http://www.europeanbob.co.uk/phil/index.html
And the actual site is here: http://www.philmarsdenphotography.co.uk
The test is hosted on dreamhost and the actual one on krystal if that makes any difference?
You are doing some weird things, but Stack Overflow is not a debugging service. What I've seen from a cursory look at the network pane, when you click on Gallery > People, there are two simultaneous requests to /inc/people.php — one takes 4-4.5s and the other 8-9s, thus my guess is one gets locked waiting on the other to finish, which might be either due to session or database. These scripts return a bit under 4k, so the long delay cannot be explained by network latency — it is something you're doing server-side. Good luck.
Part of the problem is that you have a lot of javascript and css files which all have to load before the image loads.
Every external file that you link to (ie non-inline file) requires a separate dns lookup and then retrieval. The problem is that the browser will only look up a few at a time (a limitation of the TCP/IP protocol) and make the browser wait for those to return before looking up the next ones.
The solution is to combine some of these javascript and css files from 8 or so to 2-3. This should shave off about 2 seconds
I have a site that gets around 1000 page views a minute when traffic is high
and the page has a js code which stores the browser details of the
visitor into mysql using a php file for connecting
to db.
For eg browser.js calls storebrowserdata.php
Is there a way for me to cache the php file and js file without
affecting my stats data that is stored in the db??
When the traffic is high during the day the site slows down and cpu
utilization also goes up.
PHP is interpreted. If you want to grab each visitor's browser/user agent information on each page load then you kinda need to run that script each time; you can't cache it.
If this functionality is slowing down your site, either use an alternative solution like Google Analytics, or investigate a NoSQL solution like Mongo DB that offers atomic updates and in general runs faster than MySQL.
you can cache js file as well as php one.
but it will do nothing good for you.
it's database update that most likely slows down your site.
what certain operation consumes most CPU? Did you profile your application? Is it the only page using PHP mysql on your site?
If you need to control the flow of process executed, you can use a Queue and control the rate of execution.
About this system, this is a great open source project: http://www.rabbitmq.com/