Will there be any problem if my website is being visited by more then one person simultaneously...? If the answer is yes, can you say how can i overcome that?
Do I have to incorporate the session? Will that alone work? Please explain with a small example.
Going off the lack of information:
Your website will not have any problems if multiple people visit at the same time. More than likely, the software you are using is built specifically for this purpose.
There will generally be no problem. A normal PHP/Apache server setup is designed to handle multiple requests at once. Sessions are always on a per-client basis.
For more specific info, you would have to provide more information about your setup, but if you're just starting out building a web site, it is safe to say that you don't need to worry about this for the time being.
Session are not necessarly needed BUT you should really keep in mind that you data can be read and modified by multiple users in concurrency.
Look at ACID definition for the engineering part of it :)
Related
I am the webmaster of a dynamic website, and because of plenty of complicated queries that I have to use on the front page and some other pages, the server suffers sometimes from overload, when the number of visitors of our website is elevated.
So, I got the Idea to generate periodically (every 2 minutes) an html static snapshot of these pages. This would charge the server just once per 2 minutes by just one user.
My question is: Is this a good Idea? because I plan to generalize it over many other pages, and I don't want to be surprised and have to go back again.
If it isn't, is there any good ideas to avoid this charge?
Thank you in advance
PS: I would maybe publish the method I use to do this, to see if there is a better way.
I don't think it's a bad idea, but you should use an existing caching solution rather than implementing your own. Why not to use memcached? I think that's what you are looking for, just use it for those parts of your code that are taking long time.
Caching is a good idea to protect your server from overloading. Many CMS (Content Management System) use this technique.
Sure, it's called caching :)
However, most sites are caching just parts of their content. You can't cache a whole page if you are using user specific content, for example the name of a logged-in user. But you can cache the heavy parts of your site and combine it with a dynamic page.
Your idea is really good and many big websites are using this concept. You can also using Caching techniques, if you want avoid database hit then you can use caching technique it will be better. you can use Memcached http://memcached.org/.
Again, to re-iterate: This is not a request to program anything for me. I am looking for more experienced web developers to tell me if my idea is really doable, as it involves some pretty tough issues (at least, I think so). Please, if this post is to be closed, could I at least get a little advice on where I should be posting instead first?
Imagine: You visit a website (say malonssite.com). You sign in, you get a double-paned window. Left side is chat list(think FB buddy list). Right side is a "browser".
The chat list is populated by other people who have signed into malonssite.com AND are visiting the same page as as you using the "embedded" browser.
Each user has the ability to "allow followers", at which point whatever site they visit, all their followers "follow".
Image sketch:
My abilities:
PHP
MySQL
Javascript (node.js included, but that's more serverish I guess)
I've done long polling and ajax, but this gets complicated. I am thinking something like this might be best done in flash? Or maybe an oldschool Java applet? I am just not sure.
I am pretty confident I can make this thing on my own, I am just not sure what technology to use. I usually hit stumbling blocks in each area, normally along the lines of the same origin policy. I know that JSONP can get around the SOP, however is it powerful enough to do what I want? I am not familiar enough with it.
Sockets in general (websockets, flash sockets, etc) and node.js are pretty new to me, and I think they somehow hold the answer, I am just looking for some verification.
Thanks!
As I see it, you'll just need an iframe with a JScript asking its src and sending it to the server. So basically the user will stay on your own domain, browsing other web sites in the iframe and you will have no cross-origin-request issues.
You could use ape engine for the server side, which is exactly meant for this sort of things.
It is very possible.
Simple? no. But possible.
HTML/CSS/JS will easily take care of the front end layout,that should be elementary.
Node.js is a good option, and would be best suited if you know that traffic will be heavy.
If traffic won't be heavy, i guess php is OK.
And you will also need a backend database...again, depends on how many users you think you'll have. nosql ones would suit well, although oracle just claimed they 'exponentially' improved mySQL performance.
But think about this idea carefully. The concept of allowing users to communicate if they're on the same page is neat - but they'd have to browse a site within your site....furthermore, you have to account when the user presses next/back button in the browser.
perhaps you could make a fork of firefox and implement this as a software
did you mean something like talkita
or any other solution on google search "chat with others on same page"?
some of them also allow followors (subscribers) etc..
have a look, maybe youll get an idea.
please forget about flash and java applets...
i think this is a great idea and i hope you can get it to work.
I would really use NodeJS + (Socket.IO | SockJS) for the server-side and realtime communication, all your SOP problems will be gone.
As for the client side, just take care of cross browsing the javascript and css
For data persistence, some kind of nosql implementation: mongoDB or couchDB for example
My client has a host of Facebook pages that have become very successful. In order to move away from big brother Facebook my client wishes to create a large dynamic site that incorporates the more successful parts of the Facebook empire.
One of my client's spin off sites has been created and is getting a lot of traffic. I'm not sure exactly how much but it hit 90 Gigs in a month as the space allocated need to be increased.
In any case my client has dreamed up a massive website with its own community looking to put the community under the one banner. However I am concerned that it will get thrashed, bottlenecks, long load time, etc.
My questions:
Will a managed dedicated server be able to handle a potentially large amount of traffic?
Is it going to be better to create various parts of the empire in their own separate hosting and domain (normal hosting or VPS), or is it better to have them all under the one hood (i.e. using sub-domains).
If they were all together would it be better for SEO and easier to manage? Or if they are separate, they may be quicker but would it need some sort of Passport user system so people can log into any of the website with the same user details?
Whats the best way to implement a Passport style user system? Do you remotely connect to databases? Or run a regular a Cron job that updates each individual user details on each domain? Maybe run CURL request to the other site given then any new data?
Any other Pros/Cons to keeping all the section together or separating them?
Large site like Facebook manages to have everything under the one root. Then sites like eBay have separate domain names but you can use the same user login across all of them.
I'm not sure what the best option is and would appreciate any guidance.
It is a very general question but to give some hints:
Measure, measure and measure again. Know what kind of parts are used heavily and which are not.
Fix things and go back to 1.
Really: Without knowing what takes lots of time, what is used most heavily etc. you cannot say anything usefull.
VPS or dedicated servers are not the right question. You start with: What do I have to do for the users. Then: How am I going to do that? (for example: in database, in scripts, in message queue) and then finally you see how much hardware you need.
One or multiple domains doesn't really matter. Though one exception: For static content it might be interesting if you have lots of it to use a CDN like Amazon. Read for example: http://highscalability.com/blog/2011/12/27/plentyoffish-update-6-billion-pageviews-and-32-billion-image.html where you can read some things about the possibilities with a CDN.
In general serving static content from a static domain is useful many other things don't really need that. So there you could just consider all in one domain.
I created a website where there will be many 1v1 live chats.
How can I measure whether I have made optimal system or not? Is it possible to do that? Can I "fake" many users somehow and then get the answer if my system will take lots of resources or not?
Thanks in advance.
You're referring to something called load testing.
You could check out many of the commercial offerings (in link) or you could simply roll your own using cURL or sockets, depending on how you implemented the chat system (combining this with timers, verbose logging and system commands that check resource usage, you can get a good idea as to how your system performs under stress).
If you want to load test your site, here's a link:
http://www.loadtestingtools.org/?opensource
If you want to know about optimizing php, I'd recommend
http://phplens.com/lens/php-book/optimizing-debugging-php.php
If you want to test specific actions a user would complete in a browser, we've had good experiences working with Selenium.
You may just be able to simulate this in your own code much more easily.
I've been asked to create a custom 'tracker' in PHP, to know where users are coming from and where they are going on the site.
I'm thinking of writing a simple script, which connects to a database, writes the ip, browser, and time of the visit, then closes the db link.
Is this the right way to do it ?
I've found a few similar questions on stackoverflow, but none mentioned performance.
Is there a reason you can't use a solution such as Google Analytics - its free and has some nice features such as heat maps which show traffic flow
The main disadvantage is that it requires you to embed some javascript on all the pages - which means that its client side
I suppose it's another question of the kind "I want superior performance, however I have no certain reason for that".
in fact, any solution will be fast enough as writing logs is not too heavy operation.
the only thing one have to keep in mind is not to use any indexes in case SQL database used.
that's all.
So, lets put aside that performance stuff.
The only complete solution would be analyzing web-server logs.
Any other method will not give you complete picture. Say, if there is some image hotlinked on other sites and makes heavy load because of that, you'd never notice that if you log only requests to php scripts.
So, you can run crontab-based script running every night parsing access logs and getting comprehensive information of all users and bots activity.
Check Piwik or New Relic, if you need more customization, you should take a look at Webalyzer and Visitors
N.B: You can customize Piwik by creating plugins http://geekmonkey.org/articles/34-how-to-write-a-piwik-plugin
Perhaps you need some special software like Webalyzer? (it's free and quite powerful)
Performance is easy to say but much harder to define. It depends on zillion circumstances and while i'm say: this is the best performance i can get - you might say: hey, what's this?
Personally i recommend Google Analytics. It does almost everything if you need (almost things you didn't need). Maybe you can get a small 'performance' boost if you storing it's source locally but there's a chance it's cached in users' browser yet.
Or, if you prefer open source solutions, give a shot for Piwik.
Piwik does just that, and it does it very well. There is also a Tracking API that you can use to track a lot of things about your visitors, using PHP or any other language (REST API). See more information on http://piwik.org/docs/tracking-api/
Also it is very modular & fast, don't reinvent the wheel :)