I have a web based CRM coded in PHP, running off a MySQL database. The server is hosted in the same city as the company HQ but the company's internet connection is average (10Mbps down, 2Mbps up, 30ms ping to the server, all on a good day). The boss is happy with the results but now wants it to 'run super fast in the office' but we still need it to be viewable on the internet.
Short of moving the web server from our host and on to the local office network, which isn't a great option because then it would be super slow for everyone outside of the office, does anyone know a way to achieve this? I was thinking of setting up a local copy of the site and having the MySQL databases synchronise, but this sounds like a logistical nightmare.
Any ideas would be much appreciated! Happy to provide more info if needed.
You can setup dual-master replication with MySQL to accomplish this.
I would not attempt it without a fast, reliable line (which it sees you have). I would certainly setup and load test temporary servers to prove the configuration works.
For more information
http://devel.reinikainen.net/40
http://www.neocodesoftware.com/replication/
I am not joking around here.
Step 1) Have your boss define in written format what super fast means. This could(should?) include page load times for specific pages.
Step 2) Determine where there is a deficiency in speed. You think you know, but you don't. Measure it and record results. Use firebug in firefox to check page load and transfer times.
Step 3) Identify how you can speed up the app based on SPECIFIC measurements you looked at.
Related
My website name is reqsbook.com
It is basically a job portal website, actually My website has around 1 lack jobs.
when I am searching jobs from my website it is performing too slow, actually I am using hostgator cloud server and I have hosted single domain here
I mad my website with very compress that as much as possible
I came to know through internet that i have to go for dedicated server or go for local server means I need to keep server in my location and maintain the website from there
I am thinking that if a took dedicated server then also the same problem may repeated because My website db are increasing day by day
If i go for local server....I don't have any knowledge on this
Please some one help me give me better idea
Thank you
Yes, It's really slow. Only OOP and RDBMS can help you.
Customized your code so that same code need not execute many times.
Follow relation database management system. You can use indexing for fast searching. View table, temporary table, transaction can help you to manage data easily.
We run a fairly busy website, and currently it runs on a traditional one server LAMP stack.
The site has lots of legacy code, and the database is very big (approx 50GB when gzipped, so probably 4 or 5 times that..)
Unfortunately, the site is very fragile and although I'm quite confident load balancing servers with just one database backend I'm at a bit of a loss with replication and that sort of thing.
There's a lot of data written and read to the database at all times - I think we can probably failover to a slave MySQL database fairly easily, but I'm confused about what needs to happen when the master comes back online (if a master/slave setup is suitable...) does the master pick up any written rows when it comes back up from the slave or does something else have to happen?
Is there a standard way of making PHP decide whether to use a master or slave database?
Perhaps someone can point me in the way of a good blog post that can guide me?
Thanks,
John
If you are trying to create a failover solution for your entire website, I found this article interesting. It talks about creating a clone of the mySql database and keeping them in sync with rsync.
A simpler solution would be to just backup your database periodically with a script that runs from a cron job. Also set up a static web page failover solution. This website has an article on setting that up. That's the way we do it. This way - if your database has issues, you can restore it using on of your backups, while you failover to your temporary static page.
This is going to be a very stupid question. But i'm new to php and programming all together. I would be launch my first every database powered website. I'm still a student in college, so please take it easy on me.
I was using PHPMyAdmin for my local development. I'm ready now to launch the website and wanted to ask. Should I upload PHPMyAdmin together with my website? Or Not?
Thanks a million. This is an easy question for most of you but I really dont know
I say no, it can become a security vulnerability if not properly secured - but that's completely up to you. If you get yourself a good GUI mysql client you'll save yourself a lot of time anyway.
No, you should not. If you are using a shared webhosting, they normally have an internal URL where you can manage your databases using phpMyAdmin. Or if you are using your own private server, install phpMyAdmin in another location under another hostname/port, and only allow connections from your IP.
Apologize if this particular problem has been answered already (a search didn't turn anything directly relevant up).
We are developers of a web app that is used to provide community commenting and "social" to our partners websites. Our app uses Javascript and HTML on the front end, PHP and mySQL on the back.
Currently we are running everything through our own servers, which is getting very expensive.
We would like to ask our partners if we can host the app through their servers, with them getting a discount to our monthly charge due to the bandwidth/cpu load they would help us share.
My question is, is there a way to host our app through our partner's web servers in such a way that we can offload most of the CPU time and bandwidth without exposing our source code?
I would greatly appreciate any ideas/help!!
Thank you very much all!
If you also serve static or rarely changing content your clients could run a caching reverse proxy to remove some load from your servers without giving them any source code at all. But you need to implement caching headers for this to work properly.
You may want to look into nginx.
On second thought: Did you try to compile your scripts using facebooks Hip-Hop for PHP? First of all the script should perform way better, second of all, if you still had to outsource the hosting, you deploy a compiled program, no source code involved.
If you put the code on their server they can find out. So that won't be 100% working. Though you can make it difficult but it's still not great.
Most doable solution might be to separate parts of the application and share them. So: You give away a process (so source and other needed data) but it's only part of the total. That way no partner has your total solution but you do outsource the parts.
I have a simple CRM system that allows sales to put in customer info and upload appropriate files to create a project.
The system is already being hosted in the cloud. But the office internet upload speed is horrendous. One file may take up to 15 minutes or more to finish, causing a bottleneck in the sales process.
Upgrading our office internet is not an option; what other good solutions are out there?
I propose splitting the project submission form into 2 parts. Project info fields are posted directly to our cloud server webapp and stored in the appropriate DB table, the file submission will actually be submitted to a LAN server with a simple DB and api that will allow the cloud-hosted server webapp to communicate with to retrieve the file if ever needed again via a download link. Details need to be worked out for this set-up. But this is what I want to do in general.
Is this a good approach to solving this slow upload problem? I've never done this before, so are there also any obstacles to this implementation (cross-domain restrictions is something that comes into mind, but I believe that can be fixed with using an iFrame)?
If bandwidth is the bottleneck, then you need a solution that doesn't chew up all your bandwidth. You mentioned that you can't upgrade your bandwidth - what about putting in a second connection?
If not, the files need to stay on the LAN a little longer. It sounds like your plan would be to keep the files on the LAN forever, but you can store them locally initially and then push them later.
When you do copy the files out to the cloud, be sure to compress them and also setup rate limiting (so they take up maybe 10% of your available bandwidth during business hours).
Also put some monitoring in place to make sure the files are being sent in a timely manner.
I hope nobody needs to download those files! :(