Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 days ago.
Improve this question
Am planning to install Moodle in Amazon EC2 with ELB. The approach am thinking of is couple of Moodle intance and couple of DB instances. each Moodle instance points the DB instances through a load balancer and each DB syncs automatically.
Please advice will it works.
I don't think that there is an option to have multiple DB instances in AWS synchronized between each other and all being both read and write. It seems they can only have read replicas, see here: http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html and https://aws.amazon.com/rds/mysql/ (Replication chapter).
Also, it would also be a big overhead to synchronize a high number of DB instances. Will this be synchronous? In that case it would leave a performance penalty.
The best option would be to have a number of Moodle instances behing a LB, all of them pointing to the same DB. I do not think the bottleneck sits in the DB. If you also tune the DB, add some performant storage (SSDs, see the link above for details) everything should be ok.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
There is an application built on Laravel and the application should be ready for a load of 1000 requests per second.
I have done the below tasks:
1- Composer autoload has been dumped
2- Query results are cached
3- All views have been minified
What else should I consider ?
(App runs on docker container)
How are you measuring if you reach the TPS? I would first get a baseline in order to know if your far of and based on that start looking into which part of your application stack (this includes the web and database server and other services used.) Tools that are available to use are JMeter or Apache Bench
In order to reach the 1000 TPS you'll need to tweak the webserver to allows for this type of loads. How to approach this is dependent on the webserver used. So it is difficult to provide you with specifics.
With regards to your DB server there are tools available to benchmark them as well such as pgBadger(postgres) or log files specific for the slow queries.
Ultimately you would also like to be on one of the latests PHP version as they are quite some performance gains in every new version. Currently the latest released PHP version is 7.4
In my opinion these tweaks would have a greater performance gain then tweaking the PHP code (assuming there is no mis-use of php). But this of course depends on the specifics of you application.
Optionally you should also be able to scale vertically (oppose of horizontally) to increase the TPS every time with the number of TPS per application server.
Tips to Improve Laravel Performance
Config caching,
Routes caching.
Remove Unused Service.
Classmap optimization.
Optimizing the composer autoload.
Limit Use Of Plugins.
Here is full detailed article click
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Let's say I want to implement storage server, which would be used as a place to store files, images, etc. from different websites. Something like S3, but only for my projects.
I thought about some API/Gateway on PHP, which would save files from those websites to appropriate server, but is it a good way? And should I use Webdav or maybe NFS/SMB, which protocol is more secure and fast?
Can you please give me advice how to create my own storage server? Especially I want to hear about appropriate stack for that, thank you.
There are a number of projects for building your own NAS or SAN (I think that is what you're looking for). Look at the FreeNAS project for example. It does require quite a bit of memory though (depending on the size of your storage and the demands you put on it).
When you want to build your own NAS, you will not need very powerful CPU's, unless you want to run apps on the NAS (FreeNAS provides a system for runnning containerized applications on the NAS, using it's storage) but you will need memory and of course plenty of disks, again depending on what your exact requirements are.
However, if you're simply looking for a place to store your own files and they are not extremely large or a huge number of files, then you could simply build a Linux server and push (or pull) the files using SFTP, it only uses OpenSSH and a single port, fully encrypted with minimal overhead.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I want to implement Chat system on my website where users can interact with each other in rooms. This is my first time when I am implementing chat system.
On searching, I found that phpFreeChat is a good option but on going through its introduction I found that it doesn't use DB at all. So I am wondering, how good in its performance and how much is it flexible as compared to any DB-based approach.
Anyone who have used can please give a viewpoint whether I should go for phpFreeChat so after that I can start learning more about. The website has a huge traffic of around 3 million visits per month.
Any pull based chat system (in which the clients will have to actively contact the server to ask for updates) is hugely resource intensive. Every client will make a request every so many seconds; multiply that by the number of clients and you're very soon DDoSing your own server.
A proper system should be push based, in which every client has a persistent connection to the server and the server is able to push messages to all relevant parties in realtime. This is perfectly possible using web sockets or long poll as fallback. A pub/sub protocol like WAMP is perfect for this use, as are more specialised protocols like XMPP.
Writing to a file or database is entirely unnecessary and would only be a secondary feature for the purpose of data persistence. The server just needs to be a message broker, storage is not required.
Depends on what you need - my first chat application was also file based and it was (and still is) pretty quick, but customizing and adding new functions is a pain in the ass. If your only need is to have quick chat without complex functions, go for file based. If you need to make user rights and other complex things, go for database based system.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I am creating a website and am expecting somewhat normal usage. I am setting up the system for now with 1 Apache Server and 2 DB servers. I want any DB operations to be reflected in both DB servers so that I can have 1 server as Backup. Now how do I do it ?
The ways I can think of are :
Perform same operations in both DB from PHP. This seems like a terrible idea.
Update 1 DB and sync both DB servers periodically. This seems better.
Is there any better way to achieve this ? How is it done in Enterprises ?
If you're using MySQL, there is quite powerful built-in replication.
Check out the docs
A terrible idea is to have backup each time a new operation happens. No modern, nor old application works this way. Even Windows System Restore makes backup on scheduled times, not on each operation.
I'd suggest you to make an sql dump script. And schedule a cron job wich will run it once a day, or twice a day. If you really need the data on the server immediately (assuming, you need if one of the DB servers crashes, your app continue working immediately with the backup server) you can make an import script, which will run once the dump finishes.
If you are not in the special case, when you need once the first DB server is shutdown'd, to have another one opened, you can just store the dumped sql files on the machine and not load them on real database, if they are not needed.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I'm in need of some sort of software based way to reserve the use of a couple machines. There about 5 different machines used in a lab that are shared among everybody but people need to schedule the days/times they want to use these machines. This is currently handled with pen/paper and you need to physically walk place to place to see when they're free and available for sign-up. I've been tasked with moving this system to a private webserver that currently runs an installation of mediawiki.
I've looked for extensions for mediawiki itself, but I couldn't find any kind of scheduler/planner/queue system that is premade that allows users to reserve a time frame/day to use machine. Additionally it would be nice if anyone could sign up but users were restricted from removing others from the queue(which is why a traditional calendar software with the honor system wouldn't exactly work). The solution doesn't need to be embedded within medawiki itself but must be able to be hosted off of a webserver, do you guys have any suggestions on how I can approach this? The best I can come up with is to buckle down and write my own php/django based site to handle this(I'm not very experienced with either). While I do have time I want to make sure there isn't something available I missed before dedicating my time to writing a custom application, and would appreciate anyone who could help.
While I've not used this:
phpscheduleit