Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 days ago.
Improve this question
I've facing some problems with quizes in moodle, i run a server on a 4GB ram and whenever users start doing quizes(around 50-60 users at the same time) its becomes very slow, i changed the quiz auto save parameter from 1 minute to 30 minutes and it helped a little, but still very slow, i've searching for some other method and find about php memory limit, when i looked on my moodle it says that my php memory limit is 512mb, i want to know how much i can reduce the php memory limit without prejudicing the other processess running on the server and if there is another way to improve the overall performance of moodle.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm currently optimizing an application's login process' caching and I'm thinking of using a File based approach, but I'm not so sure if it's the best when it comes to speeding things up. So among the following approach, which would greatly improve my application?
PHP_SESSION
Filebased(physical file)
PDO_Database
FTP
Anything that touches files is less optimal for caching than when staying in memory. Respectively anything that goes via a network is generally even slower. So when it comes to speed you're probably best off using the PHP_SESSION.
Do note however that because it is memory based you also lose the cache when the application or the server is restarted. If this is undesired you should probably go for a file based solution.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have a website run by php. I have about 80 users that will be signing up within a hour period.
my question is are there any problems that could occur when having lots of people accessing my database all at once?
I mean 80 users is a lot for me, and if some of you people that have way more than that are laughing at me, how many people would it take to mess up a database?
Thanks in advance!
You could use a tool like apachebench to check. Then you could use tools like memcached (or memcachedb if you needed persistence) to dial down the mysql queries. I would also set up mysql slow query logging.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Why doesn't PHP script execution time limit output time of an endless var_dump? This will crash Firefox on my system (DragonFlyBSD) because it eats up gigabytes of memory. On Ubuntu it crashes Apache.
Edit: I did a var_dump of an object with cycle references, without xdebug. This was on a framework I don't know so well.
Edit 2: Use this option to limit Apache output: http://httpd.apache.org/docs/2.2/mod/core.html#limitrequestbody
var_dumping an large object will take centuries until PHP print in the screen.
Thus, the commitment between the objects is part of a return of the management process in the long run expected script you are taking it.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
What is the Maximum Size in MB which can be load per SQL request in to a Webpage?
(PHP is the only way I've known)
I think the data which is come back from the Server will be saved in the RAM. Maybe it is the RAM-MB of the Hardware.
Is it better to save the data in a file if I have a lot of stuff?
In MySQL, the maximum size of a SQL statement is limited by the max_allowed_packet variable.
SHOW VARIABLES LIKE 'max_allowed_packet'
The default setting (I believe) is 1MB, but it's possible to increase that significantly.
That's the limit on the size " per SQL call " set by MySQL.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have an application developed in PHP + MYSQL using CodeIgniter. Currently we have about 3000 active users using the application daily, we have around 130,000 pageviews per day. The system does not use many resources, which is most commonly used is HTTP and reading / writing in MYSQL. My question is: What is the best instance to be hired at Amazon? What setting do you suggest me? An important we are to the brand alcaƧar 50,000 users in the next six months.
The best instance is the one you can afford that isn't overloaded when busy. You should be monitoring CPU load and memory usage using a tool like Munin to be sure you're not running out of headroom.
If you're using Amazon RDS to handle your database, you can add multiple front-end web servers to handle the load.
There's no magical size that works "best", but the c1.medium is pretty good value.