Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I was wondering if someone could give a high-level answer about how to track functions which are causing a slow-down.
We have a site with 6 thousand lines of code and at times there's a significant slowdown.
I was wondering what would be the best way to track the source of these occasional slowdowns? Should we attach a time execution tracker on each function or would you recommend something else?
It's a standard LAMP stack setup with PHP 5.2.9 (no frameworks).
The only way to properly track down a why and where a script is slowing down, is by the use of a profiler.
There are a few of these available for PHP. Some of which requires that you install a module on the server, some which uses a PHP-only library, and others again which are stand alone.
My preferred profiler is Zend Studio, mainly because I use it as my IDE. It has the benefit of being both stand-alone, and to be used in the conjunction with server-side modules (or the Zend Server package). Allowing you to profile both locally, and on production systems.
One of the easiest things to look for, however, are SELECT queries inside loops. They are notorious for causing slow-downs, especially when you have a more than a few hundred records in the table being queried.
Another if is you have multiple AJAX calls in rapid succession, and you're using the default PHP session handler (flat files). This can cause the loading time to increase significantly because the IO-operations are locking. This means that it can only handle one request that uses session at a time, even though AJAX is by its very nature asynchronous.
The best way to combat this, is to use/write a custom session handler that utilizes a database to store the sessions. Just make sure you don't saturate the DB connection limit.
First and foremost though: Get yourself a proper profiler. ;)
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to implement real time notifications like in Facebook. There might be a huge number of notifications to be sent for different users, depending on server load time and efficiency of coding. Which is the best approach?
1. using normal AJAX?
2. with node.js and socket programming?
3. something else?
Thanks in advance.
The choice of the proper platform greatly depends on your current architecture, knowledge, and budget.
Your question suggests that it is web based, for which there are only two basic options:
WebSocket: There exists many WebSocket server solutions including compiled executables, PHP based, and Node.js. This approach is greatly gaining in popularity but isn't necessarily accessible to every budget since it usually requires a complete server to run. VPS limitations are usually too important for systems that require so many simultaneous connections.
AJAX: The use of AJAX and its variants is still a very popular solution and, when well implemented, can be almost as efficient as WebSocket without the need to sustain the connections constantly. It doesn't often matter if there is a one second delay, and Facebook chat is usually much slower than that.
For non web-based solution, anything is possible. If you develop a client-server application, you can have real time connections similar to WebSocket which can be even easier to maintain.
Ajax request not real time but you can set timeout and in this case work.
but the ajax request busy your server and try to many connections. this is simple !
If you use socket it's better but you need more time to develop this.
Read the following link can help you :
Ajax vs Socket.io
Nowadays we have two possible solutions. WebSockets and Comet. WebSockets are probably the best solution but they’ve got two mayor problems:
Not all browsers support them.
Not all proxy servers allows the communications with websokets.
Because of that I prefer to use comet (at least now). It’s not as good as websockets but pretty straightforward ant it works (even on IE).
realtimenotifications more details To know more about refer above link.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to create an application in php with a background thread containing an timer that keeps updating a database (by collecting data from different sites) separately without any user interference. What I mean by this is: without anybody visiting the site, the thread has to keep updating a database. Is this possible in PHP and how am I able to realise this ?
The best way I think it is to create a php script to do whatever you want and then set up a cron job to run that script at specific time.
There are several options for this:
A scheduled task in your operating system, such as cron on *nix or Windows Scheduler for the Windows platform.
A permanently running script. This is not ideal for PHP though, as memory usage is sometimes not correctly thrown away, and the script can run out of memory. It is common for scripts to be set to die and respawn, to prevent this from happening.
A scheduled task in your database server. MySQL now supports this. If your purpose is to run database updates, this might be a good option, if you are running MySQL, and if your version is sufficiently recent.
A queue, where some processing is done in the background upon a request signal. See Gearman, Resque and many others. It is useful where a user requests something in a web application, but that request is too lengthy to carry out immediately. If you need something to run permanently then this may not be ideal - I add it for completeness.
Having a PHP process run for a long time isn't really a good idea because PHP isn't a very memory efficient language and PHP processes consume a lot of memory.
It would be better to use a job manager. Take a look at Gearman.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I have leased VPS with 2GB mem.
Problem i have is that i have few joomla installations and server get in to very slow response if there is more than 30-50 users attached at same time.
Do you have any tips, books/tutorials/suggestions how to increase response time in this situation?
Pls. give me only very concrete and useful URLs, i would be very grateful.
In attachment i attached just part of htop view on that VPS
The easiest and cheapest thing you can do is to install a bytecode cache, e.g. APC. Thus, php does not need to process every file again and again.
If you're on Debian or Ubuntu this is as easy as apt-get install apc.
I'm going to guess that most of our issues will come from joomla - I'd start by looking through this list: https://stackoverflow.com/search?q=joomla+performance
Other than that, you might want to investigate a php accelerator: http://en.wikipedia.org/wiki/List_of_PHP_accelerators
If you have any custom sql, you might want to check your sql queries are making good use
of indexes
A quick look at your config suggests your using apache pre fork - you might want to try
using threaded worker mode, though always benchmark each config change you make (apache
comes with a benchmarking tool) to ensure any changes have a positive effect.
Some other links..
http://www.brandonturner.net/blog/2009/07/fastcgi_with_php_opcode_cache/
Though this is for wordpress, the principals should still apply.
http://blog.mydream.com.hk/howto/linux/performance-tuning-on-apache-php-mysql-wordpress
A couple of things to pay close attention to.
You never want your server to run out of memory. Ensure any apache config limits the
number of children to within your available memory.
Doing SHOW PROCESSLIST on mysql and looking for long running queries can highlight some
easy wins, as nothing kills performance like a slow sql query.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Okay so I've running so pretty large queries on my site and its been running up the mysql resources. My admin questioned whether I've tried different php accelerators but I've never installed one before. So I did some research on it, and I'm curious do I need to make any modifications to my actual php codes or do I just install an accelerator and let it take effect? I need ways to optimize my load and reduce the amount of resources being used on the server.
"PHP accelerators" are opcode caches; they save the server from having to re-interpret PHP files on every request. The savings is somewhere in the realm of 1% of CPU load, and it won't help you one bit if your problem is with the database's resource usage.
Most PHP accelerators work by caching the compiled bytecode of PHP
scripts to avoid the overhead of parsing and compiling source code on
each request (some or all of which may never even be executed). To
further improve performance, the cached code is stored in shared
memory and directly executed from there, minimizing the amount of slow
disk reads and memory copying at runtime.
Source: http://en.wikipedia.org/wiki/PHP_accelerator
Sounds to me like you need to accelerate your SQL queries, not your PHP code.
Here are a list of PHP accelerators that you can evaluate and install
http://en.wikipedia.org/wiki/List_of_PHP_accelerators
I've used APC, which I believe is one of the most popular PHP accelerators. One thing that it does is basically cache function calls and arguments, so that subsequent calls to the same function with the same arguments will have its return value cached, and not have to recompute everything.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm working on a rather intensive rewrite and given a choice of the following options:
JSP / Java running on Tomcat
PHP running under Apache
Ruby (running under I'm not sure, ROR?)
A couple of basic questions I would like to know about the above.
Speed is a concern. We have a MongoDB backed database, so we shouldn't need to be waiting on the database for information, but the frontend needs to be as fast as possible. The common saying speed isn't a concern doesn't really apply here. If you're processing 500k+ objects in one request it needs to be fast.
Scalability is another concern. Suppose our database blossoms out of control. Which (of the above) would provide the easiest method of handling this?
What are common downsides of Tomcat / JSP and Ruby? Is parallel processing easy to do with PHP or Ruby?
The goal is not to save money but to build a solid, fast, scalable system to continue development on for years to come.
I'll be honest, I'm a former Java developer (not JSP) turned PHP developer. My preference for anything is PHP but I also am a big believe in using the right tool for the job. The team is competent enough to write this is anything that we
Seems like any of them would be acceptable based on the limited info so far. The important things I've begun to consider when launching new projects are more about the ORM and framework than about speed. For every extra 40 hours of developer time I have to spend on a project I can provision and operate a new server for 1 year.
If you have developers that are better versed in the APIs for a particular language, that alone could (potentially) make your decision. If you can parallelize 500k things across 10 servers, and choosing language (and API/libraries) A over B will save you 10 weeks, then that is your breakeven point. Similarly, if one set of things is 2x as slow, and having 2 servers instead of 1 could double your processing speed, then it will only take 1 week of extra fighting in the "faster" language before all your performance gains are wiped out due to longer development time...
Ended up going with Play!
Reasons:
Quick startup
No redeploying / packaging
Straight forward MVC pattern
Groovy template / inherited views
Drop-in support for dependencies as JAR files
Development was never hindered by it. No one had to learn anything new besides where to put the controllers / models / views.