PHP server timeout on script - php

I have a function for a wordpress plugin I'm developing that takes a lot of time.
It connects to the TMDb (movies database) and retrieves one by one all movies by id (from 0 to 8000) and creates a XML document that is saved on the local server.
Of course it takes a bunch of time, and PHP says "504 Gateway Time-out The server didn't respond in time."
What can I do???? any sugestions!!!

Assuming a one-time execution and it's bombing on you, you can set_time_limit to 0 and allow it to execute.
<?php
set_time_limit(0); // impose no limit
?>
However, I would make sure this is not in production and it will only be ran when you want it to (otherwise this will place (and continue to place) a large load on the server).

Try to set:
set_time_limit(0);
at the script head. But i think it's the servers problem, you read too long. Try read in thread mode.

I think this is not related to script timeout.
504- Gateway Timeout problem is entirely due to slow IP communication between back-end computers, possibly including the Web server.
Fix:
Either use proxies or increase your cache size(search for "cache" in your php.ini and play with it) limit.
Dot

Related

How to process multiple parallel requests from one client to one PHP script

I have a webpage that when users go to it, multiple (10-20) Ajax requests are instantly made to a single PHP script, which depending on the parameters in the request, returns a different report with highly aggregated data.
The problem is that a lot of the reports require heavy SQL calls to get the necessary data, and in some cases, a report can take several seconds to load.
As a result, because one client is sending multiple requests to the same PHP script, you end up seeing the reports slowly load on the page one at a time. In other words, the generating of the reports is not done in parallel, and thus causes the page to take a while to fully load.
Is there any way to get around this in PHP and make it possible for all the requests from a single client to a single PHP script to be processed in parallel so that the page and all its reports can be loaded faster?
Thank you.
As far as I know, it is possible to do multi-threading in PHP.
Have a look at pthreads extension.
What you could do is make the report generation part/function of the script to be executed in parallel. This will make sure that each function is executed in a thread of its own and will retrieve your results much sooner. Also, set the maximum number of concurrent threads <= 10 so that it doesn't become a resource hog.
Here is a basic tutorial to get you started with pthreads.
And a few more examples which could be of help (Notably the SQLWorker example in your case)
Server setup
This is more of a server configuration issue and depends on how PHP is installed on your system: If you use php-fpm you have to increase the pm.max_children option. If you use PHP via (F)CGI you have to configure the webserver itself to use more children.
Database
You also have to make sure that your database server allows that many concurrent processes to run. It won’t do any good if you have enough PHP processes running but half of them have to wait for the database to notice them.
In MySQL, for example, the setting for that is max_connections.
Browser limitations
Another problem you’re facing is that browsers won’t do 10-20 parallel requests to the same hosts. It depends on the browser, but to my knowledge modern browsers will only open 2-6 connections to the same host (domain) simultaneously. So any more requests will just get queued, regardless of server configuration.
Alternatives
If you use MySQL, you could try to merge all your calls into one request and use parallel SQL queries using mysqli::poll().
If that’s not possible you could try calling child processes or forking within your PHP script.
Of course PHP can execute multiple requests in parallel, if it uses a Web Server like Apache or Nginx. PHP dev server is single threaded, but this should ony be used for dev anyway. If you are using php's file sessions however, access to the session is serialized. I.e. only one script can have the session file open at any time. Solution: Fetch information from the session at script start, then close the session.

PHP Script Crashes and Max Execution Time

I currently have a website that has twice been suspended by my hosting provider for "overusing system resources". In each case, there were 300 - 400 crashed copies of one of my PHP scripts left running on the server.
The scripts themselves pull and image from a web camera at home and copy it to the server. They make use of file locks to ensure only one can write at a time. The scripts are called every 3 seconds by any client viewing the page.
Initially I was confused, as I had understood that a PHP script either completes (returning the result), or crashes (returning the internal server error page). I am, however, informed that "defunct scripts" are a very common occurrence.
Would anyone be able to educate me? I have Googled this to death but I cannot see how a script can end up in a crashed state. Would it not time out when it reaches the max execution time?
My hosting provider is using PHP set up as CGI on a Linux platform. I believe that I have actually identified the problem with my script in that I did not realise that flock was a blocking function (and I am not using the LOCK_NB mask). I am assuming that somehow hundreds of copies of my script end up blocked waiting for a resource to become available and this leads to a crash? Does this sound plausible? I am reluctant to re-enable the site for fear of it being suspended again.
Any insights greatly appreciated.
Probably the approach I would recommend is to use tempnam() first and write the contents inside (which may take a while). Once done, you do the file locking, etc.
Not sure if this happens when a PUT request is being done; typically PHP will handle file uploads first before handing over the execution to your script.
Script could crash on these two limitations
max_execution_time
memory_limit
while working with resources, unless you have no other errors in script / check for notice errors too

PHP file_get_contents() Timeout?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');

Does apache / php blocks a single file to be processed from the same client concurrently?

I have everything setup and running with KalturaCE & Drupak and the server (Ubuntu 8.02+Apache2+PHP5+MySql) is working fine.
The issue here I am having is unclassified.
When I play two videos together from my site, the second video which I played later doesn't start until the first completes its buffering. I did some HTTP watch and came to know that,
both of the entries request the file with the URL as follows,
/kalturace/p/1/sp/100/flvclipper/entry_id/xxxxxx/flavor/1/version/100000
so the first video which I played receives a 302 redirect response to the URL as follows,
/kalturace/p/1/sp/100/flvclipper/entry_id/xxxxxxx/flavor/1/version/100000/a.flv?novar=0
and starts buffering and playing. While the second video which I started later just wait for a response till the first video end its buffering and then the second video receives its 302 redirect and start buffering
My question is, why can't both videos buffer concurrently? and Obviously this is what I require.
Your help is highly anticipated and much welcome.
PHP file-based sessions will lock the session file while a request is active. If you intend to use parallel requests like this, you'll have to make sure each script closes the session as soon as possible (ie: after writing out any changes) with session_write_close() to keep the lock time to a minimum.
I found the solution to my problem and it was surely the Marc at first place suggested but it was some problem with mysql timeout as well.
So putting session_write_close() in both the files at right places solved my problem.
For a complete overview please visit the suggested thread at http://www.kaltura.org/videos-not-playing-simultaneously-0
I posted several suggestions at http://drupal.org/node/1002144 for the same question, this is one of them:
In Apache space, a possible cause might be MaxClients (or, the capacity of your Apache server to respond to multiple requests). If that apache setting or server capacity is low, then the server may be dropping your connections into its backlog until the first connection is completed. You can test this using regular large files which take some time to download on the server - that will establish whether it's KalturaCE or Apache which is causing the issue. To see this you'd either need a site which is already sitting at max for MaxClients (ie, not an unused dev site), or a very low MaxClients setting (like, 1!).
Another suggestion posted at http://drupal.org/node/1002144 for the same question -
It's possible that this behaviour might be due to a mysql query lock. If KalturaCE's SQL was locking a table until the first request is completed (I have no reason to believe this is the case, just floating possible causes), then a second request might hang like this.
I'm not familiar enough with CE to suggest that this is the case, but you could easily debug using mtop on the server during the request to see if this is what's actually happening.

How do I tell how much memory / resources is my php script using up?

I am debugging my application here and basically in a nutshell - the application is dying out on my online server or maybe its my server dying out. But I have checked this application three different servers and all exhibited similar results, the application would run for a while but all of a sudden once I'd be opening more and more requests I'd get a Network error or the site would fail to load.
I'm suspecting its my code here so I need to find out how I can make it less resource intensive infact I don't know why is it doing this in the first place. It runs ok on my localhost machine though.
Or is it because I'm hosting it on a technically shared host? Should I look for specialised hosting for hosting an application? There are a lot of complex database queries and ajax requests in my application here.
As far as checking how much memory your script is using you can periodically call memory_get_usage(true) at points in your code to identify which parts of your script are using the memory. memory_get_peak_usage(true) obviously returns the max amount of memory that was used.
You say your application runs OK for a while. Is this a single script which is running all this time, or many different page requests / visitors? There is usually a max_execution_time for each script (often default to 30 seconds). This can be changed in code on a per script basis by calling set_time_limit().
There is also an inherent memory_limit as set in php.ini. This could be 64M or lower on a shared host.
"...once I'd be opening more and more requests..." - There is a limit to the number of simultaneous (ajax) requests a client can make with the server. Browsers could be set at 8 or even less (this can be altered in Firefox via about:config). This is to prevent a single client from swamping the server with requests. A server could be configured to ban clients that open too many requests!
A shared host could be restrictive. However, providing the host isn't hosting too many sites then they can be quite powerful servers, giving you access to a lot of power for a short time. Emphasis on short time - it's in the interests of the host to control scripts that consume too many resources on a shared server as other customers would be affected.
Should I look for specialised hosting for hosting an application?
You'll have to be more specific. Most websites these days are 'applications'. If you are doing more than simply serving webpages and are constantly running intensive scripts that run for a period of time then you may need to go for dedicated hosting. Not just for your benefit, but for the benefit of others on the shared server!
The answer is probably the fact that your hosting company has a fairly restrictive php.ini configuration. They could, for example, limit the amount of time that a script can run for, or limit the amount of memory that a script could use.
What does your code attempt to do?
You might consider making use of memory_get_usage and/or memory_get_peak_usage.

Categories