PHP Stops loading on html, but still processes on server - php

So I am using a form with a file input to submit an excel file and upload the data into a MySQL database. The script imports everything just fine.
Only problem is that the process seems to timeout on the client browser, even though it seems to continue it's process on the server, but it seems to timeout on the client browser. I have already added all the common fixes, I.E.:
ini_set('max_execution_time', 0);
set_time_limit(0);
ini_set('memory_limit', '-1');
ignore_user_abort(true);
I am currently using a flush(); and ob_end_flush(); to output a message every 1000 records that get scanned for upload.
Unfortunately, after exactly 2 minutes, the page stops loading.
On the flip side, it does continue to upload the records all records get scanned and uploaded regardless of the page. This leads me to believe that somehow the request is continuing to process on the web server, but somehow the browser stops getting info from the server after a certain amount of time. Based on my calculations, the browser stops getting the request after exactly 2 minutes.
I have tried this on both Internet Explorer and Chrome and I get the same result. I have not made any changes to the php.ini file for the security reasons.
Using Old Zend Server with Apache2.

Related

PHP Internal Server Error, Long Big Script on shared Godaddy Hosting

I have a json file for more than 100000+ members, I am trying to import them to database, I coded the script and script runs fine and data do gets to insert, but not all of the data, after some time I get internal server error and only 100 or so members get inserted from json file.
I already have tried to set the max execution time to unlimited but still no output.
error_reporting(E_ALL);
ini_set('display_errors',true);
set_time_limit(0);
ini_set('memory_limit', '-1');
ini_set('max_execution_time', 0);
it only gives server time out after some time. if it would be possible to run script even for 1 hour will be great.
i cant run script on local wamp, my internet speed is very slow and script also downloads and resized profile images of members too. plus if my internet is disconnected that would also interrupt the script.
Is there any way i can by pass time out.
I heard about doing it in cron job but not sure if cron job has the same time limit.
the json file i am trying to import was of 122mb, but i divied it in to 34 parts which took some time..
the file now i am trying to run is of 4mb in size, but still after minute or two i get internal server error.

PHP MySQL Query; Max Execution Time and Memory Limit don't work on browser output

So I am using a form with a file input to submit an excel file and upload the data into a MySQL database. The script imports everything just fine.
Only problem is that the process seems to timeout on the client browser, even though it seems to continue it's process on the server, but it seems to timeout on the client browser. I have already added all the common fixes, I.E.:
ini_set('max_execution_time', 0);
set_time_limit(0);
ini_set('memory_limit', '-1');
ignore_user_abort(true);
I am currently using a flush(); and ob_end_flush(); to output a message every 1000 records that get scanned for upload.
Unfortunately, after exactly 2 minutes, the page stops loading.
On the flip side, it does continue to upload the records all records get scanned and uploaded regardless of the page. This leads me to believe that somehow the request is continuing to process on the web server, but somehow the browser stops getting info from the server after a certain amount of time. Based on my calculations, the browser stops getting the request after exactly 2 minutes.
I have tried this on both Internet Explorer and Chrome and I get the same result. I have not made any changes to the php.ini file for the security reasons.
Using Old Zend Server with Apache2.

Script still runs after stopped browser

I make a fetching script to get data from web pages. It saves some data to a mysql database.
When I hit stop or close the browser I see through phpmyadmin that it still adds records to infinity.
I killed the process many times but I cant figure out what is wrong to fix it.
Is it script's fault or server's ?
EDIT :
I close the browser i close my pc and the script still runs at the server.
Its not the code or the script.
try to set ignore_user_abort
ignore_user_abort(true);

Downloading large file with cURL in PHP - Page hangs

I have a PHP script that downloads videos from various locations.
The video files can be any where from 20mb to 100mb+
I've got PHP currently saving the video file in a directory using CURLOPT_FILE. This is working fine with no problems.
Because of the large files that are being dowloaded I've set the cURL timeout period to 45 minutes to allow the file to download. I have also set set_time_limit(0) so that the PHP page should continue processing after the download has completed. I've also set ini_set("memory_limit","500M");
When the download completes it should echo "Downloaded" and then update a mysql record stating the file has been downloaded.
What is happening though, is the video file is being downloaded correctly by cURL but it is not displaying "Downloaded" in the browser BUT it is updating mysql.
Why is this? I've tried to come up with a solution myself, but I cannot work out what the issue here is...
If you're in a browser environment, the browser will timeout after a certain time, and so will stop listening for output from the script, even though the script will continue to run. It varies across browsers, but the number I've seen is 30 seconds.
To overcome this problem, you should send output (even if meaningless echo "<!--empty comment-->";) every so often.
I recently had a similar problem, and I dealt with it by not outputting any content from the script, and instead polling from the browser every so often using AJAX to see if it was done.
Or, don't use the browser environment (as it's not ideally suited for this problem), and instead use a command line prompt, as it does not have (to my knowledge) these timeouts.

Why would file uploads stop on busy LAMP server?

We have a lamp server that is fairly busy, the CPU usage hovers around 90% at peak times. We are having an intermittent problem where file uploads from web forms fail. It only seems to happen with larger files (over a mb) and it seems to affect some users more than others. We've gone through and checked the obvious stuff like PHP ini max upload sizes, execution times, and folder write permissions. Also, the site worked for a year without trouble like this and it suddenly began (we don't think any of our application php would cause this).
We've watched what happens in Charles Proxy and it shows the upload happening (the sent filesize increases regularly) until it just stops sending data. The browser just shows it's spinning progress like it's proceeding but you can wait 20 minutes and nothing will happen or it reports a timeout.
Does anyone know why an upload might fail intermittently? My only guess is that maybe it has to do with server traffic, like apache is closing that connection prematurely.
If the load on the server is high, then your scripts may be timing out while trying to upload the file. Can you give any more specifics on your problem? I think PHP scripts have a 30 second timeout by default, meaning that if the script has not completed i.e. uploading the file within that time frame then the script will timeout and the upload will fail.
It seems like if the site worked for over a year, and now traffic has grown to the point where it is starting to strain the load on the server then it is possible that scripts may be timing out given the increased traffic and load on the server.

Categories