Okay so I'm trying to retrieve 130 XML files from a feed and then insert values into a database. Problem is, it crashes at around 40-60 entries and doesn't give an error. I timed it and the script goes for around 13 seconds each time.
I checked my php.ini settings and they are...
Memory Limit = 128M
Time Limit - 30 seconds
So what is causing this error? When I run the script on firefox it just displays a white screen.
EDIT - The error I'm getting on chrome is
"Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the
connection without sending any data"
Have you checked the memory consumption? Also could you do a writeline or output something to the screen to see that it's reading in the files? Add error handling around the read statement to see if it's failing on parsing the XML.
Related
I am fetching data from cleardb mysql. It takes around 10 mins to give me result back.
But after 230 secs Azure gives error as "500 - The request timed out.
The web server failed to respond within the specified time."
i have tried to set max_execution_timeout to infinite and changed more config variables in .user.ini.
As well tried to set manually in first line of php script file as set_time_limit(0); and ini_set('max_execution_time', 6000000);.
But no luck.
I don't want to use webjobs.
Is there any way to resolve Azure 500 - The request timed out. issue.
Won't work. You'll hit the in-flight request timeout long before the 10-min wait.
Here's a better approach. Call a stored procedure that produces the result and make a second call 10 minutes later to retrieve the data.
call the stored procedure from your code
return a Location: header in the response
follow the URL to grab results, 200 OK means you have them, 417 Expectation Failed means not yet.
I am reading 10,000 csv files each has 1000 rows and remove the duplicate and create new file
For that
I read line by line and store the data in a array.
When I store new data in the array, I check the weather it is not duplicate (weather the array already have the data)
Then recreate csv using array
Additionally I changed following in the php.ini
max_execution_time = 30000
max_input_time = 60000
memory_limit = -1
error_log = error_log
But I am getting following error. There is no error log. Is there any other configuration to change in the php.ini. Please help me on it
Internal Server Error
The server encountered an internal error or misconfiguration and was
unable to complete your request.
Please contact the server administrator, webmaster#xxxxxx.com and
inform them of the time the error occurred, and anything you might
have done that may have caused the error.
More information about this error may be available in the server error
log.
Additionally, a 404 Not Found error was encountered while trying to
use an ErrorDocument to handle the request.
Apart from checking for the Apache Timeout, you should try to break this script once & try to run it in batches of probably 2000 csv files.
It is also possible that some data in one of the CSV files is causing this error, which would be identified if you break it in batches of 2000 files & run.
I have a php page that runs a number of loops and queries and database updates etc. It can take some time to run and after a minute, I get the 500 Internal Server Error. I don't have access to the logs but my hosting service has forwarded a copy and it seems that it is a timeout related error:
mod_fcgid: read data timeout in 60 seconds
I have included:
ini_set('max_execution_time', 9000);
set_time_limit(0);
In the php page but it still causes the 500 error. I can't access any of the config files. Is there any other way I can increase the timeout for this page?
I have also tried putting
set_time_limit(59);
at the start of each loop through. If this is meant to reset the clock then I can't see that I should have a problem but the error persists.
NOTE: I am 99% sure that it is not an error in the script itself as sometimes it goes through and other times times it doesn't with exactly the same data.
I know how to download a file from a server using FTP with PHP.
I have a list of files to download from the ftp to an internal storage.
I use ftp_get() to download the list of files,
the first file size is: 126 mb, successfully downloaded to my internal storage.
However the PHP function throws an error 500, and then dies without continuing.
The error I get:
Internal Server Error
The server encountered an internal error or misconfiguration and was
unable to complete your request.
Please contact the server administrator, webmaster#zzz.com and inform
them of the time the error occurred, and anything you might have done
that may have caused the error.
More information about this error may be available in the server error
log.
Additionally, a 404 Not Found error was encountered while trying to
use an ErrorDocument to handle the request.
Any idea what I should do in order for the function to complete its run successfully?
You need to increase the timeout then. 180 is in seconds, which is 3 minutes. Try setting it to 600. I.e.: FTP_TIMEOUT_SEC, 600 or higher, depending on how much more time is needed. You probably could even try FTP_TIMEOUT_SEC, 0 which I think is NO time limit.
It is already commented for one more question similar to this. Please try this. It should work.
Maybe you exceeded the maximum execution time.
Try to increase it:
https://www.php.net/manual/en/function.set-time-limit.php
did anyone encountered a problem with excel_reader2, that the script got aborted by a big amount of rows, for example over 60k rows in the excel file? I just get an error message in error log : Aborted. Thats all. I got more files on my server and the script takes them 1 by 1 but when i get to the second the the message comes and script stops. Its php 5.4.7 btw.
I am not sure about the problem, but phpexcel does not really like large excel files, that may be a problem. Please try to limit the data with reading only chunks, it may help.