PHP File throwing 404 in the middle of executing (After ~45 seconds) - php

I have a script in a folder that has no relevant execution time limits (I regularly run hour-long scripts from this directory) with a file in it. This file is called from an html form on a codeingiter view and uploads a csv file, reads it and updates database entries with the information contained within.
If I have a smaller csv file (1-1000 entries), it works fine. Anything over that will process about 1000 entries (including echoing debug text onto the screen) and THEN throw a 404 error.
I've since attempted to circumvent the error by making the post page make an ajax call but it does the exact same thing.
"NetworkError: 404 Not Found - xxxx/xxxx/xxxxx.php"
I have confirmed that the script is running prior to the 404 by confirming changes in the database and seeing echo'd debug code on the screen.
Code that is executing when it throws the 404...
(in short, update id (first column) to have value (second column) from csv file.
if(file_exists($targetdir))
{
$file = fopen($targetdir, 'r');
$count = 0;
while(!feof($file))
{
$x = (fgetcsv($file));
if(is_numeric($x[1]) && $x[1] > 9999 && $x[0] != '-')
{
$count++;
// echo "UPDATE xxxx SET xxxxx ='".$x[0]."' WHERE id=".$x[1];
$db->query("UPDATE xxxx SET xxxxx ='".$x[0]."' WHERE id=".$x[1]);
//echo $x[1].' - '.$x[0]."<br/>";
}
}
fclose($file);
}
Edit
It looks like Jquery is throwing the error? I googled around and saw people say it's a server setting (but not what server setting, so for the heck of it, I set the max execution time to 300s and have the same results).
xxxxx.php 404 xhr jquery-1.12.4.min.js:4 1.6 KB 40.07
Image of 404 rendering after text is echo'd

I was able to find the setting that was tripping.
Rather than use the standard php max execution time, the timeout was getting hit from the FastCGI settings. Either from AJAX posting or from the read operation.
Either way, I increased FcgidIOTimeout in my apache settings on the server and was able to execute the full script.

Related

PHP 500-error only occurs in server (production side) but in local is running fine

I have a PHP application, when i ran it on my local machine, it ran perfectly, no errors.
But when I run it in the server side, our remote server machine, it spits out 500 error.
The logic of the application is:
User will declare folder where the app will loop through the number of folders inside.
Will also declare number of files app will generate/create per folder.
And the character count for the description per file created.
basically:
for(x=0; x<foldercount; x++){
for(y=0; y<filecount; y++){
//creates file
for(z=0; z<charcount;z++){
//process here for details in file
}
}
}
basically, the 500 error will occur inside the creation of file. the situation now is it occurs in the middle, say I input 10 files to be created per folder, possibly the error will happen on the 4th, 5th or 6th loop.
thanks for your answers
Usually it comes when there is not try catch.
for(x=0; x<foldercount; x++){
for(y=0; y<filecount; y++){
//creates file
for(z=0; z<charcount;z++){
//process here for details in file
}
}
}
echo each $variables like $foldercount,$filecount and $charcount before adding in forloop. one more thing if there is any database "select query" error than it will occur again. so make sure there is no DB error.
if you don't know when it occurs try to do for loop for 0 to 3, 0 to 4, 0 to 5, 0 to 6 etc.
you will get specific position where you are getting this error.

PHP hits a limit and stops running if from a browser but does work in cron

This is my code:
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
$fullcontent = file_get_contents("https://website.co.uk/feeds/google-feed-all-options.php");
file_put_contents("direct/google-feed.xml", $fullcontent);
In my browser it times out, however.
In the file google-feed-all-options.php I have this:
ignore_user_abort(true);
set_time_limit(0);
the code in https://website.co.uk/feeds/google-feed-all-options.php outputs to a file google-feed.xml if i put it in a crontab it works correctly and the file is complete, if i go to the url https://website.co.uk/feeds/google-feed-all-options.php in my browser, the data gets cut off and only gets to about 20% of the list of items.
meaning that if i echo out $fullcontent onto the page the content doesnt loop through the list correctly. meaning the code does work, but the main issue is that im accessing it using the browser instead of running it internally on the server.
so i have set ignore user abort which continues the script running even if it times out in the browser like it is doing, then the file completes it just doesnt output it. but ideally i need the output to come into the browser.
it is not a duplicate if you see i mention set time limit this is unlimited and the php.ini is set to 600 and the script barely reaches 20 seconds which is not a default of php if the other two didnt work.
Any ideas? I'm at a loss.

PHP: Detect and show "Server overloaded" message

I have noticed a few websites such as hypem.com show a "You didnt get served" error message when the site is busy rather than just letting people wait, time out or refresh; aggravating what is probably a server load issue.
We are too loaded to process your request. Please click "back" in your
browser and try what you were doing again.
How is this achieved before the server becomes overloaded? It sounds like a really neat way to manage user expectation if a site happens to get overloaded whilst also giving the site time to recover.
Another options is this:
$load = sys_getloadavg();
if ($load[0] > 80) {
header('HTTP/1.1 503 Too busy, try again later');
die('Server too busy. Please try again later.');
}
I got it from php's site http://php.net/sys_getloadavg, altough I'm not sure what the values represent that the sys_getloadavg returns
You could simply create a 500.html file and have your webserver use that whenever a 50x error is thrown.
I.e. in your apache config:
ErrorDocument 500 /errors/500.html
Or use a php shutdown function to check if the request timeout (which defaults to 30s) has been reached and if so - redirect/render something static (so that rendering the error itself cannot cause problems).
Note that most sites where you'll see a "This site is taking too long to respond" message are effectively generating that message with javascript.
This may be to do with the database connection timing out, but that assumes that your server has a bigger DB load than CPU load when times get tough. If this is the case, you can make your DB connector show the message if no connection happens for 1 second.
You could also use a quick query to the logs table to find out how many hits/second there are and automatically not respond to any more after a certain point in order to preserve QOS for the others. In this case, you would have to set that level manually, based on server logs. An alternative method can be seen here in the Drupal throttle module.
Another alternative would be to use the Apache status page to get information on how many child processes are free and to throttle id there are none left as per #giltotherescue's answer to this question.
You can restrict the maximum connection in apache configuration too...
Refer
http://httpd.apache.org/docs/2.2/mod/mpm_common.html#maxclients
http://www.howtoforge.com/configuring_apache_for_maximum_performance
This is not a strictly PHP solution, but you could do like Twitter, i.e.:
serve a mostly static HTML and Javascript app from a CDN or another server of yours
the calls to the actual heavy work server-side (PHP in your case) functions/APIs are actually done in AJAX from one of your static JS files
so you can set a timeout on your AJAX calls and return a "Seems like loading tweets may take longer than expected"-like notice.
You can use the php tick function to detect when a server isn't loading for a specified amount of time, then display an error messages. Basic usage:
<?php
$connection = false;
function checkConnection( $connectionWaitingTime = 3 )
{
// check connection & time
global $time,$connection;
if( ($t = (time() - $time)) >= $connectionWaitingTime && !$connection){
echo ("<p> Server not responding for <strong>$t</strong> seconds !! </p>");
die("Connection aborted");
}
}
register_tick_function("checkConnection");
$time = time();
declare (ticks=1)
{
require 'yourapp.php' // load your main app logic
$connection = true ;
}
The while(true) is just to simulate a loaded server.
To implement the script in your site, you need to remove the while statement and add your page logic E.G dispatch event or front controller action etc.
And the $connectionWaitingTime in the checkCOnnection function is set to timeout after 3 seconds, but you can change that to whatever you want

PHP script keeps working when it should fail

I know people complain usually about scripts not working, but here is a case where it keeps working even if I want it to stop.
I have a CSV parser that analyzes lines and inserts entries in a DB table. I am using PDO and Zend Framwork for the project. The code works fine.. too fine in fact.
public function save()
{
$memory_limit = ini_get('memory_limit');
ini_set('memory_limit', '512M');
$sql = "
INSERT INTO my_table (
date_start,
timeframe,
type,
country_to,
country_from,
code,
weight,
value
) VALUES (?,?,?,?,?,?,?,?)
ON DUPLICATE KEY UPDATE
weight = VALUES(weight),
value = VALUES(value)
";
if ($this->test_mode) {
echo $sql;
return;
}
$stmt = new Zend_Db_Statement_Pdo($this->_db, $sql);
foreach($this->parsed_data as $entry){
$stmt->execute(array_values($entry));
$affected_rows = $stmt->rowCount();
if ($affected_rows){
$this->_success = true;
}
}
unset($this->parsed_data, $stmt, $sql);
ini_set('memory_limit', $memory_limit);
}
The script takes various seconds to complete as I am parsing a big file. The problem appears when I am trying to stop the script, with ESC or even by closing the page. The script does not stop until it finishes to insert all entries. Not even an Apache reload is not fixing this, probably a restart will do it.
I am thinking that this is not normal behaviour and maybe I am doing something wrong so I am asking for suggestions.
Thanks.
UPDATE
ignore_user_abort is off (default behaviour) so user abort should be considered..
I'm pretty sure that's standard PHP behaviour - just because the browser goes away doesn't mean it won't stop processing the script. (Although restarting Apache, etc. will achieve this goal.)
To change this behaviour, you can use ignore_user_abort.
That said, "PHP will not detect that the user has aborted the connection until an attempt is made to send information to the client", which I suspect may be the issue you're experiencing.
See the above link and the PHP runtime configuration information for more info.
It is not wrong. Your tries won't work because:
ESCape - because it is totally unrelated to the working of a page - most browsers don't actually react to it
closing (or refreshing) the page - again, not related - the SERVER is doing something, and PHP will NOT stop when the client-side stops - server can't actually know if the client closed or refreshed a page
Apache reload - won't kill the PHP forked process
Restart WOULD do it - this will kill PHP processes and stuff. Although it is kinda troublesome.
Way to do this (if the long execution is undesirable), is to actually set an execution time limit, using PHP function set_time_limit(), or to make the parsing more optimal (if it is not).

PHP Memory differing server to server

I have a hefty PHP script.
So much so that I have had to do
ini_set('memory_limit', '3000M');
set_time_limit (0);
It runs fine on one server, but on another I get: Out of memory (allocated 1653342208) (tried to allocate 71 bytes) in /home/writeabo/public_html/propturk/feedgenerator/simple_html_dom.php on line 848
Both are on the same package from the same host, but different servers.
Above Problem solved new problem below for bounty
Update: The script is so big because it rawls a site and parsers data from 252 pages, including over 60,000 images, which it makes two copies of. I have since broken it down into parts.
I have another problem now though. when I am writing the image from outside site to server like this:
try {
$imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs
$h = fopen($writeTo,'w');
fwrite($h,$imgcont);
fclose($h);
} catch(Exception $e) {
$error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />";
}
All of a sudden it goes to a 500 internal server error page and I have to do it again, at which point it works, because files are only copied it they don't already exist. Is there anyway I can receive the 500 response code and send it back it to the url to make it go again? As this is to all be an automated process?
If this is memory related, I would personally use copy() rather than file_get_contents(). It supports the file wrappers the same way, and I don't see any advantage in loading the whole file in memory just to write it back on the filesystem.
Otherwise, your error_log might give you more information as of why the 500 happens.
There are three parties involved here:
Remote - The server(s) that contain the images you're after
Server - The computer that is running your php script
Client - Your home computer if you are running the script from a web browser, or the same computer as the server if you are running it from Cron.
Is the 500 error you are seeing being generated by 'Remote' and seen by 'Server' (i.e. the images are temporarily unavailable);
Or is it being generated by 'Server' and seen by 'Client' (i.e. there is a problem with your script).
If it is being generated by 'Remote', then see Ali's answer for how to retry.
If it is being generated by your script on 'Server', then you need to identify exactly what the error is - the php error logs should give you more information. I can think of two likely causes:
Reaching PHP's time limit. PHP will only spend a certain amount of time working before returning a 500 error. You can set this to a higher value, or regularly re-set the timer with a call to set_time_limit(), but that won't work if your server is configured in safe mode.
Reaching PHP's memory limit. You seem to have encoutered this already, but worth making sure you're script still isn't eating lots of memory. Consider outputing debug data (possibly only if you set $config['debug_mode'] = true or something). I'd suggest:
try {
echo 'Getting '.$va.'...';
$imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs
$h = fopen($writeTo,'w');
fwrite($h,$imgcont);
fclose($h);
echo 'saved. Memory usage: '.(memory_get_usage() / (1024 * 1024)).' <br />';
unset($imgcont);
} catch(Exception $e) {
$error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />";
}
I've also added a line to remove the image from memory, incase PHP isn't doing this correctly itself (in theory that line shouldn't be necessary).
You can avoid both problems by making your script process fewer images at a time and calling it regularly - either using Cron on the server (the ideal solution, although not all shared webhosts allow this), or some software on your desktop computer. If you do this, make sure you consider what will happen if there are two copies of the script running at the same time - will they both fetch the same image at the same time?
So it sounds like you're running this process via a web browser. I'm guessing that you may be getting the 500 error from Apache timing out somehow after a certain period of time or the process dies or something funky. I would suggest you do one of the following:
A) Move the image downloading to a background process, you can run the crawl script in the browser which will write the urls of the images to be downloaded to the db or something and another script will fire up via cron and fetch all the images. You could also have this script work in batches of 100 or so at a time to keep memory consumption down
B) Call the script directly from the command line (this is really the preferred method for something like this anyway, and you should still probably separate the image fetching to another script)
C) If the command line is not an option for some reason, have your browser loaded script touch a file, and have a cron that runs every minute and looks for the file to exist. Then it fires up your script, you can have the output written to a file for you to check later or send an email when it's completed
Is there anyway I can receive the 500 response code and send it back it to the url to make it go again? As this is to all be an automated process?
Here's the simple version of how I would do it:
function getImage($va, $writeTo, $retries = 3)
{
while ($retries > 0) {
if ($imgcont = file_get_contents($va)) {
file_put_contents($writeTo, $imgcont);
return true;
}
$retries--;
}
return false;
}
This doesn't create the file unless we successfully get our image file, and will retry three times by default. You will of course need to add any require exception handling, error checking, etc.
I would definitely stop using file_get_contents() and write the files in chunks, like this:
$read = fopen($url, 'rb');
$write = fope($local, 'wb');
$chunk = 8096;
while (!feof($read)) {
fwrite($write, fread($read, $chunk));
}
fclose($fp);
This will be nicer to your server, and should hopefully solve your 500 problems. As for "catching" a 500 error, this is simply not possible. It is an irretrievable error thrown by your script and written to the client by the web server.
I'm with Swish, this is not really the kind of task that PHP is intended for - you'de be much better using some sort of server side scripting.
Is there anyway I can receive the 500 response code and send it back it to the url to make it go again?
Have you considered using another library? Fetching files from an external server seems to me more like a job for curl or ftp than file_get_content &etc. If the error is external, and you're using curl, you can detect the 500 return code and handle it appropriately without crashing. If not, then maybe you should split your program into two files - one of which fetches a single file/image, and the other that uses curl to repeatedly call the first one. Unless the 500 error means that all php execution crashes, you would be able to detect the failure and handle it.
Something like this pseudocode:
file1.php:
foreach(list_of_files as filename){
do {
x = call_curl('file2.php', filename);
}
while(x == 500);
}
file2.php:
filename=$_GET['filename'];
results = use_curl_to_get_page(filename);
echo results;
Thanks for all your input. I had seperated everything by the time I wrote this question, so the crawler, fired the image grabber, etc.
I took on board the solution to split the number of images, and that also helped.
I also added a try, catch round the file read.
This was only being called from the browser during testing, but now that it is all up and running it is going to be a cron job.
Thanks Swish and Benubird for your particularly detailed and educational answers. Unfortunately I had no cooperation with the developers on the backend where the images are coming from (long and complicated story).
Anyway, all good now so thanks. (Swish how do you call a script from the command line, my knowledge of this field is severely lacking?)

Categories