Codeigniter - Blank screen when trying to retrieve 8500 records - php

I am trying to display a table which contains 8500 records, I am using the same controller/model functions as I have used throughout the site which all work fine.
On this page however, I just see a blank screen. Is this a known Issue with codeigniter? Is there a work around? I am totally stumped, my only option I guess is to split the table into sub tables?
I can show you my code if needed.
Thanks
Dan

When youget a blank screen it usually mean you've done something to receive a PHP error.
To see what that error is, check the php error log. I suspect that you've exceeded the maximuim allowed memory limit.
php_value memory_limit 256M
php_value display_errors 1
php_flag log_errors on
php_value error_log /some/path/on/the/box/you/have/acess/to.log
Below are the hard coded PHP ways to enable the settings, above this line are .htaccess directives you can set that will kick in for your whole app.
To make sure error reporting is turned on and you're displaying errors you can do..
ini_set('display_errors', 'On');
error_reporting(E_ALL);
To find out where your error log is make a test script to tell you.
die(ini_get('error_log'));
Make sure log_errors ini setting is enabled too in your php.ini file.
If it is indeed that you're exceeding the max allowed memory limit you can increase it by doing
ini_set(“memory_limit”,”256M”); // 256 megabytes
I recommend updating this in your php.ini file and restarting apache for the changes to kick in.
If your script is dealing with large amounts of data and it can take a while to run then you might also exceed the max_execution_time ini setting.
To see what it's currently set at, you can do
die(ini_get('max_execution_time'));
There is a nice PHP helper funciton to set this for you set_time_limit
set_time_limit(300); // Input in seconds, this is 5 minutes.
Hope this helps you get somewhere.
Your best bet is looking at your error log though.
Good luck.

Related

How can I use fineuploader with Amazon S3 in my directory structure? [duplicate]

I'm implementing an ajax fileupload for my php application (using CodeIgniter).
I detect if the uploaded POST data is to large (>post_max_size) according to http://andrewcurioso.com/2010/06/detecting-file-size-overflow-in-php/ and try to send an appropriate JSON-encoded error response.
But the corresponding php warning included in the output completely destroys my JSON response !
<br />
<b>Warning</b>: POST Content-Length of 105906405 bytes exceeds the limit of 8388608 bytes in <b>Unknown</b> on line <b>0</b><br />
[{"error":"Posted data is too large. 105906405 bytes exceeds the maximum size of 8388608 bytes."}]
I don't want to parse and filter the warning out on client side, that seems ugly.
And disabling all php warnings globally seems inappropriate.
Can I disable specific PHP warnings in the context of a php function? Or wrap it inside of a valid json response?
On production server: Always disable php warnings.
On development machine:
Disable all warnings for that page for the purpose of testing and re-enable after testing is complete and functionality confirmed as working.
Hint: I prefer to have a variable globally set and depending on that enable or disable all php and database warning notifications. You could set that in .htaccess for example if you use apache.
I have this in .htaccess:
production: SetEnv DEVELOPMENT Off
development: SetEnv DEVELOPMENT On
And in the php code:
if( strtolower(getenv('DEVELOPMENT')) == "on" ) {
ini_set("display_errors",2);
ini_set('error_reporting', E_ALL | E_STRICT);
}
To disable all errors from .htaccess use
php_flag display_startup_errors off
php_flag display_errors off
php_flag html_errors off
php_value docref_root 0
php_value docref_ext 0
If you want to get really tricky you could add a check like:
if(error_get_last() !== null)
and set a http header and check that from javascript both for json output and when loading your page and take a look at the error log or display the error on screen if you prefer that, still I strongly recommend that you disable the error showing code in the production environment.
For more in-depth talk about this take a look at this article.
I have just had the same problem implementing an upload script which requires communication back to the calling script. Some php warnings were corrupting the output response.
I have resorted to the php output buffering functions. This is how I got rid of the warning output messages:
ob_start(); // start output buffering (we are at the beginning of the script)
[ ... ] // the script actual functionality is unchanged
ob_end_clean(); // this clears any potential unwanted output
exit($my_json_encoded_response); // and this just outputs my intended response
This solved my problem.
You can set a user-defined error handler: http://www.php.net/manual/en/function.set-error-handler.php
You can also use the output buffering functions to wrap it. ob_start, ob_end_flush and others.
However, it is always best to have PHP error_reporting set to false in production systems, then catch errors and wrap them in your JSON.
Or you can just change the post max size on php.ini file:
post_max_size = 10M //change this line to whatever fits your needs
But depending on what the users should be uploading, you should also try checking the file size of the files that they're uploading.

PHP: Check for dropped/cropped POST-data

PHP has several limits on POST-data (overall data size limit as well as field-limits). I currently get no warnings when I run into these limits - except missing data.
Is there a way to check whether PHP ran into one of these limits?
You need to change your error reporting level so that it displays warnings, as a warning is outputted when the POST size is reached.
error_reporting(E_ALL);
Setting error_reporting to E_ALL includes E_WARNING, however you could just use E_WARNING if you didn't want other errors to be reported.
If you wanted the errors to be outputted, ensure you have display_errors set to on or 1. If you don't, you can do so locally by using display_errors(1);
Regardless of whether you displays errors or not, they will still be logged to your PHP error log file.
You can check or change value in php.ini or .htaccess:
post_max_size
There are also another limits in this file:
max_input_nesting_level
memory_limit
upload_max_filesize
max_input_time
max_input_vars
You can control these using .htaccess, like so:
php_value post_max_size 100M
Also check if there are errors during processes using:
error_reporting(E_ALL);
Your application can check these settings. Use ini_get() for this.
For example:
ini_get('post_max_size'); // in bytes
Be aware that this is a development and deployment issue, because your PHP script (even the whole Apache server) may be reverse proxied, and anyway there can be any number of intermediate agents having a limit on the request and thus truncating it.
Thus the first step is ensuring that the PHP limit for post requests size is the least in the chain of server software that handle the request.
Then, I suggest you read this question (and answers and comments) to get an idea of what you can do and what the platform gives you, and especially how cumbersome and somewhat unreliable it is to detect such an error.
Anyway, I think you should not be too scared about not giving users feedback about the data entered: after all, everything has a limit (think of database columns size, disk space and so on) and adequate sizing is more important than issuing an error message. At least, I can't even remember the last time a service complained about the request size. If you are especially interested in file upload limits, instead, that's quite a different thing because it's easier to know, in the PHP script, if the user hit the given limits.

Php script stops after long time and no error could be found on the error_log

Im running a long php script which handles large amounts of data.
The problem is that the script suddenly stops and no exception is thrown or could be found on the error_log.
I have set the display_errors and the error_logging to 1 in the .ini config file.
Few more details:
1) The scripts executes the 'file_get_contents' function for many times.
2) The scripts contains recursion when the file_get_contents fails.
Any help would be appriciated.
It might have hit the max execution time.
set_time_limit(0); // to increase the timelimit to infinity
Error loging configs are different depending on your hosting environment. I'd first verify that you're editing the right php.ini file. Take a look at your phpinfo output and make sure that those params are indeed set and check the path/file for where errors are being logged to. Sometimes it goes to the apache error log, other times it can be sent to a dedicated php log. Are you able to get any error output if you purposefully create a syntax error? You might also consider looking in your syslog to see if there's anything there.

PHP Configuration - Memory

In my php.ini file I have
memory=40M
What does this do (it solves some problems that I have been having). Note that this is NOT:
memory_limit=128M
I know memory limit sets the maximum amount of memory a PHP script can use, but what does memory do?
EDIT
I recognize this is not a standard directive, but it is fixing my problem. Without it my pages randomly produce 500 errors, but then I put this line in and they go away.
This is where I got the fix from:
http://www.archtopia.com/2010/01/30/wordpress-internal-server-error-500-with-1and1-webhosting/
memory is not a valid php.ini directive. It may be solving your problem because it is not recognized, in turn resorting to a default value that does in fact work. Also note that "megabyte" should be M not MB.
The proper way to set the value is:
memory_limit=40m
I can't find the memory directive on http://php.net/manual/en/ini.core.php. Are you sure it's correct and not a typo?

Frequent "Connection Timeout" errors on a shared server using PHP/MYSQL

I have a Drupal site on a shared web host, and it's getting a lot of connection errors. It's the first time I have seen so many connection timeout errors on a server. I'm thinking it's something in the configuration settings. Non-drupal parts of the site are not giving as many connection errors.
Since this hosting provider doesn't give me access to the php.ini file, I put one at my docroot to modify the lines that I thought would be causing this:
memory_limit = 128M
max_execution_time = 259200
set_time_limit = 30000
But it didn't work. There is no improvement in the frequency of the timeout errors. Does anyone have any other ideas about this type of error?
Thanks.
You can control the time limit on a script while your script is running. Add a called to set_time_limit near the top of your PHP pages to see if it helps.
Ideally you need to figure out what you actual limits are as defined by your host. A call to phpinfo() somewhere will let you see all the config settings that your server has in place.

Categories