ob_start() doesn't work on large outputs - php

I'm wondering if this is a PHP.ini config issue. I find that, with large data sets, I can't echo incremented values.
<?php
ob_start();
*SQL query*
$count++;
$search_results = ob_get_clean();
echo $count;
echo $search_results;
?>
The code itself works, but with large data sets it doesn't appear at all. I'm wondering if this is a cacheing issue?

This was not a memory issue, it was related to php.ini settings. In particular, the max_execute time was not high enough. I changed it to;
max_execution_time = 600
Solved the issue.

Related

Running out of memory on PHP-ML

I am trying to implement a sentiment analysis with PHP-ML. I have a training data set of roughly 15000 entries. I have the code working, however, I have to reduce the data set down to 100 entries for it to work. When I try to run the full data set I get this error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 917504 bytes) in C:\Users\<username>\Documents\Github\phpml\vendor\php-ai\php-ml\src\Phpml\FeatureExtraction\TokenCountVectorizer.php on line 95
The two files I have are index.php:
<?php
declare(strict_types=1);
namespace PhpmlExercise;
include 'vendor/autoload.php';
include 'SentimentAnalysis.php';
use PhpmlExercise\Classification\SentimentAnalysis;
use Phpml\Dataset\CsvDataset;
use Phpml\Dataset\ArrayDataset;
use Phpml\FeatureExtraction\TokenCountVectorizer;
use Phpml\Tokenization\WordTokenizer;
use Phpml\CrossValidation\StratifiedRandomSplit;
use Phpml\FeatureExtraction\TfIdfTransformer;
use Phpml\Metric\Accuracy;
use Phpml\Classification\SVC;
use Phpml\SupportVectorMachine\Kernel;
$dataset = new CsvDataset('clean_tweets2.csv', 1, true);
$vectorizer = new TokenCountVectorizer(new WordTokenizer());
$tfIdfTransformer = new TfIdfTransformer();
$samples = [];
foreach ($dataset->getSamples() as $sample) {
$samples[] = $sample[0];
}
$vectorizer->fit($samples);
$vectorizer->transform($samples);
$tfIdfTransformer->fit($samples);
$tfIdfTransformer->transform($samples);
$dataset = new ArrayDataset($samples, $dataset->getTargets());
$randomSplit = new StratifiedRandomSplit($dataset, 0.1);
$trainingSamples = $randomSplit->getTrainSamples();
$trainingLabels = $randomSplit->getTrainLabels();
$testSamples = $randomSplit->getTestSamples();
$testLabels = $randomSplit->getTestLabels();
$classifier = new SentimentAnalysis();
$classifier->train($randomSplit->getTrainSamples(), $randomSplit->getTrainLabels());
$predictedLabels = $classifier->predict($randomSplit->getTestSamples());
echo 'Accuracy: '.Accuracy::score($randomSplit->getTestLabels(), $predictedLabels);
And SentimentAnalysis.php:
<?php
namespace PhpmlExercise\Classification;
use Phpml\Classification\NaiveBayes;
class SentimentAnalysis
{
protected $classifier;
public function __construct()
{
$this->classifier = new NaiveBayes();
}
public function train($samples, $labels)
{
$this->classifier->train($samples, $labels);
}
public function predict($samples)
{
return $this->classifier->predict($samples);
}
}
I am pretty new to Machine Learning and php-ml so I am not really sure how to deduce where the issue is or if there is even a way to fix this without having a ton of memory. The most I can tell is that the error is happening in TokenCountVectorizer on line 22 of the index file. Does anyone have any idea what may be causing this issue o have run into this before?
The link to PHP-ML is here: http://php-ml.readthedocs.io/en/latest/
Thank you
This error comes from loading more into memory than what PHP is set up to handle in one process. There are other causes, but these are much less common.
In your case, your PHP instance seems configured to allow a maximum of 128MB of memory to be used. In machine learning, that is not very much and if you use large datasets you will most definitely hit that limit.
To alter the amount of memory you allow PHP to use to 1GB you can edit your php.ini file and set
memory_limit = 1024M
If you don't have access to your php.ini file but still have the permissions to change the setting you can do it at runtime using
<?php
ini_set('memory_limit', '1024M');
Alternatively, if you run Apache you can try to set the memory limit using a .htaccess file directive
php_value memory_limit 1024M
Do note that most shared hosting solutions etc have a hard, and often low, limit on the amount of memory you are allowed to use.
Other things you can do to help are
If you load data from files look at fgets and SplFileObject::fgets to load read files line-by-line instead of reading the complete file into memory at once.
Make sure you are running an as up to date version as possible of PHP
Make sure PHP extensions are up to date
Disable PHP extensions you don't use
unset data or large objects that you are done with and don't need in memory anymore. Note that PHP's garbage collector will not necessarily free the memory right away. Instead, by design, it will do that when it feels the CPU cycles required exists or before the script is about to run out of memory, whatever occurs first.
You can use something like echo memory_get_usage() / 1024.0 . ' kb' . PHP_EOL; to print memory usage at a given place in your program to try and profile how much memory different parts use.

PHP output buffering issues

since two days i'm trying to stop the buffer on my server, i disabled the output_buffering in php.ini i checked it was disabled with phpinfo().
Under xampp(Localhost) it works like a charm, same testing code(below), the code runs without waiting for everything to be finished, no buffer, a dream =)
On my server the output_buffering show me No value in phpinfo() so i think it's disabled, but still it's not workingn i need to wait until the loop finish his work, anyway to make this work like on my xampp config ? thanks !
testing code here :
for($i=1; $i<=5000; $i++){
echo $i."<br>";
flush();
usleep(1000);
}
ps : i tested with php 5.6 & php7 on Debian and Ubuntu, my xampp is naturally running on windows(10)
You need to use ob_flush() and flush()
What's the difference you may ask? That's a good question.
Modern browsers don't display anything until the body of the response contains a certain amount of data (about 1024 bytes). The following may look a bit hacky - but like this it works as expected:
<?php
echo '<!-- xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx -->';
flush();
for($i=1; $i<=5000; $i++) {
echo $i."<br>";
flush();
usleep(1000);
}
?>

best setting of php apcu for 2000000 tags

In my app,I want to store 2000000 (two million) tags in php apcu for performance.
x……x is key and values.$ar is about 2000k keys and size is about 250MB.I want to store 1 day.
<?php
$ar=array("x……x"=>"x……x",…………);
foreach($ar as $k=>$v){
apcu_store($k,$v);
}
echo ok;
?>
This is store.php and run this.It gets no error.output"ok"
<?php
echo apcu_fetch("x……x");
?>
It gets nothing.
Please why?
I set memory limit to 700M in php.ini
Please give best apcu setting.Thanks

CodeIgniter with PHPABTest

I'm building a CodeIgniter site, and attempting to use php ABTest in the controller.
I saved the phpabtest.php file as phpabtest_helper.php in the "helpers" folder, and loaded it in the controller. It's initialized in the PHP logic as such:
public function view($name)
{
$this->load->helper('phpab');
$testpopup = new phpab('test_popup');
$testpopup->add_variation("popup");
$type = $this->Types->getTypeBySlug($name);
$data['type'] = $type;
$data['items'] = $this->Items->getItemsByType($type->id);
$alltypes = $this->Types->getAll();
$headerdata['alltypes'] = $alltypes;
$headerdata['current'] = $type->id;
$this->load->view('header');
$this->load->view('typeheader', $headerdata);
if($testpopup->get_user_segment()=='popup'){
$this->load->view('type_new', $data);
} else{
$this->load->view('type', $data);
}
$this->load->view('footer');
}
It works fine on my localhost, but when I upload it to the server, it breaks, just displaying a blank page. I've isolated the problem to the initialization of the new phpab object. In the helper, it does ob_start(array($this, 'execute')); and this line seems to be what is breaking the code.
What server settings should I be looking at to get it to work? I'm assuming it's a server setting issue because it works fine on my localhost. If I'm wrong and it's some other issue, how do I fix this?
You might want to check your PHP settings. You'll find a setting called output_buffer in your php.ini file that might be set to Off.
Exert from php.ini:
; Output buffering allows you to send header lines (including cookies) even
; after you send body content, at the price of slowing PHP's output layer a
; bit. You can enable output buffering during runtime by calling the output
; buffering functions. You can also enable output buffering for all files by
; setting this directive to On. If you wish to limit the size of the buffer
; to a certain size - you can use a maximum number of bytes instead of 'On', as
; a value for this directive (e.g., output_buffering=4096).
output_buffering = 4096

What would cause a php script to just stop without reason?

I'm using class.upload.php to upload images to a page. It seems a script that previously worked is now basically stopping part way through without any errors. At each line in the script, I'm echoing a number so I can get an idea at which point it stops, ie:
// save uploaded image with a new name
$foo->file_new_name_body = $image_name;
$foo->Process($save_directory);
if ($foo->processed) {
echo 'original image copied 1';
} else {
echo 'error 1: ' . $foo->error;
}
echo '1'; $foo->file_new_name_body = $image_name . '_medium';
echo '2'; $foo->image_resize = true;
echo '3'; $foo->image_convert = jpg;
echo '4'; $foo->image_x = 300;
echo '5'; $foo->image_ratio_y = true;
echo '6'; $foo->file_overwrite = true;
echo '7'; $foo->Process($save_directory);
echo '8';
if ($foo->processed) {
echo 'original image copied';
} else {
echo 'error 2: ' . $foo->error;
}
Output:
original image copied 1
1
2
3
4
5
6
7
I have no way to explain it. No errors. No feedback. No further code after echo '7'; is executed. What is going on? How can I debug?
What would be causing this kind of behaviour?
Edit
I was able to fix the issue simply by changing the file being uploaded. It was 2000px wide and when I saved it as a smaller 800px image it completed the process.
I changed the memory as suggested to 40m, however this still doesn't let me process a 2000px wide image which doesn't really make much sense to me. How much memory is needed to deal with files these sizes?
Check your memory limit, I find that if I hit that, PHP can (does?) just quit without saying anything.
Well, I'm no rocket scientist (a) but it seems to me that the most likely cause is that you're not exiting from the call to $foo->Process($save_directory).
You probably need to debug that little beast to find out the reason why.
First step would be to output $save_directory and $image_name to see what they contain.
Second, I would comment out all those flag setting statements and see if it works with just the copy. Then add them back in one at a time until it starts failing again.
It may be that there's either a bug in the conversion code or something wrong with the uploaded image file which causes it to hang. If the latter, perhaps another (simpler) image file may work - it's worth checking.
It's no surprise that it would hang at that point, since all the other statement are simply setting up flags for the process call.
And, as a last resort, you have access to the source. You can simply make your own copy of it (if you don't already have one) and insert debug statements into that to see where it's stopping.
The code for process is actually quite logically laid out. It appears to be a succession of blocks of the form:
if ($this->processed) {
# Attempt to do next stage, setting
# $this->processed to false if there's a problem.
}
So you could just insert an echo statement before each block to find out which block is causing the problem, then start inserting more debug statements within that block until you narrow it down to the actual line.
You may also want to look at the FAQ, especially the bit that states upping the PHP memory limits may assist with larger images. There is also the forums but I see you've already started to go down that route :-)
(a) : What do rocket scientists say in this situation?
What is set to in your php.ini - this for example will show almost any error.
error_reporting = E_ALL & ~E_NOTICE

Categories