merge cells and apply style phpexel - php

When doing merge cells and applying style in loop script is giving fatal error mentioned below.
Fatal error: Maximum execution time of 30 seconds exceeded in D:\xampp\htdocs\bookings\vendor\phpoffice\phpexcel\Classes\PHPExcel\Cell.php on line 892
NOTE: but if we use it out side the loop it works fine, My code which i use to apply style and merge the cell is as below.
$range = chr($alpha).'1:'.chr($alpha+1).'1';
$spreadsheetWriter->mergeCell($range);
$spreadsheetWriter->setCellStyle($range);
What is wrong here, or is there any way which i can use to handle fatal error.
Thanks
Amit

Related

Memory limit while index solrfal in TYPO3

I use TYPO3 7.6 and solr 6.1.3 and solrfal 4.1.0. No I get a PHP memory limit error everytime I've tried to run the solrfal scheduler task. He is still on 57 %. I debugged and deleted the last file he tries to index. But the error was also thrown also with the next file.
I got the error:
Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 464255211 bytes) in /var/www/web1/htdocs/leipzig.ihk.de/typo3conf/ext/solr/Classes/ExtractingQuery.php on line 104
on this line file_get_contents() throws the error. The file has only 90KB. Has anybody an idea?
I'd recommend not uploading such a large file.
I would check to reduce the number of items per run, or increase the memory limit.
You need to Increase memory_limit in your php.ini file.
Got the error. In /typo3conf/ext/solrfal/Classes/Queue/ItemRepository.php on line 154, the first "merged_id" was empty. However this happens. I wrapped the line in a if statement and not it works again
if($mergeId['merge_id']) {
$rows[] = $mergeId['merge_id'];
}
Another solution would be to add merge_id > 0 to the where statement.
The previous propose solution does not fix the problem. For me this looks like, that the merge_id is not set in the database and therefore all items will be merged.
By defaut the merge_id has the following format:
'type/fileUid/languageUid/rootPageId'
If there are items in the file index queue without a merge_id your should clear the file index queue and fill it again.

Is there a maximum number of lines readable by PHP functions

I have this file of 10 millions words, one word on every line. I'm trying to open that file, read every line, put it in an array and count the number of occurrences for each word.
wartek
mei_atnz
sommerray
swaggyfeed
yo_bada
ronnieradke
… and so on (10M+ lines)
I can open the file, read its size, even parse it line by line and echo the line on the browser (it's very long, of course), but when I'm trying to perform any other operation, the script just refuse to execute. No error, no warning, no die(…), nothing.
Accessing the file is always OK, but it's the operations which are not performed with the same success. I tried this and it worked…
while(!feof($pointer)) {
$row = fgets($pointer);
print_r($row);
}
… but this didn't :
while(!feof($pointer)) {
$row = fgets($pointer);
array_push($dest, $row);
}
Also tried with SplFileObject or file($source, FILE_IGNORE_NEW_LINES) with the same result every time (not okay with big file, okay with small file)
Guessing that the issue is not the size (150 ko), but probably the length (10M+ lines), I chunked the file to reduce it to ~20k without any improvement, then reduced it again to ~8k lines, and it worked.
I also removed the time limit with set_time_limit(0); or removed (almost) any memory limit both in the php.ini and in my script ini_set('memory_limit', '8192M');.Regarding the errors I could have, I set the error_reporting(E_ALL); at the top of my script.
So the questions are :
is there a maximum number of lines that can be read by PHP built-in functions?
why I can echo or print_r but not perform any other operations?
I think you might be running into a long execution time:
How to increase the execution timeout in php?
Different operation take different time. Printing might be a lot easier than pushing 10M new data into an array one-by-one. It's strange that you don't get any error messages, you should receive process exceeded time somewhere.

Drupal - Changing decimal field in database causes errors

I needed to change the data_scale of a decimal field implemented by the module "computed field".
I changed the field_data_MYFIELD directly in database (from 10,2 to 10,4).
I also modified the field_revision_MYFIELD the same way.
As third step, I modified the data of the field_config by changing:
FROM
s:14:"data_precision";s:2:"10";s:10:"data_scale";s:1:"2";
TO
s:14:"data_precision";s:2:"10";s:10:"data_scale";s:1:"4";
As I'm trying to clear caches with drush cc all, I get the following error:
PHP Fatal error: Unsupported operand types in
DRUPAL_SITE/modules/field/field.info.class.inc on line 495
The line 495 is:
// Make sure all expected field settings are present.
$field['settings'] += field_info_field_settings($field['type']);
I enabled the error log in index.php and have the following errors:
unserialize(): Error at offset 330 of 1314 bytes in DRUPAL_SITE/modules/field/field.crud.inc on line 374
Notice: Undefined index: settings in DRUPAL_SITE/sites/all/modules/computed_field/computed_field.install on line 13
Undefined index: settings in DRUPAL_SITE/modules/field/field.info.class.inc on line 495
Fatal error: Unsupported operand types in DRUPAL_SITE/modules/field/field.info.class.inc on line 495
What am I doing wrong?
Never a good idea to alter settings via mysql directly, take a look here to do it from code : https://drupal.stackexchange.com/questions/79378/changing-a-field-type-from-integer-to-decimal/151367#151367
Instead, you could have use an hook_update to alter the field.
Thank you Clive. I did'nt paste the whole data ...apologize. The data_precision etc keys are indeed inside a database key within that string. I tested the data content into http://blog.tanist.co.uk/files/unserialize/ (as suggested) and the string is not valid ... you were right. I fixed it by changing different values of the S lenght in order to match the content of each S . After some tests, the problem is now resolved. It appears that editing the Blob data (for computed fields with php code inside) directly from phpmyadmin is not a good idea as far as it adds many additional char that doesn't match the lenght of the S. Thanks again for your help.

Php microseconds in relative templates

I have a date time object which I would like to modify the microseconds. I'm using the following code.
$datesArray[$alreadyDoneIndices+$countIndices]->modify("+$microsecondIncrementAmount mseconds");
and I'm getting the following error message.
DateTime::modify() [datetime.modify]: Failed to parse time string (+38462 microseconds) at position 3 (4): Unexpected character in C:\xampp\htdocs\DynamicUpdating\timelineUpdater.php on line 147
What's not working with this? Is there a reference for what relative timestamps work in php that I'm missing if that is the problem?
Just check mseconds in modify("+$microsecondIncrementAmount mseconds") The error message tells you that the time string is wrong.
$datesArray[$alreadyDoneIndices+$countIndices]->modify("+{$microsecondIncrementAmount} mseconds");
Notice the {}

keep getting MAX timeout error on urlencode

I keep getting an error on this code:
<?php
function encode_object(&$obj) {
foreach ($obj as &$current) {
if (is_string($current) or is_object($current) or is_array($current)) {
if (is_object($current) or is_array($current)) {
encode_object($current);
}
if (is_string($current)) {
$current = urlencode($current);
}
}
}
}
?>
This code has worked before but for some reason every time a run it I get:
Fatal error: Maximum execution time of 30 seconds exceeded in * on line 9
What I'm trying to do is be able to give it an object, search through it and encode all of the strings.
I have tried multiple times but keep getting the same error
I am using:
Apache 2.2.15.0
PHP 5.3.3
Windows 7 Ultimate build 7600
EDIT:
The input I'm entering is an error that, after going through this function, is meant to be converted into JSON and read by javascript through ajax.
The input in this case would be:
array("error"=>
array(
"error"=>"error",
"number"=>2,
"message=>"json_encode() [<a href='function.json-encode'>function.json-encode<\/a>]: recursion detected",
"line"=>22))
That is another error that i will worry about later, but it seems that when I put
$obj['error']['message'] = 'blah';
on the object before I send it, the code works fine. So there is something about
json_encode() [<a href='function.json-encode'>function.json-encode<\/a>]: recursion detected
that urlencode seems to be having a problem with.
If it has worked before, then it seems there's nothing wrong with the code, just that the objects you're sending it are large and are taking longer to process than the default execution time set in PHP.
The quick and dirty way to handle this is to use the ini_set() function:
ini_set("max_execution_time",840); (in this case, 840 is 840/60 or 14 minutes)
I've used this before on a query with a particularly large result-set, one which took at minimum five minutes to load, and build the HTML for the page.
Note, this will not work if your host has "Safe Mode" enabled. In that case you actually have to change the setting in PHP.ini. Otherwise, I use this quick and dirty roundabout fairly often for ridiculously huge parsing/processing tasks.

Categories