I keep getting an error on this code:
<?php
function encode_object(&$obj) {
foreach ($obj as &$current) {
if (is_string($current) or is_object($current) or is_array($current)) {
if (is_object($current) or is_array($current)) {
encode_object($current);
}
if (is_string($current)) {
$current = urlencode($current);
}
}
}
}
?>
This code has worked before but for some reason every time a run it I get:
Fatal error: Maximum execution time of 30 seconds exceeded in * on line 9
What I'm trying to do is be able to give it an object, search through it and encode all of the strings.
I have tried multiple times but keep getting the same error
I am using:
Apache 2.2.15.0
PHP 5.3.3
Windows 7 Ultimate build 7600
EDIT:
The input I'm entering is an error that, after going through this function, is meant to be converted into JSON and read by javascript through ajax.
The input in this case would be:
array("error"=>
array(
"error"=>"error",
"number"=>2,
"message=>"json_encode() [<a href='function.json-encode'>function.json-encode<\/a>]: recursion detected",
"line"=>22))
That is another error that i will worry about later, but it seems that when I put
$obj['error']['message'] = 'blah';
on the object before I send it, the code works fine. So there is something about
json_encode() [<a href='function.json-encode'>function.json-encode<\/a>]: recursion detected
that urlencode seems to be having a problem with.
If it has worked before, then it seems there's nothing wrong with the code, just that the objects you're sending it are large and are taking longer to process than the default execution time set in PHP.
The quick and dirty way to handle this is to use the ini_set() function:
ini_set("max_execution_time",840); (in this case, 840 is 840/60 or 14 minutes)
I've used this before on a query with a particularly large result-set, one which took at minimum five minutes to load, and build the HTML for the page.
Note, this will not work if your host has "Safe Mode" enabled. In that case you actually have to change the setting in PHP.ini. Otherwise, I use this quick and dirty roundabout fairly often for ridiculously huge parsing/processing tasks.
Related
I read somewhere that php parses the whole .php file every time it is executed. Some solution was proposed in there (that was not opcache), but I lost the website and I couldn't find it.
Now I have an enormous php website that has many long functions that are often used alone, and it's required that the execution be fast.
To avoid having php parsing all the other functions that won't be used, I was thinking of making a modular design in which the functions, stored in independent php files, will only be included if they will be actually used. But I haven't been able to confirm that php will not parse an include inside of a function or inside of a conditional statement unless it is required. Does php parse those includes?
Example:
<?php
$func_to_execute = $_GET['func'];
$parameter = $_GET['parameter'];
switch($func_to_execute)
{
case 'a':
include 'func_a.php';
$output = func_a($parameter);
break;
case 'b':
include 'func_b.php';
$output = func_b($parameter);
break;
case 'c':
include 'func_c.php';
$output = func_c($parameter);
break;
};
echo $output;
?>
In this example, I would like php to parse only the func_a if I am requesting a, only the func_b if I am requesting b, etcetera. There are in practice more than just 3 functions, and each is a very long algorythm with also very long strings and arrays.
As an alternative to includes I was thinking of making independent php files and execute them and retrieve their output only if they are required, with a shell_exec. But that would take other complexities, like formatting the parameters (I don't have idea of how I would pass a very long string with special characters, or a JSON, as a parameter in the shell) and calling the function to execute in the shell. Would those complexities make it slower than just letting php parse the whole file?
I know about the opcache function. Would it be enough even if all the ops of all the functions will be tested each time?
Are there other ways to make a PHP website modular, and not having php parsing the whole of php files everytime?
Thank you.
since php uses many optimizations and caching apcu i.e. you dont need to care about this
include wont be parsed at load time.. its more like file_get_contents and execute in same context - and these will be optimized by internal php cache
http://php.net/manual/en/intro.apc.php
I made a benchmarking experiment and it seems that php truly does not parse conditional includes. I made the test using the example script mentioned, and defining each as:
func_a: it only declares that the value of the variable $x is the sentence 'war and peace'.
$x = 'war and peace';
func_b: it only declares that the value of the variable $x is the whole text of the novel war and peace, which is approximately 3.2 MB long (the whole text was pasted in the php file). This would be a very long file to parse.
$x = 'War and Peace, by Leo Tolstoy...(the whole novel...)...';
func_c: it contained incorrect syntax, that should immediately launch an error message from php. This was made to guarantee that php was not actually parsing what was not included.
I measured the execution time from another php script with the function shell_exec(). The results were (in seconds):
func_a ≈ 0.122
func_b ≈ 0.152
func_c ≈ 0.119
Therefore I conclude that:
- Includes in a switch statement are not parsed unless they are actually required.
- A syntax mistake in an include (inside a switch statement) will not launch any error if it is not actually required, because it is not parsed.
- Anyway, the difference on the time of the process is very little (about 0.03 extra second for an extra text of 3.3 MB; or crudely said, 0.01 extra second per 1 MB of text to parse). However this might be important to consider if there are many users requesting the website at the same time, and therefore it might be useful to divide in modules (includes) if the script is actually that big. Also the fact that a wrongly written include that was not required be not parsed helps to not launch errors when they aren't relevant.
It seems then for me a good manner to design a modular application in PHP where the modules be extremely big.
I am using laravel and using https://csv.thephpleague.com/ to parse csv.
My function is something like
$path = $request->file('import_file')->getRealPath();
$csv = Reader::createFromPath($path, 'r');
$csv->setHeaderOffset(0);
$csv_header = $csv->getHeader();
$sample_data = $csv->fetchOne();
$sample_data = array_values($sample_data);
$records = $reader->getRecords();
$csv_file_id = Csv_data::create([
'csv_filename' => $request->file('import_file')->getClientOriginalName(),
'csv_header' => json_encode($csv_header),
'csv_data' => json_encode($records)
]);
How can i parse large data sets, by dealing against execution time limit.
Well, am pretty new to these things, so I request just not commenting like use this and that. Up to now time is just passing by trying this and that package. So, solution with code snippets could be better.
Also I tried with,
$stmt = (new Statement())
->offset($offset)
->limit($limit)
;
But with no success. !st reason even limiting offset and running in loop by increasing offset, it shows same error of execution time. 2nd reason, its little difficult for me to end the loop with good logic.
Looking for some help. I will be available for instant reply.
Are you using a console command for this?
https://laravel.com/docs/5.6/artisan
If you run into memory limits when doing through console, you can first try to increase php memory limit.
If that is still not enough, last option is to cut up .csv into parts.
But this should not be necessary unless you are dealing with a very vert large .csv (unlikely).
By default, PHP is usually set to execute a HTTP request for 30 seconds before timing out -- to prevent a runaway script or infinite loop from processing forever. It sounds like this is what you're running into.
The quick and dirty method is to add ini_set('max_execution_time', 300); at the top of your script. This will tell php to run for 300 seconds (5 minutes) before timing out.
You can adjust that time as needed, but if it regularly takes longer than that, you may want to look at other options -- such as creating a console command (https://laravel.com/docs/5.6/artisan) or running it on a schedule.
I have this file of 10 millions words, one word on every line. I'm trying to open that file, read every line, put it in an array and count the number of occurrences for each word.
wartek
mei_atnz
sommerray
swaggyfeed
yo_bada
ronnieradke
… and so on (10M+ lines)
I can open the file, read its size, even parse it line by line and echo the line on the browser (it's very long, of course), but when I'm trying to perform any other operation, the script just refuse to execute. No error, no warning, no die(…), nothing.
Accessing the file is always OK, but it's the operations which are not performed with the same success. I tried this and it worked…
while(!feof($pointer)) {
$row = fgets($pointer);
print_r($row);
}
… but this didn't :
while(!feof($pointer)) {
$row = fgets($pointer);
array_push($dest, $row);
}
Also tried with SplFileObject or file($source, FILE_IGNORE_NEW_LINES) with the same result every time (not okay with big file, okay with small file)
Guessing that the issue is not the size (150 ko), but probably the length (10M+ lines), I chunked the file to reduce it to ~20k without any improvement, then reduced it again to ~8k lines, and it worked.
I also removed the time limit with set_time_limit(0); or removed (almost) any memory limit both in the php.ini and in my script ini_set('memory_limit', '8192M');.Regarding the errors I could have, I set the error_reporting(E_ALL); at the top of my script.
So the questions are :
is there a maximum number of lines that can be read by PHP built-in functions?
why I can echo or print_r but not perform any other operations?
I think you might be running into a long execution time:
How to increase the execution timeout in php?
Different operation take different time. Printing might be a lot easier than pushing 10M new data into an array one-by-one. It's strange that you don't get any error messages, you should receive process exceeded time somewhere.
This question already has answers here:
making print_r use PHP_EOL
(5 answers)
Closed 6 years ago.
I've been coding in PHP for a long time (15+ years now), and I usually do so on a Windows OS, though most of the time it's for execution on Linux servers. Over the years I've run up against an annoyance that, while not important, has proved to be a bit irritating, and I've gotten to the point where I want to see if I can address it somehow. Here's the problem:
When coding, I often find it useful to output the contents of an array to a text file so that I can view it's contents. For example:
$fileArray = file('path/to/file');
$faString = print_r($fileArray, true);
$save = file_put_contents('fileArray.txt', $faString);
Now when I open the file fileArray.txt in Notepad, the contents of the file are all displayed on a single line, rather than the nice, pretty structure seen if the file were opened in Wordpad. This is because, regardless of OS, PHP's print_r function uses \n for newlines, rather than \r\n. I can certainly perform such replacement myself by simply adding just one line of code to make the necessary replacements, ans therein lies the problem. That one, single line of extra code translates back through my years into literally hundreds of extra steps that should not be necessary. I'm a lazy coder, and this has become unacceptable.
Currently, on my dev machine, I've got a different sort of work-around in place (shown below), but this has it's own set of problems, so I'd like to find a way to "coerce" PHP into putting in the "proper" newline characters without all that extra code. I doubt that this is likely to be possible, but I'll never find out if I never ask, so...
Anyway, my current work-around goes like this. I have, in my PHP include path, a file (print_w.php) which includes the following code:
<?php
function print_w($in, $saveToString = false) {
$out = print_r($in, true);
$out = str_replace("\n", "\r\n", $out);
switch ($saveToString) {
case true: return $out;
default: echo $out;
}
}
?>
I also have auto_prepend_file set to this same file in php.ini, so that it automatically includes it every time PHP executes a script on my dev machine. I then use the function print_w instead of print_r while testing my scripts. This works well, so long as when I upload a script to a remote server I make sure that all references to the function print_w are removed or commented out. If I miss one, I (of course) get a fatal error, which can prove more frustrating than the original problem, but I make it a point to carefully proofread my code prior to uploading, so it's not often an issue.
So after all that rambling, my question is, Is there a way to change the behavior of print_r (or similar PHP functions) to use Windows newlines, rather than Linux newlines on a Windows machine?
Thanks for your time.
Ok, after further research, I've found a better work-around that suite my needs, and eliminates the need to call a custom function instead of print_r. This new work-around goes like this:
I still have to have an included file (I've kept the same name so as not to have to mess with php.ini), and php.ini still has the auto_prepend_file setting in place, but the code in print_w.php is changes a bit:
<?php
rename_function('print_r', 'print_rw');
function print_r($in, $saveToString = false) {
$out = print_rw($in, true);
$out = str_replace("\n", "\r\n", $out);
switch ($saveToString) {
case true: return $out;
default: echo $out;
}
}
?>
This effectively alters the behavior of the print_r function on my local machine, without my having to call custom functions, and having to make sure that all references to that custom function are neutralized. By using PHP's rename_function I was able to effectively rewrite how print_r behaves, making it possible to address my problem.
I need to get the information (loading CPU in this moment) from server and write this in the variable. sys_getloadavg() returns an array with three samples (last 1, 5 and 15 minutes), but I need loading CPU in this moment.
There is not a direct way of doing it. My suggestion of a work around is to use a function like exec or its equivalents. With a exec you can invoke a unix command like top to get the desirable stats. Then you have to parse the return value of exec and get the information you are searching for. There is also the file /proc/stat which you can read an get the information you want.
It was easy, I had to turn to the first array variable:
$cpu_loading_array = sys_getloadavg();
$cpu_loading = $cpu_loading_array[0];