require_once() takes two minutes? - php

I have a PHP script which the execution time used to be a few milliseconds.
But yesterday I got complaints that it loads forever. I checked into it and nailed down the problem to a line that uses require_once(); -- execution time is about two minutes just for that line!
The file to be included contains a bunch of functions and in itself doesn't do anything at that point. It also is just about 35kb in size.
I went through the script logging the microtime() and this is the output:
0.10887400 1442934181 // start of script
0.13321200 1442934181 // line before "require_once()"
0.16033800 1442934307 // log time again in first line
// of the included functions file
0.16048000 1442934307 // back to original script,
// line after require_once()
0.16054300 1442934307 // end of script
Just for curiosity I've tried to replace require_once() with require() -- to change.
I don't know what the cause could be and where I should start to debug. It has worked before without problem, and I haven't done any change.

require_once and require need to access to the file, read it, and execute it. So you may have multiple problems, but you'll have to check :
Maybe your hard disk is going bad ?
Maybe the file is locked by some others process ?
The file contains now a big function doing too much things

Related

How to rerun a the script when an error occured?

I have written a script that search for values in xml file. This files I retrieve online via the following code
# Reads entire file into a string
$result = file_get_contents($xml_link);
# Returns the xml string into an object
$xml = simplexml_load_string($result);
But the xml are somethimes are big with as consequence that I got the following error Fatal error: Maximum execution time of 30 seconds exceeded.
I have adapted the ini.php files to max_exucution time set to a 360 seconds but I still got the same error.
I have two options in mind.
If this error occurs run the line again. But I couldn't find anything online (I am probably searching with wrong searchterms). Is there a possibility to run the line where the error occurs again?
Save the xml files temporary local and search for the information in this way and remove in the end of the process. Here , I have no idea how to remove them after retrieving all data? And would this actually solve the problem? Because my script still needs to search through the xml file? Will it not take the same amount of time?
When I used these two lines in my script the problem was solved.
ini_set('max_execution_time', 300);
set_time_limit(0);

is_dir fail when executed in parallel with the script creating the dir

EDIT : My problem came from the "intelligent" behaviour of Firefox. If you call the same page on two different tabs, it automatically start the second after the first is done. If you want parallel execution you must add a different parameter.
Was trying to create a mutex using a directory. For exemple :
$dir = 'test' ;
echo is_dir($dir) ;
mkdir($dir)
wait(30)
rmdir($dir)
In a browser, I call the script, on another tab a few seconds later I call the same script.
is_dir returns false and there isno error on mkdir on the second call
ON the disk the dir is created with the first script and remain until the second end.
If I call on command line the two script one after the other I have the
expected result is_dir is true and mk_dir failed with dir already exists error.
The web server is an apache2.
Can't explain such a behavior.
When you use stat(), lstat(), or any of the other functions listed in the affected functions list (below), PHP caches the information those functions return in order to provide faster performance. However, in certain cases, you may want to clear the cached information. For instance, if the same file is being checked multiple times within a single script, and that file is in danger of being removed or changed during that script's operation, you may elect to clear the status cache. In these cases, you can use the clearstatcache() function to clear the information that PHP caches about a file.
This function caches information about specific filenames, so you only need to call clearstatcache() if you are performing multiple operations on the same filename and require the information about that particular file to not be cached.
Affected functions include stat(), lstat(), file_exists(), is_writable(), is_readable(), is_executable(), is_file(), is_dir(), is_link(), filectime(), fileatime(), filemtime(), fileinode(), filegroup(), fileowner(), filesize(), filetype(), and fileperms().
TLDR, add a clearstatcache(); before any checks
source : http://php.net/manual/en/function.clearstatcache.php
You might want to explain a bit better, and paste a better code exemple...
Meanwhile, here is a better way to handle your mkdir/rmdir
$mydir= 'my/dir/'
if(!is_dir($myDir)) {
mkdir($myDir, 0755, true);
wait(30);
rmdir(mydir);
}
You might need to find out how to recursively delete dirs and files, it might help... ;)
Also, is wait()a PHP function you made?!
I do know sleep() but not wait()...
The code could be prettier and more realistic, was just trying to be concise. Add thought of apc or xcode cache problem...
Wandering on the interweb for a hint, I read that when calling the same script on two tabs firefox was so intelligent (f... him) that it waited for the first to be done before executing the second.
Adding a different param to each call (?t=1 and ?t=2) or using chrome for one call and ff for the other make it working flawlessly.... What a waste of time....

PHP resets variables

I'm trying to create a script that creates unique codes and writes them to a textfile.
I've managed to generate the codes, and write them to the file.
Now my problem is the fact that my loop keeps running, resulting in over 92 000 codes being written to the file, before the server times-out.
I've done some logging, and it seems that everything works fine, it's just that after a certain amount of seconds, all my variables are reset and everything starts from scratch. The time interval after which this happens varies from time to time.
I've already set ini_set('memory_limit', '200M'); ini_set('max_execution_time',0); at the top of my script. Maybe there's a php time-out setting I'm missing?
The script is a function in a controller. I set the ini_set at the beginning of this function. This is the loop I'm going through:
public function generateAction() {
ini_set('memory_limit', '200M');
ini_set('max_execution_time',0);
$codeArray = array();
$numberOfCodes = 78000;
$codeLength = 8;
$totaalAantal = 0;
$file = fopen("codes.txt","a+");
while(count($codeArray)<$numberOfCodes){
$code = self::newCode($codeLength);
if(!in_array($code,$codeArray))
{
$totaalAantal++;
$codeArray[] = $code;
fwrite($file,'total: '.$totaalAantal."\r\n");
}
}
fclose($file);
}
In the file this would give something like this:
total: 1
total: 2
total: ...
total: 41999
total: 42000
total: 1
total: 2
total: ...
total: 41999
total: 42000
Thanks.
Edit: so far we've established that the generateAction() is called 2 or 3 times, before the end of the script, when it should only be called once.
I already found the solution for this problem.
The host's script limit was set to 90 seconds, and because this script had to run for longer, I had to run it via the command line.
Taking account of the test with uniqid(), we can say that variables are not reseted, but the method generateAction() is called several times.
Since you code is probably synchronous, we may say that generateAction() is called several times because the main script is called several times.
What happens in detail?
Because of the nature of your algorithm, each pass in the loop is slower then the previous one. So the duration of executing generateAction() may be quite long.
You probably don't wait for the end, and you stop the process or even start the process from a new page. Nevertheless, the process don't really stop so soon, and it keeps running in back-end. I've observed such a behavior on my local WAMP/LAMP installation: the script is not actually stopped even if I stop the page, if I close the page, even if I close the navigator or if I restart Apache.
So it happens to you that several script processes are writing simultaneously in the codes.txt file.
In order to avoid this, you can for example lock the file during the loop using function flock().

totally random timeouts after session_start() is called

I have been trying to fix this wired php session issue for some time now.
Setup: running on IIS V6.0, php for windows V 5.2.6
Issue:
At totally random times, the next php line after session_start() times out.
I have a file, auth.php that gets included on every page on an extranet site (to check valid logins)
auth.php
session_start();
if (isset($_SESSION['auth']==1) { <---- timesout here
do something ...
}
...
When using the site, I get random "maximum execution time of 30 seconds exceeded" errors at the line 2: if (isset($_SESSION['auth']==1) {
If I modify this script to
session_start();
echo 'testing'; <---- timesout here
if (isset($_SESSION['auth']==1) {
do something ...
}
...
The random error now happens on line 2 as well (echo 'testing'), which is a simple echo statement, strange.
It looks like session_start() is randomly causing issues, preventing line of code right after it to throw a timeout error (even for a simple echo statement) ....
This is happening on all sorts of page on the site (db intensive, relatively static ...) which is making it difficult to troubleshoot. I have been tweaking the session variables and timeouts in php.ini without any luck
Has anyone encountered something like that, or could suggest possible places to look at ?
thanks !
A quick search suggests that you should be using session_write_close() to close the session when you are done using it if you are on an NTFS file system. Starting a session locks the session file so no other file can access it while code is running. For some reason, the lock sometimes doesn't release automatically reliably on Windows/NTFS, so you should manually close the session when you are done with it.

PHP - how to assure correct command execution sequence

Given a simple code like :
$file = 'hugefile.jpg';
$bckp_file = 'hugeimage-backup.jpg';
// here comes some manipulation on $bckp_file.
The assumed problem is that if the file is big or huge - let´s say a jpg - One would think that it will take the server some time to copy it (by time I mean even a few milliseconds) - but one would also assume that the execution of the next line would be much faster ..
So in theory - I could end up with "no such file or directory" error when trying to manipulate file that has not yet created - or worse - start to manipulate a TRUNCATED file.
My question is how can I assure that $bckp_file was created (or in this case -copied) successfully before the NEXT line which manipulates it .
What are my options to "pause" , "delay" the next line execution until the file creation / copy was completed ?
right now I can only think of something like
if (!copy($file, $bckp_file)) {
echo "failed to copy $file...\n";
}
which will only alert me but will not resolve anything (same like having the php error)
or
if (copy($file, $bckp_file)) {
// move the manipulation to here ..
}
But this is also not so valid - because let´s say the copy was not executed - I will just go out of the loop without achieving my goal and without errors.
Is that even a problem or am I over-thinking it ?
Or is PHP has a bulit-in mechanism to ensure that ?
Any recommended practices ?
any thoughts on the issue ? ??
What are my options to "pause" , "delay" the next line execution until the file is creation / copy was completes
copy() is a synchronous function meaning that code will not continue after the call to copy() until copy() either completely finishes or fails.
In other words, there's no magic involved.
if (copy(...)) { echo 'success!'; } else { echo 'failure!'; }
Along with synchronous IO, there is also asynchronous IO. It's a bit complicated to explain in technical detail, but the general idea of it is that it runs in the background and your code hooks into it. Then, whenever a significant event happens, the background execution alerts your code. For example, if you were going to async copy a file, you would register a listener to the copying that would be notified when progress was made. That way, your code could do other things in the mean time, but you could also know what progress was being made on the file.
PHP handles file uploads by saving the whole file in a temporary directory on the server before executing any of script (so you can use $_FILES from the beginning), and it's safe to assume all functions are synchronous -- that is, PHP will wait for each line to execute before moving to the next line.

Categories