I have an extremely oversimplified logger function:
<?php
class Logger {
public function __construct($logFile) {
$this->logFile = $logFile;
}
public function log($message) {
$message = date('c') . $message;
file_put_contents($this->logFile, $message, FILE_APPEND | LOCK_EX);
echo "it ran ";
}
}
Calling it like this
$logger = new Logger('log.txt');
$logger->log("message");
echo "called the method";
causes the message to be written to the file exactly 3 times, instead of 1.
The code is outside of any loop, which is confirmed by echo statements, which get printed only once.
Also, if I simply run file_put_contents() function on place where I'd call the log method, it works fine and writes the content just once. So it might have something to do with my class, but no clue what.
EDIT: #Tommy: here is the log file content:
2014-09-26T07:24:51-04:00message2014-09-26T07:24:54-04:00message2014-09-26T07:24:54-04:00message
EDIT 2: I tried using die() function after calling the method, and then it did write the message just once. So I kept moving the die() through the code, and it starts writing the message 3 times after this exact line:
if (isset($_POST['create_account'])) {
die;
Since there's a die below it, it shouldn't even matter what's in further code, right?
Wonder if it might be some sort of php bug, this is very stange. If I put the die() above this line, it will work fine and write the message just once.
There's a fairly good chance that your code does a redirect or reload somewhere. This causes a new request to start, which wipes away the original echo but does not remove the one written to file. As a result it looks like it was echo'd once and written thrice. But really, it was echo'd thrice as well, just the other copies have been removed.
If you want to see what's going on, print part of the stack-trace into the log-file along with the message. You can see exactly on which line the message is created and during which function call.
The main issue as per my experience is the index.php is being called twice. So, to fix:
change the file name
fix the index.php such that favicon.ico is missing!
Related
I use phpseclib to SSH into my remote server from a web browser and execute a php file. Below is the code I use:
$ssh->exec('cd myfolder/; php main.php ' . $file, 'packet_handler');
function packet_handler(){
echo "Completed";
header("Location: exec_completed.php");
}
The main.php file gets executed without any issue. The problem is with returning the data after execution. I have the following questions:
I do a lot of processing in the main.php file and i need to show real time progress of what the script does. When i execute the file through exec, only the first echo in the main.php is printed and the execution stops. Is there any way to get real time data from the executing script.
I follow this example from phpseclib for callbacks although my callback function packet_handler doesn't run after exec is executed. I want to redirect to another page once the main.php I execute through SSH has completed its execution. Now if i redirect to that page i get only partial results as the main.php file has not completed its execution. I tried to use sleep(10) but my main.php may take longer times to execute at times so it didn't work. please suggest any ideas
Callback example from phpseclib:
<?php
include('Net/SSH2.php');
$ssh = new Net_SSH2('www.domain.tld');
if (!$ssh->login('username', 'password')) {
exit('Login Failed');
}
function packet_handler($str)
{
echo $str;
}
$ssh->exec('ping 127.0.0.1', 'packet_handler');
?>
For #1... doing $ssh->exec without the callback should work. If it doesn't I'd need to see the logs, which you can get by doing define('NET_SSH2_LOGGGING', 2) at the top and then echo $ssh->getLog() after. Posting the log at pastebin.com and then posting a link would be good. But that said, that won't get you real time output either.
For #2... the callback function is mainly intended for real-time updates and odds are very likely that what you'll get with each call of the callback function will be an incomplete output. So for your callback to output "Completed" and redirect the user to another location is, in all likelihood, incorrect.
Another approach that may work for you: use the interactive shell. Example:
http://phpseclib.sourceforge.net/ssh/examples.html#sudo,
I don't know what your output is like. Maybe you could read() until you got to certain parts of the output that are guaranteed to be output. Or maybe you could use $ssh->setTimeout(5) and get updated output every five seconds or something..
I am fairly new to PHP, even more so to the ob_ functions, so help me understand this, as the manual is somehow does not provide a very simple example or reference.
I am assuming that "output buffering" is what delays and holds php from sending headers until full content is sent, and that may be why the header() function does not issue an error if ob_start() is declared above. If so, my question is how do I "buffer" only some contents instead of just mentioning ob_start() at the top of my script, which is greatly slowing down my application?
Example.
<?php
namespace App\Controller;
class Home extends Controller{
public function showHomePage()
{
$students = $pdo->query('SELECT id FROM students');
$view->showContent($students); // includes content.php
}
}
//content.php
<p> showing stundent by id </p>
<?php
showContent()
{
if(!$students){
header('Location: /404');
}else{
//show students
}
}
}
Now you can see in the above example that, as soon as content.php is loaded, it will issue header already sent sent error (if $students evaluates to false/null ) so, to hide this error, I placed ob_start() inside my howHomePage method as seen here
public function showHomePage()
{
$students = $pdo->query('SELECT id FROM students');
ob_start();
$view->showContent($students); // includes content.php
}
Now, with the above approach, I get no header errors, but I would like to close that buffer as soon as the showContent() method is executed. In other means, I don't want the ob_start() to apply only for that following function. I tried to do something like this
public function showHomePage()
{
$students = $pdo->query('SELECT id FROM students');
ob_start();
$view->showContent($students); // includes content.php
on_end_flush();
}
but now, the contents showContent() are not being shown
<p> showing stundent by id </p>
<?php
showContent()
{
if(!$students){
header('Location: /404');
}else{
//show students
}
}
}
This is a terrible way to code. You've already got your output baked in, which, as you've noted, prevents you from changing the header(). This is a major driver behind MVC, which holds that you need to segment your code and separate your view(HTML) from your controller(PHP). In this case, you've put a function inline with your HTML.
There's a couple of ways to work around this without having to resort to output buffering
Do the check on $students earlier in the page (like when you get/build the data set) and issue the 404 there.
Move your HTML into a separate template file (maybe check out Smarty to help with that) and then do your drawing there.
I had the same issue and solved it by adding:
ob_implicit_flush(true);
to the beginning of the php file. This outputs everything right away and you can take your other flush commands out.
http://php.net/manual/en/function.ob-implicit-flush.php
ob_implicit_flush() will turn implicit flushing on or off. Implicit
flushing will result in a flush operation after every output call, so
that explicit calls to flush() will no longer be needed.
I'm quite new to cakephp and trying to debug a code from someone else.
The problem is I get a never ending request, despite the fact that both view and crontroller seem to run properly. I even tryed to add an exit; in both of them or even introduce a syntax error in the controller, the request never ends and the browser keeps trying to load the page endlessly.
Here is the code of the controller :
public function categories()
{
file_put_contents("/tmp/logfile.log",time()." categories bla\n", FILE_APPEND);
$catData = $this->SpecificKeywordCategorie->find('all');
$modelnameLessValues = array();
foreach($catData as $singleCat)
{
$modelnameLessValues[] = $singleCat['SpecificKeywordCategorie'];
}
$this->set('categories',$modelnameLessValues);
file_put_contents("/tmp/logfile.log",time()." categories end blu\n", FILE_APPEND);
}
and the view code "categories.ctp :
<?php
file_put_contents("/tmp/logfile.log","view json ".json_encode($categories),FILE_APPEND);
print(json_encode($categories));
file_put_contents("/tmp/logfile.log","view json before exit",FILE_APPEND);
exit;
?>
all the file_put_contents entries are written in the logfile. but the exit seems to be ignored and if I do a request in a browser it never ends...
Same thing happens if I add a syntax error on controller or view. (of course, in this case, log entries are not written)
I know nothing about cakephp internals, but php scripts running outside it are running well on same apache instance.
Any idea where to look to find where this infinite request comes from ?
We are running cakephp 2.2.3
Basically, I have a script that is included at the top of a page that does a bunch of things, the most important being an ob_start(). Then in the body of the page I have a variety of tags that will be replaced, such as {hello_word}. Then at the very end, I include another script that ends the output buffer, and makes the tag replacements with other code, then prints.
Is there any possible way to do this without having to include my second file at the end? Is there some simple way I can automatically execute a function or include a file at the very end?
You can register a function to be executed at the very end of script using register_shutdown_function
Any objects that you have remaining will be destroyed at the end of the script, and their destructors will be called (manual). You can put code that you want executed at the end in the destructor.
For example:
Class Waitforme {
function __destruct() {
echo "I'm here!";
}
}
$hello = new Waitforme();
This will do nothing until $hello is destroyed, at which time we'll see "I'm here!"
You can use the auto_append setting in php.ini, but you'll sacrifice portability. If you don't plan on distributing your application, this is a good option.
I have a php file that is fired by a cronjob every minute.
When the php file is fired it updates the database, sleeps, etc
It is programmed like this:
$start = microtime(true);
set_time_limit(10);
for($i=0;$i<5;$i++)
{
updateDB();
time_sleep_until($start + $i + 1);
}
If this piece of code is run i don't see any changes happening in the database. Another thing i notices is when i echo something out i is printed when the loop is ended in one piece.
[edit] I tried using flush and ob_flush, but it still didn't print line for line[/edit]
What can i do to avoid these errors. The database needs to be updated.
Another thing i was wondering is what the best way is to log this kind of thing. Can i log the results to a log file.
The loop itself looks fine. If it isn't updating your database, the error must be in your updateDB() function.
As to the echo thing. The output of scripts is often buffered. To force PHP to print it right away, you can call either call flush() whenever you want the output flushed, or you can just call ob_implicit_flush() at the top of the script and it will flush automatically every time you print something.
Also, if you are calling the script via a browser, the browser itself may further buffer the response before showing it to you.
And as to the logging, the simplest way is to pick a file somewhere and just use file_put_contents() to print whatever you want logged. Note the FILE_APPEND flag for the third parameter.
Looks like you are running from command line, in this case you may want to write to stderr so that there is no buffering. $stderr = fopen('php://stderr', 'w');
In the case of logging, just open a file, write to it, and close it. (fopen, fwrite, fclose);