I'm attempting to generate a test email in Laravel 4.2 (Windows 7, IIS 6.1), and I've encountered a silent termination - it just fails, doesn't return my view, and doesn't return an error or Exception. I've managed to brute force my way through the Laravel codebase and located the termination within Swift\Transport\AbstractSmtpTransport::_getFullResponse(), specifically the line $line = $this->_buffer->readLine($seq);:
protected function _getFullResponse($seq)
{
$response = '';
try {
do {
$line = $this->_buffer->readLine($seq);
$response .= $line;
} while (null !== $line && false !== $line && ' ' != $line{3});
} catch (Swift_IoException $e) {
$this->_throwException(
new Swift_TransportException(
$e->getMessage())
);
} catch (Swift_TransportException $e) {
$this->_throwException($e);
}
return $response;
}
That do loop executes twice. The first time $line is assigned the value * OK The Microsoft Exchange IMAP4 service is ready., which is great, as obviously I'm getting to the server. Unfortunately, the second iteration fails in Swift\Transport\StreamBuffer::readLine() at the line $line = fgets($this->_out); :
public function readLine($sequence)
{
if (isset($this->_out) && !feof($this->_out)) {
$line = fgets($this->_out);
if (strlen($line)==0) {
$metas = stream_get_meta_data($this->_out);
if ($metas['timed_out']) {
throw new Swift_IoException(
'Connection to ' .
$this->_getReadConnectionDescription() .
' Timed Out'
);
}
}
return $line;
}
}
I've tried wrapping that line in a try/catch, and nothing happens, the code just halts with no information on the second iteration of the do loop. So, any advice as to a) how to squeeze more information out of the halt or b) what could cause fgets() to halt this way?
Ok, progress: after broadening my search to Swiftmailer in general, I was able to find mention of a timeout occurring at that particular line in Swiftmailer. By extending the max_execution_time in php.ini I was able to get an actual Exception:
Expected response code 220 but got code "", with message "* OK The Microsoft Exchange IMAP4 service is ready. * BYE Connection is closed. 13 "
I think under the circumstances I'll close this question and move onto figuring out why I'm not getting a 220.
Related
Hi I'm trying server sent events(SSE) using php, I have a https url where I get the live streaming data. Below is my script where I'm trying in infinite loop.
PHP:
<?php
while(1)
{
$get_stream_data = fopen('https://api.xyz.com:8100/update-stream/connect', 'r');
if($get_stream_data)
{
$stream_data = stream_get_contents($get_stream_data);
$save_stream_data = getStreamingData($stream_data);
if($save_stream_data == true)
{
continue;
}
}
else
{
sleep(1);
continue;
}
}
function getStreamingData($stream_data)
{
$to = "accd#xyz.com";
$subject = "Stream Details";
$msg = "Stream Details : ".$stream_data;
$headers = "From:streamdetail#xyz.com";
$mailsent = mail($to,$subject,$msg,$headers);
if($mailsent){
return true;
}else {
return false;
}
}
?>
Error:
Warning: fopen(https://api.xyz.com:8100/update-stream/connect): failed to open stream: Connection timed out in /home/public_html/get_stream_data/index.php on line 4
I couldn't get the data by my end while it is giving an updates by the server in live.
I checked that live streaming in a command prompt using below command.
CURL
curl --get 'https://api.xyz.com:8100/update-stream/connect' --verbose
First, this is best done with PHP's curl functions. See the various answers to PHP file_get_contents() returns "failed to open stream: HTTP request failed!"
If you stick with fopen() you probably need to set up the context for SSL, and it may involve installing some certificates. See file_get_contents(): SSL operation failed with code 1. And more (and note the security warning about the accepted answer)
Finally, your while(1) loop is around the fopen() (which is okay for re-starts after relatively rare failures), but you actually want it inside. Here is your code with just the minimal changes to show that:
<?php
while(1)
{
$get_stream_data = fopen('https://api.xyz.com:8100/update-stream/connect', 'r');
if($get_stream_data)while(1)
{
$stream_data = stream_get_contents($get_stream_data);
$save_stream_data = getStreamingData($stream_data);
if($save_stream_data == true)
{
continue;
}
sleep(1);
}
else
{
sleep(1);
continue;
}
}
UPDATE: The above code still nags at me: I think you want me to using fread() instead of stream_get_contents(), and use blocking instead of the sleep(1) (in the inner loop).
BTW, I'd suggest changing the outer-loop sleep(1) to be sleep(3) or sleep(5) which are typical defaults in Chrome/Firefox/etc. (Really, you should be looking for the SSE server sending a "retry" header, and using that number as the sleep.)
I'm not an expert with PHP. I have a function which uses EXEC to run WINRS whcih then runs commands on remote servers. The problem is this function is placed into a loop which calls getservicestatus function dozens of times. Sometimes the WINRS command can get stuck or take longer than expected causing the PHP script to time out and throw a 500 error.
Temporarily I've lowered the set timeout value in PHP and created a custom 500 page in IIS and if the referring page is equal to the script name then reload the page (else, throw an error). But this is messy. And obviously it doesn't apply to each time the function is called as it's global. So it only avoids the page stopping at the HTTP 500 error.
What I'd really like to do is set a timeout of 5 seconds on the function itself. I've been searching quite a bit and have been unable to find an answer, even on stackoverflow. Yes, there are similar questions but I have not been able to find any that relate to my function. Perhaps there's a way to do this when executing the command such as an alternative to exec()? I don't know. Ideally I'd like the function to timeout after 5 seconds and return $servicestate as 0.
Code is commented to explain my spaghetti mess. And I'm sorry you have to see it...
function getservicestatus($servername, $servicename, $username, $password)
{
//define start so that if an invalid result is reached the function can be restarted using goto.
start:
//Define command to use to get service status.
$command = 'winrs /r:' . $servername . ' /u:' . $username . ' /p:' . $password . ' sc query ' . $servicename . ' 2>&1';
exec($command, $output);
//Defines the server status as $servicestate which is stored in the fourth part of the command array.
//Then the string "STATE" and any number is stripped from $servicestate. This will leave only the status of the service (e.g. RUNNING or STOPPED).
$servicestate = $output[3];
$strremove = array('/STATE/','/:/','/[0-9]+/','/\s+/');
$servicestate = preg_replace($strremove, '', $servicestate);
//Define an invalid output. Sometimes the array is invalid. Catch this issue and restart the function for valid output.
//Typically this can be caught when the string "SERVICE_NAME" is found in $output[3].
$badservicestate = "SERVICE_NAME" . $servicename;
if($servicestate == $badservicestate) {
goto start;
}
//Service status (e.g. Running, Stopped Disabled) is returned as $servicestate.
return $servicestate;
}
The most straightforward solution, since you are calling an external process, and you actually need its output in your script, is to rewrite exec in terms of proc_open and non-blocking I/O:
function exec_timeout($cmd, $timeout, &$output = '') {
$fdSpec = [
0 => ['file', '/dev/null', 'r'], //nothing to send to child process
1 => ['pipe', 'w'], //child process's stdout
2 => ['file', '/dev/null', 'a'], //don't care about child process stderr
];
$pipes = [];
$proc = proc_open($cmd, $fdSpec, $pipes);
stream_set_blocking($pipes[1], false);
$stop = time() + $timeout;
while(1) {
$in = [$pipes[1]];
$out = [];
$err = [];
stream_select($in, $out, $err, min(1, $stop - time()));
if($in) {
while(!feof($in[0])) {
$output .= stream_get_contents($in[0]);
break;
}
if(feof($in[0])) {
break;
}
} else if($stop <= time()) {
break;
}
}
fclose($pipes[1]); //close process's stdout, since we're done with it
$status = proc_get_status($proc);
if($status['running']) {
proc_terminate($proc); //terminate, since close will block until the process exits itself
return -1;
} else {
proc_close($proc);
return $status['exitcode'];
}
}
$returnValue = exec_timeout('YOUR COMMAND HERE', $timeout, $output);
This code:
uses proc_open to open a child process. We only specify the pipe for the child's stdout, since we have nothing to send to it, and don't care about its stderr output. if you do, you'll have to adjust the following code accordingly.
Loops on stream_select(), which will block for a period up to the $timeout set ($stop - time()).
If there is input, it will var_dump() the contents of the input buffer. This won't block, because we have stream_set_blocking($pipe[1], false) on the pipe. You will likely want to save the content into a variable (appending it rather than overwriting it), rather than printing out.
When we have read the entire file, or we have exceeded our timeout, stop.
Cleanup by closing the process we have opened.
Output is stored in the pass-by-reference string $output. The process's exit code is returned, or -1 in the case of a timeout.
I use PDO with try and catch so i can catch the error into a user defined function. I want to know if this method is acceptable.
The Try and Catch:
Try
{
...
} catch (Exception $e) {
// Proccess error
$msg = $e->getMessage();
$timestamp = date("Y-m-d H:i:s");
$line = $e->getLine();
$code = $e->getCode();
handle_error($msg, $timestamp, $line, $code);
die("oops! It's look like we got an error here! Try Again!");
}
The error handling function:
function handle_error($msg, $timestamp, $line, $code) {
$file = 'errorlog.txt';
$data = "$timestamp // Error Message: $msg | Error Code: $code | Error Line: $line \n";
file_put_contents($file, $data, FILE_APPEND);
return;
}
Thanks
You could simplify the log writing by changing your code similar to this
catch (PDOException $e) {
$my_error = "Failed;" . $e;
error_log(print_r($my_error,TRUE));
}
The only thing I could think of that could wrong is that if by some chance a user finds out a way to trigger PDO errors then he can basically DDoS the system by triggering it over and over again and filling up your harddrive with the same error. Not to mention it can use up all your disk I/O bandwidth (when exploiting it).
Then again, Apache does similar logging and I haven't heard of HDDs being filled up even while under DDoS attacks.
I have a gateway script that returns JSON back to the client.
In the script I use set_error_handler to catch errors and still have a formatted return.
It is subject to 'Allowed memory size exhausted' errors, but rather than increase the memory limit with something like ini_set('memory_limit', '19T'), I just want to return that the user should try something else because it used to much memory.
Are there any good ways to catch fatal errors?
As this answer suggests, you can use register_shutdown_function() to register a callback that'll check error_get_last().
You'll still have to manage the output generated from the offending code, whether by the # (shut up) operator, or ini_set('display_errors', false)
ini_set('display_errors', false);
error_reporting(-1);
set_error_handler(function($code, $string, $file, $line){
throw new ErrorException($string, null, $code, $file, $line);
});
register_shutdown_function(function(){
$error = error_get_last();
if(null !== $error)
{
echo 'Caught at shutdown';
}
});
try
{
while(true)
{
$data .= str_repeat('#', PHP_INT_MAX);
}
}
catch(\Exception $exception)
{
echo 'Caught in try/catch';
}
When run, this outputs Caught at shutdown. Unfortunately, the ErrorException exception object isn't thrown because the fatal error triggers script termination, subsequently caught only in the shutdown function.
You can check the $error array in the shutdown function for details on the cause, and respond accordingly. One suggestion could be reissuing the request back against your web application (at a different address, or with different parameters of course) and return the captured response.
I recommend keeping error_reporting() high (a value of -1) though, and using (as others have suggested) error handling for everything else with set_error_handler() and ErrorException.
If you need to execute business code when this error happens (logging, backup of the context for future debugs, emailing or such), registering a shutdown function is not enough: you should free memory in a way.
One solution is to allocate some emergency memory somewhere:
public function initErrorHandler()
{
// This storage is freed on error (case of allowed memory exhausted)
$this->memory = str_repeat('*', 1024 * 1024);
register_shutdown_function(function()
{
$this->memory = null;
if ((!is_null($err = error_get_last())) && (!in_array($err['type'], array (E_NOTICE, E_WARNING))))
{
// $this->emergencyMethod($err);
}
});
return $this;
}
you could get the size of the memory already consumed by the process by using this function memory_get_peak_usage documentations are at http://www.php.net/manual/en/function.memory-get-peak-usage.php I think it would be easier if you could add a condition to redirect or stop the process before the memory limit is almost reached by the process. :)
While #alain-tiemblo solution works perfectly, I put this script to show how you can reserve some memory in a php script, out of object scope.
Short Version
// memory is an object and it is passed by reference
function shutdown($memory) {
// unsetting $memory does not free up memory
// I also tried unsetting a global variable which did not free up the memory
unset($memory->reserve);
}
$memory = new stdClass();
// reserve 3 mega bytes
$memory->reserve = str_repeat('❤', 1024 * 1024);
register_shutdown_function('shutdown', $memory);
Full Sample Script
<?php
function getMemory(){
return ((int) (memory_get_usage() / 1024)) . 'KB';
}
// memory is an object and it is passed by reference
function shutdown($memory) {
echo 'Start Shut Down: ' . getMemory() . PHP_EOL;
// unsetting $memory does not free up memory
// I also tried unsetting a global variable which did not free up the memory
unset($memory->reserve);
echo 'End Shut Down: ' . getMemory() . PHP_EOL;
}
echo 'Start: ' . getMemory() . PHP_EOL;
$memory = new stdClass();
// reserve 3 mega bytes
$memory->reserve = str_repeat('❤', 1024 * 1024);
echo 'After Reserving: ' . getMemory() . PHP_EOL;
unset($memory);
echo 'After Unsetting: ' . getMemory() . PHP_EOL;
$memory = new stdClass();
// reserve 3 mega bytes
$memory->reserve = str_repeat('❤', 1024 * 1024);
echo 'After Reserving again: ' . getMemory() . PHP_EOL;
// passing $memory object to shut down function
register_shutdown_function('shutdown', $memory);
And the output would be:
Start: 349KB
After Reserving: 3426KB
After Unsetting: 349KB
After Reserving again: 3426KB
Start Shut Down: 3420KB
End Shut Down: 344KB
So:
#fopen($file);
Ignores any errors and continues
fopen($file) or die("Unable to retrieve file");
Ignores error, kills program and prints a custom message
Is there an easy way to ignore errors from a function, print a custom error message and not kill the program?
Typically:
if (!($fp = #fopen($file))) echo "Unable to retrieve file";
or using your way (which discards file handle):
#fopen($file) or printf("Unable to retrieve file");
Use Exceptions:
try {
fopen($file);
} catch(Exception $e) {
/* whatever you want to do in case of an error */
}
More information at http://php.net/manual/language.exceptions.php
slosd's way won't work. fopen doesn't throw an exception. You should thow it manually
I will modify your second exaple and combine it with slosd's:
try
{
if (!$f = fopen(...)) throw new Exception('Error opening file!');
}
catch (Exception $e)
{
echo $e->getMessage() . ' ' . $e->getFile() . ' at line ' . $e->getLine;
}
echo ' ... and the code continues ...';
Here's my own solution. Note that it needs either a script-level global or a static variable of a class for easy reference. I wrote it class-style for reference, but so long as it can find the array it's fine.
class Controller {
static $errors = array();
}
$handle = fopen($file) or array_push(Controller::errors,
"File \"{$file}\" could not be opened.");
// ...print the errors in your view
Instead of dying you could throw an exception and do the error handling centrally in whichever fashion you see fit :-)