Is there a clean way to parse wordpress logs? - php

I am doing a plugin that checks log errors every day and send them by UDP.
My idea was to open debug.log and check line by line if the error is anterior to the last time I checked them, and then I check if it's a critical error or a warning.
That is pretty easy, but problem is an error can do more than one line!
And there is a stack trace some times (well, I could just skip it because it's always begins with #).
This is my code actually but it doesn't work when an error have multiple lines.
$path = fs_get_wp_config_path();
$path = $path . "/wp-content/debug.log";
$logs = file($path);
$date = get_option('last_date');
if ($date == false)
{
add_option('last_date', '27-Aug-2015 09:43:55 UTC');
$date = get_option('last_date');
}
$last_date = new DateTime("27-Aug-2015 09:43:55 UTC");
for($i = 0; $i < count($logs); $i++)
{
var_dump($logs[$i]);
if (substr($logs[$i], 0, 12) == "Stack trace:")
{
$i++;
while (substr($logs[$i], 0, 1) == '#' && $i < count($logs))
$i++;
$i++;
}
else
{
$log_date = substr($logs[$i], 1, 24);
$new_date = new DateTime($log_date);
//var_dump ($new_date);
}
}
Do you know how can I do that ?
Thanks
EDIT : This is a sample of my log file
[27-Aug-2015 12:49:14 UTC] PHP Fatal error: Uncaught exception 'Exception' with message 'DateTime::__construct(): \
Failed to parse time string (tack trace:
) at position 0 (t): The timezone could not be found in the database' in /Users/opsone/Sites/wordpress/wp-content/p\
lugins/opsonemonitoring/log.php:51
Stack trace:
#0 /Users/opsone/Sites/wordpress/wp-content/plugins/opsonemonitoring/log.php(51): DateTime->__construct('tack trace\
:\n')
#1 /Users/opsone/Sites/wordpress/wp-content/plugins/opsonemonitoring/opsonemonitoring.php(30): get_log()
#2 /Users/opsone/Sites/wordpress/wp-content/plugins/opsonemonitoring/opsonemonitoring.php(142): opsoneMonitoring()
#3 [internal function]: mysettings_page('')
#4 /Users/opsone/Sites/wordpress/wp-includes/plugin.php(496): call_user_func_array('mysettings_page', Array)
#5 /Users/opsone/Sites/wordpress/wp-admin/admin.php(212): do_action('settings_page_m...')
#6 /Users/opsone/Sites/wordpress/wp-admin/options-general.php(10): require_once('/Users/opsone/S...')
#7 {main}
thrown in /Users/opsone/Sites/wordpress/wp-content/plugins/opsonemonitoring/log.php on line 51
[27-Aug-2015 12:52:04 UTC] PHP Fatal error: Uncaught exception 'Exception' with message 'DateTime::__construct(): \
Failed to parse time string (tack trace:
) at position 0 (t): The timezone could not be found in the database' in /Users/opsone/Sites/wordpress/wp-content/p\
lugins/opsonemonitoring/log.php:51
Stack trace:

Related

Uncaught TypeError: DOMDocument::importNode()

I get the following error when running php on my local:
Fri, 25 Mar 2022 03:11:55 +0000---Starting f_contracts with query 1 Fri, 25 Mar 2022 03:12:01 +0000---Starting XML -> JSON conversion
Warning: XMLReader::expand(): /private/tmp/redshift-dump.xml:1109: parser error : Extra content at the end of the document in /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php on line 1605
Warning: XMLReader::expand(): </table_data> in /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php on line 1605
Warning: XMLReader::expand(): ^ in /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php on line 1605
Warning: XMLReader::expand(): An Error Occurred while expanding in /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php on line 1605
Fatal error: Uncaught TypeError: DOMDocument::importNode(): Argument #1 ($node) must be of type DOMNode, bool given in /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php:1605 Stack trace: #0 /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php(1605): DOMDocument->importNode(false, true) #1 /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php(1582): Primary->mysqlDumpXmlToJson() #2 /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php(1509): Primary->dumpData('f_contrac...', 1) #3 /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php(1460): Primary->process('f_contrac...', 1) #4 /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php(46): Primary->processFContracts() #5 /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php(1665): Primary->__construct() #6 {main} thrown in /Users/hm/repo/f_contract_update/data.redshift.sync.hsdp/scripts/primary.php on line 1605
The exact same script runs perfectly when running on the server, but gives me the above error when running on local.
When I look at the XML dump, and compare it to the dump on the server, I notice that the XML doesn't close off properly :
Compared to the server :
This is the script which it is complaining about :
private function mysqlDumpXmlToJson()
{
if (file_exists($this->json_file)) {
unlink($this->json_file);
}
$z = new \XMLReader();
$z->open($this->xml_file);
$doc = new \DOMDocument();
while ($z->read() && $z->name !== 'row') ;
$f = fopen($this->json_file, 'a+');
while ($z->name === 'row') {
$data = [];
$node = simplexml_import_dom($doc->importNode($z->expand(), true));
foreach ($node as $col) {
$value = (string)$col;
$value = str_replace('0000-00-00 00:00:00', '', $value);
$data[(string)$col['name']] = $value;
}
fwrite($f, json_encode($data));
$z->next('row');
}
fclose($f);
}
Could it be that the mysqldump is limiting output size and where are these configurations set?
EDIT********
The following steps gets executed:
private function process($table, $query)
{
$this->info('Starting ' . $table . ' with query ' . $query);
$this->dumpData($table, $query);
$this->info('Compressing JSON');
$this->compressJson();
$this->info('Moving to S3');
$this->moveToS3();
$this->info('Moving data to staging table');
$this->copyData($table);
$this->info('Finished ' . $table . ' with query ' . $query);
}
The failure occurs during dumpData :
private function dumpData($table, $query)
{
if (file_exists($this->xml_file)) {
unlink($this->xml_file);
}
if ($table=='c')
{
system(sprintf('mysql -h rds.sdp.com -u redshift -****** --xml --database onnet --execute "select columns from c" >%s'
, $this->xml_file));
}
else {
system(sprintf(
'mysqldump --single-transaction --no-tablespaces -h rds -u redshift -***** --xml onnet %s --where="%s"> %s',
$table,
$query,
$this->xml_file
));
}
if ($table == 's_m') {
system(sprintf(
'sed -i "s/<..>/__#__/g" %s',
$this->xml_file
));
system(sprintf(
'iconv -c -f utf8 -t ascii < %s > %s',
$this->xml_file,
$this->xml_file . '.tmp'
));
system(sprintf(
'strings %s > %s',
$this->xml_file . '.tmp',
$this->xml_file
));
};
$this->info('Starting XML -> JSON conversion');
$this->mysqlDumpXmlToJson();
$this->info('Finished XML -> JSON conversion');
return (filesize($this->json_file) > 1000);
}
Hope that clarifies.
File systems are:
private $xml_file = '/tmp/rsh-dmp.xml';
private $json_file = '/tmp/rsh-dmp.json';
private $json_file_compressed = '/tmp/rsh-dmp.json.gz';
private $json_file_s3 = 's3://sdp-ew/rsh-dmp.json.gz';
You have stated that the exact same script runs correctly on prod, while it fails on your local. So, the source-code that you have is capable to run correctly. So, the problem must be something else.
Now, we know that $this->xml_file is a file on the server, prod or local and it is XMLReader::expand() that complains first and then, the result is a bool instead of a DOMNode, which strongly suggests that the DOM could not be properly validated.
Since you have shown that the actual file ends unexpectedly, it is highly probable that your file is not properly formed.
First of all, you need a root node. From your example we see that your root node is not being closed. So, you either have some unclosed nodes, or, you do not have a root node in the first place.
If you download the file from prod and test your local code with that file, then you will see that it is being correctly executed. So, the bug if at the place where the file was created/generated. If you have a function/module which generates the file, then you will need to debug on how it is being generated and see why the nodes are not being closed, or why the root node is not being wrapped around your structure.

Php-fpm still logging my error even when using catch

Php-fpm error log file is still logging my error even using try-catch
$NUM_OF_ATTEMPTS = 100;
$attempts = 0;
do
{
try
{
$db = new SQLite3('proxies/socks5.db');
$results = $db->query('SELECT proxy FROM socks5proxies WHERE timeout <= ' . $settimeout . $countryq . ';');
while ($row = $results->fetchArray())
{
echo $row['proxy'] . "\r\n";
}
}
catch(Exception $e)
{
$attempts++;
sleep(1);
continue;
}
break;
}
while ($attempts < $NUM_OF_ATTEMPTS);
Expected result:
Retry on error, and don't log the error
Actual results:
Logs the error in the php-fpm error log file:
thrown in /var/www/html/api.php on line 200
[10-Jan-2019 14:00:49 UTC] PHP Warning: SQLite3::query(): Unable to prepare statement: 11, database disk image is malformed in /var/www/html/api.php on line 140
[10-Jan-2019 14:00:49 UTC] PHP Fatal error: Uncaught Error: Call to a member function fetchArray() on boolean in /var/www/html/api.php:141
Stack trace:
#0 {main}
thrown in /var/www/html/api.php on line 141
Call SQLite3::enableExceptions to tell PHP it should throw exceptions instead of standard errors:
try {
$db = new SQLite3('proxies/socks5.db');
$db->enableExceptions(true);
$results = $db->query('...');
} catch (\Exception $e) {
}
In any case, if you need to do 100 attempts to get this to work, then this really isn't the angle you should be taking to fix it.

PHP Advanced HTMLDOM error

I had php AdvancedHTMLDOM working just fine for a long time. However, about a week ago I noticed that the data I am scraping are not being updated for some reason.
I ran the script manually and got the following error:
root#telemetry:/home/telemetry/scripts/pressure# php -f get_pressure_nodes.php
PHP Fatal error: Uncaught Error: Class 'DOMDocument' not found in /home/telemetry/scripts/pressure/advanced_html_dom-master/advanced_html_dom.php:171
Stack trace:
#0 /home/telemetry/scripts/pressure/advanced_html_dom-master/advanced_html_dom.php(167): AdvancedHtmlDom->load('<html>\n<head>\n<...', false)
#1 /home/telemetry/scripts/pressure/advanced_html_dom-master/advanced_html_dom.php(747): AdvancedHtmlDom->__construct('<html>\n<head>\n<...')
#2 /home/telemetry/scripts/pressure/advanced_html_dom-master/advanced_html_dom.php(748): str_get_html('<html>\n<head>\n<...')
#3 /home/telemetry/scripts/pressure/get_pressure_nodes.php(17): file_get_html('get_pressure_no...')
#4 {main}
thrown in /home/telemetry/scripts/pressure/advanced_html_dom-master/advanced_html_dom.php on line 171
root#telemetry:/home/telemetry/scripts/pressure#
Here is my script (I kept the basics for simplicity, also I got this code from somewhere I cannot recall, so if it is yours please inform me so I can give credit where it is due):
<?php
require('advanced_html_dom-master/advanced_html_dom.php');
$html = file_get_html('get_pressure_nodes.html');
$table = $html->find('table', 1);
$rowData = array();
foreach($table->find('tr') as $row)
{
// initialize array to store the cell data from each row
$temp = array();
foreach($row->find('td') as $cell)
{
// push the cell's text to the array
$temp[] = $cell->plaintext;
}
$rowData[] = $temp;
}
foreach ( $rowData as $cell_contents )
{
print ( $cell_contents ) ;
}
?>
If AdvancedHTMLDOM has stopped working suddenly, this SO Post could be of help in fixing it.

I want to redirect to another controller's action

Error: exception 'Zend_Controller_Dispatcher_Exception' with message 'Invalid controller specified (phonenumber)' in /usr/share/php5/Zend/Controller/Dispatcher/Standard.php:248 Stack trace: #0 /usr/share/php5/Zend/Controller/Front.php(954): Zend_Controller_Dispatcher_Standard->dispatch(Object(Zend_Controller_Request_Http), Object(Zend_Controller_Response_Http)) #1 /usr/share/php5/Zend/Application/Bootstrap/Bootstrap.php(97): Zend_Controller_Front->dispatch() #2 /usr/share/php5/Zend/Application.php(366): Zend_Application_Bootstrap_Bootstrap->run() #3 /home/bina/public_html/telco-portal-testing/public/index.php(22): Zend_Application->run() #4 {main}
In this I want to redirect phonenumbers controller's addNew action And In this I want pass data of csv file
if(!empty($files)){
$name = ($files['csvfile']['tmp_name']);
$row = 1;
$handle = fopen($name, "r");
if ($handle !== FALSE) {
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$num = count($data);
// echo " $num fields in line $row: \n";
$row++;
for ($c=0; $c < $num; $c++) {enter code here
echo $data[$c];
}
}
fclose($handle);
}
// $this->_forward('addNew','PhoneNumber');
$this->_forward('addnew','phonenumber',null,array($data));
Try this:
$this->_redirect('/module/controller/action/');
you can try like this..
$this->_redirect(array('controller'=>'cotroller_name','action' => 'action_name'));
I am not familier with zend framwork....i am saying on base of cakephp.....But i think it may work fine........
You can use the helper class like this, though you didn't mention which vesion of the Framework you are using.
$this->_helper->redirector->gotoRoute(array('controller'=> 'controllerName','action' =>'actionName'));
exit();
Hope that adds some.

error with fetching tweets after refresh

First of all i want to apologize if this is the most basic question ever! I'm not that good with php but i'm learning.
I can't find a solution or even understand why it's going wrong all the time. I do want to know why this is happening
I'm trying to get the two latest tweets from a twitter account. I don't want to use massive (existing, i know) classes or codes which i don't understand. So i tried the following myself:
$timeline = "http://twitter.com/statuses/user_timeline.xml?screen_name=Mau_ries";
$data = file_get_contents($timeline);
$tweets = new SimpleXMLElement($data);
$i = 0;
foreach($tweets as $tweet){
echo($tweet->text." - ".$tweet->created_at);
if (++$i == 2) break;
}
When i first ran this code i got the text from my tweets, but when i refreshed the page i sometimes get the following error:
Warning: file_get_contents(http://twitter.com/statuses/user_timeline.xml?screen_name=Mau_ries) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request in /path/to/file on line 88
Fatal error: Uncaught exception 'Exception' with message 'String could not be parsed as XML' in /public/sites/www.singledays.nl/tmp/index.php:89 Stack trace: #0 /public/sites/www.singledays.nl/tmp/index.php(89): SimpleXMLElement->__construct('') #1 {main} thrown in /path/to/file on line 89
Lines 88 & 89 are these:
$data = file_get_contents($timeline);
$tweets = new SimpleXMLElement($data);
Really weird. Sometimes it work, sometimes not.
Does anybody know this issue and/or a solution? And why does the error seem to occur randomly (Allthough it;s now erroring for a while allready)?
Thanks!
$timeline = "http://twitter.com/statuses/user_timeline.xml?screen_name=Mau_ries";
$data = #file_get_contents($timeline);
if($data){
$fh = fopen("cache/".sha1($timeline),"w");
fwrite($fh, $data);
fclose($fh);
}else{
$fh = #fopen("cache/".sha1($timeline),"r");
$data = "";
while(!feof($fh)){ $data = fread($fh, 1024); }
fclose($fh);
}
if(!$data) die("could not open url or find a cache of url locally");
$tweets = new SimpleXMLElement($data);
$i = 0;
foreach($tweets as $tweet){
echo($tweet->text." - ".$tweet->created_at);
if (++$i == 2) break;
}
There as every one has said debugging you should really cache the results in files and if it fails to download then use the cache the above code will do it for you.

Categories