Show CoreTemp Remote Monitoring Temperature on Webpage - php

I would like to show what CoreTemp is monitoring on my personal website. The CoreTemp reports in what I can tell JSON format. However, it appears that when I do a cron or a GET requests, nothing ever shows from the CoreTemp page. I think this may have to do with it must return 5 results before displaying it on the local webpage at http://localhost:5200/.
Example output from CoreTemp Monitoring page:
{"CpuInfo":
{"uiLoad":[2,3],
"uiTjMax":[100],
"uiCoreCnt":2,
"uiCPUCnt":1,
"fTemp":[49,48],
"fVID":1.065918,
"fCPUSpeed":3292.04028,
"fFSBSpeed":99.7588,
"fMultiplier":33,
"CPUName":"Intel Core i5 4310M (Haswell) ",
"ucFahrenheit":0,
"ucDeltaToTjMax":0,
"ucTdpSupported":1,
"ucPowerSupported":1,
"uiStructVersion":2,
"uiTdp":[37],
"fPower":[5.889584],
"fMultipliers":[33,33]},
"MemoryInfo":{
"TotalPhys":16282,
"FreePhys":8473,
"TotalPage":17304,
"FreePage":8473,
"TotalVirtual":8388608,
"FreeVirtual":8388003,
"FreeExtendedVirtual":1,
"MemoryLoad":47}}
Again, I'm getting stuck even to have any data show up from a simple GET or curl request from php.
Simple php code:
<?php
$exe = curl_init();
curl_setopt($exe, CURLOPT_URL, "http://localhost:5200/");
curl_setopt($exe, CURLOPT_HEADER, 0);
curl_setopt($exe, CURLOPT_RETURNTRANSFER, 1);
$raw = curl_exec($exe);
curl_close($exe);
echo $raw;
What am I missing or is this a problem with the Monitoring plug in itself?

Related

How can I make this PHP script run faster/asynchronously?

I have a pastebin scraper script, which is designed to find leaked emails and passwords, to make a website like HaveIBeenPwned.
Here is what my script is doing:
- Scraping Pastebin links from https://psbdmp.ws/dumps
- Getting a random proxy using this Random Proxy API (because Pastebin bans your IP if you hammer too many requests): https://api.getproxylist.com/proxy
- Doing a CURL request to the Pastebin links, then doing a preg_match_all to find all the email addresses and passwords in the format email:password.
The actual script seems to be working alright, but it isn't optimized enough, and is giving me a 524 timeout error after some time, which I suspect is because of all those CURL requests.Here is my code:
api.php
function comboScrape_CURL($url) {
// Get random proxy
$proxies->json = file_get_contents("https://api.getproxylist.com/proxy");
$proxies->decoded = json_decode($proxies->json);
$proxy = $proxies->decoded->ip.':'.$proxies->decoded->port;
list($ip,$port) = explode(':', $proxy);
// Crawl with proxy
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
$curl_scraped_page = curl_exec($ch);
curl_close($ch);
comboScrape('email:pass',$curl_scraped_page);
}
index.php
require('api.php');
$expression = "/(?:https\:\/\/pastebin\.com\/\w+)/";
$extension = ['','1','2','3','4','5','6','7','8','9','10','11','12','13','14','15','16','17','18','19','20'];
foreach($extension as $pge_number) {
$dumps = file_get_contents("https://psbdmp.ws/dumps/".$pge_number);
preg_match_all($expression,$dumps,$urls);
$codes = str_replace('https://pastebin.com/','',$urls[0]);
foreach ($codes as $code) {
comboScrape_CURL("https://pastebin.com/raw/".$code);
}
}
524 timeout error - err, seems you're running php behind a web server (apache? nginx? lighthttpd? IIS?) don't do that, run your code from php-cli instead, php-cli can run indefinitely and never timeout.
because Pastebin bans your IP if you hammer too many requests - buy a pastebin.com pro account instead ( https://pastebin.com/pro ), it costs about $50 (or $20 around Christmas & Black Friday), and is a lifetime account with a 1-time payment, and gives you access to the scraping api ( https://pastebin.com/doc_scraping_api ), with the scraping api you can fetch about 1 paste per second, or 86400 pastes per day, without getting ip banned.
and because of pastebin.com's rate limits, there is no need to do this asynchronously with multiple connections (it's possible, but not worth the hassle. if you actually needed to do that however, you'd have to use the curl_multi API)

How can I decrease CURLOPT php load times on my page?

My website is pausing for a second when running my PHP code. What can I do to speed it up?
I can't put the PHP code at the bottom of the page and use CSS "position: absolute" to move it back to the top because of responsive web design issues with iPhones.
I don't really want to remove the 2 timeouts because if the 3rd party website (blockchain.info) goes offline the page won't load.
The code is for an advertisement to displayed after reading a bitcoin wallet balance. The code you see below is copied 7 times for each of the 7 ads on the page (ad1, ad2, ad3 etc.).
I know HTML and CSS but don't really know much about PHP or javascript/jQuery etc. (but I can copy and paste).
$ch = curl_init('http://whateverlink.com' . $address);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1) ;
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$ad1 = trim(curl_exec($ch)/100000000);
if ($ad1 > 0.001) {echo
'Display Ad1';
} elseif ($ad1 > -0.0001) {echo
'No ad yet';
}
Multi-curl can speed up your script.
You should have a look at the answer to this question: PHP Parallel curl requests
and the following PHP class providing an easy interface for running multiple concurrent CURL requests:
https://github.com/petewarden/ParallelCurl

php cURL request to instsagram API creates page lag

I am using cURL toaccess instagrams API on a webpage I am building. THe functionality works great, however, page load is sacrificed. For instance, consider this DOM structure:
Header
Article
Instagram Photos (retrieved via cURL)
Footer
When loading the page, the footer will not load until the instagram hotos have been fully loaded with cURL. Below is the cURL function that is being called:
function fetchData($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_TIMEOUT, 20);
$result = curl_exec($ch);
curl_close($ch);
return $result;
}
$result = fetchData("https://api.instagram.com/v1/media/search?lat={$lat}&lng={$lng}&distance={$distance}&access_token={$accessToken}");
$result = json_decode($result);
So after this function is run, then the rest of the DOM is displayed. If I move the function call below the footer, it does not work.
Is there anything I can do to load the entire webpage and have the cURL request setn on top of the loading site (not cause a lag or holdup)?
UPDATE: Is the best solution to load it after the footer, and then append it to another area with js?
You can cache the result json into a file that is saved locally. You can make a cronjob that is called ever minute and update the locally cache file. This makes your page loading much more faster. The downside is that your cache is updating even when you don't have visitors and you have a delay of a minute in data from instagram.

How to process XML returned after submitting a form?

I have just started a project that involves me sending data using POST in HTML forms to a another companies server. This returns XML. I need to process this XML to display certain information on a web page.
I am using PHP and have no idea where to start with how to access the XML. Once I knwo how to get at it I know how to access it using XPath.
Any tips of how to get started or links to sites with information on this would be very useful.
You should check out the DOMDocument() class, it comes as part of the standard PHP installation on most systems.
http://us3.php.net/manual/en/class.domdocument.php
ohhh, I see. You should set up a PHP script that the user form posts to. If you want to process the XML response you should then pass those fields on to the remote server using cURL.
http://us.php.net/manual/en/book.curl.php
A simple example would be something like:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://the_remote_server");
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, $_POST);
$YourXMLResponse = curl_exec($ch);
?>

cURL PHP: Connecting to Invoicing System

I am attempting to use cURL to connect to a page like this: https://clients.mindbodyonline.com/asp/home.asp?studioid=851 with the following code;
<?php
$curl_handle=curl_init();
curl_setopt($curl_handle,CURLOPT_URL,'https://clients.mindbodyonline.com/asp/home.asp?studioid=851');
//curl_setopt($curl_handle,CURLOPT_CONNECTTIMEOUT,2);
curl_setopt($curl_handle,CURLOPT_RETURNTRANSFER,1);
curl_setopt($curl_handle,CURLOPT_HTTPAUTH, CURLAUTH_ANY);
curl_setopt($curl_handle,CURLOPT_COOKIEJAR, '/tmp/cookies.txt');
curl_setopt($curl_handle,CURLOPT_COOKIEFILE, '/tmp/cookies.txt');
//curl_setopt($curl_handle,CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($curl_handle,CURLOPT_SSL_VERIFYHOST, false);
curl_setopt($curl_handle, CURLOPT_FOLLOWLOCATION, 1);
//curl_setopt($curl_handle,CURLOPT_HEADER, 1);
//curl_setopt($curl_handle,CURLOPT_RETURNTRANSFER, 1);
//curl_setopt ($curl_handle,CURLOPT_POST, 1);
$buffer = curl_exec($curl_handle);
curl_close($curl_handle);
if (empty($buffer))
{
print "Sorry, The booking system appears to be unavailable at this time.<p>";
}
else
{
print $buffer;
}
?>
I've fiddled the settings and the only three responses I get are;
Nothing is loaded and the error message is called
A redirect to /asp/home... locally
Returns '1' and that's all
Thanks for your time!
1. Nothing is loaded: Your code works fine, but they're ignoring you. It's possibly some anti-hammering functionality on their end. You can always change your code to sleep for a while and retry a few times.
2. A redirect to /asp/home... locally: Your code is working, but returns javascript back from their page which is executed by your browser and redirects you to a page that doesn't exist. To see this code without it running do:
print htmlspecialchars($buffer);
3. Returns '1' and that's all: That happens if you don't use
curl_setopt($client, CURLOPT_RETURNTRANSFER, true);
Uncomment it.
Unfortunately even then you wont be able to read this particular site as is, because it's drawn by javascript, which cURL can't run.
Perhaps the cookies you are using are invalid (ie, the session has expired).
If you load that page without cookies, you get a POST form. Perhaps you should send that POST data first, get the cookies and then use those cookies for the rest of the session.
Also, you need to set CURLOPT_RETURNTRANSFER to 1 in order to use $buffer like that. If not, cURL will output the page it gets, which is probably explains #2.

Categories