I have this code
<?php
if (!file_exists($folderAddress . $_GET['name'] . '.json')) {
//create file
$myJson = fopen($folder.$_GET['name']. ".json", "w");
//get a context of a url
$ch = curl_init($myUrl);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$text = '';
if( ($text = curl_exec($ch) ) === false)
{
die('fail');
}
// Close handle
curl_close($ch);
//copy to json file
$result = fwrite($myJson, $text);
fclose($myJson);
$t = time();
//add update date to db
if (!mysqli_query($con, "INSERT INTO test (name )
VALUES ('$_GET['name]', '$t')")
) {
echo $text;
mysqli_close($con);
die('');
}
//show the json file
echo $text;
}
?>
And i have this problem if users request this file in same time or with less than a 500 ms delay all of them think that the file does not exist. So how can i prevent users to write file when the first is writing on it ?
You use exclusive lock when writing to a file, so the other processes can't interfere until initial one is complete. You also don't need fopen / fwrite whatsoever.
Fetch your data using cURL and use the following snippet:
file_put_contents($filename, $contents, LOCK_EX);
Related
I am using the output of a php file on a remote server, to show content on my own web-site. I do not have access to modify files on the remote server.
The remote php file outputs java script like this:
document.write('<p>some text</p>');
If I enter the url in a browser I get the correct output. E.g:
https://www.remote_server.com/files/the.php?param1=12
I can show the output of the remote file on my website like this:
<script type="text/javascript" src="https://www.remote_server.com/files/the.php?param1=12"></script>
But I would like to filter the output a bit before showing it.
Therefore I implemented a php file with this code:
function getRemoteOutput(){
$file = fopen("https://www.remote_server.com/files/the.php?param1=12","r");
$output = fread($file,1024);
fclose($file);
return $output;
}
When I call this function fopen() returns a valid handle, but fread() returns an empty string.
I have tried using file_get_contents() instead, but get the same result.
Is what I am trying to do possible?
Is it possible for the remote server to allow me to read the file via the browser, but block access from a php file?
Your variable $output is only holding the 1st 1024 bytes of the url... (headers maybe?).
You will need to add a while not the "end of file" loop to concatenate the entire remote file.
PHP reference: feof
You can learn a lot more in the PHP description for the fread function.
PHP reference: fread.
<?php
echo getRemoteOutput();
function getRemoteOutput(){
$file = fopen("http://php.net/manual/en/function.fread.php","r");
$output = "";
while (!feof($file)){ // while not the End Of File
$output.= fread($file,1024); //reads 1024 bytes at a time and appends to the variable as a string.
}
return $output;
fclose($file);
}
?>
In regards to your questions:
Is what I am trying to do possible?
Yes this is possible.
Is it possible for the remote server to allow me to read the file via
the browser, but block access from a php file?
I doubt it.
I contacted the support team for the site I was trying to connect to. They told me that they do prevent access from php files.
So that seems to be the reason for my problems, and apparently I just cannot do what I tried to do.
For what it's worth, here is the code I used to test the various methods to read file output:
<?php
//$remotefile = 'http://www.xencomsoftware.net/configurator/tracker/ip.php';
$remotefile = "http://php.net/manual/en/function.fread.php";
function getList1(){
global $remotefile;
$output = file_get_contents($remotefile);
return htmlentities($output);
}
function getList2(){
global $remotefile;
$file = fopen($remotefile,"r");
$output = "";
while (!feof($file)){ // while not the End Of File
$output.= fread($file,1024); //reads 1024 bytes at a time and appends to the variable as a string.
}
fclose($file);
return htmlentities($output);
}
function getList3(){
global $remotefile;
$ch = curl_init(); // create curl resource
curl_setopt($ch, CURLOPT_URL, $remotefile); // set url
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); //return the transfer as a string
$output = curl_exec($ch); // $output contains the output string
curl_close($ch); // close curl resource to free up system resources
return htmlentities($output);
}
function getList4(){
global $remotefile;
$r = new HttpRequest($remotefile, HttpRequest::METH_GET);
try {
$r->send();
if ($r->getResponseCode() == 200) {
$output = $r->getResponseBody();
}
} catch (Exception $e) {
echo 'Caught exception: ', $e->getMessage(), "\n";
}
return htmlentities($output);
}
function dumpList($ix, $list){
$len = strlen($list);
echo "<p><b>--- getList$ix() ---</b></p>";
echo "<div>Length: $len</div>";
for ($i = 0 ; $i < 10 ; $i++) {
echo "$i: $list[$i] <br>";
}
// echo "<p>$list</p>";
}
dumpList(1, getList1()); // doesn't work! You cannot include/requre a remote file.
dumpList(2, getList2());
dumpList(3, getList3());
dumpList(4, getList4());
?>
When building a web application I needed to save referrer Google search queries to a file and later on echo them.
The thing I have tried so far is:
function write_to_file($q)
{$filename = 'sitemap.dat';
$fh = fopen($filename, "a");
if(flock($fh, LOCK_EX))
{fwrite($fh, $q);
flock($fh, LOCK_UN);}
fclose($fh);}
$ref = $_SERVER['HTTP_REFERER'];
if(strstr($ref, "http://")){
if(strstr($ref, "google.com")){
//echo $ref;
$regex ='/q=(.+?)&/';
preg_match($regex, $ref, $query);
$user_query = ''.$query[1].'';
write_to_file($user_query);
}}
Simplifying the above code first it creates a function to save text into file, then it looks for a refer, checks if it is not https then it check if it's google.com.
All sorted now it grabs the query part and writes it to the file. However I can't get it working because the page after the script is not displayed and nothing is saved in sitemap.dat.
Also, if I remove the file function it echos search%20query, which is I want.
Check this out.
function write_to_file($q=''){
if($q==='') return false;
$fh = fopen('sitemap.dat', "a");
if(flock($fh, LOCK_EX)){
fwrite($fh, $q."\n");
flock($fh, LOCK_UN);
}
fclose($fh);
echo 'done';
}
$ref = (isset($_SERVER['HTTP_REFERER'])) ? $_SERVER['HTTP_REFERER'] : '';
if(strstr($ref, "http://www.google.co")){
preg_match('/q=(.+)&?/', $ref, $query);
if(count($query)>1){
write_to_file($query[1]);
}
}
I'm using the PHP file grabber script. I put URL of the remote file on the field and then the file is directly uploaded to my server. The code looks like this:
<?php
ini_set("memory_limit","2000M");
ini_set('max_execution_time',"2500");
foreach ($_POST['store'] as $value){
if ($value!=""){
echo("Attempting: ".$value."<br />");
system("cd files && wget ".$value);
echo("<b>Success: ".$value."</b><br />");
}
}
echo("Finished all file uploading.");
?>
After uploading a file I would like to display direct url to the file : for example
Finished all file uploading, direct URL:
http://site.com/files/grabbedfile.zip
Could you help me how to determine file name of last uploaded file within this code?
Thanks in advance
You can use wget log files. Just add -o logfilename.
Here is a small function get_filename( $wget_logfile )
ini_set("memory_limit","2000M");
ini_set('max_execution_time',"2500");
function get_filename( $wget_logfile )
{
$log = explode("\n", file_get_contents( $wget_logfile ));
foreach ( $log as $line )
{
preg_match ("/^.*Saving to: .{1}(.*).{1}/", $line, $find);
if ( count($find) )
return $find[1];
}
return "";
}
$tmplog = tempnam("/tmp", "wgetlog");
$filename = "";
foreach ($_POST['store'] as $value){
if ($value!=""){
echo("Attempting: ".$value."<br />");
system("cd files && wget -o $tmplog ".$value); // -o logfile
$filename = get_filename( $tmplog ); // current filename
unlink ( $tmplog ); // remove logfile
echo("<b>Success: ".$value."</b><br />");
}
}
echo("Finished all file uploading.");
echo "Last file: ".$filename;
Instead of using wget like that, you could do it all using cURL, if that is available.
<?php
set_time_limit(0);
$lastDownloadFile = null;
foreach ($_POST['store'] as $value) {
if ($value !== '' && downloadFile($value)) {
$lastDownloadFile = $value;
}
}
if ($lastDownloadFile !== null) {
// Print out info
$onlyfilename = pathinfo($lastDownloadFile, PATHINFO_BASENAME);
} else {
// No files was successfully uploaded
}
function downloadFile($filetodownload) {
$fp = fopen(pathinfo($filetodownload, PATHINFO_BASENAME), 'w+');
$ch = curl_init($filetodownload);
curl_setopt($ch, CURLOPT_TIMEOUT, 50);
curl_setopt($ch, CURLOPT_FILE, $fp); // We're writing to our file pointer we created earlier
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); // Just in case the server throws us around
$success = curl_exec($ch); // gogo!
// clean up
curl_close($ch);
fclose($fp);
return $success;
}
Some words of caution however, letting people upload whatever to your server might not be the best idea. What are you trying to accomplish with this?
I'm writing an application that uses a .php script to get tweets using the twitter search API.
See below code:
<?php
$hashtag = 'hashtag'; // We search Twitter for the hashtag
$show = 25; // And we want to get 25 tweets
// Local path
$cacheFile = '../../_data/tweets.json.cache'; // A cachefile will be placed in _data/
$json = file_get_contents("http://search.twitter.com/search.json?result_type=recent&rpp=$show&q=%23" . $hashtag. "%20-RT") or die("Could not get tweets");
$fp = fopen($cacheFile, 'w');
fwrite($fp, $json);
fclose($fp);
?>
My problem is that I want to make sure that this script runs without fail, or if it does fail at least doesn't keep looping.
The script is going to be run automatically every 1 minute.
Would anyone know a good way to handle errors here?
TL;DR: How do I handle errors on my code?
in simple, use '#' prefix for function. It suppresses errors from displaying. Read More Here
<?php
$hashtag = 'hashtag'; // We search Twitter for the hashtag
$show = 25; // And we want to get 25 tweets
$cacheFile = '../../_data/tweets.json.cache'; // A cachefile will be placed in _data/
$json = #file_get_contents("http://search.twitter.com/search.json?result_type=recent&rpp=$show&q=%23" . $hashtag . "%20-RT");
if (!empty($json)) {
$fp = fopen($cacheFile, 'w');
fwrite($fp, $json);
fclose($fp);
} else {
echo "Could not get tweets";
exit;
}
?>
My code:
<?
$url = 'http://w1.weather.gov/xml/current_obs/KGJT.xml';
$xml = simplexml_load_file($url);
?>
<?
echo $xml->weather, " ";
echo $xml->temperature_string;
?>
This works great, but I read that caching external data is a must for page speed. How can I cache this for lets say 5 hours?
I looked into ob_start(), is this what I should use?
The ob system is for in-script cacheing. It's not useful for persistent multi invocation caching.
To do this properly, you'd write the resulting xml out of a file. Every time the script runs, you'd check the last updated time on that file. if it's > 5 hours, you fetch/save a fresh copy.
e.g.
$file = 'weather.xml';
if (filemtime($file) < (time() - 5*60*60)) {
$xml = file_get_contents('http://w1.weather.gov/xml/current_obs/KGJT.xml');
file_put_contents($file, $xml);
}
$xml = simplexml_load_file($file);
echo $xml->weather, " ";
echo $xml->temperature_string;
ob_start would not be a great solution. That only applies when you need to modify or flush the output buffer. Your XML returned data is not being sent to the buffer, so no need for those calls.
Here's one solution, which I've used in the past. Does not require MySQL or any database, as data is stored in a flat file.
$last_cache = -1;
$last_cache = #filemtime( 'weather_cache.txt' ); // Get last modified date stamp of file
if ($last_cache == -1){ // If date stamp unattainable, set to the future
$since_last_cache = time() * 9;
} else $since_last_cache = time() - $last_cache; // Measure seconds since cache last set
if ( $since_last_cache >= ( 3600 * 5) ){ // If it's been 5 hours or more since we last cached...
$url = 'http://w1.weather.gov/xml/current_obs/KGJT.xml'; // Pull in the weather
$xml = simplexml_load_file($url);
$weather = $xml->weather . " " . $xml->temperature_string;
$fp = fopen( 'weather_cache.txt', 'a+' ); // Write weather data to cache file
if ($fp){
if (flock($fp, LOCK_EX)) {
ftruncate($fp, 0);
fwrite($fp, "\r\n" . $weather );
flock($fp, LOCK_UN);
}
fclose($fp);
}
}
include_once('weather_cache.txt'); // Include the weather data cache