I want to run a single webpage to display files, which are stored in an Azure File Storage. It must be Azure File storage because these files came from a Docker container, which is mounted, to that file storage.
I have a Azure Container Instance witch stores PDF files in an Azure File Storage. Now I run a WebApp (PHP) that shall read all the PDFs.
I’ve installed the Microsoft Azure Storage File PHP SDK but have no clue how to use it. Even the sample.php did not help me coming forward. It would be very helpful if someone could help me a bit a simple sniped.
I see you want to directly display a PDF file from Azure File Storage in a web page. Generally, the best practice is to generate the url with sas token of a file in Azure File Storage.
So I followed the GitHub repo Azure/azure-storage-php to install Azure File Storage SDK for PHP in my sample project, here is my sample code and its dependencies.
The file structure of my sample project is as the figure below.
The content of my composer.json file is as below.
{
"require": {
"microsoft/azure-storage-file": "*"
}
}
My sample PHP file demo.php is as below, that's inspired by the function generateFileDownloadLinkWithSAS of the offical sample azure-storage-php/samples/FileSamples.php.
<?php
require_once "vendor/autoload.php";
use MicrosoftAzure\Storage\Common\Internal\Resources;
use MicrosoftAzure\Storage\File\FileSharedAccessSignatureHelper;
$accountName = "<your account name>";
$accountKey = "<your account key>";
$shareName = "<your share name>";
$fileName = "<your pdf file name>";
$now = date(DATE_ISO8601);
$date = date_create($now);
date_add($date, date_interval_create_from_date_string("1 hour"));
$expiry = str_replace("+0000", "Z", date_format($date, DATE_ISO8601));
$helper = new FileSharedAccessSignatureHelper($accountName, $accountKey);
$sas = $helper->generateFileServiceSharedAccessSignatureToken(
Resources::RESOURCE_TYPE_FILE,
"$shareName/$fileName",
'r', // Read
$expiry // A valid ISO 8601 format expiry time, such as '2020-01-01T08:30:00Z'
);
$fileUrlWithSAS = "https://$accountName.file.core.windows.net/$shareName/$fileName?$sas";
echo "<h1>Demo to display PDF from Azure File Storage</h1>";
echo "<iframe src='$fileUrlWithSAS' width='800' height='500' allowfullscreen webkitallowfullscreen></iframe>";
?>
The result of my web page is as the figures below in Chrome and Firefox.
The result in Chrome:
The result in Firefox:
Update: The code to list files in a file share, as below.
<?php
require_once "vendor/autoload.php";
use MicrosoftAzure\Storage\File\FileRestProxy;
$accountName = "<your account name>";
$accountKey = "<your account key>";
$shareName = "<your share name>";
$connectionString = "DefaultEndpointsProtocol=https;AccountName=$accountName;AccountKey=$accountKey";
$fileClient = FileRestProxy::createFileService($connectionString);
$list = $fileClient->listDirectoriesAndFiles($shareName);
function endsWith( $str, $sub ) {
return ( substr( $str, strlen( $str ) - strlen( $sub ) ) === $sub );
}
foreach($list->getFiles() as &$file) {
$fileName = $file->getName();
if(endsWith($fileName, ".pdf")) {
echo $fileName."\n";
}
};
?>
Related
I'm trying to get a video from my google drive account and publish it on my website.
The idea is to authorize the access to the file using a service account, so the video will be "public" accessible without the user using his google credentials.
Right now for the images I download them and the show it from my server, but due to storage space I would prefer not doing the same for videos
Here's my code:
$client = getGoogleClient(); // Specify the CLIENT_ID of the app that accesses the backend
$service = new Google_Service_Drive($client);
switch ($type) {
case 1: //video
$startPos=strrpos($url['URL'], "file/d")+7;
if($startPos>7)
{
$endPos=strrpos($url['URL'],"/");
$url=substr($url['URL'],$startPos,$endPos-$startPos); //its the file id
}
// Get files from our request
$file = $service->files->get($url,array("fields"=>"webContentLink"));
$customData=$file->webContentLink;
$customclass="hasVideo";
break;
case 3: //img
if(is_null($img))
{
//we have to donwload the file and store it temporaly
//find img id
$startPos=strrpos($url['URL'], "file/d")+7;
if($startPos>7)
{
$endPos=strrpos($url['URL'],"/");
$url=substr($url['URL'],$startPos,$endPos-$startPos);
$content = $service->files->get($url, array("alt" => "media"));
// Open file handle for output.
$filePath="./cachedFiles/".uniqid().".jpg";
$outHandle = fopen($filePath, "w+");
// Until we have reached the EOF, read 1024 bytes at a time and write to the output file handle.
while (!$content->getBody()->eof())
fwrite($outHandle, $content->getBody()->read(1024));
// Close output file handle.
fclose($outHandle);
$connection->runQuery("UPDATE File_Elemento SET cachedFile='".$filePath."', lastCached='".date('Y-m-d H:m:s')."' WHERE ID=".$ID);
}
else
$type=0;
}
else
$filePath=$img;
require_once('./ImageCache/ImageCache.php');
$imagecache = new ImageCache\ImageCache();
$imagecache->cached_image_directory = dirname(__FILE__) . '/cachedImg';
$filePath = $imagecache->cache( $filePath );
break;
default:
break;
}
echo '<a onclick="showDetail(this,\''.$customData.'\')" class="grid-item '.($subject ? $subject : "Generico").' '.($customclass!="" ? $customclass : "").'"><div class="card newsCard">'.($type==3 ? '<img class="lazy-load imgPrev" data-src="'.$filePath.'">' : "").'<h3>'.$school.'</h3><h1>'.$name.'</h1>';
echo '<div class="prev">'.$subject.'</div><span class="goin mainColor">Visualizza</span></div></a>';
right now I tried to get the webContentLink and then put the url I get as source for a video tag, but I get a 403 error, so still I didn't authorize the access using the service account
Any help would be appreciated
Embedding the webContentLink to your website won't make this publicly available. The webContentLink is as restricted as the file itself: it can only be accessed by users with which the file has been shared.
So you should do one of these:
Make this video public (via Permissions: create, or through the UI itself) (role: reader and type: anyone).
Download it serve it from your server, as with your images.
Related:
Generating a downloadable link for a private file uploaded in the Google drive
I have a form that allows a user to fill in several aspects and then choose a file to upload.
When the form is submitted, I want write some code that saves the file to a dropbox account and gets access to a direct download link and places this in a database I am hosting.
If anyone has done this, is there a specific section of the API to look at? Or any examples?
I can't seem to find this in the API.
Thanks.
From what I see in the API it is possible to do this. You need to download the Dropbox Core API. Inside the zip file, you will find an example folder with example code for authentication, upload, download, direct-link and so on. Just see the direct-link.php and change it to your needs. Here is a tested working example of uploading a file and generating a direct link for download:
<?php
require_once "dropbox-php-sdk-1.1.2/lib/Dropbox/autoload.php";
use \Dropbox as dbx;
$dropbox_config = array(
'key' => 'your_key',
'secret' => 'your_secret'
);
$appInfo = dbx\AppInfo::loadFromJson($dropbox_config);
$webAuth = new dbx\WebAuthNoRedirect($appInfo, "PHP-Example/1.0");
$authorizeUrl = $webAuth->start();
echo "1. Go to: " . $authorizeUrl . "<br>";
echo "2. Click \"Allow\" (you might have to log in first).<br>";
echo "3. Copy the authorization code and insert it into $authCode.<br>";
$authCode = trim('DjsR-iGv4PAAAAAAAAAAAbn9snrWyk9Sqrr2vsdAOm0');
list($accessToken, $dropboxUserId) = $webAuth->finish($authCode);
echo "Access Token: " . $accessToken . "<br>";
$dbxClient = new dbx\Client($accessToken, "PHP-Example/1.0");
// Uploading the file
$f = fopen("working-draft.txt", "rb");
$result = $dbxClient->uploadFile("/working-draft.txt", dbx\WriteMode::add(), $f);
fclose($f);
print_r($result);
// Get file info
$file = $dbxClient->getMetadata('/working-draft.txt');
// sending the direct link:
$dropboxPath = $file['path'];
$pathError = dbx\Path::findError($dropboxPath);
if ($pathError !== null) {
fwrite(STDERR, "Invalid <dropbox-path>: $pathError\n");
die;
}
// The $link is an array!
$link = $dbxClient->createTemporaryDirectLink($dropboxPath);
// adding ?dl=1 to the link will force the file to be downloaded by the client.
$dw_link = $link[0]."?dl=1";
echo "Download link: ".$dw_link."<br>";
?>
I made this really fast just to get it working. Eventually you may need to tweak it a bit so it will suite your needs.
There is section in the Core API manual, see this link.
So you can use the upload part like this:
$f = file_get_contents('data.txt');
$result = $dbxClient->uploadFile("/data.txt", dbx\WriteMode::add(), $f);
echo 'file uploaded';
I am trying to get the url to a file that I have uploaded to moodle. I want to access the file via the browser using http. The pathname of the file is hashed in the moodle database. Is there a way of getting the real url of uploaded files in moodle? this is the code I was trying out using the Moodle File API.
<?php
require_once("../config.php");
$course_name=$_GET["course"];
$table_files="files";
$results=$DB->get_records($table_files,array('filename'=>$course_name));
//Get the file details here::::
foreach($results as $obj){
$contextid=$obj->contextid;
$component=$obj->component;
$filearea=$obj->filearea;
$itemid=$obj->itemid;
}
$url=$CFG->wwwroot/pluginfile.php/$contextid/$component/$filearea/$itemid/$course_name;
echo print_r($url);
?>
Will appreciate the help.
This works for Moodle version 2.3
<?php
require_once("../config.php");
$filename = "my_image.jpg";
$table_files = "files";
$results = $DB->get_record($table_files, array('filename' => $filename, 'sortorder' => 1));
$baseurl = "$CFG->wwwroot/pluginfile.php/$results->contextid/$results->component/$results->filearea/$results->itemid/$filename";
echo $baseurl;
The result is the Url to the image "my_image.jpg"
Just wondering if anyone knows of a prewritten php script to to transfer files from rackspace cloud site to rackspace cloud files.
Rackspace does provide a script to backup files to the cloud but not transfer. (and I only realized that after spending a couple of hours and finally getting the script working).
http://www.rackspace.com/knowledge_center/article/how-to-use-cron-to-backup-cloudsites-to-cloudfiles
I don't know PHP (which is required for Rackspace cron jobs), so if there's a script that would help me with this it would be great.
Thanks.
Below is the script I use when I backup to rackspace. It uses the php cloud files master library from racker labs. I set it up as a simple cron. Simply replace the emai
php 'path/to/backup.php'
<?
require_once('php-cloudfiles-master/cloudfiles.php');
$server_name = 'YOUR_SERVER_NAME'; //Name of the Current Server
$curr_date_time = date("Y-m-d--H:i:s"); //DON' CHANGE - Date
$curr_day = date("Yd"); //DON' CHANGE - Date
$sites = array("ENTERDATABASES HERE IN ARRAY FORMAT"); //Array of Databases to be Backed up
$site_root = "/var/www/";
$temp_dir = "/bak/mysql/"; //temp directory
$site_temp_dir = "/bak/site/"; //temp directory
$rscfUsername = 'YOUR RACKSPACE USERNAME'; // the username for your rackspace account
$rscfApiKey = 'YOUR RACKSPACE API KEY'; // the api key for your account
$rscfContainer = 'YOUR RACKSPACE CONTAINER'; //rackspace containr
foreach($sites as $site) {
try{
exec("tar -czpf {$site_temp_dir}{$site}{$curr_date_time}_site.tar.gz {$site_root}{$site}");
// check if the file is created
if(file_exists("{$site_temp_dir}{$site}{$curr_date_time}_site.tar.gz")) {
$auth = new CF_Authentication($rscfUsername, $rscfApiKey);
$auth->ssl_use_cabundle();
$auth->authenticate();
$conn = new CF_Connection($auth);
$conn->ssl_use_cabundle();
// I allready created a container with this name otherwise user create_container
$container = $conn->get_container($rscfContainer);
// make a unique name with date in it so you know when it was created
$objectName = $site.$curr_date_time.'.tar.gz';
$store = $container->create_object($objectName);
$store->load_from_filename("{$site_temp_dir}{$site}{$curr_date_time}_site.tar.gz"); // send to rackspace
}
} catch (Exception $e) {
print_r($e, true);
}
}
?>
I am trying to script some code that will search a specified folder on the server for the latest file with a specific file extension (in my case .zip) and transfer that file to Rackspace's Cloud Files. The code below is as far i got and i keep getting the error:
Fatal error: Uncaught exception 'IOException' with message 'Could not open file for reading: Resource id #8' in /home/test/public_html/cloudapi/cloudfiles.php:1952 Stack trace: #0 /home/test/public_html/final.php(60): CF_Object->load_from_filename(Resource id #8) #1 {main} thrown in /home/test/public_html/cloudapi/cloudfiles.php on line 1952
The code that i'm using below was originally done for uploading content via an html upload form & i'm trying to adapt the same code to use a local server file instead of an uploaded file. You will see commented code with was previously for an upload script to show you how the upload script had worked.
<?php
// include the Cloud API.
require('cloudapi/cloudfiles.php');
// START - Script to find recent file with the extension .zip in current folder
$show = 2; // Leave as 0 for all
$dir = ''; // Leave as blank for current
if($dir) chdir($dir);
$files = glob( '*.zip');
usort( $files, 'filemtime_compare' );
function filemtime_compare( $a, $b )
{
return filemtime( $b ) - filemtime( $a );
}
$i = 0;
foreach ( $files as $file )
{
++$i;
if ( $i == $show ) break;
$value = $file; //Variable $value contains the filename of the recent file to be used in Cloud Files API
}
// END - Script to find recent file with the extension .zip in current folder
// START - Rackspace API code to upload content to cloud files container
// Rackspace Connection Details;
$username = "randomusername"; // put username here
$key = "234887347r93289f28h3ru283h2fuh23093402398"; // api key
// Connect to Rackspace
$auth = new CF_Authentication($username, $key);
$auth->authenticate();
$conn = new CF_Connection($auth);
//Set the Container you want to use
$container = $conn->get_container('Backups');
//Temp store the file
//$localfile = $_FILES['uploadfile']['tmp_name'];
//$filename = $_FILES['uploadfile']['name'];
$localfile = fopen($value, "r");
$filename = $value;
//Uploading to Rackspace Cloud
$object = $container->create_object($filename);
$object->load_from_filename($localfile);
echo "Your file has been uploaded";
// END - Rackspace API code to upload content to cloud files container
?>
I know this is an old thread. But for the benefit of people who may land at this page while searching for a solution...
The problem with your code is:
The following statement opens the file for reading
$localfile = fopen($value, "r");
However when you place the load_from_filename call, the routine again tries to open the same file and fails because you already have it open in $localfile.
So comment out the previous command and you should be able to run the script successfully.
Error's being thrown in because content type not defined from read file.