How to use Nginx secure_link to stream from remote server - php

I would need to get secure link in nginx working where I got the data on another server.
I'm using one server setup with nginx secure_link as follows:
location ~ \.mp4$ {
secure_link $arg_md5,$arg_expires;
secure_link_md5 "$secure_link_expires$uri$remote_addr secretkey";
if ($secure_link = "") {
return 403;
}
if ($secure_link = "0") {
return 410;
}
Using PHP to build the url.
function buildSecureLink($baseUrl, $path, $secret, $ttl, $userIp)
{
$expires = time() + $ttl;
$md5 = md5("$expires$path$userIp $secret", true);
$md5 = base64_encode($md5);
$md5 = strtr($md5, '+/', '-_');
$md5 = str_replace('=', '', $md5);
return $baseUrl . $path . '?md5=' . $md5 . '&expires=' . $expires;
}
$secret = 'secretkey';
$baseUrl = 'domain here';
$path = '/videos' . $video->id . '.mp4';
$ttl = 3600;
$userIp = $_SERVER["HTTP_CF_CONNECTING_IP"]; // since behind cloudflare
$vidurl = buildSecureLink($baseUrl, $path, $secret, $ttl, $userIp);
This works just fine when data is in the same server. But if I try to use secure_link with the same settings in nginx to stream from that remote server (naturally with PHP part updated with correct $baseUrl and path), it simply doesn't work.
Is there any way to use secure_link where the actual data (video in this case) is on another server?

My mistake. Got it working.
Forgot to add the following to the nginx conf which is needed to get the real user ip with Cloudflare: https://support.cloudflare.com/hc/en-us/articles/200170706-How-do-I-restore-original-visitor-IP-with-Nginx-
Compared the nginx headers to figure it out and noticed the IP provided was wrong.

Related

Get Azure File from snapshot with php

Is there any documentation to get a SAS URL to download a file from a Snapshot of a Azure Share File?
Using this is easy to download a direct Azure File with SAS, but not any snapshot:
GenerateFileDownloadLinkWithSAS (https://github.com/Azure/azure-storage-php/blob/master/samples/FileSamples.php)
Here my code:
use MicrosoftAzure\Storage\Common\Exceptions\ServiceException;
use MicrosoftAzure\Storage\Common\Internal\Resources;
use MicrosoftAzure\Storage\Common\Internal\StorageServiceSettings;
use MicrosoftAzure\Storage\Common\Models\Range;
use MicrosoftAzure\Storage\Common\Models\Metrics;
use MicrosoftAzure\Storage\Common\Models\RetentionPolicy;
use MicrosoftAzure\Storage\Common\Models\ServiceProperties;
use MicrosoftAzure\Storage\File\FileRestProxy;
use MicrosoftAzure\Storage\File\FileSharedAccessSignatureHelper;
use MicrosoftAzure\Storage\File\Models\CreateShareOptions;
use MicrosoftAzure\Storage\File\Models\ListSharesOptions;
use MicrosoftAzure\Storage\File\Models\ListDirectoriesAndFilesOptions;
function MapFileURL($shareName,$filePath)
{
global $fileRestProxy;
global $mapConString;
$prepareFilePath = implode('/', array_map(function ($v)
{
return rawurlencode($v);
}, explode('/', $filePath))
);
// Create a SharedAccessSignatureHelper
$settings = StorageServiceSettings::createFromConnectionString($mapConString);
$accountName = $settings->getName();
$accountKey = $settings->getKey();
$helper = new FileSharedAccessSignatureHelper(
$accountName,
$accountKey
);
$endDate=MapIsoDate(time() + 13300);
// Generate a file readonly SAS token
// Refer to following link for full candidate values to construct a service level SAS
// https://learn.microsoft.com/en-us/rest/api/storageservices/constructing-a-service-sas
$sas = $helper->generateFileServiceSharedAccessSignatureToken(
Resources::RESOURCE_TYPE_FILE,
$shareName . "/" . $prepareFilePath,
'r', // Read
$endDate
);
$connectionStringWithSAS = Resources::FILE_ENDPOINT_NAME.'='.'https://'.$accountName.'.'.Resources::FILE_BASE_DNS_NAME.';'.Resources::SAS_TOKEN_NAME.'='.$sas;
$fileClientWithSAS = FileRestProxy::createFileService($connectionStringWithSAS);
// Get a downloadable file URL
$fileUrlWithSAS = sprintf(
'%s%s?%s',
(string)$fileClientWithSAS->getPsrPrimaryUri(),
$shareName . "/" . $prepareFilePath,
$sas
);
return $fileUrlWithSAS;
}
What would be missing to be able to download the file from a Azure File snapshot?
What would be missing to be able to download the file from a Azure
File snapshot?
What you need to do is append the share's snapshot date/time to your SAS URL. Something like:
https://account.file.core.windows.net/share/file.png?sastoken&snapshot=2021-05-01T13:49:56.0000000Z
Here is the code that works:
function MapSnapshotFileURL($shareName, $filePath, $snapshotTime)
{
global $fileRestProxy;
global $mapConString;
// Preparar path para enviar a la funciĆ³n azure.
$prepareFilePath = implode('/', array_map(function ($v)
{
return rawurlencode($v);
}, explode('/', $filePath))
);
// Create a SharedAccessSignatureHelper
$settings = StorageServiceSettings::createFromConnectionString($mapConString);
$accountName = $settings->getName();
$accountKey = $settings->getKey();
$helper = new FileSharedAccessSignatureHelper(
$accountName,
$accountKey
);
$endDate=MapIsoDate(time() + 13300);
//$endDate='2019-07-16T08:30:00Z';
// Generate a file readonly SAS token
// Refer to following link for full candidate values to construct a service level SAS
// https://learn.microsoft.com/en-us/rest/api/storageservices/constructing-a-service-sas
$sas = $helper->generateFileServiceSharedAccessSignatureToken(
Resources::RESOURCE_TYPE_FILE,
$shareName . "/" . $prepareFilePath,
'r', // Read
$endDate // '2020-01-01T08:30:00Z' // A valid ISO 8601 format expiry time
);
$connectionStringWithSAS = Resources::FILE_ENDPOINT_NAME.'='.'https://'.$accountName.'.'.Resources::FILE_BASE_DNS_NAME.';'.Resources::SAS_TOKEN_NAME.'='.$sas;
$fileClientWithSAS = FileRestProxy::createFileService($connectionStringWithSAS);
// Get a downloadable file URL
$fileUrlWithSAS = sprintf(
'%s%s?%s&%s',
(string)$fileClientWithSAS->getPsrPrimaryUri(),
$shareName . "/" . $prepareFilePath,
$sas,
"snapshot=".$snapshotTime
);
return $fileUrlWithSAS;
}

How to download a github repository or files with curl

I have a success to download a repository from github via file_get_content but what is the orientation with curl.
The advantage with the file_get_content it's like anonymous. In this case inside a backoffice, it's possible to download plugin in for example.
But Github restrict this usage at less 10 times, after the user must to wait long time to restart something.
It seems if the curl is used, github allow more access to download a respository.
Do you have an example to use with curl (like anonymous)?
Below an example that I make. It's works fine but it's not possible to use this several times. Think less 10 times.
Thank you
public function __construct() {
$this->githubUrl = 'https://github.com';
$this->githubApi = 'https://api.github.com';
$this->githubRepo = 'repos';
$this->context = stream_context_create(array('http' => array('header' => 'User-Agent: ClicShopping',)));
$this->githubRepoClicShoppingCore = 'CoreofApplication';
$this->githubRepoName = 'addOnNameOfApplication';
}
private function getGithubRepo() {
$url = $this->githubApi . '/' . $this->githubRepo . '/' . $this->githubRepoName;
return $url;
}
private function getGithubCoreRepo() {
$url = $this->githubApi . '/' . $this->githubRepo . '/' . $this->coreName . '/' . $this->githubRepoClicShoppingCore;
return $url;
}
private function setContext() {
return $this->context;
}
private function getGithubApiClicShoppingCoreArchive() {
$url = $this->githubUrl . '/' .$this->coreName . '/' . $this->githubRepoClicShoppingCore.'/archive/master.zip';
return $url;
}
Thank you.

How to generate Signed URL for google cloud storage objects using PHP

The method i tried using was with openssl
$fp = fopen($key, 'r'); //open the PEM file
$priv_key = fread($fp,8192);
fclose($fp);
$pkeyid = openssl_get_privatekey($priv_key,"password");
openssl_sign($response["data_to_sign"], $signature, $pkeyid,'sha256');
$sign = base64_encode($signature)
Is this the correct Method to generate signature for signed urls in google?
You can try Google Cloud Storage PHP SDK, it's a good choice for keeping your codes clean.
cloud-storage PHP SDK
Install package to your project by following this page
on Packagist,
then
function getSignedGcsUrl($objPath/* which is your target object path */, $duration = 50)
{
$storageClient = new StorageClient([
'projectId' => /* your gcp projectId here */,
'keyFilePath' => /* your gcp keyFilePath here */,
]);
$bucket = $storageClient->bucket($objPath);
$object = $bucket->object();
$url = $object->signedUrl(new \DateTime('+ ' . $duration . ' seconds'));
return $url;
}
laravel-google-cloud-storage (for Laravel)
Install and configurate superbalist/laravel-google-cloud-storage by following this page:
on Github,
then
public static function getSignedGcsUrl($objPath, $duration = 50)
{
return Storage::disk('gcs'/* following your filesystem configuration */)
->getAdapter()
->getBucket()
->object($objPath)
->signedUrl(new \DateTime('+ ' . $duration . ' seconds'));
}
I put all the answers together. This should work in out of the box project. If you have space in the paths, you will need to rawurlencode the individual components, not urlencode.
function signedGoogleStorageURL($bucketName, $resourcePath, $duration = 10, $method = 'GET')
{
$expires = time() + $duration;
$content_type = ($method == 'PUT') ? 'application/x-www-form-
urlencoded' : '';
$to_sign = ($method . "\n" .
/* Content-MD5 */ "\n" .
$content_type . "\n" .
$expires . "\n" .
"/" . $bucketName . $resourcePath);
$sign_result = AppIdentityService::signForApp($to_sign);
$signature = urlencode(base64_encode($sign_result['signature']));
$email = AppIdentityService::getServiceAccountName();
return ('https://storage.googleapis.com/' . $bucketName .
$resourcePath .
'?GoogleAccessId=' . $email .
'&Expires=' . $expires .
'&Signature=' . $signature);
}
$signedPath = signedGoogleStorageURL(AppIdentityService::getDefaultVersionHostname(), "/my_folder/my_file", 60);
One thing to note that I spent about two hours on:
The GoogleAccessId you pass into the URL is the Email Address in the "Certificate" section of the Google Cloud Console. It's not the OAuth Client ID with a string replacement as Google suggests in their documentation.
There's an example here that signs a URL for Google Cloud Storage using PHP:
https://groups.google.com/forum/#!msg/google-api-php-client/jaRYDWdpteQ/xbNTLfDhUggJ
However - I note this is tagged with Google App Engine... If your code is running inside of Google App Engine, you should use the built-in App Identity service - (note this will only work once your application is deployed in production, not while running locally) - this means you will not need to download or handle any private keys:
require_once 'google/appengine/api/app_identity/AppIdentityService.php';
$sign_result = AppIdentityService::signForApp( $message );
You will need to make sure that the service account associated with the App Engine application is added to the team for the project that owns the Cloud Storage bucket.

How set Meta data key in php for Rackspace Cloud API

I am using Cloud File API of Rackspace Cloud server in PHP, I want to generate a temp url to download files for direct to my server for this i am using get_temp() method of this api but before use to this method i have to set Meta Data key for my container. How would i do this.
public function get_temp_url($key, $expires, $method)
{
$expires += time();
$url = $this->container->cfs_http->getStorageUrl() . '/' . $this->container->name . '/' . $this->name;
return $url . '?temp_url_sig=' . hash_hmac('sha1', strtoupper($method) .
"\n" . $expires . "\n" . parse_url($url, PHP_URL_PATH), $key) .
'&temp_url_expires=' . $expires;
}
The comments on this page include an example of how to set this:
http://docs.rackspace.com/files/api/v1/cf-devguide/content/Set_Account_Metadata-d1a4460.html
Also, if you use the new Cloud Files API...
https://github.com/rackspace/php-opencloud
...it includes a SetTempUrlSecret method in the ObjectStore class that will do this for you.

All of a sudden, my Amazon S3 HTTP requests aren't working. Signature Does Not Match error. Help?

My code was working just fine a couple days ago. Then, all of a sudden, BAM: it stopped working. My PUTs stopped going through, with a SignatureDoesNotMatch error. Help?
require_once 'Crypt/HMAC.php';
require_once 'HTTP/Request.php';
function uploadFile($path_to_file, $store_file_as, $bucket, $debugmode = false) {
$S3_URL = "http://s3.amazonaws.com/";
$filePath = $path_to_file;
$contentType = 'audio/mpeg';
$keyId = 'THISISMYKEY, YES I DOUBLE CHECKED IT';
$secretKey = 'THIS IS MYSECRET, YES I DOUBLED CHECKED IT';
$key = $store_file_as;
$resource = $bucket . "/" . $key;
$acl = "public-read";
$verb = "PUT";
$httpDate = gmdate("D, d M Y H:i:s T");
$stringToSign = "PUT\n\naudio/mpeg\n$httpDate\nx-amz-acl:$acl\n/$resource";
$hasher =& new Crypt_HMAC($secretKey, "sha1");
$str = $hasher->hash($stringToSign);
$raw = '';
for ($i=0; $i < strlen($str); $i+=2) {
$raw .= chr(hexdec(substr($str, $i, 2)));
}
$signature = base64_encode($raw);
$req =& new HTTP_Request($S3_URL . $resource);
$req->setMethod('PUT');
$req->addHeader("content-type", $contentType);
$req->addHeader("Date", $httpDate);
$req->addHeader("x-amz-acl", $acl);
$req->addHeader("Authorization", "AWS " . $keyId . ":" . $signature);
$req->setBody(file_get_contents($filePath));
$req->sendRequest();
echo $req->getResponseBody();
}
Run your signature against the Amazon S3 JavaScript signature tester.
If the two signatures do not match, you know something is wrong with the keys or signing procedure.
If the two match, your keys and signing process are correct and the problem is somewhere else.
The JS tester is invaluable for troubleshooting signature generation problems.

Categories