Backblaze Storage B2 PHP Download with Authorization from Private Bucket - php

I'm sort of struggling to understand the documentation for BackBlaze B2, I want to download files from a private bucket using PHP, I know the file ID and I want to get authorization (like 30 seconds) and then generate a download link (Presigned?).
b2_authorize_account
b2_get_download_authorization
b2_download_file_by_name
I'm just not quite sure how to put the example codes all together, pardon my ignorance.
https://www.backblaze.com/b2/docs/downloading.html

I spent a while looking for the answer to this as well. Apparently it is possible.
Essentially the steps are as follows:
On the server, make a GET request to the b2_authorize_account endpoint, to get an auth token (https://www.backblaze.com/b2/docs/b2_authorize_account.html)
Also on the server, make a POST request to the b2_get_download_authorization endpoint to get a more specific auth token used just for downloading (https://www.backblaze.com/b2/docs/b2_get_download_authorization.html)
Form the download URL by combining the apiUrl from step 1, with the path to the specific file, and then appending the token from step 2 as a query param (this is the part that's very difficult to find in their documentation).
Send the download URL out to the browser, and the user can click it to access the file.
So you'll get something like
https://api001.backblazeb2.com/file/bucket-name/path/to/file?Authorization={token}
You can restrict the length of time that the auth token is valid, and restrict it so that only files which have a particular prefix can be accessed. So if you want to make a particular private file available to a particular user that you've already authenticated, you can use these steps to generate a token with a short lifetime that only works for the specific file (by using the full file name as the prefix).

Related

restrict access to amazon s3 file to only allow logged in users access

When I go to the url of my bucket file it downloads straight away. However I only want users that are logged into my application to have access to these files.
I have been searching for hours but cannot find out how to do this in php from my app. I am using laravel to do this so the code may not look familiar. But essentially it just generates the url to my bucket file and then redirect to that link which downloads it
$url = Storage::url('Shoots/2016/06/first video shoot/videos/high.mp4');
return redirect($url);
How can i make this file only accessible for users logged into my application?
We ran into a similar issue for an application I'm working on. The solution we ended up working with is generating S3 signed URLS, that have short expiration times on them. This allows us to generate a new signed link with every request to the web server, pass that link to our known auth'd user, who then has access for a very limited amount of time, (a few seconds). In the case of images we wanted to display in the DOM, we had our API respond with an HTTP 303 (See Other) header and the signed URL, that expired with-in a couple of second. This allowed the browser time to download the image and display it before the link expired.
A couple of risks around this solution: We know a user could possibly request a signed URL and share it with another service before the expiration happens programmatically, or an un-auth'd user who was intercepting network traffic could potentially intercept the request and make it themselves, we felt these were edge case enough that we were comfortable with our solution.

Moodle: get user picture from webservice

By using Moodle Web Service (REST) call core_user_get_users_by_field I successfully get my user details. One of the returned fields is profileimageurl that links to my user profile picture and look like this:
http://my_moodle_server/pluginfile.php/89/user/icon/f1
Unfortunately that link works only in a browser where I have already logged in, otherwise it will redirect to a standard user icon (the grey anonymous face).
So in order to get the actual picture form a client app that is using moodle web services I think I have to call core_files_get_files and pass correct values. So I tried to remap that link to calls paramenters like this:
contextid: 89
component: "user"
filearea: "icon"
itemid: 0
filepath: "/"
filename: "f1.png" (also tryed without .png)
and of course my valid token
but all I get is:
{"parents":[],"files":[]}
Parameters seem to be formally correct (otherwise I would get an exception) however I only get empty response which tells me that some values are not correct.
Ok I found the solution to my problem. I'm posting the answer here also because there's not much information about Moodle web service around...
First of all, core_files_get_files is not the way... it will only show files information, it won't give you the actual file contents (binary).
Fortunately, there's an equivalent URL to be used when calling from external client app:
http://my_moodle_server/webservice/pluginfile.php
It accepts the same parameters/format as http://my_moodle_server/pluginfile.php and in addition you can also pass your token for web service authentication.
So profileimageurl field returned by core_user_get_users_by_field which looks like this:
http://my_moodle_server/pluginfile.php/89/user/icon/f1
can be turned into
http://my_moodle_server/webservice/pluginfile.php/89/user/icon/f1?token=my_web_service_token
Also note that appending ?token= parameter is required

Google Cloud Storage - Knowing who uploaded

I'm currently porting a webservice I've built to work with Google App Engine. One of the main functions in the webservice is to upload an image (a profile picture, for example). Currently what I do is:
Authenticate the user who wants to upload the image using his unique API KEY and other data.
Upload the file, give it a unique name and store the name of the file inside the user's row in the mysql database.
Now in order to port the file upload to App Engine I'm using Google Cloud Storage and following this tutorial:
https://cloud.google.com/appengine/docs/php/googlestorage/user_upload
I'm trying to get the file upload to work with my Android app the following way:
The user makes a request, the webservice authenticates him and sends in response the upload url created by CloudStorageTools::createUploadUrl.
The user uploads the image to this URL
Now heres the problem: After the upload is done, a POST is made to the given php file in createUploadUrl (I quote from google's docs) with the uploaded file. But how can this script know who uploaded the file it got? I can't pass any parameters indicating who uploaded the file to createUploadUrl so I can't insert the file name to a user in the Cloud SQL database, so now theres only a file not associated with anything in Cloud Storage.
Any hints? Am I missing something?
I'm posting this as a separate answer because a) it's different approach than the first answer, and b) I'd advocate for my other answer over this one because I think it's best to let GAE handle auth. However, I think you can do what you're trying to do this way:
Instead of routing a singular URL to your upload handler, use a regex match like this in your app.yaml to route any matching URLs to your handler:
handlers:
- url: upload_handler/(.*)
script: my-php-script-that-uploads-stuff.php
Then when invoking createUploadURL, simply pass in your API_KEY after the 'upload_handler/' as a query argument, e.g.
$upload_url = CloudStorageTools::createUploadUrl(sprintf('/upload_handler/?API_KEY=%s', $API_KEY), $options);
Then in my-php-script-that-uploads-stuff.php:
parse_str(parse_url($_SERVER['REQUEST_URI'])['query'])
This will parse the request URL to get the query string and then parse the query string, populating the value of API_KEY with the value passed in the URL in your local scope.
I just tested this pattern of extracting stuff from the request URL in a php script in with dev_appserver and it worked.
I think you can do this by using the google appengine users service in your php script.
From the link above, from the Overview page of the users service, here's how you get the current user:
use google\appengine\api\users\User;
use google\appengine\api\users\UserService;
$user = UserService::getCurrentUser();

Generating encoded URL's for file sharing stored on amazon S3

I am working on file sharing for objects stored on amazon S3.Now the path for the object stored on S3 is default like this https://s3.amazonaws.com/bucket_name/path_to_file/file_name.jpg/docx etc.Now I want to share these file URLs via email through my app.
Currently when I share I see the entire URL as is in the email.I want it to be sent in an encoded form so that its hard to guess the exact location of the file.
I am using PHP and I was planning to use base_64_encode/decode functions or md5 the URLs but not sure if thats the right way to go.
So,I am looking for some tool or API (by amazon ot 3rd party) that can do it for me.
I would also like to shorten the URLs while sharing.
Would like to seek advice and guidance from someone implemented something similar.
Not sure if it comes under URL-REWRITING but tagging it under it.
Thank you
You have two options to do this:
You can map your url and s3's url in your server, and give your url to user. When user make request to your url, redirect it to s3's. You can ref http://en.wikipedia.org/wiki/URL_redirection
You can call API provided by url redirect service provider, e.g. tiny.cc/api-docs

1 simultaneous download + force waiting time + Amazon S3

I must say this is the first time I ask anything here, and I'm not a developer, so please be patient with my lack of knownledge. This requirement is for a website I am creating with some friends, so it's not that I'm making money with this.
This is the problem: I want to implement some kind of restriction to downloads, very much in the same way Rapidshare or any other file sharing service does:
The user should be able to download only 1 file simultaneously
The user should wait before being able to download another file, let's say 2 hours.
However, I am not trying to create a file sharing website. I am going to upload all the files to Amazon S3, and the only thing I need is to be able to restrict the downloads. I will create the links to the files. I don't care if users are registered or not, they should be able to download anyway.
The website is built in Joomla!, which uses Apache + MySQL. The files would be located at Amazon's servers.
My question is the following. Is there any way to implement this in a not-so-extremely-complicated way? Do you know some script or web-service that could help me get this done?
I have looked around, but the only thing I've found are Payment gateways, and we don't plan to charge for downloads.
Any help will be much appreciated.
Thanks!
UPDATE: I solved this problem using this script: http://www.vibralogix.com/linklokurl/features.php
As far as I know, there is no way to check on the current status of a download from S3. Having said that, S3 really does have plenty of bandwidth available, so I wouldn't worry too much about overloading their servers :) Just last week, Amazon announced that S3 is now serving an average of 650,000 objects / second.
If you want to implement something like #Pushpesh's solution in PHP, one solution would be to use the Amazon SDK for PHP and do something like this:
<?php
#Generate presigned S3 URL to download S3 object from
# Include AWS SDK for PHP and create S3
require_once("./aws-sdk/sdk.class.php");
$s3 = new AmazonS3();
# Let S3 know which file we want to be downloading
$s3_bucket_name = "yours3bucketname";
$s3_object_path = "folder1/object1.zip";
$s3_url_lifetime = "10 minutes";
$filename = "download.zip";
#Check whether the user has already downloaded a file in last two hours
$user_can_download = true;
if($user_can_download) {
$s3_url = $s3->get_object_url($s3_bucket_name, $s3_object_path, $s3_url_lifetime, array('response' => array('content-type' => 'application/force-download', 'content-disposition' => 'attachment; filename={$filename}')));
header("Location: {$s3_url}");
}
else {
echo "Sorry, you need to wait a bit longer before you can download a file again...";
}
?>
This uses the get_object_url function, which generates pre-signed URLs that allow you to let others download files you've set to private in S3 without making these files publicly available.
As you can see, the link this generates will only be valid for 10 minutes, and it's a unique link. So you can safely let people download from this link without having to worry about people spreading the link: the link will have expired. The only way people can get a new, valid link is to go through your download script, which will refuse to generate a new link if the IP/user that is trying to initiate a download has already exceeded their usage limit. It's important that you set these files to private in S3, though: if you make them publicly available, this won't do much good. You probably also want to take a look at the docs for the S3 API that generates these pre-signed URLs.
Only 2 ways comes to my mind - you either copy a file with unique hash and let apache serve it.. then you don't have any control over when user actually ends his download (or starts). Useful for big files. Another way is to pass it through php. Still, you would need to kill download session in case user stops download.
If there is no plugin for that you won't be able to do it the easy way by adding a script or copy & paste "some" code.
So either hire somebody or you'll need to learn what you need to approach the task an your own, in other words learn how to program. Your question already contains the steps you need to implement: Record who is downloading what and when and keep track of the status of the download.
I have not tried to track a download status before but I'm not sure if it is possible to get the status somehow directly from the server that is sending the file. But you can get it from the client: Download Status with PHP and JavaScript
I'm further not sure if this will work properly in your scenario because the file will come from S3.
S3 itself has a so called feature "query string protection":
With Query string authentication, you have the ability to share Amazon
S3 objects through URLs that are valid for a predefined expiration
time.
So you need to lookup the S3 API to figure out how to implement this.
What you could try is to send an ajax request to your server when the user clicked the download link, send the amazon s3 link your server generated back as a response and have the client side javascript somehow trigger that file download then.
You can monitor user downloads by their ip address, store it in a database along with the time at which the user downloaded and the session id (with hashes of course) and check this before each download request. If the current time is less than 2 hours within the same session, block requests, else give them the download.
Table Structure:
Id Ip_Addr Session_Hash Timestamp
1 10.123.13.67 sdhiehh_09# 1978478656
2 78.86.56.11 mkdhfBH&^# 1282973867
3 54.112.192.10 _)*knbh 1445465565
This is a very basic implementation. I'm sure more robust methods exist. Hope this helps.

Categories