I'm able to upload a file to my firebase storage bucket via nodejs using the firebase-admin but when I go to the firebase UI I cannot open the file. I noticed that uploaded files via firebase UI will have an access token automatically generated but no for files uploaded via nodejs.
I already tried several things like setting metadata with downloadtokens and making the file public after it is uploaded. None has worked.
How can I generate the access token via API call rather than having to go to hi and click generate token for each uploaded file?
Full answer as of July 2020:
It is possible to generate a uuid and use it as a token when uploading a file to Firebase Cloud Storage (aka Google Cloud Storage)
First import this package:
const uuidv4 = require('uuid/v4');
or
const { v4: uuidv4 } = require('uuid');
Then, generate a unique id before uploading the file:
const uuid = uuidv4();
Finally, upload your file:
The most important is to embed
metadata: { firebaseStorageDownloadTokens: uuid, }
in the metadata field of upload function
await admin.storage().bucket().upload(filePath, {
destination: thumbFilePathForUpload,
metadata: {
metadata: {
firebaseStorageDownloadTokens: uuid,
}
},
});
In order to check if it worked, click on the newly uploaded file directly from Firebase Console, you should have a blue, clickable link along your file. You can also find the access token under File Location, right under the preview.
For instance:
I use the below command and it is working perfectly
const uuidv4 = require('uuid/v4');
const uuid = uuidv4();
metadata: { firebaseStorageDownloadTokens: uuid }
To clarify #Rawan-25's answer further, what you want is:
bucket.upload(filename, {
destination,
metadata: {
metadata :{
firebaseStorageDownloadTokens: uuidv4(),
}
},
})
This is per this Github issue.
It's currently not possible to generate an access token with the Firebase Admin SDK. You'll have to do it using one of the client SDKs with the getDownloadUrl method on the StorageReference object. The token is only really intended for use with Firebase client apps.
However, the fact that you can't load a preview in the Firebase console for files uploaded with the Admin SDK is a known issue, and not the way that the console was intended to work. The Firebase team knows about this, but you should still file a bug report anyway with Firebase support to know them know you are impacted by the issue.
after an extensive search I managed to get the answer through a reddit post that referred to another stack overflow post lol.
Please take a look at answer #2! Get Download URL from file uploaded with Cloud Functions for Firebase
Related
I successfully tested uploading to local server using traditional PHP.
However, I am having problem uploading to Amazon s3.
I wrote php using git examples as reference.Please tell me what am I doing wrong.
All the scripts referenced are in the proper location in my local system and I am not making any CORS requests either.
Here are the specific code sections:
//UI Instance
var s3uploader = new qq.s3.FineUploader({
request: {
endpoint: "bucket.s3.amazonaws.com",
accessKey: "given key"
},
signature: {
endpoint: "endpoint.php"
},
uploadSucess: {
endpoint: "endpoint.php?success"
},
});
In endpoint.php I have assigned clientPrivateKey,bucketName and hostName and I am assuming that rest of the things are best left untouched. (including composer.json file)
Errors:
1.Error attempting to parse signature
2.Recieved an empty or invalid server response
3.Policy signing failed
Further:
Are policy documents to be authored explicitly by ourselves?
How do I know if my bucket supports only version 4 signature?
You must include include values for the following variables:
$clientPrivateKey = $_ENV['AWS_CLIENT_SECRET_KEY'];
$serverPublicKey = $_ENV['AWS_SERVER_PUBLIC_KEY'];
$serverPrivateKey = $_ENV['AWS_SERVER_PRIVATE_KEY'];
$expectedBucketName = $_ENV['S3_BUCKET_NAME'];
Additionally, if you are utilizing v4 signatures, you must also include a value for:
$expectedHostName = $_ENV['S3_HOST_NAME'];
If you are seeing signature errors, then either you have not set all of these values, or the AWS keys are incorrect.
Regarding your other two questions:
Are policy documents to be authored explicitly by ourselves?
No, Fine Uploader S3 creates these. Note that policy documents are only used for non-chunked uploads. For chunked uploads, the S3 multipart upload API is used, and your signature server is asked to sign a string of identifying headers instead.
How do I know if my bucket supports only version 4 signature?
http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
I'm trying to get the url of the MP4 file for a public video as a PRO user using Vimeo's API and php library.
$lib = new Vimeo($apiKey, $apiSecret, $accessToken);
$response = $lib->request('/videos/' . $video_id);
var_dump($response['body']);
This is successfully giving me a response from the API, but it is missing the files array that should contain the link to the mp4 file (according to this forum response).
My access token has private, public and interact scopes. Any other reason the files array is missing?
For anyone else experiencing this, it may be caused by a missing 'Video Files' scope on the access token (starting from version 3.3+).
More info: https://github.com/vimeo/vimeo.php/issues/194
PRO users only have access to their own videos. If the access token is authenticated as the owner of $video_id you should be able to see the files key.
If you are unable to see the files key, contact us at https://vimeo.com/help/contact
I have a problem. any help is welcome.
I am developing a solution that need to use google apis like cloud storage, Drive, etc.
In the clasic profile page I have to upload picture profile. then I use a angularjs to post data to my appengine/php/yii
formPHP:
`
<?php
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
$options = [ 'gs_bucket_name' => 'seektrabajo' ];
$upload_url = CloudStorageTools::createUploadUrl('/perfiles/subirFotoPerfil', $options);
?>
<input type="file" name="file"
nv-file-select="uploader.uploadAll()"
uploader="uploader" />
`
and when submit send data to google and google to yii ajax service:
`
public function actionSubirFotoPerfil(){
$answer = array('answer'=>'Incompleted','serverFile'=>'sinfoto.png');
if(!empty( $_FILES )){
$filename = $_FILES['file']['name'];
$gs_name = $_FILES['file']['tmp_name'];
move_uploaded_file($gs_name, 'gs://seektrabajo/'.$filename);
$answer = array( 'answer' => 'File transfer completed','serverFile' => $filename);
}
echo json_encode( $answer );
Yii::app()->end();
}
`
the problem is that never save the file uploaded to my bucket on cloud storage.
this not work for development/local mode and either appengine deployed.
in appengine land, I get this error:
I just get permision to bucket:
gsutil acl ch -u user#gmail.com:FULL_CONTROL gs://seektrabajo
the user#gmail.com y get from my google console/ apis-autentications/ Email
![web browser console error][1]
http://i.stack.imgur.com/noKnh.png
Somebody have an idea?
Thanks.
Why are you uploading the files to a static handler then uploading them to GCS? Use createUploadUrl() to template the <form> you serve. Provide a callback URL on your own app to the function and once the file is uploaded, a request will go to your app with the rest of the parameters of the form, the metadata of the file, etc...
With the pattern you are using, there is not only the headache of trying to figure out how to stream the data correctly, there's the simple fact that you're handling all that read/write on your instance, which is surely not a good idea if you intend to use this in production code where you pay for the used resources.
In short, there's no reason to use this pattern. In a traditional web app, you would post the file form to your own server on a route and do like you are in your example code (minus forwarding to GCS), but with App Engine you need to think a little different than the traditional ways of web development on a single VM you purchase and host a PHP runtime on.
I tried to retrieve the latest 10 photos from my picasa account but it doesn't work.
$file = file_get_contents("http://picasaweb.google.com/data/feed/api/user/firdawsfm?kind=photo&max-results=10&alt=json&access=public&thumbsize=".$tSize);
print_r($file);
the result :
{"$t":"http://picasaweb.google.com/data/feed/api/user/firdawsfm"},"updated":{"$t":"2013-09-08T19:27:11.010Z"},"category":[{"scheme":"http://schemas.google.com/g/2005#kind",
"term":"http://schemas.google.com/photos/2007#user"}],
"title":{"$t":"108451527358440546192","type":"text"},
"subtitle":{"$t":"","type":"text"},
"icon":{"$t":"http://lh3.ggpht.com/-Srl88atqmQE/AAAAAAAAAAI/AAAAAAAAAAA/AhcCTIASEAM/s64-c/108451527358440546192.jpg"},
"link":[{"rel":"http://schemas.google.com/g/2005#feed","type":"application/atom+xml",
"href":"http://picasaweb.google.com/data/feed/api/user/108451527358440546192?alt=json"},{"rel":"alternate",
"type":"text/html",
"href":"https://picasaweb.google.com/108451527358440546192"},{"rel":"http://schemas.google.com/photos/2007#slideshow",
"type":"application/x-shockwave-flash",
"href":"https://static.googleusercontent.com/external_content/picasaweb.googleusercontent.com/slideshow.swf?host=picasaweb.google.com&RGB=0x000000&feed=http://picasaweb.google.com/data/feed/api/user/108451527358440546192?alt%3Drss"},{"rel":"self","type":"application/atom+xml",
"href":"http://picasaweb.google.com/data/feed/api/user/108451527358440546192?alt=json&q=&start-index=1&max-results=10&kind=photo&thumbsize=180c&access=public"}],
"author":[{"name":{"$t":"Firdaws Haskell"},"uri":{"$t":"https://picasaweb.google.com/108451527358440546192"}}],
"generator":{"$t":"Picasaweb",
"version":"1.00",
"uri":"http://picasaweb.google.com/"},
"openSearch$totalResults":{"$t":0},
"openSearch$startIndex":{"$t":1},"openSearch$itemsPerPage":{"$t":10},
"gphoto$user":{"$t":"108451527358440546192"},"gphoto$nickname":{"$t":"Firdaws Haskell"},"gphoto$thumbnail":{"$t":"http://lh3.ggpht.com/-Srl88atqmQE/AAAAAAAAAAI/AAAAAAAAAAA/AhcCTIASEAM/s64-c/108451527358440546192.jpg"}}}
there is no data about photos. when I tried this exemple with another account it works. I verified the photos are public.
I tried your url and all works fine, i can access gphoti$id and media$group values.
So far seems all ok ;) Try again!
Maybe you didn`t have public photos at that time there...
Not relevant answer to that question
(in case if server requests authorisation):
For all Picasa web albums api queries with alt=json, or alt=json-in-code and /userid/default/ you must provide access_token parameter.
Access token you can get using OAuth2 authorization work-flow as described here:
http://code.google.com/p/google-api-php-client/wiki/OAuth2 (using google-api-php-client SDK for example)
And using in scopes this value "http://picasaweb.google.com/data/".
More how to do OAuth2 and get access token from https://accounts.google.com/o/oauth2/token
after requesting user login: https://accounts.google.com/o/oauth2/auth you can find on official website:
https://developers.google.com/accounts/docs/OAuth2Login
In final you must have:
$file = file_get_contents("http://picasaweb.google.com/data/feed/api/user/firdawsfm?kind=photo&max-results=10&alt=json&access=public&thumbsize=".$tSize."&access_token=".$access_token);
print_r($file);
I wish to build an application using which I can record video (along with audio) and also audio (only audio preferably in mp3 format).
From some research I did, I found I need a client app in flash or flex, a RTMP Server (RED5 preferable as its free)
This is the code which I used to get cam working flash.
var camera:Camera = Camera.getCamera();
var video:Video = new Video();
video.attachCamera(camera);
addChild(video);
The problem is, I don't know how to send the stream to RED5.
Also, what do I need to do so that I can store the video according to the user. The website I am creating is in PHP/MySQL and need to have their own videos and audios recorded. I love the way facebook has integrated Video Recording.
Check this: http://www.actionscript.org/resources/articles/615/2/Getting-started-with-red5-server/Page2.html
It explains how to connect and use RED5 and gives you an example.
Here's the exact AS3 code for publishing video from Flash to a media server like Red5, Wowza or AMS:
//init vars
public var nc:NetConnection;
public var ns:NetStream;
//net connection to media server
nc = new NetConnection();
nc.connect("rtmp://yourmediaserver/oflaDemo/instance");
//net stream through which the recording data is sent
ns = new NetStream(nc)
//attach cam and mic to net stream
ns.attachCamera(Camera.getCamera())
ns.attachAudio(Microphone.getMicrophone())
//send the data to the media server
ns.publish("streamName","record");
For just audio comment the ns.attachAudio line .
Flash Player can't encode mp3 sound (it can decode). You'll get sound encoded with NellyMoser ASAO. Speex is also an option. See this answer for more details.
oflaDemo is a Red5 app that supports video recording that's shipped with Red5.
For a (commercial) Flash/HTML video recording solution that supports Red5 and PHP you should check out https://hdfvr.com.
Also, what do I need to do so that I can store the video according to the user.
Just execute a PHP script (from the Flash client) that saves the info in the database. You can use POST or GET to send the video data and sessions or cookies to retrieve the user data.
var video:Video;
var camera:Camera = Camera.getCamera();
camera.addEventListener(ActivityEvent.ACTIVITY, active);
video = new Video();
video.attachCamera(camera);
function active(event:Event):void
{
addChild(video);
camera.removeEventListener(ActivityEvent.ACTIVITY, active);
}