I'm trying to download files from a Google Drive link from Google server to my web server, to avoid the 100 max size in PHP POST.
<?php
$link = $_POST["linkToUpload"];
$upload = file_put_contents("uploads/test".rand().".txt", fopen($link, 'r'));
header("Location: index.php");
?>
Inserting a normal link like http://example.com/text.txt it works fine. The problem comes linking google drive https://drive.google.com/uc?authuser=0&id=000&export=download. This is a direct link from Google Drive, but it doesn't work. So I tried to insert the link that I obtained downloading the file locally https://doc-08-bo-docs.googleusercontent.com/docs/securesc/000/000/000/000/000/000?e=download and it's still not working. Do you think Google is trying to avoid the server-to-server copies? Or there is another method to do it?
If you want to fetch files with your own application you should use the API (Application Programming Interface) to get these.
Have a look at the file download documentation for Google Drive
Example download snippet in PHP:
$fileId = '0BwwA4oUTeiV1UVNwOHItT0xfa2M';
$response = $driveService->files->get($fileId, array(
'alt' => 'media'));
$content = $response->getBody()->getContents();
Related
I'm trying to create a web application using PHP (Laravel 5.8) where user can past link of any video found on the Internet (like Youtube, Dailymotion ... etc), then cut the video.
Cutting the video the video both in front-end and back-end is easy to do, I'm using PHP-FFMPeg to do it in server side.
My problem is that I couldn't find a solution to open a remote video from my PHP script, i.e if user past this link "https://www.dailymotion.com/video/x6rh0" I would open it then make a clip.
This my code :
$ffmpeg = FFMpeg\FFMpeg::create();
$url = "https://www.dailymotion.com/video/x6rh0";
$video = $ffmpeg->open($url);
$clip = $video->clip(FFMpeg\Coordinate\TimeCode::fromSeconds(30), FFMpeg\Coordinate\TimeCode::fromSeconds(15));
$clip->save(new FFMpeg\Format\Video\X264('libmp3lame', 'libx264'), public_path().'/videos/video2.avi');
I'm using PHP Laravel framework.
Please, how can I open a video from URL using PHP-FFMpeg, this is my question.
I have app made using app inventor.
From there users selects image from phone.
The app sends the file via POSTFILE method to a PHP file.
The PHP file normally gets the file contents using:
$data = php_compat_file_get_contents('php://input')
But my host is on GoDaddy so I cant upload using this method.
So I want to use cloudinary but I get it to work. Is is due to same GoDaddy shared server restrictions?
Heres the cloudinary Upload code in the PHP file:
\Cloudinary::config(array(
"cloud_name" => "rafsystems",
"api_key" => "94993346XXXXXX",
"api_secret" => "bIgBADFROG-aU1GFLfHEzeQjWs"
));
$result = \Cloudinary\Uploader::upload("php://input", array("public_id" => $file1));
So what options do I have. I need to sort this out asap for myself and a client
Thanks
Please note that the api_secret must not be revealed to the public. Please generate a new pair of api_key and api_secret from your account's settings page.
We warmly recommend to upload the photo directly to Cloudinary from the browser (client-side upload), either signed or unsigned. Then, send the info, as returned by the upload response, to your PHP server?
I'm trying to allow download of large files to clients, from Google Storage API behind an authenticated PHP.
I was able to read/download small files using the following code:
$object = $storage->objects->get($bucket, 'filename');
$request = new GuzzleHttp\Psr7\Request('GET', $object['mediaLink']);
//authorize the request before sending
$http = $client->authorize();
$response = $http->send($request);
$body = $response->getBody()->read($object->getSize());
$body will have the entire content of the file, but some of those might be 1gb size.
Tried using:
$stream = Psr7\stream_for($response->getBody());
But it doesn't work.
How would I be able to stream the download of the file to the client without loading it in memory?
Thanks.
Consider sending the client a signed URL so that the content is served directly from Google Cloud Storage, rather than trying to proxy the entire file download yourself.
I have a website on an Ubuntu webserver (not an app and not hosted at App Engine) and I want to use google cloud storage for the upload/download of large files. I am trying to upload a file directly to the Google Cloud Storage which isn't working (maybe because I made some basic errors).
I have installed the Google Cloud SDK and downloaded and unzipped Google App Engine. If I now include CloudStorageTools.php I get an the error:
Class 'google\appengine\CreateUploadURLRequest' not found"
My script looks like this:
require_once 'google/appengine/api/cloud_storage/CloudStorageTools.php';
use google\appengine\api\cloud_storage\CloudStorageTools;
$options = [ 'gs_bucket_name' => 'test' ];
$upload_url = CloudStorageTools::createUploadUrl( '/test.php' , $options );
If you want to use the functionality of Google App Engine (gae), you will need to host on gae, which will likely have a larger impact on your app architecture (it uses a custom google compiled php version with limited libraries and no local file handling, so all that functionality needs to be into blob store or gcs - Google Cloud Storage).
With a PHP app running on ubuntu, your best bet is to use the google-api-php-client to connect to the storage JSON api.
Unfortunately the documentation is not very good for php. You can check my answer in How to rename or move a file in Google Cloud Storage (PHP API) to see how to GET / COPY / DELETE an object.
To upload I would suggest to retrieve a pre signed Upload URL like so:
//get google client and auth token for request
$gc = \Google::getClient();
if($gc->isAccessTokenExpired())
$gc->getAuth()->refreshTokenWithAssertion();
$googleAccessToken = json_decode($gc->getAccessToken(), true)['access_token'];
//compose url and headers for upload url request
$initUploadURL = "https://www.googleapis.com/upload/storage/v1/b/"
.$bucket."/o?uploadType=resumable&name="
.urlencode($file_dest);
//Compose headers
$initUploadHeaders = [
"Authorization" =>"Bearer ".$googleAccessToken,
"X-Upload-Content-Type" => $mimetype,
"X-Upload-Content-Length" => $filesize,
"Content-Length" => 0,
"Origin" => env('APP_ADDRESS')
];
//send request to retrieve upload url
$req = $gc->getIo()->makeRequest(new \Google_Http_Request($initUploadURL, 'POST', $initUploadHeaders));
// pre signed upload url that allows client side upload
$presigned_upload_URL = $req->getResponseHeader('location');
With that URL sent to your client side, you can use it to PUT the file directly onto your bucket with an upload script that generates an appropriate PUT request. Here an example in AngularJS with ng-file-upload:
file.upload = Upload.http({
url: uploadurl.url,
skipAuthorization: true,
method: 'PUT',
filename: file.name,
headers: {
"Content-Type": file.type !== '' ? file.type : 'application/octet-stream'
},
data: file
});
Good luck - gcs is a tough one if you don't want to go google all the way with app engine!
The Google API PHP Client allows you to connect to any Google API, including the Cloud Storage API. Here's an example, and here's a getting-started guide.
The google docs states
The GCS stream wrapper is built in to the run time, and is used when you supply a file name starting with gs://.
When I look into the app.yaml, I see where the runtime is selected. I have selected php runtime. However when I try to write to my bucket I get an error saying the wrapper is not found for gs://. But when I try to write to my bucket using the helloworld.php script that is provided by google here https://cloud.google.com/appengine/docs/php/gettingstarted/helloworld and modifying it so that it says
<?php
file_put_contents('gs://<app_id>.appspot.com/hello.txt', 'Hello');
I have to deploy the app in order for the write to be successful. I am not understanding why I have to deploy the app everytime to get the wrapper I need to write to my bucket. How come I can not write to my bucket from a random php script?
Google says
"In the Development Server, when a Google Cloud Storage URI is specified we emulate this functionality by reading and writing to temporary files on the user's local filesystem"
So, "gs://" is simulated locally - to actually write to GCS buckets using the stream wrapper, it has to run from App Engine itself.
Try something like this:
use google\appengine\api\cloud_storage\CloudStorageTools;
$object_url = "gs://bucket/file.png";
$options = stream_context_create(['gs'=>['acl'=>'private', 'Content-Type' => 'image/png']]);
$my_file = fopen($object_url, 'w', false, $options);
fwrite($my_file, $file_data));
fclose($my_file);