Upload a video file in Akamai netstorage using PHP - php

I started working on uploading a file to akamai netstorage using PHP and referred few API's in GitHub. I couldn't upload a video file. Though i can create and write contents in them.
<?php
require 'Akamai.php';
$service = new Akamai_Netstorage_Service('******.akamaihd.net');
$service->authorize('key','keyname','version');
$service->upload('/dir-name/test/test.txt','sample text');
?>
I referred this API. I also referred few others but couldn't get the right way to upload a video/image file. The code which i wrote above is working perfectly. Now i need to upload a video file instead of writing contents to a text file.

There is a more modern library for Akamai's NetStorage, which is built as a plugin for FlySystem, the akamai-open/netstorage.
Once you have it installed, you need to setup the authentication and the HTTP client (based on Guzzle):
$signer = new \Akamai\NetStorage\Authentication();
$signer->setKey($key, $keyName);
$handler = new \Akamai\NetStorage\Handler\Authentication();
$handler->setSigner($signer);
$stack = \GuzzleHttp\HandlerStack::create();
$stack->push($handler, 'netstorage-handler');
$client = new \Akamai\Open\EdgeGrid\Client([
'base_uri' => $host,
'handler' => $stack
]);
$adapter = new \Akamai\NetStorage\FileStoreAdapter($client, $cpCode);
And then you can create the filesystem object, and upload the file:
$fs = new \League\Flysystem\Filesystem($adapter);
// Upload a file:
// cpCode, action, content signature, and request signature is added transparently
// Additionally, all required sub-directories are created transparently
$fs->write('/path/to/write/file/to', $fileContents);
However, because you're uploading a video file I would suggest you use a stream rather than reading the contents in to memory. To do this, you use writeStream() instead:
$fs = new \League\Flysystem\Filesystem($adapter);
$stream = fopen('/path/to/local/file', 'r+');
$fs->writeStream('/path/to/write/file/to', $stream);

Related

Convert Local Excel.xlsx File to Google spreadsheet Google DRIVE API

I am trying to convert a local Excel.xlsx file with all existent design, format and formulas existent in my local Excel file. How can I do that using Google API with PHP?
What I was doing but not working was :
$client = new \Google_Client();
$client->setApplicationName('Name');
$client->setScopes([\Google_Service_Drive::DRIVE]);
$client->setAccessType('offline');
$client->setAuthConfig($_SERVER['DOCUMENT_ROOT'] . '/credentials.json');
$service = new Google_Service_Drive($client);
$fileID = '';
$path = $_SERVER['DOCUMENT_ROOT'] . '/includes/files/';
$fileName = 'MAIN.xlsx';//this is the file I want to convert to Google sheet
$filePathName = $path.$fileName;
$mimeType = 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet';
$file->setMimeType($mimeType);
$createdFile = $service->files->copy($file, array(
'data' => $filePathName,
'mimeType' => $mimeType,
'convert' => true,
));
But that is not working. How should I correct?
I believe your goal as follows.
You want to upload a XLSX file from the local PC to Google Document.
When the XLSX file is uploaded, you want to convert it to Google Spreadsheet.
You want to achieve this using googleapis for PHP.
Modification points:
In your script, the method of "Files: copy" is used. This method copies the file on Google Drive. So I think that this cannot be used for achieving your goal. I think that this is the reason of your issue.
From your error message, I thought that you might be using Drive API v3. I guess that this might be also the reason of your issue.
From above points, when your script is modified, it becomes as follows.
Modified script:
$service = new Google_Service_Drive($client); // Please use your $client here.
$path = $_SERVER['DOCUMENT_ROOT'] . '/includes/files/';
$fileName = 'MAIN.xlsx';//this is the file I want to convert to Google sheet
$filePathName = $path.$fileName;
$file = new Google_Service_Drive_DriveFile();
$file->setName('MAIN.xlsx');
$file->setMimeType('application/vnd.google-apps.spreadsheet');
$data = file_get_contents($filePathName);
$createdFile = $service->files->create($file, array(
'data' => $data,
'uploadType' => 'multipart'
));
printf("%s\n", $createdFile->getId());
Note:
In this modified script, it supposes that you have already been able to get and out values for Google Drive using Drive API. Please be careful this.
In this method, the maximum file size is 5 MB. Please be careful this. When you want to upload the large file, please use the resumable upload. Ref
References:
Upload file data
Create files

Using ZipStream in Symfony: streamed zip download will not decompress using Archive Utility on Mac OSX

I have a symfony 2.8 application and on the user clicking "download" button, I use the keys of several large (images, video) files on s3 to stream this to the browser as a zip file using ZipStream (https://github.com/maennchen/ZipStream-PHP).
The streaming of files and download as a zip (stored, not compressed) is successful in the browser & the zip lands in Downloads. However, when attempting to open the zip in Mac OSX El Capitan using Archive Utility (native archive software on OSX), it errors out. The Error:
Unable to expand "test.zip" into "Downloads". (Error 2 - No such file or directory.)
I have seen older, identical issues on SO and attempted those fixes, specifically this post: https://stackoverflow.com/a/5574305/136151 and followed up on the Issues & PRs in ZipStream that relates and the upstream fixes in Guzzle etc.
Problem is, the above fixes were back in 2011 and things move on in that time. So applying the same fixes I am not getting a working result.
Specific fixes I have tried is:
1. Setting "version to extract" to 0x000A as suggested. Also as '20' as recommended in another post. I've set the same for "version made by".
2. I tried to force the compression method to 'deflate' instead of 'stored' to see if I got a working result. A stored resulted is all I need and suitable for a zip used as container file for images & video.
I am able to extract the zip using a third party archive app called The Unarchiver. However, users won't know & can't be expected to install an alternative archive app to suit my web app. Thats not an effective solution.
Does anyone have knowledge or experience of solving this issue and can help me out with how to resolve it?
NB: A streamed zip to browser is the required solution. Downloading assets from s3 to the server to create a zip and then stream the resulting zip to the browser is not a solution given the amount of time and overhead of such an approach.
Added info if required:
Code & Setup:
- Files are stored on s3.
- Web app is symfony 2.8 on PHP7.0 runing on ec2 with Ubuntu.
- Using aws-sdk-php, create an s3client with valid credentials and I register StreamWrapper (s3client->registerStreamWrapper()) on the s3client. This is to fetch files from s3 via fopen to stream to ZipStream library:
$this->s3Client = $s3Client;
$this->s3Client->registerStreamWrapper();
// Initialize the ZipStream object and pass in the file name which
// will be what is sent in the content-disposition header.
// This is the name of the file which will be sent to the client.
// Define suitable options for ZipStream Archive.
$opt = array(
'comment' => 'test zip file.',
'content_type' => 'application/octet-stream'
);
$zip = new ZipStream\ZipStream($filename, $opt);
$keys = array(
"zipsimpletestfolder/file1.txt"
);
foreach ($keys as $key) {
// Get the file name in S3 key so we can save it to the zip file
// using the same name.
$fileName = basename($key);
$bucket = 'mg-test';
$s3path = "s3://" . $bucket . "/" . $key;
if ($streamRead = fopen($s3path, 'r')) {
$zip->addFileFromStream($fileName, $streamRead);
} else {
die('Could not open stream for reading');
}
}
$zip->finish();
Zip Output Results:
Extraction on mac via Archive Utility fails with error 2
Extraction on mac with The unarchiver works.
Extraction on windows with 7-zip works.
Extraction on windows with WinRar fails - says zip is corrupted.
Response Headers:
Edit
I'm open to using another method of streaming files to browser as a zip that can be opened on Mac, Windows, Linux natively without using ZipStream if suggested. Just not to creating a zip file server side to subsequently stream thereafter.
The issue with the zip downloaded is that it contained html from the response in the Symfony controller that was calling ZipStream->addFileFromStream(). Basically, when ZipStream was streaming data to create zip download in client browser, a controller action response was also sent and, best guess, the two were getting mixed up on client's browser. Opening the zip file in hex editor and seeing the html in there it was obviously the issue.
To get this working in Symfony 2.8 with ZipStream, I just used Symfony's StreamedResponse in the controller action and used ZipStream in the streamedResponse's function. To stream S3 files I just passed in an array of s3keys and the s3client into that function. So:
use Symfony\Component\HttpFoundation\StreamedResponse;
use Aws\S3\S3Client;
use ZipStream;
//...
/**
* #Route("/zipstream", name="zipstream")
*/
public function zipStreamAction()
{
//test file on s3
$s3keys = array(
"ziptestfolder/file1.txt"
);
$s3Client = $this->get('app.amazon.s3'); //s3client service
$s3Client->registerStreamWrapper(); //required
$response = new StreamedResponse(function() use($s3keys, $s3Client)
{
// Define suitable options for ZipStream Archive.
$opt = array(
'comment' => 'test zip file.',
'content_type' => 'application/octet-stream'
);
//initialise zipstream with output zip filename and options.
$zip = new ZipStream\ZipStream('test.zip', $opt);
//loop keys useful for multiple files
foreach ($s3keys as $key) {
// Get the file name in S3 key so we can save it to the zip
//file using the same name.
$fileName = basename($key);
//concatenate s3path.
$bucket = 'bucketname';
$s3path = "s3://" . $bucket . "/" . $key;
//addFileFromStream
if ($streamRead = fopen($s3path, 'r')) {
$zip->addFileFromStream($fileName, $streamRead);
} else {
die('Could not open stream for reading');
}
}
$zip->finish();
});
return $response;
}
Solved. Maybe this helps someone else use ZipStream in Symfony.

Rename uploaded user image Google Cloud

I have the images uploading to google cloud, however I am not able to rename them to a random string before the image is uploaded.
I am wondering if anyone has a solution to this please.
Code:
<?php
# Includes the autoloader for libraries installed with composer
require '../vendor/autoload.php';
use Google\Cloud\Storage\StorageClient;
putenv('GOOGLE_APPLICATION_CREDENTIALS=/PATH TO JSON');
$client = new Google_Client();
$client->useApplicationDefaultCredentials();
$client->setSubject($user_to_impersonate);
$storage = new StorageClient([
'projectId' => 'XXXXXXXXXX'
]);
$bucket = $storage->bucket('abcd');
// Upload a file to the bucket.
$bucket->upload(
fopen('./test.jpg', 'r') // I WANT TO RENAME THIS FILE BEING UPLOADED AND YES i UNDERSTAND IT WILL BE A $_FILE TYPE WHEN I FINISH THE CODE.
);
// Download and store an object from the bucket locally.
$object = $bucket->object('test.jpg');
$object->downloadToFile('/cache/test2.jpg');
?>
Ok the answer is really easy, first you need to read the file that you want to upload. second you will want to generate your new filename, for this i used two user properties (these will most likely stay the same or change very rarely). I also used the timestamp of the request. My theory is no two people should have the same personal information (IP address & user agent) [I will add the users UID once i have finished the prototype.
If anyone else would like the code please see below, and please improve it & share the changes you would make (be great to see what we all come up with.
THIS IS FREE TO USE AND NO NEED TO GIVE CREDIT
$new_file_name = md5(time().md5($_SERVER['REMOTE_ADDR']).md5($_SERVER['HTTP_USER_AGENT']));
$path = "./test.jpg";
$ext = pathinfo($path, PATHINFO_EXTENSION);
$current = file_get_contents('./test.jpg');
$rand = $new_file_name.".".$ext;
echo $rand;
file_put_contents("./cache/".$rand, $current);
// Upload a file to the bucket.
$bucket->upload(
fopen('./cache/'.$rand, 'r')
);
The only problem with it is that you are left with the new files in cache folder but you can add a delete function after the file has been uploaded.
Full code on github
OFFICIAL GITHUB REPO

Files uploaded with Google Drive SDK not viewable

I have created a simple PHP script to play around with Google's Drive SDK. The plan is that eventually we will use Google Drive as a form of CDN for some of our web content (our company already has already upgraded to 1TB).
The code works to a degree, in that it successfully authenticates and uploads a file. The problem is, the file is always broken and cannot be viewed either with Drive itself, or by downloading.
The code is relatively simple, and just fetches an image from Wikipedia and attempts an upload:
<?php
require_once 'Google/Client.php';
require_once 'Google/Service/Drive.php';
require_once 'Google/Service/Oauth2.php';
$client = new Google_Client();
//$client->setUseObjects(true);
//$client->setAuthClass('apiOAuth2');
$client->setScopes(array('https://www.googleapis.com/auth/drive.file'));
$client->setClientId('***');
$client->setClientSecret('***');
$client->setRedirectUri('***');
$client->setAccessToken(authenticate($client));
// initialise the Google Drive service
$service = new Google_Service_Drive($client);
$data = file_get_contents('http://upload.wikimedia.org/wikipedia/commons/3/38/JPEG_example_JPG_RIP_010.jpg');
// create and upload a new Google Drive file, including the data
try
{
//Insert a file
$file = new Google_Service_Drive_DriveFile($client);
$file->setTitle(uniqid().'.jpg');
$file->setMimeType('image/jpeg');
$createdFile = $service->files->insert($file, array(
'data' => $data,
'mimeType' => 'image/jpeg',
));
}
catch (Exception $e)
{
print $e->getMessage();
}
print_r($createdFile);
?>
The print_r statement executes and we get information about the file. However, as I mentioned, the file is not viewable, and appears to be corrupt. Can anyone shed any light on what the issue may be?
After doing some more digging around in the docs (the current public docs are seriously out of date), I found that it's necessary to send another parameter as part of the insert() function's body parameter (the second argument in the function call).
Using the following:
$createdFile = $service->files->insert($doc, array(
'data' => $content,
'mimeType' => 'image/jpeg',
'uploadType' => 'media', // this is the new info
));
I was able to get everything working. I'll leave the question here, as I think it will be very useful until such a time that Google actually updates the documentation for the PHP API.
Source info here
Your code seems to be ok.
Have You tried to download this file from Google Drive afterwards and look at it?
Also I know its abit stupid but have You tried to write file on disk right after using file_get_contents(). You know just to establish point where it goes bad for 100%.

Streaming private videos from Amazon S3

I need to display videos / images file with ACL:PRIVATE uploaded to my Amazon S3 account on my wordpress blog.
I am a newbie to PHP oops based coding. Any script help, link references, free plugins or even Logical Algorithm will be great help :)
Thanks in advance.
This issue could be solved by implementing the following steps:
Download latest stable version of SDK from here
Extract the .zip file & place in wamp/www folder
Rename config-sample.inc.php file to config.inc.php
Add the access key & secret key (retrieved from Amazon S3 account) into above file, save & exit
create a sample file to display public / private objects from Amazon S3
The content of the file should look as follows:
require('sdk.class.php');
require('services/s3.class.php');
$s3 = new AmazonS3();
$bucket = "bucketname";
$temp_link = $s3->get_object_url($bucket, 'your/folder/path/img.jpg', '5 minute');
echo $temp_link;
In above code, the URL you receive as output is a signed URL for your private object, thus it is valid only for 5 minutes.
You may grant access for a future date and allow only authorized users to access your private content or media on Amazon S3.
This question is a little bit old, but I'm posting it anyway. I had a simliar issue today and found out there's a simple answer.
aws doc explains it clearly and has an example as well.
http://docs.aws.amazon.com/aws-sdk-php-2/guide/latest/service-s3.html#amazon-s3-stream-wrapper
Basically, you need to register AWS' stream wrapper and use s3:// protocol.
Here's my code sample.
use Aws\Common\Aws;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;
$s3 = Aws::factory(array(
'key' => Config::get('aws.key'),
'secret' => Config::get('aws.secret'),
'region' => Config::get('aws.region')
))->get('s3');
$s3->registerStreamWrapper();
// now read file from s3
// from the doc.
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/key', 'r')) {
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}

Categories