I am trying to save images from image url to the amazon s3, but image is created there in bucket, but image is not shown in browser, displays message "image cannot be displayed because it contains error.
This is my code:
require_once("aws/aws-autoloader.php");
// Amazon S3
use Aws\S3\S3Client;
// Create an Amazon S3 client object
$s3Client = S3Client::factory(array(
'key' => 'XXXXXXXXXXXX',
'secret' => 'XXXXXXXXX'
));
// Register the stream wrapper from a client object
$s3Client->registerStreamWrapper();
// Save Thumbnail
$s3Path = "s3://smmrescueimages/";
$s3Stream = fopen($s3Path . 'gt.jpg', 'w');
fwrite($s3Stream, 'http://sippy.in/gt.jpg');
#fclose($s3Stream);
echo "done";
This is the image path generated https://s3.amazonaws.com/smmrescueimages/gt.jpg
change this line
fwrite($s3Stream, 'http://sippy.in/gt.jpg');
to
fwrite($s3Stream, file_get_contents('http://sippy.in/gt.jpg'));
otherwise you save the url string instant of the image binary into your file.
dont use # to prevent error messages of php functions!
just check if a valid handler is present
$s3Stream = fopen($s3Path . 'gt.jpg', 'w');
if( $s3Stream ) {
fwrite($s3Stream, file_get_contents('http://sippy.in/gt.jpg'));
fclose($s3Stream);
}
Related
I want to read the csv file content using php, google drive api v3
I got the fileid and file name but I am not sure how I can read the file content?
$service = new Drive($client);
$results = $service->files->listFiles();
$fileId="1I****************";
$file = $service->files->get($fileId);
The google drive api is a file storage api. It allows you to upload, download and manage storage of files.
It does not give you access to the contents of the file.
To do this you would need to download the file and open it locally.
Alternately since its a csv file you may want to consider converting it to a google sheet then you could use the google sheets api to access the data within the file programmatically.
Code for downloading a file from Google drive api would look something like this
Full sample can be found here large-file-download.php
// If this is a POST, download the file
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
// Determine the file's size and ID
$fileId = $files[0]->id;
$fileSize = intval($files[0]->size);
// Get the authorized Guzzle HTTP client
$http = $client->authorize();
// Open a file for writing
$fp = fopen('Big File (downloaded)', 'w');
// Download in 1 MB chunks
$chunkSizeBytes = 1 * 1024 * 1024;
$chunkStart = 0;
// Iterate over each chunk and write it to our file
while ($chunkStart < $fileSize) {
$chunkEnd = $chunkStart + $chunkSizeBytes;
$response = $http->request(
'GET',
sprintf('/drive/v3/files/%s', $fileId),
[
'query' => ['alt' => 'media'],
'headers' => [
'Range' => sprintf('bytes=%s-%s', $chunkStart, $chunkEnd)
]
]
);
$chunkStart = $chunkEnd + 1;
fwrite($fp, $response->getBody()->getContents());
}
// close the file pointer
fclose($fp);
I'm willing to create a php function, trigger in js, that can :
Retrieve all AWS S3 bucket files from a specific folder (I can also provide the path of each files)
Create a Zip containing all S3 files
Download the Zip when the trigger is hit (cta)
I'm able to download a single file with the getObject method from this example However, I can't find any informations in order to download multiple file and Zip it.
I tried the downloadBucket method, however it download all files inside my project architecture and not as a zip file.
Here is my code:
<?php
// AWS Info + Connection
$IAM_KEY = 'XXXXX';
$IAM_SECRET = 'XXXXX';
$region = 'XXXXX';
require '../../vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$FolderToDownload="FolderToDownload";
// Connection OK
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => $region
)
);
try {
$bucketName = 'XXXXX';
$destination = 'NeedAZipFileNotAFolderPath';
$options = array('debug'=>true);
// ADD All Files into the folder: NeedAZipFileNotAFolderPath/Files1.png...Files12.png...
$s3->downloadBucket($destination,$bucketName,$FolderToDownload,$options);
// Looking for a zip file that can be downloaded
// Can I use downloadBucket? Is there a better way to do it?
// if needed I can create an array of all files (paths) that needs to be added to the zip file & dl
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
?>
If some one can help, would be nice.
Thanks
You can use ZipArchive, list the objects from a prefix (the bucket's folder). In the case, I also use 'registerStreamWrapper' to get the keys and added to the created zip file.
Something like that:
<?php
require '/path/to/sdk-or-autoloader';
$s3 = Aws\S3\S3Client::factory(/* sdk config */);
$s3->registerStreamWrapper();
$zip = new ZipArchive;
$zip->open('/path/to/zip-you-are-creating.zip', ZipArchive::CREATE);
$bucket = 'your-bucket';
$prefix = 'your-prefix-folder'; // ex.: 'image/test/folder/'
$objects = $s3->getIterator('ListObjects', array(
'Bucket' => $bucket,
'Prefix' => $prefix
));
foreach ($objects as $object) {
$contents = file_get_contents("s3://{$bucket}/{$object['Key']}"); // get file
$zip->addFromString($object['Key'], $contents); // add file contents in zip
}
$zip->close();
// Download de zip file
header("Content-Description: File Transfer");
header("Content-Type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"/path/to/zip-you-are-creating.zip\"");
readfile ('/path/to/zip-you-are-creating.zip');
?>
You can also delete the zip file after the download, if you prefer.
Must work in most of cases, but I don't want to save the file in the server, but I don't see how I can download multiple objects directly from the AWS S3 bucket by the browser without saving a file in my server. If anyone know, please share with us.
I have this code to get files from rackspace cloud files:
$username = getenv('RS_USERNAME');
$apikey = getenv('RS_APIKEY');
$containerName = 'imagenesfc';
$region = 'DFW';
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => $username,
'apiKey' => $apikey,
));
$filename = "myfile.ext";
$service = $client->objectStoreService(null, $region);
$container = $service->getContainer($containerName);
$object = $container->getObject($filename);
$account = $service->getAccount();
$account->setTempUrlSecret();
$tempUrl = $object->getTemporaryUrl(15, 'GET');
In the open php cloud documentation says you can change 'GET' to 'PUT' to what I imagine is being able to put a file, instead of getting it, the problem is that the file doesn't exist yet, and apparently the only way to create a file is uploading it first? PHP SDK Container
In Amazon s3 I can do the following to get what I want:
$keyname = "myfile.ext";
$arr = array(
'Bucket' => 'bucket',
'Key' => $keyname
);
$cmd = $s3Client->getCommand('PutObject', $arr);
$request = $s3Client->createPresignedRequest($cmd, '+10 minutes');
$presignedUrl = (string) $request->getUri();
Then I can write to the presignedUrl anyway I prefer, like with Java:
url = new URL(jObj.get("presignedUrl").toString().replace("\\", ""));
connection=(HttpURLConnection) url.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("PUT");
out = new DataOutputStream(connection.getOutputStream());
...(write,flush,close)
So, basically what I want to do, is get the upload URL from RackSpace and write to it like I do with Amazon s3.
Is it possible? And if it is, how can you do it?
I need to do it this way because my API will provide only download and upload links, so no traffic goes directly through it. I can't have it saving the files to my API server then upload it to cloud.
Yes, you can simulate a file upload without actually uploading content to the API - all you need to do is determine what the filename will be. The code you will need is:
$object = $container->getObject();
$object->setName('blah.txt');
$account = $service->getAccount();
$account->setTempUrlSecret();
echo $object->getTemporaryUrl(100, 'PUT');
The getObject method returns an empty object, and all you're doing is setting the remote name on that object. When a temp URL is created, it uses the name you just set and presents a temporary URL for you to use - regardless if the object exists remotely. The temp URL can then be used to create the object.
So, I've been trying to get this to work for the past couple hours, but I can't figure it out. The goal is to pull the converted mp4 file from gfycat and upload that file to the Amazon S3 bucket.
gfycat is returning a JSON object properly, and $result->mp4Url is returning a correct url to the mp4 file. I keep getting errors such as "object expected, string given". Any ideas? Thanks.
// convert the gif to video format using gfycat
$response = file_get_contents("http://upload.gfycat.com/transcode/" . $key .
"?fetchUrl=" . urlencode($upload->getUrl('g')));
$result = json_decode($response);
$result_mp4 = file_get_contents($result->mp4Url);
// save the converted file to AWS S3 /i
$s3->putObject(array(
'Bucket' => getenv('S3_BUCKET'),
'Key' => 'i/' . $upload->id64 . '.mp4',
'SourceFile' => $result_mp4,
));
var_dump($response) yields:
string '{
"gfyId":"vigorousspeedyinexpectatumpleco",
"gfyName":"VigorousSpeedyInexpectatumpleco",
"gfyNumber":"884853904",
"userName":"anonymous",
"width":250,
"height":250,
"frameRate":11,
"numFrames":67,
"mp4Url":"http:\/\/zippy.gfycat.com\/VigorousSpeedyInexpectatumpleco.mp4",
"webmUrl":"http:\/\/zippy.gfycat.com\/VigorousSpeedyInexpectatumpleco.webm",
"gifUrl":"http:\/\/fat.gfycat.com\/VigorousSpeedyInexpectatumpleco.gif",
"gifSize":1364050,
"mp4Size":240833,
"webmSize":220389,
"createDate":"1388777040",
"views":"205",
"title":'... (length=851)
Using json_decode() on it also yields similar results.
You are mixing up the 'SourceFile' parameter (which accepts a file path) with the 'Body' parameter (which accepts raw data). See Uploading Objects in the AWS SDK for PHP User Guide for more examples.
Here are 2 options that should work:
Option 1 (Using SourceFile)
// convert the gif to video format using gfycat
$response = file_get_contents("http://upload.gfycat.com/transcode/" . $key .
"?fetchUrl=" . urlencode($upload->getUrl('g')));
$result = json_decode($response);
// save the converted file to AWS S3 /i
$s3->putObject(array(
'Bucket' => getenv('S3_BUCKET'),
'Key' => 'i/' . $upload->id64 . '.mp4',
'SourceFile' => $result->mp4Url,
));
Option 2 (Using Body)
// convert the gif to video format using gfycat
$response = file_get_contents("http://upload.gfycat.com/transcode/" . $key .
"?fetchUrl=" . urlencode($upload->getUrl('g')));
$result = json_decode($response);
$result_mp4 = file_get_contents($result->mp4Url);
// save the converted file to AWS S3 /i
$s3->putObject(array(
'Bucket' => getenv('S3_BUCKET'),
'Key' => 'i/' . $upload->id64 . '.mp4',
'Body' => $result_mp4,
));
Option 1 is better though, because the SDK will use a file handle to mp4 file instead of loading the entire thing into memory (like file_get_contents does).
i have problem with amazon s3 service and a web service writed with php.
This WS receive from $_POST a file base64encoded. I need to take this "string" and save to Amazon S3 bucket.
I didn't find a right solution for do that, and after a week of work I'm looking for help here.
//$file = 'kdK9IWUAAAAdaVRYdENvbW1lbnQAAAAAAENyZWF0ZWQgd'
$file = $_POST['some_file'];
$opt = array(
'fileUpload' => base64_decode($file),
'acl' => AmazonS3::ACL_PUBLIC
);
$s3 = new AmazonS3(AWS_KEY, AWS_SECRET_KEY);
$response = $s3->create_object($bucket, $filename, $opt);
Thanks
Per the docs, fileUpload expects a URL or path, or an fopen resource.
fileUpload - string|resource - Required; Conditional - The URL/path for the file to upload, or an open resource. Either this parameter or body is required.
You should pass the decoded file data via the body parameter instead:
body - string - Required; Conditional - The data to be stored in the object. Either this parameter or fileUpload must be specified.
The imagestring is the base64 encoded string that was passed from POST data. You can encode this image and make a file to anywhere - in my case /photos. I used this is a school server, but if Amazon server is similar enough, it could also work there.
$values = array();
foreach($_POST as $k => $v){
$values[$k] = $v;
}
$imagestring = $values['stringEncodedImage'];
$img = imagecreatefromstring(base64_decode($imagestring));
if($img != false)
{
echo 'making the file!';
imagejpeg($img, 'photos/'.$file_name);
}