Cannot Upload Files to existing bucket in s3 SDK PHP - php

Trying to upload files using the PHP SDK of s3. Uploading the file to the existing Bucket, pops up the error.
<?php
error_reporting(-1);
// Set plain text headers
header("Content-type: text/plain; charset=utf-8");
// Include the SDK
require_once '../sdk.class.php';
//%**********************************%*/
// UPLOAD FILES TO S3
// Instantiate the AmazonS3 class
$s3 = new AmazonS3();
$s3->path_style = true;
$bucket = 'photossss1.abc.com';
$name = "picture.jpg" ;
$response = $s3->create_object($bucket, 'picture2.jpg', array(
'fileUpload' => 'picture.jpg'
));
if($response->isOk()){
echo " Done" ;
} else {
//var_dump($response);
echo "error: create_object error.";
}
What is the error in the above code..?
Debug:
print_r($response->body);
=>
CFSimpleXML Object
(
[Code] => PermanentRedirect
[Message] => The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
[RequestId] => DACD5C54BC4BD82
[Bucket] => photossss1.abc.com
[HostId] => QUBlZEZKh0Ujzk6UyGG7LjC0vMCWDlOPszTZru/+OpWidBH84VXor1
[Endpoint] => photossss1.abc.com.s3.amazonaws.com
)

You need to set region after class initialization
// Initialize the class
$s3 = new AmazonS3();
$s3->set_region( AmazonS3::REGION_EU_W1 );
...
Related page on SDK Documentation

What is the result of:
<?php
print_r($response->body);
This should give you a proper error message from S3.

in services/s3.class.php;
replace this part:
const REGION_US_E1 = 's3.amazonaws.com';
to:
const REGION_US_E1 = 's3-eu-west-1.amazonaws.com';

Related

PHP S3 download: echo huge text variable to browser with header to download

PHP S3 download: echo huge text variable to browser with header to download, but the page is getting timeout and not responding.
Here is my code to download a huge CSV file data from S3 services:
$result = $s3->getObject($_REQUEST['bucket'], $_REQUEST['path'], false);
if($result){
header('Content-type: ' . $result->headers['type']);
header('Content-Length: ' . $result->headers['size']);
header('Content-Disposition: inline; filename="'.$_REQUEST['filename'].'"');
echo $result->body;
} else {
echo 'File not found';
}
I have also increased page execution time:
ini_set('memory_limit', '1024M' );
ini_set('max_execution_time', 0);
I have found an alternate solution to above problem, instead of getting contents from aws and then buffering large amount of data to browser, i created "Public" url from the file and redirect user to the link.
// Getting the URL to an object
$url = $s3->getObjectUrl($targetBucket, $keyname);
// redirect user to the download link
header('Location: '.$url); exit;
If "Private" bucket and files, we can create a temporary bucket and copy the file to that bucket on request, like something below, and then again create link like above.
// temporary bucket for public files
$sourcebucket = '........';
$targetBucket = '........';
$keyname = '........';
// Copy an object.
$s3->copyObject([
'Bucket' => $targetBucket,
'Key' => "{$keyname}",
'CopySource' => "{$sourcebucket}/{$keyname}",
]);
$s3->putObjectAcl(array(
'Bucket' => $targetBucket,
'Key' => $keyname,
'ACL' => 'public-read'
));
I posted here, may be it will help other's. As i was tired for solution! :)
I think the other way of doing what you've got as your own response, is to create a pre-signed request as described in the AWS documentation.
This then means that you can do something like this without having to bother with copying the object to another bucket (and also means you don't have to worry about cleaning up the unsecured bucket/object afterwards) -
$s3Client = new S3Client([
'region' => 'us-east-1',
'version' => 'latest'
]);
$cmd = $s3Client->getCommand('GetObject', [
'Bucket' => $mybucket,
'Key' => $key,
]);
$request = $s3Client->createPresignedRequest($cmd, '+20 minutes');
// Get the actual presigned-url
$presignedUrl = (string)$request->getUri();
header("Location: " . $presignedUrl);
exit;

php-mailparse composer not loading on server

I have an Ubuntu php7.0 server and I am trying to use a php Mailparse script I found here
However I confirmed that composer is installed on the server and mailparse is also on the server. However the below script returns a 500 error and I tracked it down to the top two lines of code that is causing it and not sure how to resolve it.
So what I mean is when I comment out the two lines that say
require_once __DIR__.'/vendor/autoload.php';
$Parser = new PhpMimeMailParser\Parser();
Then the script will load but of course the mail parse wont work??
//load the php mime parse library
require_once __DIR__.'/vendor/autoload.php';
$Parser = new PhpMimeMailParser\Parser();
//Include the AWS SDK using the Composer autoloader.
require 'awssdk/aws-autoloader.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// AWS Info
$bucketName = 'pipedemail';
$IAM_KEY = '******';
$IAM_SECRET = '******';
// Connect to AWS
try {
// You may need to change the region. It will say in the URL when the bucket is open
// and on creation. us-east-2 is Ohio, us-east-1 is North Virgina
$s3 = S3Client::factory(
array(
'credentials' => array(
'key' => $IAM_KEY,
'secret' => $IAM_SECRET
),
'version' => 'latest',
'region' => 'us-east-1'
)
);
} catch (Exception $e) {
// We use a die, so if this fails. It stops here. Typically this is a REST call so this would
// return a json object.
die("Error: " . $e->getMessage());
}
// Use the high-level iterators (returns ALL of your objects).
$objects = $s3->getIterator('ListObjects', array('Bucket' => $bucketName));
foreach ($objects as $object)
{
$objectkey = $object['Key'];
$path = "https://s3.amazonaws.com/pipedemail/$objectkey";
//lets get the raw email file to parse it
$Parser->setText(file_get_contents($path));
// Once we've indicated where to find the mail, we can parse out the data
$to = $Parser->getHeader('to'); // "test" <test#example.com>, "test2" <test2#example.com>
$addressesTo = $Parser->getAddresses('to'); //Return an array : [[test, test#example.com, false],[test2, test2#example.com, false]]
$from = $Parser->getHeader('from'); // "test" <test#example.com>
$addressesFrom = $Parser->getAddresses('from'); //Return an array : test, test#example.com, false
$subject = $Parser->getHeader('subject');
}

Laravel and AWS PHP SDK - Unable to delete a local file after it was uploaded to S3

I am trying to delete a file from a local directory right after I have uploaded it to AWS S3. When I run it on Vagrant I get an error = Text-file:busy, and when I run it on xampp I get the error :permission denied. For some reason the AWS S3 PutObject method is not releasing the file handle. I have tried to unset the s3 object but that didn't work.
Here is the code:
$tempName = public_path().'/path/to/file'
//Initialize AWS
$s3 = AWS::createClient('s3');
//Upload image to AWS
try {
$reponse = $s3->putObject(array(
'Bucket' => 'zotamoda',
'Key' => $productImage->image_folder."/".$productImage->image_name,
'SourceFile' => $tempName,
'ACL' => 'public-read',
));
} catch (S3Exception $e) {
// The AWS error code (e.g., )
echo $e->getAwsErrorCode() . "\n";
// The bucket couldn't be created
echo $e->getMessage() . "\n";
}
//Delete image from temporary location
unlink($tempName);
You could try:
Storage::disk('s3')->put($productImage->image_folder."/".$productImage->image_name, file_get_contents($tempName), 'public');
unlink($tempName);
or, assuming that $tempName is relative to your project root:
Storage::disk('local')->delete($tempName)
I think you should try calling:
gc_collect_cycles();
before deleting the file

PHP API Youtube Uploads, exception: Failed to start the resumable upload, upload must be sent to the upload URL

I'm trying to build a server application for uploading videos to YouTube. In my server application, the user can upload a video directly to my YouTube channel to make it public.
The client part of my application acquires the video and uploads it to my server.
My server then uses YouTube API to upload the video to my YouTube channel.
To make this work, I created a dummy Web Application that can capture the refresh token generated and I have stored it in a key.txt file
{"access_token":"MYTOKEN","token_type":"Bearer","expires_in":3600,"created":1435654774}
The upload_video.php script will automatically update "key.txt" file if access_token is out of date. This is the code from upload_video.php:
$key = file_get_contents('key.txt');
$application_name = 'YouTube_Upload';
$client_secret = 'MY_CLIENT_SECRET';
$client_id = 'MY_CLIENT_ID';
$scope = array('https://www.googleapis.com/auth/youtube.upload', 'https://www.googleapis.com/auth/youtube', 'https://www.googleapis.com/auth/youtubepartner');
$videoPath = "Test.f4v";
$videoTitle = "A tutorial video";
$videoDescription = "A video tutorial on how to upload to YouTube";
$videoCategory = "22";
$videoTags = array("youtube", "tutorial");
try{
// Client init
$client = new Google_Client();
$client->setApplicationName($application_name);
$client->setClientId($client_id);
$client->setAccessType('offline');
$client->setAccessToken($key);
$client->setScopes($scope);
$client->setClientSecret($client_secret);
if ($client->getAccessToken()) {
/**
* Check to see if access token has expired. If so, get a new one and save it to file for future use.
*/
if($client->isAccessTokenExpired()) {
$newToken = json_decode($client->getAccessToken());
$client->refreshToken($newToken->refresh_token);
file_put_contents('key.txt', $client->getAccessToken());
}
$youtube = new Google_Service_YouTube($client);
// Create a snipet with title, description, tags and category id
$snippet = new Google_Service_YouTube_VideoSnippet();
$snippet->setTitle($videoTitle);
$snippet->setDescription($videoDescription);
$snippet->setCategoryId($videoCategory);
$snippet->setTags($videoTags);
// Create a video status with privacy status. Options are "public", "private" and "unlisted".
$status = new Google_Service_YouTube_VideoStatus();
$status->setPrivacyStatus('unlisted');
// Create a YouTube video with snippet and status
$video = new Google_Service_YouTube_Video();
$video->setSnippet($snippet);
$video->setStatus($status);
// Size of each chunk of data in bytes. Setting it higher leads faster upload (less chunks,
// for reliable connections). Setting it lower leads better recovery (fine-grained chunks)
$chunkSizeBytes = 1 * 1024 * 1024;
// Setting the defer flag to true tells the client to return a request which can be called
// with ->execute(); instead of making the API call immediately.
$client->setDefer(true);
// Create a request for the API's videos.insert method to create and upload the video.
$insertRequest = $youtube->videos->insert("status,snippet", $video);
// Create a MediaFileUpload object for resumable uploads.
$media = new Google_Http_MediaFileUpload(
$client,
$insertRequest,
'video/*',
null,
true,
$chunkSizeBytes
);
$media->setFileSize(filesize($videoPath));
// Read the media file and upload it chunk by chunk.
$status = false;
$handle = fopen($videoPath, "rb");
while (!$status && !feof($handle)) {
$chunk = fread($handle, $chunkSizeBytes);
$status = $media->nextChunk($chunk);
}
fclose($handle);
/**
* Video has successfully been upload
*/
if ($status->status['uploadStatus'] == 'uploaded') {
// Actions to perform for a successful upload
// ......
}
// If want to make other calls after the file upload, set setDefer back to false
$client->setDefer(true);
} else{
// #TODO Log error
echo 'Problems creating the client';
}
} catch(Google_Service_Exception $e) {
print "Google_Service_Exception ".$e->getCode(). " message is ".$e->getMessage();
print "Stack trace is ".$e->getTraceAsString();
}catch (Exception $e) {
print "Exception ".$e->getCode(). " message is ".$e->getMessage();
print "Stack trace is ".$e->getTraceAsString();
}
When the script runs, it raise this exception:
Exception 0 message is Failed to start the resume-able upload (HTTP 400: global, Uploads must be sent to the upload URL. Re-send this request to https://www.googleapis.com/upload/youtube/v3/videos?part=status,snippet&uploadType=resumable)Stack trace is
#0 D:\xampp\htdocs\youtube\src\Google\Http\MediaFileUpload.php(136): Google_Http_MediaFileUpload->getResumeUri()
#1 D:\xampp\htdocs\youtube\resumable_upload.php(100): Google_Http_MediaFileUpload->nextChunk('\x00\x00\x00\x1Cftypf4v \x00\x00\x00...')
#2 {main}
exception raise on getResumeUri() (line 281) in Google_Http_MediaFileUpload, i have var dump the response from google
Google_Http_Request Object
(
[batchHeaders:Google_Http_Request:private] => Array
(
[Content-Type] => application/http
[Content-Transfer-Encoding] => binary
[MIME-Version] => 1.0
)
[queryParams:protected] => Array
(
[part] => status,snippet
[uploadType] => resumable
)
[requestMethod:protected] => POST
[requestHeaders:protected] => Array
(
[content-type] => application/json; charset=UTF-8
[authorization] => Bearer XXXXXXXXXXXXXXXX
[content-length] => 187
[x-upload-content-type] => video/*
[x-upload-content-length] => 10201286
[expect] =>
)
[baseComponent:protected] => https://www.googleapis.com//upload
[path:protected] => /youtube/v3/videos
[postBody:protected] => {"snippet":{"categoryId":"22","description":"A video tutorial on how to upload to YouTube","tags":["youtube","tutorial"],"title":"A tutorial video"},"status":{"privacyStatus":"unlisted"}}
[userAgent:protected] =>
[canGzip:protected] =>
[responseHttpCode:protected] => 400
[responseHeaders:protected] => Array
(
[x-guploader-uploadid] => XXXXXXXXXXXXXXXXXXXXXXXXXX
[location] => https://www.googleapis.com/upload/youtube/v3/videos?part=status,snippet&uploadType=resumable
[vary] => Origin
X-Origin
[content-type] => application/json; charset=UTF-8
[content-length] => 468
[date] => Fri, 10 Jul 2015 09:54:30 GMT
[server] => UploadServer
[alternate-protocol] => 443:quic,p=1
)
[responseBody:protected] => {
"error": {
"errors": [
{
"domain": "global",
"reason": "wrongUrlForUpload",
"message": "Uploads must be sent to the upload URL. Re-send this request to https://www.googleapis.com/upload/youtube/v3/videos?part=status,snippet&uploadType=resumable"
}
],
"code": 400,
"message": "Uploads must be sent to the upload URL. Re-send this request to https://www.googleapis.com/upload/youtube/v3/videos?part=status,snippet&uploadType=resumable"
}
}
[expectedClass:protected] => Google_Service_YouTube_Video
[expectedRaw:protected] =>
[accessKey] =>
)
What is wrong?
Thanks for any help and sorry for bad English.
This seems to be a problem with PHP Client Library of Google API. Go to GOOGLE_LIB_PATH/Http/MediaFileUpload.php and replace this line:
$this->request->setBaseComponent($base . '/upload');
With this one:
$this->request->setBaseComponent($base . 'upload');
Try again and share the results. I faced a similar problem with Google Pubsub API where API path set by library was incorrect.
I hope you need Resumable Uploads(https://developers.google.com/youtube/v3/guides/using_resumable_upload_protocol)
Then use this without any modification.
Also check if the video format is supported from https://support.google.com/youtube/troubleshooter/2888402?hl=en
There is a google method to get the refresh token, so instead of using json_decode :
$newToken = json_decode($client->getAccessToken());
$client->refreshToken($newToken->refresh_token);
You can do :
$client->refreshToken( $client->getRefreshToken() );
This is what i have tested so far and working fine.
$key = trim(file_get_contents('key.txt'));
$scope = 'https://www.googleapis.com/auth/youtube';

Using PHP to upload to Amazon S3

I've spent the last few hours following tutorials for implementing file uploads to Amazon S3 using php. I uploaded the most recent version of Donovan Schönknecht's S3 class to my server (as S3.php) and I am trying to use the following code to test upload capability. I know this code will work because I've seen numerous examples in action.
<?php
require('S3.php');
$s3 = new S3('KEY', 'SECRET KEY');
//insert into s3
$new_name = time() . '.txt';
S3::putObject(
'upload-me.txt',
'bucketName',
$new_name,
S3::ACL_PUBLIC_READ,
array(),
array(),
S3::STORAGE_CLASS_RRS
);
?>
I get an error 500 server error when I attempt to load this page. Additionally, every other reputable tutorial of this nature has given me the same error 500.
I verified that my key and secret key are valid by connecting to S3 with Cyberduck.
Does anyone have a clue as to what I could be doing incorrectly?
Thanks,
Sean
As it turns out, I was missing the cURL extension for PHP and this was causing an issue as the S3 class I was using required the use of cURL. All is working now.
You should also consider using the official AWS SDK for PHP. Examples for using S3 with the SDK can be found in their S3 user guide.
You can download most recent version of Amazon PHP SDK by running following composer command
composer require aws/aws-sdk-php
Further configuration to upload file on Amazon s3 are following
// Include the SDK using the Composer autoloader
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
// Set Amazon s3 credentials
$client = S3Client::factory(
array(
'key' => "your-key",
'secret' => "your secret key"
)
);
try {
$client->putObject(array(
'Bucket'=>'your-bucket-name',
'Key' => 'your-filepath-in-bucket',
'SourceFile' => 'source-filename-with-path',
'StorageClass' => 'REDUCED_REDUNDANCY'
));
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Get step by step details from here Amazon S3 File Upload Using PHP
Following example worked for me:
<?php
use Aws\S3\S3Client;
use Aws\S3\Exception\S3Exception;
$client = S3Client::factory([
'version' => 'latest',
'region' => 'us-west-1',
'credentials' => [
'key' => "<scret-key>",
'secret' => "<my-secret>"
]
]);
try {
$client->putObject([
'Bucket' =>'<my-bucket-name>',
'Key' => '<file-name>',
'SourceFile' => '<file-path-on-server>', // like /var/www/vhosts/mysite/file.csv
'ACL' => 'public-read',
]);
} catch (S3Exception $e) {
// Catch an S3 specific exception.
echo $e->getMessage();
}
Getting security credentials:
https://aws.amazon.com/blogs/security/wheres-my-secret-access-key/
https://console.aws.amazon.com/iam/home?#/security_credential
Getting region code
https://docs.aws.amazon.com/general/latest/gr/rande.html
Use this one to Upload Images using a form and it's working Fine for me
you may try using it with your code
$name = $_FILES['photo']['name'];
$size = $_FILES['photo']['size'];
$tmp = $_FILES['photo']['tmp_name'];
//////Upload Process
// Bucket Name
$bucket = 'bucket-name';
require_once('S3.php');
//AWS access info
$awsAccessKey = 'awsAccessKey';
$awsSecretKey = 'awsSecretKey';
//instantiate the class
$s3 = new S3($awsAccessKey, $awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
//Rename image name.
$actual_image_name = time();
//Upload to S3
if($s3->putObjectFile($tmp, $bucket , $actual_image_name, S3::ACL_PUBLIC_READ) )
{
$image='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
}else{
echo 'error uploading to S3 Amazon';
}
I never found a updated Script with Amazons latest sdk. i have made it by myself. it woks as a php commandline interpreter script. give it a try :
https://github.com/arizawan/aiss3clientphp
I'm not familiar with S3 API, but i used it as the storage with https://github.com/KnpLabs/Gaufrette. Gaufrette is a library that provides pretty nice abstraction layer above S3 and other file services/systems.
Here is sample code to upload images on Amazon S3.
// Bucket Name
$bucket="BucketName";
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'ACCESS_KEY');
if (!defined('awsSecretKey')) define('awsSecretKey', 'ACCESS_Secret_KEY');
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket($bucket, S3::ACL_PUBLIC_READ);
if($s3->putObjectFile($tmp, $bucket , $image_name_actual,S3::ACL_PUBLIC_READ) )
{
$message = "S3 Upload Successful.";
$s3file='http://'.$bucket.'.s3.amazonaws.com/'.$actual_image_name;
echo "<img src='$s3file'/>";
echo 'S3 File URL:'.$s3file;
}
else{
$message = "S3 Upload Fail.";
}
}
Below is the best solution. Its using multipart upload.Make Sure to install Aws SDK for PHP before using
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use Aws\Exception\AwsException;
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
try {
$s3Client = new S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'credentials' => [
'key' => 'AKIAUMIZJR5U5IO7M3',
'secret' => 'BmcA3vFso1bc/9GVK7nHJtFk0tQL6Vi5OoMySO',
],
]);
// Use multipart upload
$source = 'https://b8q9h6y2.stackpathcdn.com/wp-content/uploads/2016/08/banner-for-website-4.png';
$uploader = new MultipartUploader($s3Client, $source, [
'bucket' => 'videofilessandeep',
'key' => 'my-file.png',
'ACL' => 'public-read',
]);
try {
$result = $uploader->upload();
echo "Upload complete: {$result['ObjectURL']}\n";
} catch (MultipartUploadException $e) {
echo $e->getMessage() . "\n";
}
} catch (Exception $e) {
print_r($e);
}

Categories