I'm trying to create a PHP script that creates a function from some code that i zip up on our server. I uploaded the file manually to lambda, and it works fine. But when i try to use the aws sdk to create the function, it fails with an error message. Anyone got any clue?
Code:
private function createLambdaFunction() {
$result = $this->lambdaConn->createFunction(array(
'FunctionName' => $this->lambdaFunctionName,
'Runtime' => $this->runtime,
'Role' => $this->role,
'Handler' => $this->lambdaFunctionName.".".$this->handler,
'Description' => $this->description,
'Timeout' => $this->timeout,
'MemorySize' => $this->memorySize,
'Code' => array(
'ZipFile' => 'fileb://test.zip'
)
));
Error:
PHP Fatal error: Uncaught Aws\Lambda\Exception\LambdaException: AWS
Error Code: InvalidParameterValueException,
Status Code: 400, AWS Request ID: asdf, AWS Error Type: user,
AWS Error Message: Could not unzip uploaded file. Please check
your file, then try to upload again., User-Agent:
aws-sdk-php2/2.8.10 Guzzle/3.9.3 curl/7.35.0 PHP/5.5.9-1ubuntu4.9
I can't seem to find a good example on google, and the documentation is...less than ideal. I created the zip file with php, so I've tried passing that var, the full path to the file, relative path to file, etc. Finally learned you have to use fileb:// preface, but that didn't end up fixing anything.
Okay, I'm not sure why this is the case, but you need to base64 encode your zip file like:
$result = $this->lambdaConn->createFunction(array(
'FunctionName' => $this->lambdaFunctionName,
'Runtime' => $this->runtime,
'Role' => $this->role,
'Handler' => $this->lambdaFunctionName . "." . $this->handler,
'Description' => $this->description,
'Timeout' => $this->timeout,
'MemorySize' => $this->memorySize,
'Code' => array(
'ZipFile' => 'fileb://'.base64_encode(file_get_contents('test.zip'))
)
));
I'm not sure why this is required, as accourding to the doumentation and a post by an AWS employee, you dont have to have base64 encoding for create function. They must have mixed up something or another.
Related
I'm trying to use the google/cloud-translate library (v ^1.5) in Laravel (v ^6.0).
In GoogleController.php:
public function translate(Request $request) {
$request->validate([
'source' => 'required|string|min:2|max:5',
'target' => 'required|string|min:2|max:5',
'q' => 'required|string',
]);
$translate = new TranslateClient([
'keyFile' => base_path(config('services.google.json_path')),
'projectId' => config('services.google.project_id'),
'suppressKeyFileNotice' => true,
]);
// Translate text from english to french.
$result = $translate->translate($request->q, [
'target' => explode($request->target, '-')[0],
'source' => explode($request->source, '-')[0],
]);
return $result;
}
But calling the route in Postman gives me the error:
Argument 2 passed to Google\Auth\CredentialsLoader::makeCredentials() must be of the type array, string given, called in /[...]/vendor/google/cloud-core/src/RequestWrapperTrait.php on line 155
I've checked that the projectId and the path to the keyFile is correct. Can anyone shed some light on how to get past this error?
You're specifying the path to the key file, so you should use the keyFilePath parameter instead.
Try this:
$translate = new TranslateClient([
'keyFilePath' => base_path(config('services.google.json_path')),
...
]);
From the TranslateClient.__construct docs:
keyFile: The contents of the service account credentials .json file retrieved from the Google Developer's Console. Ex: json_decode(file_get_contents($path), true).
keyFilePath: The full path to your service account credentials .json file retrieved from the Google Developers Console.
I am running into an error attempting to authenticate and create a new Route53 Record Set using the Amazon's PHP SDK and changeresourceRecordSets. Here's what I have attempted so far:
Installed the AWS SDK for Laravel
Used Amazon's IAM to create a new user and group and applied the FullAdministrator policy to the group.
Stored the new user credentials and other AWS variables in my .env file like so:
Code below:
AWS_REGION=us-east-1
AWS_ACCESS_KEY_ID=XXYYZZ
AWS_SECRET_ACCESS_KEY=112233
AWS_ZONE_ID=UHUHUHUH
Confirmed that my Laravel environment is configured correctly and my controller works by testing the following:
Code below:
$s3 = AWS::createClient('s3');
$s3->putObject(array(
'Bucket' => 'mydomain.com',
'Key' => 'new.pdf',
'SourceFile' => storage_path('app/old.pdf'),
));
Once I confirmed that my credentials worked against S3, I closely followed this SO answer and code to create a new Route53 client and create a new Record Set in my Route53 Hosted Zone. Here's my slightly modified code:
Code below:
$client = AWS::createClient('Route53');
//dd($client); $client object returned, this works
$result = $client->changeResourceRecordSets(array(
'HostedZoneId' => env('AWS_ZONE_ID'),
'ChangeBatch' => array(
'Comment' => 'just testing',
'Changes' => array(
array(
'Action' => 'CREATE',
'ResourceRecordSet' => array(
'Name' => 'test.mydomain.com.',
'Type' => 'A',
'TTL' => 600,
'ResourceRecords' => array(
array(
'Value' => '52.52.52.52',//my AWS IP address
),
),
),
),
),
),
));
The resulting error is as follows:
Client error: POST
https://route53.amazonaws.com/2013-04-01/hostedzone/MYZONE/rrset/
resulted in a 403 Forbidden response:
Sender
And more from the error...
SignatureDoesNotMatch (client): Signature expired: 20160225T215502Z is now earlier than 20160225T220842Z (20160225T221342Z - 5 min.)
Any suggestions are appreciated.
I should have added that I'm running in a homestead/virtualbox environment and the real problem was that my date service on my VM was woefully off.
Simply running sudo ntpdate -s time.nist.gov fixed the problem.
I'm having problems setting the "Metadata" option when uploading files to Amazon S3 using the AWS SDK PHP v2. The documentation that I'm reading for the upload() method states that the the 5th parameter is an array of options...
*$options Custom options used when executing commands: - params: Custom
parameters to use with the upload. The parameters must map to a
PutObject or InitiateMultipartUpload operation parameters. -
min_part_size: Minimum size to allow for each uploaded part when
performing a multipart upload. - concurrency: Maximum number of
concurrent multipart uploads. - before_upload: Callback to invoke
before each multipart upload. The callback will receive a
Guzzle\Common\Event object with context.*
My upload() code looks like this..
$upload = $client->upload(
'<BUCKETNAME>',
'metadatatest.upload.jpg',
fopen('metadatatest.jpg','r'),
'public-read',
array('Metadata' => array(
'SomeKeyString' => 'SomeValueString'
))
);
...and no meta data is set after upload.
If however I use putObject() as documented here, which I assume is a "lower level" method compared to upload()...
$putObject = $client->putObject(
array(
'Bucket' => '<BUCKETNAME>',
'Key' => 'metadatatest.putobject.jpg',
'Body' => file_get_contents('metadatatest.jpg'),
'ACL' => 'public-read',
'Metadata' => array(
'SomeKeyString' => 'SomeValueString'
)
)
);
The meta data is successfully returned when I call getObject() or view the file directly in my browser when uploaded using putObject()
$getObject = $client->getObject(
array(
'Bucket' => '<BUCKETNAME>',
'Key' => 'metadatatest.putobject.jpg'
)
);
I would prefer to use the $client->upload() method as the documentation states
Upload a file, stream, or string to a bucket. If the upload size exceeds the specified threshold, the upload will be performed using
parallel multipart uploads.
I'm not sure what I've missed?
There's really no difference in using upload() or putObject() if you don't do multipart uploads. You can have a look at the AWS PHP SDK source code but basically the upload method just calls putObject like this:
// Perform a simple PutObject operation
return $this->putObject(array(
'Bucket' => $bucket,
'Key' => $key,
'Body' => $body,
'ACL' => $acl
) + $options['params']);
This isn't very clear in the SDK documentation, but you need to send the last parameter as an array with the key params and its value being a second array with the Metadata key and value like this:
$upload = $client->upload(
'<BUCKETNAME>',
'metadatatest.upload.jpg',
fopen('metadatatest.jpg','r'),
'public-read',
array('params' => array(
'Metadata' => array(
'SomeKeyString' => 'SomeValueString'
)))
);
However, I you could just use the putObject call to achieve the same thing.
$response = $facebook->api(
'me/objects/namespace:result',
'POST',
array(
'app_id' => app_id,
'type' => "namespace:result",
'url' => "http://samples.ogp.me/370740823026684",
'title' => "Sample Result",
'image' => "https://fbstatic-a.akamaihd.net/images/devsite/attachment_blank.png",
'description' => ""
)
);
I get the following error.
The parameter object is required
I don't know where to add the parameter object.
I have been facing the same issue for the past couple of days, in the end I stopped trying to use the Facebook PHP SDK to send the request and resorted to using just sending a HTTP Request.
I am creating an object rather than updating one, but the implementation shouldn't differ much from your needs. I was getting the same error (The parameter object is required) and the below implementation resolved that issue.
I used the vinelab HTTP client composer package and built a request like this:
$request = [
'url' => 'https://graph.facebook.com/me/objects/namespace:object',
'params' => [
'access_token' => $fbAccessToken,
'method' => 'POST',
'object' => json_encode([
'fb:app_id' => 1234567890,
'og:url' => 'http://samples.ogp.me/1234567890',
'og:title' => 'Sample Object',
'og:image' => 'https://fbstatic-a.akamaihd.net/images/devsite/attachment_blank.png',
'og:description' => 'Sample Object'
])
]
];
// I'm using Laravel so this bit might look different for you (check the vine lab docs if you use that specific HTTP client)
$response = HttpClient::post($request);
// raw content
$response->content();
// json
$response->json();
As I said, I used the vinelab HTTP package in my implementation, you should be able to use any similar package or directly use curl in PHP instead.
So the AWS php sdk 2.x library has been put out recently and I've taken a turkey day plunge into upgrading from 1.5x. My first was to upgrade my S3 backup class. I've quickly run into an error:
Fatal error: Class 'EntityBody' not found in /usr/share/php/....my file here
when trying to upload a zipped file to an S3 bucket. I wrote a class to abstract the writing a bit to allow for multi-region backup, so the code below references to $this are that.
$response1 = $s3->create_object(
$this->bucket_standard,
$this->filename,
array(
'fileUpload' => $this->filename,
'encryption' => 'AES256',
//'acl' => AmazonS3::ACL_PRIVATE,
'contentType' => 'text/plain',
'storage' => AmazonS3::STORAGE_REDUCED,
'headers' => array( // raw headers
'Cache-Control' => 'max-age',
//'Content-Encoding' => 'gzip',
'Content-Language' => 'en-US'
//'Expires' => 'Thu, 01 Nov 2012 16:00:00 GMT'
),
'meta' => array(
'param1' => $this->backupDateTime->format('Y-m-d H:i:s'), // put some info on the file in meta tags
'param2' => $this->hostOrigin
)
)
);
The above worked fine on 1.5.x.
Now, in 2.x, I'm looking into their docs and they've changed just about everything (great...maximum sarcasm)
$s3opts=array('key'=> $this->accessKey, 'secret' => $this->secretKey,'region' => 'us-east-1');
$s3 = Aws\S3\S3Client::factory($s3opts);
so now I've got a new S3 object. And here is my 2.x syntax to do the same exact thing. My problem arises where they've (sinisterly) changed the old "fileupload" to "Body" and made it more abstract in how to actually attach a file! I've tried both and I'm thinking it has to do with the dependencies (Guzzle or Smyfony etc), but I get the error above (or substitute Stream if you like) whenever I try to execute this.
I've tried using Composer with composer.json, and the aws.phar but before I get into that, is there something dumb I'm missing?
$response1 = $s3->putObject(array(
'Bucket' => $this->bucket_standard,
'Key' => $this->filename,
'ServerSideEncryption' => 'AES256',
'StorageClass' => 'REDUCED_REDUNDANCY',
'Body' => EntityBody::factory(fopen($this->filename, 'r')),
//'Body' => new Stream(fopen($fullPath, 'r')),
'MetaData' => array(
'BackupTime' => $this->backupDateTime->format('Y-m-d H:i:s'), // put some info on the file in meta tags
'HostOrigin' => $this->hostOrigin
)
));
Thanks as always,
R
Did you import the EntityBody into your namespace?
use Guzzle\Http\EntityBody;
Otherwise, you'd have to do
'Body' => \Guzzle\Http\EntityBody::factory(fopen($this->filename, 'r')),