File from $_POST and Amazon S3 - php

i have problem with amazon s3 service and a web service writed with php.
This WS receive from $_POST a file base64encoded. I need to take this "string" and save to Amazon S3 bucket.
I didn't find a right solution for do that, and after a week of work I'm looking for help here.
//$file = 'kdK9IWUAAAAdaVRYdENvbW1lbnQAAAAAAENyZWF0ZWQgd'
$file = $_POST['some_file'];
$opt = array(
'fileUpload' => base64_decode($file),
'acl' => AmazonS3::ACL_PUBLIC
);
$s3 = new AmazonS3(AWS_KEY, AWS_SECRET_KEY);
$response = $s3->create_object($bucket, $filename, $opt);
Thanks

Per the docs, fileUpload expects a URL or path, or an fopen resource.
fileUpload - string|resource - Required; Conditional - The URL/path for the file to upload, or an open resource. Either this parameter or body is required.
You should pass the decoded file data via the body parameter instead:
body - string - Required; Conditional - The data to be stored in the object. Either this parameter or fileUpload must be specified.

The imagestring is the base64 encoded string that was passed from POST data. You can encode this image and make a file to anywhere - in my case /photos. I used this is a school server, but if Amazon server is similar enough, it could also work there.
$values = array();
foreach($_POST as $k => $v){
$values[$k] = $v;
}
$imagestring = $values['stringEncodedImage'];
$img = imagecreatefromstring(base64_decode($imagestring));
if($img != false)
{
echo 'making the file!';
imagejpeg($img, 'photos/'.$file_name);
}

Related

Read the csv file content

I want to read the csv file content using php, google drive api v3
I got the fileid and file name but I am not sure how I can read the file content?
$service = new Drive($client);
$results = $service->files->listFiles();
$fileId="1I****************";
$file = $service->files->get($fileId);
The google drive api is a file storage api. It allows you to upload, download and manage storage of files.
It does not give you access to the contents of the file.
To do this you would need to download the file and open it locally.
Alternately since its a csv file you may want to consider converting it to a google sheet then you could use the google sheets api to access the data within the file programmatically.
Code for downloading a file from Google drive api would look something like this
Full sample can be found here large-file-download.php
// If this is a POST, download the file
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
// Determine the file's size and ID
$fileId = $files[0]->id;
$fileSize = intval($files[0]->size);
// Get the authorized Guzzle HTTP client
$http = $client->authorize();
// Open a file for writing
$fp = fopen('Big File (downloaded)', 'w');
// Download in 1 MB chunks
$chunkSizeBytes = 1 * 1024 * 1024;
$chunkStart = 0;
// Iterate over each chunk and write it to our file
while ($chunkStart < $fileSize) {
$chunkEnd = $chunkStart + $chunkSizeBytes;
$response = $http->request(
'GET',
sprintf('/drive/v3/files/%s', $fileId),
[
'query' => ['alt' => 'media'],
'headers' => [
'Range' => sprintf('bytes=%s-%s', $chunkStart, $chunkEnd)
]
]
);
$chunkStart = $chunkEnd + 1;
fwrite($fp, $response->getBody()->getContents());
}
// close the file pointer
fclose($fp);

How to create Temporary URL to upload file to RackSpace Cloud Files using PHP?

I have this code to get files from rackspace cloud files:
$username = getenv('RS_USERNAME');
$apikey = getenv('RS_APIKEY');
$containerName = 'imagenesfc';
$region = 'DFW';
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => $username,
'apiKey' => $apikey,
));
$filename = "myfile.ext";
$service = $client->objectStoreService(null, $region);
$container = $service->getContainer($containerName);
$object = $container->getObject($filename);
$account = $service->getAccount();
$account->setTempUrlSecret();
$tempUrl = $object->getTemporaryUrl(15, 'GET');
In the open php cloud documentation says you can change 'GET' to 'PUT' to what I imagine is being able to put a file, instead of getting it, the problem is that the file doesn't exist yet, and apparently the only way to create a file is uploading it first? PHP SDK Container
In Amazon s3 I can do the following to get what I want:
$keyname = "myfile.ext";
$arr = array(
'Bucket' => 'bucket',
'Key' => $keyname
);
$cmd = $s3Client->getCommand('PutObject', $arr);
$request = $s3Client->createPresignedRequest($cmd, '+10 minutes');
$presignedUrl = (string) $request->getUri();
Then I can write to the presignedUrl anyway I prefer, like with Java:
url = new URL(jObj.get("presignedUrl").toString().replace("\\", ""));
connection=(HttpURLConnection) url.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("PUT");
out = new DataOutputStream(connection.getOutputStream());
...(write,flush,close)
So, basically what I want to do, is get the upload URL from RackSpace and write to it like I do with Amazon s3.
Is it possible? And if it is, how can you do it?
I need to do it this way because my API will provide only download and upload links, so no traffic goes directly through it. I can't have it saving the files to my API server then upload it to cloud.
Yes, you can simulate a file upload without actually uploading content to the API - all you need to do is determine what the filename will be. The code you will need is:
$object = $container->getObject();
$object->setName('blah.txt');
$account = $service->getAccount();
$account->setTempUrlSecret();
echo $object->getTemporaryUrl(100, 'PUT');
The getObject method returns an empty object, and all you're doing is setting the remote name on that object. When a temp URL is created, it uses the name you just set and presents a temporary URL for you to use - regardless if the object exists remotely. The temp URL can then be used to create the object.

phpleague flysystem read and write to large file on server

I am using flysystem with IRON IO queue and I am attempting to run a DB query that will be taking ~1.8 million records and while doing 5000 at at time. Here is the error message I am receiving with file sizes of 50+ MB:
PHP Fatal error: Allowed memory size of ########## bytes exhausted
Here are the steps I would like to take:
1) Get the data
2) Turn it into a CSV appropriate string (i.e. implode(',', $dataArray) . "\r\n")
3) Get the file from the server (in this case S3)
4) Read that files' contents and append this new string to it and re-write that content to the S3 file
Here is a brief run down of the code I have:
public function fire($job, $data)
{
// First set the headers and write the initial file to server
$this->filesystem->write($this->filename, implode(',', $this->setHeaders($parameters)) . "\r\n", [
'visibility' => 'public',
'mimetype' => 'text/csv',
]);
// Loop to get new sets of data
$offset = 0;
while ($this->exportResult) {
$this->exportResult = $this->getData($parameters, $offset);
if ($this->exportResult) {
$this->writeToFile($this->exportResult);
$offset += 5000;
}
}
}
private function writeToFile($contentToBeAdded = '')
{
$content = $this->filesystem->read($this->filename);
// Append new data
$content .= $contentToBeAdded;
$this->filesystem->update($this->filename, $content, [
'visibility' => 'public'
]);
}
I'm assuming this is NOT the most efficient? I am going off of these docs:
PHPLeague Flysystem
If anyone can point me in a more appropriate direction, that would be awesome!
Flysystem supports read/write/update stream
Please check latest API https://flysystem.thephpleague.com/api/
$stream = fopen('/path/to/database.backup', 'r+');
$filesystem->writeStream('backups/'.strftime('%G-%m-%d').'.backup', $stream);
// Using write you can also directly set the visibility
$filesystem->writeStream('backups/'.strftime('%G-%m-%d').'.backup', $stream, [
'visibility' => AdapterInterface::VISIBILITY_PRIVATE
]);
if (is_resource($stream)) {
fclose($stream);
}
// Or update a file with stream contents
$filesystem->updateStream('backups/'.strftime('%G-%m-%d').'.backup', $stream);
// Retrieve a read-stream
$stream = $filesystem->readStream('something/is/here.ext');
$contents = stream_get_contents($stream);
fclose($stream);
// Create or overwrite using a stream.
$putStream = tmpfile();
fwrite($putStream, $contents);
rewind($putStream);
$filesystem->putStream('somewhere/here.txt', $putStream);
if (is_resource($putStream)) {
fclose($putStream);
}
If you are working with S3, I would use the AWS SDK for PHP directly to solve this particular problem. Appending to a file is actually very easy using the SDK's S3 streamwrapper, and doesn't force you to read the entire file into memory.
$s3 = \Aws\S3\S3Client::factory($clientConfig);
$s3->registerStreamWrapper();
$appendHandle = fopen("s3://{$bucket}/{$key}", 'a');
fwrite($appendHandle, $data);
fclose($appendHandle);

trying to save image in amazon s3 but image not getting saved

I am trying to save images from image url to the amazon s3, but image is created there in bucket, but image is not shown in browser, displays message "image cannot be displayed because it contains error.
This is my code:
require_once("aws/aws-autoloader.php");
// Amazon S3
use Aws\S3\S3Client;
// Create an Amazon S3 client object
$s3Client = S3Client::factory(array(
'key' => 'XXXXXXXXXXXX',
'secret' => 'XXXXXXXXX'
));
// Register the stream wrapper from a client object
$s3Client->registerStreamWrapper();
// Save Thumbnail
$s3Path = "s3://smmrescueimages/";
$s3Stream = fopen($s3Path . 'gt.jpg', 'w');
fwrite($s3Stream, 'http://sippy.in/gt.jpg');
#fclose($s3Stream);
echo "done";
This is the image path generated https://s3.amazonaws.com/smmrescueimages/gt.jpg
change this line
fwrite($s3Stream, 'http://sippy.in/gt.jpg');
to
fwrite($s3Stream, file_get_contents('http://sippy.in/gt.jpg'));
otherwise you save the url string instant of the image binary into your file.
dont use # to prevent error messages of php functions!
just check if a valid handler is present
$s3Stream = fopen($s3Path . 'gt.jpg', 'w');
if( $s3Stream ) {
fwrite($s3Stream, file_get_contents('http://sippy.in/gt.jpg'));
fclose($s3Stream);
}

Can't set Content-type to Google Storage with the Google PHP Client

I'm using google-api-php-client
Here's the bit where I upload a .jpg image.
$postbody = array("data" => $imgData);
$gso = new Google_StorageObject();
$gso->setName($imageName);
$contentType = 'image/jpg';
$gso->setContentType($contentType);
$resp = $objects->insert('bucket-name', $gso, $postbody);
By inspecting $gso ContentType is being added but in the cloud console is added with the default application/octet-stream type.
Is there another way to set the content type?
Try this, it worked for me:
$postbody = array('mimeType' => 'image/jpeg', "data" => $imgData);
$gso = new Google_StorageObject();
$gso->setName($imageName);
$resp = $objects->insert('bucket-name', $gso, $postbody);
It seems 'mimeType' its a parameter.
Take a look at line 39 in https://code.google.com/p/google-api-php-client/source/browse/tags/0.6.7/src/service/Google_ServiceResource.php
That sounds like a bug. I'd suggest filing it with google-api-php-client.
In the meanwhile, there is a workaround. Google Cloud Storage treats the Content Type like any other metadata. You can change the content type of an object after you upload it by by using the $objects->update() method.

Categories