I developed an API in PHP, hosted at Google App Engine, to use with my Android app.
Basically, when a user wants to change his profile picture, the android app send a request to the server containing the user id, the session key and the picture to upload. I want to upload this picture on the Google Cloud, but since Google App Engine require you to build a public upload URL to upload a file, how can I generate the upload URL and then use this URL to upload the profile picture in my Google Cloud in one request?
I tried to use a basic PHP redirect after generating the upload URL but I get a HTTP 405.
I am really stuck and I honestly don't understand why you NEED to create an URL before uploading a file to Google App Engine...
Source: https://cloud.google.com/appengine/docs/php/googlestorage/user_upload
It seems that the photo that you want to upload on Google Cloud Storage saves as a Blob on GCS that's why they ask you to create a URL. Follow the document [1] it is related to Blobstore API in python but the concept of blob is same for any language so its good to read about blobs. Also find below a sample code which might help you to upload a photo on GCS.
$options = [ 'gs_bucket_name' => 'bucket-name' ];
$upload_url = CloudStorageTools::createUploadUrl('/upload', $options);
<form action="<?php echo $upload_url?>" method="post" enctype="multipart/form-data">
<input type ="file" name="fileupload" id = "fileupload">
<input type="submit" value="Upload" name ="submit">
</form>
Upload Handler:
<?php
use google\appengine\api\cloud_storage\CloudStorageTools;
$gs_name = $_FILES['fileupload'];
$buffer = file_get_contents($gs_name['tmp_name']);
$bucket = CloudStorageTools::getDefaultGoogleStorageBucketName();
$user_pic_url = 'gs://' . $bucket
. (substr($final_file_path,0,1) != '/' ? '/' : '' )
. $final_file_path;
// set file options on Google Could Storage
$options = stream_context_create( ['gs' => ['acl' => 'public-read',
'Content-Type' => 'image/jpg']] );
echo "<br>";
$my_file = fopen($user_pic_url, 'w', false, $options);
fclose($my_file);
[1] BlobStore API: https://cloud.google.com/appengine/docs/python/blobstore/
$pdf->Output('example_025.pdf', 'I');
$DynamicNameofPic = rand(1000,10000)."_Invoice.pdf";
$FileNameDynamic = "gs://#######/".$DynamicNameofPic;
$pdf->Output($FileNameDynamic,'F');
$image_dataURL = "https://storage.googleapis.com/##########/".$DynamicNameofPic;
$image_data = file_get_contents($image_dataURL);$fileName = $DynamicNameofPic;
Related
I want to read the csv file content using php, google drive api v3
I got the fileid and file name but I am not sure how I can read the file content?
$service = new Drive($client);
$results = $service->files->listFiles();
$fileId="1I****************";
$file = $service->files->get($fileId);
The google drive api is a file storage api. It allows you to upload, download and manage storage of files.
It does not give you access to the contents of the file.
To do this you would need to download the file and open it locally.
Alternately since its a csv file you may want to consider converting it to a google sheet then you could use the google sheets api to access the data within the file programmatically.
Code for downloading a file from Google drive api would look something like this
Full sample can be found here large-file-download.php
// If this is a POST, download the file
if ($_SERVER['REQUEST_METHOD'] == 'POST') {
// Determine the file's size and ID
$fileId = $files[0]->id;
$fileSize = intval($files[0]->size);
// Get the authorized Guzzle HTTP client
$http = $client->authorize();
// Open a file for writing
$fp = fopen('Big File (downloaded)', 'w');
// Download in 1 MB chunks
$chunkSizeBytes = 1 * 1024 * 1024;
$chunkStart = 0;
// Iterate over each chunk and write it to our file
while ($chunkStart < $fileSize) {
$chunkEnd = $chunkStart + $chunkSizeBytes;
$response = $http->request(
'GET',
sprintf('/drive/v3/files/%s', $fileId),
[
'query' => ['alt' => 'media'],
'headers' => [
'Range' => sprintf('bytes=%s-%s', $chunkStart, $chunkEnd)
]
]
);
$chunkStart = $chunkEnd + 1;
fwrite($fp, $response->getBody()->getContents());
}
// close the file pointer
fclose($fp);
I have an S3 bucket, and am using a GitHub S3 class. I am successful in uploading a file which is less then 5GB. But I want to upload more than 5GB data - e.g. 100GB in a single transfer.
S3::setAuth(awsAccessKey, awsSecretKey);
$bucket = "upload-bucket";
$path = "myfiles/"; // Can be empty ""
$lifetime = 3600; // Period for which the parameters are valid
$maxFileSize = (1024 * 1024 * 1024 * 5);
$metaHeaders = array("uid" => 123);
$requestHeaders = array(
"Content-Type" => "application/octet-stream",
"Content-Disposition" => 'attachment; filename=${filename}'
);
$params = S3::getHttpUploadPostParams(
$bucket,
$path,
S3::ACL_PUBLIC_READ,
$lifetime,
$maxFileSize,
201, // Or a URL to redirect to on success
$metaHeaders,
$requestHeaders,
false // False since we're not using flash
);
$uploadURL = "https://{$bucket}.s3.amazonaws.com/";
<form method="post" action="<?php echo $uploadURL; ?>" enctype="multipart/form-data">
<input type="file" name="file" /> <input type="submit" value="Upload" />
</form>
I want to upload more the 100GB data through file browser. By using the above code i am able to to upload 5 GB data. But, If i am trying to upload more then 5GB data eg,10Gb,100GB. then this type of error is showing
Read the FAQ:
The largest object that can be uploaded in a single PUT is 5
gigabytes. For objects larger than 100 megabytes, customers should
consider using the Multipart Upload capability.
You cannot upload file larger that 5 Gb in one request, so you need to develop a solution for chunked ("multipart") upload.
I have this code to get files from rackspace cloud files:
$username = getenv('RS_USERNAME');
$apikey = getenv('RS_APIKEY');
$containerName = 'imagenesfc';
$region = 'DFW';
$client = new Rackspace(Rackspace::US_IDENTITY_ENDPOINT, array(
'username' => $username,
'apiKey' => $apikey,
));
$filename = "myfile.ext";
$service = $client->objectStoreService(null, $region);
$container = $service->getContainer($containerName);
$object = $container->getObject($filename);
$account = $service->getAccount();
$account->setTempUrlSecret();
$tempUrl = $object->getTemporaryUrl(15, 'GET');
In the open php cloud documentation says you can change 'GET' to 'PUT' to what I imagine is being able to put a file, instead of getting it, the problem is that the file doesn't exist yet, and apparently the only way to create a file is uploading it first? PHP SDK Container
In Amazon s3 I can do the following to get what I want:
$keyname = "myfile.ext";
$arr = array(
'Bucket' => 'bucket',
'Key' => $keyname
);
$cmd = $s3Client->getCommand('PutObject', $arr);
$request = $s3Client->createPresignedRequest($cmd, '+10 minutes');
$presignedUrl = (string) $request->getUri();
Then I can write to the presignedUrl anyway I prefer, like with Java:
url = new URL(jObj.get("presignedUrl").toString().replace("\\", ""));
connection=(HttpURLConnection) url.openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("PUT");
out = new DataOutputStream(connection.getOutputStream());
...(write,flush,close)
So, basically what I want to do, is get the upload URL from RackSpace and write to it like I do with Amazon s3.
Is it possible? And if it is, how can you do it?
I need to do it this way because my API will provide only download and upload links, so no traffic goes directly through it. I can't have it saving the files to my API server then upload it to cloud.
Yes, you can simulate a file upload without actually uploading content to the API - all you need to do is determine what the filename will be. The code you will need is:
$object = $container->getObject();
$object->setName('blah.txt');
$account = $service->getAccount();
$account->setTempUrlSecret();
echo $object->getTemporaryUrl(100, 'PUT');
The getObject method returns an empty object, and all you're doing is setting the remote name on that object. When a temp URL is created, it uses the name you just set and presents a temporary URL for you to use - regardless if the object exists remotely. The temp URL can then be used to create the object.
I am trying to save images from image url to the amazon s3, but image is created there in bucket, but image is not shown in browser, displays message "image cannot be displayed because it contains error.
This is my code:
require_once("aws/aws-autoloader.php");
// Amazon S3
use Aws\S3\S3Client;
// Create an Amazon S3 client object
$s3Client = S3Client::factory(array(
'key' => 'XXXXXXXXXXXX',
'secret' => 'XXXXXXXXX'
));
// Register the stream wrapper from a client object
$s3Client->registerStreamWrapper();
// Save Thumbnail
$s3Path = "s3://smmrescueimages/";
$s3Stream = fopen($s3Path . 'gt.jpg', 'w');
fwrite($s3Stream, 'http://sippy.in/gt.jpg');
#fclose($s3Stream);
echo "done";
This is the image path generated https://s3.amazonaws.com/smmrescueimages/gt.jpg
change this line
fwrite($s3Stream, 'http://sippy.in/gt.jpg');
to
fwrite($s3Stream, file_get_contents('http://sippy.in/gt.jpg'));
otherwise you save the url string instant of the image binary into your file.
dont use # to prevent error messages of php functions!
just check if a valid handler is present
$s3Stream = fopen($s3Path . 'gt.jpg', 'w');
if( $s3Stream ) {
fwrite($s3Stream, file_get_contents('http://sippy.in/gt.jpg'));
fclose($s3Stream);
}
I have developed a facebook application that contains following files
1) index.php
2) cap3.php
where cap3.php generates an image using GD in PHP.
index.php displays this image using following code
<img src="cap3.php">
now I want this generated image to be posted on user's timeline.
I have tried to do so by using following code:
$facebook->setFileUploadSupport(true);
$img = 'cap3.php';
$args=array( 'source' => '#' .$current , 'message' => 'Photo uploaded via!' );
$photo = $facebook->api('/me/photos', 'POST', $args);
but the image is not posted on user's timeline
please help.
One way to do that is to put your cap3.php file in your remote host to get an URL to it. For example : http://www.example.com/cap3.php. Then, you download that image from your app and send it to facebook.
Let's see an example:
$local_img = 'cap3.png'; // I assume your cap3.php generates a PNG file
$remote_img = 'http://www.example.com/cap3.php';
// sample: check for errors in your production app!
$img_content = file_get_contents($remote_img);
file_put_contents($locale_img, $img_content);
$facebook->setFileUploadSupport(true);
$args=array( 'source' => '#' .$locale_img , 'message' => 'Photo uploaded via!' );
$photo = $facebook->api('/me/photos', 'POST', $args);