Right now I'm working on allowing user image uploads to my site using the Google Cloud Storage. Uploading regular image files such as jpg, png, gif, and webp works fine. However, SVG images do not work. They get uploaded ok but when I have the PHP code echo the URL as an image source, all browsers just display the missing image icon. However, it does appear as if the image is downloading in the network tab of the code inspector. Not only that, pasting the link into it's own tab causes the file to download. This makes me think that the server is telling the browser to download the file rather than serve it as an image. Here is the code that I am using:
include 'GDS/GDS.php';
//create datastore
$obj_store = new GDS\Store('HomeImages');
$bucket = CloudStorageTools::getDefaultGoogleStorageBucketName();
$root_path = 'gs://' . $bucket . '/' . $_SERVER["REQUEST_ID_HASH"] . '/';
$public_urls = [];
//loop through all files that are images
foreach($_FILES['images']['name'] as $idx => $name) {
if ($_FILES['images']['type'][$idx] === 'image/jpeg' || $_FILES['images']['type'][$idx] === 'image/png' || $_FILES['images']['type'][$idx] === 'image/gif' || $_FILES['images']['type'][$idx] === 'image/webp' || $_FILES['images']['type'][$idx] === 'image/svg+xml') {
//path where the file should be moved to
$original = $root_path . 'original/' . $name;
//move the file
move_uploaded_file($_FILES['images']['tmp_name'][$idx], $original);
//don't use the getImageServingUrl function on SVG files because they aren't really images
if($_FILES['images']['type'][$idx] === 'image/svg+xml')
$public_urls[] = [
'name' => $name,
'original' => CloudStorageTools::getPublicUrl($original, true),
'thumb' => CloudStorageTools::getPublicUrl($original, true),
'location' => $original
];
else
$public_urls[] = [
'name' => $name,
'original' => CloudStorageTools::getImageServingUrl($original, ['size' => 1263, 'secure_url' => true]),
'thumb' => CloudStorageTools::getImageServingUrl($original, ['size' => 150, 'secure_url' => true]),
'location' => $original
];
}
}
//store image location and name in the datastore
foreach($public_urls as $urls){
$image = new GDS\Entity();
$image->URL = $urls['original'];
$image->thumbURL = $urls['thumb'];
$image->name = $urls['name'];
$image->location = $urls['location'];
$obj_store->upsert($image);
}
//redirect back to the admin page
header('Location: /admin/homeimages');
Having run into this issue just now, I found a solution. It turns out that every file in a bucket has metadata attached and stored as key-value pairs. The key we're after is 'Content-Type', and the value isn't always correct for SVG. the value needs to be 'image/svg+xml'. I don't know how to set that programmatically, but if you only have a few objects, then it's easy to do in the file's ellipses menu in the online interface for the bucket.
Related
I have some inputs in it including thumbnails and images, i have some validation conditions to check both data from those inputs. The data entered in the images column is entered successfully but in the thumbnail column it produces C:\xampp\tmp\phpACA2.tmp, how do you make validation so that the data enters properly?
public function store(Request $request)
{
if($request->file('thumbnail')){
$request->file('thumbnail')->store('post-images');
};
$image = array();
if ($files = $request->file('images')) {
foreach ($files as $file) {
$image_name = md5(rand(1000, 10000));
$ext = strtolower($file->getClientOriginalExtension());
$image_full_name = $image_name.'.'.$ext;
$upload_path = 'public/storage/post-images/';
$image_url = $upload_path.$image_full_name;
$file->move($upload_path, $image_full_name);
$image[] = $image_url;
}
}
Product::create([
'title' => $request->title,
'subtitle' => $request->subtitle,
'description' => $request->description,
'features' => $request->features,
'categories_id' => $request->categories_id,
'thumbnail' => $request->thumbnail,
'file' => $request->file,
'images' => implode('|', $image),
]);
return redirect('/dashboard/products');
}
column thumbnail that enter into the database
thumbnail
column images that enter into the database
images
You should not assign thumbnail as $request->thumbnail (this will get its local path). You will need to assign it like:
(by taking into account that 'post-images' drive's path is public/storage/post-images/)
'thumbnail' => 'public/storage/post-images/'.$request->file('thumbnail')->getClientOriginalName()
My recommendation is to apply same logic that you did for each image to thumbnail as well.
Generate unique name
Store it with generated name and its extension
Then save its path and name in database
I have a non-versioned S3 bucket (VersionId is null for all files), files have different names.
My current code is:
$path = $this->key.'/primary/pdfs/'.$id.'/';
$result = $this->s3->listObjects(['Bucket' => $this->bucket,"Prefix" => $path])->toArray();
//get the last object from s3
$object = end($result['Contents']);
$key = $object['Key'];
$file = $this->s3->getObject([
'Bucket' => $this->bucket,
'Key' => $key
]);
//download the file
header('Content-Type: application/pdf');
echo $file['Body'];
The above is incorrect as it is giving the end file which is not the latest file.
Do I need to use the below api call ? if so, how to use it ?
$result = $this->s3->listObjectVersions(['Bucket' => $this->bucket,"Prefix" => $path])->toArray();
Since the VersionId of all files is null, there should be only one version of the files in the bucket
So, I've been trying to get this to work for the past couple hours, but I can't figure it out. The goal is to pull the converted mp4 file from gfycat and upload that file to the Amazon S3 bucket.
gfycat is returning a JSON object properly, and $result->mp4Url is returning a correct url to the mp4 file. I keep getting errors such as "object expected, string given". Any ideas? Thanks.
// convert the gif to video format using gfycat
$response = file_get_contents("http://upload.gfycat.com/transcode/" . $key .
"?fetchUrl=" . urlencode($upload->getUrl('g')));
$result = json_decode($response);
$result_mp4 = file_get_contents($result->mp4Url);
// save the converted file to AWS S3 /i
$s3->putObject(array(
'Bucket' => getenv('S3_BUCKET'),
'Key' => 'i/' . $upload->id64 . '.mp4',
'SourceFile' => $result_mp4,
));
var_dump($response) yields:
string '{
"gfyId":"vigorousspeedyinexpectatumpleco",
"gfyName":"VigorousSpeedyInexpectatumpleco",
"gfyNumber":"884853904",
"userName":"anonymous",
"width":250,
"height":250,
"frameRate":11,
"numFrames":67,
"mp4Url":"http:\/\/zippy.gfycat.com\/VigorousSpeedyInexpectatumpleco.mp4",
"webmUrl":"http:\/\/zippy.gfycat.com\/VigorousSpeedyInexpectatumpleco.webm",
"gifUrl":"http:\/\/fat.gfycat.com\/VigorousSpeedyInexpectatumpleco.gif",
"gifSize":1364050,
"mp4Size":240833,
"webmSize":220389,
"createDate":"1388777040",
"views":"205",
"title":'... (length=851)
Using json_decode() on it also yields similar results.
You are mixing up the 'SourceFile' parameter (which accepts a file path) with the 'Body' parameter (which accepts raw data). See Uploading Objects in the AWS SDK for PHP User Guide for more examples.
Here are 2 options that should work:
Option 1 (Using SourceFile)
// convert the gif to video format using gfycat
$response = file_get_contents("http://upload.gfycat.com/transcode/" . $key .
"?fetchUrl=" . urlencode($upload->getUrl('g')));
$result = json_decode($response);
// save the converted file to AWS S3 /i
$s3->putObject(array(
'Bucket' => getenv('S3_BUCKET'),
'Key' => 'i/' . $upload->id64 . '.mp4',
'SourceFile' => $result->mp4Url,
));
Option 2 (Using Body)
// convert the gif to video format using gfycat
$response = file_get_contents("http://upload.gfycat.com/transcode/" . $key .
"?fetchUrl=" . urlencode($upload->getUrl('g')));
$result = json_decode($response);
$result_mp4 = file_get_contents($result->mp4Url);
// save the converted file to AWS S3 /i
$s3->putObject(array(
'Bucket' => getenv('S3_BUCKET'),
'Key' => 'i/' . $upload->id64 . '.mp4',
'Body' => $result_mp4,
));
Option 1 is better though, because the SDK will use a file handle to mp4 file instead of loading the entire thing into memory (like file_get_contents does).
I have tried many ways to upload image to wordpress using xml-rpc and getting perfect responce with an array of file name, path and file type. Still if i look at image in wordpress it make a 0 byte corrupted image file.
I have made a class to operate all queries like create post/ edit post/ delete post etc. All working file just wp.uploadfile mot working well.
Here is my function for image upload.
function upload_pic($url, $pic, $type='image/jpg')
{
$fs = filesize($url);
$file = fopen($url, 'rb');
$filedata = fread($file, $fs);
fclose($file);
$content = array(
'name' => $pic,
'type' => $type,
'bits' => new IXR_Base64($filedata),
'overwrite' => false
);
$params = array(1,$this->UserName,$this->PassWord,$content,true);
return $this->send_request('wp.uploadFile',$params);
}
I am getting following responce
Array
(
[id] => 190
[file] => P_1364799102.jpg
[url] => http://localhost/wordpress/wp-content/uploads/2013/04/P_13647991025.jpg
[type] => image/jpg
)
Response looks good but still image file is corrupted with 0 byte.
Please help me with this. I have also tried 'metaWeblog.newMediaObject' but problem still same.
I Have found a fix it is working fine now.
function upload_pic($postid, $myFile, $name, $type='image/jpeg')
{
$rpcurl = $this->XMLRPCURL;;
$username = $this->UserName;
$password = $this->PassWord;
$file=file_get_contents($myFile);
$filetype = $type;
$filename = $name;
xmlrpc_set_type($file,'base64'); // <-- required!
$params = array($postid,$username,$password,array('name'=>$filename,'type'=>$filetype,'bits'=>$file,'overwrite'=>false));
$request = xmlrpc_encode_request('wp.uploadFile',$params);
$result = xmlrpc_decode($this->go($request,$rpcurl));
return $result;
}
Thanks
In our application a user is allowed to upload an image of dimension 1024 X 768(around 150 KB).
When the user upload an image following things happen :
1)Image uploaded on temporary directory
2)Crop the Image into four different sizes.
3)Upload the original image and its cropped images on amazon s3 server.
The above process prove to be time consuming for the user.
After profiling with xdebug it seems that 90% of the time is being consumed by uploading image on amazon s3.
I am using given below method to save image in amazon s3 bucket
public function saveInBucket( $sourceLoc,$bucketName = '', $destinationLoc = '' ) {
if( $bucketName <> '' && $destinationLoc <> '' && $sourceLoc <> '') {
$s3 = new AmazonS3();
$response = $s3->create_object( $bucketName.'.xyz.com',$destinationLoc, array(
'contentType' => 'application/force-download',
'acl' => AmazonS3::ACL_PUBLIC,
'fileUpload' => $sourceLoc
)
);
if ( ( int ) $response->isOK() ) {
return TRUE;
}
$this->ErrorMessage = 'File upload operation failed,Please try again later';
return FALSE;
}
return FALSE;
}
I also thought of uploading image directly to amazon s3 but i can not do that since i also have to crop image into 4 different sizes
How can i speed up or improve image management process.
This happened to me before. What you can do is:
When you resize your image, you have to convert to string.
I was using WideImage class.
Example:
$image = WideImage::load($_FILES["file"]['tmp_name']);
$resized = $image->resize(1024);
$data = $resized->asString('jpg');
And then when you're uploading on Amazon, you have to use the param 'body' instead of 'fileUpload'.
Example:
$response = $s3->create_object( $bucketName.'.xyz.com',$destinationLoc, array(
'contentType' => 'application/force-download',
'acl' => AmazonS3::ACL_PUBLIC,
'body' => $data
)
);
I hope that helps.