Issue with file edit in PHP using file_put_contents (Laravel) - php

Good afternoon everyone, I have run across a weird issue which I can't put my finger on and I am hoping somebody can help me figure out what is causing this problem.
To provide some context, I allow the user to store an array of images on a product, and after they are stored using laravel-stapler package which is configured in the following way:
public function __construct(array $attributes = [])
{
parent::__construct($attributes);
$this->hasAttachedFile('image', [
'styles' => [
'thumbnail' => '500x500#',
'large' => '800x800#'
],
'url' => '/media/image/:id/:style/:filename',
'default_url' => '/img/category-placeholder-greyscale.jpg',
'convert_options' => [
'jpeg_quality' => 60
]
]);
}
The images are saved in three folders:
../path-to-image/original/file-name
../path-to-image/thumbnail/file-name
../path-to-image/large/file-name
After I save these images I use croppie.js to rotate and edit them. After the user edits and crops them and they submit it the image is sent as a base64 to a controller and the controller is shortened below to only the relevant parts:
$imageData = $request->get('imagebase64');
list(, $imageData) = explode(';', $imageData);
list(, $imageData) = explode(',', $imageData);
$imageData = base64_decode($imageData);
// $image is loaded up through dependency injection
$path = public_path('/path' . '/large/' . $image->image_file_name);
$path2 = public_path('/path' . '/original/' . $image->image_file_name);
file_put_contents($path, $imageData);
file_put_contents($path2, $imageData);
This works on my local machine just fine, the image is saved in both folders and I get a new cropped and edited image, but on my server this doesn't work, the first file_put_contents doesn't work and doesn't store a new image into the /large folder but the second file_put_contents works and stores a new image into the /original folder.
I am not sure why does this happen and would appreciate any help you can give me. I also do not think it is due to permissions because I gave the folder for the images the right permission but I can't be certain. The code doesn't crash also it just executes without saving the first image

you might have to edit the .htaccess file to give permissions to these files

Related

Can't write image data to path (upload/brand/1710068361905783.png)

My Project Works Perfectly On Local Server But After I Have Upload It To Public Server . All Add Images Functions Give Me Same Error
Can't write image data to path (upload/....
I Am Using Laravel 8
here is my code when i add brand image
$image = $request->file('brand_image');
$name_gen = hexdec(uniqid()).'.'.$image->getClientOriginalExtension();
Image::make($image)->resize(300,300)->save('upload/brand/'.$name_gen);
$save_url = 'upload/brand/'.$name_gen;
Brand::insert([
'brand_name_en' => $request->brand_name_en,
'brand_name_ar' => $request->brand_name_ar,
'brand_slug_en' => strtolower(str_replace(' ', '-',$request->brand_name_en)),
'brand_slug_ar' => str_replace(' ', '-',$request->brand_name_ar),
'brand_image' => $save_url,
If you haven't created your path's folder, you will get that error. You need to create first it.

Upload image in specific folder php Cloudinary

I'm trying to upload images to folders in my cloudinary account but I can't get it to work, it always saves to the default home folder.
I have two folders in there called categories and products.
This is what I have right now
\Cloudinary\Uploader::upload(
$request->file('image'),
$options = array('public_id' => '/home/categories/' . $request->name . '.' .$extension)
);
I also did '/categories/' . $request->name . '.' .$extension
I get this
Cloudinary\Error
public_id (/home/categories/asdasdf.jpg) is invalid
The docs are here, I can't see what I'm doing wrong
https://cloudinary.com/documentation/image_upload_api_reference#upload_method
How can I fix this?
The correct way is like this
\Cloudinary\Uploader::upload(
$request->file('image'),
$options = array('public_id' => 'categories/' . $request->name)
);
This saves the file at /home/categories/file

503 Unavailable Service on Form Submit

I created a project with PHP Laravel and Vue JS. and configured it with Amazon AWS's basic plan. In the beginning, it worked well without any issues. But now, when I try to add a new blog post or edit already existed blog post with heavy content, it is showing 503 Service Unavailable error. I developed the algorithm for the blog as shown below.
Tables:
blogs - This table is used to store lightweight data of the post like title, featured image, URL, etc.
posts - This table is used to store the actual content of the post. It contains three columns like blog id, content, order. Here I am
using text datatype for content which will accept nearly 70k characters.
Algorithm:
When I submit a blog post, first it will create a row in the blogs table with lightweight data. And after that, the actual post content will be split into an array with each item contains 65k characters. And every item will be stored in the posts table as a new row with the blog id created in the blogs table. At the time of retrieving the post, it will join the rows in the posts table and will display the actual post.
Note: The above process is working fine without any issues.
Problem:
The actual problem is suddenly it started to show 503 error when I try to add a new post or edit the existed post with images(produces heavy amount of characters), even the post is being created in blogs table and incomplete amount of characters are adding in the posts table, while adding the rest of the content its showing 503 error.
Note: Even it is working fine on my localhost and another Bluehost server.
I tried to reduce the content splitting with 25k characters, but the result is showing the same.
if($request->hasFile('image')) {
$filenameWithExt = $request->file('image')->getClientOriginalName();
$filename = pathinfo($filenameWithExt, PATHINFO_FILENAME);
$extension = $request->file('image')->getClientOriginalExtension();
$fileNameToStore = 'post_' . time() . '.' .$extension;
$path = public_path('uploads/posts/' . $fileNameToStore);
if(!file_exists(public_path('uploads/posts/'))) {
mkdir(public_path('uploads/posts/'), 0777);
}
Image::make($request->file('image')->getRealPath())->resize(900, NULL)->save($path);
}else {
$fileNameToStore = NULL;
}
if($request->hasFile('author_image')) {
$filenameWithExt = $request->file('author_image')->getClientOriginalName();
$filename = pathinfo($filenameWithExt, PATHINFO_FILENAME);
$extension = $request->file('author_image')->getClientOriginalExtension();
$authorImage = 'author_' . time() . '.' .$extension;
$path = public_path('uploads/authors/' . $authorImage);
if(!file_exists(public_path('uploads/authors/'))) {
mkdir(public_path('uploads/authors/'), 0777);
}
Image::make($request->file('author_image')->getRealPath())->resize(50, 50)->save($path);
}else {
$authorImage = NULL;
}
$blog = Blog::create([
'title' => $request->title,
'image' => $fileNameToStore,
'category' => $request->category,
'meta_description' => $request->meta_description,
'meta_keywords' => $request->meta_keywords,
'url' => $request->url,
'meta_title' => $request->meta_title,
'author_name' => $request->author_name,
'author_image' => $authorImage,
'author_linkedin' => $request->author_linkedin,
'popular' => $request->popular
]);
$contents = str_split($request->post, 65000);
$i = 1;
foreach($contents as $key => $item)
{
Post::create([
'post' => $blog->id,
'content' => $item,
'order' => $i
]);
$i++;
}
I expect the output to be redirect back to blogs page with success message "Post has been created successfully", but the actual result is
Service Unavailable
The server is temporarily unable to service your request due to maintenance downtime or capacity problems. Please try again later.
It looks like you need to increase values of post_max_size and upload_max_filesize in php.ini.
Guide for AWS: https://aws.amazon.com/ru/premiumsupport/knowledge-center/wordpress-themes-2mb/
I also recommend using transactions for your case. This will help to avoid partially created posts. https://laravel.com/docs/5.8/database#database-transactions

Podio API image upload on item

Hoping I can get some help here. I am able to upload image files via the API, and I receive a file_id as a response, but every time I try to update an image field in an item using the id that was returned I get:
"File with mimetype application/octet-stream is not allowed, must match one of (MimeTypeMatcher('image', ('png', 'x-png', 'jpeg', 'pjpeg', 'gif', 'bmp', 'x-ms-bmp')),)
I've even added a line of code into my PHP script to pull the jpeg from file, rewrite it as a jpeg to be certain ( imagejpeg()) before uploading. Still, when I get to the point of updating the image field on the item, I get the same error. It seems all images uploaded via the API are converted to octet-stream. How do I get around this?
I'm using the Podio PHP library.
The PHP code is as follows:
$fileName = "testUpload.jpeg";
imagejpeg(imagecreatefromstring(file_get_contents($fileName)),$fileName);
$goFile = PodioFile::upload($fileName,$itemID);
$fileID = $goFile->file_id;
PodioItem::update((int)$itemID, array(
'fields' => array(
"logo" => (int)$fileID,
)
) , array(
"hook" => 0
));
Please try and replace :
$goFile = PodioFile::upload($fileName,$itemID);
with something like:
$goFile = PodioFile::upload('/path/to/example/file/example.jpg', 'example.jpg');
$fileID = $goFile->file_id;
PodioItem::update((int)$itemID, array(
'fields' => array(
"logo" => array((int)$fileID),
)
As it is described in https://developers.podio.com/examples/files#subsection_uploading
And then use $fileID as you've used. And yes, filename should have file extension as well, so it will not work with just 123123123 but should work well with 123123123.jpg

How to Zip mutiple files using Zend Compress Filter?

I have multiples files to be compressed into one unique file (zip), but it seems that it creates a new file everytime the loop restart, how do I do that? there is no enough documentation about this, here is my code:
$file1 = "test1.xls";
$files2 = array(
"test2.xls", "test3.xls"
);
$filter = new Zend_Filter_Compress(array(
'adapter' => 'Zip',
'options' => array(
'archive' => BASE_PATH .'/public/' . $this->configuracoes->get('media') . '/temp/zipped'.date('d-m-Y-H-i-s').'.zip'
)
));
$compress = $filter->filter($file1);
foreach($files2 as $file){
$compress = $filter->filter($file);
}
This only result a zip with test3.xls inside.
Worth looking at this link, seems to be the same answer where you open the zip, add files, then close it outside the loop.
Create zip with PHP adding files from url
An old question but I found this looking for something similar.
You can pass a directory to the filter function and it will recursively add all files within that directory.
i.e
$compress = $filter->filter('/path/to/directory/');

Categories