Convert image from png to webp then upload with ftp in Laravel - php

Preface
Environment
OS: Ubuntu
PHP: 7.4
Laravel: ^8.12
I'm writing a scraper page for a web app that I'm developing and these are the steps that I'm trying to achieve:
Scrape image from the target website.
Convert image from png to webp
Upload converted webp to my server via FTP.
The core of the program is one line:
Storage::disk('ftp')->put("/FILE_UPLOAD_LOCATION", file_get_contents("scraped-image.png"));
This confirmed that my FTP environment was configured correctly.
Attempts and Errors
I should just be able to do the following should I not?
$image = createimagefromstring(file_get_contents("remote_url.png");
imagepalettetotruecolor($image); // needed for webp
Storage::disk('ftp')->put('remote_desination', imagewebp($image));
The answer is no. This does not work as it just creates a "image" file with 1 as the contents of it.
I haven't tried much outside of this, but hopefully someone has an answer or can point me in a new direction.

This is happening because imagewebp() doesn't return the image, but a boolean indicating either it worked or not.
You must create a handle to store the image in memory then use it to store on the ftp:
$handle=fopen("php://memory", "rw");
$image = createimagefromstring(file_get_contents("remote_url.png");
imagepalettetotruecolor($image); // needed for webp
imagewebp($image, $handle);
imagedestroy($image); // free up memory
rewind($handle); // not sure if it's necessary
Storage::disk('ftp')->put('remote_desination', $handle);
fclose($handle); // close the handle and free up memory

Related

Getting error message on Cpanel using Imagick

I am using Imagick to edit and save svg image.
I am getting an error when I save svg image after having croped it.
Here is my code to operate svg images:
$image = new Imagick();
$image->readImage($path1);
$image -> cropImage($rw*11.2, $sw*7.7, 0, 0);
$image->writeImage('Cvilogpdfbackend/core/images/tempimage/'.$targetfile.'.svg');
When i call this function, I am getting bellow error
PHP Fatal error: Uncaught ImagickException: delegate failed
`'potrace' --svg --output '%o' '%i'' #
error/delegate.c/InvokeDelegate/1897 in
/home/civilogc/public_html/resizeimage.php:19 Stack trace:
#0 /home/civilogc/public_html/resizeimage.php(19): Imagick->writeimage('Cvilogpdfbacken...')
#1 {main} thrown in /home/civilogc/public_html/resizeimage.php on line 19
How to solve this issue?
Why would you want to use mainly bitmap/raster based image processing library (ImageMagick) to modify vector SVG images?
Let me first point you to do the right way:
ImageMagick should be mostly used for reading and writing bitmap images and for creating/writing SVG images (and other vector based format if required, e.g. EPS, PDF etc).
Instead of using ImageMagick/Imagick for your conversion (changing vector image to bitmap and back again - losing quality) just read your image for example with this code:
$image = file_get_contents($path1);
$svg_dom = new DOMDocument();
$tmp_obj = $svg_dom->getElementsByTagName("svg")->item(0);
$svg_width = floatval($tmp_obj->getAttribute("width"));
$svg_height = floatval($tmp_obj->getAttribute("height"));
$svg_viewbox = floatval($tmp_obj->getAttribute("vieWBox"));
and then I suggest you just calculate the new viewBox with your cropping adjustment (create a function for that) and set it back, optionally set new width and height as well:
$svg_viewBox = calculateCroppedView($svg_viewbox, $new_width, $new_height, $top, $left);
$tmp_obj->setAttribute("viewBox", $svg_viewBox);
$tmp_obj->setAttribute("width", $new_width."px");
$tmp_obj->setAttribute("height", $new_height."px");
file_put_contents('Cvilogpdfbackend/core/images/tempimage/'.$targetfile.'.svg', $svg_dom->saveXML());
Note: you are not really cropping the SVG image (SVG elements), just changing the view (e.g. zoom) to it. Converting all cut paths, shapes, text etc is possible but would require a lot of coding. Either way the goal is to keep the (vector) image quality.
For an excellent explanation of the SVG viewBox attribute read this section:
https://css-tricks.com/scale-svg/#the-viewbox-attribute
To answer your question:
This error states that ImageMagick in your webserver cannot find or doesn't have installed the (free) potrace software that does the actual vectorization of a bitmap image.
Note: you need to have ImageMagick version 7 since older version just embedded bitmap images and data encoded (base64) string to the SVG image tag or maybe used a weird conversion - depending of ImageMagick version - of converting dots to 1px SVG circle tags. And ImageMagick installed needs to be the same version as Imagick as well for the mentioned reasons.
Potrace software command line is set in delegates.xml of the ImageMagick/Imagick installation and ImageMagick or Imagick needs to find potrace in the PATH (environment variable).
I suggest you to contact your web hosting provider to install Potrace that can be downloaded here:
http://potrace.sourceforge.net/#downloading
And then you'll have to test your script, most likely your web hosting provider will already solve the issue just by installing Potrace. Maybe a path to it will need to be added to a user/system PATH variable and that should practically solve your issue.
If not, you can try to use ImageMagick (convert or now renamed magick, main program) and potrace using exec() command in PHP yet your web hosting must allow that since it's usually forbidden in PHP configuration.
To go for other help:
Well, ask a question here on SO or on ImageMagick support forum that based on their main site forum (where you cannot register anymore) is now here:
https://github.com/ImageMagick/ImageMagick/discussions

S3 creates blank transparent webp image

I have uploaded webp image into Amazon-S3. But sometimes it generates a blank transparent webp image. Content-Type:image/webp and file-size are also correct. Also, note that it displays from the mobile application but not in any browser. All is working fine on the local server. Problems occur in only with the live server. I've used the following code:
$disk = Storage::disk('s3');
$original_targetFile = "videoflyer/webp_original/" . $image;
$disk->put($original_targetFile, file_get_contents($original_sourceFile),
'public');
S3 generates a blank image like this:
https://imgur.com/a/CMx6tsU
Sorry, it's my miss understanding. There is an issue of GD driver.
Now I am using https://developers.google.com/speed/webp/ & it's work perfectly.

Replace image on server and show previous image until the new one is fully uploaded

I'm uploading to a lightspeed server through "ncftpput" an image taken from a raspberry camera every minute.
I want to be able to show the updated image and I know how I can force the browser to use the latest version instead of the cached image.
So, everything works properly except that, if I refresh the image (like through shift-F5) during image upload, the browser reports image contains errors (or shows image partially).
Is there any way to ensure the image is fully loaded before serving the new one?
I'm not sure if I should operate on ncftp or use PHP to ensure the swap happens only after complete loading.
Image is a progressive jpg but that doesn't help...
Any suggestion?
Thanks
I ended up with NOT using FTP because, as Viney mentioned, the webserver doesn't know if the upload is completed.
I'm using "curl" which has the advantage of being preinstalled on raspberry distro and a php upload page.
It seems that PHP will pass the new image only once fully uploaded and avoid creating the issue when image is still partially uploaded.
So, to recap:
raspberry (webcam), after having taken the image:
curl -F"somepostparam=abcd" -F"operation=upload" -F"file=#filename.jpg" https://www.myserver.com/upload.php
PHP server code:
$uploadfile = '/home/domain/myserver.com/' . basename($_FILES['file']['name']);
move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile);
$content = file_get_contents($uploadfile);
Problem is this: you open browser (at 07:00:10 AM) image.jpg gets rendered now say it's 07:01:00 you hit refresh in browser but the raspberry is already started uploading image.jpg say it would take 3 secs to complete the full upload but the server doesn't know about ftp and would read whatever bytes are there in image,jpg and flushes to your browser. Had it been a baseline JPG it would have shown a cropped(in height) image but since it's a progressive JPG it would be messed up.I am not aware of whether it's possible but try looking up if you FTP supports a locking file.
How to solve ?
The best way is to let you server know that the file it's accessing is in a write process.If your ftp supports advisory locks then you can use it so when server tries accessing you file ( via system call) kernel would instruct it that the file is currently locked so the server will wait until ftp releases that lock.
In vsftp there would be a option lock_upload_files in VSFTPD.CONF setting yest would enable this feature
If you are unable to work out above solution then you can use some trick like checking file last modified time if it's almost same as current time then you make php wait for some guessed time that you think is avg. upload time of your file.For this method you should use PHP to send it to browser instead of server. ie Just change the src of your image from '/path/to/image.jpg' to 'gen-image.php'. This script will read that image and would flush it to the browser
gen-image.php
$guessedUploadTime = 3;// guessed time that ncftpput takes to finish
$currTime = time();
$modTime = filemtime('image.jpg');
if( ($currTime - $modTime) < $guessedUploadTime)
{
sleep($guessedUploadTime);
}
$file = '/path/to/image.jpg';
$type = 'image/jpeg';
header('Content-Type:'.$type);
readfile($file);
Note that the above solution is not ideal because if file has been just done uploading it won't be modified for another 57 seconds yet the browser request say at 07:02:04 has to wait unnecessarily for 3 seconds because mtime would be 07:02:03 and browser would get file only after 07:02:06. I would recommend you to search for some way(proly cmd based) to make server and ftp go hand in hand one should know the status of the other because that is the root cause of this problem.

getimagesize unable to get size of remote webp files with PHP version 5.3.29

Following is the code snippet -
list($campaign_image_width, $campaign_image_height, $campaign_image_type, $campaign_image_attr)=getimagesize($campaign_image);
Wherein $campaign_image contains the url of third party images.
Problem
$campaign_image_width comes out empty for this url -
https://lh3.googleusercontent.com/VRY0O_3L8VH2wxJSTiKPr72PeM5uhPPFEsHzzYdxenddpTI150M0TYpljnZisQaROR0=h256-rw
I am not sure if it is the limitation of getimagesize(), because of unsupported format, which is causing this, or it is because of accessibility issues with the image.
Note -
- The =h256-rw appeneded at the end seems to tell the server to return a different sized version of the image.
- I found that if I try to open the file using firefox browser, it does not display the image, but rather asks to download a webp file (an image format by google it seems).
Google chrome opens the file and displays the image, normally.
Since your server is already downloading the file you might as well do it yourself (if the problem is that it can't do it correctly for webp). You can easily do this using the GD methods imagecreatefromwebp with imagesx and imagesy:
<?php
$url = 'https://lh3.googleusercontent.com/VRY0O_3L8VH2wxJSTiKPr72PeM5uhPPFEsHzzYdxenddpTI150M0TYpljnZisQaROR0=h256-rw';
$img = imagecreatefromwebp($url);
$width = imagesx($img);
$height = imagesy($img);
var_dump($width, $height);
Note:
imagecreatefromwebp() was first introduced in PHP 5.5, so make sure your minimum version is 5.5 with the GD extension installed.
If possible you can install Google's own webp converter as a binary on your server:
https://developers.google.com/speed/webp/docs/compiling#building
In this instance you're running Amazon linux which is based on Fedora and hence uses yum as the package manager, so you should be able to run the following command:
sudo yum install libwebp;
Once you have installed this you can make sure that your safemode supports the binary by using safe_mode_exec_dir and one of the following execution methods:
exec
passthru
system
popen
Once you've run the conversion to eg. JPG, you can run the usual PHP tools to get the image dimensions:
$hnd = imagecreatefromjpeg('convertedImage.jpg');
$width = imagesx($hnd);
$height = imagesy($hnd);
I think it's beacause of unsupported format. Try imagetypes to know what is supported.
$bits = imagetypes();
Check out this post, it can be helpful. After installing you'll be able to do
$image = new Imagick($originalFilepath);
$origImageDimens = $image->getImageGeometry();
$origImgWidth = $origImageDimens['width'];
$origImgHeight = $origImageDimens['height'];

File upload with breaks

I'd like to upload large files to my server, but i would like to be able to make breaks (for example, the user must be able to shut down his computer and to continue after reboot) in the upload process.
I think i can handle the client side upload, but I don't know how to make the server side. What is the best way to make it on the server side? Is PHP able to do that ? Is PHP the most efficient?
Thanks a lot
If you manage to do the client side do post the file in chunks, you could do something like this on the server side:
// set the path of the file you upload
$path = $_GET['path'];
// set the `append` parameter to 1 if you want to append to an existing file, if you are uploading a new chunk of data
$append = intval($_GET['append']);
// convert the path you sent via post to a physical filename on the server
$filename = $this->convertToPhysicalPath($path);
// get the temporary file
$tmp_file = $_FILES['file']['tmp_name'];
// if this is not appending
if ($append == 0) {
// just copy the uploaded file
copy($tmp_file, $filename);
} else {
// append file contents
$write_handle = fopen($filename, "ab");
$read_handle = fopen($tmp_file, "rb");
$contents = fread($read_handle, filesize($tmp_file));
fwrite($write_handle, $contents);
fclose($write_handle);
fclose($read_handle);
}
If you are trying to design a web interface to allow anyone to upload a large file and resume the upload part way though I don't know how to help you. But if all you want to do is get files from you computer to a server in a resume-able fashion you may be able to use a tool like rsync. Rsync compare the files on the source and destination, and then only copies the differences between the two. This way if you have 50 GB of files that you upload to your server and then change one, rsync will very quickly check that all the other files are the same, and then only send your one changed file. This also means that if a transfer is interrupted part way through rsync will pick up where it left off.
Traditionally rsync is run from the command line (terminal) and it is installed by default on most Linux and Mac OS X.
rsync -avz /home/user/data sever:src/data
This would transfer all files from /home/user/data to the src/data on the server. If you then change any file in /home/user/data you can run the command again to resync it.
If you use windows the easiest solution is probably use DeltaCopy which is a GUI around rsync.download

Categories