Uploading bigger files using PHP fails - php

I'm trying to upload a file with the size of e.g. 244 KB via POST request with PHP but it causes a 403 Forbidden error. When I upload a small file (e.g. 3 B) it works fine.
I've made sure my PHP.ini is configured to support bigger files as seen below in the CPanel screenshot:
My PHP script is the following:
<?php
$filename = $_GET['filename'];
$fileData = file_get_contents('php://input');
file_put_contents($filename, $fileData);
Error handling and such removed for clarity.
Adding a user agent to the POST request does not help.
What could be the cause of bigger files being refused to be uploaded?

Related

Crash while uploading file to S3 using Laravel file storage

I'm getting a random crash while uploading a file to S3 using Laravel File Storage system. The crash is not reproducible in local/dev environment and in production also it is very random. All the files are still getting uploaded to S3. The issue occurs randomly for any file type (pdf, png, jpg). File size is usually 1 MB to 3 MB.
Aws\Exception\CouldNotCreateChecksumException
A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.
Crashed in non-app: /vendor/aws/aws-sdk-php/src/Signature/SignatureV4.php in Aws\Signature\SignatureV4::getPayload
/app/Http/Controllers/ApiController.php in App\Http\Controllers\ApiController::__invoke at line 432
$filename = $request->file('file')->getClientOriginalName();
$user_file_id = $request->input('file_id');
$path = Storage::putFileAs(
'fileo',
$request->file('file'),
$user_file_id
);
return $path;
I had the same error message, but files were not being saved to S3 - so it my be different.
I followed the answer StackOverflow - update php.ini to increase upload limits and this error stopped.
I had the same issue on laravel and minio object storage. The problem was from my /etc/php.ini configuration I messed up some values. Just make sure that you did not changed this or if you did, make sure they are correct.
upload_max_filesize = 1024M
max_file_uploads = e.g 25

Opening/Reading a local file from PHP Web Application

I feel like this should be a pretty straightforward process.
I have the following code:
<?php
$filename = "c:/TestFolder/config.txt";
echo "Attempting to read: ".$filename."<br/>";
$fh = fopen($filename, 'r') or die("file doesnt exist");
$readtext = fread($fh, filesize($filename));
fclose($fh);
echo "The Text is: ".$readtext;
?>
I have checked that I do indeed have "config.txt" in a folder called "TestFolder" on my C:/ drive... but I keep getting an error that the file doesn't exist.
I checked my PHPInfo to ensure that "allow_url_fopen" is turned on.
I have also tried different file path variations, such as:
C:\\TestFolder\\config.txt
C:\/TestFolder\/config.txt
This doesn't seem to make a difference.
Any idea what might be preventing me from opening this file?
Edits:
It should be noted that this file is uploaded to my web host, while I am attempting to access a file on my local machine. Not sure if this changes things at all.
This is not possible. "local files" are files on the server where PHP is running, not the client running the web browser. While they might be the same machine when you're testing locally, once you upload the script to a web host, PHP tries to read files on the web host, not your machine.
The only way for PHP to access a file on the client machine is for the application to upload it.

Replace image on server and show previous image until the new one is fully uploaded

I'm uploading to a lightspeed server through "ncftpput" an image taken from a raspberry camera every minute.
I want to be able to show the updated image and I know how I can force the browser to use the latest version instead of the cached image.
So, everything works properly except that, if I refresh the image (like through shift-F5) during image upload, the browser reports image contains errors (or shows image partially).
Is there any way to ensure the image is fully loaded before serving the new one?
I'm not sure if I should operate on ncftp or use PHP to ensure the swap happens only after complete loading.
Image is a progressive jpg but that doesn't help...
Any suggestion?
Thanks
I ended up with NOT using FTP because, as Viney mentioned, the webserver doesn't know if the upload is completed.
I'm using "curl" which has the advantage of being preinstalled on raspberry distro and a php upload page.
It seems that PHP will pass the new image only once fully uploaded and avoid creating the issue when image is still partially uploaded.
So, to recap:
raspberry (webcam), after having taken the image:
curl -F"somepostparam=abcd" -F"operation=upload" -F"file=#filename.jpg" https://www.myserver.com/upload.php
PHP server code:
$uploadfile = '/home/domain/myserver.com/' . basename($_FILES['file']['name']);
move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile);
$content = file_get_contents($uploadfile);
Problem is this: you open browser (at 07:00:10 AM) image.jpg gets rendered now say it's 07:01:00 you hit refresh in browser but the raspberry is already started uploading image.jpg say it would take 3 secs to complete the full upload but the server doesn't know about ftp and would read whatever bytes are there in image,jpg and flushes to your browser. Had it been a baseline JPG it would have shown a cropped(in height) image but since it's a progressive JPG it would be messed up.I am not aware of whether it's possible but try looking up if you FTP supports a locking file.
How to solve ?
The best way is to let you server know that the file it's accessing is in a write process.If your ftp supports advisory locks then you can use it so when server tries accessing you file ( via system call) kernel would instruct it that the file is currently locked so the server will wait until ftp releases that lock.
In vsftp there would be a option lock_upload_files in VSFTPD.CONF setting yest would enable this feature
If you are unable to work out above solution then you can use some trick like checking file last modified time if it's almost same as current time then you make php wait for some guessed time that you think is avg. upload time of your file.For this method you should use PHP to send it to browser instead of server. ie Just change the src of your image from '/path/to/image.jpg' to 'gen-image.php'. This script will read that image and would flush it to the browser
gen-image.php
$guessedUploadTime = 3;// guessed time that ncftpput takes to finish
$currTime = time();
$modTime = filemtime('image.jpg');
if( ($currTime - $modTime) < $guessedUploadTime)
{
sleep($guessedUploadTime);
}
$file = '/path/to/image.jpg';
$type = 'image/jpeg';
header('Content-Type:'.$type);
readfile($file);
Note that the above solution is not ideal because if file has been just done uploading it won't be modified for another 57 seconds yet the browser request say at 07:02:04 has to wait unnecessarily for 3 seconds because mtime would be 07:02:03 and browser would get file only after 07:02:06. I would recommend you to search for some way(proly cmd based) to make server and ftp go hand in hand one should know the status of the other because that is the root cause of this problem.

File upload with breaks

I'd like to upload large files to my server, but i would like to be able to make breaks (for example, the user must be able to shut down his computer and to continue after reboot) in the upload process.
I think i can handle the client side upload, but I don't know how to make the server side. What is the best way to make it on the server side? Is PHP able to do that ? Is PHP the most efficient?
Thanks a lot
If you manage to do the client side do post the file in chunks, you could do something like this on the server side:
// set the path of the file you upload
$path = $_GET['path'];
// set the `append` parameter to 1 if you want to append to an existing file, if you are uploading a new chunk of data
$append = intval($_GET['append']);
// convert the path you sent via post to a physical filename on the server
$filename = $this->convertToPhysicalPath($path);
// get the temporary file
$tmp_file = $_FILES['file']['tmp_name'];
// if this is not appending
if ($append == 0) {
// just copy the uploaded file
copy($tmp_file, $filename);
} else {
// append file contents
$write_handle = fopen($filename, "ab");
$read_handle = fopen($tmp_file, "rb");
$contents = fread($read_handle, filesize($tmp_file));
fwrite($write_handle, $contents);
fclose($write_handle);
fclose($read_handle);
}
If you are trying to design a web interface to allow anyone to upload a large file and resume the upload part way though I don't know how to help you. But if all you want to do is get files from you computer to a server in a resume-able fashion you may be able to use a tool like rsync. Rsync compare the files on the source and destination, and then only copies the differences between the two. This way if you have 50 GB of files that you upload to your server and then change one, rsync will very quickly check that all the other files are the same, and then only send your one changed file. This also means that if a transfer is interrupted part way through rsync will pick up where it left off.
Traditionally rsync is run from the command line (terminal) and it is installed by default on most Linux and Mac OS X.
rsync -avz /home/user/data sever:src/data
This would transfer all files from /home/user/data to the src/data on the server. If you then change any file in /home/user/data you can run the command again to resync it.
If you use windows the easiest solution is probably use DeltaCopy which is a GUI around rsync.download

using php to save remote images files at local sever but not completely successfully

I know some people already ask it but my probelm is when downloading remote images files(each file size is smaller than 200KB). But some files won't be completly saved. And some files can not be saved, or some files are saved but not 100%, I will see some gray shadow at image. The worst is everytime is different error ouput.(it's internet problem?)
I try the following methods to save file.
file_get_contents
curl/GD
copy
it all can work but I can't find the perfect method to save whole files.
The followings are error msg.
failed to open stream: HTTP request
failed! HTTP/1.0 408 Request
Time-out in at line "copy"
Maximum
execution time of 60 seconds
exceeded( I increase time)
my php program.
set_time_limit(60);
$imageArray=array(image array............);
for ($k=0;$k<count($imageArray);$k++){
echo '<img src="'.$imageArray[$k].'"><br/>';
$isok=copy($imageArray[$k] , dirname(__FILE__).'/photo/item_'.($k+1).'.jpg');
if(isok==true){
echo' success!';
}else{
echo ' Fail';
}
Most possibly it's an internet problem. Do they load fine in the browser when you try ? If they do, you can try running the code on your machine and see if this would help.
But most probable reason is the remote site which you try to download from - it can throttle you for connections per time-interval. Try sleeping between images - for example 5-6 seconds and see if this helps.
Also try to download smaller batches of images - 1 - 2 at a time to see if it works.
I noticed in your copy() that you have hardcoded in .jpg to the destination output. Are you always uploading .jpg as if you are uploading a .png or .gif and are forcing it to .jpg you might be causing issues there. Just a thought to be honest.

Categories