I am trying to solve an extremely trivial issue since long but no luck.
I want to delete a file immediately after uploading it to AWS S3 from a PHP WebServer. Following are the steps:
//Upload file to S3 using PHP SDK's S3Client::putObject method:
$result = $s3_client->putObject( array(
'Bucket' => AWS_BUCKET_NAME,
'Key' => $file_name,
'SourceFile' => $file_path,
'Metadata' => array(
'metadata_field' => 'metadata_value'
)
));
//Poll the object until it is accessible
$s3_client->waitUntil('ObjectExists', array(
'Bucket' => AWS_BUCKET_NAME,
'Key' => $file_name
));
//Delete the file
unlink( $file_path );
These steps work perfectly in case I upload a small file (~500KB).
However, if I upload a larger file (5MB-10MB), I get the following error:
Warning: unlink(<Complete Path to File>): Permission denied in <Complete path to uploader.php> on line N
I am working on Windows and have tried elevating user permissions for the directory and file. (using chmod, chown php commands and made sure that the directory is writable and accessible)
It seems that AWS S3 PutObject method is not releasing the file handle (in case of large files only). I have also tried adding sleep() but not luck.!
Moreover, in case I skip uploading the file to S3 (just to test my delete workflow), the file gets deleted without any issue.
Please help.!
The issue has raised on https://github.com/aws/aws-sdk-php/issues/841
Try using the gc_collect_cycles() function, it solved the problem for me. See the page above for further reference.
Regards,
Andor
Maybe you need to set the value of upload_max_filesize and post_max_size in your php.ini:
; Maximum allowed size for uploaded files.
upload_max_filesize = 40M
; Must be greater than or equal to upload_max_filesize
post_max_size = 40M
After modifying php.ini file(s), you need to restart your HTTP server to use new configuration.
In case anybody else is also stuck on this, I moved nginx server deployment to CentOS and this issue was not observed.
The waitUntil 'ObjectExists' have an timeout/max attempts by default.
You can change using:
$s3Client->waitUntil('ObjectExists', array(
'Bucket' => AWS_BUCKET_NAME,
'Key' => $file_name,
'waiter.interval' => 10,
'waiter.max_attempts' => 6
));
Related
I've been searching through the web quite a bit now, found several possible solutions, but none of them worked. Some say it's due to php.ini settings, some say it's due to the method I am using from the SDK. I'm a bit stuck here. I've tested it quite thoroughly, and with the current code I have, I am able to download a file from my S3 bucket without problems or corruptions, however it's ALWAYS limited to 64 megabytes.
Is there some way to up this limit? Or increment the download?
When I try to download a file over 64 megabytes the page cannot be reached. Sometimes it might actually download the file anyway (while it says cannot be reached), but only exactly 64 megabytes.
try {
$result = $s3->getObject([
'Bucket' => $bucket,
'Key' => $keyname
]);
set_time_limit(0);
header('Content-Description: File Transfer');
header("Content-Type: {$result['ContentType']}; charset=utf-8");
header("Content-Disposition: attachment; filename=".$filename);
echo $result['Body'];
} catch (S3Exception $e) {
echo $e->getMessage() . PHP_EOL;
}
I've set my memory_limit to no limit (0). I've also tried to set the memory limit to about 64 megabytes, but still no dice.
Tried tinkering with post_max_size etc, but still nothing. I'm not sure if the problem relies on my apache/php setup, the EC2 I'm running, or S3 SDK limitations.
The EC2 instance I'm running is a t2.xlarge, running: Ubuntu 20.04 LTS (Focal Fossa) LAMP Stack - Linux Apache MySQL/MariaDB PHP
Some of the things I've found with similar issues (The first one I've tried without luck):
Download large files from s3 via php <-- This link has solution
Max execution time out error when tried to download large object(2gb) from s3 bucket to window server using php
The bottom link I don't really understand, apparently the solution should be to increment the download (according to online Thomas), but I'm not sure how that would work. How would I combine the data, and how would I keep downloading from where I left off? I'm missing an example of how to work that solution. The OP of that post asked the same question.
The presigned request solution from OP from the first link, did solve my problem. However it's a different "way of downloading", before in my code we were using the SDK to download via the SDK, now we are using the SDK to create basically a download link. The 64 megabyte issue still persists. I can use this, but if anyone has a solution for how to download via the SDK over 64 megabytes, please let me know!
My solution:
$cmd = $s3->getCommand('GetObject', [
'Bucket' => $bucket,
'Key' => $keyname,
'ResponseContentDisposition' => 'attachment; filename="'.$filename.'"'
]);
$request = $s3->createPresignedRequest($cmd, '+15 min');
$presignedUrl = (string)$request->getUri();
And then basically just open that URL anywhere in HTML, JS etc. and it will begin to download the file.
I'm trying to upload a gif file to my server using an upload tool (ShareX)
It works very well with "small" files, but when trying to upload a 8 MB one (takes about 1 minute with my connection), it just doesn't work.
So I did some quick debug, and it seems that the functions file_exists and is_uploaded_file are both set to false, exactly like nothing was uploaded, which isn't the case.
if (!file_exists($_FILES[$fileFormName]["tmp_name"]) || !is_uploaded_file($_FILES[$fileFormName]["tmp_name"]))
{
error([
"error" => "No file uploaded",
"file_exists" => file_exists($_FILES[$fileFormName]["tmp_name"]),
"is_uploaded_file" => is_uploaded_file($_FILES[$fileFormName]["tmp_name"])
], "400 Bad Request");
}
Why would that happen?
My apache2 php.ini upload_max_filesize is set to 128M so it shouldn't be a file size issue.
My apache2 php.ini max_execution_time is set to 0 so it shouldn't be a timeout issue.
I wasn't able to find anything similar to my problem using Google.
Fixed, I also had to adjust post_max_size in my php.ini file.
I am forced to use Version 2 of AWS S3, because i cannot update PHP to 5.5 on this server in order to use Version 3.
I made this PHP script to download files from AWS, which works good:
//http://docs.aws.amazon.com/aws-sdk-php/v2/api/class-Aws.S3.S3Client.html#_createPresignedUrl
// Get a command object from the client and pass in any options
// available in the GetObject command (e.g. ResponseContentDisposition)
$command = $s3Client->getCommand('GetObject', array(
'Bucket' => $bucket,
'Key' => $objectKey,
'ResponseContentDisposition' => 'attachment; filename="' . $originFilename . '"'
));
// Create a signed URL from the command object that will last for
// 10 minutes from the current time
$signedUrl = $command->createPresignedUrl('+1000 minutes');
$file = file_get_contents($signedUrl);
The problem is that i want to be sure that the file_get_contents() downloads the entire file and to detect and fix any error (like server going offline during a download, etc...), so i thought the following flow:
I ask AWS the file size
I download the file
I check the size. If it's not equal i re-download the file
So, how to get file size from AWS? I found this, but it doesn't work for my version.
You can use the HEAD Object REST API to determine the size of the object stored on S3.
HEAD Object will return the meta-data associated with the stored S3 Object, including the size on disk of the object, within the Content-Length header.
http://docs.aws.amazon.com/aws-sdk-php/v2/api/class-Aws.S3.S3Client.html#_headObject
I have written a Symfony2(PHP MVC Framework) script to download a zip file from the server. But the file download stops in the midway. I have increased the max_execution_time in apache configuration. Still the problem is persisting.
Do anyone have the quick fix for this?
Thanks in advance.
It seems like you may have an issue with a large file (downloading an archive of videos). You should use a StreamedResponse. This way, you don't have to store the entire contents of your file in memory, it will just stream to the client. The way you are currently doing it makes the file load into memory before it can start to download. You can see why this could be a problem. Here is a simple example of how you can stream a file to the client:
$path = "//usr/www/users/jjdqlo/Wellness/web/yoga_videos/archive.zip";
return new StreamedResponse(
function () use ($path) { // first param is a callback, where you do the readfile()
readfile($path);
},
200, // second param is the http status code
array( // third param is an array of header settings
'Content-Disposition' => 'attachment;filename="archive.zip"',
'Content-Type' => 'application/zip'
)
);
Give this a shot. Assuming the problem is because of file size, this should solve the issue.
I am using the SabreDAV PHP library to connect to a WebDAV server and download some files but it is taking forever to download a 1MB file and I have to download up to 1GB files from that server. I looked at this link http://code.google.com/p/sabredav/wiki/WorkingWithLargeFiles but it is not helpful because it's telling me that I will get a stream when I do a GET but it is not the case.
Here is my code:
$settings = array(
'baseUri' => 'file url',
'userName' => 'user',
'password' => 'pwd'
);
$client = new \Sabre\DAV\Client($settings);
$response = $client->request('GET');
response is an array with a 'body' key that contains the content of the file. What am I doing wrong? I only need the file for read only. How can I can read through the file line by line as quick as possible?
Thanks in advance.
If its taking too long just to download a 1MB file, then I think its not SabreDAV problem but a problem with your server or network, or perhaps the remote server.
The google code link you mentioned just lists a way if you want to transfer very large files, for that you will have to use the stream and fopen way they mentioned, but I think I was able to transfer 1GB files without using that way and just normally when I last used it with OwnCloud.
If you have a VPS/Dedi server, open ssh and use wget command to test the speed and time it takes to download that remote file from WebDAV, if its same as what its taking with SabreDAV, then its a server/network problem and not SabreDAV, else, its a problem with Sabre or your code.
Sorry but I donot have any code to post to help you since the problem itself is not clear and there can be more than 10 things causing it.
PS: You need to increase php limits for execution time, max file upload and max post size too relatively