Laravel/Aws - Doesnot delete file after upload s3 bucket with MultipartUploader - php

I'm tring upload large video file to my Amazon S3 bucket with aws api.
$uploader = new MultipartUploader($s3->getDriver()->getAdapter()->getClient(), $localFullFilePath, [
'bucket' => env('S3_BUCKET'),
'key' => $s3fullFullFilePath,
]);
try {
$result = $uploader->upload();
Log::info("Upload complete");
} catch (MultipartUploadException $e) {
Log::info($e->getMessage());
}
Then I am deleting my uploaded videos with below code.
foreach ($oldVideos as $oneVideo) {
// $localFullFilePath = $localFilePath . $oneVideo;
unlink($localFullFilePath);
}
My videos uploading successfully but when i try to delete my local file, it gives 'permission denied' error.
I am sure it is not file permission error because it occurs only when I uploading file to S3.
I think api does not fclose file after reading.
Do you suggest any tips or workarounds?

did you gave yourself permission in the map structure? That's also what happend to me last time :P
You must give yourself the permission to Upload & Delete

You're right, api does not close file. You can do it manually:
// open a file
$source = fopen($localFullFilePath, 'rb');
// pass a resource, not a path
$uploader = new MultipartUploader($s3->getDriver()->getAdapter()->getClient(), $source, [
'bucket' => env('S3_BUCKET'),
'key' => $s3fullFullFilePath,
]);
// upload
try {
$result = $uploader->upload();
Log::info("Upload complete");
} catch (MultipartUploadException $e) {
Log::info($e->getMessage());
}
// close
fclose($source);
// now we can remove it

Related

How to download object from aws s3 bucket and save it to local computer not on server using php?

The problem with that code when I am using it will download files on my server where my website is running. How can I download it to my local computer.
public function download($bucketName,$folderName)
{
try {
$source = $bucketName.'/'.$folder;
$dest = 'C:/sync/download/'.$folder;
$manager = new \Aws\S3\Transfer($this->s3, $source, $dest);
$manager->transfer();
return $manager;
//return true;
} catch (Aws\S3\Exception\S3Exception $e) {
//echo "There was an error downloading the file.";
return $e;
}
}

ftp_get(): Opening BINARY mode data connection when downloading a large file

I am trying to download two files from two different FTP servers. One file is 10 MB in size while the other is 3.3GB in size. The 10MB file is downloaded every time without a problem. The 3.3GB file always encounters an error:
Code Error: [2] ftp_get(): Opening BINARY mode data connection for
bigfile.gz (3232089332 bytes).Error/Warning on line 55 in file
script.phpPlease go over the collector code.
The size of the file is exactly 3232089332, so this issue appears after the file has finished downloading completely.
Both files are .gz files (so I know they are binary).
There is enough space on the hard drive (currently free 47GB).
It is worthwhile to note that I am able to download the file without any issues using Filezilla.
Any help would be highly appreciated.
The code is as follows:
function ftpDownload($server, $username, $password, $filename) {
if (strpos($server, '://') !== false) $server = substr($server, strpos($server, '://') + 3);
# set up basic connection
echo "Connecting to $server\n";
$connectionId = ftp_connect($server);
if (!$connectionId) {
return ['success' => false, 'error' => "FTP Connection has Failed"];
}
# login with username and password
echo "Logging in\n";
$loginResult = ftp_login($connectionId, $username, $password);
ftp_pasv($connectionId, true);
# check connection
if (!$loginResult) {
return ['success' => false, 'error' => "Failed to login"];
}
# Verify the file exists
echo "Locating $filename\n";
$result = ftp_size($connectionId, $filename);
if ($result == -1) {
return ['success' => false, 'error' => "Unable to locate $filename in server", 'filename' => false];
}
echo "File size ".number_format(($result / 1024 / 1024))." MBs\n";
# Download the file
echo "Downloading $filename, this may take a while\n";
if (file_exists(__DIR__.'/files/'.$filename)) unlink(__DIR__.'/files/'.$filename);
ftp_pasv($connectionId, true);
$success = ftp_get($connectionId, __DIR__.'/files/'.$filename, $filename, FTP_BINARY);
ftp_close($connectionId);
if ($success == false) {
return ['success' => false, 'error' => "Failed to download file $filename from server, received error from FTP"];
}
if (!file_exists(__DIR__.'/files/'.$filename)) {
return ['success' => false, 'error' => "Unable to locate $filename in server, filename was not properly stored locally"];
}
}
Reached a solution. As the timeout occurred after the file has been fully downloaded, instead of using ftp_get() I used the following:
$fp = fopen($filename, 'w');
#$success = ftp_fget($connectionId, $fp, $filename, FTP_BINARY);
Then to verify check whether the transfer was successful, I just compared the downloaded file's filesize using filesize() with the remote file's filesize which I've obtained beforehand (can be seen in the question's code).

What would cause unlink to return 'Resource Temporarily Unavailable'?

I'd like to create a .zip archive, upload it to Amazon S3, then delete the created .zip from the server. Steps 1 and 2 are working great, but the delete step is returning:
unlink(temp/file.zip): Resource temporarily unavailable
I've tried to unset all the related variables and resources, but I'm still getting the error.
Here's the code:
$zipFile = 'temp/file.zip';
// create the zip archive:
$z = new \ZipArchive();
$z->open($zipFile, \ZipArchive::CREATE);
$z->addEmptyDir('testdirectory');
// add a file
$filename = 'fileName.txt';
$content = 'Hello World';
$z->addFromString('testdirectory/' . $filename, $content);
$z->close();
// upload to S3
$s3 = AWS::createClient('s3');
$result = $s3->putObject(array(
'Bucket' => 'my-bucket-name',
'Key' => basename($zipFile),
'SourceFile' => $zipFile
));
// check to see if the file was uploaded
if ($result['#metadata']['statusCode'] == "200") {
$uploaded = true;
}
// delete the temp file
if ($uploaded) {
unset($result);
unset($s3);
unset($z);
if (file_exists($zipFile)) {
unlink($zipFile);
}
}
Some additional details: I'm using Lumen 5.4 and the aws-sdk-php-laravel package.
Any insight would be much appreciated! Thanks.
S3 is holding resources so we have to forcefully clear the gc (Garbage Collector).
Just do gc_collect_cycles() before deleting that file.

Zend File rename after uploading zend form

I'm not getting the information I am looking for from research. I'd like to perform a rename on the file upload after it has uploaded. I need the original filename as well as renaming it. Here is what I have so far:
$form = new Sam_Form_Database($this->resource);
$form->setMethod(Zend_Form::METHOD_POST);
if($this->getRequest()->isPost()){
if($form->isValid($this->getRequest()->getPost())){
$data = $form->getValues();
try {
$form->fileelement->receive();
$originalFilename = pathinfo($form->fileelement->getFileName());
$newFilename = Sam_Util::generateHash().'.'.$originalFilename['extension'];
$filterFileRename = new Zend_Filter_File_Rename(array(
'target' => $newFilename,
'overwrite' => true,
'keepExtension' => true
));
$filterFileRename->filter($form->fileelement->getFileName());
} catch(Exception $e){
Sam::exception("Cannot upload file");
}
Sam_Util::insertDataIntoDatabase($data,$this->resource);
Sam_Util::redirectSimple('list');
}
The problems:
nothing seems to be uploading
before when it was uploading it wasn't renaming the file in the destination
What I need is a fluent way to handle uploading, retrieving the original filename, and performing a rename on the target file using zend.

ftp_fget() returns Transfer Complete error

I'm trying to download images from a remote FTP server and upload them to our own rackspace account. This goes fine for about 3000 images but then it crashes and gives me the following error:
exception with message 'ftp_fget(): Transfer complete.'
We've tried changing the code to use ftp_get(), to not use a temp file to store it in, but it always resulted in the same error. It always fails on the same files, if I were to delete a couple of files that were already downloaded and run the scripts again it has no problem downloading them... it just fails again once it hits those specific images on the FTP server. I've tried downloading those images manually from the server and it worked, it seems nothing is wrong with them.
This is basically the code that does it:
$this->handle = ftp_connect($ftpurl);
$loggedIn = ftp_login($this->handle, $ftpusername, $ftppassword);
if ($loggedIn === false) {
throw new Exception('Can\'t login to FTP server');
}
if ($this->handle === false) {
throw new Exception('Could not connect to the given url');
}
ftp_pasv($this->handle, true);
$fileList = ftp_nlist($this->handle, '.');
if (count($fileList) === 0) {
throw new Exception('No files found on FTP-server');
}
foreach($fileList as $filename) {
try {
$container->getObject($filename);
// Image already exists, rackspace has no convenient hasImage() function
} catch (Exception $ex) {
$temp = tmpfile();
ftp_fget($this->handle, $temp, $filename, FTP_BINARY);
//upload $tmp to rackspace
}
}
Any ideas what could be the issue here?

Categories