Aws s3 copy and replicate folder in laravel - php

I am trying to copy a folder which is already on s3 and save it with different name on S3 in laravel 5.4 . What I have found so far is that I can copy an Image, Not folder. I have tried to copy folder like ie:
$disk->copy("admin/form/$old_form","admin/form/$new_form");
But it doesnot work like that. It give me an error. Do i need to apply loop and get each folder item separately? Like:
$images = $disk->allFiles('admin/form/$id');
Or is there any work around available in laravel or s3 api it self?
Please help, Its driving me crazy.
Thanks in advance.

I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).
Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = \Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.
$filesystem = new FilesystemManager(app());
$s3 = $filesystem->disk('s3');
$images = $s3->allFiles('old-folder');
$s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
foreach($images as $image)
{
$new_loc = str_replace('old-folder', 'new-folder', $image);
$s3->copy($image, $new_loc);
}

Another option:
$files = Storage::disk('s3')->allFiles("old/location");
foreach($files as $file){
$copied_file = str_replace("old/location", "new/location", $file);
if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
}

This can all be done from CLI:
Install and configure s3cmd:
sudo apt install s3cmd
s3cmd --configure
Then you can:
s3cmd sync --delete-removed path/to/folder s3://bucket-name/path/to/folder/
To make the files and folder public:
s3cmd setacl s3://bucket-name/path/to/folder/ --acl-public --recursive
Further reading: https://s3tools.org/s3cmd-sync

I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.
Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:
$source = 's3://abc';
$destination = 's3://xyz';
shell_exec('aws s3 sync ' . $source . ' ' . $destination);
If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.

Related

Laravel storage:link does not work on heroku?

So I've been playing around with heroku and I really like it. it's fast and it just works. However i have encountered a problem with my gallery app: https://miko-gallery.herokuapp.com . Create a free account , create an album and try uploading a photo. It will not display. I have run the php artisan storage:link command, but it does not work. What am i missing here?
EDIT
I've just tried a new thing, I tried running heroku run bash and i cd'ed into storage/app/public folder, and it does not contain the folder images which was supposed to be there.
My code for saving the photo is here (works on localhost):
public function store(Request $request)
{
$ext = $request->file('items')->getClientOriginalExtension();
$filename = str_random(32).'.'.$ext;
$file = $request->file('items');
$path = Storage::disk('local')->putFileAs('public/images/photos', $file, $filename);
$photo = new Photo();
$photo->album_id = $request->album_id;
$photo->caption = $request->caption;
$photo->extension = $request->file('items')->getClientOriginalExtension();
$photo->path = $path.'.'.$photo->extension;
$photo->mime = $request->file('items')->getMimeType();
$photo->file_name = $filename;
$photo->save();
return response()->json($photo, 200);
}
Heroku's filesystem is dyno-local and ephemeral. Any changes you make to it will be lost the next time each dyno restarts. This happens frequently (at least once per day).
As a result, you can't store uploads on the local filesystem. Heroku's official recommendation is to use something like Amazon S3 to store uploads. Laravel supports this out of the box:
Laravel provides a powerful filesystem abstraction thanks to the wonderful Flysystem PHP package by Frank de Jonge. The Laravel Flysystem integration provides simple to use drivers for working with local filesystems, Amazon S3, and Rackspace Cloud Storage. Even better, it's amazingly simple to switch between these storage options as the API remains the same for each system.
Simply add league/flysystem-aws-s3-v3 ~1.0 to your dependencies and then configure it in config/filesystems.php.
if you don't have ssh access then simply create a route.so you can hit this command simply by hitting url
Route::get('/artisan/storage', function() {
$command = 'storage:link';
$result = Artisan::call($command);
return Artisan::output();
})
firstly unlink existing link from storage

How do I list all files within a Bucket Subdirectory in PHP?

After reviewing the PHP Docs for the GCP Storage API and the Bookshelf tutorial (https://cloud.google.com/php/getting-started/using-cloud-storage) I'm lost on how to list the files located in a Bucket subdirectory.
I've viewed Listing files in Google Cloud Storage (nearline) - missing files, however, this code is adapted to Python. If it really is as simple as using an ls command, how would I run this command from PHP? I've dug through the repo's on Github from Google and I'm not sure which to use in this case.
I have both of these libraries included via composer. Just to clarify I'm running these remotely from a DigitalOcean Droplet, not from App Engine.
"google/appengine-php-sdk": "^1.9",
"google/cloud": "^0.39.2",
There's an "objects" method that'll do this for you.
use Google\Cloud\Storage\StorageClient;
$storage = new StorageClient();
$bucket = $storage->bucket('my-bucket');
$objects = $bucket->objects([
'fields' => 'items/name,nextPageToken'
]);
foreach ($objects as $object) {
echo $object->name() . PHP_EOL;
}
The documentation for the PHP storage client is over here: https://googlecloudplatform.github.io/google-cloud-php/#/docs/google-cloud/v0.39.2/storage/storageclient

How to use composer/composer PHP classes to update individual packages

I want to use the composer/composer PHP classes to update individual plugin packages.
I do not want to use command-line solutions like exec("php composer.phar update");
I am unable to get it to work. I have tried several different options, much alike the following code.
It just returns a blank screen.
use Composer\Console\Application;
use Symfony\Component\Console\Input\ArrayInput;
use Symfony\Component\Console\Output\BufferedOutput;
$input = new ArrayInput(array('command' => 'require vendor/packkage dev-master'));
$output = new BufferedOutput();
$application = new Application();
$application->run($input, $output);
dd($output->fetch());
Things i would like to achieve:
Download/Update individual packages
Get result output to verify success
Dump autoload
Remove/require packages
A bit of context details:
I am creating a plugin updater for my PHP application (in admin panel).
Every plugin is a composer package and resides on my own Satis repository.
The plugins get installed into a custom dir using my composer plugin.
I can read composer.lock locally and packages.json on the satis server to figure out
what packages require updates.
update
I've managed to at least get it to work. The no-output issue was due to $application->setAutoExit that needed to be false before running. Next issue that i had was that the required package would download itself into the same directory as the class where i called it from. Solved that by using putenv and chdir. Result:
root/comp.php
putenv('COMPOSER_HOME=' . __DIR__ . '/vendor/bin/composer');
chdir(__DIR__);
root/workbench/sumvend/sumpack/src/PackageManager.php
include(base_path() . '/comp.php');
$input = new ArrayInput(array('command' => 'require', 'packages' => ['vend/pak dev-master']));
$output = new BufferedOutput();
$application = new Application();
$application->setAutoExit(false);
$application->run($input, $output); //, $output);
dd($output->fetch());
This works, but it's far from ideal.
The full solution to this would be pretty long winded, but I will try to get you on the right track.
php composer.phar require composer/composer dev-master
You can load the source of composer into your project vendors. You might have already done this.
The code you are looking for is at: Composer\Command\RequireCommand.
$install = Installer::create($io, $composer);
$install
->setVerbose($input->getOption('verbose'))
->setPreferSource($input->getOption('prefer-source'))
->setPreferDist($input->getOption('prefer-dist'))
->setDevMode($updateDevMode)
->setUpdate(true)
->setUpdateWhitelist(array_keys($requirements))
->setWhitelistDependencies($input->getOption('update-with-dependencies'));
;
$status = $install->run();
Most of the command relates to the reading and writing to of the composer.json file.
However the installer itself is independent of where the configuration actually came from. You could in theory store the configuration in a database.
This is the static create method for the installer:
public static function create(IOInterface $io, Composer $composer)
{
return new static(
$io,
$composer->getConfig(),
$composer->getPackage(),
$composer->getDownloadManager(),
$composer->getRepositoryManager(),
$composer->getLocker(),
$composer->getInstallationManager(),
$composer->getEventDispatcher(),
$composer->getAutoloadGenerator()
);
}
You will need to pay special attention to the Package, And implement your own.
Although your current attempt to run it on the command line will work, I do not recommend it because Composer is primarily a development and deployment utility, not an application utility.
In order to smoothly use it to assist with loading plugins on a production environment, you will need to tightly integrate its internals with your own application, not just use it on the side.
This is something I am interested in as well, and I think this has inspired me to look into it myself. So I'll let you know what I come up with, but this is the best I can advise you for now on what I consider to be the correct approach.

PHP Subversion Setup FTP

I work at a small php shop and I recently proposed that we move away from using our nas as a shared code base and start using subversion for source control.
I've figured out how to make sure our dev server gets updated with every commit to our development branch... and I know how to merge into trunk and have that update our staging server, because we have direct access to both of these, but my biggest question is how to write a script that will update the production server, which we many times only have ftp access to. I don't want to upload the entire site every time... is there any way to write a script that is smart enough to upload only what has changed to the web server when we execute it (don't want it to automatically be uploading to the production enviroment, we want to execute it manually)?
Does my question even make sense?
Basically, your issue is that you can't use subversion on the production server. What you need to do is keep, on a separate (ideally identically configured) server a copy of your production checkout, and copy that through whatever method to the production server. You could think of this as your staging server, actually, since it will also be useful for doing final tests on releases before rolling them out.
As far as the copy goes, if your provider supports rsync, you're set. If you have only FTP you'll have to find some method of doing the equivalant of rsync via FTP. This is not the first time anybody's had that need; a web search will help you out there. But if you can't find anything, drop me a note and I'll look around myself a little further.
EDIT: Hope the author doesn't mind me adding this, but I think it belongs here. To do something approximately similar to rsync with ftp, look at weex http://weex.sourceforge.net/. Its a wrapper around command line ftp that uses a local mirror to keep track of whats on the remote server so that it can send only changed files. Works great for me.
It doesn't sound like SVN plays well with FTP, but if you have http access, that may prove sufficient to push changes using svnsync. That's how we push changes to our production severs -- we use svnsync to keep a read-only mirror of the repository available.
I use the following solution. Just install the SVN client on your webserver, and attach this into a privately accessible url:
<?php
// make sure you have a robot account that can't commit ;)
$username = Settings::Load()->Get('svn', 'username');
$password = Settings::Load()->Get('svn', 'password');
$repos = Settings::Load()->Get('svn', 'repository');
echo '<h1>updating from svn</h1><pre>';
// for secutity, define an array of folders that you do want to be synced from svn. The rest should be skipped.
$svnfolders = array( 'includes/' ,'plugins/' ,'images/' ,'templates/', 'index.php' => 'index.php');
if(!empty($_GET['justthisone']) && array_search($_GET['justthisone'], $svnfolders) !== false){ // you can also just update one of above by passing it to $_GET
$svnfiles = array($_GET['justthisone']);
}
foreach($svnfiles as $targetlocation)
{
echo system("svn export --username={$username} --password {$password} {$repos}{$targetlocation} ".dirname(__FILE__)."/../{$targetlocation} --force");
}
die("</pre><h1>Done!</h1>");
I'm going to make an assumption here and say you are using a post-commit hook to do your merging/updating of your staging server. This may work, but I would strongly recommend you look into a Continuous Integration solution. The following are some that I am aware of:
Xinc - http://code.google.com/p/xinc/ (PHP Specific)
CruiseControl - http://cruisecontrol.sourceforge.net/ (Wildly popular.)
PHP integration made possible with http://phpundercontrol.org/about.html
Hudson - [https://hudson.dev.java.net/] (Appears to be Java based, but allows for plugins/extensions)
LFTP is capable of synchronizing directories over ftp.
Just an idea:
You could hold a revision of your project on a host you have access to and where subversion is installed. This single revision reflects the production server's version.
You could now write a PHP script that makes this repository update over svn and then find all files that have been changed since the rep was updated. These files you can upload.
Such a script could look like this:
$path = realpath( '/path/to/production/mirror' );
chdir( $path );
$start = time();
shell_exec( 'svn co' );
$list = array();
$i = new RecursiveIteratorIterator( new RecursiveDirectoryIterator( $path ), RecursiveIteratorIterator::SELF_FIRST );
foreach( $i as $node )
{
if ( $node->isFile() && $node->getCTime() > $start )
{
$list[] = $node->getPathname();
}
// directories should also be handled
}
$conn = ftp_connect( ... );
// and so on
Just as it came to my mind.
I think this will help you
https://github.com/midhundevasia/deploy
its works well in Windows.

Can't get SFTP to work in PHP

I am writing a simple SFTP client in PHP because we have the need to programatically retrieve files via n remote servers. I am using the PECL SSH2 extension.
I have run up against a road block, though. The documentation on php.net suggests that you can do this:
$stream = fopen("ssh2.sftp://$sftp/path/to/file", 'r');
However, I have an ls method that attempts to something similar
public function ls($dir)
{
$rd = "ssh2.sftp://{$this->sftp}/$dir";
$handle = opendir($rd);
if (!is_resource($handle)) {
throw new SFTPException("Could not open directory.");
}
while (false !== ($file = readdir($handle))) {
if (substr($file, 0, 1) != '.'){
print $file . "\n";
}
}
closedir($handle);
}
I get the following error:
PHP Warning: opendir(): Unable to open ssh2.sftp://Resource id #5/outgoing on remote host
This makes perfect sense because that's what happens when you cast a resource to string. Is the documentation wrong? I tried replacing the resource with host, username, and host and that didn't work either. I know the path is correct because I can run SFTP from the command line and it works fine.
Has anyone else tried to use the SSH2 extenstion with SFTP? Am I missing something obvious here?
UPDATE:
I setup sftp on another machine in-house and it works just fine. So, there must be something about the server I am trying to connect to that isn't working.
When connecting to a SFTP server and you need to connect to the root folder (for instance for reading the content of the folder) you would still get the error when using just "/" as the path.
The solution that I found was to use the path "/./", that's a valid path that references to the root folder. This is useful when the user you are logging with has access only to its own root folder and no full path is available.
So the request to the server when trying to read the contents of the root folder should be something like this:
$rd = "ssh2.sftp://{$this->sftp}/./";
For php versions > 5.6.27 use intval()
$sftpConnection = ssh2_connect($host);
$sftp = ssh2_sftp($sftpConnection);
$fp = fopen("ssh2.sftp://" . intval($sftp) . $remoteFile, "r");
https://bugs.php.net/bug.php?id=73597
I'm having a similar issue. I assume you are doing something similar to this:
$dir = "ssh2.sftp://{$sftp}{$remote_dir}";
$handle = opendir($dir);
When $remote_dir is the full path from root then open_dir works. If $remote_dir is just '/' or '', then I get the 'unable to open' error as you did.
In my case, it seems ssh connects at the root folder instead of the 'home' directory as ftp does. You mentioned that it worked on a different server, so I wonder if it is just a config issue.
the most easiest way to get SFTP working within PHP (even on windows) without installing any extension etc is PHPSECLIB: http://phpseclib.sourceforge.net/ . The SSH stuff is completely implemented in a PHP class.
You use is like this:
<?php
include('Net/SFTP.php');
$sftp = new Net_SFTP('www.domain.tld');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp->pwd();
?>
The documentation on that page contains an error. Take a look at the example here instead: http://php.net/ssh2_sftp - what you actually need to do is to open a special SFTP resource using ssh2_sftp() prior to using it with fopen(). And yes, it looks just like that, e.g. "Resource #24" when converted to string... a bit weird but apparently it works.
Another caveat is that SFTP starts in the root directory rather than the home directory of the remote user, so your remote path in the URI should always be an absolute one.
I just had the same issue, but I could figure out the problem.
On my case, when connecting to the server, I was going to the root of the account, and due to server configs I wasn't able to write there.
I have connected to the account using a fireFTP, and so I could see where the root of the account was...it was the root of the server.
I had to include the whole path until the folder where I am allowed to write, and so I could solve the issue.
So, my advice is to get the path using a graphic interface (I have used fireFTP), and add the whole path to your code.
$pathFromAccountRootFolderToMyDestinationFolder = '/Account/Root/Folder/To/My/Folder';
$stream = fopen("ssh2.sftp://".$sftp."/".$pathFromAccountRootFolderToMyDestinationFolder."/myFile.ext", 'r');
Hope this will help you and other people with the same issue!
Cheers!
I recently tried to get SFTP on PHP working and found that phpseclib was a lot easier to use:
http://phpseclib.sourceforge.net/
If you have the luxury of not being on a shared host and can install whatever extensions you want to maybe the PECL extension would be better but not all of us are so lucky. Plus, phpseclib's API looks a bit more intuitive, being OOP and all.
My issue was, that I was connecting in function and returning string URL with resource inside. Unfortunatelly resource is than created in function context and garbage collector is disconnecting resource on function end. Solution: return resource by reference and unset it manually in more complex context.
Solved my issue by enabling sftp support on the (Powershell) server
This is a bug in the ssh2 package that I found years ago and posted a patch to php.net. It fixes this issue but requires a rebuild of the ssh2 pecl package. You can read more here: https://bugs.php.net/bug.php?id=69981. I included a patch there to the ssh2_fopen_wrappers.c file in the package to fix the issue. Here is a comment I included:
Here is a line of code from ssh2_fopen_wrappers.c that causes this bug: (comment included)
/*
Find resource->path in the path string, then copy the entire string from the original path.
This includes ?query#fragment in the path string
*/
resource->path = estrdup(strstr(path, resource->path));
This line of code -- and therefore this bug -- was introduced as a fix for bug #59794. That line of code is attempting to get a string containing the part, query and fragment from the path variable.
Consider this value for the path variable:
ssh2.sftp://Resource id #5/topdir?a=something#heading1
When resource->path is "/topdir", the result is that "/topdir?a=something#heading1" gets assigned to resource->path just like the comment says.
Now consider the case when resource->path is "/". After the line of code is executed, resource->path becomes "//Resource id#5/topdir#heading1". This is clearly not what you want. Here's a line of code that does:
resource->path = estrdup( strstr( strstr( path, "//" ) + 2, "/" ) );
You may also need to apply the patch for bug # 73597 which removes "Resource id #" from the path string before calling php_url_parse().

Categories