Run composer accross a broswser with a PHP - php

This post is in relation with this :
Run composer with a PHP script in browser
My problem is to solve how to install a library without terminal and take in consideration some hosting do not accept the exec command.
In summary the user can just click on the button to install apps with the library
Thank you.
I tried 2 solutions :
use Composer\Console\Application;
use Symfony\Component\Console\Input\ArrayInput;
use Symfony\Component\Console\Output\StreamOutput;
putenv('COMPOSER_HOME=' . self::$root); // /www/mywebsite/shop/
putenv('COMPOSER_CACHE_DIR=' . CORE::BASE_DIRECTORY . '/Work/Cache/Composer/');
putenv('COMPOSER_HTACCESS_PROTECT=0');
first with this code
$stream = fopen('php://temp', 'w+');
$output = new StreamOutput($stream);
$application = new Application();
$application->setAutoExit(false);
$code = $application->run(new ArrayInput(array('command' => 'install tinify/tinify')), $output);
$result = stream_get_contents($stream);
var_dump($code);
var_dump($result);
The result is : (not work and nothing is installed)
int(1) string(0) ""
The second approach :
$input = new ArrayInput(array('command' => 'install tinify/tinify'));
$application = new Application();
$application->setAutoExit(false); // prevent `$application->run` method from exitting the script
$result = $application->run($input);
var_dump($result);
The result is : (not work and nothing is installed)
int(1)

You do not need to run composer on the production server.
You can run composer in a project on a computer with similar software (OS, php version). Then you can copy already built project directory with vendor folder in it to the production server.
Usually you can zip/gzip it first then transfer then unzip/ungzip. For the follow up updates something like rsync may be a fast solution for updating only those parts of project folder that changed.
You may want to have a script that copies your development directory first then cleans up any personal credentials (e.g. passwords to any SaaS solutions you use for development), and then does the composer & transfer automatically.

Related

Laravel storage:link does not work on heroku?

So I've been playing around with heroku and I really like it. it's fast and it just works. However i have encountered a problem with my gallery app: https://miko-gallery.herokuapp.com . Create a free account , create an album and try uploading a photo. It will not display. I have run the php artisan storage:link command, but it does not work. What am i missing here?
EDIT
I've just tried a new thing, I tried running heroku run bash and i cd'ed into storage/app/public folder, and it does not contain the folder images which was supposed to be there.
My code for saving the photo is here (works on localhost):
public function store(Request $request)
{
$ext = $request->file('items')->getClientOriginalExtension();
$filename = str_random(32).'.'.$ext;
$file = $request->file('items');
$path = Storage::disk('local')->putFileAs('public/images/photos', $file, $filename);
$photo = new Photo();
$photo->album_id = $request->album_id;
$photo->caption = $request->caption;
$photo->extension = $request->file('items')->getClientOriginalExtension();
$photo->path = $path.'.'.$photo->extension;
$photo->mime = $request->file('items')->getMimeType();
$photo->file_name = $filename;
$photo->save();
return response()->json($photo, 200);
}
Heroku's filesystem is dyno-local and ephemeral. Any changes you make to it will be lost the next time each dyno restarts. This happens frequently (at least once per day).
As a result, you can't store uploads on the local filesystem. Heroku's official recommendation is to use something like Amazon S3 to store uploads. Laravel supports this out of the box:
Laravel provides a powerful filesystem abstraction thanks to the wonderful Flysystem PHP package by Frank de Jonge. The Laravel Flysystem integration provides simple to use drivers for working with local filesystems, Amazon S3, and Rackspace Cloud Storage. Even better, it's amazingly simple to switch between these storage options as the API remains the same for each system.
Simply add league/flysystem-aws-s3-v3 ~1.0 to your dependencies and then configure it in config/filesystems.php.
if you don't have ssh access then simply create a route.so you can hit this command simply by hitting url
Route::get('/artisan/storage', function() {
$command = 'storage:link';
$result = Artisan::call($command);
return Artisan::output();
})
firstly unlink existing link from storage

How to handle secret files on Azure App Services with Git

We have an PHP app, where for encryption of the connection with database we need to provide 3 files that shouldn't be publicly accessible, but should be present on the server to make the DB connection (https://www.cleardb.com/developers/ssl_connections)
Obviously we don't want to store them in the SCM with the app code, so the only idea that comes to my mind is using post-deploy action hook and fetch those files from storage account (with keys and URIs provided in the app parameters).
Is there a nicer/cleaner way to achieve this? :)
Thank you,
You can try to use Custom Deployment Script to execute additional scripts or command during the deployment task. So you can create a php script whose functionality is to download the certificate files from Blob Storage to server file system location. And then in your PHP application, the DB connection can use these files.
Following are the general steps:
Enable composer extension in your portal:
Install azure-cli module via npm, refer to https://learn.microsoft.com/en-us/azure/xplat-cli-install for more info.
Create deployment script for php via command azure site deplotmentscript --php
Execute command composer require microsoft/windowsazure, make sure you have a composer.json with the storage sdk dependency.
Create php script in your root directory to download flies from Blob Storage(e.g. named run.php):
require_once 'vendor/autoload.php';
use WindowsAzure\Common\ServicesBuilder;
use MicrosoftAzure\Storage\Common\ServiceException;
$connectionString = "<connection_string>";
$blobRestProxy = ServicesBuilder::getInstance()->createBlobService($connectionString);
$container = 'certificate';
$blobs = ['client-key.pem','client-cert.pem','cleardb-ca.pem'];
foreach($blobs as $k => $b){
$blobresult = $blobRestProxy->getBlob($container, $b);
$source = stream_get_contents($blobresult->getContentStream());
$result = file_put_contents($b, $source);
}
Modify the deploy.cmd script, add santence php run.php under the step KuduSync.
Deploy your application to Azure Web App via Git.
Any further concern, please feel free to let me know.

Installing PHP library for Supervisor using composer

I want to use Supervisor to manager my processes. I have got it installed on my Amazon linux Machine and the basic setup runs fine as per the config file.
Now, I want to change the processes dynamically. Since it needs the config file to be changed every time and a restart, using PHP library to do the same seems to be a good option.
Specifically I am going through SupervisorPHP config to change the configuration dynamically and SupervisorPHP to manager Supervisor through PHP.
Following the README for SupervisorPHP config, I got it installed via composer
composer require supervisorphp/configuration
I copied the sample code
<?php
use Supervisor\Configuration\Configuration;
use Supervisor\Configuration\Section\Supervisord;
use Supervisor\Configuration\Section\Program;
use Indigophp\Ini\Rendere;
$config = new Configuration;
$renderer = new Renderer;
$section = new Supervisord(['identifier' => 'supervisor']);
$config->addSection($section);
$section = new Program('test', ['command' => 'cat']);
$config->addSection($section);
echo $renderer->render($config->toArray());
When I run this code, I get the following error:
PHP Fatal error: Class 'Supervisor\Configuration\Configuration' not found in test.php on line 7
I also tried to clone the repo and include the files individually, however it shows error for other dependencies. It would be great if I could use this.
There are 2 mistakes in the above code.
The first mistake is that you do not use the autoloader provided by composer so that php can find the necessary classes. To do so just add require __DIR__ . '/vendor/autoload.php'; (If the vendor folder is in a different path relative to the sample script then adjust accordingly).
The second mistake is in the use statement for Indigophp. Apart from the obvious typo in the word Renderer, if you check the source of Indigo you will see that it must be use Indigo\Ini\Renderer;
So the correct code to test your installation is:
<?php
require __DIR__ . '/vendor/autoload.php';
use Supervisor\Configuration\Configuration;
use Supervisor\Configuration\Section\Supervisord;
use Supervisor\Configuration\Section\Program;
use Indigo\Ini\Renderer;
$config = new Configuration;
$renderer = new Renderer;
$section = new Supervisord(['identifier' => 'supervisor']);
$config->addSection($section);
$section = new Program('test', ['command' => 'cat']);
$config->addSection($section);
echo $renderer->render($config->toArray());
Running the above code, you should get the following output:
[supervisord]
identifier = supervisor
[program:test]
command = cat

How to use composer/composer PHP classes to update individual packages

I want to use the composer/composer PHP classes to update individual plugin packages.
I do not want to use command-line solutions like exec("php composer.phar update");
I am unable to get it to work. I have tried several different options, much alike the following code.
It just returns a blank screen.
use Composer\Console\Application;
use Symfony\Component\Console\Input\ArrayInput;
use Symfony\Component\Console\Output\BufferedOutput;
$input = new ArrayInput(array('command' => 'require vendor/packkage dev-master'));
$output = new BufferedOutput();
$application = new Application();
$application->run($input, $output);
dd($output->fetch());
Things i would like to achieve:
Download/Update individual packages
Get result output to verify success
Dump autoload
Remove/require packages
A bit of context details:
I am creating a plugin updater for my PHP application (in admin panel).
Every plugin is a composer package and resides on my own Satis repository.
The plugins get installed into a custom dir using my composer plugin.
I can read composer.lock locally and packages.json on the satis server to figure out
what packages require updates.
update
I've managed to at least get it to work. The no-output issue was due to $application->setAutoExit that needed to be false before running. Next issue that i had was that the required package would download itself into the same directory as the class where i called it from. Solved that by using putenv and chdir. Result:
root/comp.php
putenv('COMPOSER_HOME=' . __DIR__ . '/vendor/bin/composer');
chdir(__DIR__);
root/workbench/sumvend/sumpack/src/PackageManager.php
include(base_path() . '/comp.php');
$input = new ArrayInput(array('command' => 'require', 'packages' => ['vend/pak dev-master']));
$output = new BufferedOutput();
$application = new Application();
$application->setAutoExit(false);
$application->run($input, $output); //, $output);
dd($output->fetch());
This works, but it's far from ideal.
The full solution to this would be pretty long winded, but I will try to get you on the right track.
php composer.phar require composer/composer dev-master
You can load the source of composer into your project vendors. You might have already done this.
The code you are looking for is at: Composer\Command\RequireCommand.
$install = Installer::create($io, $composer);
$install
->setVerbose($input->getOption('verbose'))
->setPreferSource($input->getOption('prefer-source'))
->setPreferDist($input->getOption('prefer-dist'))
->setDevMode($updateDevMode)
->setUpdate(true)
->setUpdateWhitelist(array_keys($requirements))
->setWhitelistDependencies($input->getOption('update-with-dependencies'));
;
$status = $install->run();
Most of the command relates to the reading and writing to of the composer.json file.
However the installer itself is independent of where the configuration actually came from. You could in theory store the configuration in a database.
This is the static create method for the installer:
public static function create(IOInterface $io, Composer $composer)
{
return new static(
$io,
$composer->getConfig(),
$composer->getPackage(),
$composer->getDownloadManager(),
$composer->getRepositoryManager(),
$composer->getLocker(),
$composer->getInstallationManager(),
$composer->getEventDispatcher(),
$composer->getAutoloadGenerator()
);
}
You will need to pay special attention to the Package, And implement your own.
Although your current attempt to run it on the command line will work, I do not recommend it because Composer is primarily a development and deployment utility, not an application utility.
In order to smoothly use it to assist with loading plugins on a production environment, you will need to tightly integrate its internals with your own application, not just use it on the side.
This is something I am interested in as well, and I think this has inspired me to look into it myself. So I'll let you know what I come up with, but this is the best I can advise you for now on what I consider to be the correct approach.

Deploy a PHP project from Git to a server that does not have Git installed

I need to find a method of deploying a PHP project stored in a git repo to a staging and production server that do not have git installed. Scripts I've found so far (ie Capistrano) require Git on the target server.
Unfortunately, my host does not allow this, and the only way so far is via standard FTP, with which I keep missing files. This makes for an unprofessional look.
I would like to be able to deploy from my local git repo, which will check the .git folder on the target to see which version is on there, then cause the target server to backup the current version and then overwrite it with only the changed files being pushed.
Preferably something in PHP with a web interface.
Not asking much am I ;)
Anyone out there got/seen anything like this?
There are three git-ftp scripts which allow you to "push" a git repository to a FTP server.
git-ftp (bash)
git-ftp (python)
PHPloy (php)
You might be able to use something like Fuse to "mount" the production server as a local drive, and then as far as your copy of git is concerned it's a local operation. Alternatively, rsync.
There's a tool call Dandelion that also does this. From what I can see, it's quite similar to git-ftp, BUT it also supports sftp and Amazon S3, which is handy if you don't want to change deploy tool just because you change server. It comes as a ruby gem, so really easy to install and get going.
I have done something like that using ssh2 and php.
first you need to clone the repo on the server. Once cloned, you can do git pull, checkout, etc from php using ssh2. the most practical way I found was doing.
git fetch;
git reset --hard commit_hash;
in order to get set the commit to the one expected.
To execute a php - ssh2 command (supposing you have ssh2 installed), you can use this method.
public static function SSHCommmand($command,$user,$ip) {
$port = 22;
if (!function_exists("ssh2_connect"))
die("function ssh2_connect doesn't exist.");
$result['debug'] .= " -Connect- 1";
if (!($con = ssh2_connect($ip, $port, array('hostkey' => 'ssh-rsa')) )) {
die("unable to establish connection.");
} else {
// try to authenticate with username root, password secretpassword
if (!(ssh2_auth_pubkey_file($con, $user, '/home/' . $user . '/.ssh/deploy_rsa.pub', '/home/' . $user . '/.ssh/deploy_rsa'/* , 'secret' */))) {
dir("fail: unable to authenticate.");
} else {
// allright, we're in!
// execute a command
if (!($stream = ssh2_exec($con, $command))) {
die("fail: unable to execute command.");
} else {
// collect returning data from command
stream_set_blocking($stream, true);
$data = '';
while ($buf = fread($stream, 4096)) {
$data .= $buf;
}
fclose($stream);
return $data;
}
}
}
}
I'm using ssh-rsa key, the auth method might change. I'm aslo supposing that the keys are in '/home/' . $user . '/.ssh/deploy_rsa.pub' and '/home/' . $user . '/.ssh/deploy_rsa.
The other thing you might take into account is that to execute remote a remote git command, the command should be like:
_GIT_PATH.' --git-dir='.$path.'/.git --work-tree='.$path.' '.$command;
where $path is the toplevel of the working tree.
By using this and the Amazon Api, I've been able to deploy new code to several servers automatically and simultaneously.
I use Beanstalkapp.com, which is great. You can deploy via FTP or SFTP.

Categories