I have an application built in 2007 that makes extensive use of PEAR's AUTH and DB packages. It had been mothballed but out again now. Since those packages are not available and pear has completely changed, it no longer works in my system.
Outside of rewriting the entire software, is there anyway to get the previous functionality of DB & AUTH packages?
Thanks.
If you don't mind doing some investigative work yourself, you could look at the changelog pages for both of those packages - at http://pear.php.net/package/Auth/download/All and http://pear.php.net/package/DB/download/All to determine which version of these packages you had installed and used when you developed your application.
Once you've confirmed and installed the specific versions of these packages that you need, you might want to consider writing what's called a "PEAR Meta Package" and committing it to your version control system so that you can ensure these specific packages can be easily installed again (on other servers, whichever) with minimum hassle.
Related
I'm still new to coding and I'm learning everything on my own. This is a silly question for you but after reading a dozen of articles I am still confused.
I have a php based website on a shared host. After reading the various articles on benefits of using repositories and Composer, I decided to give it a try. These are my difficulties so far:
Which version of the operating system of Composer should I download, to enable me to install/update repositories of my cPanel based shared hosting?
If I am to install Windows version, how do I connect to my shared hosting to install/update the repositories?
My apologies for my silly questions, but it would really help.
If you are using shared hosting, you are unlikely to be able to use Composer on the host itself. Furthermore, you are not encouraged to use Composer "on production".
I would recommend you use Composer locally (on the O/S of your local machine), to compose your project and install your dependent packages. Once it's all working and tested with your own code, you upload your entire development directory tree including the resulting vendor library - as one big FTP/SCP upload of "flat files".
Once you get more advanced you could adventure into automated deployment techniques, but I feel for now you would be best to stick to using Composer as a local development tool to manage your codebase.
Update, further details:
Composer is really a tool to help you manage your codebase in development. It's not intended as a "deployment" tool. Previously you used to find a library you liked, download it, unzip it into your codebase somewhere random like "lib/stuff" and then link to it, and commit it into your version control system (VCS). OK, but they a year later you want to update it and you have to download it again, figure out where you saved it and how to overwrite the files, or delete old ones... it gets hard. Also your VCS repository gets full of 3rd-party components - even duplicates of the same one! Composer solved this by bringing order to this long-term dependency management chaos.
The reason you don't want to run Composer "on production" (i.e. your live website), is that during the process of download, update, composition your website will probably be broken. Even if the composer process works, this could be several minutes of broken site. After the update has finished - you now have a completely new set of 3rd party packages: how do you know they are compatible with your codebase?
So therefore you only do composer updates locally, test everything, amend your code to work the shiny new updates, and only then do you decide to upload the whole new site to the server - just as if you'd cobbled it all together manually. The deployment is independent.
I'm building an SDK for developers to use to build modules for ecommerce platforms that will consume our API for a new startup.
Obviously it would be ideal to use composer, which I am doing right now. But as I examine most of the ecommerce platforms out there right now, or at least the most popular ones, they don't use composer.
So I'm wondering what's the best way to get all the dependencies all my current packages need and build them into a freestanding SDK.
This way I can have a version that will work for both composer and non-composer enabled platforms.
Is there a standardized way to do this in terms of a design pattern? How would I lay out all the dependency packages in any organized way?
Because those e-commerce platforms don't use composer, that doesn't force you to exclude composer from equation. You can't distribute your package as a plugin/module/whatever for that particular e-commerce platform, but you can still use composer's autoloader in production.
You could prepare the package for deployment on your machine or on a build server, archive the result and distribute the archive.
For the sake of simplicity, my example will assume that you will prepare your package on your local machine:
Create a temporary working directory:
$ mkdir -p ~/.tmp && cd ~/.tmp
Clone your package:
$ git clone <package>
Install dependencies1
$ cd ~/.tmp/<package> && composer.phar install --no-dev --optimize-autoloader
or if you do this from an automated tool:
$ cd ~/.tmp/<package> && composer.phar install --no-ansi --no-dev --no-interaction --no-progress --no-scripts --optimize-autoloader
Remove .git directory.
Create the zip/tar archive from ~/.tmp/<package>
Distribute the archive.
Assuming that your package is already a plugin/module for that e-commerce platform, it can be installed as usual from that zip/tar archive.
1) Regarding --optimize-autoloader, please read this answer from Sven, which explains why in some cases doesn't help your application to become faster.
Don't have dependencies!
Yes, seriously. If you'd develop an API client that would use Guzzle as the HTTP client, you'd have to make a choice: Use Guzzle version 3, 4, 5 or 6?
Guzzle 3 is out of maintenance and abandoned. You wouldn't want to use it.
Guzzle 4 is also considered end-of-life, because version 5 came very fast. Nobody really use this version.
This boils down to using either version 5 or 6. But Guzzle is using the same namespace and likely the same class names in both versions, but is incompatible to each other. No matter which version you choose: Your customer will have made the opposite choice - and now you have a codebase where two versions of Guzzle are running at the same time - this will not work.
If you don't have dependencies, but deliver everything within your own codebase, you have all of your code under your control, and are reducing the need to use Composer as a tool to easily install all your dependencies. Your package will have everything already included, it's unlikely that there will be any namespace conflicts.
You'd be able to offer a ZIP file for download. And if you additionally offer a composer.json to allow developers to include your package that way, everyone will be happy.
Update
Now after finding out that everyone thinks I am crazy proposing not to use stuff invented elsewhere, I challenge you to think about the situation once again: You find that you have to produce code that will likely be included in a codebase that is NOT managed with Composer. That means you have no idea what kind of software is put together there.
It may simply be so that you have a version of Guzzle in the existing codebase - undetectable, because there is no composer.json. Now you provide your own package with a bundled Guzzle version (whatever way made it appear there). This will likely crash the entire software at some point because of conflicts, because the autoloading will of course be merged at some point, and then some part of the code will request some Guzzle class to be loaded, which is included twice from two different versions of Guzzle.
WHAT SHOULD HAPPEN IN THIS CASE? THINGS WILL CRASH!
And it is unavoidable that this will happen. Even in the lucky case of being able to use Composer, it will conflict - the software won't crash, but the entire package won't be installed. The good thing is: You will notice this immediately.
If the primary goal is to deliver an API client anyone can use in every situation, without using a dependency manager: Don't have dependencies!
Alternatively, be completely sure that you know which software is already being used, and create a package that will not conflict in any case. However, this is still an effort, because there might be other addons also being installed, which might include conflicting software.
My central point is: If you don't have a dependency manager like Composer being able to manage the dependencies, you are better off NOT to have dependencies in your own code to make it super easy to include your own code in someone else code base.
And the question above clearly states that Composer is not an option in the general case.
Now there is one light at the end of the tunnel: When it comes to general tasks, the PHP-FIG has started to standardize interfaces that should leverage interoperability. For HTTP, the standard is PSR-7.
You COULD provide an API SDK that depends (and brings with it) the PSR-7 interface and requires the user of the SDK to provide a HTTP client that implements this interface.
The problem with this approach I see is that you will still run into trouble if you try to use for example Guzzle for the same reason: The only valid choice now is to use Guzzle 6 for the SDK - what if Guzzle 5 was already used elsewhere? Conflict! The good thing is: You can avoid using Guzzle 6 if you are already using Guzzle 5 by using any other PSR-7 capable HTTP client.
When you set up a stack in OpsWorks, does it lock in the current built-in cookbooks version or will it use the most up-to-date version each time a lifecycle event is triggered?
For custom cookbooks, I understand that OpsWorks caches the provided recipes when they are provided rather than fetching the newest version each time, but I wonder if the same is true for the built-in cookbooks.
I'm concerned about this for a few reasons. What if the cookbooks are updated to install a different version of Apache or PHP or slightly vary their default configuration? What if I then setup a new instance in a layer in which the old recipe was used and end up with multiple servers with slightly different configurations?
Also there doesn't appear to be a way to customize which PHP5 version gets installed, so am I just at the mercy of the ubuntu package managers' decision to use the latest stable version?
I do want to continue using the latest and greatest software versions, but I would like to deploy them on my own time after I have been able to test that my application works in the new version.
When you set up a stack in OpsWorks, does it lock in the current built-in cookbooks version or will it use the most up-to-date version each time a lifecycle event is triggered?
When you provision a new machine, built in cookbook + custom cookbooks are requested onto the server at the same time. It gets updated only custom cookbook update is requested. This is why the recommendation is NOT copy the entire AWS cookbook into your custom cookbook. Only things you are modifying so you can benefit from standard community cookbook updates.
I'm concerned about this for a few reasons. What if the cookbooks are updated to install a different version of Apache or PHP or slightly vary their default configuration? What if I then setup a new instance in a layer in which the old recipe was used and end up with multiple servers with slightly different configurations?
This is not just a BANE, but a benefit too. It depends on how you perceive this. What maybe a performance improvement can be done, and also introduction of bugs. This needs to be kept in sync by your operations people.
You can override parts of the built in cookbook by just placing duplicates ( or customised ) versions in the same place in your cookbook. Converge option
OR the more complex but sure way :
implement custom recipe cookbook.
import opsworks cookbooks as a submodule into a folder inside your cookbooks folder.
symlink the cookbook that you need version controled to the main folder now
evaluate and update as needed specific ones
ie :
cd cookbook
git submodule add https://github.com/aws/opsworks-cookbooks external-cookbooks/opsworks-cookbooks
ln -s external-cookbooks/opsworks-cookbooks/rails rails
This way you can update and keep version control of your infrastructure code. Make and evaluate changes and only import the changes after you've
I still recommend using converge mode with only the minor changes you require hardcoded. It will mean LESS duplication , and your will benefit from updates that maybe in the community version of the cookbooks.
If you're using a cookbook that uses the UBUNTU way to install PHP - then you will be mercy to what is in the repo. If you are using another one that does custom compiled version, then you can compile specific versions. You may have to either write your own OR find one that does build it and let you specify version via an attribute on the cookbook.
I am working on a PHP application that uses many features from PEAR. The app is meant to be distributable kind of like Wordpress but really scaled down.
Now the problem I've run into is that PEAR needs to be installed and configured alongside the PHP server without which my app simply will not function unless the users go through all the painful steps of installing PEAR on their server. Users can very well be newbies or non-technical so it's not an option for them.
Therefore there is a need to somehow package everything PEAR into the application itself. As far as I know it may not be possible.
Are there any alternate solution to this? Any solution at all will help. Thanks..
PEAR installs system wide dependencies which makes things like what you describe hard. Composer on the other hand is exactly what you'd need, because it's a per-project dependency manager with much better support for resolving and installing of dependencies. Basically, compared to Composer, PEAR sucks... it always did, Composer on the other hand rocks!
The first thing I would do for each package you need is to see if it is also provided on https://packagist.org/. If it is, problem solved, include the installation into your build process with composer. If you end up with only a few packages from PEAR, you have several options:
inspire the author to provide it on packagist
make your own mirror on packagist (not recommended but sometimes necessary)
see if the project is on github and install directly from git with composer
install the PEAR package via composer anyways, it's possible.
Short answer: switch to composer!
If you are talking about the PEAR packages or class files, you can put the PEAR packages anywhere you want. Just put the ones you use into a dir within your app dir structure and add that to the include path.
I want to start using more libraries for my codeigniter project, but I am weary of having them all in my vcs or having to manually manage versions of the libraries. I've recently found this PEAR Guide but I don't see anything about existing codeigniter libraries being available through this system. Does a CI specific PEAR repo exist and if not, how difficult is it to get packages into PEAR?
You can setup your own pear channel server (i.e. via pirum, pearhub or pearfarm) and put your own package in there. it's not that hard to build your own packages - you can even automate it with a phing script.
See also a similar question here on Stackoverflow with a more complete list of channel servers.