On Openshift there is an oudated version of php therefore I want to make a catridge that is able to run the latest php version in order to run applications that require the latesr version of it.
I also need to have composer intalled too and to be able to connect either with mysql/mariadb or postgresql.
I have seen https://blog.openshift.com/new-openshift-cartridge-format-part-1/ and https://blog.openshift.com/new-openshift-cartridge-format-part-2/ from documentation and I have just made the directory in order to make the compiles.
Also on the example abode does not show HOW to make a catridge when you need to customly compile stuff such as php.
So what I am asking is if I need to compile an x-y application that my application uses such as php7 etc etc how I will do it. Also php7 need to be compiled only once and I am not sure if I put my scripts on ./openshift/build is a good idea.
I also need to be able to manually abb composer and pecl support and to be able to use the existing postgresql and mysql catridges.
I built PHP in RHEL a while ago using the infamous configure/make/make install procedure, instead of using yum. Now, I planned to include SNMP in my PHP build, but I couldn't find the old configure command. Now I have to reconstruct the configure command just to include the SNMP component. While doing so, I had a thought if there's a way to automatically regenerate the configure command with all the options and parameters. I searched high and low in Google, but I can't find any. So I post here to ask if such method exists. Please share if such step is possible. Thanks.
I am using a shared host. PHP is compiled with --disable-sysvshm. I get the following error while running a script:
Fatal error: Call to undefined function shm_attach() in ...
Is there any way to enable it without re-compiling php?
There is, but as a regular user, you can't do it. You'll need admin access.
If you have root access, then your package manager should have the extension available if it doesn't come built into PHP. For SuSE, it's looking like a php-sysvshm package would do it. If there's no package, you'll still need to rebuild, but it's doable.
If you don't have the access you'd need to build PHP or install packages, you won't be able to build or install, let alone load, extensions (which are pretty much the only way you can add functionality without replacing your existing PHP). In that case, you'll need to talk to your web host and see if they will install it for you. If they won't, then that's pretty much it.
I have been compiling PHP for years with the configuration options I want. I compile extensions I use from source. Is there an advantage to doing this versus installing it from a package manager like apt-get or yum. I assumed it would also give me a leaner binary. I noticed that their are PHP modules in the repos such as "php53-gd". What if there wasn't a package available for something I wanted such as cURL for PHP?
I understand the disadvantages of compiling such as needing to download/install dependencies based on my configuration options. I'm not really concerned with that.
So the question is:
Compile PHP on Linux or just use apt-get / yum? Can I get all the things I need from the repos? Does anyone out there still compile it from source?
Any insight is appreciated! Thanks.
I compile from source every time. It's not hard to corral the mentioned issues with regards to compiling manually. For example, my ./configure settings are saved to a file which is version controlled, so when a new version of PHP is stable and I am ready to make the switch, I download and extract the file, then run this command:
./configure `sh /path/to/my/configure/php.sh`
Not too difficult. And because it's in version control, I can add notes as to why a module was added or removed.
Another benefit of manual compilation is it allows me to keep the PHP footprint as minimal as possible. I pass the --disable-all flag, then add the modules I need. However, there is a downside to this minimalist approach, recently I needed to install Magento, so I had to recompile with --enable-hash and --with-mcyrpt flags. Even though I needed to add new flags, it wasn't difficult to add to the configure file and recompile.
Compiling from source has a few quirks:
There are hundreds of config parameters and flags. And you might not know the optimal ones that need to be used.
if you rely on apt-get's PHP, then you can be assured that you will get the latest patches and security updates if you set up auto-upgrade on your server.
the configuration of php.ini varies a lot. Sometimes your OS may decide some defaults for you which may work better with the rest of the system.
installing extensions like xdebug or other packages are a lot easier with apt.
However, it's worth compiling php from scratch if you want to learn. Also if you don't use some portions of it, you can always disable them in configuration - but then again it might not make much difference to performance.
I compiled php for specific needs only, like :
very small hard disk space so required a minimalist php version
and/or
need only a few specific modules or extensions
and/or
needed for a specific application
and/or
needed to optimize performances: when compiling on the machine where it's used, this allows some performance improvements, if using compile options to get a real tuned version for your system,
and/or
needed multiple and different php versions on the same machine.
and/or
I had a specific nux distro like only a busybox, so no other options than compiling.
But for common usage, e.g. in 80% of the cases, it's not worth spending time to compile and better using the repository version. But I learned a lot by compiling.
Personally, it's a matter of opinion. If you are in a hurry, apt-get it, if you have time to learn and possibly need to reinstall 20 times...compile it.
There are tons of guides out there for PHP compiling. It has a ton of flags for configuration, especially for GD and other libraries. Personally if this is for learning and development, just get LAMP or use apt-get...especially if you need to use Apache
I feel the primary reason for compiling is to have latest version binary (stable or nightly). package managers (most distors) are often annoyingly slow in this respect.
The other reason is that its very common problem that production systems are not wholesale upgraded using package managers. Even if that can be easy. Since package managers create dependency chains and you may not want to upgrade those items. So just to pick one item, compiling is an option. It keeps everything else as it is. You ofcourse have to always study the upgrade issues and make sure nothing else will fail.
After compiling PHP from source are the devel libraries still needed?
For example, I am building a newer version of PHP from source than is on our dev servers. I installed alot of [extension i.e. mysql, postgresql, curl, etc]-devel packages in order for the configure from the dev server setup to work. Do i still need these after php has compiled? For example could I make a distro and then distribute the PHP distro to another server without needing these devel dependencies?
I am a bit of a noob to this.
You don't need to ship the devel-libraries.
But my advice is to take some time and learn how the build system of your linux distribution works. And then build a new php package that can be installed by the package manager.
Take a look at how the "original" php packages were built for the distribution. Most likely you can simply copy and edit the existing rule file(s) and then make a new version of that package. This way you take advantage of the dependency mechanisms and the package manager will not remove/overwrite your version so easily when an update shows up in the "official" repositories.