In my project I have 2 databases. propel-build-model is already set up to work for 2 databases - Multiple databases support in Symfony
If I make changes to either of the databases, I need the propel-build-schema command to rebuild the schemas for both.
I know I can do this manually by amending my settings per schema, but is it possible to create both at the same time? If so, how can I adjust my propel.ini file to have both connections?
I am currently using Symfony 1.0
The propel-build-schema command uses the settings in the propel.ini file which can store the configuration of a single connection only. However, you can probably do what you want by creating a copy of your propel.ini file with a different name to store the settings of your other connection and writing a simple shell script to rename the .ini files to propel.ini as needed and invoke propel-build-schema twice so that each invocation uses the right propel.ini (your script may also need to rename the output schema.yml files as well). Should be simple to do. Then, whenever you want the schema files regenerated, just run the script.
Related
I'm using PhpStorm to manage many of my client Laravel PHP projects, however as these projects have become larger it's become important to maintain and also show version information inside the system I'm developing.
To do this I'm using the .env file to hold a config called APP_VERSION.
What I would like to do is whenever a file is saved inside PhpStorm - any file - I want to automatically update the value (by incrementing) for this configuration variable.
However I can't seem to find a way to do this, or if it's even possible.
I have looked at File Watchers but this doesn't seem to solve the problem I'm having in that it only allows certain file types and even then it wants to run a compiler (such as a Sass or Less compiler, etc.) which is certainly not what I'm after.
Is there any way to do this, without me having to manually increment the minor version number with each file I update?
Update
So by using an adaptation of the answer by #LazyOne I created a small script which executes the necessary text update to the file in question.
I used a custom File Watcher which was setup against changes to any file type with a custom scope created to exclude certain files and folders that I didn't want watched.
End result, when I save a file that is within a valid scope, the script I wrote executes to update my file with the necessary change.
PhpStorm does not have such functionality (including other IDEA-based IDEs). There are some tickets asking for such functionality .. but it will definitely not work for your scenario (the use case there is different).
You will have to do it "manually". By this I mean:
either actual manually editing the file (as in your question)...
or automate it by writing some script (e.g. in PHP) that would do it for you and then just execute it either on demand (e.g. External Tools or manually) or via File Watcher (File Watcher is basically (simply speaking) External Tools that is run on file modification -- you can run any program there, even your own shell/batch script).
Such script will open your .env file, find the right line and edit the value there -- little bit of file parsing job -- nothing super heavy.
Hint: Laravel has artisan .. and one of the commands is key:generate (which alters .env file -- at very least it is run at very least once when you create new Laravel app). You may do it in similar fashion -- do your script as artisan command (so that you have full Laravel power behind it) and just call it when needed (File Watcher or whatever).
P.S. Instead of editing .env file (which is more environment specific rather than global setting, which version info is) .. why not store this in custom config file (e.g. config/version.php) which will contain that info only and therefore will be much easier to alter or even generate from scratch (template) as it has much simple structure and therefore no need to maintain other existing info there (which you have to do with .env files)?
At the end of the day -- there is no big difference for you/programmers between calling env('APP_VERSION') and config('version.app_version') to get that info in your app.
I'm not sure if that's possible directly within PHPStorm. You could, however, write yourself a small PHP script which monitors the specified directory for file changes and when one happens your script can make the necessary changes in the file you want.
Try looking at inotify for more info.
I'm trying to query environment variables that i have set my .bashrc file (i'm running Ubuntu 14.04) in a PHP scripts that run's under Apache server.
When i query the getenv('MY_VAR_NAME') or $_ENV['MY_VAR_NAME'] while accessing the page, it seems that those variables are missing..
My guess is: when the script is been executed the user is www-data, so 'MY_VAR_NAME' is not accessible.
Is there any procedures / best practices to this kind of problems ?
Thanks
My guess is: when the script is been executed the user is www-data, so 'MY_VAR_NAME' is not accessible.
Your guess is correct. :)
Is there any procedures / best practices to this kind of problems ?
What most programmers would do is to have a configuration file containing these variables. You would store the configuration file in some area where the PHP scripts could get to it -- whether that's in a home directory somewhere, in /etc. or in some other place such as in the web root or a directory near it.
Different frameworks take different approaches to the format of the configuration files -- some are PHP scripts, some are YAML files, some are windows/DOS format INI files, some are XML, some are JSON, etc.
Personally I like the idea of doing this:
Store in a simple INI file only the configuration required to access the database.
Store all of the remaining configuration in a database table, and build an editor for the parts that you need to be able to edit.
Cache the contents of the configuration database table in memcached.
Whichever method you use is up to you, however, and will vary depending on the needs of your application.
My project is a collection of PHP scripts using MySQL as a database and needs to be installed locally using WAMP/LAMP/MAMP.
Previously I've been sending the users a link to a zipped archive and having them overwrite it, but since I took the plunge to GitHub, I've realized that there are far better ways; namely Service Hooks in GitHub. However, this would work fine as long as I don't alter the database in any way, which is a good possibility.
I've been toying with the idea of how I would implement this, but I can't find a clear solution. So far I've concluded with that I need to have a directory (say update/) which contains .sql files for each update. The PHP script will then check said directory for a file corresponding with the new version number (not sure how I will define a version number; I was thinking of using the commit ID, but that won't be available until after the commit, so...).
I would love some input on this!
Here's how I would tackle this (not the most elegant or performant):
Add a flag in the DB with a version number
Add a min-version number in your DB layer PHP file
Check that the DB version is greater than the min-version
If it is: continue about your business
Else: Run the PHP file in update/ which would have a series of ALTER TABLE commands to be run on the DB server
Update the min-version number in the DB to the latest number
All done
Alternately instead of querying the DB you can have a file which is generated by your DB interface PHP file (and ignored with .gitignore) which you can just as above.
I would really recommend checking out Doctrine and its migration feature.
This does exactly what you are looking for, plus you get a very nice tool for working with all other aspects of your database handling.
Ok this might seems a bad idea or an obvious one. But let's imagine a CMS like PHPBB. And let's imagine you'd build one. I'd create just 1 file called PHPBB.install.php and running it it will create all folders and files needed with PHP. I mean, the user run it just once and every file and folder of the app is created via the PHP file.
Why to do this?
Well mostly because it's cleaner and you are pretty much sure it creates everything as you wish (obliviously checking everything about the server first). Also, having all the files backed-up inside a file you would be able to restore it very easily by deleting everything and reinstalling it running again PHPBB.install.php. Backing-up files like this will allow you to also prevent errors: How? When an error occurred in a file, this file is restored as it was and automatically re-run.
It would be too heavy!
The installation would happen only once and you'd be sure the user will not forget to place the files correctly. The error-preventing will worth the cause and it would also happen only once.
Now the questions:
Does this technique exists? If so, What's its name?
Why would you discourage it?
As others have said, an installer.
It requires the web server to have permission to write to the filesystem, and ends up having the files owned by the user the web server runs as. Even when one has the ability to change filesystem permissions, it's usually a longer process than just extracting an archive and having the initial setup verify permissions.
Does this technique exists? If so, What's its name?
I'd advise to read about __halt_compiler(). It allows you to mix PHP code with non-php data which is not parsed, so you may have PHP code ("installer") and binary data (e.g., compressed contents of all the files) in single PHP file.
1 - Yes, there is a single install file in PHPBB. You run through an online wizard defining your settings and then it installs automatically.
http://www.phpbb.com/support/documents.php?mode=install&version=3&sid=908f5766fc04868ccb985c1b1e6dee4b#quickinstall
2 - The only reason to discourage it would be if you want the user to understand exactly how the system works. Automatically installing it means the user has no need to understand the nitty gritty of it all - of course, many see this as a good thing.
I'm writing a CMS on PHP+MySQL. I want it to be self-updatable (throw one click in admin panel). What are the best practices?
How to compare current version of cms and a version of the update (application itself and database). Should it just download zip archive, upzip it and overwrite files? (but what to do with files that are no longer used). How to check if an update is downloaded correctly? Also it supports modules and I want this modules to be downloadable from the admin panel of cms.
And how should I update MySQL tables?
Keep your code in a separate location from configuration and otherwise variable files (uploaded images, cache files, etc.)
Keep the modules separate from the main code as well.
Make sure your code has file system permissions to change itself (use SuPHP for example).
If you do these, simplest would be to completely download the new version (no incremental patches), and unzip it to a directory adjacent to the one containing the current version. Because there won't be variable files inside the code directory, you can just remove or rename the old one and rename the new one to replace it.
You can keep the version number in a global constant in the code.
As for MySQL, there's no other way than making an upgrade script for every version that changes the DB layout. Even automatic solutions to change the table definition can't know how to update the existing data.
A slightly more experimental solution could be to use something like the phpsvnclient library.
With features:
List all files in a given SVN repository directory
Retrieve a given revision of a file
Retrieve the log of changes made in a repository or in a given file between two revisions
Get the repository latest revision
This way you can see if there are new files, removed files or updated files and only change those in your local application.
I recon this will be a little harder to implement, but the benefit would probably be that it is easier and quicker to add updates to your CMS.
You have two scenarios to deal with:
The web server can write to files.
The web server can not write to files.
This just dictates if you will be decompressing a ZIP file or using FTP to update the files. In ether case, your first step is to take a dump of the database and a backup of the existing files, so that the user can roll back if something goes horribly wrong. As others have said, its important to keep anything that the user will likely customize out of the scope of the update. Wordpress does this nicely. If a user has made changes to core logic code, they are likely smart enough to resolve any merge conflicts on their own (and smart enough to know that a one click upgrade is probably going to lose their modifications).
Your second step is to make sure that your script doesn't die if the browser is closed. This is a process that really should not be interrupted. You could accomplish this via ignore_user_abort(true);, or some other means. Or, if you like, allow the user to check a box that says "Keep going even if I get disconnected". I'm assuming that you'll be handling errors internally.
Now, depending on permissions, you can either:
Compress the files to be updated to the system /tmp directory
Compress the files to be updated to a temporary file in the home directory
Then you are ready to:
Download and decompress the update en situ , or in place.
Download and decompress the update to the system's /tmp directory and use FTP to update the files in the web root
You can then:
Apply any SQL changes as needed
Ask the user if everything went OK
Roll back if things went badly
Clean up your temp directory in the system /tmp directory, or any staging files in the user's web root / home directory.
The most important aspect is making sure you can roll back changes if things went bad. The other thing to ensure is that if you use /tmp, be sure to check permissions of your staging area. 0600 should do nicely.
Take a look at how Wordpress and others do it. If your choice of licenses and their's agree, you might even be able to re-use some of that code.
Good luck with your project.
There is a SQL library called SQLOO (that I created) that attempts to solve this problem. It's a little rough still, but the basic idea is that you setup the SQL schema in PHP code and then SQLOO changes the current database schema to match the code. This allows for the SQL schema and attached PHP code to be changed together and in much smaller chunks.
http://code.google.com/p/sqloo/
http://code.google.com/p/sqloo/source/browse/#svn/trunk/example <- examples
Based on experience with a number of applications, CMS and otherwise, this is a common pattern:
Upgrades are generally one-way. It's possible to take a snapshot of full system state for a restore upon failure, but to restore usually entails losing any data/content/logs added to the system since the upgrade. Performing an incremental rollback can put data at risk if something were not converted properly (e.g. database table changes, content conversions, foreign key constraints, index creation, etc.) This is especially true if you've made customizations that rollback scripts couldn't possibly account for.
Upgrade files are packaged with some means of authentication/verification, such as md5 or sha1 hashes and/or digital signature to ensure it came from a trusted source and was not tampered. This is particularly important for automated upgrade processes. Suppose a hacker exploited a vulnerability and told it to upgrade from a rogue source.
Application should be in an offline mode during the upgrade.
Application should perform a self-check after an upgrade.
I agree with Bart van Heukelom's answer, it's the most usual way of doing it.
The only other option would be to turn your CMS into a bunch of remote Web Services/scripts and external CSS/JS files that you host in one location only.
Then everyone using your CMS would connect to your central "CMS server" and all that would be on their (calling) server is a bunch of scripts to call your Web Services/scripts that do all the processing and output. If you went down this route you'd need to identify/authenticate each request so that you returned the corresponding data for the given CMS user.