SWho works with cached clients system knows that sometimes you have to update server and client files. So far I've managed to solve partially the problem, by making one call every time the software is opened to ask PHP what version of the software he's in. With the result, I compare to the version that Flex is in and voalá. Problem is, whenever I need to make an emergency update inside the business hour range, it's impossible to know how many clients have the Flex version already opened.
So to sunup: The cache problem I solved by controlling the version in start-up time, if your browser cached it, the version won't match with the server's app.
The only solution I can think to solve the 'already opened app' problem is to make a gateway between the PHP Services and Flex calls, where I would have to pass the Flex version and compare it inside the gateway, before the service is actually called, although I don't like this solution.
Any ideas?
Thanks.
You can download this application from Adobe website. http://labs.adobe.com/technologies/airlaunchpad/ It will allow you to build a new test app, and you need to select in the menu : "auto update" property. That will generate all the necessary files for you both for server and client.
The end result will have a server based xml file, and setup in each of the client apps to check on recurring basis if the xml file offers newer version of the application, and if true, automatically downloads and updates it. You can update the "check for update" frequency to your liking in the source code, by default it is tied to the application open event.
This frequent update will check for updates also while app is open, so it should solve your problem.
Related
We're currently developing a 'sort of' e-commerce platform for our customers that are using our POS system.
This mainly exists of:
An Angular client-side
A PHP API as back-end
A MySQL database
Before I distribute the application to clients, I want to have a 'manageable' system for deploying and updating their platforms in case of code changes etc.
The initial setup would be:
Create database
Copy PHP files
Run composer
Run migrations
Modify configuration file for database credentials, salts, domain,..
Copy client side files
I was looking at Deployer for PHP, but I'm not sure how the whole database creation and config file modifications would work. I've originaly have the database creation in one of my migrations, but this would require a root db-user (or one with create permissions) and this user would need to be created as well.
The intial setup part could be done manually (it's not like it will be more than 5+ installations per week or so, but I would like to make it as simple as possible so that our support can do this instead of me every time)
The next part would be Updates.
I don't want to FTP to every server and apply changes. Updates can be both server side and client side. What would be the best way to do this:
Have a central system with all versions and registered websites at our end and let the client server daily check for a new version. If there is a new version, download all files from our server and run the migrations.
Push via deployer the new version to all clients. But this would overwrite or move the original config file with the DB credentials etc with the new version?
What if I need to add a new config setting? (application settings are stored in the database, but like the 'API' settings are within a config file.)
There will be a chance that all these client-servers will be distributed via our hosting provider, so we'll have access to all of them and they'll all be the same (for the configuration and such)
I've only written web applications used on one (server) location, so updating those were easy, for example via deploybot and such and the database setup was done manually, but now I'm stepping up my game and I want to make sure that I don't give myself more work than it should be.
Here's our case on developing an e-commerce platform - maybe you'll find answers to your questions there.
Codenetix spezializes in custom development, mostly web apps, so if you need help - let us know.
Good luck with your project!
I'm developing a web app using Laravel (a PHP framework). The app is going to be used by about 30 of my co-workers on their Windows laptops.
My co-workers interview people on a regular basis. They will use the web app to add a new profile to a database once they interview somebody for the first time and they will append notes to these profiles on subsequent visits. Profiles and notes are stored using MySQL, but since I'm using Laravel, I could easily switch to another database.
Sometimes, my co-workers have to interview people when they're offline. They might visit a group of interviewees, add a few profiles and add some notes to existing ones during a session without any internet access.
How should I approach this?
With a local web server on every laptop. I've seen applications ship with some kind of installer including a LAMP stack, but I can't find any documentation on this.
I could install the app and something like XAMPP on every laptop
myself. That would be possible, but in the future more people might use the app and not all of them might be located nearby.
I could use Service Workers, maybe in connection with a libray such
as UpUp. This seems to be the most elegant approach.
I'd like to give option (3) a try, but my app is database driven and I'm not sure whether I could realize this approach:
Would it be possible to write all the (relevant) data from the DB to - let's say - a JSON file which could be accessed instead of the DB when in offline mode? We don't have to handle much data (less than 100 small data records should be available during an interview session).
When my co-workers add profiles or notes in offline mode, is there any "Web Service" way to insert data into the db that has been entered?
Thanks
Pida
I would think of it as building the app in "two parts".
First, the front end uses ajax calls to the back end (which isn't anything but a REST API). If there isn't any network connection, store the data in the browser using local storage.
When the user later has network connection, you could send the data that exists in the local storage to the back end and clear the local storage.
If you add web servers on the laptops, the databases and info will only be stored on their local laptops and would not be synced.
You can build what you describe using service workers to cache your site's static content to make it available offline, and a specific fetch handler in the service worker to detect a failed PUT or POST and queue the data in IndexedDB. You'd then periodically check IndexedDB for any queued data when your web app is loaded, and attempt to resend it.
I've described this approach in more detail at https://developers.google.com/web/showcase/case-study/service-workers-iowa#updates-to-users-schedules
That article assumes the use of the sw-precache library for caching your site's static assets, and the sw-toolbox library to provide runtime fetch handlers that check for failed business-logic requests. It also uses a promise-based IndexedDB wrapper called simpleDB although I'd probably go with the more recent idb library nowadays.
I am working on a PHP product application, which will be deployed on several client servers (say 500+). What my client's requirement is, when a new feature released in this product, we need to push all the changes ( files & db ) to clients server using FTP/SFTP. I am a bit concerned about transferring files using FTP to 500+ server at a time. I am not sure how handle this kind of situation. I have came up with some ideas, like
When user (product admin) click update, we send an ajax request, which will update first 10 servers, and returns with the remaining count of servers. From the ajax response, we will send again for next 10, and so on.
OR
Create a cron, which runs every 1 mins, which check whether any update is active, and update the first 10 active servers. Once it complete the transfer for a server, it changes the status for that server to 0.
I just want know, is there any other method to do these kind of tasks ?
Add the whole code to a code repository mechanism like Git and then push all over present files to the created repository. Go to any one server and write a cron for auto pull the repository to the severs and upload that cron to every server.
In future if you like to add new feature just pull the whole repository and add the feature. Push the code again to repository it will be pulled by cron automatically in all the server where you kept the cron previously.
First, I would like to provide some suggestions and insight into your suggested methods:
In both the methods, you'll have to keep a list of all the servers where your application has been deployed and have keep track whether the update has been applied to a particular one or not. That can be difficult if in future you want to scale from 500+ to say 50,000+.
You are also not considering the case, where the target server might not be functioning at the time you send the request for update.
I suggest, instead of sending the update from your end to target server, you achieve the same in the opposite direction. As you said, you are developing an entire PHP Application to be deployed on Client Server, I suggest you develop an Update Module into it. The Target Server can send a request to your servers at designated time to check whether there is any Update available or not. If there is, then I suggest following two ways to proceed further
You send an update list, providing the names and paths of files to be updated, along with any DB changes, the same can be processed on Client Side accordingly.
You can just send a response saying there is an update available, then a separate process launches on Client Server which will download all the files and DB changes from the Server.
For maintaining concurrency of updates you can implement a Token System, or can rely on the Time-stamp at which the update happened.
I am working on a mobile application that communicates with an IIS server to synchronize data among application users.
The server is implemented in PHP and MySQL. The final procuct will consist of the server and the application. In other words, every client (company) is going to use a different server and the employees of each company will be the users of the mobile application. As soon as the application is released, bugs are expected to come up. Therefore, each synchronization server will require updates. The db schema and the PHP code will probably need to be altered. Using git to have clients fetch the most recent version of the server is not an option since the clients are not able to handle issues such as merge conflicts.
I need to automate the update process as much as possible. Is there any tool or piece of advice that would help me do so?
Thank you in advance for your assistance.
I would suggest for the MySQL part to write you own migartions(PHP scripts) which if carefully tested should do the DB migrations correctly. The customers MUST be forbidden to modify the database or you'll never be able to handle migrations correctly.
The second part with PHP sync, I really don't understand what's the problem using git - I think that the right way to go. I don't understand your concerns about the conflicts because the customers wont have to deal with this. When you merge the branches you will have to deal with the conflicts yourself and after you push it to the git server the clients will only have to "pull" the new version.
So to finalize you should create a script that when a new version is available should git pull the version and after that execute the DB migration script(if any).
I am in the final stages of completing my project (vizulium - open-source photography CMS). I have one final remaining stumbling block: updating the software.
My idea that I wanting to implement is this:
Check newest version at Vizulium website (page just displays current stable version).
If newer version exists, and the user requests it:
a. Zip the updated files on Vizulium server
b. Download the files to the user's server
c. Unzip contents
I already have a tracking system in place that keeps track of the updates (datetime) that I push. I have not began step 2. All is in PHP and mySQL.
Is this a typical implementation of the problem? Do I need to clarify anything?
I am not using FTP since it is a self-install and I assume the user is programming-illiterate.
Your solution is valid, but needs a few extra considerations.
You should connect to your server via HTTPS and with certificate verification to query and fetch any available updates.
You should sign your updates with a private key and have the client verify the updates as authentic before applying them.
If you need to remove an obsolete file from an install, unzipping will not do this, perhaps have an "upgrade.php" script in each upgrade that is executed to perform any extra necessary steps.
Your upgrade script should backup the web directory and database before performing the upgrade, and retain the backup until the user requests to remove it.
Make your upgrades incremental, so to upgraded from 1 -> 3, you need to upgrade to version 2 first. This would be of-course transparent to the user, but would ensure that the upgrades between versions would be complete and all database updates/modifications are applied in the correct order.