I am working on a mobile application that communicates with an IIS server to synchronize data among application users.
The server is implemented in PHP and MySQL. The final procuct will consist of the server and the application. In other words, every client (company) is going to use a different server and the employees of each company will be the users of the mobile application. As soon as the application is released, bugs are expected to come up. Therefore, each synchronization server will require updates. The db schema and the PHP code will probably need to be altered. Using git to have clients fetch the most recent version of the server is not an option since the clients are not able to handle issues such as merge conflicts.
I need to automate the update process as much as possible. Is there any tool or piece of advice that would help me do so?
Thank you in advance for your assistance.
I would suggest for the MySQL part to write you own migartions(PHP scripts) which if carefully tested should do the DB migrations correctly. The customers MUST be forbidden to modify the database or you'll never be able to handle migrations correctly.
The second part with PHP sync, I really don't understand what's the problem using git - I think that the right way to go. I don't understand your concerns about the conflicts because the customers wont have to deal with this. When you merge the branches you will have to deal with the conflicts yourself and after you push it to the git server the clients will only have to "pull" the new version.
So to finalize you should create a script that when a new version is available should git pull the version and after that execute the DB migration script(if any).
Related
We're currently developing a 'sort of' e-commerce platform for our customers that are using our POS system.
This mainly exists of:
An Angular client-side
A PHP API as back-end
A MySQL database
Before I distribute the application to clients, I want to have a 'manageable' system for deploying and updating their platforms in case of code changes etc.
The initial setup would be:
Create database
Copy PHP files
Run composer
Run migrations
Modify configuration file for database credentials, salts, domain,..
Copy client side files
I was looking at Deployer for PHP, but I'm not sure how the whole database creation and config file modifications would work. I've originaly have the database creation in one of my migrations, but this would require a root db-user (or one with create permissions) and this user would need to be created as well.
The intial setup part could be done manually (it's not like it will be more than 5+ installations per week or so, but I would like to make it as simple as possible so that our support can do this instead of me every time)
The next part would be Updates.
I don't want to FTP to every server and apply changes. Updates can be both server side and client side. What would be the best way to do this:
Have a central system with all versions and registered websites at our end and let the client server daily check for a new version. If there is a new version, download all files from our server and run the migrations.
Push via deployer the new version to all clients. But this would overwrite or move the original config file with the DB credentials etc with the new version?
What if I need to add a new config setting? (application settings are stored in the database, but like the 'API' settings are within a config file.)
There will be a chance that all these client-servers will be distributed via our hosting provider, so we'll have access to all of them and they'll all be the same (for the configuration and such)
I've only written web applications used on one (server) location, so updating those were easy, for example via deploybot and such and the database setup was done manually, but now I'm stepping up my game and I want to make sure that I don't give myself more work than it should be.
Here's our case on developing an e-commerce platform - maybe you'll find answers to your questions there.
Codenetix spezializes in custom development, mostly web apps, so if you need help - let us know.
Good luck with your project!
I have a web applications that stores data in a MySQL database on-line. It also retrieves data using PHP code, performs calculations on the server and sends the result back to the user.
Data it's quite simple: names, descriptions, prices, VAT, hourly charges that are read from the database and manipulated on the server side.
Often client work in environments where the internet connection is poor or not available. In this case I would like the client to be able to work offline: enter new names, descriptions, prices and use the last VAT to perform calculations. Then synchronise all data as soon as a connection is available.
Now the problem is that I do not know what is the best way or technologies for achieving this. Don't worry, I am not asking to write code for me. Can you just explain to me what is the correct way to build such a system?
Is there a simple way to use my online MySQL and PHP code locally?
Should I save the data I need in a local file, rebuild the calculation in JavaScript, perform them locally and then synchronise the data if database is available.
Should I use two MySQL database, one local and one online and do a synchronisation between the two when data is available? If yes which technology (language) shall I use to perform this operation?
If possible, I would like an answer from PHP coders that worked on a similar project in the past and can give me detailed information on framework structure and technology to use. please remember that I am new to this way of writing application and I would appreciate if you can spare few minutes and explain everything to me like if I am six year old or stupid (which I am!)
I really appreciate any help and suggestion.
Ciao,
Donato
There are essentially 3 ways to go:
Version 1: "Old school": PHP-Gtk+ and bcompiler
first, if you not have done so already, you need to separate your business logic from your presentation layer (HTML, templating engines, ...) and database layer
then adapt your database layer, so that it can live with an alternative DB (local SQlite comes to mind) and perform synchronisation when online again
Finally use PHP-Gtk+ to create a new UI and pack all this with bcompiler
Version 2: "Standard": Take your server with you
Look at Server2Go, WampOnCD and friends to create a "double clickable webserver" (Start at Z-WAMP)
You still need to adapt your DB layer as in Version 1
Version 3: "Web 2.x": Move application from server to browser
Move your application logic from the server side (PHP) to the client side (JS)
Make your server part (PHP) only a data access or sync layer
Use the HTML5 offline features to replace your data access with local data if you are offline and to resync if online
Which one is best?
This depends on what you have and what you want. If most of your business logic is in PHP, then moving it into the browser might be prohibitingly expensive - be aware, that this also generates a whole new class of security nightmaares. I personally do not recommend porting this way, but I do recommend it for new apps, if the backing DB is not too big.
If you chose to keep your PHP business logic, then the desicion between 1 and 2 is often a quiestion of how much UI does your app have - if it's only a few CRUD forms, 1. might be a good idea - it is definitly the most portable (in the sense of taking it with you). If not, go with 2.
I have worked with similar system for ships. Internet is expensive in the middle of the ocean so they have local web servers installed with database synchronization via e-mail.
We also have created simple .exe packages so people with no experience can install the system or update system...
SWho works with cached clients system knows that sometimes you have to update server and client files. So far I've managed to solve partially the problem, by making one call every time the software is opened to ask PHP what version of the software he's in. With the result, I compare to the version that Flex is in and voalá. Problem is, whenever I need to make an emergency update inside the business hour range, it's impossible to know how many clients have the Flex version already opened.
So to sunup: The cache problem I solved by controlling the version in start-up time, if your browser cached it, the version won't match with the server's app.
The only solution I can think to solve the 'already opened app' problem is to make a gateway between the PHP Services and Flex calls, where I would have to pass the Flex version and compare it inside the gateway, before the service is actually called, although I don't like this solution.
Any ideas?
Thanks.
You can download this application from Adobe website. http://labs.adobe.com/technologies/airlaunchpad/ It will allow you to build a new test app, and you need to select in the menu : "auto update" property. That will generate all the necessary files for you both for server and client.
The end result will have a server based xml file, and setup in each of the client apps to check on recurring basis if the xml file offers newer version of the application, and if true, automatically downloads and updates it. You can update the "check for update" frequency to your liking in the source code, by default it is tied to the application open event.
This frequent update will check for updates also while app is open, so it should solve your problem.
There have been many questions along these lines but I'm struggling to apply them to my scenario. Any help would be be greatly appreciated!
We currently have a functioning mySQL database hosted on a website, data is entered from a website and via PHP it is put into the database.
At the same time we want to now create a python application that works offline. It should carry out all the same functions as the web version and run totally locally, this means it needs a copy of the entire database to run locally and when changes are made to such local database they are synced next time there is an internet connection available.
First off I have no idea what the best method would be to run such a database offline. I was considering just setting up a localhost, however this needs to be distributable to many machines. Hence setting up a localhost via an installer of some sort may be impractical no?
Secondly synchronization? Not a clue on how to go about this!
Any help would be very very very appreciated.
Thank you!
For binding Python to MySql you could use HTSQL:
http://htsql.org
You can then also query your MySQL DB via http requests, either from AJAX calls or server-side e.g. cURL (and of course still have the option of writing standard SQL queries).
There is a JQuery plugin called HTRAF that handles the client side AJAX calls to the HTSQL server.
The HTSQL server runs on localhost as well.
What OS would you be using?
How high-performance does your local application need to be? Also, how reliable is the locally available internet connection? If you don't need extremely high performance, why not just leave the data in the remote MySQL server?
If you're sure you need access to local data I'd look at MySQL's built-in replication for synchronization. It's really simple to setup/use and you could use it to maintain a local read-only copy of the remote database for quick data access. You'd simply build into your application the ability to perform write queries on the remote server and do read queries against the local DB. The lag time between the two servers is generally very low ... like on the order of milliseconds ... but you do still have to contend with network congestion preventing a local slave database from being perfectly in-sync with the master instantaneously.
As for the python side of things, google mysql-python because you'll need a python mysql binding to work with a MySQL database. Finally, I'd highly recommend SQLalchemy as an ORM with python because it'll make your life a heck of a lot easier.
I would say an ideal solution, however, would be to set up a remote REST API web service and use that in place of directly accessing the database. Of course, you may not have the in-house capabilities, the time or the inclination to do that ... which is also okay :)
Are you planning to run mysql on your local python offline apps ? I would suggest something like sqlite. As for keeping things in sync, it also depends on the type of data that needs to be synchronized. One question that needs to be answered:
Are the data generated by these python apps something that is opague ? If yes (i.e. it doesn't have any relations to other entities), then you can queue the data locally and push it up to the centrally hosted website.
We are developing a system on PHP with SQL Server 2008. Is a system that must work with the invoices stored in another SQL Server instance, that I have linked to my Database using sp_addlinkedserver.
The problem is that I think I need to have it loaded locally (because of performance). Si I'm thinking to make a my own "invoices" table, and two times per day somehow bring the data from the linked table to the locally stored one.
How can I program SQL to do this every X amount of time?
What approach I should use to program the importing?
It first I though to make my own script to do this, but I would preffer to have SQL Server to handle this, but that depends on your opinion :)
Thnak you!
Guillermo
NOTE: Replication sounds overkill for me.. I dont need to have real-time synconization. Neither I need to update the database, just read.
One option is to use replication to copy the data. However, it may take more administration than you're planning. Replication is great for managing a consistent and timely copy of the data.
Another option is to setup a SQL Server job that will run a SQL script to insert into your target table using a select from your linked server.
You could also use SQL Server Integration Services (SSIS). You would create a SSIS package where you would build a data flow that transfers your data from the source table to the target table. You wouldn't need a linked server for this approach, because your data sources are defined within the SSIS package. And, you can use a SQL Server job to schedule the package run times.