how to build a php mysql application that works offline - php

I have a web applications that stores data in a MySQL database on-line. It also retrieves data using PHP code, performs calculations on the server and sends the result back to the user.
Data it's quite simple: names, descriptions, prices, VAT, hourly charges that are read from the database and manipulated on the server side.
Often client work in environments where the internet connection is poor or not available. In this case I would like the client to be able to work offline: enter new names, descriptions, prices and use the last VAT to perform calculations. Then synchronise all data as soon as a connection is available.
Now the problem is that I do not know what is the best way or technologies for achieving this. Don't worry, I am not asking to write code for me. Can you just explain to me what is the correct way to build such a system?
Is there a simple way to use my online MySQL and PHP code locally?
Should I save the data I need in a local file, rebuild the calculation in JavaScript, perform them locally and then synchronise the data if database is available.
Should I use two MySQL database, one local and one online and do a synchronisation between the two when data is available? If yes which technology (language) shall I use to perform this operation?
If possible, I would like an answer from PHP coders that worked on a similar project in the past and can give me detailed information on framework structure and technology to use. please remember that I am new to this way of writing application and I would appreciate if you can spare few minutes and explain everything to me like if I am six year old or stupid (which I am!)
I really appreciate any help and suggestion.
Ciao,
Donato

There are essentially 3 ways to go:
Version 1: "Old school": PHP-Gtk+ and bcompiler
first, if you not have done so already, you need to separate your business logic from your presentation layer (HTML, templating engines, ...) and database layer
then adapt your database layer, so that it can live with an alternative DB (local SQlite comes to mind) and perform synchronisation when online again
Finally use PHP-Gtk+ to create a new UI and pack all this with bcompiler
Version 2: "Standard": Take your server with you
Look at Server2Go, WampOnCD and friends to create a "double clickable webserver" (Start at Z-WAMP)
You still need to adapt your DB layer as in Version 1
Version 3: "Web 2.x": Move application from server to browser
Move your application logic from the server side (PHP) to the client side (JS)
Make your server part (PHP) only a data access or sync layer
Use the HTML5 offline features to replace your data access with local data if you are offline and to resync if online
Which one is best?
This depends on what you have and what you want. If most of your business logic is in PHP, then moving it into the browser might be prohibitingly expensive - be aware, that this also generates a whole new class of security nightmaares. I personally do not recommend porting this way, but I do recommend it for new apps, if the backing DB is not too big.
If you chose to keep your PHP business logic, then the desicion between 1 and 2 is often a quiestion of how much UI does your app have - if it's only a few CRUD forms, 1. might be a good idea - it is definitly the most portable (in the sense of taking it with you). If not, go with 2.

I have worked with similar system for ships. Internet is expensive in the middle of the ocean so they have local web servers installed with database synchronization via e-mail.
We also have created simple .exe packages so people with no experience can install the system or update system...

Related

Single point of truth for database shared by multiple applications

I've three application which share the same database. All three applications are built with php (Laravel framework). At the beginning there was only one app, so I've managed my database with the built-in migration feature from Laravel (modifying/creating/deleting tables and rolling these changes back). This way it was possible to track all changes in my version control.
Now, as I have three apps, I wonder where I should manage my database. Where is my Point of Truth? As I see, I have multiple options:
Using one app as "database-master". Only this app must change the structure of the database. All other apps only have access to read/write data.
Every app makes its own changes. This would result in a huge chaos as a rolling back / rebuilding from zero would be nearly impossible.
Using a third party tool (e.g. MySQL-Workbench) to edit my DB. Seems like a valid option to me as this tool is built to handle databases and their structure/data. But is there some kind of version control? Can I roll back changes I've made?
Using a fourth application (in my example another Laravel-app). This app is only responsible for the database. This way it would be possible to track all changes in git.
It's the first time I come across this problem and would be happy to gather some information about best-practices.
Edit
The applications share the same tables.

PHP Websockets Mysql Pub/Sub

I have this "crazy" project starting, the idea behind it is quite clear:
There is some software that writes to MySQL database. The interval between queries are 1 second.
Now I need and web interface which loads those database records, and continues to show new records, when they happen.
The technologies I am supposed to use are PHP and HTML5 WebSockets. I'v found this nice library Ratchet which I think fits my needs, however there's one problem, I am not sure how to notify PHP script, or send a message to running PHP WebSockets server when the MySQL query occurs.
Now I could use Comet and send request for database record every second, but then it beats the WebSokets which I am supposed to use.
So what I really need is MySQL Pub/Sub system.
I'v read of MySQL triggers but I see that it possess some security risks, and thought the security in this case isn't a real concern since the system will be isolated in a VPN and only few specific people will be using it, I still would like to address every possible issue and do everything in a right way.
Then there is MySQL proxy, of which I have no knowledge, but if it could help me achieve my goal, I would very much consider using it.
So in short the question is how can I notify or run PHP script when MySQL query occurs?
I would separate the issues a bit.
You definitely need some sort of pub/sub system. It's my understanding that Redis can act as one, but I've never used it for this and can't make a specific recommendation as to what system to use. In any case, I wouldn't attach it directly to your database. There are certainly database operations you need to do on your database for maintenance purposes and you don't want it flushing out a million rows to your clients because it's based on a trigger. Pick your pub/sub system independent of what you're doing client-side and what you're doing with your database. The deciding factors should be how it interacts with your server-side languages (PHP in this case).
Now that your pub/sub is out of the way, I would build an API server or ingest system of sorts that takes data in from wherever these transactions are coming from. It will handle these and publish messages as well as inserting them into the database at the same time.
Next, you need a way to get that to clients. Web Sockets are a good choice, as you have discovered. You can do this in PHP or anything really. I prefer Node.js with Socket.IO which provides a nice fallback to long-polling JSON (among others) for clients that don't suppot Web Sockets. Whatever you choose here needs to listen for messages on your pub/sub and send the right data for the client (likely stripping out some of the information that was published that isn't immediately needed client-side).

Accessing a database located on another computer using PHP

My client has an offline product database for a high street shop that they update fairly frequently for their own purposes. They are now creating an online store which they want to use product information from this database.
Migrating the database to a hosted server and abandoning the offline database is not an option due to their current legacy software set up.
So my question is: how can I get the information from their offline database to an online database? Their local server is always connected to the internet so is it possible to create a script on the website that somehow grabs the data from their server and imports it into the online server? If this ran every 24 hours it would be perfect. But is it even possible? And if so how would I do it?
The only other option I can think of is to manually upload the database after every update, but this isn't really a viable idea.
I did something like this with quickbooks using an odbc connection. Using that I synced data to MySQL. This synchronization however, was just one way. Unless you have keys in the data that indicates when something was changed (updated date), you will end up syncing alot of extra data.
Using SQLYog, i set up a scheduled job that connected to the odbc data source, and pushed the changes since last sync to the mysql database I was using to generate reports. If you can get the data replicated into MySQL it should be easy at that point to make use of it in your online store.
The downside is that it wont be realtime. Inventory could become a problem.
In an ideal world I would look at creating a restful API that would run on the same server or at least run on the same network as your offline database. This restful API would run as a web server via http and return JSON or even XML structures of data from the offline database. Clients running on the internet would be able to connect and fetch any data they need, at any time. A restful API like this has a number of advantages.
Firstly it's secure. You don't have to open up an attack vector to the public by making connections to your offline database public. The only thing you have to do is enable public access to your restful API. In your API's logic you might not even include functionality to write to the database so even if your API's security is compromised at very worst all attackers can do is read your data, not corrupt it.
Having a restful api in this situation represents a good separation of concerns. Your client code should not know anything about the database nor should it know about any internal systems that the offline database uses. What happens when your clients want to update their offline system or even change it? In this situation all you would have to do is update the restful api. Your client that is connecting to the data no longer cares about anything else but the api so changing databases would be easy.
Another reason to consider an API is concurrency. I hinted at this before but having an API would be great if you ever need to have more than one client accessing the offline databases' data. In a web server set up where you would have the API sitting and waiting for requests there is no reason why you could not have more than one client connecting to the api at the same time. HTTP is really good at this!
You talked about having to place old data in a new database. Something like this could be done easily with a restful API as you would just have to map the endpoints of your API to tables in the new database and run that when you need. You could even forgo the new database and use the API as your backend. This solution would require some caching but it would cut down on the duplication of a database if you don't feel it's needed.
The draw back to all of this is the fact that writing an API over a script is more complex. So in this situation I believe in horses for courses. If this database is the backbone of a long term project that will be expanding in the future an API is the way to go. If its a small part of your project then maybe you can swing it with a script that runs every 24 hours however I have done this before and the second I have to change/edit the solution things start getting a little "hairy". Hope this helps and good luck with it.

Cheapest way (platform/language) to implement a RESTful web API for an iPhone app?

I am developing an iPhone app and would like to create some sort of RESTful API so different users of the app can share information/data. To create a community of sorts.
Say my app is some sort of game, and I want the user to be able to post their highscore on a global leaderboard as well as maintain a list of friends and see their scores. My app is nothing like this but it shows the kind of collective information access I need to implement.
The way I could implement this is to set up a PHP and MySQL server and have a php script that interacts with the database and mediates the requests between the DB and each user on the iPhone, by taking a GET request and returning a JSON string.
Is this a good way to do it? Seems to me like using PHP is a slow way to implement this as opposed to say a compiled language. I could be very wrong though. I am trying to keep my hosting bills down because I plan to release the app for free. I do recognise that an implementation that performs better in terms of CPU cycles and RAM usage (e.g. something compiled written in say C#?) might require more expensive hosting solutions than say a LAMP server so might actually end up being more expensive in terms of $/request.
I also want my implementation to be scalable in the rare case that a lot of people start using the app. Does the usage volume shift the performance/$ ratio towards a different implementation? I.e. if I have 1k request/day it might be cheaper to use PHP+MySQL, but 1M requests/day might make using something else cheaper?
To summarise, how would you implement a (fairly simple) remote database that would be accessed remotely using HTTP(S) in order to minimise hosting bills? What kind of hosting solution and what kind of platform/language?
UPDATE: per Karl's suggestion I tried: Ruby (language) + Sinatra (framework) + Heroku (app hosting) + Amazon S3 (static file hosting). To anyone reading this who might have the same dilemma I had, this setup is amazing: effortlessly scalable (to "infinity"), affordable, easy to use. Thanks Karl!
Can't comment on DB specifics yet because I haven't implemented that yet although for my simple query requirements, CouchDB and MongoDB seem like good choices and they are integrated with Heroku.
Have you considered using Sinatra and hosting it on [Heroku]? This is exactly what Sinatra excels at (REST services). And hosting with Heroku may be free, depending on the amount of data you need to store. Just keep all your supporting files (images, javascript, css) on S3. You'll be in the cloud and flying in no time.
This may not fit with your PHP desires, but honestly, it doesn't get any easier than Sinatra.
It comes down to a tradeoff between cost vs experience.
if you have the expertise, I would definitely look into some form of cloud based infrastructure, something like Google App Engine. Which cloud platform you go with depends on what experience you have with different languages (AppEngine only works with Python/Java for e.g). Generally though, scalable cloud based platforms have more "gotchas" and need more know-how, because they are specifically tuned for high-end scalability (and thus require knowledge of enterprise level concepts in some cases).
If you want to be up and running as quickly and simply as possible I would personally go for a CakePHP install. Setup the model data to represent the basic entities you are managing, then use CakePHP's wonderful convention-loving magic to expose CRUD updates on these models with ease!
The technology you use to implement the REST services will have a far less significant impact on performance and hosting costs than the way you use HTTP. Learning to take advantage of HTTP is far more than simply learning how to use GET, PUT, POST and DELETE.
Use whatever server side technology you already know and spend some quality time reading RFC2616. You'll save yourself a ton of time and money.
In your case its database server that's accessed on each request. so even if you have compiled language (say C# or java) it wont matter much (unless you are doing some data transformation or processing).
So DB server have to scale well. here your choice of language and DB should be well configured with host OS.
In short PHP+MySQL is good if you are sending/receiving JSON strings and storing/retrieving in DB with minimum data processing.
next app gets popular and if your app don't require frequent updates to existing data then you can move such data to very high scalable databases like MongoDB (JSON friendly).

access + mysql converting to webplatform = (php + asp.net + mysql)?

i have a database that is written in access. the access mdb file connects via ODBC to a local mysql database. i have a bunch of sql and vba code in the access file. i dont expect the database to surpass 100mb. currently it is around 10mb. i will need to have multiple user access. (no more than 10 users at a time)
i need to convert this database from being a local one to a web server, and i need to make a web interface for it.
how do i get the current local instance of mysql database to run off a webserver? i am currently running it off wampserver 2.0. i dont have experience putting a database on a webserver.
i have an OK vb.net background. i have never done any web applications. here's a picture of the access form that i may need to replicate to work off a website:
alt text http://img42.imageshack.us/img42/1025/83882488.jpg
which platform should i use as the front end to this thing?
would it be possible to just run this access file off a webserver instead of programming a new front end for it? is that not a smart idea?
thank you for your help!
If your webserver has TCP connectivity to your existing database server, and its hosted in a suitable place (eg, don't have your webserver in a datacenter connecting to a database server on your office DSL connection), then no move is required.
If you do need to move it, it's as easy as creating a backup/dump, and restoring it elsewhere.
As far as the frontend, there are MANY technologies that will do what you need (ASP.NET, PHP, Python, Ruby, Perl, Java being the most popular ones, not necessarily in that order).
Use something you are comfortable with, or that you are interested in learning (provided you have the time to do so)
Use something that runs properly on your target webserver. Really, ASP.NET is the only one that has any major issue here, as it's limited to Windows.
Access itself has no direct web-accessible version. A Google search finds some apps that claim to convert Access forms to web-based, but I will not link to any because I don't know how well they work. I'm certainly leary of anything like that, because web apps are a different breed from Windows apps. If you are going to go that route, be sure they actually generate HTML output; make sane, clean source; and offer a free trial so you can verify it actually works.
Really though, a form like that is reasonably easy to reproduce with some basic knowledge of server-side programming and some HTML.
I don't have any experience migrating access to a web-based interface, although I have heard of people going straight from access to a web page. MySql is exceptionally easy to migrate. MySQL.com has a program called mysqldump that comes with the standard install of MySQL that allows you to export your database straight to a text file that can be used then with mysqldump to import it on another server. I don't believe the WAMP server comes with the command line tools although they can be downloaded from mysql.com. However, if it has phpMyAdmin, then there is also an export feature with that as well that will generate a .sql file that can be imported to the webserver using phpMyAdmin. One thing to keep in mind though is that I have had very little success mixing and matching these methods: ie, I've never been able to get a mysqldump-created file to work with phpMyAdmin and vice versa.
Good luck!
The link will help you to export and import mySQL database
May be on Windows web server there is an opportunity to run Access files, you can check, but any way if you have some programming skills, I would say that it is not difficult to crate a php script which will query your database info and will edit.
Migrating an Access application to the web is quite difficult, because you can't translate an Access form 1:1 into a web page. Web apps are stateless, whereas Access is built around the concept of bound controls and bound datasets.
Secondly, it is impossible to easily replicate an Access subform.
Third, you lose tons of events that Access forms and controls are built around.
In general, a web page that performs the same task as an Access form will bear little or no resemblance to the Access form, simply because the methods for accomplishing the same tasks and the UI widgets available to you are so completely different.
One thing to consider is whether your users need a web application or if they just need to use your existing Access application over the Internet. If the latter is the case, Windows Terminal Server/Citrix can do the job for a lot less money, since there's no conversion needed. You do need to provision a Windows Terminal Server, set up a VPN and purchase CALs for the users, but the costs of those are going to be much less than the cost of rebuilding the app for web deployment.
It may not be an appropriate solution, but it's one that you should consider, I think.

Categories