I have a web application in PHP & MySQL.
It has three local systems connected with one database server.
Each local system has a primary computer which has the source code connected directly to database server. Other computers connect to each primary computer to use the application.
The problem is:
When the connection drops between one of these systems and the database server, the application can store data in the local system database. When the connection comes back up again, I want the local database changes to the synchronized with the remote database.
How can I do this?
Related
For my project, there are webapps installed on local servers. Whenever there is internet connection available, a press of a sync button on local server app should sync the database and the user uploaded files with central server. The central server should know from where the data is coming. I am building the local server apps using laravel 4.2. Is there a simple way I can achieve this?
I solved it using an extra column synced in all the tables on both local and cloud server and wrote code to send data through JSON APIs. Then, I updated the synced column with timestamp on both ends. And then on every sync button press, I compared updated and synced column and sent the data to the server accordingly.
I don't know if this is the perfect solution but worked for me.
I am trying to migrate an existing MySQL + PHP project (with phpMyAdmin) to server-side javascript (Node.js and Express). I am wondering how best to establish a MySQL connection with Node, given that the current setup is on phpMyAdmin? That is, is it possible to configure the node-mysql connection such that it finds the host on phpMyAdmin?
In case the above is not possible, I would like to allow collaborators access to this database. Setting up remote MySQL connections to my localhost MySQL database seems to be one solution, but I have run into difficulty in following this post for two reasons: How to allow remote connection to mysql.
An /etc/mysql folder does not exist on my system (cannot find my.cnf)
Upon granting remote access, I'm unsure how to specify the host and/or ip address for other users that will be connecting remotely to my localhost MySQL db.
Alrighty, i've got a processor intensive program i'm running on a locally hosted server, i recently bought a web hosting service and want to frequently update the website's DB with the local server's DB information but cannot figure out how to go about doing this.
I'm using PDO and the local server is a Debian Distro, the remote server is hosted on JustHost. I'm hoping there's a way to update the remote without having to just dump the local's SQL file and upload it to the remotes.
I have a Hotel monitoring web application developed in CodeIgniter.
It monitors each room of a hotel for activities and Sends live data about rooms occupancy, status of electrical switches etc.
It hosted in cpanel web server. I have the same copy of it in my local computer.
I want to synchronize the 2 databases (ie. in localhost and in remote server).
Every change in any database should be automaticaly updated to other database.
First I tried to access my remote database from localhost by putting host name as my cpanel Shared IP Address as follos
$db['default']['hostname'] = 'xx.xx.xx.xx';
where xx.xx.xx.xx is the shared IP of cpanel. but no use.
how should I proceed?
Thanks in advance
Option 1: The Codeigniter Way
You can set up a remote access MySQL server:
Tutorial for cpanel here: https://www.liquidweb.com/kb/enable-remote-mysql-
connections-in-cpanel/
Then just set up a new connection in your database.php changing the to $db['remote']['hostname'] = 'xxx.xxx.xxx.xxx';
Do note that you'll have to make a method to pull in this new data and replace it in your local database, so a script that can be run via cronjob.
Option 2: The Master/Slave Way
You can set up a Master database, being the hotel, and a slave being your local database. I don't know of a method to do this straight cpanel, but you can do it in phpmyadmin, there's a video on how to do it here: https://www.youtube.com/watch?v=nfsmnx24gxU
This method would be the most comprehensive/automatic way to do it. It will overwrite any changes in your database though.
Note that any outside access to a database is potentially insecure. Be aware of this when choosing.
I have a PHP/MySQL application running with WAMP on a local server. This application contains sensitive and confidential data that can only be accessed from devices on the network in the office.
However, this application generates reports that clients should be able to access from the web from an entirely separate application running on a LAMP stack.
Currently, I have the reports transferring via SFTP from the local server to the web based server.
My question is, how can I update the remote database from the local application securely, and so that the MySQL db can only be modified by the localhost of the remote application and the server running the local application?
I'm thinking about creating some kind of API that only accepts data from the IP of the local app, but I do not know the best practices for this, nor do I know how to start going about it.
MySQL provides a USER > FROM HOST > PER DATABASE > PER TABLE > PER COLUMN grant system.
Meaning that you can specify which user can connect from which host to which database,... Make usage of the FROM HOST feature.