Alrighty, i've got a processor intensive program i'm running on a locally hosted server, i recently bought a web hosting service and want to frequently update the website's DB with the local server's DB information but cannot figure out how to go about doing this.
I'm using PDO and the local server is a Debian Distro, the remote server is hosted on JustHost. I'm hoping there's a way to update the remote without having to just dump the local's SQL file and upload it to the remotes.
Related
I have a WordPress site running on Digital Ocean droplet with MySQL database on the same server. I am trying to migrate the database to a remote database. I used Digital Ocean's managed SQL service in the same region for it, but after changing the database configuration in wp-config.php, the site is taking 30+ seconds to load now.
I also tried using the GCP's Cloud SQL but facing the same issue.
I would suggest first confirming if you can reach the remote SQL server on your own and what's the response time from the other server.
If everything seems fine and both servers are in the same region, ask DigitalOcean if it's possible for the two servers to communicate in the private network instead of connecting through public IPs.
Finally, if you are still experiencing this issue, I would suggest reaching their support and explaining this issue. They should be able to help if it's a "managed" service.
I have created a web app locally in WAMP with PHP and a MySQL database. I am about to launch a demo online on a Linode server.
My question is once I have got it live on LINODE and I want to add a new Mysql table locally on WAMP how do i push the MYSQL changes to the live version online. I am not sure what this is called and how to do it. if anyone can share ideas/videos that would be awesome
I would suggest using a MySQL Replica, in which the replicated database will only be read from.
That or you can create some type of CI/CD pipeline which will run the same queries on your local as the remote, but that's a bit more complex.
I have a web app running with PHP and MySQL.
I need to develop a desktop application which will sync data from the cloud DB whenever the client's computer connects to internet. If the client's computer is not connected to the internet, the desktop application will continue to work offline, using the local DB. The local DB is obviously a replica of the cloud DB.
I don't want to use Microsoft c# to create the desktop application. The desktop application needs to be cross platform and should run on Windows, Mac and Linux.
I have used XAMPP to create a local MySQL DB and have achieved the local app to sync with the cloud app. However, there are multiple problems to that approach.
-- Whenever my client's need to install the local app, they need to call me and I have to install XAMPP in their computer, setup the server, setup the local database and prepare it to sync with the cloud database with their account. They obviously aren't tech savvy so they don't know how to do it themselves.
-- If the client formats his computer, they will call me again and again and I have to set it up for them all the time, which isn't scalable in the longer haul.
-- XAMPP doesn't work when there are other processes running and using common ports. Example - Skype, Quickheal and other antivirus software running will prevent the SQL server to start. Sometimes what happens is that even after I have installed the local app, the client will install an antivirus software or some other tool and my local app will stop working on their computer.
Hence, I need to do away with XAMPP and switch to something else.
SQLite is out of question since it is serverless. I don't want to use .NET either. What I am looking for is this
I want to develop the database driven local application and package it somehow. I want to provider an installer file which will automatically install the database server, setup the database and everything else. The client will only login to the system in the local app and he doesn't have to setup any server. All the work that he does will be synced with the cloud server whenever internet connection resumes.
Please note that there is a master slave work involved. The client will have multiple terminal computers using the master system from other terminals and all these terminal computers will use the local database installed in the master computer.
I have tried to illustrate this with a diagram below
What's the best way to go about it?
I have developed a table based view.php, which consists of a local mysql database part, that does a remote left join on external oracle databases (read-only) from our partner company.
All working fine on my local machine so far. Now to my problem:
I need to somehow distribute the view.php to every windows 7 computer in our office network.
We have absolutely no chance to install or execute any files (or
local webservers) on our computer/network. Anything, that requires
administrator rights will not work.
Simply getting the administrator password or asking about any
changes to an administrator is out of question.
The oracle databases (read-only!) are secured with a firewall, which
only permits connection requests from our static ip on our working
machines.
Considering the mentioned limitations above: how can i make the php file accessible to the other machines in our office with a working external oracle database connection from the office ip without the possibility to install a webserver or in general without execute rights? is there anything else i can try?
Is there a way to publish the view.php on a remote webserver and somehow connect to the oracle databases from our local ip instead of failing by connecting from the webservers ip?
Any help is highly appreciated.
Thanks in advance.
I have a PHP/MySQL application running with WAMP on a local server. This application contains sensitive and confidential data that can only be accessed from devices on the network in the office.
However, this application generates reports that clients should be able to access from the web from an entirely separate application running on a LAMP stack.
Currently, I have the reports transferring via SFTP from the local server to the web based server.
My question is, how can I update the remote database from the local application securely, and so that the MySQL db can only be modified by the localhost of the remote application and the server running the local application?
I'm thinking about creating some kind of API that only accepts data from the IP of the local app, but I do not know the best practices for this, nor do I know how to start going about it.
MySQL provides a USER > FROM HOST > PER DATABASE > PER TABLE > PER COLUMN grant system.
Meaning that you can specify which user can connect from which host to which database,... Make usage of the FROM HOST feature.