I am a newbie and hence need some advice.
My problem is similar to this question but I couldn't resolve the problem yet.
Problem:
I am processing data internally and generating 8-10 tables. I want to replicate those tables (in 2 different schema) to remote server automatically and continuously every 15 minutes.
I went toward AWS solution using EC2 DMS RDS but got stuck there and couldn't resolve the problem after spending two days (here is my other question if it helps to understand the background).
Proposed Solution:
By doing research and reading this post, this post, this post and this post, I have come up to a different solution.
Automatically Dump and FTP the csv file(s) to remote web server/cPanel every 15min using PHP and Windows Task Schedular.
Automatically read those csv file(s) and update records on remote DB using PHP script and with some sort of task schedular on web server? (is it possible?).
Question/Advice:
Is my above approach correct or do I need to find another or better solution to do this? If this approach is correct then any kind of related help would be highly appreciated.
Please note:
I couldn't find any solution after spending hours on research on and off S.O.
I'm no natural born coder, I find solutions to what I need to achieve
I think you should do the first part as you mentioned
Automatically Dump and FTP the csv file(s) to remote web server/cPanel every 15min using PHP and Windows Task Schedular.
Automatically read those csv file(s) and update records on remote DB using PHP script and with some sort of task schedular on web server? (is it possible?).
After that as you mentioned that it is cpanel, you can setup cronjob to run this php file in your step 2. Setup php file to send email once the database is updated for your records. The email should be setup for 2 outcomes. One message if there was an error updating database and one if database was updated successfully.
Cronjob is quite useful tool on cpanel.
Related
Ok, so I'm in the starting stage of a new project where I have an apache web server with PHP included and a MySQL database.
The main focus aim of this project is to show data in this MySQL database as real time on the web page. The problem I have is I am not allowed to install any new software on the server, so I cannot use nodejs or socket.io
I've been looking at the PHP long polling possibility, but I'm curious if anyone out there has managed to pull off something similar without grinding their server to a halt due to too many threads being used.
I've heard about comet, but not sure how that would work as from reading it seems to just look at flat files, not databases.
Thanks for any help.
This is easily achievable with jquery and php, create a php file and echo json encoded data in return. Usage can be found here: jquery post
Two years ago, I had the need to make a tool which automatically uploads a txt/csv file via POST to my Web Server which later is then parsed via PHP with a cronjob.
This had two happen automatically at midnight everyday. Although this worked, I cannot say that it was a flawless approach, as it was really stateless.
I am currently brainstorming and sketching on paper a new approach.
What will you suggest that I should do best? Any ready made solutions or ideas?
Additional Info for the patient: So far I am really considering that instead of using cronjobs, I execute the parsing via GET/REST, as I'll be more in a known state like that
Thanks a billion!
Please note that cronjobs , POST/GET and REST are 3 different things and they do 3 different things .
A simple approach would be to use Inotify to monitor the upload folder. When a new file is added its sends a trigger to a php file that uploads the files to your server.This way the file is uploaded as soon as they are created or modified.
This solved all my hassles... altough I am still open to suggestions regarding the above post
PHP smbclient
Is it possible to back up an access database? I have done research on how to backup access database through php but I wasnt able to get a good answer. Most of the result that came out is about backing up MySQL database. Can anyone help me :) thanks
re: actually performing the backup
Backing up a native Access database is simply a matter of copying the entire database file (.mdb for Access_2003 and earlier, .accdb for Access_2007 and later). You could use PHP for that, but any scripting language would work, even a simple Windows batch file that does something like
copy /Y d:\apps\databases\mydatabase.accdb z:\backups\databases\*.*
If you're really set on using PHP then you'll likely end up using the copy() function.
re: automatic scheduling of the backup
The Task Scheduler in Windows could take care of that for you. Once you've created your script to copy the database file(s) you can create a scheduled task to run it periodically. See the MSDN article Using the Task Scheduler (Windows) for more information.
So the scenario is this:
I have a mySQL database on a local server running on Windows 2008 Server. The server is only meant to be accessible to users on our network and contains our companies production schedule information. I have what is essentially the same database running on a hosted server running linux, which is meant to be accessible online so our customers can connect to it and update their orders.
What I want to do is a two-way sync of two tables in the database so that the orders are current in both databases, and a one-way sync from our server to the hosted one with the data in the other tables. The front end to the database is written in PHP. I will say what I am working with so far, and I would appreciate if people could let me know if I am on the right track or barking up the wrong tree, and hopefully point me in the right direction.
My first idea is to make (at the end of the PHP scripts that generate changes to the orders tables) an export of the changes that have been made, perhaps using INSERT into OUTFILE WHERE account = account or something similar. This would keep the size of the file small rather than exporting the entire orders table. What I am hung up on is how to (A) export this as an SQL file rather than a CSV (B) how to include the information about what has been deleted as well as what has been inserted (C) how to fetch this file on the other server and execute the SQL statement.
I am looking into SSH and PowerShell currently but can't seem to formulate a solid vision of exactly how this will work. I am looking into cron jobs and Windows scheduled tasks as well. However, it would be best if somehow the updates simply occurred whenever there was a change rather than on a schedule to keep them synced in real time, but I can't quite figure that one out. I'd want to be running the scheduled task/cronjob at least once every few minutes, though I guess all it would need to be doing is checking if there were any dump files that needed to be put onto the opposing server, not necessarily syncing anything if nothing had changed.
Has anyone ever done something like this? We are talking about changing/adding/removing from 1(min) to 160 lines(max) in the tables at a time. I'd love to hear people's thoughts about this whole thing as I continue researching my options. Thanks.
Also, just to clarify, I'm not sure if one of these is really a master or a slave. There isn't one that's always the accurate data, it's more the most recent data that needs to be in both.
+1 More Note
Another thing I am thinking about now is to add at the end of the order updating script on one side another config/connect script pointing to the other servers database, and then rerun the exact same queries, since they have identical structures. Now that just sounds to easy.... Thoughts?
You may not be aware that MySQL itself can be configured with databases on separate servers that opportunistically sync to each other. See here for some details; also, search around for MySQL ring replication. The setup is slightly brittle and will require you to learn a bit about MySQL replication. Or you can build a cluster; much higher learning curve but less brittle.
If you really want to roll it yourself, you have quite an adventure in design ahead of you. The biggest problem you have to solve is not how to make it work, it's how to make it work correctly after one of the servers goes down for an hour or your DSL modem melts or a hard drive fills up or...
Start a query on a local and a remote server can be a problem if the connection breaks. It is better to each query locally stored in the file, such as GG-MM-DD-HH.sql, and then send the data every hour, when the hour expired. Update period can be reduced to 5 minutes for example.
In this way, if the connection breaks, the re-establishment take on all the left over files.
At the end of the file insert CRC for checking content.
Hey folks, this question can't be too complicated. Please provide a solution to at least figure out the ultimate root cause of the problem.
I currently write an application, which controls Excel through COM: The app creates a COM-based Excel instance, opens some XLS files and reads their contents.
Scenario I
On Windows 7, I start Apache and mySQL using xmapp-control with system administrator rights. All works as expected. The PHP-based controller script interacts with Excel as expected.
Scenario II
A problem appears, if I start Apache and mySQL as 'background jobs'. Here is how:
I created two jobs using Windows 7 Task Planner. One runs apache_start.bat, the other runs mysql_start.bat.
Both tasks run as SYSTEM with elevated privileges when Windows 7 boots.
Apache and mySQL work as expected. Specifically, Apache serves HTTP request from clients and PHP is able to talk to mySQL.
When I call the PHP controller, which calls and interacts with Excel using COM, I do receive an error.
The error message comes from Excel [not COM itself] and reads like this:
Excel can't read the specified Excel-file
Excel failed to save the file due to an ill-name worksheet
Interestingly, the first during the first run of the PHP-based controller script, it takes a few seconds to render the error message. Each subsequent run immediately renders the error message.
Windows system logs didn't show a single problem report entry.
Note, that the PHP program and the Apache instance didn't change - except the way Apache was started.
At least the PHP controller script is perfectly able to read the file-system, since it provides the pathes to the XLS-file through scandir() of a certain directory.
Concurrency issues can't be the cause of the problem. A single instance of the specific PHP controller interacts with Excel.
Question
Could someone provide details, why this happens? Or provide ways to isolate the ultimate cause of the problem (e.g. by means of a PowerShell 2 script)?
UPDATE-1 :: 2011-11-29
As proposed, I switched the Task Planner job from SYSTEM to a conventional user. Works. Apache and MySQL get started and process request.
Unfortunately, the situation regarding Excel did't change a bit. Still, I see the error.
As assumed earlier, the EXCEL COM server starts. I'm able to change various settings (e.g. suppress dialogs) without a problem through the COM-instance.
The problem happens while calling this:
$excelComObject->Workbooks->Open( 'PathToXLSFile' );
UPDATE-2 :: 2011-11-30
Added the accounts USER, GUEST and EVERYONE with the READABLE right to the access control list of the XLS file . No change.
Modified the app in such a way, that the PHP part creates a copy of the XLS file as a temporary file and moves the contents of the original file into this. Just to ensure, that the problem isn't forced by odd file / path names.
Still, the problem persists.
UPDATE-2 :: 2011-12-05
I'm going to send the EXCEL COM-Server methods in such a way, that Excel creates a blank file and saves it to /tmp. Let's see, if Excel even isn't able to read this file.
Go into the task planner and let everything run as a local user. This will probably require that you enter a password so create one if you don't have one already.
Excel is a user-level application that shouldn't run as SYSTEM. I'm sure there are ways around it, but you should simply let everything run at the correct level.
Having Apache run on the user level isn't a problem.
Try creating the following directories:
C:\Windows\SysWOW64\Config\Systemprofile\Desktop
C:\Windows\System32\Config\Systemprofile\Desktop
it worked for me :-)
http://social.msdn.microsoft.com/Forums/en-US/innovateonoffice/thread/b81a3c4e-62db-488b-af06-44421818ef91
In the past (read: pre Vista) services had an option called "Allow Service to interact with desktop" which allowed services to spawn windows etc. Starting Vista, this is no longer allowed.
I suspect Excel is failing because it can't function under this restriction. Thus, any attempt to run it as a service in your Win7 installation will fail.
You can Windows XP and allow desktop interaction for your Apache process, which I don't really recommend for obvious reasons.
Another approach I would take is to create a PHP script that runs as a regular process and listens on a socket in an infinite loop. Your PHP script that runs under Apache would communicate with the secondary script through the local socket and have the secondary script spawn Excel.
This may sound complicated but in fact it's not a lot of code and it fixes a problem you will soon have anyway: You should only have one instance of Excel running or you may run into problems. The secondary script could queue requests, handing off one by one to Excel then taking the next in the queue.