I'm generating a new csv file (approx) every 2 mins on my machine through a local application that I've written, and I need this file to update my database each time it is generated. I've successfully done this locally via a (scheduled) repeating bat file, and now I need to move this process online so my website has access to this data in as similar of a time-frame as possible.
I'm totally green on mySql and learning it as I go, so I was wondering if there is anything I should be concerned about or any best practices I should follow for this task.
Can I connect directly to my server-side database from my cmd window (bat file) and send this data once the process has run and generated the csv file? Or do I need to upload this file via ftp/php to my webserver and import it into the database once it is online?
Any help/thoughts would be greatly appreciated.
Related
I am a newbie and hence need some advice.
My problem is similar to this question but I couldn't resolve the problem yet.
Problem:
I am processing data internally and generating 8-10 tables. I want to replicate those tables (in 2 different schema) to remote server automatically and continuously every 15 minutes.
I went toward AWS solution using EC2 DMS RDS but got stuck there and couldn't resolve the problem after spending two days (here is my other question if it helps to understand the background).
Proposed Solution:
By doing research and reading this post, this post, this post and this post, I have come up to a different solution.
Automatically Dump and FTP the csv file(s) to remote web server/cPanel every 15min using PHP and Windows Task Schedular.
Automatically read those csv file(s) and update records on remote DB using PHP script and with some sort of task schedular on web server? (is it possible?).
Question/Advice:
Is my above approach correct or do I need to find another or better solution to do this? If this approach is correct then any kind of related help would be highly appreciated.
Please note:
I couldn't find any solution after spending hours on research on and off S.O.
I'm no natural born coder, I find solutions to what I need to achieve
I think you should do the first part as you mentioned
Automatically Dump and FTP the csv file(s) to remote web server/cPanel every 15min using PHP and Windows Task Schedular.
Automatically read those csv file(s) and update records on remote DB using PHP script and with some sort of task schedular on web server? (is it possible?).
After that as you mentioned that it is cpanel, you can setup cronjob to run this php file in your step 2. Setup php file to send email once the database is updated for your records. The email should be setup for 2 outcomes. One message if there was an error updating database and one if database was updated successfully.
Cronjob is quite useful tool on cpanel.
This is a very general question but I'm not sure the best way to go about things here.
I have two applications that want to interface. One is a Windows based app that has a database and can send CURL commands. The other is a very simple website with a MySQL database.
The main feature is that these two apps can swap database data between each other. The Windows app is currently using SQLAnywhere but could be converted to MySQL.
Anyway: On the web app there is a js function to dump all data requested into a .txt file, essentially a mysql dump. This function will be called by the Windows app via CURL. It will say
"Hey, dump the data for this table in to a txt file, then let me download it."
What I am unsure of is: Once the request to dump the data is complete, the Windows app will want the file right away. How do I say back to it, "Wait until the file is completed, and then you can download it."
I was thinking of making a dummy file and then a .txt file so the Windows app essentially gets stuck in a loop (with a timeout) until the file is renamed to .txt. Is this a good way to approach this?
Thank you.
I have a simple PHP/HTML application (hosted on Heroku) which runs a survey. I would like to save the responses to this survey (only 4/5 questions) to a CSV file for later analysis. These responses are currently saved as PHP variables so a form submission is not possible at the moment.
What is the best way to do this? I'm assuming this would be easiest using Javascript but am open to suggestions. Also, should I put the CSV file on a different server?
We're working on a feature to allow our users to import their own customer/marketing database into our system from a CSV file they upload to our servers.
We're using PHP on Ubuntu 10.04 on Amazon EC2 backed by MySQL on Amazon RDS.
What we've currently got is a script that uses LOAD DATA LOCAL INFILE but it's somewhat slow, and will be very slow when real users start uploading CSV files with 100,000+ rows.
We do have an automation server that runs several tasks in the background to support out application, so maybe this is something that's handed over to that server (or group of servers)?
So a user would upload a CSV file, we'd stick it in an S3 bucket and either drop a line in a database somewhere linking that file to a given user, or use SQS or something to let the automation server know to import it, then we just tell the user their records are importing and will show up gradually over the next few minutes/hours?
Has anybody else had any experience with this? Is my logic right or should we be looking in a entirely different direction?
Thanks in advance.
My company does exactly that, via cron.
We allow the user to upload a CSV, which is then sent to a directory to wait. A cron running every 5 minutes checks a database entry that is made on upload, which records the user, file, date/time, etc. If a file that has not been parsed is found in the DB, it accesses the file based on the filename, checks to ensure the data is valid, runs USPS address verification, and finally puts it in the main user database.
We have similarly setup functions to send large batches of emails, model abstractions of user cross-sections, etc. All in all, it works quite well. Three servers can adequately handle millions of records, with tens of thousands being loaded per import.
I have an iOS app that allows users to update the cover charges at local bars. The data is then displayed on the app for other users to see. The updates are made by sending a request to a php script and then the script updates and xml file. What will happen if a user tries to read the xml while another user is updating it, i.e. while the file is being rewritten with a new update?
Thanks!
The user has a 50-50 chance of getting the updated version, depending on the server speed and there connection speed it may differ. I agree with AMayer, once the file gets big it's going to be hard on your server to download and upload the ENTIRE xml file again and again! I would just setup a MySQL database now and use it instead of the XML.