Hello i try to set up automate cron import to prestashop using excel power query table. The problem is i have table edited at my computer fetching data from external source. I need to write a script in PHP that will refresh the xlsx file before sending it to import php.
Now the steps are, Open Excel, Press Refresh All, Save, Upload to FTP, Run Import.
I want to automate these steps for PHP.
Thank you.
Related
I see we can import json files into firebase.
What I would like to know is if there is a way to import CSV files (I have files that could have about 50K or even more records with about 10 columns).
Does it even make sense to have such files in firebase ?
I can't answer if it make sense to have such files in Firebase, you should answer that.
I also had to upload CSV files to Firebase and I finally transformed my CSV into JSON and used firebase-import to add my Json into Firebase.
there's a lot of CSV to JSON converters (even online ones). You can pick the one you like the most (I personnaly used node-csvtojson).
I've uploaded many files (tab separated files) (40MB each) into firebase.
Here are the steps:
I wrote a Java code to translate TSV into JSON files.
I used firebase-import to upload them. To install just type in cmd:
npm install firebase-import
One trick I used on top of all the one already mentioned is to synchronize a google spreadsheet with firebase.
You create a script that upload directly to firebase db base on row / columns. It worked quite well and can be more visual for fine tuning the raw data compared to csv/json format directly.
Ref: https://www.sohamkamani.com/blog/2017/03/09/sync-data-between-google-sheets-and-firebase/
Here is the fastest way to Import your CSV to Firestore:
Create an account in Jet Admin
Connect Firebase as a DataSource
Import CSV to Firestore
Ref:
https://blog.jetadmin.io/how-to-import-csv-to-firestore-database-without-code/
I'm generating a new csv file (approx) every 2 mins on my machine through a local application that I've written, and I need this file to update my database each time it is generated. I've successfully done this locally via a (scheduled) repeating bat file, and now I need to move this process online so my website has access to this data in as similar of a time-frame as possible.
I'm totally green on mySql and learning it as I go, so I was wondering if there is anything I should be concerned about or any best practices I should follow for this task.
Can I connect directly to my server-side database from my cmd window (bat file) and send this data once the process has run and generated the csv file? Or do I need to upload this file via ftp/php to my webserver and import it into the database once it is online?
Any help/thoughts would be greatly appreciated.
I am trying to set up the following interface between websites:
1) Other websites will upload (xml) files on my webserver via FTP on a specific folder
2) My website will scan the folder for new files and will import them immediately after they have been uploaded
2a) Ideally the scan is triggered after a new file is uploaded in FTP, but I do not know how to do this. Is this possible?
2b) Alternatively, I use cron to scan the folder every second, but I think this will use a lot of resources.
3) After the (xml) file has been detected, it will be automatically imported in the mySQL database.
I just cant find any help on using the scan for new file trigger. I am using PHP, PHP my admin and Drupal. Are these tools sufficient or should I have something else?
You could rather user their feeds and import and update will be done at every cron runs!
I'm working with the site which is required to read the excel file cell data using PHP. And I'm successfully made it using the PHP EXCEL. As the progress moved, My client require me to read the button on the excel file which is an excel macro button. So he can able to trigger it, using the site, without signing in to the server.
Is this possible? Is there anyone can give me an idea regarding this task?
Form data (such as buttons) and macros are not supported by PHPExcel.
To access buttons and execute macros, you'd need a solution based on MS Excel itself (I don't believe Open/Libre Office supports MS Excel macros or VBA Script) which limits you to COM and a Windows platform
I know there are similar questions to this, but I never found a solution to match my case.
so my problem is i have a large file like 130MB with .txt extension.
now I want to upload this file to mysql database.
now I have problem uploading this file it gets timeout, using phpmyadmin.
is there a good way to upload this file using php?
or is there any other way besides those?
Access your server via the console (ssh, telnet, etc) and import the file using the native cli client load data syntax to import your file data:
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Edit: updating answer based on comments.
Since you can't access mysql via a CLI, i would suggest uploading the text file via ftp, then making a quick php script to import the file via a simple db connect + insert statement..
also use set_time_limit(0) to ensure the script doesn't timeout while executing the query..
You'll also need to make sure you have enough ram available to load the file.