Can excel read and write to a phpmyadmin database? - php

I am developing an application in excel in which data can be updated remotely ie, data is read from an online database and loaded into excel upon start up. I know this can be done by reading from html tables on a website however, was wondering if it could be done through phpmyadmin directly.

Yes, it can (surprisingly).
Though I strongly recommend exposing your MySQL database as an ODBC data source instead.
In Excel, create a new spreadsheet and go to the Data tab, and under "External Data" choose "From web query", then follow the instructions and enter thr address of your phpMyAdmin installation, login, get to the data, then select the table. Excel will remember your login details and cookies for future data refreshes, though it can be a bit wonky as the feature hasn't really been updated in over a decade (source: I asked some members of the Excel dev team a few weeks ago myself)

phpmyadmin is just a GUI for PHP functions that connect to a MySQL database. You would have to communicate directly with the database or via a PHP script that does so for you.

You can do that. Its just simple.
Take an example that you have a table with following fields.
1) ID 2) Name 3) Email
Now create a csv with 3 columns in it with the data, for eg
1,Rushit,Rushitdawda#gmail.com
2,Test,test#gmail.com
Now go to phpmyadmin and create a table and then go to import & upload the sheet and you are done..

Related

Best Method to accept multiple external datafeed

I'm in the middle of a project that I am working on. It's a classified website for lighting distributors created in php. I would like to accept csv data-feed from each distributor and import the data about 3 times a week. I would also like the data-feed to be hosted on the distributors website and I would import the data to the classified website mysql database from the external link that is provide by the distributor.
What would be the best method to import multiple data-feeds from multiple distributors? I am sorry that I have posted this question but I am desperate. I have search the net for answers but came up empty.
Would it be best to create a cron job that calls a script to import each feed? Obviously I would have a test database to first test each data-feed at first to make sure all the data in the csv file is the correct location.
Would I have to use the test database each and every time I import the data? What would be the best way to prevent something from happening to my database if for some reason the distributor changes the feed?
Any information would be greatly appreciated. Thank you in advance for your help.
Welcome to the wonderful world of ETL. While this question is a little too broad for SO, here's how I would go about it (from a high level):
Create a script to import the CSV to your local file system
Import the data from your local file system to a "Stage" table in your database
Check whatever you want to check (did it load without error, does the stage table look correct, etc)
Assuming everything checks out, drop and reload (or upsert or whatever) from your stage table to the live table. Consider adding a new field to your live tables that holds the timestamp from when the data was last loaded for that record
Consider archiving the flat file on your local system for preservation sake
Make a cron job to run your script that does the above steps.

Difference between mySQL and SQLite in simple terms - Database Solution

I seen many questions posted but not these two exactly. I also tried reading the Wikipedia definitions but I don't have enough technical understanding to know what it is saying exactly.
I am looking for a database solution where the User of a website can enter customized data into the columns in an Order.tbl and link it to their account in the Account.tbl.
Scenario:
I make customized shirts. The user would need to log in to their account and order customized details such as pictures and personalized text which could differ from User to User. It would need to be dynamic in this way versus a static selection of shirts.
The software I use to make the shirts can connect to an MS Access database, SQLite database, mySQL database and a few other flat file databases like Excel and CSV.
Currently my software subscription is with Access and SQLite and was informed that Access would not work in my case because I am using web forms. Which leaves me with SQLite or upgrade to handle mySQL.
The only thing I really understand is that mySQL is more robust than SQLite but I am unsure if I actually need it or not.
Any suggestions for a beginnner?

How do I export my mongodb collection into a table on my website?

I want to create a very simple table that lists all the data in a mongodb database.
The database is hosted locally and updated every minute with information scraped by scrapy.
There are two pieces of data that will populate this table and apart from the "_id" element they are the datatypes in the database.
Because there will be new data added frequently but irregularly I was thinking the data should be pulled only when the website is loaded.
Currently the webpage is nothing more than an html file on my computer and I'm still in the process of learning how to host it. I'd like to be able to have the database accessible before making the website available as making this information available is its primary function.
Should I write a php script to pull the data?
Is there a program that already does this?
Do you know of any good tutorials that would be able to break the process down step-by-step?
If you are just looking to export the data into a file (like a csv) you could try this:
http://blogs.lessthandot.com/index.php/datamgmt/dbprogramming/mongodb-exporting-data-into-files/
The csv may be more useful if you are planning to analyze the data offline.
Otherwise, you could write a script in PHP or Node.JS that connects to the database and finds all the records and displays them.
The Node function you would want is called find:
http://mongodb.github.io/node-mongodb-native/api-generated/collection.html#find

how to update whole database without losing data

I am using mysql database for my site.
I have create one site using codeigniter php framework and mysql.
Now after few months I have updated that site and also the database.
I have added some new columns to my database table not deleted or alert any previous one.
Now I want to update my site but don't want to lose data that I already have right now.
Is there anyway I can update database without losing the data present in it.
If your database is created/updated automatically by some modeling tool, I think the best you way you should do is to understand those changes and write the "alter table" statements yourself and run it in your deployment.
As user1281385 pointed in answer https://stackoverflow.com/a/22012478/1033539 , there may be some tool that can help you generate those statements.
SQLyog has a great feature called "Schema Synchronization tool" which will do it
they also have a blog comparing other methods of doing it
http://blog.webyog.com/2012/10/16/so-how-do-you-sync-your-database-schema/
Im sure other similar tools can do it also
Edit:
Mysql Workbench also has this feature
http://www.mysql.com/products/workbench/features.html

SQL/PHP: How to upload big database to server when I have import file size limit? And then update

I'm creating locally a big database using MySQL and PHPmyAdmin. I'm constantly adding a lot of info to the database. I have right now more than 10MB of data and I want to export the database to the server but I have a 10MB file size limit in the Import section of PHPmyAdmin of my web host.
So, first question is how I can split the data or something like that to be able to import?
BUT, because I'm constantly adding new data locally, I also need to export the new data to the web host database.
So second question is: How to update the database if the new data added is in between all the 'old/already uploaded' data?
Don't use phpMyAdmin to import large files. You'll be way better off using the mysql CLI to import a dump of your DB. Importing is very easy, transfer the SQL file to the server and afterwards execute the following on the server (you can launch this command from a PHP script using shell_exec or system if needed) mysql --user=user --password=password database < database_dump.sql. Of course the database has to exist, and the user you provide should have the necessary privilege(s) to update the database.
As for syncing changes : that can be very difficult, and depends on a lot of factors. Are you the only party providing new information or are others adding new records as well? Are you going modify the table structure over time as well?
If you're the only one adding data, and the table structure doesn't vary then you could use a boolean flag or a timestamp to determine the records that need to be transferred. Based on that field you could create partial dumps with phpMyAdmin (by writing a SQL command and clicking Export at the bottom, making sure you only export the data) and import these as described above.
BTW You could also look into setting up a master-slave scenario with MySQL, where your data is transferred automatically to the other server (just another option, which might be better depending on your specific needs). For more information, refer to the Replication chapter in the MySQL manual.
What I would do, in 3 steps:
Step 1:
Export your db structure, without content. This is easy to manage on the exporting page of phpmyadmin. After that, I'd instert that into the new db.
Step 2:
Add a new BOOL column in your local db in every table. The function of this is, to store if a data is new, or even not. Because of this set the default to true
Step 3:
Create a php script witch connects to both databases. The script needs to get the data from your local database, and put it into the new one.
I would do this with following mysql methods http://dev.mysql.com/doc/refman/5.0/en/show-tables.html, http://dev.mysql.com/doc/refman/5.0/en/describe.html, select, update and insert
then you have to run your script everytime you want to sync your local pc with the server.

Categories