I have an application with a local Sqlite database which stores records on the local phone. Now I want to increase functionality and I have come to the conclusion that I need a remote database and server based functionality to make my application more interactive. Also at this juncture I don't want my users to lose data due to this transformation. After research and lookup over the internet I came to know that I can achieve this by writing a web-service which will fetch records row by row and feed into my mySQL database. However I am confused about how to achieve this, do I convert the local databases into XML file(s) and POST them to my PHP where I parse them and feed into mySQL database or there is a better way or rather an API for this? I want a starting point, a heading. Thanks!
As you say, convert local db to XML or JSON and POST it to server. Do not forget use gzip when transfer.
You can upload the db file directly to server. Your db file locate in /data/data/app-package-name/databases. You should parse this db file in server.
Related
I need to run a process that will perform about 10,000 mysql inserts into a GoogleSQL instance. Normally, I would use a load data local infile query for this to avoid the script timing out, but my app is running in Google App Engine which has a read-only filesystem. Normally, when I need my GAE app to write to the filesystem, I can just use file names prefixed with gs:// and the php code will read/write to/from Google Storage transparently.
However, I doubt that MySQL will understand a file path of gs://path/to/my/file.
Is there another way that I can make a dynamically generated local file available in a Google App Engine environment so that I can load it into my GoogleSQL instance?
Otherwise, I feel like I'm going to need to build a looping ajax system to insert X rows at a time until it's gone through however many I need (10,000... 20,000, etc).
I know that I can put multiple values sets into a single insert to speed it all up and I'm planning to do that, but with the datasets as large as I'm dealing with, that still won't speed things up enough to avoid the timeouts consistently.
Sorry in advance for sounding like a novice but I'm very new to app development and hope someone can help me!
I'm trying to create an iOS app that will store data locally when offline (i.e. an email address) and once connectivity to the internet is available will persist that data across to a server.
First I created an SQLite database which I done using the db browser tool, and everything for the most part is working the way it should. The app uses Core Data to persist to a SQLite DB.
Next I created a php file that would check for an internet connection then select the SQLite db and it's data then insert that data into a MySQL db.
This is where I got stuck. Am I right in thinking that the data saved in the SQLite db when running on a device is saved in the device's document directory?
If this is true then how do I gain access to it via the php file for it to be persisted over to MySQL server?
Am I missing something? What is the correct way of persisting this data across from the SQLite db on the device to my MySQL server?
Many thanks in advance for any help.
It sounds like you want to store your data in a cookie (the email address), and post to the server when online. You would not need a offline client DB for data as small as an email address.
You could do this with AJAX (Asynchronous Javascript), posting to an awaiting php file based upon a Javascript file.
PHP communicates to MySQL very well with a LAMP/WAMP (Linux/Windows-Apache-MySQL-PHP) type server (or Ios version: IAMP?). With AJAX, you would $_GET or $_POST to the PHP file, and then write the data (after sanitizing) to the MySQL database with either MYSQLI or PDO, preferably with a prepared statement.
To get the data back from the server, you would have another PHP file that could be loaded (again by AJAX) to query MySQL with a simple SELECT statement.
I am suggesting AJAX here because it would not require user interaction to post; you could set a timer to continually check for an internet connection, and post when it does come online. You could also have a similar timer to sync with the server database. Keep in mind, a timer based AJAX will put a constant load on your server, so this may not be too scalable.
My goal is to create a way fill a PDF form from a database. I have an inventory of items that I want to track and would like to do so by outputting to an existing pdf form. I would like to use a database that can easily be edited by a gui like access (that is installed on the computers at work). I am on work computers and am not sure if I will be able to run an exe's on them(like trying to use java with iText).
That sums up my goals and issues. I am very new to programming and was thinking I could do this with a webpage and have the database and the pdf form all stored in the same file and use the webpage to make queries to fill the pdf form while using access to edit the database for the inventory.
It appears php would be the best way to do this but it looks like php requires a server to run its code. So I was thinking I could host the php code on a server and attempt to access a the database at work using the db's shared drive location. Since I am new to database I am not sure if that is possible or if I would need to have the database hosted on a server.
Does anyone have any recommendations on how I could accomplish this. I was thinking about just having it all server based but I signed up for 1and1 hosting to experiment with and its MySQL database cannot be accessed remotely. I guess this is not a big problem but it would require me to develop my own gui for db editing.
I am not necessarily looking for code examples, just big picture ideas on how to accomplish this.
I need to weekly sync a large (3GB+ / 40+ tables) local MySQL database to a server database.
The two databases are exactly the same. The local DB is constantly updated and every week or so the server DB need to be updated with the local data. You can call it 'mirrored DB' or 'master/master' but I'm not sure if this is correct.
Right now the DB only exist locally. So:
1) First I need to copy the DB from local to server. With PHPMyAdmin export/import is impossible because of the DB size and PHPMyAdmin limits. Exporting the DB to a gzipped file and uploading it through FTP probably will break in the middle of the transfer because of connection to the server problems or because of the server file size limit. Exporting each table separately will be a pain and the size of each table will also be very big. So, what is the better solution for this?
2) After the local DB us fully uploaded to the server I need to weekly update the server DB. What the better way to doing it?
I never worked with this kind of scenario, I don't know the different ways for achieving this and I'm not precisely strong with SQL so please explain yourself as good as possible.
Thank you very much.
This article should get you started.
Basically, get Maatkit and use the sync tools in there to perform a master-master-synchronization:
mk-table-sync --synctomaster h=serverName,D=databaseName,t=tableName
You can use a DataComparer for mysql.
Customize the template synchronization, which specify the data which tables to synchronize.
Schedule a weekly update on the template.
I have 2 servers daily synchronized with dbForge Data Comparer via command line.
I want to connect Excel to my web site, as an external data source, and thereby run reports on the data that is fetched from my website. I have heard that this should be possible to do and Excel seems to support it, but I have little knowledge on how I actually should build the backend on my PHP server to serve the data. How do I do it?
I am well aware of the fact of being able to create and read Excel files on a PHP server, but that's not what I am after.
Excel supports IQY, internet queries.
You may define one in excel pointing at your webserver and get the data right into excel.
You may use formatted html-tables at your server. Colors will be preserved.
the iry-format is described at:
http://support.microsoft.com/kb/157482
and supports both post/get and parameters.
regards
//t