Ok, so I'm in the starting stage of a new project where I have an apache web server with PHP included and a MySQL database.
The main focus aim of this project is to show data in this MySQL database as real time on the web page. The problem I have is I am not allowed to install any new software on the server, so I cannot use nodejs or socket.io
I've been looking at the PHP long polling possibility, but I'm curious if anyone out there has managed to pull off something similar without grinding their server to a halt due to too many threads being used.
I've heard about comet, but not sure how that would work as from reading it seems to just look at flat files, not databases.
Thanks for any help.
This is easily achievable with jquery and php, create a php file and echo json encoded data in return. Usage can be found here: jquery post
Related
I am a newbie and hence need some advice.
My problem is similar to this question but I couldn't resolve the problem yet.
Problem:
I am processing data internally and generating 8-10 tables. I want to replicate those tables (in 2 different schema) to remote server automatically and continuously every 15 minutes.
I went toward AWS solution using EC2 DMS RDS but got stuck there and couldn't resolve the problem after spending two days (here is my other question if it helps to understand the background).
Proposed Solution:
By doing research and reading this post, this post, this post and this post, I have come up to a different solution.
Automatically Dump and FTP the csv file(s) to remote web server/cPanel every 15min using PHP and Windows Task Schedular.
Automatically read those csv file(s) and update records on remote DB using PHP script and with some sort of task schedular on web server? (is it possible?).
Question/Advice:
Is my above approach correct or do I need to find another or better solution to do this? If this approach is correct then any kind of related help would be highly appreciated.
Please note:
I couldn't find any solution after spending hours on research on and off S.O.
I'm no natural born coder, I find solutions to what I need to achieve
I think you should do the first part as you mentioned
Automatically Dump and FTP the csv file(s) to remote web server/cPanel every 15min using PHP and Windows Task Schedular.
Automatically read those csv file(s) and update records on remote DB using PHP script and with some sort of task schedular on web server? (is it possible?).
After that as you mentioned that it is cpanel, you can setup cronjob to run this php file in your step 2. Setup php file to send email once the database is updated for your records. The email should be setup for 2 outcomes. One message if there was an error updating database and one if database was updated successfully.
Cronjob is quite useful tool on cpanel.
Ive built an AngularJS application over the last several months that utilizes a MySQL database for its data. This data is fetched by Angular making calls to PHP and PHP returns JSON strings etc.
The issue is once this application is running inside node-webkit, none of the php works, so all of the content areas are empty. I assume (though the documentation on this issue is null and so i have no confirmation) this happens because Node-webkit is a client-side application framework and therefor wont run server-side languages like php. Is there a way to expand node webkit to run php and other server side languages?
I have done my best to find an answer to this question before posting, but documentation for this is nonexistent, and all of the information I have found about node-webkit talks about installing node on your server and installing npms for MySQL and having angular make calls to node. This defeats the purpose of the application entirely as it is designed so that the exe/deb/rpm/dmg can run and you can set up a database with any cloud database provider and be ready to go. Not ideal if you have to buy a vps just to run this one thing.
I have to assume this is possible in some way. i refuse to believe that everyone with an nw application hard codes all their data.
Thanks in advance
I know of four methods to accomplish this. Some of which you have preferred not to do but I am going to offer them in the hopes it helps you or someone else.
Look for an NPM that can do this for you. You should be able to do this functionality within node.js. - https://www.npmjs.com/search?q=mysql
You can host your PHP remotely. Using node-remote you can give this server the appropriate access to your NW.js project.
You can code a RESTful PHP application that your JavaScript can pass off information to.
You can use my boilerplate code to run PHP within a NW.js project. It however fires up an express.js web server internally to accomplish this. But the server is restricted to the machine and does not accept outside connections - https://github.com/baconface/php-webkit
1 and 4 both carry a risk in your case. Your project can be reversed engineered to reveal the source code and the connection information can be retrieved rather easy. So this should only be in an application on trusted machines and 2 and 3 are the ideal solutions.
I have a very simple web page in PHP that uses a MySQL database to randomly feed a simple quiz that goes on forever as long as the user wants to keep answering questions.
I want to move this to my Android to be able to use it offline. I used jQuery Mobile to adapt the layouts to the smartphone. So now I want to move the database to a local database and remove all the need to an internet connection. I have absolutely no experience on Android development so if anyone can help me with a few questions, I'd appreciate it:
How can I store my existing database in the smartphone?
The only actions that the app performs on the database are selects. The information on the database will grow with time (not much, I just want to be able to add more records with time). Is there any tool I can use to manage the local database and add the information as I need (as I do now with PhpMyAdmin)?
The web page exists online right now, will I be able to run it locally as it is? (aka, PHP page with css and js files)
Thanks in advance.
1) Android databases are done in SQLite. I'm unsure exactly what the syntax difference between mySQL and SQLite is, but if a straight dump/import doesn't work, you could export to csv/import that way. For info on getting an external database packaged with an app, check Using your own SQLite database with Android applications. It's a very helpful guide to getting it set up.
2) I use a Firefox add-on, SQLiteManager. I hate doing it, as it's the only reason I have Firefox installed any more, but on *nix it's the best option I've found. It's either that, command line, or SQLiteMan, which I found feature-lacking. On other platforms, I can't comment.
3) I don't know if PHP has a good library for SQLite. If so, you'll most likely need to do some modification to work with that instead of mySQL. If you can get that running smoothly, you should be able to drop it into a webview. The other option is to redo it in Java.
Good luck!
There have been many questions along these lines but I'm struggling to apply them to my scenario. Any help would be be greatly appreciated!
We currently have a functioning mySQL database hosted on a website, data is entered from a website and via PHP it is put into the database.
At the same time we want to now create a python application that works offline. It should carry out all the same functions as the web version and run totally locally, this means it needs a copy of the entire database to run locally and when changes are made to such local database they are synced next time there is an internet connection available.
First off I have no idea what the best method would be to run such a database offline. I was considering just setting up a localhost, however this needs to be distributable to many machines. Hence setting up a localhost via an installer of some sort may be impractical no?
Secondly synchronization? Not a clue on how to go about this!
Any help would be very very very appreciated.
Thank you!
For binding Python to MySql you could use HTSQL:
http://htsql.org
You can then also query your MySQL DB via http requests, either from AJAX calls or server-side e.g. cURL (and of course still have the option of writing standard SQL queries).
There is a JQuery plugin called HTRAF that handles the client side AJAX calls to the HTSQL server.
The HTSQL server runs on localhost as well.
What OS would you be using?
How high-performance does your local application need to be? Also, how reliable is the locally available internet connection? If you don't need extremely high performance, why not just leave the data in the remote MySQL server?
If you're sure you need access to local data I'd look at MySQL's built-in replication for synchronization. It's really simple to setup/use and you could use it to maintain a local read-only copy of the remote database for quick data access. You'd simply build into your application the ability to perform write queries on the remote server and do read queries against the local DB. The lag time between the two servers is generally very low ... like on the order of milliseconds ... but you do still have to contend with network congestion preventing a local slave database from being perfectly in-sync with the master instantaneously.
As for the python side of things, google mysql-python because you'll need a python mysql binding to work with a MySQL database. Finally, I'd highly recommend SQLalchemy as an ORM with python because it'll make your life a heck of a lot easier.
I would say an ideal solution, however, would be to set up a remote REST API web service and use that in place of directly accessing the database. Of course, you may not have the in-house capabilities, the time or the inclination to do that ... which is also okay :)
Are you planning to run mysql on your local python offline apps ? I would suggest something like sqlite. As for keeping things in sync, it also depends on the type of data that needs to be synchronized. One question that needs to be answered:
Are the data generated by these python apps something that is opague ? If yes (i.e. it doesn't have any relations to other entities), then you can queue the data locally and push it up to the centrally hosted website.
This is not a codefix question but please help me where possible.
I am developing an application which needs to store information in a database. This information needs to be stored off-device for security reasons (patient data).
I have explored using PHP as a bridge to the external MySQL database hosted locally via WAMP, however I have recently been informed of servlets and also of SQLite.
As I am learning these technologies under a limited time frame I need to know which to invest my time into to get the job done as easily as possible. I have no experience with any query language but I did get a simple login screen to work on Android using PHP and MySQL on WAMP using HTTP post/fetch within android.
Also, is it possible to store the information in SQLite within android and write that data to a server, which can then load the SQLite database again on re-launch?
Many thanks for your time!
From our expericence these kind of services need to work offline if possible. Both 3g and especially wifi has been unreliable in several occasions.
You do good if you save the data locally in e.g. sqlite database but be warned the data can get wiped on OS update or software update.
Also try to write your changes/additions in batches and servr updater can send all the batches that are not yet on the server. this way it's more reliable if you miss updates.
These kind of interfaces to another systems take a lot of time in testing. Consider getting a new timeframe in advance.
You will find more about Android connection to MySQL here