I have 2 servers. On #1 remote DB access is disabled. Database is huge (~1GB) so there is no possible to dump it with phpMyAdmin as it crashes and hangs the connection. I have no SSH access. I need to copy entire DB to #2 (where I can set up virtually everything).
My idea is to use some kind of HTTP access layer over #1.
For example simple PHP script that accepts query as an _GET/_POST argument and returns result as HTTP body.
On #2 (or my desktop) I could set up some kind of server application that would ask sequentially for every row in every table, even one at the time.
And my question is: do you know some ready-to-use app with such a flow?
BTW: #1 is PHP only, #2 can be PHP, Python etc
I can't run anything on #1, all fopen, curl, sockets, system etc are disabled. I can only access DB from PHP, no remote connections allowed
Can you connect to a remote MySQL server from PHP on Server #1?
I know you said "no remote connections allowed", but you haven't specifically mentioned this scenario.
If this is possible, you could SELECT from your old database and directly INSERT to MySQL running on Server #2.
long time ago I used Sypex Dumper for this, just left open browser for the night and next morning whle db dump was available on ftp
It is not clear what is available on the server #1.
Assuming you can run only php you still might be able to
run scp from php and connect to server #2, send files like that
maybe you can use php to run local commands on server #1? in that case something like running rsync through php from server #1 can work
It sounds to me that you should contact the owner of the host and ask if you can get your data out somehow. It is a bit too stupid that you should stream your entire database and reinsert it on the new machine. It will eat a lot of resources on the php server you get the data from. (And if the hosting provider already is that restrictive, you might have a limit to how much sql operations you are allowed in a time span as well)
Though if you is forced to do it, you could do a select * from table and for each row convert it to a json object that you echo on the line. This you can store to disk on your end and use to insert it afterward.
I suspect you can access both servers using FTP.
How do you get your php files on it otherwise.
Perhaps you can just copy the mysql database if you can access it using FTP.
Works not in all cases check out: http://dev.mysql.com/doc/refman/5.0/en/copying-databases.html for more info.
Or you can try the options found on this link:
http://www.php-mysql-tutorial.com/wikis/mysql-tutorials/using-php-to-backup-mysql-databases.aspx
Don't now how your php is setup though as I can imagine it can take some time to process the entire database.
max execution time setting etc.
There is the replication option, if at least one of the hosts is reachable remotely via TCP. It would be slow to synch a virgin DB this way, but it would have the advantage of preserving all the table metadata that you would otherwise lose by doing select/insert sequences. More details here: http://dev.mysql.com/doc/refman/5.0/en/replication.html
I'm not sure how you managed to get into this situation. The hosting provider is clearly being unreasonable, and you should contact them. You should also stop using their services as soon as possible (although this sounds like what you're trying to do anyway).
Unfortunately PHPMyAdmin is not suitable for database operations which are critical for data as it's got too many bugs and limitations - you certainly shouldn't rely on it to dump or restore.
mysqldump is the only way to RELIABLY dump a database and bring it back and have a good chance for the data to be complete and correct. If you cannot run it (locally or remotely), I'm afraid all bets are off.
Related
I'm making a program which will be always gathering some data and putting them into a MySQL database, now i have two way to accomplish that and i wanted to knwo which one is better:
1- As i'm using Qt i can use QtSql module to connect directly to the DB and insert data
2- I can aslo write a PHP script using GET or POST varibles to insert data into the DB and just call the URL from my program with the suitable data.
I'm most worried about performance, as there will a lot of insertion all the time (about 100inserts / second). But the data size of each insertion is not that big, it will not exceed 10 charcaters.
I would point out that the Web server(where the php script will be stored), the DB server and the Server where the program will be running are all on the same local network
It depends on your environment. Is the MySQL DB running on your local PC? Then, QtSql should be perfectly fine.
If the DB is publicly accessible (eg. via the internet), then this would be a bad idea. You generally should avoid exposing MySQL DB servers directly to public network and instead provide access e.g. via locked-down interfaces such as your PHP one.
edit: From a pure performance-centric point of view, the QtSql solution should definitely be faster (though the only way to be sure is to benchmark it!). The PHP solution would have at least the following elements that generate overhead:
Data transfer over a network socket
Startup of the PHP interpreter (is it still the case that PHP is started once for each request? If so, this is going to be significant.)
Well I'm looking for a way how I can transfer selected MySQL data from one server to another every minute or at least every few minutes. Here an example:
(Connect to the source SQL server and select the needed data)
SELECT name, email, online, session FROM example_table WHERE session!=0
(Process the data, connect to the external target SQL server and INSERT/REPLACE the data)
I want to transfer ONLY the output of the query to the target server which has of course a fitting table structure.
I have made already a simple PHP script which is being executed every minute by a cronjob on Linux but I guess that there are performance wise better ways, nor it supports arrays right now.
Any kind of suggestions / code examples which are Linux compatible are welcome.
I'm not entirely sure what data it is you're trying to transfer, but luckily MySQL supports replication between different servers. If you save the data on the local source server and set up the target server to fetch all updates from the source server, you'll have two identical databases. This way, you won't need any scripts or cronjobs.
You can find more information at http://dev.mysql.com/doc/refman/5.0/en/replication-howto.html.
Here is a good open source replication engine:
http://code.google.com/p/tungsten-replicator/
There have been many questions along these lines but I'm struggling to apply them to my scenario. Any help would be be greatly appreciated!
We currently have a functioning mySQL database hosted on a website, data is entered from a website and via PHP it is put into the database.
At the same time we want to now create a python application that works offline. It should carry out all the same functions as the web version and run totally locally, this means it needs a copy of the entire database to run locally and when changes are made to such local database they are synced next time there is an internet connection available.
First off I have no idea what the best method would be to run such a database offline. I was considering just setting up a localhost, however this needs to be distributable to many machines. Hence setting up a localhost via an installer of some sort may be impractical no?
Secondly synchronization? Not a clue on how to go about this!
Any help would be very very very appreciated.
Thank you!
For binding Python to MySql you could use HTSQL:
http://htsql.org
You can then also query your MySQL DB via http requests, either from AJAX calls or server-side e.g. cURL (and of course still have the option of writing standard SQL queries).
There is a JQuery plugin called HTRAF that handles the client side AJAX calls to the HTSQL server.
The HTSQL server runs on localhost as well.
What OS would you be using?
How high-performance does your local application need to be? Also, how reliable is the locally available internet connection? If you don't need extremely high performance, why not just leave the data in the remote MySQL server?
If you're sure you need access to local data I'd look at MySQL's built-in replication for synchronization. It's really simple to setup/use and you could use it to maintain a local read-only copy of the remote database for quick data access. You'd simply build into your application the ability to perform write queries on the remote server and do read queries against the local DB. The lag time between the two servers is generally very low ... like on the order of milliseconds ... but you do still have to contend with network congestion preventing a local slave database from being perfectly in-sync with the master instantaneously.
As for the python side of things, google mysql-python because you'll need a python mysql binding to work with a MySQL database. Finally, I'd highly recommend SQLalchemy as an ORM with python because it'll make your life a heck of a lot easier.
I would say an ideal solution, however, would be to set up a remote REST API web service and use that in place of directly accessing the database. Of course, you may not have the in-house capabilities, the time or the inclination to do that ... which is also okay :)
Are you planning to run mysql on your local python offline apps ? I would suggest something like sqlite. As for keeping things in sync, it also depends on the type of data that needs to be synchronized. One question that needs to be answered:
Are the data generated by these python apps something that is opague ? If yes (i.e. it doesn't have any relations to other entities), then you can queue the data locally and push it up to the centrally hosted website.
I got a situation where I have lots of system configurations/logs off which I have to generate a quick review of the system useful for troubleshooting.
At first I'd like to build kind of web interface(most probably a php site) that gives me the rough snapshot of the system configuration using the available information from support logs. The support logs reside on mirrored servers (call it log server) & the server on which I'll be hosting the site (call it web server) will have to ssh/sftp to access them.
My rough sketch:
The php script on web server will make some kind of connection to the log server & go to the support logs location.
It'll then trigger a perl script at logs server, which will collect relevant stuffs from all the config/log files into some useful xml (there'd be multiple of those).
Someway these xml files are transferred to web server & php will use it to create the html out of it.
I'm very new to php & would like to know if this is feasible or if there's any other alternative/better way of doing this?
It would be great if someone could provide more details for the same.
Thanks in advance.
EDIT:
Sorry I missed to mention that the logs aren't the ones generated on live machine, I'm dealing with sustenance activities for NAS storage device & there'll be plenty of support logs coming from different end customers which folks from my team would like to have a look at.
Security is not a big concern here (I'm ok with using plain text authentication to log servers) as these servers can be accessed only through company's VPN.
Yes, PHP can process XML. A simple way is to use SimpleXML: http://php.net/manual/en/book.simplexml.php
While you can do this using something like expect (I think there is something for PHP too..), I would recommend doing this in two separate steps:
A script, running via Cron, retrieves data from servers and store it locally
The PHP script reads from the local stored data only, in order to generate reports.
This way, you have these benefits:
You don't have to worry about how to make your php script connect via ssh to servers
You avoid the security risks related to allowing your webserver user log in to other servers (high risk in case your script gets hacked)
In case of slow / absent connectivity to servers, long time to retrieve logs, etc. you php script will still be able to quickly show the data -- maybe, along with some error message explaining what went wrong during latest update
In any case, you php script will terminate much quicker since it only has to retrieve data from local storage.
Update: ssh client via php
Ok, from your latest comment I understand that what you need is more a "front-end browser" to display the files, than a report generation tool or similar; in this case you can use Expect (as I stated before) in order to connect to remote machines.
There is a PECL extension for PHP providing expect functionality. Have a look at the PHP Expect manual and in particular at the usage examples, showing how to use it to make SSH connections.
Alternate way: taking files from NFS/SAMBA share
Another way, avoiding to use SSH, is to browse files on the remote machines via locally-mounted share.
This is expecially useful in case interesting files are already shared by a NAS, while I wouldn't recommend this if that would mean sharing the whole root filesystem or huge parts of it.
I want to run a php script from the command line that is always running and constantly updating a variable.
I then want any php script that is run in the meantime (probably but not necessarily from the web) to be able to read that variable at any time.
Anyone know how I can do this?
Thanks.
Here, you want some kind of inter-process communication mecanism.
You cannot use a PHP variable for that : these are local to the script they're in.
Which means you'll have to use some "external" tool to store your data, like, to only speak of a few :
a file
a database (SQLite, MySQL, ...)
some shared-memory segment
In each case, you'll have :
One script that write to the data-storage space -- i.e. your first always running script
One or many other scripts that will read from the data-store
You should write the variable to a file with the CLI script and read from that with the other script.
Be sure to use flock to prevent race conditions.
You can write a php socket based server script, which will listen on desired port. Find article here.
Then your client php script can connect to it either locally or from the web and retrieve any data, including variables.
You can use any simple protocol designed by you or well known like XML to transfer variables.
Lots of idea's:
At set intervals it appends/writes to a file.
You use sqlite and write your data to it.
Your use a small memcached service as your intermediary.
You go somewhat crazy and write a socket class, listen on a set port, then make non-blocking calls to check.
1-2 are probably the simplest
3 would work great if you need to query the value a lot
4 would be fun, but might not be worth the effort.