MySQL Databases with Ghost Data? - php

I wrote a PHP tool a while back that allows us to automatically setup free MySQL Databases for our customers in an internal standalone installation. It works fine but we recently were forced to migrate the databases due to hardware failure.
After doing so, the existing MySQL databases seem to have some type of ghost data in them. We cannot see these tables in PHPMyAdmin but it is detected by our customer's plugins and some of our own tools.
I have never heard of MySQL Ghost Data like this before and was wondering if anyone has any idea where this comes from or how to fix it?

Ghost Data.. interesting!
I am not sure, but I think you are talking about INFORMATION_SCHEMA, which provide access to database metadata. Check This
And, wherever you migrate to, most of the databases have these metadata-tables.
Ex: SQL SERVER has sysobjects or sysusers tables and many more...

Related

MariaDB multiple DB split between multiple servers - can I query like they all are on the same instance?

I have this scenario: I have a PhP 5.6 application that connects to a MariaDB (edit: I stated MySQL before but is not correct) instance.
In this instance I have a "master" DB and a DB for each firm using the application. Each new firm -> a new DB.
The number of DBs is growing exponentially and we are considering a way to split these client's DBs between multiple MariaDB servers.
The problem is that there are many queries that join between client's DB and master DB, so we cannot blindly just connect to another Host.
Is it possible to setup a MariaDB instance that have databases in other hosts, but still "virtually sees them" as on the same instance (so that cross DB queries still work) ?
I tried to google this without success.
Thank you in advance!
EDIT: I've found out about Federated Tables that might be the solution.
Question is, we like to split DBs between servers because we might have a number of DBs in the range of 50.000-100.000 and we fear about performances.
If I create these DBs locally all with federated tables, will this solve my issue or are we still facing performance issues?
Sounds like you need FederatedX to get to the common database from the client dbs.
And "sharding" is putting several clients on each of several servers. Not one client on each of 50K servers.
Or is there some requirement you have not provided yet?
How big are these "clients"? How big is the largest? Are there security concerns if multiple clients live on the same server? In the same MariaDB instance? In the same table?
Have you already set up some form of Proxy to route a client request to the server for that client? Or are you expecting the client to own his server?
Etc, etc.
Keep in mind that reaching into a 'federated' table is slow, in some cases orders of magnitude slower. So work hard on minimizing the need for federation.
Another approach would be less burden on the individual queries, but more admin effort -- duplicate the "common" database across all of the shards. This could probably be done via replication by having the "common" server be a Master and all the shards by Slaves.
You can use Firebase it's fast easy more then MySQL v
But if you need to this
You can make a simple API on the other server that has the instance for MySQL this API will run your query on the server and return back the results

Table got dropped from phpmyadmin

I have a PHP application which is on a production server and it`s meant to register users to some services. It has two forms which registers a user in a different table from my database.
Problem is that today one of the tables disappeared and I was able to restore it from a backup. But this dose`t get rid of the problem.
How do I investigate this in order to determine how that table got lost and most likely dropped by some bot or something.
How would you proceed in a situation like this?
There are two ways:
Have a working backup of the system, and restore the files from it.
An undelete tool might help, if you deleted the db very recently (and ideally, if you unplugged the computer right afterward).
As for doing it with MySQL, though...on all systems i'm aware of, no. MySQL tables are files in the server's data directory, and dropping a table deletes those files. Once they're gone, they're gone, and only the methods above can get them back. A database is a directory of those files, and dropping it deletes the whole directory.
Check this free software
http://www.majorgeeks.com/Restoration_d4474.html
More information here - http://emaillenin.blogspot.com/2010/11/recover-accidentally-deleted-mysql.html
If your tables got dropped, find out what mysql users have privileges to drop a table (there shouldn't be many) and what services log in with that user credentials.
Maybe you have a web form with a php backend that doesn't clean up (escape) input, so you were maybe open to sql injection.
In that case, you could check you webserver access log.

Import data from an external sql server database into local database using php

I have a website where I query one of our external sql servers and insert the records into the local server of the website.
I'm simply connecting to the external database, querying the table, truncating the local table, and running a foreach to insert data into the local table.
The process works fine, the problem is that it takes a long time.
I just want to see if you guys could give me some hints on how to speed up the process. If there is another way to do this, please let me know.
There are so many factors to determine the best approach. Is the database you are copying to supposed to always be the same as the source one, or will it have entries that are not in the source one. If you just want them to be identical, and the web site database is simply a read only clone, you have a bunch of ways in SQL Server to do this: replication, log shipping, mirroring, SSIS packages. It all depends on how frequently you want to synchronize the databases and a lot of other factors.

connect a database with externals tables

I never made something similar .
I have a system and i need to relate my data with external data (in another database).
My preference is get these data and create my own tables, but in this case when the other dbs are updated my personal tables will be obsolete.
So, basically i need to synchronize my tables with external tables, or just get the external data values.
I don't have any idea how i can connect and relate data from ten external databases.
I need to check if an user is registered in another websites basically.
Any help?
I am crrently doing something similar.
Easiset way I found is to pull the data in, though I do bi-directional syncronisation in my project you haven't mentionned this so I imagine it's a data pull you are aiming for .
You need to have user accounts on the other servers, and the account needs to be created with an ip instead of 'localhost'. You will connect from your end through mysql client using the ip of distant host instead of the ususal localhost.
see this page for a bit more info.
If, like me you have to interface to different db server types, I recommend using a database abstraction library to ease the managing of data in a seamless way across different sql servers. I chose Zend_db components, used standaline with Zend_config as they Support MySQL and MSSQL.
UPDATE - Using a proxy DB to access mission critical data
Depending on the scope of your project, if the data is not accessible straight from remote database, there are different possibilities. To answer your comment I will tell you how we resolved the same issues on the current project I am tied to. The client has a big MSSQL database that is is business critical application, accounting, invoicing, inventory, everything is handled by one big app tied to MSSQL. My mandate is to install a CRM , and synchronise the customers of his MSSQL mission-critical-app into the CRM, running on MySQL by the way.
I did not want to access this data straight from my CRM, this CRM should not ever touch their main MSSQL DB, I certainly am not willing to take the responsibility of something ever going wrong down the line, even though in theory this should not happen, in practice theory is often worthless. The recommandation I gave (and was implemented) was to setup a proxy database, on their end. That database located on the same MSSQL instance has a task that copies the data in a second database, nightly. This one, I am free to access remotely. A user was created on MSSQL with just access to the proxy, and connection accepted just from one ip.
My scipt has to sync both ways so in my case I do a nightly 'push' the modified records from MSSQL to the crm and 'pull' the added CRM records in the proxy DB. The intern gets notified by email of new record in proxy to update to their MSSQL app. Hope this was clear enough I realize it's hard to convey clearly in a few lines. If you have other questions feel free to ask.
Good-luck!
You have to download the backup (gzip,zip) of the wanted part(or all) of the Database and upload it to the another Database.
Btw. cronjobs wont help you at this point, because you cant have an access to any DB from outside.
Does the other website have an API for accessing such information? Are they capable of constructing one? If so, that would be the best way.
Otherwise, I presume your way of getting data from their database is by directly querying it. That can work to, just make a mysql_connect to their location and query it just like it was your own database. Note: their db will have to be setup to work with outside connections for this method to work.

Locking a SQL Server Database with PHP

I'm wanting extra security for a particular point in my web app. So I want to lock the database (SQL Server 2005). Any suggestions or is this even necessary with SQL Server?
Edit on question:
The query is failing silently with no errors messages logged, and does not occur inside of a transaction.
Final Solution:
I never was able to solve the problem, however what I wound up doing was switching to MySQL and using a transactional level query here. This was not the main or even a primary reason to switch. I had been having problems with SQL Server and it allowed me to have our CMS and various other tools all running on the same database. Previous we had a SQL Server and a MySQL database running to run our site. The port was a bit on the time consuming however in the long run I feel it will work much better for the site and the business.
I suppose you have three options.
Set user permissions so that user x can only read from the database.
Set the database into single user mode so only one connection can access it
sp_dboption 'myDataBaseName', single, true
Set the database to readonly
sp_dboption 'myDataBaseName', read only, true
I never was able to solve the problem, however what I wound up doing was switching to MySQL and using a transactional level query here. This was not the main or even a primary reason to switch. I had been having problems with MSSQL and it allowed me to have our CMS and various other tools all running on the same database. Previous we had a MSSQL and a MySQL database running to run our site. The port was a bit on the time consuming however in the long run I feel it will work much better for the site and the business.

Categories