How to Create a Schema in MySQL? - php

How to create a schema with existing tables. I have 100 tables then now I want to create a schema for this table can you tell the procedure how or any tools.
I am trying the TOAD but it shows the result in HTML format. I want the result in SQL.
How create a schema using PHP.

A single show create table <tablename> statement in mysql command line tool will suffice, if you want schema only and no data. The result will be a full create table ddl

Take Dump of all table so you get schema...refer this Dump
mysqldump -u root -p[root_password] [database_name] > dumpfilename.sql

mysqldump --user MYSQL_USER --password=MYSQL_PASSWORD --add-locks --flush-privileges \
--add-drop-table --complete-insert --extended-insert --single-transaction \
--database DB_TO_BACKUP > FILENAME_FOR_SQL_STATEMENTS
this will give a file full of SQL create and insert statements that can be used to recreate the DB. The back slashes "\" are continue to next line... you may want to ignore it and write the whole command in one line
Now, I am not sure if you wanted to the whole Db with data or just tables... you may want to use --no-data to get rid of insert SQL statements and just create and delete (and lock) statements, e.g.
mysqldump --user MYSQL_USER --password=MYSQL_PASSWORD --add-locks --flush-privileges \
--add-drop-table --no-data --single-transaction \
--database DB_TO_BACKUP > FILENAME_FOR_SQL_STATEMENTS

http://dev.mysql.com/doc/refman/5.5/en/mysqldump.html#option_mysqldump_no-data
That is:
mysqldump --no-data -u <username> -p --database <dbname> <further options> > <dbname>.sql
from the command line, on the server (perhaps via ssh).

...can you tell the procedure how or any tools
You can use 'Generate Schema Script' or 'Backup' GUI tool in dbForge Studio for MySQL.
Limited Backup is available in free express version, the size of backup files is limited to 1MB, it is enough for 100 table schemas.

You can use mysql workbench, connect with existing database there and you can import and export the schema in sql formats

Related

How do I use PHP to execute a SQL file in batch that contains thousands of statements?

I have a .sql file that contains a handful of TRUNCATE, DROP TABLE IF EXISTS and CREATE TABLE statements as well as thousands of INSERT and UPDATE statements.
I'm trying to use PHP to execute all the statements in the file in batch as follows:
$sql = file_get_contents('path/to/sql/file.sql');
$link->($sql);
Where $link is a valid MySQLi DB link. I've also tried swapping out $link->($sql) with $link->multi_query($sql), but neither seem to work.
If it makes any difference, I'm attempting to do this in an XAMPP localhost environment on Windows 10 with PHP 7.1 and MariaDB 15.1.
Does anyone know why this would cause problems and not work?
What's the best way in PHP to execute a SQL file with thousands of statements in it?
Thanks.
Ryan, thanks again for the advice. As you suggested, running the mysql command directly on the SQL file is the way to go.
The following worked for me:
shell_exec('C:\\xampp\\mysql\\bin\\mysql.exe -h localhost -u root -ppassword -D db-name "SELECT * FROM table-name;"');
Please note that you don't want a space between the -p option and the password itself. If you don't have a password, then just remove the -p option altogether.
Edit: The sample command above is a very arbitrary one to demonstrate running the mysql command. If you want to read and execute a bunch of SQL statements in a file, the following is more what you need:
shell_exec('C:\\xampp\\mysql\\bin\\mysql.exe -h localhost -u root -ppassword -D db-name < C:\\xampp\\htdocs\\path\\to\\file.sql');

How to fetch & store over 100k records to DB from another DB

I have school database who having more then 80 000 records and I want to update and insert into my newSchool database using php, whenever I try to run query update or insert almost 2 000 records and after some time its stopped automatically please help
You could (should) do a full dump and import that dump later. I'm not sure how to do it with php - and think you'd be better doing this with those commands on the cli:
mysqldump -u <username> -p -A -R -E --triggers --single-transaction > backup.sql
And on your localhost:
mysql -u <username> -p < backup.sql
The backup statement flags meanings from the docs:
-u
DB_USERNAME
-p
DB_PASSWORD
Don't paste your password here, but enter it after mysql asks for it. Using a password on the command line interface can be insecure.
-A
Dump all tables in all databases. This is the same as using the --databases option and naming all the databases on the command line.
-E
Include Event Scheduler events for the dumped databases in the output.
This option requires the EVENT privileges for those databases.
The output generated by using --events contains CREATE EVENT
statements to create the events. However, these statements do not
include attributes such as the event creation and modification
timestamps, so when the events are reloaded, they are created with
timestamps equal to the reload time.
If you require events to be created with their original timestamp
attributes, do not use --events. Instead, dump and reload the contents
of the mysql.event table directly, using a MySQL account that has
appropriate privileges for the mysql database.
-R
Include stored routines (procedures and functions) for the dumped
databases in the output. Use of this option requires the SELECT
privilege for the mysql.proc table.
The output generated by using --routines contains CREATE PROCEDURE and
CREATE FUNCTION statements to create the routines. However, these
statements do not include attributes such as the routine creation and
modification timestamps, so when the routines are reloaded, they are
created with timestamps equal to the reload time.
If you require routines to be created with their original timestamp
attributes, do not use --routines. Instead, dump and reload the
contents of the mysql.proc table directly, using a MySQL account that
has appropriate privileges for the mysql database.
--single-transaction
This option sets the transaction isolation mode to REPEATABLE READ and
sends a START TRANSACTION SQL statement to the server before dumping
data. It is useful only with transactional tables such as InnoDB,
because then it dumps the consistent state of the database at the time
when START TRANSACTION was issued without blocking any applications.
If you only need the data and don't need routines nor events, just skip those flags.
Be sure to do commit after a few commands, for example after 500 rows. That save memory but has the problem that in case of rollback only going back to the last commit.

Exporting data from database to xml

I need to save data from a relational database record set in MySQL (composed of multiple tables and multiple sub-records for each master record) to an xml file. My questions are:
Is there an xml standard format for defining the relation between the database structure and the xml to output?
Are there recommended open-source libraries in PHP that would do the job?
It is possible to export and import XML data using the mysql monitor.
$ mysql -uroot -p --xml -e 'SELECT * FROM tablename' > /tmp/tabllename.xml
The mysql monitor has an --xml option, which enables us to dump data in XML format. The -e option executes a statement and quits the monitor.
For whole database you can use mysqldump tool
mysqldump --xml databasename > databasename.xml
or
mysqldump -X databasename> databasename.xml
Hope it helps!!

Error importing MySQL database using phpMyAdmin

I am importing a MySQL database from another server to my own server using phpMyAdmin. But the problem is:
I go into import then choose a file which have extension like .sql, .xml. After this procedure I click on the ok button but this doesn't give any response and doesn't even do anything, the page just remains stable.
I also tried with MySQL command prompt using mysqldump.
mysqldump -u username -p databse > database name
but this is also giving an error.
Can any one please help me in solving this?
This command is used to export :
mysqldump -h hostname -u username -p database > sql_file_name
to import you should use
mysql -h hostname -u username -p database < full_file_path
first you need to export a DB from the first server.
on your server you first need to create a new DB and than do the import.
the DB you imported will go to the DB you just created.

Use php to backup a specific database table and compress it

Using php is it possible to create a backup of a specific mysql database table using php.
Additionally I would like it gzipped also.
I would like it so the file was ready to be loaded directly in using the standard mysql import command i.e.
mysql -uuser -ppass mydb < mybackupfile.sql
Current solutions i am looking at involve iterating over each row and adding to a file - however this doesnt seem correct.
You could use php to execute a mysqldump command. Like so...
exec("mysqldump -uuser -ppass --compact mydb mytable > filename.sql");
Then run another command to compress it however you like.
The best tool for this job is mysqldump and gzip combined:
mysqldump ...options... | gzip -9 > backup.sql.gz
As a note, you'll need to reverse this process before restoring:
gunzip -dc backup.sql.gz | mysql ...options...
PHP itself might be able to initiate this process but it's not an exceptionally good tool on its own. You could try phpMyAdmin and see if that works for you.

Categories