run mysql.sql using php - php

i have mysql file. it has .sql extension. how to run that "mysql" file using php script? how is the script should be?
the mysql file is already on the server, and i want to run this query using php file which is placed on the same place with mysql file...

You could do this
$query=file_get_contents($file);
$mysqli->multi_query($query);
mysqli::multi_query can do the job.

the mysql file is already on the server
dun use PHP ....
Linux ???
install mysql client, and run a shell like
mysql -u root -ppassword -P port_number < any.sql
Windows ???
still need to install mysql client, repeat from above
if you don't have ssh access to the server, and don't have exec right
read/prepare again the mysql file by breaking multiple queries then execute one-by-one

Related

Apache and Mysql using Ubuntu mate

I'm new to Ubuntu. I tried to run a .php file and connect it to a database. Everything is on-set. I already imported the database in phpMyAdmin but every time I access my database,it returns an error
This page isn’t working localhost is currently unable to handle this
request. HTTP ERROR 500
Turns out, it seems like my database isn't running at all. In Windows I just open the XAMPP and click Apache and MySQL buttons. While in Ubuntu,
I have no idea on how to start or run MySQL and Apache. I already tried running commands on the terminal but it won't help. Someone has already installed it on this computer, I just don't how to run it and what web-server platform is this running.
How do I do it and how would I know that my database is running and accessible?
Try to connect your database and access database via command line.
mysql -u [username] -p
you can replace [username] with your real username of mysql like root
it will prompt for password so you are type yours like root
prompt will say
mysql>
now you need to list all databases to see is database exists or not
show databases;
it will list down all databases. you may verify is your exists or not
then you can select database by
use databasename;
and then run
show tables;
it will show all tables.
so you can verify that mysql working, database exists and tables are there or not.
Use this command it will start the database is you have it
systemctl mysql start
This should do the trick you need to have mysql database or maria db installed
Check by running this command in terminal after the first one
mysql
And you can also add argument like host and login
mysql -h (your host default is localhost) -u (user default is root) -p (password default is none)
Check your files access level
sudo chmod -R 777 "location of your file"

how to dump mysql db from terminal macbook

I'm a new MacBook Pro user.
Because the file is so large, I need to import it from terminal.
I already know how to import and export MySQL data using terminal in Linux,
but since I'm newbie in the iOS environment, I'm lost.
I think I'm missing something, maybe the path or anything, I just don't know.
I'm using XAMPP. I access my htdocs file from terminal, with this
cd /Applications/xampp/xamppfiles/htdocs/abcFolder
and then i try to import my db with this :
mysql -u root mir_existing < mirdb_21_november_2016\ \(1\).sql
Since I have no password, I remove the -p syntax.
But when I press enter to run the script, the result is command not found.
Many of you referred me to this page.
How can I access the mysql command line tool when using XAMPP in OS X?
I already did it, but I don't know where to access my mysql file path to import the db. It's different.
For example. i need to run this script to import the db right?
mysql -u mysql_user -p DATABASE < backup.sql
for example, my backup.sql is on htdocs/abcFolder/backup.sql
How can access it ?
Should I try this?
mysql -u mysql_user -p DATABASE < htdocs/abcFolder/backup.sql
i already tried that thing.
nothing happen. sigh.
How do I import my db?
If doing
which mysql
doesn't yield any results, you'll need to add the /path/to/mysql to your PATH variable and put it in your .bash_profile or .bashrc for future use so you don't have to keep adding it. After adding it to one of these files, just do
source .bash_profile
or
source .bashrc
depending on where you put it.
e.g. on one of my macs I use MAMP and need access to the bins it provides (mysql among them) so this is in my .bash_profile:
export PATH="$PATH:/Applications/MAMP/Library/bin"

How would I perform an sqlite backup through php

I am attempting to use the sqlite backup command through php.
This requires 3 statements
Opening the db sqlite3 testing.sqlite
Backup the DB .backup testing_backup.sqlite
Close sqlite .exit
The exec command doesn't like this as it hangs when the process stays open, I have tried running all three commands together using && to join them but this doesn't work either.
Can anyone help me run the 3 commands through php?
This is an attempt to create a backup file to solve a database locking issue.
The sqlite3 command-line shall can also receive command(s) as parameters.
Just execute the following command:
sqlite3 testing.sqlite ".backup testing_backup.sqlite"

cannot import too large sql file to mysql

I have a 28 MB sql file need to import to mysql.
Firstly, i'm using xampp to import, and it fails, and so i change my max_file_uploads, post_size(something like that)in php.ini-development and php.ini-product to 40 MB, but it still show "max:2048kb" and import fail again.
From research, i've learned to import by using mysql.exe, so i open mysql.exe and type the command line(msdos) below:
-u root -p dbname < C:\xxx\xxx\sqlfile.sql
but still failed again and again.....
what the problem is? xampp? or my sql setting?
Try this:
mysql -uroot -p --max_allowed_packet=24M dbname
Once you log into the database:
source C:\xxx\xxx\sqlfile.sql
I think that you should be able to load your file
How large is your file?. You might as well do it from a console:
mysql -u##USER## -p ##YOUR_DATABASE## < ##PATH_TO_YOUR_FILE##
Do that without executing your mysql.ext file: just "cd" right into the directory and try the command.
It will ask for your password and start importing right away. Don't forget to create the database or delete all tables if it's already there.
I always found this approach quick, painless and easier that rolling around with php directives, phpmyadmin configuration or external applications. It's right there, built into the mysql core.
You should increase max_allowed_packet in MySQL.
Just execute this command before importing your file:
set global max_allowed_packet=1000000000;
I also fetched the similar problem. So after that I also conclude , large sql file will never be imported to mysql. It will always give timeout error.
Then I found a solution.
There is an software Heidisql.
follow below steps:-
1) download the software.
2) then install the software
3) create new session in Heidisql and open the session
4) then go to Tools -> Load SQL File -> Browse.
That's it. This solution works best for me.
check the link here
I found the only solution was to log in to MySQL from the command line and use the 'source' command:-
1) cd to the directory containing your SQL file for import, then log into MySQL:
#> mysql -u YOURUSERNAME -p -h localhost
2) use MySQL commands to import the data:
#> use NAMEOFYOURDB;
#> source NAMEOFFILETOIMPORT.sql
This also feeds back info about progress to your terminal, which is reassuring.

Php : running ssh from Windows to login to a Linux and run a script

Here's my goal :
I have a Windows XP PC with all the source code in it and a development database.
Let's call it "pc.dev.XP".
I have a destination computer that runs Linux.
Let's call it "pc.demo.Linux".
Here's what I've done on "pc.dev.XP" (just so you get the context) :
installed all cygwin stuff
created a valid rsa key and put it on the dest
backup computer so that ssh doesn't
ask for a password
rsync works pretty well this way
If i try to do this on "pc.dev.XP" via a command line :
cd \cygwin\bin
ssh Fred#pc.demo.Linux "cd /var/www && ls -al"
this works perfectly without asking a password
Now here's what I want to do on the "pc.dev.XP":
launch a php script that extract the dev. database into a sql file
zip this file
transfer it via ftp to the "pc.demo.Linux"
log to the "pc.demo.Linux" and execute "unzip then mysql -e "source unzipped file"
if I run on "pc.dev.XP" manually :
putty -load "myconf" -l Fred -pw XXX -m script.file.that.unzip.and.integrates.sql
this works perfectly.
Same for :
cd \cygwin\bin
ssh Fred#dest "cd /var/www && ls -al"
If I try to exec() in php (wamp installed on "pc.dev.XP") those scripts they hangs. I'm pretty sure this is because the user is "SYSTEM" and not "Fred", and putty or ssh ask for a password but maybe I'm wrong.
Anyway I'm looking for a way to automate those 4 tasks I've described and I'm stuck because exec() hangs. There's no problem with safe_exec_mode or safe_exec_dir directives, they're disabled on the development machine, thus exec() works pretty well if I try some basic stuff like exec("dir")
Any idea what I could do / check / correct ?
I'm not sure if this is what you need, but I typically use a construct like this to sync databases across machines:
php extractFromDb.php | ssh user#remote.com "mysql remoteDatabaseName"
This executes the PHP script locally, and pipes the SQL commands the script prints out through SSH straigt into the remote mysql process which executes them in the remote database.
If you need compression, you can either use SSH's -C switch, or integrate the use of your compression program of choice like this:
php extractFromDb.php | gzip -9 | ssh user#remote.com "gunzip | mysql remoteDatabaseName"
You want to do this from PHP running under apache, as in I go to http://myWebserver.com/crazyScript.php and all this happens? Or you just want to write your scripts in PHP and invoke them via cmd line?
If you want the first solution, try running your apache/iss under a different user that has credentials to perform all those tasks.
"if I run on the development PC manually this works perfectly.".
Why not do it like that? When you run that script, I assume you're connecting to the local SSH server on the dev machine. When you do this, you are using the credentials Fred, so everything works. When you run the PHP script, you are right that it is probably running as SYSTEM.
Try either changing the user that apache is running as or use php to connect to the local ssh thereby using alternate credentials.
Here's what I did :
a batch file that :
Calls a php file via "php.exe my_extract_then_compress_then_ftp.php"
Calls rsync to synchronize the source folder
Calls putty -l user -pw password -m file_with_ssh_commands_to_execute
It works like a charm.

Categories