Old Mysql to new Mysql? - php

I was learning some web design about 18+ months ago and got caught up on some other things and had to stop for a while. I'm getting back into web design again. I used xampp back then and using xampp now also. I found my old xampp zip of the entire folder. I extracted it thinking I might get lucky and get my databases back from mysql. I go into the xampp\mysql\data and I copied all the folders with the databases I wanted to put into the new mysql. I loaded up phpmyadmin and the databases are there but they are all empty.
The main database I want to save has these files inside of it...
db.opt
pcms_categories.frm
pcms_meta.frm
pcms_posts.frm
pcms_users.frm
so I see the table files there but mysql isn't loading them. I assume maybe it's an older version and it won't read these files because they are obsolete. I'm not sure. After transferring the files I stopped mysql and reloaded it so just wanted to mention that.
It's been so long ago that it doesn't exactly matter if I can recover this data. I was working on a blog though would be cool to get that back and finish what I started.

try to dump it to .sql perhaps?
in other word, export/import.

First, I'd check for permission issues. MySQL usually requires special permissions on its datafiles. Check to see if MySQL has created its own user/group. Next, I'd try to find out what version of MySQL you previously had, create an installation of said version, and use it to dump the datafiles to SQL.

Related

Moodle reports from Database

I have a moodle database which I exported a few months ago before our server went down. Now I want to generate reports from my old database, I have tried to import to new moodle site but moodledata folder is missing. So now I'm looking for another way to generate reports from my database. I have tried to make Msql queries but I think that would take a lot of time for now. I need help if there is any tool around which I can use or any API which I can use to generate reports from my database. I have tried to use Seal Report to tackle this issue but I have found that there is a lot of manual work to be done, I don't means this tool can't do that but I'm just looking if there is any other tool which can simplify my task.
NB: I know some will say this is not a programming question, Please feel free to suggest any best way to query using any language.
You should be able to set up a local copy of a Moodle site with a copy of the database and with a blank Moodle data folder (I've done this regularly in order to investigate issues on a customer's site).
Once you've done that, you will have access to any reporting tools you would normally have inside Moodle.
You may find it easiest to set up a fresh install of Moodle, pointed at a blank database, then, once the install is finished, edit the config.php file to point at the restored copy of the original site. You may have to purge caches (php admin/cli/purge_caches.php) and you may have to reset the admin password (php admin/cli/reset_password.php). It is also wise to turn off email (edit config.php and add $CFG->noemailever = true; ).

Install MediaWiki locally with a large DB: "LocalSettings.php" couldn't be generated

I'm trying to install MediaWiki (1.29.1 or 1.27.3) locally with a large Wiktionary dump (3GB).
After converting the xml dump into an sql file and importing the latter into my DB that I create with this script, I followed the MediaWiki installation instructions in the browser to generate my specific "LocalSettings.php". I get the message
There are MediaWiki tables in this database. To upgrade them to MediaWiki 1.29.1, click Continue."
By clicking the "continue" button, the browser stays in loading state forever.
My understanding is that my DB containing the wiktionary dump has some tables that are not compatible with the version of wikimedia that I'm using. Therefore, an update of the DB is required.
I tried to run install.php from the command line to avoid having timeout with the browser. The command didn't return anything (after waiting more than 2 hours).
I tried as well a workaround:
Create my DB with empty Tables
Generate "LocalSettings.php" from the browser (that was fast since the DB is small)
Import the wiki sql dump to my DB
Refresh the index.php page
I got then a blank page with this message
Exception caught inside exception handler.Set $wgShowExceptionDetails
= true; and $wgShowDBErrorBacktrace = true; at the bottom of LocalSettings.php to show detailed debugging information.
All the examples and tutorials that I found online about this matter are assuming/using a small or new created DB.
Any idea what's wrong? Did really someone tried to use an existing wikimedia dump and run it locally? Why is there no such an advanced example?
You wrote "I'm trying to install Wikimedia (1.29.1 or 1.27.3)". I suppose that you are talking about Mediawiki, not Wikimedia. Am I right?
1) You can try parsed version of Wiktionary. It is a little bit old (2014) http://whinger.krc.karelia.ru/soft/wikokit/index.html
2) You can try to use my tutorial about download Wiktionary dump, uploading to MySQL, converting and parsing to something more usefull for work: Getting started Wiktionary parser.
See: MySQL import
The issue in a first level originates from mwdumper which seems to be outdated. An sql DB I generated using mwdumper is missing some tables which should have been though created by running update.php. It was not possible for me to run any php file neither from shell nor from the browser and I suspect the size of the dump to be the cause.
The workaround which by some magic helped to overcome this issue was:
run the update.php from shell with missing db credentials. This somehow enables logs and make the execution of index.php possible through the browser
add manually to the missing table columns claimed in the error messages (the column types here should be respected)
place a LocalSettings.php file, generated easily from a Wiktionary DB with empty tables, in the right directory of the mediawiki installation.
Run index.php from the browser
Et voila! The huge wiktionary mysql dump is queryable now throw the mediawiki interface. Not sure if such a trick could be called a solution but it solved the problem in my case. An explanation for what could have happened in background would be definitely helpful.

Moving Xamp after formatting my computer and finding all my Databases empty

I recently had to format my computer but I made sure I backed-up all my files before. I am developing lots of local sites using Xamp on my Windows 7 machine.
When I tried to move the files back onto the same drive (c:) and then looked at PHPMYADMIN the tables where there but all empty.
I have all the xamp files including the .frm files but the phpmyadmin shows the DB's but they are all empty i.e showing no tables inside.
Really hope someone can help as I have potentially lost a LOT of work. Thanks in advance.
When you said "I made sure I backed up all my files" -- which files are you talking about? There are the MySQL data files, the XAMPP program files, and then any exported .sql files you may have backed up. The best way to create a backup is to "dump" an SQL file from the database ("Export" from within phpMyAdmin, though I generally use the mysqldump command-line tool). If you didn't do that but still have the MySQL data directory, the links provided by Peter Michael will guide you (although relying on the MySQL data directory isn't generally a good backup means, because of various inconsistencies or loss that can occur with your data; whether it works at all also depends on your table type). I don't know where those data files are stored when using XAMPP and a cursory Google search gives a couple of different options, so good luck.

Why does my MySQL database tables only exist on my Workbench?

I am developing my first php-MySQL system, but have a minimum of database experience. I have successfully created a schema with aproximately 10 tables, and the testings have so far worked fine.
My problems started when i created a dump of the database, as copy for my laptop, so I could continue to test new web pages and new queries even when I am not at home, and still use a localhosted database.
I have successfully imported the dump, and created the schema in my workbench, and it is visible from the command-line prompt "show databases".
However, both my web application, and the command-line prompts are unable to locate the underlying tables. THe problem is at least not directly in my php-code, as it works perfectly on another computer, with another, but assumably identical localhosted database. I have researched the matter online for what feels like a decade, but without any relevant results.
Command-line prompts are able to find tables from every other schema, like the sakila examples. I have tried to compare the two's settings, but they all seem to be identical, for instance they are both InnoDB.
I realise that I am not giving enough information for a direct solution here, but I would greatly appreciate if anyone could hguide me to finding the right questions to ask, so I can solve this matter, and get on with my php-learning.
(SOLVED)
I discovered that the list of open tables had references to an additional Schema. I thereafter realised that I had installed the XAMPP bundle server before I installed MySQL Workbench. In other words, i already had an existing database, with all of the regular examples included in another instance, which were running on port 3306. It also happpened to have an identical, but empty scheme, of the name that I tried to access.
My newly added SQL installation was ported to 3307, and so I am now able to access it. THis issue was just one of several issues I have been able to work around, but they have all been caused by the same mistake. Thanks everyone for trying to help.

How would you implement an auto-update for a standalone PHP project using GitHub?

My project is a collection of PHP scripts using MySQL as a database and needs to be installed locally using WAMP/LAMP/MAMP.
Previously I've been sending the users a link to a zipped archive and having them overwrite it, but since I took the plunge to GitHub, I've realized that there are far better ways; namely Service Hooks in GitHub. However, this would work fine as long as I don't alter the database in any way, which is a good possibility.
I've been toying with the idea of how I would implement this, but I can't find a clear solution. So far I've concluded with that I need to have a directory (say update/) which contains .sql files for each update. The PHP script will then check said directory for a file corresponding with the new version number (not sure how I will define a version number; I was thinking of using the commit ID, but that won't be available until after the commit, so...).
I would love some input on this!
Here's how I would tackle this (not the most elegant or performant):
Add a flag in the DB with a version number
Add a min-version number in your DB layer PHP file
Check that the DB version is greater than the min-version
If it is: continue about your business
Else: Run the PHP file in update/ which would have a series of ALTER TABLE commands to be run on the DB server
Update the min-version number in the DB to the latest number
All done
Alternately instead of querying the DB you can have a file which is generated by your DB interface PHP file (and ignored with .gitignore) which you can just as above.
I would really recommend checking out Doctrine and its migration feature.
This does exactly what you are looking for, plus you get a very nice tool for working with all other aspects of your database handling.

Categories