I've been doing some googling but I don't know if my search terms could be better.
I am making a database that will keep customer information. I have a few reps that will have access to that data using the php website front end and I would like to put something in place to allow me to undo a mistake that could occur.
I initially thought to make a trigger for all update and/or insert queries in which it would save the query used and would make a backup of the contents of the table before the query was executed.
I wasn't sure if that would be considered best practice or if there was a better option I could be overlooking.
I would appreciate any input on if this route seems best or should I be looking towards a different option.
Thanks!
Related
I use php to fetch data from a mysql database for a web application that I am working on. I have over a hundred php files, each with a number of sql statements. Currently I run the risk of forgetting to update queries in other files when a change is made.
Is there a way to better manage all these sql statements/queries?
I was thinking along the lines of having a file with a list of different functions each querying the database for the relative sql statement. It can be the case that a particular statement is used in multiple files. Instead of having to write out each statement in each file, I can just call the relevant function with the sql statement. Then if I have to update the statement for any reason (i.e. additional fields added to the table, etc) then I can update the query in one place, I dont have to search through multiple files and make an update, and I dont have to worry about missing a file to update.
Im only not entirely sure of whether functions is the best way to go about this because of scope. The config file has the db connection credentials, il have to think about how to pass variables to the function,... its a rabbit hole I dont want to go down yet unless I know im on the right track.
Looking forward to any help and advice as well as any examples or resources to assist
Many thanks.
I have built a web app using MySQL and php and I'm at the moment trying to figure out the best approach to make a script to automatically update the customers based on the new changes on my development environment without affect the customer data.
So far my first attempt was to check the app version if there is a new one, download the zip with the new and changed files, then I do a mysqldump skipping triggers, etc. of the customers, drop all tables on the customers database, load the scheme and reload the dumped file.
The problem I am facing is that this work if the change in the scheme is a minor one, if I decide to add a couple of columns with new values, or remove unused tables, or to remove unused rows the upload fails.
So my question is whats the best approach to safely update different databases, based on the my development database changes?
I guess the best way is to include queries upgraded script with all the needed queries on it?
But is this the right way?
There is some automatic way to handle this to avoid to manually have to write each change and avoid as well to miss some changes on the script and screw it all?
I really appreciate your opinions as looking around, I didn't find any clear approach of this procedure.
So I have an old website which was coded over an extended period of time but has been inactive for 3 or so years. I have the full PHP source to the site, but the problem is I do not have a backup of the database any longer. I'm wondering what the best solution to recreating the database would be? It is a large site so manually going through each PHP file and trying to keep track of which tables are referenced is no small task. I've tried googling for the answer but have had no luck. Does anyone know of any tools that are available to help extract this information from the PHP and at least give me the basis of a database skeleton? Otherwise, has anyone ever had to do this? Any tips to help me along and possibly speed up the process? It is a mySQL database I'm trying to use.
The way I would do it:
Write a subset of SQLi or whatever interface was used to access the DB to intercept all DB accesses.
Replace all DB accesses with the dummy version of yours.
The basic idea is to emulate the DB so that the PHP code runs long enough to activate the various DB accesses, which in turn will allow you to analyze the way the DB is built and used.
From within these dummy functions:
print the SQL code used
regenerate just enough dummy results to let the rest of the code run, based on the tables and fields mentioned in the query parameters and the PHP code that retrieves them (you won't learn much from a SELECT *, but you can see what fields the PHP code expects to get from it)
once you have understood enough of the DB structure, recreate the tables and let the original code work on them little by little
have the previous designer flogged to death for not having provided a way to recreate the DB programatically
There are currently two answers based on the information you provided.
1) you can't do this
PHP is a typeless language. you could check you sql statements for finding field and table names. but it will not complete. if there is a select * from table, you can't see the fields. so you need to check there php accesses the fields. maybe by name or by index. you could be happy if this is done by name, because you can extract the name of the fields. finally the data types will missing. also missing: where are is an index on, what are primary keys, constrains etc.
2) easy, yes you can!
because your php is using a modern framework with contains a orm. this created the database for you. a meta information are included in the php classes/design.
just check the manual how to recreate the database.
I am working on a database in mysql and I need to make a user facing page that allows me to enter text for each field and then submit the record. I have been able to accomplish this easily, however it is getting quit annoying having to update the input.html and save.php every time I decide to add/remove a field.
It really seems like there should be some sort of program that can auto-maintain the code for me and allow me to just focus on the database structure. Does anyone know of something that does this? I feel like I am doing it all wrong.
Thanks in advance.
P.S. I realize that I could just use phpmyadmin, but I do not want to give full DB access to my data entry people; plus they are not technical types, I don't want to intimidate them.
In the end, I decided to make my own script, using INFORMATION_SCHEMA to get the field names, and then made a recursive loop to add each record. It wasn't hard, but it seems like there should still be a better way.
I am designing a web application using php and mysql. I have a little doubt in database.
The application is like
Users get themselves registered.
Users input workload (after login ofcourse :) ).
User logs out.
Now there are multiple types of inputs which i accept on a same form. Say there are 3 types of inputs and they are stored in 7 different tables (client requirement :( )
Now my question is what is the best way to fire a query after inputs are done ?
For now i can think of following ways.
Fire 7 different queries from php
Write a trigger to propagate inputs in appropriate tables ?
Just guide me which approach is performance efficient ?
Thanks :)
Generally you want to stay away from triggers because you will be penalized later if you have to load a lot of data. Stored procedures are the way to go. You can have different conditions set to propagate inputs into different tables if needed.
I think you need to re-think your situation. You already know how awesome it would be to have fewer tables to deal with? Well, why not simulate that situation with a properly constructed view. Then, the client (are you sure it is the client? Sometimes ops says "client", when they mean, "report which we need to provide later") can have as many tables as your database can handle. And, by the way, you can still fire inserts and updates on a view.
Because it seems like your database does not have a clear relationship with PHP data structures, my instinct will be to separate the two more, not less. This would mean actually favoring stored procedures and triggers (assuming the above is not workable), which can be harder to debug, but it also means that PHP only has to think about
"I am inserting into this thing called <thing name>"
Instead of
"OMG, so this is like, totally intense first I have to talk to <table 1>, but I can't forget <table 2>, especially since those two might have... wait, did I miss my turn?"
OK, PHP isn't a ditz (I actually like the language), but it also should also be acting as dumb as possible when it comes to actually storing things -- that's' not its business.
You probably want to write a stored procedure that runs the seven queries. Think hard about how many transactions you need to run those seven queries.
How often do you think you will have to change which queries to run?
Do you have access to the database server?
Do you know which circumstance should trigger your triggers?
Are there other processes/applications writing data to the database?
If your queries change very often, I would go for code in PHP to just run the queries for you.
If you don't have access to the database server you may actually have to go for that method! You need permissions to write stored procedures and triggers.
If other processes are writing to the same database you have to discuss your requirements with the respective process owners! Otherwise data may appear/change in your database that was unwanted.
I personally tend to stay away from triggers unless they call very simple stored procedures and I'm 100% certain that nobody else is going to be bothered by the trigger!