I'm doing with PHP. I have a form. Each time users submit that form, I want to save these data to database first (to save time for users) then deal with these data later. But I don't know how to deal with these data automatically. Any code can auto process in server to treat these data automatically everytime having new record in database?
Did you get my question? Any suggestions? Thanks in advanced!
You could use 'Stored Routines' that get triggered when the data changes, see:
http://dev.mysql.com/doc/refman/5.0/en/stored-routines.html
or, slightly less elegant, use a cronjob that launches every set amount of time and have a "processed" flag on the database that allows the script to know what records it has already processed and which ones it hasn't and act accordingly.
You might be looking for trigger http://dev.mysql.com/doc/refman/5.0/en/create-trigger.html
Related
first of all.. sorry for my english, i'll try my best!
I have a web application where i show info about Data stored in my Mysql. I Would like to make it more dynamic and if some new information appear in my DB then update my web content without a refresh. I was thinking about Ajax. Every minute send an ajax function asking for some new data... but it's not a good idea 'cause it can be stressing for my server.
What's the best way to do it?
Thanks a lot for your time!
You could use something like Firebase which automatically pushes data to the client when changes occur in the backend database, but that means that you would have to store all your data in Firebase instead of in your Mysql database.
Apart from a service like that, one option is to use an ajax call to fetch new data, but only when new data is available. I don't know how your data is structured, or what kind of data it is, but one solution to minimize the load on your server is to have a database table with a timestamp column that's updated every time relevant data is changed, added or deleted in your database. Use ajax to fetch only this timestamp every minute and if the timestamp has changed since you last checked it, make another ajax call to fetch the actual data.
I have two web pages: page1.php and page2.php.
In page1.php
There is a mySQL query executed and saved into variable A.
In page2.php
I want the same results of the same query.
Should I reconnect to mySQL and execute the query again or send the variable A from page1.php via POST or SESSION? What is the fast solution? There is any other solution?
EDIT: There is no other queries, so the data will not change (I think)
I certainly wouldn't pass it via POST, this allows for the data to be tampered with by a malicious client.
SESSION will be a bit faster assuming the dataset is not too large, and if the data is unlikely to change in the space of a user's session it's a reasoble choice.
It would be useful to know how long your query actually takes normally, to see whether the time difference would be significant. You did mention that the result set contains 14000 rows. If it's just a select from one table I can't imagine it takes very long. 14000 rows is really quite a lot to be storing in the session. Remember that it stores the dataset once for every user using the site at any one moment (since a session is specific to a single user). If you have a lot of users, you would have to keep an eye on how much memory on your server it is consuming.
Another design note: if you want exactly the same results on multiple pages, consider a 3rd script you can include in the others which does nothing but produce that output.
If you are not shure if the data may have changed, make a new query! It may take some more time to do this operation but you can be sure the data is still up to date.
What if the user restarts page1.php in another browser while still processing page2.php?
So only if you are 100% sure nothing can be manipulated or if loading the data takes too much time/resources reuse the data.
Never trust data which may have been changed by the user and ALWAYS make sure the still is what you expect it is. Maybe you should consider to make some kind of validation of the data.
I have table called playlist, and I display those details using display_playlist.php file.
screen shot of display_playlist.php:
Every time user clicks the 'up' or 'down' button to arrange the song order, I just update the table.But I feel updating DB very often is not recommended, so Is there any efficient way to accomplish this task.
I am still a newbie to AJAX, so if AJAX is the only way to do it, can you please explain it in detail.thank you in advance.
In relative terms, yes, hitting the database is an expensive operation. However, if the playlist state is meant to be persistent then you have to hit the database at some point, it's just a question of when/how often.
One simple optimization you might try is instead of sending each change the user makes to the server right away, allow them to make however many changes they want (using some client-side javascript to keep the UI in the correct state) and provide a "Save Playlist" button that they can press to submit all of their changes to the server at once. That will reduce database hits, and also the number of round-trips made to the server (in terms of what a user experiences, a round-trip to the server is far more expensive than a database hit).
More broadly though, you shouldn't get hung up over hypothetical performance concerns. Is your application too slow to handle its current load (and if so, have you done any profiling to verify that it is indeed this database query that is causing the issue)? If not, then you don't need to worry too much about changing it just yet.
You can have a save button, so instead of updating on each move there will only be one update where you update every row at one time. This also lets you have a cancel button for people to refresh the way it was.
You can do it so users can change locally all they wish; defer writing the final result to the database until they choose to move on from the page.
if you really want to avoid updating the database, you can try some JavaScript based MP3players , which allow you to pass the path to *.mp3 files.
Then I suggest you to use Jquery UI - Sortable
and use it to update the songs list to the flash player ..
I have created an office scheduling program that uses jQuery to post to a PHP file which then inserts an appointment into a pgSQL database. This has not happened yet but I can foresee this problem in the future--two office workers try to schedule an appointment in the same slot at the same time, creating a race condition and one set of customer data would be lost, or at least I'd have to dig it out of a log. I was wondering if there was a flag I could set in the database, if I need to create some kind of gatekeeper program to control server connections, or if there is some kind of mutex/lock/semaphore I can use with javascript/php/sql to keep this race condition from occurring.
You can either lock it with a database flag, or a better strategy is to detect collisions, since this only happens in rare cases.
To detect the problem, you can save a timestamp from the database containing the last updated time. Send this along with the form, and compare the timestamp before you update the record. If the timestamp has changed, then present the user with all the data and ask them what they want to do. This offers a way for the second saving user to modify their changes based on the previously saved data if they wish.
There are other ways to solve this problem, and the proper solution depends the nature of the specific problem.
I am having a few issues when people are trying to access a MySQL database and they are trying to update tables with the same information.
I have a webpage written using PHP. In this webpage is a query to check if certain data has been entered into the database. If the data hasn't, then i proceed to insert it. The trouble is that if two people try at the same time, the check might say the data has not been entered yet but when the insert takes place it has been by the other person.
What is the best way to handle this scenario? Can i lock the database to only process my queries first then anothers?
Read up on database transactions. That's probably a better way to handle what you need than running LOCK TABLES.
Manually locking tables is the worst think you could ever do. What happens if the code to unlock them never runs (because the PHP fails, or the user next clicks the next step, walks away from the PC, etc).
One way to minimize this in a web app, and a common mistake devs do, is to have a datagrid full of text boxes of data to edit, with a save button per row or on the whole table. Obviously if the person opens this on Friday and comes back Monday, the data could be wrong and they could be saving over new data. One easy way to fix this is to instead have EDIT buttons on each row, and clicking the button then loads an editing form, this way they are hopefully loading fresh data and can only submit 1 row change at a time.
But even more importantly, you should include a datetime field as a hidden input box, and when they try to submit the data look at the date and decide how old the data is and make a decision how old is too old and to warn or deny the user about their action.
You're looking for LOCK.
http://dev.mysql.com/doc/refman/5.0/en/lock-tables.html
This can be run as a simple mysql_query (or MySQLi::query/prepare).
I'd say it's better to lock specific tables (although you can probably try LOCK TABLES *) that need to be locked rather than the whole database - as nothing will be able to be read. I think you're looking for something like:
LOCK TABLES items;
START TRANSACTION;
INSERT INTO items (name, label) VALUES ('foo', 'bar');
UNLOCK TABLES;
Or in PHP:
mysql_query('LOCK TABLES items');
mysql_query("INSERT INTO items (name, label) VALUES ('foo', 'bar')");
mysql_query('UNLOCK TABLES');
You could check if data has been changed before you edit something. In that case if someone has edited data while other person is doing his edit, he will be informed about it.
Kind of like stackoverflow handles commenting.