Rollback Database Update After Refreshing a Page (PHP) - php

I have a php application where the user can make some changes to an oracle database with adodb.
After the request is executed, the page is refreshed and the user can see the result.
How would I add an undo option of this UPDATE after refreshing the page?
I've tried beginTrans(), but it seems like it automatically rollbacks after the php script is executed.

Database transactions are tied to a single connection. Connections are normally closed when the PHP script finishes and trying to make a connection persist for the same user on multiple requests would be very problematic.
As much as possible, it's best to treat HTTP requests as stateless. Meaning, changes should be committed to the database at the end of every request and an undo in HTTP should probably not be concerned with rolling back a previous transaction, but also actually committing changes in the database.

Related

Double requests causing double inserts

We have a SPA app with big traffic and sometimes occasionally double rows are inserted in several parts of application.
For example user registration. Normally the validation mechanism will do the trick checking if email address already exists, however I was able to reproduce the problem by dispatching the same request twice using axios resulting doubled user in the database.
I initially thought that the second request should throw validation error but apparently it's too quick and checks user before the first request was able to store it.
So I put 500ms delay between those requests and yes it worked, the second request thrown a validation error.
My question is, what are the techniques to prevent double inserts IF TWO REQUESTS ARE ALREADY DISPATCHED IN THE SAME FRACTION OF A SECOND?
Of course we have blocked submit form button (since the beginning) after making first request, yet people somehow manages to dispatch requests twice.
One option I've utilized in the past is database locking. I'm a bit rusty on this, but in your case:
Request WRITE LOCK for the table
Run SELECT on table to find user.
Run INSERT on table.
Release WRITE LOCK.
This post on DB locking should give you a better idea of what locks apply what affect. Note: some database systems may implement locks differently.
Edit: I should also note that there will be additional performance issues using database locks.

PHP: Is there a relationship between PDO transactions and Sessions?

I am currently working on a project to uses Yii and stumbled across something that made me scratch my head. I started a db transaction using Yii (which just calls PDO::beginTransaction) and did some database stuff and at the end, store a flash message for the user and do a redirect. I forgot though to commit the transaction so nothing got stored in my database, but what caught my attention is that my flash message also did not appear. Doing a commit or rollback makes the flash message appear just fine.
Basically, I noticed that I could not store any session related data and have it stick after a redirect if I started a transaction and didn't commit/rollback. I normally don't leave transactions hanging so I never noticed this behavior before.
So is there a relationship between the 2 that would prevent Sessions from working properly?
Session is written to the database at the end of the request. If you make an explicit rollback, it still gets written to the db outside of the transaction. If you don't, the rollback happens implicitly AFTER the session saving queries are run.

Real Time Event Listener for PHP and mySQL

I am thinking of implementing a real time event listener for a personal project. Is there are way that say, if an INSERT, UPDATE, and DELETE SQL queries have been issued, then mySQL will trigger a PHP file which will in turn process this, like refreshing a page automatically if a new record is found, or say a record has been edited or deleted?
I have been reading through mySQL triggers but I do not know how to implement. Thanks!
If what you want to do is refreshing the page in an appropriately lazy manner, I suggest you look less to triggering out of MySQL and trigger it with AJAX. Waygood has a good link for the latter, but consider the former for simply updating data.
You can update the information on your site by way of long polling. That way you keep a persistant connection open back to your server and can update data whenever the server pushes an update through. When done, simply start the connection again and wait for another update. The most commonly used LP technique is probably one with AJAX. Alternatively, for things like cross-domain support, you could go a bit more exotic with script tag long polling.

Dealing with long server-side operations using ajax?

I've a particularly long operation that is going to get run when a
user presses a button on an interface and I'm wondering what would be the best
way to indicate this back to the client.
The operation is populating a fact table for a number of years worth of data
which will take roughly 20 minutes so I'm not intending the interface to be
synchronous. Even though it is generating large quantities of data server side,
I'd still like everything to remain responsive since the data for the month the
user is currently viewing will be updated fairly quickly which isn't a problem.
I thought about setting a session variable after the operation has completed
and polling for that session variable. Is this a feasible way to do such a
thing? However, I'm particularly concerned
about the user navigating away/closing their browser and then all status
about the long running job is lost.
Would it be better to perhaps insert a record somewhere lodging the processing record when it has started and finished. Then create some other sort of interface so the user (or users) can monitor the jobs that are currently executing/finished/failed?
Has anyone any resources I could look at?
How'd you do it?
The server side portion of code should spawn or communicate with a process that lives outside the web server. Using web page code to run tasks that should be handled by a daemon is just sloppy work.
You can't expect them to hang around for 20 minutes. Even the most cooperative users in the world are bound to go off and do something else, forget, and close the window. Allowing such long connection times screws up any chance of a sensible HTTP timeout and leaves you open to trivial DOS too.
As Spencer suggests, use the first request to start a process which is independent of the http request, pass an id back in the AJAX response, store the id in the session or in a DB against that user, or whatever you want. The user can then do whatever they want and it won't interrupt the task. The id can be used to poll for status. If you save it to a DB, the user can log off, clear their cookies, and when they log back in you will still be able to retrieve the status of the task.
Session are not that realible, I would probably design some sort of tasks list. So I can keep records of tasks per user. With this design I will be able to show "done" tasks, to keep user aware.
Also I will move long operation out of the worker process. This is required because web-servers could be restrated.
And, yes, I will request status every dozens of seconds from server with ajax calls.
You can have JS timer that periodically pings your server to see if any jobs are done. If user goes away and comes back you restart the timer. When job is done you indicate that to the user so they can click on the link and open the report (I would not recommend forcefully load something though it can be done)
From my experience the best way to do this is saving on the server side which reports are running for each users, and their statuses. The client would then poll this status periodically.
Basically, instead of checkStatusOf(int session), have the client ask the server of getRunningJobsFor(int userId) returning all running jobs and statuses.

AJAX (prototype/php) running 2 ajax process hangs until first one is finished

This question is a followup to my previous one: Previous Questions.
So I setup my page to initiate an ajax call to initiate processing some records. And after each record it updates a row in another table to keep track of the status of this process. After that first ajax call is made, I have another start up. It's a Ajax.PeriodicalUpdater and it's set to hit a file which simply queries that row in the db and returns the status of the original process.
So this works perfectly fine... as long as the file that provides the status updates is outside my current app. If I put the file inside of my app, then it doesn't work right. If I watch firebug, the PeriodicalUpdater call doesn't get anything back until the original ajax call finishes, it just hangs out so it's as if the file is hung and not returning anything.
This whole app is running inside just a basic framework we are using. Nothing crazy, just handles routing, and basic template aspects etc... So all of these functions/files are inside this app and all these ajax calls are being routed through this.
What could be causing something like this?
Can this be due to the limit of concurrent connections supported by a browser to a particular domain?
This is caused by PHP session serialization. The session data is locked until the PHP process for each request has finished writing to it, so further requests in the same session will queue until the lock is released.
If your AJAX requests need access to session state, read out the information you need and then use session_write_close() as early in your code as possible to release those locks.

Categories