I'm in a weird situation. I just found out that near 50% of my users don't have javascript enabled. Most of my website is based on a lengthy search function. Once users click submit, the function executes, and we send them an email when it's finished - this can sometimes take up to 10 minutes.
My issue is that I don't know how to tell users the button was clicked successfully to start this long function if they have javascript disabled. Any ideas?
Once submitted, spawn a php process that does the search and return immediately with the page saying the job was submitted.
The php reference talks about this, with a comment stating that you would call the process using nohup so that it doesn't die when the caller returns.
php.net exec function
As an idea: if you can detect, that some users don't have javascript enabled, maybe for that users you will show page without javascript, html with simple 'type="submit"' button?
you can check it on server side... when the form is sent to server just check if the submit button value is set.
Related
I have a form that processes a payment, but approximately one in every 300 payments comes though twice, my logs show that there were two requests for all these occurrences.
I implemented some JavaScript that disables the submit button after it's clicked, and it seems to work fine for me, but I'm still getting double submissions every now and then.
Does anyone know anything else that could be causing the form to be submitted twice?
As Dagon says, server-side checks are your friend here. Give each instance of the form a randomly generated key (guid would be nice), and store that in the database. Don't accept forms that contain a key that already exists in the db.
Also, are you simply displaying HTML after your form processing logic executes? If so, try redirecting after you process the form.
It is a way: after user presses form submit button, disable it with javascript. That should prevent user from double submitting the form accidently. It works for all sites we implement it like because. Because sometimes user gets impatient and clicks form submit button several times, thats why you get double requests. What you need to make sure it works for all the browsers (disabling the button I mean).
Also you could do it on server side, it just is harder.
We use similar to the upvoted answer in Enable/disable submit button with jQuery and Coldfusion server code
I have a page that allows for used input, when the user inputs his/her specifications and selects submit the algorithm will run. This can take a few minutes depending on input. The user will also be directed to a page which will show their results. I want to show a block of php code that allows for the user to input their email and an email containing the url for results will be sent to them automatically when the results are ready. However, if the results are ready I want the block of code (acting like loading page) to 'disappear' and the results visualisation code to run.i.e. show the results. is this possible? I'm fairly new to programming, so not sure the way to go about this. Thanks for the help.
The thing about PHP script is that, it will keep running until -
1) The server kills the script.
2) Script kills itself.
So whatever you are doing, it will keep running in the PHP script untill it is finished and t the end you could place the algorithm to mail the user. To have the results placed back when they are ready you could use Ajax. You could see the following tutorial for Ajax with jQuery: http://www.devirtuoso.com/2009/07/beginners-guide-to-using-ajax-with-jquery/
I have a page where users enter their email address, click "Send", and the webserver will email them a ~10mb attachment. Right now, I have the page just display "Sending..." and the user waits about 20 seconds on this page.
Instead, I want to have it say "Your attachment will be emailed in a few minutes," sending that process somewhere else and allowing the user to continue browsing without having to open up a new tab.
I really just need someone to point me in the right direction because I don't even know what to Google at this point.
You could call another php file that will process the email sending and make sure to put in this call:
ignore_user_abort(true);
What this does is allows the php process to finish, even though the browser leaves. So you could initiate the process via ajax and then go to another page saying your attachment has been sent.
For more info:
http://www.php.net/manual/en/function.ignore-user-abort.php
I recommend checking out this question I posted a while ago.
How can I run a PHP script in the background after a form is submitted?
The answer explains how you can run a PHP script in the background after a form is submitted. This, of course, is just one way to accomplish this. Another would be to build a list of addresses and set up a cron to run a script on a timed interval. Both can accomplish the same thing, it just depends on how you wish to tackle the issue and what you can do on your server.
I have been using Uploadify in my PHP application for the last couple months, and I've been trying to track down an elusive bug. I receive emails when fatal errors occur, and they provide me a good amount of details. I've received dozens of them. I have not, however, been able to reproduce the problem myself. Some users (like myself) experience no problem, while others do.
Before I give details of the problem, here is the flow.
User visits edit screen for a page in the CMS I am using.
Record id for the page is put into a form as a hidden value.
User clicks the Uploadify browse button and selects a file (only single file selection is allowed).
User clicks Submit button for my form.
jQuery intercepts the form submit action, triggers Uploadify to start uploading, and returns false for the submit action (manually cancelling the form submit event so that Uploadify can take over).
Uploadify uploads to a custom process script.
Uploadify finishes uploading and triggers the Javascript completion callback.
The Javascript callback calls $('#myForm').submit() to submit the form.
Now that's what SHOULD happen. I've received reports of the upload freezing at 100% and also others where "I/O Error" is displayed.
What's happening is, the form is submitting with the completion callback, but some post parameters present in the form are simply not in the post data. The id for the page, which earlier I said is added to the form as a hidden field, is simply not there in the post data ($_POST)--there is no item for 'id' in the $_POST array. The strange thing is, the post data DOES contain values for some fields. For instance, I have an input of type text called "name" which is for the record name, and it does show up in the post data.
Here is what I've gathered:
This has been happening on Mac OSX 10.5 and 10.6, Windows XP, and Windows 7. I can post exact user agent strings if that helps.
Users must use Flash 10.0.12 or later. We've made it so the form reverts to using a normal "file" field if they have < 10.0.12.
Does anyone have ANY ideas at all what the cause of this could be?
IOError: Client read error (Timeout?)
I got the same error a lot although my server side is python/django. I assumed it was the client timing out, but looking back though the logs for you now there seems to be a coincidence of this ceasing when I changed something in the authentication routines. Is it possible that the server is receiving the file but then refusing to write it to storage?
Also, you aware that several flash clients do not send cookies? You have to work around it by injecting the session keys into uploadify's 'scriptData' variable.
x--------------------------------
Edit. This python/django code starts off the routine to which uploadify submits itself:
# Adobe Flash doesn't always send the cookies, esp. from Apple Mac's.
# So we've told flash to send the session keys via POST. So recreate the
# session now. Also facilitates testing via curl.
cookie_name = settings.SESSION_COOKIE_NAME
if request.member.is_anonymous() and request.POST.has_key(cookie_name):
# Convert posted session keys into a session and fetch session
request.COOKIES[cookie_name] = request.POST[cookie_name]
SessionMiddleware().process_request(request)
# boot anyone who is still anonymous
if request.member.is_anonymous():
response['message'] = "Your session is invalid. Please login."
return HttpResponse(simplejson.dumps(response), mimetype='application/json')
Uploadify might alter the form. Take a look at the html/DOM tree of the form at the time when uploadify has finished and is calling your callback.
Have you tried using Live HTTP Headers in Firefox to see if there is some kind of rewrite happening that is causing the post data to be lost?
A while back, online apps used to say, "do not click submit more than once." That's gone now, right? How do you guard against that in, say, PHP?
One solution I'm using involves putting a variable in the Session, so you cannot submit to a page more than once every 10 seconds. That way the database work will have completed so the normal checks can take place. Obviously, this feels like a hack and probably is.
Edit: Thanks everybody for the Javascript solution. That's fine, but it is a bit of work. 1) It's an input type=image and 2) The submit has to keep firing until the Spry stuff says it's okay. This edit is just me complaining, basically, since I imagine that after looking at the Spry stuff I'll be able to figure it out.
Edit: Not that anyone will be integrating with the Spry stuff, but here's my final code using Prototype for the document.getElementByid. Comments welcome!
function onSubmitClick() {
var allValid = true;
var queue = Spry.Widget.Form.onSubmitWidgetQueue;
for (var i=0;i<queue.length; i++) {
if (!queue[i].validate()) {
allValid = false;
break;
}
}
if (allValid) {
$("theSubmitButton").disabled = true;
$("form").submit();
}
}
For some reason, the second form submit was necessary...
You should do both client- and server-side protections.
Client side - disable button e.g. by jquery as cletus has described.
Server side - put a token in the form. If there are two submissions with the same token, ignore the latter. Using this approach, you are protected against CSRF.
This is an excellent example of what jQuery is useful for (you can do it in any Javascript though). Add this code:
$("form").submit(function() {
$(":submit",this).attr("disabled", "disabled");
});
And it disables submit buttons once clicked once.
As others have noted, you can disable the button. I like server-side checks better, though - JS may be disabled, user might hit refresh (although if you're properly using POST that will generate a warning), etc.
You can add a timestamp to the form and track it in the session - require the POSTed timestamp be greater than the tracked one. That will prevent most double-posts without noticeably affecting the UI.
It's also important to note that PHP's default behaviour if it detects the user has "cancelled" the request (by closing the browser, pressing "stop", or perhaps pressing Submit a second time) is to stop executing the script. This is undesirable if you're doing some sort of lengthy transaction. More details here.
I would think the best option is to let the PHP script first set a flag in the session array to indicate that it is processing a form. That way a second request can be set to wait until the original request has completed (use a sleep call server side waiting for the flag to clear).
It is important not to interfere too much with the submit button and process, because the nature of the web is one of uncertainty; the user may want to click a second time if no answer has arrived, and you don't want to lose their data. That could occur if the first request is lost and you have disabled their only way to submit (and therefore store) the data.
One cleaner solution is to submit the request, and use javascript to display a message saying 'Processing', so that the user can see that something is happening, but they are not prevented from re-submitting the data.
One solution is to immediately disable the button on click using Javascript. This obviously relies on javascript being on on the client browser.
The server side trick is better since it will catch race conditions between multiple browser windows if the user is editing the same record twice.
I've done a simple version of this with javascript when I was working with ASP.NET AJAX but it should work in any case where your button has an actual ID.
I take the following steps in the button onclick event:
Disable the button that triggered the onclick event
Store the button id in a magic variable closureId that I can reference later via closure
Use the setTimeout function to execute a dynamically defined callback after the specified number of milliseconds (5000ms = 5 seconds)
In the callback function I can reference the magic closureId and re-enable the button after the timeout
Below is a simple HTML button you can throw into a test.html file by itself to play with:
<input id="btnTest" type="button" value="test" onclick="var closureId=this.id; this.disabled = true; setTimeout(function(){document.getElementById(closureId).disabled = false;}, 5000);>
You should still be doing server-side checks as others have suggested but in my own instance the submit also did a server side validation of business logic. The user would submit their form and then have to wait to see if the business logic on the server-side would actually allow that entry or not. It also helped out during development when I could take my server down and still have the form submission fail decently.
Note I call that closureId variable 'magic' because I don't have a firm grasp of how it works--
I just realized that calling this.id didn't work because this is a reference to the timeout function that is executing the dynamic callback function and doesn't have any sort of DOM id.
I couldn't find any other way to get a reference to the original event (this.this.id doesn't work) but lexical scoping is somehow allowing me to still access the closureId variable as it was defined at the time of the original button click.
Feel free to fix/comment if you know of a better way!
I usually disable the submit button after it's pressed. Session throttling is good against direct attacks, but I do not want to mix interface logic where it doesn't belong.