When working in PHP, to avoid duplicate form submissions, I used to generate a unique id of some sort, store it into a session variable, and have the id in the form, so on submission I can compare the values, regenerating the session value at that point. I never considered it a great solution, but I was never able to think of/find a better solution.
Now, I'm doing an Angular front end with a PHP backend (Lumen), and I'm struggling to think of a solution that doesn't involve me writing into a database. Unless I'm misunderstanding something, I can't use sessions between Angular and PHP, right? So this solution won't work. The only other thing I can think of is to have a key/pair value in a DB, but I never quite understood how that prevents duplicates on something like an accidental double click, wherein the session/database may not update it's key before the second submission starts processing. And as I'm learning more about stateless systems, it feels like a session isn't the best place to put this sort of thing?
Overall, I'm having trouble with creating a secure, backend system to avoid duplicate forms. With angular, I can always prevent duplicate submissions through preventing the button from being clicked, the API call from firing, etc, but I'd like to add backend protection too, and I'd love to hear how the experts do it.
In most cases this should be protected in the front end. Backend protection solution depends upon your definition of unique request. You can create a composite key from the attributes that together makes the request unique. For eg. email id, request id, or any set of request parameters
DB will not accept duplicate keys and you can catch the exception and gracefully handle it on front end.
Eg. Newsletter subscription request application - the uniqueness of the request is determined by email address and newsletter type
I'm sure there are ways to hack it, but I think the short answer is that you don't want to hack it. When you transition to a split back/front end, one thing you are doing is specifically making your API calls stateless. This is a good thing! The statelessness, lack of sessions, etc, can dramatically simplify your back-end application. In short, statelessness is half the reason why you do something like this.
Preventing double submits as you are used to doing is decidedly something that you need a stateful application to do. As a result, it is now the job of the front-end application exclusively.
Your best bet is to think about your application in a whole new way. Your PHP backend handles stateless REST requests, and as such it is not PHP's problem if it gets duplicate submissions. In practice angular should have no problem making sure duplicates don't get submitted (it is really easy to prevent it on the front-end). Your PHP backend does need to make sure it always returns appropriate responses. So for instance on a registration page, back-to-back duplicate registration requests would result a successful registration followed by a failed registration with a message of "That email already exists" (or something like that). Otherwise, your PHP backend doesn't care. I can still do its job. It's the client's job to make sure the double submit doesn't happen, or make sense of the conflicting answers if it does.
Statelessness is a desirable quality in API calls: you'll make your life much more difficult if you muck that up.
The logic is simple, just disable the submit button and show a loader to the user on form submit and enable the button again and hide the loader when you get a response from the first API and also you can empty the form fields too. Now, If the user submits the form again with same details you can easily access the old saved data and compare them and can warn the user that fields are already submitted.
If you are using Angular 4 or 2,
[disabled] , *ngIf or [hidden], Observables will help
to reset form formRef.reset()
https://codecraft.tv/courses/angular/forms/submitting-and-resetting/
and angular 1
ng-disabled, ng-if and HTTP promise will help
and for resetting form
angularjs form reset error
Related
While looking for a way to speed up form validation on a large number of fields I build my own library of php validation functions to be re-used across multiple websites. I am now trying to avoid duplicating these rules in javascript without sacrificing user-friendliness.
I am thinking of storing attempted inputs in a $_SESSION['attempted_inputs']
Upon failed server-side validation, user would be redirected back to the original form where an error message will be printed and all fields will be prefilled with attempted inputs, thus eliminating the need for JS validation.
Assuming attempted inputs will be properly sanitized upon saving and server resources are not a concern on my clients small-scaled applications, what could be the downsides of using this method instead of a classic js client-side approach ?
Thanks
Downsides:
Unnecessary start/save/delete of the session. The GC process could do a lot of unnecessary work. You could use a POST back style instead of session saving. I know you said you don't care about performance, but this could bite you back someday.
Slower validation. Going back and forth with the form isn't a nice UX. If you can say what's wrong before posting it, you should do it.
If i can think of more downsides i'll edit my answer.
While this is not the right place to ask for concepts, I just wanted to give you a quick heads-up.
You should always validate your inputs on the server side. Manually changing requests these days with tools as the developer console, makes your application really vulnerable to many kinds of attacks.
I've built a MVC 'framework' for learning purposes, and I'm struggling with this problem:
I am working on a CRUD application and I don't know how I should delete the records from my database. Right now I'am doing it through URL.
example.com/controller/delete/id that is how I delete a record from the database. I don't really like this way, because anyone could unintentionally or intentionally delete database records.
So my question is: How should I implement this feature?
You've got a number of issues here:
First of all, you need to know who is performing the operation, then you need to decide if they're allowed to do it.
For the first, you need a login system which issues a session id to the client (usually via cookie). You then use the session id on the server to look up who the user is and check if they're allowed to do the delete. This is usually handled through granting roles to users and then allowing roles to perform certain actions
Incidentally, GET requests are used for requests that do not modify the server state and can be repeated with no side-effects. POST, (or PUT/DELETE) should be used for any action that makes changes. Browsers will not send a POST twice without prompting the user explicitly.
You need to send data with POST data.
you can also use GET with CSRF token
I think both way is good.
You need to include a security layer to your MVC in order to define who can access specific resources of your API.
The most simple way is to require a key parameter in the URL that needs to match a key that you would have predefined on the server side, but be aware that despite it will prevent random user to update your data, it might not be suitable depending on the security level you want to achieve for your application.
I have looked through many threads and/or questions on the internet looking for a clear answer but so far havn't gotten any.
A part of the question was already answered (unless you tell me it is wrong). Data validation should be done client side AND server side. Client side to notify the user of data who is invalid and to offload the server and as well on the server to prevent any kind of attacks.
Validating on both sides can be a tedious task though and I wondered if there was some way to do it so that you don't have so much duplicated code.
There is also something else I was wondering...I have a table with rows who contain the id (of the database) of that row. At the end of each row I have a delete button to delete it from the html, my JSON object who contains an array of my values and who is sent to an ajax call to be deleted from the database (a link between 2 tables).
This isn't safe (well in an unsafe environment like the internet) and I know it. I can always check client side to see if the id is only numbers and if so then check server side if it exists. But who tells me the user did not go in the debugger and inverted 2 lines and end up deleting rows who should not be? What would be the best way to have my ids and be safe from people inverting them?
Any suggestions appreciated
Data validation should be done client side AND server side.
It is not really needed to do it client side IMHO. But it is nice to be able to notify the user before submitting the data. So I would call it a pre.
Validating on both sides can be a tedious task though and I wondered if there was some way to do it so that you don't have so much duplicated code.
Because PHP is serverside and javscript is clientside it isn't really easy to prevent duplication of code. There is one way that I know of however using xhr requests. This way you can use the php validation in javascript (clientside).
At the end of each row I have a delete button to delete it from the html
Please tell me you are using CSRF protection. :-)
I can always check client side to see if the id is only numbers and if so then check server side if it exists.
To prevent malicious sql injected you should use prepared statements and parameterized queries.
Validation should always be done on server. Client-side validation is for user convenience only and is not strictly necessary from the security point of view (except when it opens a XSS security hole).
Apart from that, you should sanitize every input you receive from user. After the input has passed a sanity check (for example if there is text where there should be a number), you should do a permission test (Does the user have the appropriate permission to perform an action that is requested?)
After you have done this, you can actually execute the transaction.
Since other answers have already been posted, i would recommend that you dedicate a utility class that you can call Utility and where you would store many useful functions to check data type & content in the server side, that way even if you'll be doing some great verifications on both client and server side, at least your code will be more readable and manageable.
Another benefit of creating and managing your own utility classes is that you can easily use them across all your projects.
For example, a utility class can contain methods ranging from verifying that a certain field is within a specific range of values, or that it doesn't contain special characters (use Regular Expressions) and so on ... Your advanced utility method would focus on preventing scripts insertion, sql injections as mentionned before.
All in all, you'll do the work once and benefit from it all the time, i'm sure you no want to start checking if an age field is negative or whatever on every project you do from scratch. Using Ajax and XHR can be a bridge between your server side and client side code and would help you unifiy your validation and avoid duplication.
Getting this out of the way; I don't frequently do much in php or database programming, so I'm just trying to be extra careful in this current project.
So basically, I have a site powered by a database and I have some Jquery code that uses the "post" method to insert rows into the database. As soon as I started I had my thoughts about security, seeing as people can't see my php, but can see my Javascript. As soon as I finished setting up the system I did a test and running the jquery does insert information even if I'm on a remote source... which can obviously lead to a possible major security problem. So being as unrefined as I am, I just want to know how I should prevent this and secure the system better?
And... Please don't insult me or be rude... I understand that I may be stupid or whatever, but I'm just trying to learn!
This is what Cross Site Request Forgery (CSRF) and the same origin policy sandbox are designed to prevent.
From another point of view, how do you stop someone from writing a form on their site and submitting it as a POST request to your script to add junk records?
Typically CSRF protection is provided by your framework. If you want to implement it directly in PHP, it is quite simple.
Generate a unique token that from your server's code, and add it to the session.
Embed it as a hidden field in the form.
On the script that accepts the form, check for the presence of this hidden field.
If the field exists, match it against what is in the session, if they match - then it is a legitimate request.
If they don't match (or the field is missing), then its a remote request and you can reject it.
This page provides some PHP code to implement the above.
Once you have done that, you need to make sure that the jquery script can be authenticated the same way. This snippet details steps to implement the same technique in jquery.
This is a technical design question more then a syntax question.
I have a large page with 9 forms corresponding to different tables. I've wrestled with the design and I there's no way out I have to send all the forms via ajax to the server for processing. There are lots of interrelations. I can't combine the forms into one large one either. So the question is what's the best way to send a lot of forms via ajax. To further complicate the issue there are dynamic forms with fields with same names.
I'm trying a technique of:
1. serializing each form,
2. prepending the form name to each field name
3. combining the serialized version of the forms into one
4. posting that combined serialized form to the server as one
5. breaking them apart on the server side into separate arrays and then finally doing the application logic
I just don't know if there's a tried and true easier solution and I'm making a mountain out of a mole hill
If there's genuinely no way to redesign the page, then your solution seems simple and straightforward to me, not at all "mountain"-ish. To me, though, your description of the page screams "redesign," though of course I don't have enough information to work with.
One such redesign would be to send field changes to the server as they happen, rather than waiting and submitting the entire thing. The server side can hold them as "pending" if you need the user to explicitly commit the whole thing when they're done. But that depends on expense of server resources, etc.
You should be able to send 9 separate AJAX requests without hassle (assuming that a: each doesn't rely on the response of another, and b: this isn't something which happens all the time).
Using your javascript library (you are using one, right??) just loop through your forms and AJAX submit each one. I think it'll probably only process probably 2 at a time, but if that's not a problem to your design, then all should be sweet.
It would certainly keep the PHP/Server-Side part of the equation much much simpler.
If you were working with a high-traffic site, then you'd probably want to reduce the number of requests being made, but chances are your current setup will work sufficiently well.
I'd prepare a javascript dispatcher which would smartly do the job of posting the data. So when the submit button is pressed it would collect all the data needed, and then send the data to the appropriate controllers on the server side.
It could block the form, in the meanwhile, or display a "Processing..." popup.