How to Handle Concurrency control in PHP & MySql - php

I am using the core PHP & Mysql to building a project and I have a registration form that is divided into multiforms (say 4 to 5 forms) and each form is getting user details from new users, and i want to save the users data to database tables at the final submission of form, so my question is where to save the data got from mutiforms?
situation: 1. if i use the session variable to store the forms data, then it will take too much server resources to store data to session or worse if i got many users at the same time of registration. so sessions can slow down the server.
Please suggest a solution for that.

Store all data in a database and just match data with user's session id.

you can get all fields in javascript and send to server with ajax. Other solution is request individual form and use SQL transactions.

Related

Temporary tables that last through browser session

In my web application, I store some data on the local storage of browsers. This data forms a part of the following SQL. Users may request the SQL many times in a session.
SELECT * from table WHERE id NOT IN (...local storage data) ORDER BY RAND() LIMIT 1
To avoid posting data from local storage each time the request is made, can I create a temporary table to store the data instead, that can last until the user leaves my site?
for me to store data is much easier using this:
<?php
$_SESSION['variable'] = "data";
?>
to save it simply by post method with some additional js code make it realtime.
using back the "$_SESSION" code to insert it into database or csv file.
Yes. Its required another php page to perform this. Because php cannot be update with realtime data in a page. Its done and never do another task once the page is completely 100% loaded

Best method to store form values WITHOUT sending them to a MySQL Database for Payment Gateway Processing

I have been scratching my head for days now and wonder if you can help.
I am busy developing a ticket booking system BUT don’t want to store data to the database UNTIL payment successful, or in other words the visitor is returned to my site after payment.
So what I have is a form that a post to a confirmation form. On the confirmation form I catch all the values via POST commands for visitor to review. I then POST this off to the payment provider and once the transaction is successful the visitor is routed back to my site with a confirmation page.
It’s THIS confirmation page that I want to use the store the values in the mysql database.
Now I can use sessions to keep the values in an array or I could write the values into a TEMP database table, but I don’t want to write to and back from the database to many times as I don’t have load balancing in place plus I want to it to be as lightweight as possible.
Any ideas?
The answer is fairly simple. Important data have to be stored in a reliable storage. Session is not one by design.
So, being guided by not a whim but a reason, you'll end up storing this data in database, connecting it with user id.
The solution, as you said, is to use SESSION. If you want data to be more persistent, you can also use cookies.
So before sending your data to the payment gateway, you should do something like this :
$_SESSION['posted_data'] = $_POST;

How to secure JSON call without using a captcha

So we are building a website and created our basic information to send logins to our database. We have trouble trying to disallow requests that just plug their own data in.
E.g.
http://testing.site.com/php/interfaces/User.php?Action=1&Email=test#gmail.com&FirstName=herp%20derp
By replacing email and firstname, they are able to add multiple users to the database and potentially with a script thousands. Is there any way to prevent this without using a captcha? We are trying to be very minimal and open with the site's design so would love some input if this is possible.
One option we have considered is moving our PHP offline and only allowing our API to access it- however it still presents the problem of users adding in authorised data (and overloading our database with thousands of multiple requests)
Here is a sample option, create a table with 2 fields, one is an Auto Increment id and one is a random code, lets name them ID and CODE
When sending that request, create 1 record in that table and pass the ID and CODE along with request, when receiving the request, check if there is a record in database with that ID and CODE process the request and delete that record from database too and if there isn't that record, just ignore request ...

saving form data to database

I am new to PHP, so I am looking for some input on how to make my project a little simpler.
I have a form in which a user can create a list of song, the Submit button then sends it to an intermediate page that saves it to a MySQL database for later use, the intermediate page then forwards them on to the final page that shows them the lyrics of the songs in the order that they chose them.
Originally I had the intermediate page and the final page combined, but ever time a user refreshed the page it would resubmit the data to the DB. What I have works, but it seems like there should be an easier way to accomplish this.
#micahmills: An easier way of stopping duplicate data from being added to the database? Well, it would depend on what you'd consider "easier" -- less code? Less steps? Something else?
What you could do is generate a unique hash or token that submits with the form. This token is then stored in a session after successfully inserting to the database. Attempts to repost the form will then fail because the token sent with the form will be the same as the one stored in the session.
Redirecting to another page after posting to the database is still one of the best and simplest ways of preventing duplicate data being posted though.
Best practice is to redirect after database operation to success / failure page.
You can have form & intermediate combined and a final success page, on failure you need to return back form.

php: pass large arrays of data through pages

I am trying to to solve a problem where I need to pass large arrays of data to another page, this is my scenario:
The user input his/her gmail login information inside a form, I then send this information to an ajax page where i authenticate and fetch all the contacts, if the login is invalid they can try again but if it authenticated I need to send them to the next page where I parse all the emails and check if they match any users on the site.
Method 1 (didn't work):
Store all the data inside a session, this only work if the array is small as there is a size limit for the sessions.
Method 2 (didn't work):
Add an hidden input with javascript and then submit the form (also with javascript).
As it turns out you can't submit the form and return true (change page) unless the user triggers the event.
So how should I go on, should I just skip the ajax authentication and send them back to the previous page if it didn't work or is there some workaround to my problem?
Why don't you store the data in a database, MySQL, or SQLite if MySQL is not available to you. In there, you would store a serialized version of your array linked to the users session id.
The MySQL table I'm thinking of:
id | session_id | data
http://php.net/manual/en/function.serialize.php on how to serialize your array.
If you are able to fetch the data again on the next page, you could do that instead of passing it between pages.
Since you are using jQuery you can submit the data directly or as a hidden element on the form without a user click. Assuming the second submission is via AJAX you can:
$("#mydiv").load("secondpage.php", {email1: 'blah'}, function(){
alert("Submitted everything nicely");
});
Depending on your webserver, but session variables do not typically have a size restriction. Apache+PHP you could handle extremely large sizes, instead you should care about
http://ca.php.net/manual/en/ini.core.php#ini.memory-limit. In addition, PHP.ini carries session.size variable that you could adjust. I am not sure how it did not work for you; you used $_SESSION, right?
Finally, to make a better persisting yet fast (faster than Database) volatile storage I would recommend using Danga's memcached. It is very stable, widely used and has wrappers for every possible language flavor.

Categories