Database storage vs Cookies - To store form data - php

Good day to all,
I have a form with around 90 to 100 fields, divided into sub forms, which are loaded using ajax each time a form has to be displayed. but i would like to retain the data on the form fields every time a subform is loaded(lets say on an accidental page refresh or even if the user is switching between sub forms). What is the best way that this can be done.
I was thinking that i can store it in cookies or lets say in the database. But, storing it in the database would mean that i would have to query for the fields data every time a sub form is loaded. And again, if it was cookies, it has to read the data stored in the cookie files. I need some help with deciding what is the most efficient way, more in terms of speed.
What is the best way among these, or is there any other possibility to retain the fields data in the sub forms, every time they are loaded (which are loaded via AJAX each time.)
I am working with PHP and Codeigniter framework.
Thanks!!

A form like that needs to be durably stored. I would consider session state to smooth out the sub form loads, with writes to the database whenever the user updates something of consequence. Personally, I would start with a database-only solution, and then add session state if performance is an issue.
Cookies aren't meant to store large amounts of data. Even if it were possible, they bloat the request considerably (imagine 100 form fields all being transmitted with every request).
Local storage in the browser is also an option, though I would consider other options first.

I would first simplify it by using serialize:
$data = serialize(array_merge($_POST,$olddata));
Then that may be enough for you, but it's now super easy to store it anywhere since it is just a string. To reform it into its original state:
$data = unserialize($data);
.. wherever you end up pulling it from - database,session,etc..

Prose of database
It can also access from other computer too
You can store far far more data then cookie
Cones
If you retrive data by ajax it coukd cose more load on server
Cookie
Faster than database no query , fetch and all process .
Cones
Limited amount of space
However you can use local storage
So answer is database storage

Related

PHP - Reconnect to mySQL again or send previous via POST?

I have two web pages: page1.php and page2.php.
In page1.php
There is a mySQL query executed and saved into variable A.
In page2.php
I want the same results of the same query.
Should I reconnect to mySQL and execute the query again or send the variable A from page1.php via POST or SESSION? What is the fast solution? There is any other solution?
EDIT: There is no other queries, so the data will not change (I think)
I certainly wouldn't pass it via POST, this allows for the data to be tampered with by a malicious client.
SESSION will be a bit faster assuming the dataset is not too large, and if the data is unlikely to change in the space of a user's session it's a reasoble choice.
It would be useful to know how long your query actually takes normally, to see whether the time difference would be significant. You did mention that the result set contains 14000 rows. If it's just a select from one table I can't imagine it takes very long. 14000 rows is really quite a lot to be storing in the session. Remember that it stores the dataset once for every user using the site at any one moment (since a session is specific to a single user). If you have a lot of users, you would have to keep an eye on how much memory on your server it is consuming.
Another design note: if you want exactly the same results on multiple pages, consider a 3rd script you can include in the others which does nothing but produce that output.
If you are not shure if the data may have changed, make a new query! It may take some more time to do this operation but you can be sure the data is still up to date.
What if the user restarts page1.php in another browser while still processing page2.php?
So only if you are 100% sure nothing can be manipulated or if loading the data takes too much time/resources reuse the data.
Never trust data which may have been changed by the user and ALWAYS make sure the still is what you expect it is. Maybe you should consider to make some kind of validation of the data.

Should I store data in a SESSION or use multiple Ajax calls?

If I am retrieving large amounts of data from a database to be used on multiple sequential pages what is the best method for this?
For example at the moment I have a page with a form that calls a PHP script via it's action
This script searches a database full of customers, stores them in a session array and redirects back to the original page. The page then tests if the session is set and loops through the session array displaying each customer in a combo box.
The user moves to the next page where the array is used again to display the customers.
There will be multiple users accessing the database for this information ( only around 10 or so) sequentially
Would I be better off keeping it stored in the database and retrieving it from there every time I need it, rather than the SESSION?
With only ten users to worry about you aren't going to notice a significant improvement or decline in performance either way. That being said, if this data is dynamic, you would benefit from directly accessing it in case it changes in between pages.
Normally this would be done with caching (memcache, etc) but your situation is controlled enough not to require anything more than an SQL query.
This of course assumes your server was made in the past decade and can handle ten simulatneous users. If your server performance is not up to the challenge I will revisit my answer.

PHP One time database query on application startup

I have a ajax based PHP app (without any frameworks etc.).
I need to retrieve some records from the database (html select element items) ONCE, and once only, during application startup, store it in a PHP array, and have this array available for future use to prevent future database calls, for ALL future users.
I could do this easily in Spring with initializing beans. And this bean would have the application scope (context) so that it could be used for ALL future user threads needing the data. That means the database retrieval would be once, only during app boot, and then some bean would hold the dropdown data permanently.
I can't understand how to replicate the usecase in PHP.
There's no "application" bootstrapping as such, not until the first user actually does something to invoke my php files.
Moreover, there is no application context - records retrieved for the first user will not be available to another user.
How do I solve this problem? (Note: I don't want to use any library like memcache or whatever.)
If you truly need to get the data only the first time the app is loaded by any user, than you could write something that gets the data from your database, and then rewrites the html page that you're wanting those values in. That way when the next user comes along, they are viewing a static page that has been written by a program.
I'm not so sure that 1 call to the database everytime a user hits your app is going to kill you though. Maybe you've got a good reason, but avoiding the database all but 1 time seems rediculous IMO.
If you need to hit the database one time per visitor, you could use $_SESSION. At the beginning of your script(s) you would start up a session and check to see if there are values in it from the database. If not, it's the user's first visit and you need to query the database. Store the database values in the $_SESSION superglobal and carry on. If the data is in the session, use it and don't query the database.
Would that cover you?

PHP parse a CSV file - should it be stored in the database

I have a php/mysql application, part of which allows the user to upload a csv file. The steps in the process are as follows:
User uploads a file, and it gets parsed to check for validity
If the file is valid, the parsed information is displayed, along with options for the user to match columns from the csv file to database columns
Import the data - final stage where the csv data is actually imported into the database
So, the problem that I have at this point is that the same csv file gets parsed in each of the above 3 steps - so that means 3 parses for each import.
Given that there can be up to 500 rows per csv file, then this doesn't strike me as particularly efficient.
Should I rather temporarily store the imported information in a database table after step 1? If so, I would obviously run clear up routines to keep the table as clean as possible. The one downside is that the csv imports can contain between 2 and 10 columns - so I'd have to make a database table of at least 11 columns (with an ID field)...which would be somewhat redundant in most cases.
Or should I just stick with the csv parsing? Up to 500 rows is quite small...
Or perhaps there is another better alternative?
In PHP, you can store data into the Session memory for later use. This allows you to parse the CSV file only once, save it in the Session memory and use this object in all of the later steps.
See http://www.tizag.com/phpT/phpsessions.php for a small tutorial.
Let me explain a bit more.
Every time a web browser requests a page from the server, PHP executes the PHP script associated with the web page. It then sends the output to the user. This is inherently stateless: the user requests something, you give something back -> end of transaction.
Sometimes, you may want to remember something you calculated in your PHP script and use it the next time the page is requested. This is stateful, you want to save state across different web requests.
One way is to save this result in the database or in a flat file. You could even add an identifier for the currently connected user, so you use a per-user file or save the current user in your database.
You could also use a hidden form and save all of the data as hidden input fields. When the user presses "Next", the hidden input fields are sent back to the PHP script.
This is all very clumsy. There is a better way: session memory. This is a piece of memory that you can access, which is saved across different PHP calls. It is perfect for saving temporary state information like this. The session memory can be indexed per application user.
Please note that there are frameworks that take this a lot further. Java SEAM has an APPLICATION memory, a SESSION memory, a CONVERSATION memory, a PAGE memory and even a single EVENT memory.
I had to do a similar thing for importing users into a database. What I ended up doing was this:
Import and parse CSV
Assign data to an array
Next page had a bunch of hidden form fields each with the data (ex. <input type="hidden" name="data[]" value="thedata" />)
Post it over and add the data to the database
It ended up working well for me. You can also save the data to session variables.
I'd just stick with parsing it 3 times. PHP is slow anyways, as are the network latencies for using the database or sending information to the client. What's most important is that your code is maintainable and extensible. The slowest part of a computer program is the developer.
See http://en.wikipedia.org/wiki/Program_optimization#When_to_optimize
http://ubiquity.acm.org/article.cfm?id=1147993
http://benchmarksgame.alioth.debian.org/u32/which-programs-are-fastest.html
Hope that helps...

Tracking information over many pages for a website

I have a sort of vague question that you guys are perfect for answering. I've many times come across a point where I've had a form for my user to fill out where it consisted of many different pages. So far, I've been saving them in a session, but I'm a little worried about that practice since a session could expire and it seems a rather volatile way of doing it.
I could see, for example, having a table for temporary forms in SQL that you save to at the end of each page. I could see posting all the data taken so far to the next page. Things along those lines. How do you guys do it? What's good practice for these situations?
Yes, you can definitely save the intermediate data in the database, and then flip some bit to indicate that the record is finished when the user submits the final result. Depending on how you are splitting up the data collection, each page may be creating a row in a different table (with some key tying them together).
You may also want to consider saving the data in a more free-form manner, such as XML in a single column. This will allow you to maintain complex data structures in a simple data schema, but it will make querying the data difficult (unless your database supports xml column types, which most modern enterprisey databases do).
Another advantage to storing the interim data in the database is that the user can return to it later if he wishes. Just send the user an email when he starts, with a link to his work item. Of course, you may need to add whatever security layers on top of that to make sure someone else doesn't return to his work item.
Storing the interim data in the DB also allows the user to skip around from one page to another, and revisit past pages.
Hidden fields are also a good approach, but they will not allow the user to return later.
I would avoid storing large data structures in session, since if the user doesn't invalidate the session explicitly, and if you don't have a good mechanism for cleaning up old sessions, these expired sessions may stick around for a long time.
In the end, it really depends on your specific business needs, but hopefully this gives you something to think about.
I would stick with keeping the data in the session as it is more or less temporary at this stage: What would you do if a user does not complete the forms? You would have to check the SQL table for uncompleted data regularly making your whole application more complex.
By the way, there is a reason for session expiring namely security. And you can define yourself when the session expires.
Why not just pass things along in hidden parameters?
Ahh, good question.
I've found that a great way to handle this (if it's linear). The following will work especially well if you are including different content (pages) into one PHP page (MVC, for example). However, if you need to go from URL to URL, it can be difficult, because you cannot POST across a redirect (well, you can, but no browsers support it).
You can fill in the details.
$data = array();
//or//
$data = unserialize(base64_decode($_POST['data']));
// add keys to data
// serialize
$data = base64_encode(serialize($data));
<input type="hidden" name="data" value="<?= htmlspecialchars($data, ENT_QUOTES); ?>" />

Categories