I have two web pages: page1.php and page2.php.
In page1.php
There is a mySQL query executed and saved into variable A.
In page2.php
I want the same results of the same query.
Should I reconnect to mySQL and execute the query again or send the variable A from page1.php via POST or SESSION? What is the fast solution? There is any other solution?
EDIT: There is no other queries, so the data will not change (I think)
I certainly wouldn't pass it via POST, this allows for the data to be tampered with by a malicious client.
SESSION will be a bit faster assuming the dataset is not too large, and if the data is unlikely to change in the space of a user's session it's a reasoble choice.
It would be useful to know how long your query actually takes normally, to see whether the time difference would be significant. You did mention that the result set contains 14000 rows. If it's just a select from one table I can't imagine it takes very long. 14000 rows is really quite a lot to be storing in the session. Remember that it stores the dataset once for every user using the site at any one moment (since a session is specific to a single user). If you have a lot of users, you would have to keep an eye on how much memory on your server it is consuming.
Another design note: if you want exactly the same results on multiple pages, consider a 3rd script you can include in the others which does nothing but produce that output.
If you are not shure if the data may have changed, make a new query! It may take some more time to do this operation but you can be sure the data is still up to date.
What if the user restarts page1.php in another browser while still processing page2.php?
So only if you are 100% sure nothing can be manipulated or if loading the data takes too much time/resources reuse the data.
Never trust data which may have been changed by the user and ALWAYS make sure the still is what you expect it is. Maybe you should consider to make some kind of validation of the data.
Related
Good day to all,
I have a form with around 90 to 100 fields, divided into sub forms, which are loaded using ajax each time a form has to be displayed. but i would like to retain the data on the form fields every time a subform is loaded(lets say on an accidental page refresh or even if the user is switching between sub forms). What is the best way that this can be done.
I was thinking that i can store it in cookies or lets say in the database. But, storing it in the database would mean that i would have to query for the fields data every time a sub form is loaded. And again, if it was cookies, it has to read the data stored in the cookie files. I need some help with deciding what is the most efficient way, more in terms of speed.
What is the best way among these, or is there any other possibility to retain the fields data in the sub forms, every time they are loaded (which are loaded via AJAX each time.)
I am working with PHP and Codeigniter framework.
Thanks!!
A form like that needs to be durably stored. I would consider session state to smooth out the sub form loads, with writes to the database whenever the user updates something of consequence. Personally, I would start with a database-only solution, and then add session state if performance is an issue.
Cookies aren't meant to store large amounts of data. Even if it were possible, they bloat the request considerably (imagine 100 form fields all being transmitted with every request).
Local storage in the browser is also an option, though I would consider other options first.
I would first simplify it by using serialize:
$data = serialize(array_merge($_POST,$olddata));
Then that may be enough for you, but it's now super easy to store it anywhere since it is just a string. To reform it into its original state:
$data = unserialize($data);
.. wherever you end up pulling it from - database,session,etc..
Prose of database
It can also access from other computer too
You can store far far more data then cookie
Cones
If you retrive data by ajax it coukd cose more load on server
Cookie
Faster than database no query , fetch and all process .
Cones
Limited amount of space
However you can use local storage
So answer is database storage
Well this is kind of a question of how to design a website which uses less resources than normal websites. Mobile optimized as well.
Here it goes: I was about to display a specific overview of e.g. 5 posts (from e.g. a blog). Then if I'd click for example on the first post, I'd load this post in a new window. But instead of connecting to the Database again and getting this specific post with the specific id, I'd just look up that post (in PHP) in my array of 5 posts, that I've created earlier, when I fetched the website for the first time.
Would it save data to download? Because PHP works server-side as well, so that's why I'm not sure.
Ok, I'll explain again:
Method 1:
User connects to my website
5 Posts become displayed & saved to an array (with all its data)
User clicks on the first Post and expects more Information about this post.
My program looks up the post in my array and displays it.
Method 2:
User connects to my website
5 Posts become displayed
User clicks on the first Post and expects more Information about this post.
My program connects to MySQL again and fetches the post from the server.
First off, this sounds like a case of premature optimization. I would not start caching anything outside of the database until measurements prove that it's a wise thing to do. Caching takes your focus away from the core task at hand, and introduces complexity.
If you do want to keep DB results in memory, just using an array allocated in a PHP-processed HTTP request will not be sufficient. Once the page is processed, memory allocated at that scope is no longer available.
You could certainly put the results in SESSION scope. The advantage of saving some DB results in the SESSION is that you avoid DB round trips. Disadvantages include the increased complexity to program the solution, use of memory in the web server for data that may never be accessed, and increased initial load in the DB to retrieve the extra pages that may or may not every be requested by the user.
If DB performance, after measurement, really is causing you to miss your performance objectives you can use a well-proven caching system such as memcached to keep frequently accessed data in the web server's (or dedicated cache server's) memory.
Final note: You say
PHP works server-side as well
That's not accurate. PHP works server-side only.
Have you think in saving the posts in divs, and only make it visible when the user click somewhere? Here how to do that.
Put some sort of cache between your code and the database.
So your code will look like
if(isPostInCache()) {
loadPostFromCache();
} else {
loadPostFromDatabase();
}
Go for some caching system, the web is full of them. You can use memcached or a static caching you can made by yourself (i.e. save post in txt files on the server)
To me, this is a little more inefficient than making a 2nd call to the database and here is why.
The first query should only be pulling the fields you want like: title, author, date. The content of the post maybe a heavy query, so I'd exclude that (you can pull a teaser if you'd like).
Then if the user wants the details of the post, i would then query for the content with an indexed key column.
That way you're not pulling content for 5 posts that may never been seen.
If your PHP code is constantly re-connecting to the database you've configured it wrong and aren't using connection pooling properly. The execution time of a query should be a few milliseconds at most if you've got your stack properly tuned. Do not cache unless you absolutely have to.
What you're advocating here is side-stepping a serious problem. Database queries should be effortless provided your database is properly configured. Fix that issue and you won't need to go down the caching road.
Saving data from one request to the other is a broken design and if not done perfectly could lead to embarrassing data bleed situations where one user is seeing content intended for another. This is why caching is an option usually pursued after all other avenues have been exhausted.
I certainly can't solve this problem by myself after a few many days already trying. This is the problem:
We need to display information on the screen (HTML) that is being generated in real time inside a PHP file.
The PHP is performing a very active crawling, returning huge arrays of URLs, each URL need to be displayed in real time in HTML, as soon as the PHP captures it, that's why we are using Ob_flush() and flush methods to echo and print the arrays as soon as we got them.
Meanwhile we need to display this information somehow so the users can see it while it works (since it could take more than one hour until it finishes).
It's not possible to be done, as far as I understand, with AJAX, since we need to make only 1 request and read the information inside the array. I'm not either totally sure if comet can do something like this, since it would interrupt the connection as soon as it gets new information, and the array is really rapidly increasing it's size.
Additionally and just to make the things more complex, there's no real need to print or echo the information (URLs) inside the array, since the HTML file is being included as the User Interface of the same file that is processing and generating the array that we need to display.
Long story short; we need to place here:
<ul>
<li></li>
<li></li>
<li></li>
<li></li>
<li></li>
...
</ul>
A never ending and real time updated list of URLS being generated and pushed inside an array, 1,000 lines below, in a PHP loop.
Any help would be really more than appreciated.
Thanks in advance!
Try web-sockets.
They offer real-time communication between client and server and using socket.io provide cross-browser compatibility. It's basically giving you the same results as long-polling / comet, but there is less overhead between requests so it's faster.
In this case you would use web sockets to send updates to the client about the current status of the processing (or whatever it was doing).
See this Using PHP with Socket.io
Suppose you used a scheme where PHP was writing to a Memcached server..
each key you write as rec1, rec2, rec3
You also store a current_min and a current_max
You have the user constantly polling with ajax. For each request they include the last key they saw, call this k. The server then returns all the records from k to max.
If no records are immediately available, the server goes into a wait loop for a max of, say 3 seconds, checking if there are new records every 100ms
If records become available, they are immediately sent.
Whenever the client receives updates or the connection is terminated, they immediately start a new request...
Writing a new record is just a matter of inserting max+1 and incrementing min and max where max-min is the number of records you want to keep available...
An alternative to web sockets is COMET
I wrote an article about this, along with a followup describing my experiences.
COMET in my experience is fast. Web sockets are definitely the future, but if you're in a situation where you just need to get it done, you can have COMET up and running in under an hour.
Definitely some sort of shared memory structure is needed here - perhaps an in-memory temp table in your database, or Memcached as stephen already suggested.
I think the best way to do this would be to have the first PHP script save each record to a database (MySQL or SQLite perhaps), and then have a second PHP script which reads from the database and outputs the newest records. Then use AJAX to call this script every so often and add the records it sends to your table. You will have to find a way of triggering the first script.
The javascript should record the id of the last url it already has, and send it in the AJAX request, then PHP can select all rows with ids greater than that.
If the number of URLs is so huge that you can't store a database that large on your server (one might ask how a browser is going to cope with a table as large as that!) then you could always have the PHP script which outputs the most recent records delete them from the database as well.
Edit: When doing a lot of MySQL inserts there are several things you can do to speed it up. There is an excellent answer here detailing them. In short use MyISAM, and enter as many rows as you can in a single query (have a buffer array in PHP, which you add URLs to, and when it is full insert the whole buffer in one query).
If I were you , I try to solve this with two way .
First of all I encode the output part array with json and with the setTimeout function with javascript I'll decode it and append with <ul id="appendHere"></ul> so
when list is updated , it will automatically update itself . Like a cronjob with js .
The second way , if u say that I couldn't take an output while proccessing , so
using data insertion to mysql is meaningless I think , use MongoDb or etc to increase speed .
By The way You'll reach what u need with your key and never duplicate the inserted value .
I'm designing my own session handler for my web app, the PHP sessions are too limited when trying to control the time the session should last.
Anyway, my first tests were like this: a session_id stored on a mysql row and also on a cookie, on the same mysql row the rest of my session vars.
On every request to the server I make a query, get these vars an put them on an array to use the necesary ones on runtime.
Last night I was thinking if I could write the vars on a server file once, on the login stage, and later just include that file instead of making a mysql query on every request.
So, my question is: which is less resource consuming? doing this on mysql or on a file?
I know, I know, I already read several threads on stackoverflow about this issue, but I have something different from all those cases (I hope I didn't miss something):
I need to keep track of the time that has passed since the last time the user used the app, so, in every call to the server not only I request the entire database row, I also update a timestamp on that same row.
So, on both cases I need to write to the session on every request...
FYI: the entire app runs on one server so the several servers scenario when using files does not apply..
It's easier to work with when it's done in a database and I've been using sessions in database mostly for scalability.
You may use MySQL since it can store sessions in it's temporary memory with well-configured MySQL servers, you can even use memory tables to fasten the thing if you can store all the sessions within memory. If you get near your memory limit it's easy to switch to a normal table.
I'd say MySQL wins over files for performance for medium to large sites and also for customization/options. For smaller websites I think that it doesn't make that much of a difference, but you will use more of the hard drive when using files.
I have a ajax based PHP app (without any frameworks etc.).
I need to retrieve some records from the database (html select element items) ONCE, and once only, during application startup, store it in a PHP array, and have this array available for future use to prevent future database calls, for ALL future users.
I could do this easily in Spring with initializing beans. And this bean would have the application scope (context) so that it could be used for ALL future user threads needing the data. That means the database retrieval would be once, only during app boot, and then some bean would hold the dropdown data permanently.
I can't understand how to replicate the usecase in PHP.
There's no "application" bootstrapping as such, not until the first user actually does something to invoke my php files.
Moreover, there is no application context - records retrieved for the first user will not be available to another user.
How do I solve this problem? (Note: I don't want to use any library like memcache or whatever.)
If you truly need to get the data only the first time the app is loaded by any user, than you could write something that gets the data from your database, and then rewrites the html page that you're wanting those values in. That way when the next user comes along, they are viewing a static page that has been written by a program.
I'm not so sure that 1 call to the database everytime a user hits your app is going to kill you though. Maybe you've got a good reason, but avoiding the database all but 1 time seems rediculous IMO.
If you need to hit the database one time per visitor, you could use $_SESSION. At the beginning of your script(s) you would start up a session and check to see if there are values in it from the database. If not, it's the user's first visit and you need to query the database. Store the database values in the $_SESSION superglobal and carry on. If the data is in the session, use it and don't query the database.
Would that cover you?