I am using php mysql.
First on form submission I run query on mysql and get the row id -> store it in session array.
For displaying the result I have modified one php array pagination script to create pagination of session array with first,prev,next,last, Jump to page number functionality. For that pagination I have taken the reference from:
http://lotsofcode.com/php/php-array-pagination.htm
Script is working fine.
But I have two questions.
Question 1: Is it fine to store big result in session array? if not
what will be the good alternative for that? (I am wondering to keep
first 500 result in session array and if result result is more than
500 then create one xml file.)
Question 2: Is it possible to use ajax for pagination working with
session array and having functionality of first,prev,next,last, Jump
to page number?
If any solution is there please update me.
Thanks
Ravindra.
I wouldn't use sessions to store large amounts of data, and I wouldnt use sessions to store data returned from a table (mysql or otherwise).
The session data is stored on the server - and held within memory - whats going to happen when you get multiple users using the same table ?
The database (mysql or otherwise) should be sufficient enough, with the correct indexes in place, to handle queries for displaying data.
I have a mysql table with 120 million records and can extract data quickly using date ranges - no speed issues what so ever.
Related
I recently started saving data that I use several times into a PHP session.
For example, I have to use the active countries for each load of the website. Instead of creating a database connection every time, I have now saved this data in a session. I do that with all the data that I use several times. I also created banners in a multi-level array and saved this array in a session.
On first Loading of the Website, all the data that i need, are collected and stored in several php session.
Now I have about 10 different PHP sessions at the start of the website but practically no database connection anymore.
Is this procedure workable in this way?
Do you see a risk? For me it's all about the performance of the site. I try to create as few database connections as possible for recurring tasks
Am I making a reasoning mistake?
There are no sensitive data in the session. For example, All Country for a select field with all (necessary) countries. The array is then Key => Country. And so I create the Select field via "foreach".
I have an AngularJS v1 app that connects to Oracle DB and takes the values from it.
At the beginning, each page had just several such values, so the optimization was not in question. But now I have pages, that contain dozens of elements with values taken from DB.
Currently, for each element, the name of the element is passed to PHP file that opens a connection to DB, reads the last value (using rownum and order by time) and returns this value back to AngularJS.
So, as you can imagine, it takes quite a while to display all dozens of values on a page (due to the fact that AngularJS first loads all the values and then displays them all together).
I would like to somehow optimize this connectivity.
The example of a page can be seen at: NP04 Photon Detectors
And the code can be found here: Github
The PHP file in question is: app/php-db-conn/elementName.conn.php
Thanks!
UPDATE
I have updated the code to partly retrieve data using the array, but it seems to me that I'm forming JSON in an incorrect way. Could someone please help me out?
UPDATE 2
Managed to make it all work, but now it takes even longer to load than before despite the fact that it transfers less data.
If I am retrieving large amounts of data from a database to be used on multiple sequential pages what is the best method for this?
For example at the moment I have a page with a form that calls a PHP script via it's action
This script searches a database full of customers, stores them in a session array and redirects back to the original page. The page then tests if the session is set and loops through the session array displaying each customer in a combo box.
The user moves to the next page where the array is used again to display the customers.
There will be multiple users accessing the database for this information ( only around 10 or so) sequentially
Would I be better off keeping it stored in the database and retrieving it from there every time I need it, rather than the SESSION?
With only ten users to worry about you aren't going to notice a significant improvement or decline in performance either way. That being said, if this data is dynamic, you would benefit from directly accessing it in case it changes in between pages.
Normally this would be done with caching (memcache, etc) but your situation is controlled enough not to require anything more than an SQL query.
This of course assumes your server was made in the past decade and can handle ten simulatneous users. If your server performance is not up to the challenge I will revisit my answer.
I've got a json string stored in a mySQL DB and now I'm trying to find a way to check if a specific key contains a value at all or not.
Been googling around, but most solutions point to finding a specific value, where instead I want to check if there's a value there or not.
This is to be implemented into some sort of php check function, and if there's a way to get all results from mysql instead of doing multiple queries, that'd be great.
Example:
row 1 {"Name":"Jane","Group":"","customernumber":"12345"}
row 2 {"Name":"Mike","Group":"Sales","customernumber":"23456"}
row 3 {"Name":"Steve","Group":"","customernumber":"34567"}
The resulting array would contain Mike with details and so on.
A little help, please?
EDIT
I didn't choose to store the data like this. It's the CMS I'm working with that stores custom form like this.
I've got about 400 db entries and I thought of letting MySQL do the processing since I don't know if storing that many results in an array would be bad for performance since a couple of users are going to view pages that uses these results causing quite frequent requests.
I have an area which gets populated with about 100 records from a php mysql query.
Need to add ajax pagination meaning to load 10 records first, then on user click to let's say "+" character it populates the next 20.
So the area instead of displaying all at once will display 20, then on click, then next 20, the next... and so on with no refresh.
Should I dump the 100 records on a session variable?
Should I call Myqsl query every time user clicks on next page trigger?
Unsure what will be the best aproach...or way to go around this.
My concern is the database will be growing from 100 to 10,000.
Any help direction is greatly apreciated.
If you have a large record set that will be viewed often (but not often updated), look at APC to cache the data and share it among sessions. You can also create a static file that is rewritten when the data changes.
If the data needs to be sorted/manipulated on the page, you will want to limit the number of records loaded to keep the JavaScript from running too long. ExtJS has some nice widgets that do this, just provide it with JSON data (use PHP's encode method on your record set). We made one talk to Oracle and do the 20-record paging fairly easily.
If your large record set is frequently updated, and the data must be "real time" accurate, you have some significant challenges ahead. I would look at comet, ape, or web workers for polling/push solution and build your API to only deal in updates to the "core" data--again, probably cached on the server rather than pulled from the DB every time.
your ajax call should call a page which only pulls the exact amount of rows it needs from the database. so select the top 20 rows from the query on the first page and so on. your ajax call can take a parameter called pagenum and depending on that is what records you actually pull from the database. no need for session variables.