In my web application, I store some data on the local storage of browsers. This data forms a part of the following SQL. Users may request the SQL many times in a session.
SELECT * from table WHERE id NOT IN (...local storage data) ORDER BY RAND() LIMIT 1
To avoid posting data from local storage each time the request is made, can I create a temporary table to store the data instead, that can last until the user leaves my site?
for me to store data is much easier using this:
<?php
$_SESSION['variable'] = "data";
?>
to save it simply by post method with some additional js code make it realtime.
using back the "$_SESSION" code to insert it into database or csv file.
Yes. Its required another php page to perform this. Because php cannot be update with realtime data in a page. Its done and never do another task once the page is completely 100% loaded
Related
I am using the core PHP & Mysql to building a project and I have a registration form that is divided into multiforms (say 4 to 5 forms) and each form is getting user details from new users, and i want to save the users data to database tables at the final submission of form, so my question is where to save the data got from mutiforms?
situation: 1. if i use the session variable to store the forms data, then it will take too much server resources to store data to session or worse if i got many users at the same time of registration. so sessions can slow down the server.
Please suggest a solution for that.
Store all data in a database and just match data with user's session id.
you can get all fields in javascript and send to server with ajax. Other solution is request individual form and use SQL transactions.
hello all i am having a website where i have a side pannel which is displaying the same thing on all the pages
i mean the data which is fetched by the database dosent change often.
like i have a pannel on the left side of my site which is displaying recent birthdays which is going to be change after 1 day only but the problem is that everytime the page is refreshed the same data needs to be fetched from the database again and again which can increase the database load and also page load time
i was wondering if there is anything which can display the side bar once the query is executed and after that the result will be fetched from session/cache or what ever the sollution is
i think it can be done by cache like memcache but i dont know how
i also tried to store them into session and cookies but there is error it say can not store objects into session and also i have to use loop to the results obtained by query the query is like
$birthday_query=mysqli_query($connection,"select name,email,id,date from members where dob 'some code here' ");
// this code fetches about 30-40 results and i am using the loop to display the results..
Yes, you can use memcache to store the information. This is your best option as it is going to stay in the server memory.
You can also use something very simple like a temp file where to keep the recent birthdays information - the HTML code. Update the information in this file once a day and read it from there. For every user you can read the information once (when they first come the website) and then store it and read it from the session.
For just 30-40 results, you don't need caching it will not add a great value, good will be to use Session to store the data as soon user browse your first page and then fetching data always from Session. using a file to cache html is also a good bet here.
you can check how to store object into session
I am using Wordpress for this site. I have order data from users that is retrieved by get-order.php.
So on the order page, the data is retrieved by requiring the get-order.php file. They can upload a file on this page. When they upload the file and submit, I use ajax to send the file to upload-file.php for validation and database changes.
I am wondering if it would be more efficient to add the already received data on the order page to the data set sent by ajax, or requiring the get-order.php in the upload-file.php. I am guessing posting is more efficient since you won't have to run the query again, just making sure.
I am not sure about how sensitive "order data" is but I wouldn't trust the posted data since it can be easily manipulated.
Either get the data again by including "get-order.php" or put the order data in $_SESSION which can be accessed all across the script.
I would personally prefer session route.
Using session in php is very easy :-
session_start(); //Initializes the session
$order_data = get_data(); //function or method that gets the order data
$_SESSION['order_data'] = $order_data;
//In some other place
$order_data = $_SESSION['order_data'];
I want to temporary store a series of array which will be used by next request. The stored information contains some sensitive data which will be used for navigating around that page with ajax call. The data were different from pages to pages. So, I just need to temporary store it for use when user is on that page.
First, I try to do it with cache: Cache::put($dynamickey, $multiArray, 20); But this will result in huge amount of "junk" cache store inside the folder even after it is expired.
So, I tried with session flush: Session::flash($dynamickey, $multiArray);. This works when user is open only 1 tab of webpage. But if user is open multiple tab of this website, it breaks.
For example:
1. User browse this website on tab1.
2. Then, user browse this website on tab2. As soon as after user browse website on tab2, the session data for tab1 is removed.
3. User come back and navigate tab1 content. The system break, and not working.
How can I store temporary data which will be deleted once it is no longer required, but also works well with multiple tab?
Thank you.
So, on the page that actually sets the session data you will need to generate a dynamic key which you can also generate when the ajax call is made. So:
Session:put($dynamicKey, $data);
Since the server doesn't know if you have multiple tabs open it just processes more requests, we need to distinguish AJAX requests from standard ones. This can be achieved via:
if (Request::ajax())
{
if (Session::has($dynamicKey)) {
Session::forget($dynamicKey);
// Do your application logic
}
}
So the session will not be removed until an ajax request is made where you can regenerate that key, now if you cannot regenerate that key from the data provided then you cannot tell apart two different requests. So you will need to get this key to the client side some how such as echoing it into a bit of javascript.
Now the AJAX call can utilise this key and send it in the request, where your server can pick it up and find the correct session of that tab.
Hope you understand this.
So we are building a website and created our basic information to send logins to our database. We have trouble trying to disallow requests that just plug their own data in.
E.g.
http://testing.site.com/php/interfaces/User.php?Action=1&Email=test#gmail.com&FirstName=herp%20derp
By replacing email and firstname, they are able to add multiple users to the database and potentially with a script thousands. Is there any way to prevent this without using a captcha? We are trying to be very minimal and open with the site's design so would love some input if this is possible.
One option we have considered is moving our PHP offline and only allowing our API to access it- however it still presents the problem of users adding in authorised data (and overloading our database with thousands of multiple requests)
Here is a sample option, create a table with 2 fields, one is an Auto Increment id and one is a random code, lets name them ID and CODE
When sending that request, create 1 record in that table and pass the ID and CODE along with request, when receiving the request, check if there is a record in database with that ID and CODE process the request and delete that record from database too and if there isn't that record, just ignore request ...