In one file I make query and get result.
index.php (outputs the links to browser)
foreach ($conn->query($sql) as $info) {
$output_html = '<a href = addsearch.php?id='.$info['id'].' target=_blank \>'.$title_.'</a>';
print("$output_html<br>");
}
$_SESSION['info'] = $info; // now it's works such way
In other file I wish I would get the copy of result without using GET, POST, SESSION methods. (I no need in GET\POST as data stay on server in nearest RAM area. Also wouldn't want to use the SESSION variable as it use HDD.)
addsearch.php (launch only when user click on the link)
session_start();
print_r($_SESSION['info']); // works now
...
Is there another methods to get data? Any global RAM variables or cache, common shared resource between files.
I tried first example from PHP manual:
<?php
$a = 1;
include 'b.inc';
?>
but it doesn't work :-) because I launch files separately, so they have different processes.
PHP isn't restricting anything, it doesn't know about that data in the first place. There are two parts to understanding this:
PHP is a shared-nothing architecture.
HTTP is stateless.
The first is a design decision of PHP: every request receives a completely new environment, so data isn't held in memory between HTTP requests from users. This makes the language much more predictable, because actions on one page have very few side-effects on another.
The second, however, is more fundamental: even if PHP stored data between requests, it would be storing them in one pot for every user that accessed your site. That's because HTTP doesn't have any native tracking of "state", so two requests from the same user looks fundamentally the same as two requests from different users.
That's where cookies and sessions come in: you send a cookie to the user's browser with an ID, and you tie some data to that ID, stored somewhere on your server. That somewhere doesn't need to be on disk - it could be in a memory store like memcache or Redis, in a database, etc - but because of PHP's "shared nothing" model, it can't just be in a PHP variable.
Another relevant concept is caching: storing (again, on disk, in a memory store, etc) the results of slow computations, so that when asked to do the same computation again you can just look up the answer. Whereas a session is good for remembering what the customer puts in their shopping cart, caching is good for displaying the same set of search results to every customer that enters the same search.
Related
I use simple file_get_contents feature to grab data from other site and place it on mine.
<?php
$mic1link = "https://example.com/yyy.html";
$mic2link = "https://example.com/zzz.html";
$mic3link...
$mic4link...
$mic5link...
$mic6link...
?>
<?php
$content = file_get_contents($mic1link);
preg_match('#<span id="our_price_displays" class="price" itemprop="price" content=".*">(.*)</span>#Uis', $content, $mic1);
$mic1 = $mic1[1];
?>
<?php
$content = file_get_contents($mic2link);
preg_match('#<span id="our_price_displays" class="price" itemprop="price" content=".*">(.*)</span>#Uis', $content, $mic2);
$mic2 = $mic2[1];
?>
And fired up by
<?php echo "$mic1";?> and <?php echo "$mic2";?>
It works but it impacts on performance (delay).
Is there any way to optimize this script or maybe another way to achieve this?
Firstly, as others have said, the first step is to use the Guzzle library for this instead of file_get_contents(). This will help, although ultimately you will always be constrained by the performance of the remote sites.
If at all possible, try to reduce the number of http requests you have to make: Can the remote site aggregate the data from all the requests into a single one? Or are you able to obtain the data via other means? (eg direct requests to a remote database?). The answers here will depend on what the data is and where you're getting it from, but look for ways to acheive this as those requests are going to be a bottleneck to your system no matter what.
If the resources are static (ie they don't change from one request to another), then you should cache them locally and read the local content rather than the remote content on every page load.
Caching can be done either the first time the page loads (in which case that first page load will still have the performance hit, but subsequent loads won't), or done by a separate background task (in which case your page needs to take account of the possibility of the content not being available in the cache if the page is loaded before the task runs). Either way, once the cache is populated, your page loads will be much faster.
If the resources are dynamic then you could still cache them as above, but you'll need to expire the cache more often, depending on how often the data are updated.
Finally, if the resources are specific to the individual page load (eg time-based data, or session- or user-specific) then you'll need to use different tactics to avoid the performance hit. Caching still has its place, but won't be anything like as useful in this scenario.
In this scenario, your best approach is to limit the amount of data being loaded in a single page load. You can do this a number of ways. Maybe by giving the user a tabbed user-interface, where he has to click between tabs to see each bit of data. Each tab would be a different page load, so you'd be splitting the performance hit between multiple pages, and thus less noticable to the user, especially if you've used caching to make it seamless when he flips back to a tab he previously loaded. Alternatively if it all needs to be on the same page, you could use ajax techniques to populated the different bits of data directly into the page. You might even be able to call the remote resources directly from the Javascript in the browser rather than loading them in your back-end php code. This would remove the dog-leg of the data having to go via your server to get to the end user.
Lots to think about there. You'll probably want to mix and match bits of this with other ideas. I hope I've given you some useful tips though.
I have a question about php and how it works on server side.
Im thinking about creating array of users in PHP than access it. When user access the website, my script will push the username to array. Can I access this array in another page without using the database to store the usernames?
Without database- use sessions. Add session_start() on every page you want to access your users array, and then $_SESSION["session_variable_name"] = $users_array to assign your array to session variable. Then if you want to use it, just access it like a usual variable: echo $_SESSION["session_variable_name"][0].
But using database would be much more appropriate. Session variables are accessible only on that session, so if there's new user added to the array, only client who added it will see it. Each user will have his own session, where user array may be completely different from what others will see.
What I'd do- after successful login to the system, assign username to session variable, and then if I want to perform a task for that user, say, get his email address from database, I'd make sql query, selecting email from users, where username equals to username stored in session.
No. This won't work as you described. You can either use a database (typical with PHP is MySQL but there are many other options) or a file (JSON or any of a number of other formats). But the key is understanding why it won't work:
I think you are looking at PHP on the server as one system serving many users. However, from the perspective of PHP, each user is seeing a fresh and separate instance of the PHP system. Depending on how the server is actually configured, there may or may not be multiple PHP processes running at the operating system level. But at the application level, each instance (run) of a PHP program is totally independent, with its own copy of all local (RAM) data storage - including arrays. That is NOT necessarily the case in all languages, but for practical purposes you can treat a PHP server process as if it were a separate server brought online to serve the one current user, then rebooted to serve the next user. With that setup, it should be clear that the only way for the server process serving the 2nd user to see what the 1st user did is if the process serving the 1st user stored the information in non-volatile storage - e.g., MySQL or a file on disk. The advantages of a database over a simple file are that it allows fast & easy queries of the information and hides most of the little details of cooperation between the different server processes - e.g., if there are 10 users all accessing the server at basically the same time, the individual PHP server processes don't have to worry about who can write the file "now".
In addition to the problem of each server process seeing (or not seeing) the data collected from other users, even within a single user the data is not available. Each PHP page request, including regular POST or GET and AJAX calls too, starts a fresh PHP instance (again, this is logically speaking - the server code may actually be resident and not reloaded, but it runs "fresh" each time), so with the exception of session data (as others have rightly suggested) there is no data carried over between pages even for an individual user. Session data is normally limited in size and often is just used as a reference into server-side data stored in files or a database.
first you need to prepare user details array as per below
$userDetailArr = array(
'username' => 'foobar',
'id' => 1,
'email' => 'foobar#foo.com'
);
now add session_start(); function and store above array into $_SESSION variable
$_SESSION['userdetail'] = $userDetailArr;
$_SESSION['userdetail'] you can use anywhere you want. but make sure to add session_start() function at top of page before use session variable.
I've been looking into the problems of having persistent data available between pages in PHP. This particularly applies to objects that have been set up in one page that need to be accessed later. It seems this is more difficult than I assumed it would be, but there are several ways this could be done, although they all seem a bit awkward to use especially when the data gets quite complex:
Passing the data via $_GET or $_POST to the next page
Copying the data to a database and retrieving it in the next page
Putting the data in a session or cookie
Serializing the object and recreating it with the same parameters and values
These all seem quite laborious as they mostly rely on having to deconstruct your existing data structure and then rebuild it again on the next page. I assume this is to reduce memory requirements of the PHP server by purging data from one page as soon as its closed and starting with a 'clean slate'.
Is there a more direct way of passing larger data structures between pages in PHP?
Many thanks,
Kw
I assume this is to reduce memory requirements of the PHP server by purging data from one page as soon as its closed
Nope, this is not because of memory efficiency concern. This is because HTTP protocol is stateless. Each request must carry all information that is necessary to fulfill it.
Counter-example to your proposed scenario:
let's suppose Alice visits page A, some objects are created and you want them to be available in page B.
You track a visit to page B.
2.1. But it's not Alice, it's Bob. How do you determine which objects to show and where do you get them from?
2.2. It is Alice again, but the request arrived to another machine from your 1000 server farm. Naturally, you don't have original PHP objects. What do you do now?
If you use $_GET or $_POST you are limited to non-sensitive data and you expose your objects to any user. You don't want that.
Cookies are limited in size
cookies are usually limited to 4096 bytes and you can't store more than 20 cookies per site.
The best way to persist objects between requests (for the same user) is to use Sessions. There are already session save handlers for memcached, redis, mysql etc. You can also write your own if you need something custom.
My website sends curl requests to an external service and gets XML responses.
The requests are user specific and the responses are rather heavy (& several requests on the same page), so it takes time to load the page and uses too much server's traffic.
How I tried to solve the problem:
The requests sent from the client side (js). Unluckily for me it becomes rather messy to parse the received data and integrate it to the page's objects
Put the responses in session (as they are specific for user). The session files on server get large too fast. Implemented a counter, that erases all the responses from session if their number is too big (using this now)
Memcache? Too much data to save
Do you think I should use one of the solutions or is there another way to do it?
Use a commbination of
cache
database
You push things in your "data store" (this is cache and database). Then you look up in your datastore if it is available. The data store looks into cache, if available give it, if not look in database. And if everything fails get the info.
You could also increase the size of the cache (but that is not a good sollution).
Try like this
$key = "User_id_".$user_id."category_".$category_id;
then acc to this key store each data like
$memcache->set($key, $data, , 3600);
Am doing online Quiz type of script in PHP. User needs to attend 50 Question in 45 minutes.
After that time it should close the page or Submit the answer to the next page.
It is better to use cookies or sessions. How can i do that.
Am novice in session concept so can u suggest the suitable code.
Awaiting the earliest reply
I assume, as this is a quizz, you'll count point, record ranks, etc. So your users will eventually try to cheat.
Therefor, I would recommend sessions which are only server-side.$_SESSION is an array, like $_GET and $_POST, unique to every user using your website. You can put and retrieve anything when you want.
The only thing client side is a special cookie, called PHPSESSID, which is your visitor's id, used by PHP to retrieve his $_SESSIONarray.
Only things you have to do is to begin every page with session_start(); , before any instructions (except if you use buffering like ob_start())
The main difference between cookies and sessions is where the data is stored.
With cookies, you send the data to the browser, and the browser keeps sending it back to you with every request thereafter.
With sessions, you're storing the data in memory, and then just setting one cookie that has an ID to identify the chunk of space in the server's memory where the data is stored.
The crucial difference is that when the data is stored in cookies:
it can be edited by the user
it can be seen on the network as requests are made
it adds to the weight of each request in additional bandwidth required
it takes up less server memory
When data is stored in the session:
it can't be accessed by the user without going through you
it's not sent back and forth with each request (only the session ID cookie is)
but it takes up memory on the server
it can cause issues on larger sites when needing to move to multiple web servers
I would say it depends on scale. For a lot of questions, those cookies will get heavy and make each request very large. If you quiz is running in an environment that is spread across multiple front-end web servers, sessions might be out of the question.
I suspect the deciding factor is going to be the integrity of the quiz though. If it's crucial that the user can't change the data (such as previous answers, a running score or a timestamp for the start of the quiz) then you'll need to store the data out of their reach, which means using sessions.