This question already has answers here:
What is the difference between client-side and server-side programming?
(3 answers)
Can Firefox's "view source" be set to not make a new GET request?
(12 answers)
Closed 5 years ago.
I have three web servers running identical code. On main, one failover and one development server. Let's call them server1, server2 and server3.
If I load a page from server1, then view the source, the browser reloads the source from the server.
If I load the same page from server2 or server3 they both show the page source without reloading. This is true for both Chromium and Firefox.
The servers are running the same php scripts. They have AFAIK identical nginx/php-fpm installations.
It must be server-side issue because I am using the same browsers in all cases, just changing the IP of the domain from one physical server to another.
What directive could the server be sending to the browser to tell it not to reload? Not an HTTP directive, because there is no difference in the php code from one server to another. I am thinking something at the web server level?
If I reload the page it actually reloads. Only View Source avoids reloading on server2 and server3, which is the behaviour I want. server1 is the one I want to force the browser not to reload on View Source
Edit: I don't believe it is a duplicate because the above question is written by a newbie wanting to know why they can't execute a php script on the client side. The question I have is why does an identical page get reloaded on View Source when it is served by one server but not get reloaded when it is served by a different server? Same browser. Page served by same php code on both servers.
Edit: As to the second "duplicate", thank you for pointing out the post on Firefox. I did read this post when I began researching but the problem affects Chromium as well.
Related
Hoping someone has some ideas around what to do with this.
We have a application thats PHP based hosted in IIS.
There are a number of functions that need to run which can be running for 10mins+. The problem I have is that if I run one of these functions in my web browser. If I open another tab and try to access the site while that is happening then it just sits loading until the long process finishes and then it loads the page.
I guess this is more of a multi session thing to my browser? Is there some easy option in IIS I can change that will let it load the other pages as normal? Or is this a browser thing?
It seems if I open an in private window at the same time, that will load normally.
The issue is related to the session. by default PHP session use a file system.so that has to wait for the session file to be closed before they can open new. Therefore, subsequent requests for the same session wait on prior requests.
To resolve the issue you could try the below things:
-close the session when you are done with it by using the session_write_close()
-Implement a custom DB handler which utilizes the database instead of the file system.
Reference:
FastCGI on IIS7... multiple concurrent requests from same user session?
This question already has answers here:
How does PHP interact with HTML and vice versa?
(4 answers)
Closed 7 years ago.
When PHP is embedded in HTML what is happening?
Isn't it the case that the html response object is interpreted in the browser, so how does the browser handle the php? Does it make a separate request?
PHP is a server side language that can be embedded in client side languages?
Here is what happens:
Someone goes to your site in their browser. This triggers at HTTP request to your server
Your server decides how it wants to handle the request. Let's say you're using Apache: by default then, this is to serve the index page within your DocumentRoot
Let's assume your index page is index.php. On the server, all PHP code within index.php is executed once. After it has executed, the HTML result of that page is served to the client
Once served to the client, the only thing that can modify the page is JavaScript. PHP only runs on the server. No PHP code will be sent to the client.
If your JavaScript wants to dynamically edit the page with information from the server without a reload, it can perform an AJAX request to the server. This entails the JavaScript making a network request to an endpoint (let's say, getNames.php). getNames.php runs on the server, and returns it's result (usually in the form of echo <something> back to the JavaScript, which can then edit the page based on the received data.
Questions?
The browser issues an HTTP request to the server
The server reads the URL and resolves it (usually to a file with the same name)
The server recognises (usually by matching file extensions) the file as containing a PHP program
The server passes the program through a PHP compiler and executes it
The server sends the output of the program (usually along with some extra HTTP response headers) to the browser
The PHP source code is never sent to the browser. Only one request is made (unless the output of the PHP is, for instance, an HTML document that tells the browser to load (for example) images).
Is it possible for the same web page to be viewed by two different remotely separated web browser such that when input is put into one browser the same data is displayed in the other browser? Think GoogleDocs (I know this works) or perhaps a document in SharePoint (I'm told this works).
PHP is a server-side scripting language, this means that when the client asks to view the page the PHP-script will render an HTML-page that the client then receives. Any changes that are then done on the client-side will not affect the file on the server.
The way of doing this would be (as already suggested by VWGolf2) to use JavaScript and upload the changes to the server once the client has made any changes. These can then be downloaded (using JavaScript) to the other client and then updated on their webpage.
You can ofcourse write this all in PHP, but it will not be the PHP that is performing the actual logic on the client-side, it will most likely be JavaScript.
You can not solve this problem in php. You will need to use JavaScript and ajax.
In my localhost PHP file test.php, I use php function file_get_contents to grab a forum index page.
echo file_get_contents('http://www.XX.com/forum.php');
when the forum data, sunch as posting , member changes, I refresh the test.php, content is hasn't changed, I want to know why?
There are a number of possible reasons:
You are behind a caching proxy server and are receiving a cached copy of the page
This can exist at the network or server level
The target site detects such requests and provides cached versions for performance or security reasons
Your browser has cached the output of your script.
You will need to examine your configuration, talk to your network administrator, or check your own browser's cache to find the source of the problem.
Is there any way to save a external web page to folder on server but with all webpage elements (JS files,images,css... etc.). Like you can do at browser with option save-complete-page but I need this to save with php on my server folder. And when include this folder to show the page as original. Maybe with curl or some php function ... ???
HOW TO DO THAT. please HELP!
p.s.I doing this for good reason not for stealing content!
and when I finising with operation and function I will empty the folder.
Are you saying that you want to visit a php site as a client (whether in a browser or via wget/curl/etc) and then save all related server-side files?
Without access to the server, that is not possible. The server-side content (e.g. the PHP pages and possibly some other parts of the site) are interpretted by the server before you as a client see any part of the site. You need server-side access to the files in order to see what is there before the code is interpretted.