Situation: a user clicks on a link, the server gets the request and starts processing it. In the meanwhile, the user clicks on another link, the server gets the new request while processing the 1st one. What happens? On the client side we only see the webpage from the 2nd request, but is the process from the 1st request killed on the server when receiving the 2nd one? And is it managed by the server or the language (Apache or PHP)?
Depends. If the browser does not drop the connection to server, it'll have absolutely no idea that the client has navigated elsewhere. If it does drop the connection, it's up to the Web server to choose to detect it and abort the processing thread or not.
Either case, this is the nature of statelessness of HTTP. You shouldn't rely on anything in this regard.
Both requests get served (if the browser did send the second one).
You would only see the second page, but if you'll look into access_log you'll surely notice two requests.
That's how HTTP works.
You can use ignore_user_abort() to tell a script to continue (or not) after the connection has been terminated.
Related
I have to solve this problem:
Explain what the following code snapshots do, and for each specify
whether it triggers an interaction (of the client/browser) with the server or
it is executed at the client side.
(iv) click here
(v) click here
Now, respectively they call for page.html and page.php from the web server. My instinct says this counts as an interaction (request-receive) of the client with the server in both cases, but the way the question is laid out seems to suggest that perhaps only the php link truly counts as an interaction.
Could anyone confirm the correct interpretation and perhaps clarify what counts as an interaction with the server.
Both trigger an interaction with the server, as your instinct said.
Without knowing the server setup, you don't even know if the HTML-file is served unmodified, or if the request is being rewritten and processed by any server side scripting language.
The question explicitly says the other option is being executed at the client side, which is definitely wrong for both. The only exception would be when the click event is intercepted, but that requires more code on the client side and that would have to be part of the question.
The code itself does no server interactions. Browser uses this code to draw a hyperlink, still no server interaction. As soon as the user clicks any of the links, the interaction between browser and server is triggered (no matter if it is a .html or .php, the server receives the request and decides how to handle it).
If the page is opened locally as file://C:/.../xxx.html, it interacts directly with the localhost, I don't know if this is considered a server interaction... The question seems a bit unclear to me.
I have a python script that listen to specific port on a server and waits until it gets an http request from browser or program to run.
I want that script to run when the admin on my Joomla site clicks on the save button.
That means that I need a way to go to the link that the server listens to without showing it to the the user (a way to do it in the background).
Any ideas?
Thanks!
PHP is server side, so anything implemented there would not be visible to your user, unless it gets rendered as HTML output.
Does the request need to come from the client machine? Or can it come from your web server?
If it needs to come from the client, I would recommend using PHP headers.
Script 1
header("Location: exec_something_private.php");
exec_something_private.php
exec("do something");
header("Location: success.php");
The above implementation could call a php file, do what you need to do, and then redirect them immediately to a new page?
If the function can be called by the webserver, simply uses exec to run a curl request or something, no?
I have a really weird behavior going on.
I'm hosting a tracking software for users, that mainly logs mobile traffic. Now, the path is as follows:
1. My client gets a php code snippet to put in his website.
2. This code sends a cURL post (based on predefined post fields like: visiotr IP, useragent, host etc) to my server.
3. my server logs the data, and decide what the risk level is.
4. it then responds the client server about the status. That is, it sends "true" or "false" back to the client server.
5. client server gets that r
esponse, and decides what to do (load diffrent HTML content, redirect, block the visitor etc).
The problem I'm facing is, for some reason, all the requests made from my client's server to my server, are recorded and stored in the a log file, but my clients report of click loss as if my server sends back the response, but their server fails to receive those responses or something.
I may note that, there are tons of requests every minute from different clients' servers, and from each client himself.
Could the reason be related to the CURL_RETURNTRANSFER not getting any response ? or, maybe the problem is cURL overload ?
I really have no idea. My server is pretty fast, and uses only 10% of its sources.
Thanks in advance for your thoughts.
You touched very problematic domain - high load servers, you problem can be in so many places, so you will have to really spend time to fix it, or at least partially fix.
First of all, you should understand what is really going on, check out this simplified scheme:
Client's php code tries to open connection to your server, to do this it sends some data via network to your server
Your server (I suppose apache) tries to accept it, if it has resources - check max connections properties in apache config
If server can accept connection it tries to create new thread (or use one from thread pool)
After thread is started, it runs your php script
Your php script do some work, connecto to db and sends response back via network
Client waits till the answer from p5 or closes connection because of timeout
So, at each point you can have bottleneck:
Network bandwidth
Max opened connections
Thread pool size
Script execution time
Max database connections, table locks, io wait times
Clients timeouts
And it is not a full list of possible places where problem can occur and finally lead to empty curl response.
From the very start I suggest you to add logging to both PHP codes (clients and servers) and store all curl_error problems in some text file, at least you will see what problems occur often.
I have looked at a few topics (here & google) regarding detecting browser exit in php and im not really any clearer on how to do so.
I tried the register_shutdown_function in PHP but that was executing even when i refreshed the browser page.
Can anyone help?
Thanks in advance
PHP is a server side language. Browser events are handled by DOM events. On "onunload" or "onbeforeunload" events, sending an AJAX call to a PHP file.
And in this other question there is a flavored explanation of what I'm saying.
Please explain what you want to do when the browser closes, to see if there perhaps is another way to do so.
A web server sends its response to the browser, and then (usually) closes the connection. You'd have to do something in Javascript, but that won't catch all conditions. You certainly can't detect it serverside.
It can be detected using the Javascript onbeforeunload or onunload functions, but that is absolutely not accurately, since it won't detect:
a browser crash
a browser exit
a computer shutdown
a link click on the page
when going Back in the browser
Also see this answer.
So for example when you want to log out users when they close the browser, you'd better use a "keepalive" mechanism instead of a "say goodbye" one. You can then log those users off on the server (e.g. using cron) whose sessions have not been active (i.e. who haven't sent a "keepalive") for more than X minutes.
I don't think there is any foolproof way to detect a browser close button in PHP or Javascript.
It is much safer and better to handle it via timer based polling OR just simple session timeouts on server side.
One solution that is at least fool resistant is to send a heartbeat to the server using Javascript and Ajax, then assuming that the browser window has been closed when the signal stops pulsing.
Another would be to use web sockets to maintain a constant connection until the browser window closes.
In any case it would take quite a bit of work from your part to set it up
Not just with PHP.
PHP runs server-side, and is far done processing your page by the time the user will have a chance to close their browser. You could technically detect if PHP was still processing the page and the user closes it, with a specific configuration. However, it is not ideal. See connection_aborted().
What you need to do is set up a long-polling connection with JavaScript, and monitor it server-side. You will then get an idea for when that window is closed. That connection could be made to your PHP script, allowing PHP to check connection_aborted(). Note that you will need to set up ignore_user_abort() for this to work, or configure PHP.ini accordingly.
I have a web page that contains a SWF object(external) that loads up random content by making HTTP requests to its server. Is there any way I can implement a sort of observer for the page that stores all the HTTP request that were made once the page was loaded.
I'll appreciate any help on the topic, I just need a point to start on.. I don't even know if this is possible.
HTTP requests from a Flash file occur on the client, so if the request goes off to a different server (not yours) your server won't even know that it's occurred. You'd need to install something on the client to track it.