I'm trying to force to reload a (second) page if a criteria is met in php
But if the criteria is met, i want the page to force reload everywhere, even if 10 people have it open at once for example.
for simplicty lets say the code is like this:
in /filelocation/script.php:
if {$data == "ok"}{
reload/refresh "reload.php" if it's open somewhere;
}
I came across a software that basicly does this, and i want to understand how this is done.
(it works cross device somehow, so i asume its done through php somehow)
Well, in your PHP code, you cannot simply reload/refresh something for all the users connected. This is simply because the PHP code is only executed when your browser requests a page on the server so it's only executed to build the HTML response and then it stops executing. Once the browser has the HTML response it will render the page and then it waits for an action from the user to do something else (such as clicking on a link or posting a form).
I imagine that you would like that when a specific user does something, like posting a comment or buying a product, you would like all the other visitors to be notified that a new comment has been posted or that the number of products available has been reduced.
To do that, you need to implement some JavaScript which is executed in the browser of each visitor. The idea is to keep a connection with the server with the help of web sockets. This way, you can inform the browser that something has changed.
You could google to find some examples of PHP apps using web sockets. The first example I found:
https://www.twilio.com/blog/create-php-websocket-server-build-real-time-even-driven-application
Another solution could be to have some JavaScript doing some pooling, meaning that every N seconds, it executes an Ajax request to the server to ask if something has changed. This can be done with the help of setTimeout(yourFunction, 10000) to call a JavaScript function every 10 seconds. This function will do the Ajax request and then update the part of your page that needs to change. Just be carefull that if you get a lot of users on your site then you'll produce quite a lot of load on your server. So this wouldn't be a good solution, but it could be an alternative to the web sockets.
Related
I am currently creating a stock market simulation and am working on the moment that the user logs into the simulation. I have a PHP script that will generate a certain price for a company four times and update it into my MySQL database while running. I currently have the following code:
PHP:
if (isset($_SESSION['userId']))
{
$isPlaying = 0;
while ($isPlaying <= 3)
{
$priceTemp = (rand(3300, 3700) / 100);
$sql = "UPDATE pricestemp SET price = $priceTemp WHERE companyName = 'Bawden';";
mysqli_query($conn, $sql);
sleep(1);
$isPlaying++;
}
echo '<h1>Welcome to the simulation</h1>';
}
I am aiming for these updates to happen in the background once the user has logged into the simulation. When refreshing my database every second, the updated prices are shown which is one of my objectives. However, what I would like it to do is still load the HTML onto the page (to say "Welcome to the simulation") while updating the database with every second with an updated price.
So far, when I log in, I have to wait 4 seconds before the HTML will load. In the future, I hope to have it consisently updating until a certain condition is met but when I have set an infinite loop earlier the HTML never loaded.
What do I have to do to allow the HTML to load once logged in and have the prices being generated and updated in the MySQL database in the background with no delay in either of these tasks happening?
You have a fundamental misunderstanding of how web-based requests work.
What you need to understand is that PHP is a server-side language. PHP generates any combination of HTML, CSS, JavaScript, JSON, or any other forms of data you want and sends it to your web browser when it's finished. While it's doing that, it can also manage data within a database or perform any other number of actions, but it will never send what the web browser can make use of until it finishes setting everything up. So if you're within an infinite loop, it will never finish and therefore nothing will be sent back to the web browser.
To remedy this, you need to use something called "asynchronous JavaScript", more commonly referred to as "ajax". Specifically, you first send some initial HTML to the web browser in one request and let the request end immediately. This allows the user to see something without waiting around for an indefinite period of time. Then, on the web browser end, you can use JavaScript to automatically send a second request to the server. During this second request to the server, you can perform your data processing and send back some data when you're finished to display to the user.
If you want to periodically update what you show the user, then you would repeat that second request to refresh what is shown on the user's webpage.
Any time you see some kind of "real-time" updating on a website, it's not coming from a single, persistently open connection to the web server--it's actually a series of repeated, broken up requests that periodically refresh what you see.
Broken down, standard web request workflows look something like this:
Web browser asks the web server for the webpage. Web browser waits for a reply.
Web server generates the webpage and sends the webpage to the web browser. Web server is done.
Web browser receives the webpage and shows it to the user. Web browser stops waiting for a reply.
Web browser runs any JavaScript it needs to run and requests data from the web server. Web browsers waits for a reply.
Web server processes the request and sends the requested data back to the web browser. Web server is done.
Web browser receives the requested data and updates the HTML on the webpage so the user can see it. Web browser stops waiting for a reply.
As you can see, each series of requests is 1) initiated by the web browser, 2) processed by the web server, and 3) any replies from the web server are then handled by the web browser after the web server is finished up. So each and every request goes browser -> server -> browser. If we add steps 7., 8., and 9. to the above, we will see them repeat the exact same pattern.
If you want to avoid adding JavaScript into the mix, preferring to refresh the entire page every time, then keep your data processing short. Optimize your database calls, fix your infrastructure (make sure your server and database have a LAN connection, that your hardware is good enough, etc.), make your code more efficient... do whatever you need to do to keep the processing time to a minimum.
This is all incredibly simplified and not 100% accurate, but should hopefully help you with your specific problem. The short version of all of this is: you can't show your HTML and process your data at the same time the way you're doing things now. You need to fundamentally change your workflow.
You have to do this in 2 network calls. The first network call should fetch the html. Then you have to use Javascript to fire another call to update your data. Once that api call returns it will update the html.
The scheduling model to manage the frequency of a background operation based on the frequency of requests at the front end is a very difficult problem. It's also a problem you don't need to solve. The data doesn't need to be changed when nobody is looking at it. You just need to store when the data was last looked at and apply greater deltas to older data.
I'm new to ajax and I'm trying to build a real-time (facebook like) messaging system with PHP and jQuery. Currently, when the user loads the page, messages are loaded from the database. Then, the script sends a get request every 2 seconds to get the latest messages from the database and displays them on the page. However, after a while, the connection times out and if I try to reload the website, the website does not load. It does load when I use a proxy. I feel it has something to do with the server blocking my ip.
Here is my current code (message refresh)
$(function() {
var m = $('.messages');
m.scrollTop(d.prop("scrollHeight"));
setInterval(function() {
$.get("get_messages.php", function(result) {
$('.messages').html(result);
});
}, 2000);
});
How can I make this more efficient/better?
Even though the comments to your question are good, i would rather answer this in my own way..
First off, polling the server every 2 seconds.. In general should work, I would simply guess you are being blocked by the server.. Specially if it is a public/shared server.. As they have to restrict traffic/high loads so you dont weight it down for others.
First i would check your browsers inspector/DOM.. You might see that the network requests are coming back with an error code instead of a 200 status.. This should give you some indication.. Depending on on the server and how they respond.. Some return a 200 status/ok but still stop the script.
Another thing to consider is cache.. With your GET request, also post a timestamp with it.. This will force a new request every time.. jQuery can handle this in the background but certain browsers still fail in the area.. So post a append a javascript timestamp to the URL as well.
Otherwise also consider adding a subscript that logs to a text file and see if that stops logging as well, You might find some interesting results here.
And if all else fails, install/test a local webserver to your computer and test it that way.. This will also help you check if it is your server or your script.
If you try all of the above, You should find an answer to what is causing the issue or at least get some indications as to where or why it stops working.. The proxy simply to me suggests it is a server block, BUT also remember a proxy can change the way it requests your page.... So i would start with the browser inspector first.. This is my first point of reference when something does not work right.
First/Main Question.. Is this a public/shared server or is it your own server that you have setup? If you have access to various logs on the server, check them as well..
I would like to find a way to have my user not having to wait for the output of a php script and being redirected to a page while the script is running on the server.
Basically the user submits a form which takes quite long to process and I would like to redirect the user to a page notifying him that the form is being processed and that its output will be later available (I thought about opening a tab when the output is ready).
Basically I would like something like this, which I tried without success, the
if ($form_valid) {
process_form(); // this would need not to be running on the current page so that the user don't have to wait for it to be ready (timeout problems)
header('Location: http://example.com/form_submitted_output_coming_soon.html');
}
I hope that it is not too vague.
Thank you in advance for any help / advice on how I could do that.
It really depends on the time the script takes to execute if it's seconds, under 10 I would do an ajax request and have a modal progress message
If they take extended amounts of time my approach would be to create or use an existing task scheduler/ report generator
a single system scheduled task entry calling a central management script ( probably not optimal )
You mark a task/report for execution
Concurrency. Count, limit the number currently executing ( so you don't over load the server)
users pool via ajax for their tasks / reports or push to the clients with web sockets
Example on how to fork php to background
Update
I think you would get better performance out of a bot continuously check a database or file for work to do and submitting results back to the database. Alerting users via ajax, web sockets and or email when the work that they need is done / updated.
Here is a good introduction on how to build a web crawler in php
The best approach for solving this kind of problem is to use AJAX to make the request to the server in the background and then update the user once it has finished processing.
You may submit the form with an asynchronous request (ajax) and handle the page forward also with javascript. This way your form is handled asynchronously - you may even wait for the response to tell the user once you have an answer. This asynchronous request will not block the UI.
Just for completeness if you really really want to use php only for this:
Run PHP Task Asynchronously
Is there a way to detect whether a user is disconnected from internet or not? The way that stackoverflow does when you want to post a question. I couldn't think of any approach to do that. Could someone shed some light on the subject?
You can send AJAX-request to PHP script when windows is closed:
window.onbeforeunload = function(){
// Request goes here
}
Alternativly you can use websocket-technology (you can use phpDaemon) to connect with server permanently so you will know when user is disconnected from internet or your site or pereodicly (use setInterval function) ping your server.
I guess Stack Overflow uses AJAX, which is a JavaScript driven program executed on the client side inside your browser. This ajax setup is responsible for notifying the user when, for example, a new answer is posted, and giving them the opportunity to load said new answer without reloading the whole page.
And this construct has a way to detect errors in the communication with the server which it is interpreted as the user being disconnected, resulting in a warning.
However, this requires that the user is still having the browser open. There are also various other functions in JavaScript and AJAX to execute something when the user is closing the page, but neither of them are considered to always work. There are no silver bullets after all.
From the server's side, one can monitor the constant ping-pong of the user client's AJAX and execute something when this ping is fading away. Like: the user has been pinging us in every 5 second in the past two minutes, but now this ping is missing.
The main problem with this lies inside the principles of PHP and that every pages basically lives on its own. When the request is get, the page is loaded and created, but at the end of request, the current page instance is denied from existance, just like how every variable is lost which is not saved elsewhere (cookie, session, database).
There might be some cases that your request takes long time because
of some problems with your client internet connection or your server
connection. So since the client doesn't want to wait he clicks on the Ajax
link again which sends the request to the server again which messes up
the following:
Rendering of our website in the browser because we are giving extra
load to the browser.
What if the second request processed correctly and you showed user
the page and then comes along the error message from your first
request(saying request timed out) which loads above on the correct
content and mess up with the user reading the correct content.
I want to stop the 1st Ajax response if the Ajax function is called twice. How do I do this?
so i want to stop the 1st Ajax response if the Ajax function is called
twice
What you actually want is to prevent a second request when a first request is in progress.
For example, You may have to change the Save button to Saving..., disable it (and add a little progress wheel) to give live feedback to the user. (Facebook does this)
The key is love feedback to the user. If the user is clueless on what is going on, they are going to think nothing is happening.
You might want to check why the operation is taking long
If this is a complex/time consuming operation, like, say a report generation or a file upload, a progress bar should do
If this is because of the client's internet connection, say it up front, like Gmail: Your have a slow Internet connection and this site may be slow. Better still, provide a fallback option, with less/no Ajax.
You say cause we are giving extra load to the browser: this is kind of fishy. You will not be giving extra load to the browser unless you are giving it tons of HTML to render. Use Ajax only for small updates on the browser. You may want to reload the page if you expect a large change.
How bout seeing as your using some form of JavaScript to begin with you have that link either hidden or disabled in a manor of speaking til the pages request has been followed through with. You could for example have the requested ajax wait for a second variable that would enable that link so a user could click it. But until that variable is received from the original AJAX request its disabled to the click, then if it is clicked it disables again and waits for the same variable to come back again.
Think of how some developers disable a submit button on a form so a user can't double submit there form.. same concept of notion here.