I have a request to provide the following solution:
Two web pages with 1 form on them. This form is submitted and inserted into a database. Another web page is used to display the results of the form inputs in a tag cloud solution. My question is what exact workflow would people use for this? My thoughts were like so:
1 MySQL database, 1 html page running AJAX and jQuery for the tag cloud/polling. 1 PHP processing script which grabs new data from the database and serves it out to the html page. Now, what is the effect of this being ran for say 24 hours, constantly updating via AJAX ie, every 10 seconds or should i use a different method. The results from the form need to be saved for offline viewing after the 24 hour period, so i cant just stream the form results straight to the page.
All advice welcome using any technologies...
Should add, there's a possibility that it may be on a LAN with no internet access, possibly thinking of a local XAMPP installation...
given that there is 3 clients, and they poll every 10 sec, that means
approx 25k requests per day to keep the page updated. which is hardly any load for a xampp install, especially on a local network.
i would suggest testing the polling and see if it still works after 24hours.
are there any limitations in the browser that stop the script from working after x amount of time. (i have never tried that long before sorry)
an alternative approach is,
does it need to poll continuously, or can you just have a refresh button
Related
I'm trying to force to reload a (second) page if a criteria is met in php
But if the criteria is met, i want the page to force reload everywhere, even if 10 people have it open at once for example.
for simplicty lets say the code is like this:
in /filelocation/script.php:
if {$data == "ok"}{
reload/refresh "reload.php" if it's open somewhere;
}
I came across a software that basicly does this, and i want to understand how this is done.
(it works cross device somehow, so i asume its done through php somehow)
Well, in your PHP code, you cannot simply reload/refresh something for all the users connected. This is simply because the PHP code is only executed when your browser requests a page on the server so it's only executed to build the HTML response and then it stops executing. Once the browser has the HTML response it will render the page and then it waits for an action from the user to do something else (such as clicking on a link or posting a form).
I imagine that you would like that when a specific user does something, like posting a comment or buying a product, you would like all the other visitors to be notified that a new comment has been posted or that the number of products available has been reduced.
To do that, you need to implement some JavaScript which is executed in the browser of each visitor. The idea is to keep a connection with the server with the help of web sockets. This way, you can inform the browser that something has changed.
You could google to find some examples of PHP apps using web sockets. The first example I found:
https://www.twilio.com/blog/create-php-websocket-server-build-real-time-even-driven-application
Another solution could be to have some JavaScript doing some pooling, meaning that every N seconds, it executes an Ajax request to the server to ask if something has changed. This can be done with the help of setTimeout(yourFunction, 10000) to call a JavaScript function every 10 seconds. This function will do the Ajax request and then update the part of your page that needs to change. Just be carefull that if you get a lot of users on your site then you'll produce quite a lot of load on your server. So this wouldn't be a good solution, but it could be an alternative to the web sockets.
A website has 250 products and i need to get the quantity for those products. The problem is that the quantity is not displayed unless i submit a form. Now i am able to submit that form but the problem is that after 10-20 products scraped there is like a bottleneck where that website returns this message:
Sorry, we have too many customers, please come back later.
So basically it's clear that i'm sending too many requests. But if i'm using a usleep between requests the time for the scrape is like 15 minutes...i guess the server is delaying the answer to my requests.
So basically my question is: what can i do to submit that form without getting stopped or delayed?
So basically my question is: what can i do to submit that form without getting stopped or delayed?
have your own local cached copy of all the products, and have a daemon or cronjob that is constantly (but slowly) updating your own local cache, which should make your cache as close to up-to-date as possible without hitting the rate limit. and whenever you need to quickly inspect all 250 products, use your own local cache, not the live version. PS: the rate-limit is probably on a per-ip basis, if the update speed of 1 IP is insufficient, you can probably just keep adding more IPs for the cache-updater until the cache update speed is acceptable.. (.. and if you're looking for a cheap place to get more IP's, i can recommend https://cloudatcost.com/developer-cloud - or if you're looking for something free, you can try the torproject https://www.torproject.org/ - but many websites blocks tor exit nodes)
My page is visited by multiple users at the same time.
User 1: visits the page and changes the name of title
User 2: user 2 was already on that page but sees the old title, the title automatically has to be updated to new title.
I know i can simply use AJAX to call every 5 minutes, but im trying to see if there is any other way possible that fires an event to all instances of the page opened by different users that if one of them is updated all other pages get automatically updated with latest data without the wait of 5 minute ajax call. Ajax seems inefficient since it will do many ajax calls and also what happens if user 1 updates title while user 2 updates title as well before user 2's page has been updated with 5 minute ajax call.
Not asking for a code, just need an advise whether I should keep using AJAX calls every 5 minutes and be happy with it or there is a better solution.
Try investigating web sockets for real time, two way communication between server and browser.
http://socketo.me/
I'm in the early stages of working with it myself but it seems like a solution that would fit your requirements.
Also, maybe look at push notifications
e.g. http://www.pubnub.com/blog/php-push-api-walkthrough
I am simulating push notifications using PHP the following way:
A jQuery ajax call calls a script on the server.
The script is being delayed using a for loop and a sleep after each iteration.
If something happens - the loop is being breaked and the information returned to the jQuery.
If the script runs for more than one minute - the script returns an empty value.
When the jQuery receives an answer from the server it parses the information and starts the same procedure again.
That's a procedure, used by facebook and it works like a charm on their website. On my server I have the following problem:
For example if the script is being delayed for 60 seconds and I click on another link on my website on the 30-th second we still have 30 seconds left to generate the output. So my webserver is waiting those 30 seconds and my click request is being processed after those seconds, making my website almost impossible to use. I have doublechecked facebook and I found that, when you click on a link the page is never refreshed and the requests keep flowing on the same page. The adressbar however changes. What is the way to achieve the same thing on my website? Is there a way to force my server to process more then one PHP request at a time or we have to do this using javascript only. What am I missing?
P.S as far as I know there is no way to change the adressbar using javascript.
Actually there is in HTML 5. It's called History API.
You could find a quick demo with the source code on GitHub here:
http://html5demos.com/history
The complete chapter about it in Dive Into HTML5:
http://diveintohtml5.info/history.html
I want to create a live news ticker similar to facebook in functionality.
I have already created a page where it will be having the news ticker ... Also, the database is ready to have the data ... What I don't know how to do is the rotation of the ticker as soon as new news have been added to the database ... I can do it when the page refreshes ... but I can't have it done live ... I wish somebody help me.
I have everything ready to be published, but waiting for the news-ticker ...
NOTE: I have only psudo-code no php finished yet.
You probably want to use timed AJAX calls. For example, every 1 minute send a request to the server to see if anything new is added. If yes, then display the new piece of news to replace the currently displayed one.
You'll need some kind of ajax-mechanism.
You can either poll a url every x seconds to see if there's any updates, you if you really want real-time, you'll need something like nodejs on the server (or anything that's not as cgi-like as PHP).
But if you want to keep things simple, I'd start with a simple Ajax solution and polling every x seconds.
I would suggest doing it with AJAX, poll the database once every 30 seconds for new entries.
AJAX let's you communciate asynchronously with the server, this way you can get new data without a page refresh.