I am new to php web dev and I have been working on a website recently... It was working great on my local server but when I put it up on an actual server it started running really slow until it finally gave up and now I can't even access the website.. or cpanel... I have been doing a little research thinking maybe it had to do with and overload of sql queries but I can't seem to find any...
My site contains a lot of javascript and ajax calls to php scripts which fetch data from the database... (new notifications, messages) and I have stuff like:
//QUESTION RETRIEVAL FOR HOME FEED
$(function(){
$r = setTimeout(alive_retrieval,100);
});
function alive_retrieval(){
$.ajax({
type:"GET",
url:"php/alive_questions.php",
success:function(data){
$("#alive_question.content").html(data);
}
});
$r = setTimeout(alive_retrieval,100);
}
$(function(){
$t = setTimeout(question_retrieval,100);
});
This just keeps calling the alive_retrieval function over and over, which I am guessing could cause my site to be slow...
I don't kno what to post in order to help (code, the link to my website.. or w/e) Please tell me what I should give you guys so you might be able to see what is happening...
When I go to my site address this error pops up from my hosting provider:
Website you were trying to visit was disabled for 5 minutes, because it received over 20% of total server requests.
It means that this website was using over 20% of processor resources, which is above allowed limit.
Website was temporary disabled to protect server from overloading and other websites on server.
It looks as though the Javascript you were running was trying to reload the page 10 times per second (by setting timeouts of 100 milliseconds). This is totally unreasonable -- changing that to once every 10 seconds (10000 milliseconds) might be more acceptable.
Related
I'm trying to force to reload a (second) page if a criteria is met in php
But if the criteria is met, i want the page to force reload everywhere, even if 10 people have it open at once for example.
for simplicty lets say the code is like this:
in /filelocation/script.php:
if {$data == "ok"}{
reload/refresh "reload.php" if it's open somewhere;
}
I came across a software that basicly does this, and i want to understand how this is done.
(it works cross device somehow, so i asume its done through php somehow)
Well, in your PHP code, you cannot simply reload/refresh something for all the users connected. This is simply because the PHP code is only executed when your browser requests a page on the server so it's only executed to build the HTML response and then it stops executing. Once the browser has the HTML response it will render the page and then it waits for an action from the user to do something else (such as clicking on a link or posting a form).
I imagine that you would like that when a specific user does something, like posting a comment or buying a product, you would like all the other visitors to be notified that a new comment has been posted or that the number of products available has been reduced.
To do that, you need to implement some JavaScript which is executed in the browser of each visitor. The idea is to keep a connection with the server with the help of web sockets. This way, you can inform the browser that something has changed.
You could google to find some examples of PHP apps using web sockets. The first example I found:
https://www.twilio.com/blog/create-php-websocket-server-build-real-time-even-driven-application
Another solution could be to have some JavaScript doing some pooling, meaning that every N seconds, it executes an Ajax request to the server to ask if something has changed. This can be done with the help of setTimeout(yourFunction, 10000) to call a JavaScript function every 10 seconds. This function will do the Ajax request and then update the part of your page that needs to change. Just be carefull that if you get a lot of users on your site then you'll produce quite a lot of load on your server. So this wouldn't be a good solution, but it could be an alternative to the web sockets.
A website has 250 products and i need to get the quantity for those products. The problem is that the quantity is not displayed unless i submit a form. Now i am able to submit that form but the problem is that after 10-20 products scraped there is like a bottleneck where that website returns this message:
Sorry, we have too many customers, please come back later.
So basically it's clear that i'm sending too many requests. But if i'm using a usleep between requests the time for the scrape is like 15 minutes...i guess the server is delaying the answer to my requests.
So basically my question is: what can i do to submit that form without getting stopped or delayed?
So basically my question is: what can i do to submit that form without getting stopped or delayed?
have your own local cached copy of all the products, and have a daemon or cronjob that is constantly (but slowly) updating your own local cache, which should make your cache as close to up-to-date as possible without hitting the rate limit. and whenever you need to quickly inspect all 250 products, use your own local cache, not the live version. PS: the rate-limit is probably on a per-ip basis, if the update speed of 1 IP is insufficient, you can probably just keep adding more IPs for the cache-updater until the cache update speed is acceptable.. (.. and if you're looking for a cheap place to get more IP's, i can recommend https://cloudatcost.com/developer-cloud - or if you're looking for something free, you can try the torproject https://www.torproject.org/ - but many websites blocks tor exit nodes)
I'm new to ajax and I'm trying to build a real-time (facebook like) messaging system with PHP and jQuery. Currently, when the user loads the page, messages are loaded from the database. Then, the script sends a get request every 2 seconds to get the latest messages from the database and displays them on the page. However, after a while, the connection times out and if I try to reload the website, the website does not load. It does load when I use a proxy. I feel it has something to do with the server blocking my ip.
Here is my current code (message refresh)
$(function() {
var m = $('.messages');
m.scrollTop(d.prop("scrollHeight"));
setInterval(function() {
$.get("get_messages.php", function(result) {
$('.messages').html(result);
});
}, 2000);
});
How can I make this more efficient/better?
Even though the comments to your question are good, i would rather answer this in my own way..
First off, polling the server every 2 seconds.. In general should work, I would simply guess you are being blocked by the server.. Specially if it is a public/shared server.. As they have to restrict traffic/high loads so you dont weight it down for others.
First i would check your browsers inspector/DOM.. You might see that the network requests are coming back with an error code instead of a 200 status.. This should give you some indication.. Depending on on the server and how they respond.. Some return a 200 status/ok but still stop the script.
Another thing to consider is cache.. With your GET request, also post a timestamp with it.. This will force a new request every time.. jQuery can handle this in the background but certain browsers still fail in the area.. So post a append a javascript timestamp to the URL as well.
Otherwise also consider adding a subscript that logs to a text file and see if that stops logging as well, You might find some interesting results here.
And if all else fails, install/test a local webserver to your computer and test it that way.. This will also help you check if it is your server or your script.
If you try all of the above, You should find an answer to what is causing the issue or at least get some indications as to where or why it stops working.. The proxy simply to me suggests it is a server block, BUT also remember a proxy can change the way it requests your page.... So i would start with the browser inspector first.. This is my first point of reference when something does not work right.
First/Main Question.. Is this a public/shared server or is it your own server that you have setup? If you have access to various logs on the server, check them as well..
I have a request to provide the following solution:
Two web pages with 1 form on them. This form is submitted and inserted into a database. Another web page is used to display the results of the form inputs in a tag cloud solution. My question is what exact workflow would people use for this? My thoughts were like so:
1 MySQL database, 1 html page running AJAX and jQuery for the tag cloud/polling. 1 PHP processing script which grabs new data from the database and serves it out to the html page. Now, what is the effect of this being ran for say 24 hours, constantly updating via AJAX ie, every 10 seconds or should i use a different method. The results from the form need to be saved for offline viewing after the 24 hour period, so i cant just stream the form results straight to the page.
All advice welcome using any technologies...
Should add, there's a possibility that it may be on a LAN with no internet access, possibly thinking of a local XAMPP installation...
given that there is 3 clients, and they poll every 10 sec, that means
approx 25k requests per day to keep the page updated. which is hardly any load for a xampp install, especially on a local network.
i would suggest testing the polling and see if it still works after 24hours.
are there any limitations in the browser that stop the script from working after x amount of time. (i have never tried that long before sorry)
an alternative approach is,
does it need to poll continuously, or can you just have a refresh button
i hope you guys know about webim a.k.a mibew messenger. I know only java, jsp and no idea about php except for some basics. Anyways, i ran this app in my apache2.2 local server. Everything works superb! But if i change my db to a virtual machine and give its address in the config.php (previously i had used localhost), in the visitors page, i get timeout, reconnecting. Login has no prob, so my guess is db connection is fine. I even changed the default page refresh time from 2 to 10. Nothing happens. Still same thing. You guys have any idea?
In users.php, you may get "time out, reconnecting" if the number of in chat sessions is so great that it takes too long for the javascript to retrieve and update the page.
select istate,count(*) from chatthread group by istate;
Look at the number of chat threads with a status = 2.
If it's big (mine was over 1000), then you can update the chatthread table.
update chatthread set istate=3 where istate=2 and dtmcreated < date(now()-interval 1 day) ;
Above query updates threads older than yesterday where the consumer simply closed their window and went away.
Why does this happen? In my case, it's because customer service department did not log into midew for over a week and there were too many messages threads for the users.php/javascript to retrieve and display.
Replace users.php with update.php in the address bar. Load it. There should be valid xml code. You'll see if there are any errors.