IP Connection limit exceeded - php

I used php script to parse remote xml file and print output on web page into a div. Since I need output have to be synchronized with currently playing track, I used Javascript to reload div content every 20sec. While testing the page I faced an issue with my hosting, and got message "IP Connection limit exceeded", site was not accessible. I've changed IP to solve this. Is there a workaround to parse metadata without bumping the server and running into web hosting issues?
<script>
setInterval(function() {
$('#reload').load('current.php');
}, 20000);
</script>

Since a web page is a client-based entity, it is in nature unable to receive any data that it hasn't requested. That being said, there are a few options that you may consider.
First, I don't know what web host you are using, but they should let you refresh the page (or make a request like you are doing) more than once every 20 seconds, so I would contact them about that. A Denial of Service attack should be more like 2 or 3 times per second per connection. There could be a better answer for this that I'm just not seeing, but at first glance that's my take on that.
One option you may want to consider is using a Web Socket, which is a new feature of HTML 5 enabling the Web Server to maintain an open connection between the Visitor's Browser and send packets of data back and forth. This prevents the need for the browser to constantly poll the server every 20 seconds. Granted, these are new and I believe they only work in Safari and Chrome. I haven't experimented with them but plan to in the future.
In conclusion, I don't know of a better way than polling the server every so often to check for changes. Based on my browser's XMLHttpRequest tab, this is how gmail looks for new messages. If your host won't allow you more requests per time interval, perhaps decrease the frequency you are polling the server or switch to a different host.

Related

is it too much to check database every half a second

So I'm making a chat app. I use ajax to check for new entry(message) in database every 0.5sec. If there is then display the message. Is that too much to ask the server? Im using a cheap shared hosting service. From my experience so far, half the time message is fast and smooth, the other time, especially during peak time, messages disappear and the ajax request fail half the time. Sometimes even connection to database itself comes back fail. I want to know if its me asking too much from the server or my server is bad I should consider changing. (or both)
You have 2 simple ways,
First is that using some cache mechanism like Redis cache to save last state of every user-chat and last received message id,date to get rid of checking db on every request.
Second way is using Web-Socket instead of client to server requests. In this method you establishing a connection between client and server then once a new message arrived you pushes the message to client directly.
If you want to resolve your problem without any external tool, My offer to you is the second way.
In this case I would probably recommend going with Web Sockets, seems like it fits your use case.
Look into http://socketo.me/
That being said I would look into upgrading your hosting from a shared account if you are wanting your site to remain performant under a larger load. Again really depends how many users you are trying to cater for.

Issues with PHP cURL between servers, return transfer probably wont catch the response

I have a really weird behavior going on.
I'm hosting a tracking software for users, that mainly logs mobile traffic. Now, the path is as follows:
1. My client gets a php code snippet to put in his website.
2. This code sends a cURL post (based on predefined post fields like: visiotr IP, useragent, host etc) to my server.
3. my server logs the data, and decide what the risk level is.
4. it then responds the client server about the status. That is, it sends "true" or "false" back to the client server.
5. client server gets that r
esponse, and decides what to do (load diffrent HTML content, redirect, block the visitor etc).
The problem I'm facing is, for some reason, all the requests made from my client's server to my server, are recorded and stored in the a log file, but my clients report of click loss as if my server sends back the response, but their server fails to receive those responses or something.
I may note that, there are tons of requests every minute from different clients' servers, and from each client himself.
Could the reason be related to the CURL_RETURNTRANSFER not getting any response ? or, maybe the problem is cURL overload ?
I really have no idea. My server is pretty fast, and uses only 10% of its sources.
Thanks in advance for your thoughts.
You touched very problematic domain - high load servers, you problem can be in so many places, so you will have to really spend time to fix it, or at least partially fix.
First of all, you should understand what is really going on, check out this simplified scheme:
Client's php code tries to open connection to your server, to do this it sends some data via network to your server
Your server (I suppose apache) tries to accept it, if it has resources - check max connections properties in apache config
If server can accept connection it tries to create new thread (or use one from thread pool)
After thread is started, it runs your php script
Your php script do some work, connecto to db and sends response back via network
Client waits till the answer from p5 or closes connection because of timeout
So, at each point you can have bottleneck:
Network bandwidth
Max opened connections
Thread pool size
Script execution time
Max database connections, table locks, io wait times
Clients timeouts
And it is not a full list of possible places where problem can occur and finally lead to empty curl response.
From the very start I suggest you to add logging to both PHP codes (clients and servers) and store all curl_error problems in some text file, at least you will see what problems occur often.

Apache2 not responding

I am trying to make Comet requests via Prototype/php like here : http://www.zeitoun.net/articles/comet_and_php/start
But!!! While connection is open, other pages from my project is not loading from the same browser.
What can I do to provide normal behaviour?
Very very tnx
Comet works by keeping a connection open between the server and the client. Browsers have a maximum number of connections that they will allow a page to make (something like 2 max for IE), I think it might also group all requests for the same domain together. That is why connections are not going through for you.
I believe it is not the server that is at fault here it is the browsers, using an iframe is the correct solution here as you mentioned, but it's not the servers fault.
[Edit]
Simplest solution for you is to monitor focus. When the page has focus, open a connection, when it is lost(ie. user switches tabs) close the connection and wait for focus again before updating the page. That way you will have the appearance of multiple pages updating while only needing 1 comet connection at any time.

How to automatically reflect the new Updates (on Database) to User Pages

I am trying to create a new Notice Alert using PHP and Javascript. Suppose that a 100 users are currently logged into to the Online Notice Board Application and any One user posts a new notice. I want an immediate alert signal on all the users screen.
I know, that the simplest way of doing it is to constantly Ping the server but I don't want to do it as it will slow down the server.
Moreover, I am on a shared host. So I don't have access to any Socket Port. That means, I cannot establish any direct Socket Communication Channel from the Server to the User Machine.
Can any one suggest me some other solution to this kind of problems???
This is a COMET application. Google for COMET and you should find lots of information. Basically there are two techniques for retrieving asynchronous notifications over HTTP. The first is to ping the server, which you've already said you don't want to do. The other technique is to send a request to the server and have the server respond only when there is some data. In other words, instead of pinging once a second and only getting a message after 50 pings and 50 seconds, the server simply holds the first request for 50 seconds, until there is something to send, then responds. There are tools that will do all this for you.

failsafe for networked robot

I have a robot that I'm controlling via a browser. A page with buttons to go forward, reverse, etc is written in PHP hosted on an onboard computer. The PHP is just sending ASCII characters over a serial connection to a microcontroller. Anyway, I need to implement a failsafe so that when the person driving it gets disconnected, the robot will stop. The only thing I can think to do is to ping the person on the web page or something, but I'm sure there is a better way than that. The robot is connected either via an ad hoc network or a regular wireless network that is connected to the internet. Obviously if I go with the ping method then there will have to be a delay between the actual time disconnected and when it realizes it's been disconnected. I'd like this delay to be a small as possible, whatever the method used. I'd appreciate any ideas on how to do this.
Pinging a web client is somewhat unreliable, for you have to take into account, that the client ip might change.
On the other hand, you could emulate a "dead-man-button" via Ajax. Let the webpage send a defined command every now and then (maybe every 5 to 10 seconds). If the robot doesn't receive the message for some time, it can stop. The Ajax script could run in the background so the controlling user won't even notice anything.
This would of course mean, that your robot needs to have a counter which is incremented every second and reset when the message is received. The moment the timer variable is too high, FULL STOP
May I suggest you use a simple flash object embedded in the web browser to open a socket connection to a server on the robot? The server can be written in any suitable language - even PHP (cough).
Then it is a simple matter to detect immediately when the connection goes down, and implement your fail-safe approach.
HTTP is not a ideal protocol for robot control.
Good luck!
All I can think of is to include ajax code in your HTML that "pings" your server every X second. I believe that's what Facebook Chat does to know whether or not you are still online.
HTML 5 Web sockets might be the solution you are looking for but you have to consider that it won't be implemented by most of your users' browsers.
You might find this article interesting: http://www.infoq.com/news/2008/12/websockets-vs-comet-ajax.

Categories