Apache2 not responding - php

I am trying to make Comet requests via Prototype/php like here : http://www.zeitoun.net/articles/comet_and_php/start
But!!! While connection is open, other pages from my project is not loading from the same browser.
What can I do to provide normal behaviour?
Very very tnx

Comet works by keeping a connection open between the server and the client. Browsers have a maximum number of connections that they will allow a page to make (something like 2 max for IE), I think it might also group all requests for the same domain together. That is why connections are not going through for you.
I believe it is not the server that is at fault here it is the browsers, using an iframe is the correct solution here as you mentioned, but it's not the servers fault.
[Edit]
Simplest solution for you is to monitor focus. When the page has focus, open a connection, when it is lost(ie. user switches tabs) close the connection and wait for focus again before updating the page. That way you will have the appearance of multiple pages updating while only needing 1 comet connection at any time.

Related

Ratchet Server vs Long Polling

I am developing a website which has a chat feature and requires instant notifications when a user receives a new message. I am trying to decide whether to use a Ratchet server or implement a long polling system with AJAX. I have currently implemented a basic Pub/Sub Ratchet server which works fine when delivering notifications while a user remains on the "Chat" page on my site. My problem here is that the connection is closed whenever the user moves to any other page, and I need to re-create a connection to the server. I am aware that there are possible solutions such as having the websocket connection exist within an IFrame which is always displayed, however I do not want to go down this path if I do not have to. Would a better approach be to implement long polling with AJAX? I am concerned that if I continually re-create a users connection to the Ratchet server whenever they change pages within the site, it will add too much overhead when the site is under heavy usage (thousands of users at a time). Does anyone have experience in this area with Ratchet servers?
Thanks.
As a disclaimer I'm not very knowledgeable in this area, but you should not use long polling. You say you need "instant notifications," which means with AJAX you would have to be making very frequent requests, whereas with websockets you simply establish the connection and wait for data.
I haven't done any sort of testing, but it seems like establishing and maintaining one websocket connection per browser tab would be less overhead than constantly making AJAX requests, not to mention that you can't truly have "instant notifications" with AJAX because you'd have to make requests at an unsustainable rate to achieve that.
So open a websocket connection for each browser tab the user has. If they open a new tab, or browse to a different page, simply open the connection again.

PHP and Multiple MySQL Connections

I have a webapp written in PHP that currently creates a DB connection (using mysqli_connect) on every page to pull data from the database.
Recently, my website has slowed down (with some traffic increase), and I was wondering if the creation of so many connections - one for every user that is on any page - is causing the slow down?
Is there any fix for this?
Is it possible to create one sharable connection for the server? I know this is possible in python, but I do not know how I would implement such a model in PHP.
Note: My site is on BlueHost...I don't know if that also makes a difference.
Well two things to do.
Setup the slow query log in MySQL and see if there are queries that
are slow. Optimize these slow queries by using the EXPLAIN command
to identify missing indexes etc.
Setup connection pooling to eliminate opening a connection all the
time. See this link Connection Pooling In PHP for more information.
You can prepend host by "p:" when passing it to mysqli_connect to open a persistent connection.
You can read about Persistent Connections from the links below:
http://us2.php.net/manual/en/features.persistent-connections.php
http://php.net/manual/en/mysqli.persistconns.php
http://us2.php.net/manual/en/function.mysql-pconnect.php
But I would strongly suggest NOT to use Persistent Connections.
You will need to find out what is actually slowing your website.
Ping a website that loads fast for you. eg. google.com.
Ping your webserver, if the ping difference is a lot then you should ask BlueHost to resolve it.
Check whether there is any lag between your Web Server and Database Server. They could probably be:
on the same machine (mostly no lag)
on different machines in the same local network (mostly no lag unless there is a LAN problem, ask BlueHost)
on different machines in different networks (there could be issues, ask BlueHost if they could shift the Database Server within the same Local Network)
If everything above is fine and the pages are still loading slowly.
You could try explicitly calling mysqli_close() after your DB work in the page is done. This will free up your connection immeditely instead of waiting for the page to fully execute. You will then need to figure what is slowing the page after your DB work is over.
You can use mictotime() on your slow pages to see what code is slowing it down.
You could use this link: Accurate way to measure execution times of php scripts

using server push methods without browser

I have a client device that request a web page.
I trying to send data to a client when a database table entry is changed.
Problems: the client is not a "browser" ie client side scripting wont do me any good here.(Its a micro controller)
Attempts at first I was thinking of using php and the flush command. I could ever so often output waiting to the client while still in a loop thats checking the database for changes. This to me is a stretch of a method for I don't think my server supports the function and I dont really like it for it seems "dirty" :) ...
Next thought have the php constantly poll the database for changes using a loop. the client should wait until the server finishes and thus I will have a stable connection for "as long as it takes for a change to happen :) optimistic I know". Taking into account that if the connection does time out I can have the client reconnect.
Now a bit of a silly stretch is server side JavaScript a thing lol yes i asked...maybe there is something i don't know about...
I'm hoping someone here can help on this quest of knowledge
Thanks JT
My client is currently:
Opening a socket using tcp connection on port 8090... Then opening a connection to my web site using my socket number and the server address and server port number(80)...I not sure how to correlate this type of socket to the type i would need to stream data very sparingly to the client.
If you need to stick with the HTTP protocol (see comments on other possible methods), read about the meta refresh HTML header. This does what you asked without client side scripting.
Another possible thing would be to setup your db updates as an RSS feed.
It does feel better design not to use HTTP.
Non HTTP based notes:
1) could you just do your current HTTP request, sleep abit, then repeat the same process?
Nothing special, or original; but this fundamentally is what other options are doing.
2) using the same socket, blocking reads would allow you to get more data as soon as its available
the webserver may need to have the config adjusted to act as streaming media server.
38951
As discussion, not a solution, have a look at streaming media

IP Connection limit exceeded

I used php script to parse remote xml file and print output on web page into a div. Since I need output have to be synchronized with currently playing track, I used Javascript to reload div content every 20sec. While testing the page I faced an issue with my hosting, and got message "IP Connection limit exceeded", site was not accessible. I've changed IP to solve this. Is there a workaround to parse metadata without bumping the server and running into web hosting issues?
<script>
setInterval(function() {
$('#reload').load('current.php');
}, 20000);
</script>
Since a web page is a client-based entity, it is in nature unable to receive any data that it hasn't requested. That being said, there are a few options that you may consider.
First, I don't know what web host you are using, but they should let you refresh the page (or make a request like you are doing) more than once every 20 seconds, so I would contact them about that. A Denial of Service attack should be more like 2 or 3 times per second per connection. There could be a better answer for this that I'm just not seeing, but at first glance that's my take on that.
One option you may want to consider is using a Web Socket, which is a new feature of HTML 5 enabling the Web Server to maintain an open connection between the Visitor's Browser and send packets of data back and forth. This prevents the need for the browser to constantly poll the server every 20 seconds. Granted, these are new and I believe they only work in Safari and Chrome. I haven't experimented with them but plan to in the future.
In conclusion, I don't know of a better way than polling the server every so often to check for changes. Based on my browser's XMLHttpRequest tab, this is how gmail looks for new messages. If your host won't allow you more requests per time interval, perhaps decrease the frequency you are polling the server or switch to a different host.

Jquery PHP push?

I need to implement real-time page data update with php and jquery.
(I found www.ape-project.org/ but it seems site is down)
Is any other solutions?
Very TNX!
You might want to check out Comet:
Comet is a web application model in
which a long-held HTTP request allows
a web server to push data to a
browser, without the browser
explicitly requesting it.[1][2] Comet
is an umbrella term, encompassing
multiple techniques for achieving this
interaction. All these methods rely on
features included by default in
browsers, such as JavaScript, rather
than on non-default plugins. The Comet
approach differs from the original
model of the web, in which a browser
requests a complete web page at a
time.
http://en.wikipedia.org/wiki/Comet_%28programming%29
If you want to do streaming (sending multiple messages over a single long lived, low latency connection), you probably need a comet server. Check out http://cometdaily.com/maturity.html for details on a variety of server implementations (I am the maintainer of one of them - Meteor).
If you are happy to reconnect after each message is received, you can do without complicated servers and transports and just use long polling - where you make an ajax request and the server simply sleeps until it has something to send back. But you will end up with LOTS of connections hanging off your web server, so if you're using a conventional web server like Apache, make sure it's configured to handle that. By default Apache doesn't like having more than a few hundred concurrent connections.
There is lots of solutions to do this...
Depending on what is your data, how your data are organized and stored (mysql ?).
Your question is too open to have a real answer.

Categories