I am using Google Chrome for my testing because in the future the comet page will be loaded in google chrome embedded.
After about 12 hours, i guess the comet file gets too big and chrome gets the official:
How do i prevent chrome from as it seems crashing after the page has been up for that long?
Do i have to refresh the iframe?
What i tried is using the comet scripts every 2 minutes i do $('script').remove(), so i guess that removes them from the DOM, but the file is still getting bigger...
Can anyone help? ^_^
i will provide as much code as needed if asked. (js, or php)
I'd try to restart the Comet after a while. One hour, or less.
Delete the iframe you where using and appendChild() a new one.
Related
I've just finished a CakePHP website which works like a charm in local. So, I put it online.
The problem is that , although pages are generated pretty quickly (around 400ms), my browser shows the loading symbol for 5 long seconds.
My Firefox dev console shows that the browser is in "receiving" state for EXACTLY 5 seconds, every time. Actually, the page is displayed way before, but it seems that for some reason the connection stays open for precisely 5 seconds.
It's actually boring because, let alone the loading sign that appears even though the page is fully loaded and usable, AJAX calls will last 5 seconds too. So AJAX content can't be displayed before 5s, which is obviously unacceptable for users.
I've tested the problem on several computers, browsers, and internet connections. I've also tested a vanilla CakePHP on the same host, and encountered the same problem.
So do you have any idea of which host setting could cause that? I believe that this only can happen because the server is keeping the connection open, not because the client is. But i can't figure out the reason. I hope you will !
I had the same problème.
It's not the good solution, but you can force the php script to close http connection at the end of Ajax action :
function AjaxEdit(){
// do some thing
header( "Connection: Close" );
}
I run a php script that uses the wikipedia api to locate wikipedia pages about certain movies, based on a long list with titles and year of release. This takes 1-2 seconds per query on average, and I do about 5 queries per minute. This has been working well for years. But since february 11 it suddenly became very slow : 30 seconds per query seems the norm now.
This is a example from a random movie in my list, and the link my script loads with file_get_contents();
http://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=content&format=yaml&rvsection=0&titles=Deceiver_(film)
I can put this link in my browser directly and it takes no more than a few seconds to load and open it. So I don't think the wikipedia api servers have suddenly become slow. When I load the link to my php script from my webserver in my browser, it takes between 20 to 40 seconds before the page is loaded and the result from one query is shown. When I run the php script from the command line of my webserver, I have the same slow loading times. My script still manage to save some results to the database now and then, so I'm probably not blocked either.
Other parts of my php scripts have not slowed down. There is a whole bunch of calculations done with the results of the wikipedia api, and all that is still working at a regular speed. So my webserver is still healthy. The load is always pretty low, and I'm not even using this server for something else. I have restarted apache, but found no difference in loading times.
My questions :
has something changed in the wikipedia api system recently ? Perhaps my way of using it is outdated and I need to use something new ?
where could I look for the cause of this slow loading ? I've dug through error files and tested what I could test, but I don't even know where something goes wrong. If I knew what to look for, I might perhaps easily fix it.
git log --until=2013-02-11 --merges in operations/puppet.git shows no changes on that day. Sounds like a DNS issue or similar on your end.
I am trying to use AJAX to read when a file or database has changed (had an extra post added to it by another user), and display the newest post (kind of like SO)
And it worked, but the thing is the host I'm using only allows a certain amount of "resource usage per hour" and once the limit is hit, the site is locked out for an hour. This is a free host I'm mainly using for testing and learning.
So before, I had the AJAX set to a setInterval of checking every 2-4 seconds, from a file that was just echoing the last post made in the system. Which I'm guessing is what shut down the site for an hour in a matter of minutes.
So I'm wondering if there's anyway to make it only retrieve the newest post ONLY when the result has changed from what it last found. It sounds like that can't be done because it'd still have to check every time, activating the PHP, regardless of what sends back.
Any ideas how I can do this or something similar?
You could use http://en.wikipedia.org/wiki/WebSocket (but I guess not on your host, since there is a apache extension you have to install) or you use http://en.wikipedia.org/wiki/Push_technology#Long_polling.
With long polling you send one request to your PHP and the PHP script loops until a new post was found and then send the response.
But you really should consider changing the hoster, because realtime web application need moe ressources. Why not testing and learning locally on your machine?
I'm trying to convince a guy in IT that the PHP installation he gave me is corrupt (somehow). He did an upgrade to the latest PHP and since then, PHP pages have been slow. In a weird way.
If I visit a PHP page, the page displays instantly but the activity monitor in Safari shows activity for ~5 more seconds. When I use Safari's inspector, it looks like lots of things load, then there's a huge pause, then more things load. If I load a blank PHP page that doesn't reference CSS files or anything, it still sits for ~5 seconds.
Any ideas what could be causing this? It's hard for me to Google it because "PHP page loads slow" involves lots of different scenarios.
Is there a way I could somehow pinpoint why PHP loading is longer than it was before the upgrade?
If you set up a sample page with only the following in it, do you get the same behavior? This will show that php works. From a speed / content delivery point of view, you need to then determine what is causing the slowdown. phpinfo(); should load right away and display to the browser without much delay ...
<?php phpinfo(); ?>
I have a wordpress blog that is having serious performance issues (like 10s to load each page). I installed WP Super Cache to try to solve the problem, but the first time a user visits the page after the cache expired againg it takes 10s to load. After it is cached, the site speed is normal.
So to try to fix this, I configured the preload mode to run every 30 mins but something is not working, because once the cache expires the first user has to wait 10s for each page...
I configured the cache to last 1 hour (1800s) and the preload to run every 30 mins, this way there should always be a cached version of the page that the users are requesting... but no :(
I would REALLY appreciate a help with this as I dont know what else to do.
Thanks in advance!
Juan
Sometimes plugins can be poorly written and suck resources. Disable every plugin and see if the site runs okay. Then start re-enabling plugins until you find the source of the problem; you should then get rid of the offending plugin and find a replacement.
Install FireBug and use the "Net" tab to see what is taking long to load. It can be anything.. scripts, external scripts, images from external sites, DB connection etc etc.
I dentify the issue then it will be easy for you to solve.
If caching fixes the problem, then your likely culprit is poorly written code (lots of error suppression etc.)
An alternative issue is the server the code is hosted on (not as likely, but a possibility). If the server is having issues, or is running out of memory, it may respond slower in delivering content.
Do what the other say:
Then, also consider adding in multistage caching at different rates. Cache DB at one rate, Cache large page bits at another rate. Cache the whole page at another. That way no person loads it all in one shot. In theory.
The behaviour explained is completely normal.
Cache Misses will be slow. This is expected. Set a cache without and expiry if you want it to hit the cache 100% of the time ( this is far from recommended)
Use an opcode cache if you can. such as APC.