I am trying to use AJAX to read when a file or database has changed (had an extra post added to it by another user), and display the newest post (kind of like SO)
And it worked, but the thing is the host I'm using only allows a certain amount of "resource usage per hour" and once the limit is hit, the site is locked out for an hour. This is a free host I'm mainly using for testing and learning.
So before, I had the AJAX set to a setInterval of checking every 2-4 seconds, from a file that was just echoing the last post made in the system. Which I'm guessing is what shut down the site for an hour in a matter of minutes.
So I'm wondering if there's anyway to make it only retrieve the newest post ONLY when the result has changed from what it last found. It sounds like that can't be done because it'd still have to check every time, activating the PHP, regardless of what sends back.
Any ideas how I can do this or something similar?
You could use http://en.wikipedia.org/wiki/WebSocket (but I guess not on your host, since there is a apache extension you have to install) or you use http://en.wikipedia.org/wiki/Push_technology#Long_polling.
With long polling you send one request to your PHP and the PHP script loops until a new post was found and then send the response.
But you really should consider changing the hoster, because realtime web application need moe ressources. Why not testing and learning locally on your machine?
Related
I developed a site using Zend Framework 2. It is basically a price comparison site that integrates with many of the top affiliate networks out there. I wrote a script that checks prices from each affiliate network, and then updates my local DB with that price. Depending on which affiliate network I am contacting, I may be making an API call (Amazon or CJ.com), or I may be looking at an XML product feed (Pepperjam or LinkShare). The XML product feed would be hosted locally.
At present, there are around 3,500 sku's that I am checking with this script. The vast majority of them (95%+) are targeting an XML product feed. I would estimate that this script should probably take in the neighborhood of 10 minutes to complete. Some of the XML files I am looking at are around 8 MB in size.
I have tested this script thoroughly in my local environment and taken great lengths to make sure that there is no memory leak or something of that nature which would cause performance issues. As an example, I made sure to use data streams where possible to avoid putting the XML file in memory over and over, etc. Suffice to say, the script runs locally without issue.
This script is intended to be run as a cron job, however I do have a way to trigger it via the secure admin interface ad-hoc. Locally, this is how I initiate the script to run, and everything goes rather smoothly.
When I deploy my code to the shared hosting account, I am having all sorts of problems. In order to troubleshoot, I attached logging to various stages of this script to track when it starts, how it progresses, and when each step completes, etc. All of this is being logged to a MySQL database.
Problem #1: If I run the script ad-hoc via an HTTP request, I find that it will run for a couple minutes, and then the script starts again (so there are now two instance apparently running). Wait another couple minutes, and a third one will start, etc..... Here is an example when I triggered the script to run at 10:09pm via an HTTP request.
Screenshot of process manager
Needless to say, I DO NOT run it via an HTTP request because it only serves to get me in trouble with my web hosting provider :)
Problem #2: When the script runs on the server, triggered via a cron job, it is failing to complete. I have taken the production copy of the database and taken it locally along with the XML files, it runs fine. So it should not be a problem with bad data exposing bad code. My observation is - the script nearly runs for the exact same amount of time - before aborts, or is terminated, or whatever. The last record updated is generally timestamped around 4 minutes and 30 seconds or so (if memory serves) after the script is triggered. The SKU list is constantly changing so the record that it ends on differs, but the the time of the last update is nearly the same each time. Nothing is being logged in the error logs. I monitored server resources via SSH top command and there is nothing out of the ordinary. CPU usage is in check and memory used does not go up.
I have a shared hosting account through Bluehost. My thoughts were that perhaps it was a script max execution time issue. I extended the max execution time in the script itself and via php.ini. Made no difference.
So I guess what I am looking for is some fresh ideas of where to go next. What questions should I be asking my hosting company so they can help me get to the bottom of this. They are only somewhat helpful to say the least. Could it be some limitation on my hosting account? Triggering some sort of automatic monitor that is killing the script? What types of Apache settings could be problematic for a script of this nature? PHP.ini settings? Absolutely any input you can provide would be helpful.
And why, when triggered via HTTP, would it keep spinning up new instances? I guess I could live w/o running it manually, and only run it via a cron job, but that isn't working either. So .... interested in hearing the communities thoughts on this. Thanks!
I haven't seen your script, neither did I work with your hoster, so everything below is just a guess - and a suggestion.
Given your description, I would say you're right that your script might have been killed by timeout when run from cron. I'm not sure why it keeps spawning new instances of your script when you execute it manually via an HTTP request, but it may also be related to a timeout (e.g. if they have a logic that restarts a script if it has not produced an output within a certain time, or something like that).
You can follow up with your hosting provider about running long-running (or memory-consuming) script in their environment, and they might have some FAQ or document already written that covers this topic.
Let me suggest an option for you in case if your provider is unable to help.
From what you said, I expect your script runs an SQL query to get a list of SKUs, and then slowly iterates over this list, performing some job on every item (and eventually dies for whatever reason, as we learned).
How about if you create a temporary table (or file - just any kind of persistent storage on the server) that would save the last processed record ID of the script, or NULL if the script successfully completed. That way you'll be able to make your script start with the last processed record (if the last processed record had id = 1000, add ... WHERE id > 1000 to the main query that fetches SKUs), and you won't really care if the script completed its first attempt or not (if not, it will keep processing from that very point when it was killed, on its second try).
Alternatively, to extend this approach, you can limit one invocation to the certain amount of records to process (e.g. 100 or 1000), again, saving the last processed record ID in the database or somewhere else.
The main idea is: if the script fails to process all SKUs at once, just make it restartable so that it does not lose its progress.
I'm making a app for my website for visitors to look at the camera.
I own a PT( pan-tilt) camera which can be operated by using url's.
I want my camera to move randomly at prefixed times ( like every 5 seconds a different position) and in the background, so i will move without any operator but i can't seem to figure out how to make it movable automatically.
The manufacturer works with CGI commands like:
myip:myport/decoder_control.cgi?command=39&user=user&pwd=password
(this code makes it go to preset 1).
How can i make the camera move with this command using serverside php, making it move after 5 seconds?
Running the CGI script from PHP.
You can perform an HTTP request from PHP, that would load the URL corresponding to the command, causing the camera to change position. Some ways of achieving this:
Using the function http_get: PHP: http_get – Manual.
Using cURL.
Using file_get_contents for very basic requests: question on SO.
If you just need to perform a GET request, and the response is empty (e.g. you just need to check the +200 OK code) or contains some very simple data (e.g. a string), then file_get_contents is more than enough.
If you don't have any background on how HTTP requests work, Wikipedia could be a good introduction; especially if later on you have more complex CGI commands to send to your PT Cam.
Make the camera move every 5 sec.
This is a completely different matter. The problem here is run PHP code periodically and automatically.
You can schedule the PHP script to be executed, using a Cron job (Cron, crontab) and this questions explains you how. BUT Cron's minimal time resolution is one minute; also moving a camera every 5 seconds doesn't really sound like schedule a job, sounds more like it should be handled by a system service.
What you could do, is moving the camera from the PHP script users use to watch: store the last update time on a file/database, and if the elapsed time is >5s, run the CGI script.
This would keep your camera still unless someone is actually watching. Other problems might arise, for example what if many users are visiting the same page and your server serves the request simultaneously? You might get several consecutive commands sent to the camera. Moreover, while the users are watching, staying on your PHP page, you must again find a way of moving the camera every 5".
A possible solution.
Create a PHP script that, when loaded, runs the CGI command only if at least 5s have passed since the last call (by storing the time of the last call).
Create a client page for your users, that, via JavaScript, loads the PHP script every 5s. Look for JavaScript GET request, you will find enough information to fill a book.
Again, this would generate a lot of traffic on your webserver, just for those five seconds of panning. My suggestion is that the movement should be handled by some server side program, not script.
I've finally made a simple chat page that I had wanted to make for a while now, but I'm running into problems with my servers.
I'm not sure if long polling is the correct term, but from what I understand, I think it is. I have an ajax call to a php page that checks a mysql database for messages with times newer than the time sent in the ajax request. If there isn't a newer message, it keeps looping and checking until there is. Else, it just returns the new messages and the client script sends another ajax request as soon as it gets the messages.
Everything is working fine, except for the part where the server on 000webhost stops responding after a few chat messages, and the server on x10 hosting gives me a message about hitting a resource limit.
Maybe this is a dumb way to do a chat system, but it's all I know how to do. If there is a better way please let me know.
edit: Holy hell, it's just occurred to me that I didn't put any sleep time in the while loop on the server.
You can find a lot of reading on this, but I disbelieve that free web hosting is going to allow to do what you are thinking of doing. PHP was also not really designed to create chat systems.
I would recommend using WebSockets, and use for example, Node.JS with Socket.IO, or Tornado with Python; There is a lot of solutions out there, but most of them would require you to run your own server since it requires to run a whole program that interacts with many connections at once instead of simple scripts that just start and finish with a single connection.
What about using the same strategy whether there are newer messages on the server or not. The server would always return a list of newer messages - this list could be empty when there are no newer messages. The empty list could be also be encoded as a special data token.
The client then proceeds in both cases the same way: it processes the received data and requests new messages after a time period.
Make sure you sleep(1) your code on each loop, the code gonna enter the loop several times per second, stressing your database/server.
But still, nodejs or websockets are better tecnologies to deal with real time chats.
As my server is not supporting cron job, I want a file in my server to trigger its action on a particular time every day..
Please let me know whether it possible to do run a script at a particular time from the server side itself without any external act.
I agree with Kel's answer.
You could try out one of the free cronjob services available, if your server doesn't support it.
Online Cronjobs
Set Cronjob
Just the first two found on Google, there's likely to be more if you search a little.
You cannot start script without ANY external act.
If your file server has SSH or HTTP server or something like that, you can configure cron job on another server to start your script via SSH / HTTP / something like that.
Also, you can create PHP script, which would do sleeping in a loop all the time, and wake up and do some job only if current time is near some specific value. You will have to correct maximum execution time for php script (see here for details), and you will have to start your script on server startup. BTW, this does not look like good solution.
As mentioned before, this is not possible literally "without external act".
A nice solution I found in the ThinkUp software (don't know where else this is used) to use a RSS feed reader. From the point of simplicity, this is probably the best option.
The idea is that you use your feed reader to automatically call a script on your site every XX hours (or whatever interval you want). When called, this script executes the maintenance tasks or whatever it is that you want to do.
To make sure that not everybody can run that script and cause your server to break down (I suppose this is a somewhat heavy task), you can use a unique long identifier string appended as URL parameter to make sure that the script only gets called by you.
Other than that, you can use one of the "poor man's" web cron job services that have been suggested in other answers.
if (rand(0,100)==0){
if (!file_exists($tf='tmp/job.crontime') || (time() - filemtime($tf))>(60*60*24)){
... # your tasks
touch($tf);
}
}
This simple & stupid script uses a file to store the time of last job-ecexution. If >60*60*24 has passed — it launches the job code. rand(0,100) should lower the overhead of checking for jobs on each request: 1/100 is the chance of running your jobs.
Put it in the end of your 'index.php'. Don't use in projects with modelate to high load :))
The Great Disadvantage: it won't run if you don't have any visitors.
UPD: Write a script that runs indefinitely and every 30s does touch('tmp/job.crontime') to report it's still alive. It should also check the current time & perform actions.
In index.php, if more than 30s has passed — re-launch the daemon with an HTTP-request. Ugly, but fully functional. You'll also deal with time limits, be careful!
Well, if this is on a public web server and you have enough visits, you could always use those to run code to check for a given value, say hour of day, number of times a file have been accessed (or store your number in a file). Just put your php code on top of a web page.
I'm currently running a Linux based VPS, with 768MB of Ram.
I have an application which collects details of domains and then connect to a service via cURL to retrieve details of the pagerank of these domains.
When I run a check on about 50 domains, it takes the remote page about 3 mins to load with all the results, before the script can parse the details and return it to my script. This causes a problem as nothing else seems to function until the script has finished executing, so users on the site will just get a timer / 'ball of death' while waiting for pages to load.
**(The remote page retrieves the domain details and updates the page by AJAX, but the curl request doesnt (rightfully) return the page until loading is complete.
Can anyone tell me if I'm doing anything obviously wrong, or if there is a better way of doing it. (There can be anything between 10 and 10,000 domains queued, so I need a process that can run in the background without affecting the rest of the site)
Thanks
A more sensible approach would be to "batch process" the domain data via the use of a cron triggered PHP cli script.
As such, once you'd inserted the relevant domains into a database table with a "processed" flag set as false, the background script would then:
Scan the database for domains that aren't marked as processed.
Carry out the CURL lookup, etc.
Update the database record accordingly and mark it as processed.
...
To ensure no overlap with an existing executing batch processing script, you should only invoke the php script every five minutes from cron and (within the PHP script itself) check how long the script has been running at the start of the "scan" stage and exit if its been running for four minutes or longer. (You might want to adjust these figures, but hopefully you can see where I'm going with this.)
By using this approach, you'll be able to leave the background script running indefinitely (as it's invoked via cron, it'll automatically start after reboots, etc.) and simply add domains to the database/review the results of processing, etc. via a separate web front end.
This isn't the ideal solution, but if you need to trigger this process based on a user request, you can add the following at the end of your script.
set_time_limit(0);
flush();
This will allow the PHP script to continue running, but it will return output to the user. But seriously, you should use batch processing. It will give you much more control over what's going on.
Firstly I'm sorry but Im an idiot! :)
I've loaded the site in another browser (FF) and it loads fine.
It seems Chrome puts some sort of lock on a domain when it's waiting for a server response, and I was testing the script manually through a browser.
Thanks for all your help and sorry for wasting your time.
CJ
While I agree with others that you should consider processing these tasks outside of your webserver, in a more controlled manner, I'll offer an explanation for the "server standstill".
If you're using native php sessions, php uses an exclusive locking scheme so only a single php process can deal with a given session id at a time. Having a long running php script which uses sessions can certainly cause this.
You can search for combinations of terms like:
php session concurrency lock session_write_close()
I'm sure its been discussed many times here. I'm too lazy to search for you. Maybe someone else will come along and make an answer with bulleted lists and pretty hyperlinks in exchange for stackoverflow reputation :) But not me :)
good luck.
I'm not sure how your code is structured but you could try using sleep(). That's what I use when batch processing.