So I have a websocket server setup and running in PHP (CodeIgniter to be exact, though that shouldn't matter...). What I would like to do is have the server run "clean up" functions every n seconds without the use of CRON jobs if at all possible. Basically, I want the websocket server function that is already running to check for users that haven't done anything in x amount of time and "kick" them automatically by closing their socket.
From what I've seen all over the web, the only way to perform a server action is once user input is received...there isn't a way to run a server function automatically...
Ideas?
Thanks!
Probably the best way to do this would be implement a daemon.
Here is a tutorial and a great class to get you started: http://kevin.vanzonneveld.net/techblog/article/create_daemons_in_php/
I use this class on a few daemons that do similar tasks. They run for weeks, and I don't have any trouble. You do need to be mindful though of your memory usage. Make sure you don't have any anonymous functions that never get killed off by the garbage collector, for example.
I dont know anything about your "socketserver" but it sounds like a server writen in java, or somthing similar, and I'd assume that you have either written it yourself (or at least have the source). What you could do is start a thread running something like the following (this is writen in php, because I don't know java very well)
while(true) {
$oldTime = time();
while(time()-$oldTime < 10) continue;
clean();
up();
functions();
}
Related
I have code structure like this
echo "First Operation";
sleep(3);
echo "Second Operation";
I want to try simulate multi-threading but I could not make, sleep function is meaningless there, but I could not find another option.
I have to agree that PHP is not built for multithread. As such the script you create will always run in one thread. There are some exceptions to that. When you call a shell command there is the possibility to fire-and-forget the command when you dont need the output. It runs parallel but you can not access the result.
PHP is designed to execute one instruction after the other.
To create something like a parallel working script, you need to take advantage of some external systems. All of those tricks are related to your actual solution you wane try to accomplish. Let me give you an example.
You could use a Message Queueing system like Rabbit MQ to work as a separation between one script and the other. Lets say script1 is permanently listening on the queue for work to do. Another script, lets say script2, is (on request) bushing a work into the queue and continuing with other stuff while the script1 is picking that up, doing something and returning the answer. When script2 is done with the other stuff it can read the result from the queu and finish whatever you want to do.
This is actually a very tricky and theoretical concept, but it might give you an idea. But without any external component like the queue, php will not work multithreaded.
I hope that is helping you!
Is there a way to execute a function at regular interval?
I have a database table and I need to know when an entry is added or removed. The logic am trying to use is Ajax makes a call to Server, but instead of responding immediately, server continuously checks for 30 seconds if database is updated, if yes then only then it responds, else it responds after 30 seconds. This way I am trying to minimize the load on server by calling Ajax requests every second.
How do I do this? Does using while loop make sense ? Something like this may be-
while (SomeCondition)
{
if (CheckIfDatabaseChanged())
{
echo "System Updated";
break;
}
}
If this is a no non-sense solution then how can I make sure that the loop runs only for 30 seconds and breaks. Or is there a better solution?
What you are thinking off is something called long-polling and it does not scale good on PHP especially when you use blocking IO.
See https://stackoverflow.com/a/6488569/11926 for some more information.
But your code could look something like this
set_timeout_limit(31);
$i=0;
while ($i<30) {
// poll database for changes which is a bad idea.
i = i + 1;
sleep(1); // sleep 1 second
}
I bet you you can not run many of these concurrent.My advice would be to use something like redis pubsub to notify of db changes and some kind of long-polling/websocket solution instead.
If possible you should spawn a background process to subscribe to database changes and then publishes changes to pusher for example, because having multiple long running processes is really bad for performance.
You could host both of them yourself or use hosted services like for example:
Redis:
http://redistogo.com
Long-polling / Websocket:
http://pusher.com
They both have small free plans which could get you started and when you get too big for these plans you could think about hosting these solutions for yourself.
P.S: I have also found a non-blocking solution in PHP which is called React. This solution might scale(better) in PHP.
Use this:
set_timeout_limit(0)
http://se.php.net/manual/ru/function.set-time-limit.php
How can I make a scheduler in PHP without writing a cron script? Is there any standard solution?
Feature [For example]: sent remainder to all subscriber 24hrs b4 the subscription expires.
The standard solution is to use cron on Unix-like operating systems and Scheduled Tasks on Windows.
If you don't want to use cron, I suppose you could try to rig something up using at. But it is difficult to imagine a situation where cron is a problem but at is A-OK.
The solution I see is a loop (for or while) and a sleep(3600*24);
Execute it through a sending ajax call every set interval of yours through javascript
Please read my final opinion at the bottom before rushing to implement.
Cron really is the best way to schedule things. It's simple, effective and widely available.
Having said that, if cron is not available or you absolutely don't want to use it, two general approaches for a non-cron, Apache/PHP pseudo cron running on a traditional web server, is as follows.
Check using a loadable resource
Embed an image/script/stylesheet/other somewhere on each web page. Images are probably the best supported by browsers (if javascript is turned off there's no guarantee that the browser will even load .js source files). This page will send headers and empty data back to the browser (a 1x1 clear .gif is fine - look at fpassthru)
from the php manual notes
<?php
header("Content-Length: 0");
header("Connection: close");
flush();
// browser should be disconnected at this point
// and you can do your "cron" work here
?>
Check on each page load
For each task you want to automate, you would create some sort of callable API - static OOP, function calls - whatever. On each request you check to see if there is any work to do for a given task. This is similar to the above except you don't use a separate URL for the script. This could mean that the page takes a long time to load while the work is being performed.
This would involve a select query to your database on either a task table that records the last time a task has run, or simply directly on the data in question, in your example, perhaps on a subscription table.
Final opinion
You really shouldn't reinvent the wheel on this if possible. Cron is very easy to set up.
However, even if you decide that, in your opinion, cron is not easy to set up, consider this: for each and every page load on your site, you will be incurring the overhead of checking to see what needs to be done. True cron, on the other hand, will execute command line PHP on the schedule you set up (hourly, etc) which means your server is running the task checking code much less frequently.
Biggest potential problem without true cron
You run the risk of not having enough traffic to your site to actually get updates happening frequently enough.
Create a table of cronjob. In which keep the dates of cron job. Keep a condition, if today date is equal to the date in the creonjob table. then call for a method to execute. This works fine like CRON job.
Is it possible to run a php script after every 100ms ? This script will check a database for changes and then the changes will be reflected into some other database. I am already doing it using Triggers. But I want to know if there is any other way to do it without using Cron and Triggers. I will be using Linux for this purpose.
Thanks
Running something every 100ms almost means that it runs all the time , might as well create a daemon that continuously loops and executes
or use triggers. Essentially on every database change it will copy to another table/db.
http://codespatter.com/2008/05/06/how-to-use-triggers-to-track-changes-in-mysql/
It is not possible to do this with cron (it has a max frequency of one minute) and this is a really bad idea. You will be running a whole new php interpreter ten times per second, not to mention doing database connection too.
Far better perhaps would be to run one program that re-uses it's connection and checks every second or so.
Sounds a little like you are trying to make your own database replication or sync between two databases.
You could write a daemon to do it, essentially a script which continually runs in memory somewhere to then run whatever code you want to.
So that daemon would then do the database processing for you, and you wouldn't have to call a script over and over again.
Use your favorite programming language and set up a permanent loop to run it every 100ms, then put the script into inittab with 'respawn' (man inittab for complete syntax). Finally, init q to reload init.
It's best if you write a little daemon for that. Use the pcntl functions to do so. In your case you might get away with:
<?php
while (1) {
usleep(100000);
if (pcntl_fork() == 0) {
include("/lib/100ms-script.php");
exit;
}
// else pcntl_wait(); eventually
}
I'm assuming that this is in reference to some type of web page to be created. If so, this sounds like this is a job for Ajax, not PHP. As you may already know PHP processing is done on the server side. Once processing is complete the page is served up to the client.
With Ajax/JavaScript processing can continue via the browser. You can setup a timer that can then be used to communicate with the server. Depending on the output of the response the page may be updated to reflect the necessary changes.
What is the best way to break up a recursive function that is using a ton of resources
For example:
function do_a_lot(){
//a lot of code and processing is done here
//it takes a lot of execution time
if($true){
//if true we have to do all of that processing again
do_a_lot();
}
}
Is there anyway to make the server only have to take the brunt of the first execution and then break up the recursion into separate processes? Or am I dreaming?
Honestly, if your function is using up that much of your system's resources, I'd most likely refactor my code. However, it's not truly multithreading, but you could perhaps look at using popen to fork your process.
One of the rule of PHP is "Share nothing". That means every PHP process is independant and shares nothing with the others. So if you want to break your execution on several PHP process you'll have to store the data somewhere. It can be a memcached storage, or a database, or the session, as you want.
Then you'll need to 'fork' your PHp process. They're solutions available to get this done on the server side. IMHO this is all hacks. Dangerous and not minded in the PHP/web way. With the exception of 'work queues' tools.
I think the nicest way is to break your task with ajax. This will allow you a clean user interface and will avoid any long response timeout in the web process. i.e. show a 'working zone' to you user, then ask in ajax for next step of the job (first one), get response (in server side stor you response), then ask for next step, store new response and respond , next step, etc. You can even add a 'stop that stuff' function on the client side.
You can check as well for 'php work queue' on google.
If it's a long running task, divide and conquer with gearman