I have code structure like this
echo "First Operation";
sleep(3);
echo "Second Operation";
I want to try simulate multi-threading but I could not make, sleep function is meaningless there, but I could not find another option.
I have to agree that PHP is not built for multithread. As such the script you create will always run in one thread. There are some exceptions to that. When you call a shell command there is the possibility to fire-and-forget the command when you dont need the output. It runs parallel but you can not access the result.
PHP is designed to execute one instruction after the other.
To create something like a parallel working script, you need to take advantage of some external systems. All of those tricks are related to your actual solution you wane try to accomplish. Let me give you an example.
You could use a Message Queueing system like Rabbit MQ to work as a separation between one script and the other. Lets say script1 is permanently listening on the queue for work to do. Another script, lets say script2, is (on request) bushing a work into the queue and continuing with other stuff while the script1 is picking that up, doing something and returning the answer. When script2 is done with the other stuff it can read the result from the queu and finish whatever you want to do.
This is actually a very tricky and theoretical concept, but it might give you an idea. But without any external component like the queue, php will not work multithreaded.
I hope that is helping you!
Related
I'm trying to do a Bash script like:
n=1; While(n<=500) php /Path/script.php?v=$n ++n;
I would like to these script run in parallel. It could use so much of CPU? Or maybe parallel wouldn't be good?
A simple bash solution is to use the for construct and send the current instance of said script to the background, like this:
for n in {1..500}; do
php /Path/script.php?v=$n &
done
This will make it so that it's ran 500 times in parallel, whether it's going to use up all of your resources or not depends on what resources are available and what the script does.
I for one, doubt that this is an ideal solution, but it depends on what you're trying to achieve and what the reason for wanting to do it this way is.
How can I make a scheduler in PHP without writing a cron script? Is there any standard solution?
Feature [For example]: sent remainder to all subscriber 24hrs b4 the subscription expires.
The standard solution is to use cron on Unix-like operating systems and Scheduled Tasks on Windows.
If you don't want to use cron, I suppose you could try to rig something up using at. But it is difficult to imagine a situation where cron is a problem but at is A-OK.
The solution I see is a loop (for or while) and a sleep(3600*24);
Execute it through a sending ajax call every set interval of yours through javascript
Please read my final opinion at the bottom before rushing to implement.
Cron really is the best way to schedule things. It's simple, effective and widely available.
Having said that, if cron is not available or you absolutely don't want to use it, two general approaches for a non-cron, Apache/PHP pseudo cron running on a traditional web server, is as follows.
Check using a loadable resource
Embed an image/script/stylesheet/other somewhere on each web page. Images are probably the best supported by browsers (if javascript is turned off there's no guarantee that the browser will even load .js source files). This page will send headers and empty data back to the browser (a 1x1 clear .gif is fine - look at fpassthru)
from the php manual notes
<?php
header("Content-Length: 0");
header("Connection: close");
flush();
// browser should be disconnected at this point
// and you can do your "cron" work here
?>
Check on each page load
For each task you want to automate, you would create some sort of callable API - static OOP, function calls - whatever. On each request you check to see if there is any work to do for a given task. This is similar to the above except you don't use a separate URL for the script. This could mean that the page takes a long time to load while the work is being performed.
This would involve a select query to your database on either a task table that records the last time a task has run, or simply directly on the data in question, in your example, perhaps on a subscription table.
Final opinion
You really shouldn't reinvent the wheel on this if possible. Cron is very easy to set up.
However, even if you decide that, in your opinion, cron is not easy to set up, consider this: for each and every page load on your site, you will be incurring the overhead of checking to see what needs to be done. True cron, on the other hand, will execute command line PHP on the schedule you set up (hourly, etc) which means your server is running the task checking code much less frequently.
Biggest potential problem without true cron
You run the risk of not having enough traffic to your site to actually get updates happening frequently enough.
Create a table of cronjob. In which keep the dates of cron job. Keep a condition, if today date is equal to the date in the creonjob table. then call for a method to execute. This works fine like CRON job.
Is it possible to run a php script after every 100ms ? This script will check a database for changes and then the changes will be reflected into some other database. I am already doing it using Triggers. But I want to know if there is any other way to do it without using Cron and Triggers. I will be using Linux for this purpose.
Thanks
Running something every 100ms almost means that it runs all the time , might as well create a daemon that continuously loops and executes
or use triggers. Essentially on every database change it will copy to another table/db.
http://codespatter.com/2008/05/06/how-to-use-triggers-to-track-changes-in-mysql/
It is not possible to do this with cron (it has a max frequency of one minute) and this is a really bad idea. You will be running a whole new php interpreter ten times per second, not to mention doing database connection too.
Far better perhaps would be to run one program that re-uses it's connection and checks every second or so.
Sounds a little like you are trying to make your own database replication or sync between two databases.
You could write a daemon to do it, essentially a script which continually runs in memory somewhere to then run whatever code you want to.
So that daemon would then do the database processing for you, and you wouldn't have to call a script over and over again.
Use your favorite programming language and set up a permanent loop to run it every 100ms, then put the script into inittab with 'respawn' (man inittab for complete syntax). Finally, init q to reload init.
It's best if you write a little daemon for that. Use the pcntl functions to do so. In your case you might get away with:
<?php
while (1) {
usleep(100000);
if (pcntl_fork() == 0) {
include("/lib/100ms-script.php");
exit;
}
// else pcntl_wait(); eventually
}
I'm assuming that this is in reference to some type of web page to be created. If so, this sounds like this is a job for Ajax, not PHP. As you may already know PHP processing is done on the server side. Once processing is complete the page is served up to the client.
With Ajax/JavaScript processing can continue via the browser. You can setup a timer that can then be used to communicate with the server. Depending on the output of the response the page may be updated to reflect the necessary changes.
What is the best way to break up a recursive function that is using a ton of resources
For example:
function do_a_lot(){
//a lot of code and processing is done here
//it takes a lot of execution time
if($true){
//if true we have to do all of that processing again
do_a_lot();
}
}
Is there anyway to make the server only have to take the brunt of the first execution and then break up the recursion into separate processes? Or am I dreaming?
Honestly, if your function is using up that much of your system's resources, I'd most likely refactor my code. However, it's not truly multithreading, but you could perhaps look at using popen to fork your process.
One of the rule of PHP is "Share nothing". That means every PHP process is independant and shares nothing with the others. So if you want to break your execution on several PHP process you'll have to store the data somewhere. It can be a memcached storage, or a database, or the session, as you want.
Then you'll need to 'fork' your PHp process. They're solutions available to get this done on the server side. IMHO this is all hacks. Dangerous and not minded in the PHP/web way. With the exception of 'work queues' tools.
I think the nicest way is to break your task with ajax. This will allow you a clean user interface and will avoid any long response timeout in the web process. i.e. show a 'working zone' to you user, then ask in ajax for next step of the job (first one), get response (in server side stor you response), then ask for next step, store new response and respond , next step, etc. You can even add a 'stop that stuff' function on the client side.
You can check as well for 'php work queue' on google.
If it's a long running task, divide and conquer with gearman
What is the correct way to run Symfony tasks in a separate process. My first guess would be to use fork/exec, but according to this, you can't do it with anything that keeps open file descriptors or connections (like MySQL). So that doesn't sound like its an option. Another alternative is to do exec('symfony taskname &'), but that seems like a hack. Is that the best I can do? Is there a third way?
The way this is generally handled is to use a task queue. When you want to do a background process, add it to a queue of some kind (you could use your database, or you could use an actual queue daemon like beanstalkd). You then have some daemonized procces(es) whose job is to pull work out of the queue and perform it.
Here's how I ended up doing it:
exec('nohup ' . sfConfig::get('sf_root_dir') . '/symfony TASKNAME >/dev/null &');
You have to redirect STDOUT, or else it won't run in the background (though you don't have to use /dev/null if you want the actual output). In my case I set up all my tasks to use Symfony's file logger, so it wasn't an issue.
I'm still looking for a better solution though. This seems like a hack.
Php knows no multithreading.
And yes it is a big flaw in php IMO.
There is a way to do multithreading, but it is not recommended. It's complex, and it is ugly and it is asking, no calling out for problems.
So, i think the best you can do is something like exec, or maby somthing like calling a webservice like call?