Achieving multithreading in PHP - php

I am writing a kind of test system in php that would test my database records. I have separated php files for every test case. One (master) file is given the test number and the input parameters for that test in the form of URL string. That file determines the test number and calls the appropriate test case based on test number. Now I have a bunch of URL strings to be passed, I want those to be passsed to that (master) file and every test case starts working independently after receiving its parameters.

PHP is a single threaded entity, no multithreading currently exists for it. However, there are a few things you can do to achieve similar (but not identical) results for use cases I have come across when people normally ask me about multithreading. Again, there is no multithreading in PHP, but some of the below may help you further in creating something with characteristics that may match your requirement.
libevent: you could use this to create an event loop for PHP which would make blocking less of an issue. See http://www.php.net/manual/en/ref.libevent.php
curl_multi: Another useful library that can fire off get/post to other services.
Process Control: Not used this myself, but may be of value if process control is one aspect of your issue. http://uk.php.net/pcntl
Gearman: Now this I've used and it's pretty good. It allows you to create workers and spin off processes into a queue. You may also want to look at rabbit-php or ZeroMQ.

PHP is not multithreaded, it's singlethreaded. You cannot start new threads within PHP. Your best bet would be a file_get_contents (or cURL) to another PHP script to "mimic" threads. True multithreading isn't available in PHP.
You could also have a look at John's post at http://phplens.com/phpeverywhere/?q=node/view/254.

What you can do is use cURL to send the requests back to the server. The request will be handled and the results will be returned.
An example would be:
$c = curl_init("http://servername/".$script_name.$params);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($c);
curl_close($c);
Although this is not considered multithreading, it can be used to achieve your goal.

Related

Break php code execution and continue with next request | multi-request serverside code execution

Is it possible to make php code execution to break at a certain command, end the http connection, than "wait" for the client to request the page again and continue from that point but with new $_POST or $_GET content?
I don't think php offers such a thing because it's hard to make sure the second request comes from the same client.
The usage should be for controlling a simple device over the internet by a program running on the server. The device just recieves some simple commands from the server like forward or turn_left and responds with statements like success or turning_failed. But to be able to run the complex code on the server I need the ability to get feedback from the device.
And because the code is complex, I can't have simple solution like indexing the requests or something like that. The "break" command could be inside another function and/or inside a loop...
So the only solution I came up with is to make my own interpreter written in php, which would be able to break execution like required and save all states to a file and later on continue from that point.
So my question should be: Do you know such a sophisticated interpreter? (language doesn't matter - c#, lua, js, ...) It would take a long time to make my own so I want to know all options.
Btw, php generators look really nice to me but they are iterable only in foreach loop not in forsomerequest loop :D

PHP multithreaded cURL alternative proposed, but is it good?

Because PHP doesn't have multithreading capabilities I am trying to find a workaround to speed up a simple process.
The process is I am posting data to a webpage with various permutations in the post-data on each request. In a foreach loop I am checking each request response to see if a string exists using strspos. When it is found, it breaks and returns the page. There are around 1000 requests, and it takes bout 1 minute to complete or longer.
As I don't want to use additional libraries, my idea was to exec standalone scripts passing each permutation of post data (say 1000 processes). Each process will only write to a file if the string is found. And on the main script I will run a loop checking if file exists, when it does find that the file exists, can read the file for the post data that was correct.
It seems sound in theory, but I wanted to check if this is a ridiculous solution for a problem that has much simpler solutions!
Thanks.
One solution is using process control library
http://php.net/manual/en/book.pcntl.php
I don't know if you have support for them installed

How can I make a scheduler in PHP without the help of cron

How can I make a scheduler in PHP without writing a cron script? Is there any standard solution?
Feature [For example]: sent remainder to all subscriber 24hrs b4 the subscription expires.
The standard solution is to use cron on Unix-like operating systems and Scheduled Tasks on Windows.
If you don't want to use cron, I suppose you could try to rig something up using at. But it is difficult to imagine a situation where cron is a problem but at is A-OK.
The solution I see is a loop (for or while) and a sleep(3600*24);
Execute it through a sending ajax call every set interval of yours through javascript
Please read my final opinion at the bottom before rushing to implement.
Cron really is the best way to schedule things. It's simple, effective and widely available.
Having said that, if cron is not available or you absolutely don't want to use it, two general approaches for a non-cron, Apache/PHP pseudo cron running on a traditional web server, is as follows.
Check using a loadable resource
Embed an image/script/stylesheet/other somewhere on each web page. Images are probably the best supported by browsers (if javascript is turned off there's no guarantee that the browser will even load .js source files). This page will send headers and empty data back to the browser (a 1x1 clear .gif is fine - look at fpassthru)
from the php manual notes
<?php
header("Content-Length: 0");
header("Connection: close");
flush();
// browser should be disconnected at this point
// and you can do your "cron" work here
?>
Check on each page load
For each task you want to automate, you would create some sort of callable API - static OOP, function calls - whatever. On each request you check to see if there is any work to do for a given task. This is similar to the above except you don't use a separate URL for the script. This could mean that the page takes a long time to load while the work is being performed.
This would involve a select query to your database on either a task table that records the last time a task has run, or simply directly on the data in question, in your example, perhaps on a subscription table.
Final opinion
You really shouldn't reinvent the wheel on this if possible. Cron is very easy to set up.
However, even if you decide that, in your opinion, cron is not easy to set up, consider this: for each and every page load on your site, you will be incurring the overhead of checking to see what needs to be done. True cron, on the other hand, will execute command line PHP on the schedule you set up (hourly, etc) which means your server is running the task checking code much less frequently.
Biggest potential problem without true cron
You run the risk of not having enough traffic to your site to actually get updates happening frequently enough.
Create a table of cronjob. In which keep the dates of cron job. Keep a condition, if today date is equal to the date in the creonjob table. then call for a method to execute. This works fine like CRON job.

Using cURL Handle as Array Key

I'm using curl_multi functions to request multiple URLs and process them as they complete. As one connection completes all I really have is the cURL handle (and associated data) from curl_multi_info_read().
The URLs come from a job queue, and once processed I need to remove the job from the queue. I don't want to rely on the URL to identify the job (there shouldn't be duplicate URLs, but what if there is).
The solution I've worked up so far is to use the cURL handle as an array key pointing to the jobid. Form what I can tell, when treated as a string the handle is something like:
"Resource id #1"
That seams reasonably unique to me. The basic code is:
$ch = curl_init($job->getUrl());
$handles[$ch] = $job;
//then later
$done = curl_multi_info_read($master);
$handles[$done['handle']]->delete();
curl_multi_remove_handle($master, $done['handle']);
Is the cURL handle safe to use in this way?
Or is there a better way to map the cURL handles to the job that created them?
Store private data inside the cURL easy handle, e.g. some job ID:
curl_setopt($ch, CURLOPT_PRIVATE, $job->getId());
// then later
$id = curl_getinfo($done['handle'], CURLINFO_PRIVATE);
This "private data" feature is not (yet) documented in the PHP manual. It was introduced already in PHP 5.2.4. It allows you to store and retrieve a string of your choice inside the cURL handle. Use it for a key that uniquely identifies the job.
Edit: Feature is now documented in the PHP manual (search for CURLOPT_PRIVATE within the page).
It will probably work thanks to some implicit type cast, but it doesn't feel right to me at all. I think it's begging for trouble somewhere down the line, with future versions that treat resources differently, different platforms...
I personally wouldn't do it, but use numeric indexes.
I have to agree with Pekka... it will probably work but it smells bad. id use straight up integers as Pekka suggests or wrap the handles in a simple class and then use spl_object_hash or have the constructor generate a uniqid when its set up.

Alternative to header(location: ) php

I have a while loop that constructs a url for an SMS api.
This loop will eventually be sending hundreds of messages, thus being hundreds of urls.
How would i go about doing this?
I know you can use header(location: ) to chnage the location of the browser, but this sint going to work, as the php page needs to remain running
Hope this is clear
thankyouphp h
You have a few options:
file_get_contents as Trevor noted
curl_ - Use the curl library of commands to make the request
fsock* - Handle the connection a bit lower level, but making and managing the socket connection.
All will probably work just fine and you should pick one depending on your overall needs.
After you construct each $url, use file_get_contents($url)
If it just a case that during the construction of all these URLs you get the error "Maximum Execution Time Exceeded", then just add set_time_limit(10); after the URL generation to give your script an extra 10 seconds to generate the next URL.
I'm not quite sure what you are actually asking in this question - do you want the user to visit the urls (if so, can you does the end users web browser support javascript?), just be shown the urls, for the urls to be generated and stored or for the PHP script to fetch each url (and do you care about the user seeing the result) - but if you clarify the question, the community may be able to provide you with a perfect answer!
Applying a huge amount guesswork, I infer from your post that you need to dynamically create a URL, and the invoking of that URL causes an SMS message to be sent.
If this is the case, then you should not be trying to invoke the URL from the client but from server side using the url_wrappers or cURL.
You should also consider running the loop in a seperate process and reporting back to the browser using (e.g.) AJAX.
Have a google for spawning long running processes in PHP - but be warned there is a lot of bad advice on the topic published out there.
C.

Categories