I'm trying to build a web interface for some python scripts. The thing is I have to use PHP (and not CGI) and some of the scripts I execute take quite some time to finish: 5-10 minutes. Is it possible for PHP to communicate with the scripts and display some sort of progress status? This should allow the user to use the webpage as the task runs and display some status in the meantime or just a message when it's done.
Currently using exec() and on completion I process the output. The server is running on a Windows machine, so pcntl_fork will not work.
LATER EDIT:
Using another php script to feed the main page information using ajax doesn't seem to work because the server kills it (it reaches max execution time, and I don't really want to increase this unless necessary)
I was thinking about socket based communication but I don't see how is this useful in my case (some hints, maybe?
Thank you
You want inter-process communication. Sockets are the first thing that comes to mind; you'd need to set up a socket to listen for a connection (on the same machine) in PHP and set up a socket to connect to the listening socket in Python and send it its status.
Have a look at this socket programming overview from the Python documentation and the Python socket module's documentation (especially the examples at the end). I'm sure PHP has similar resources.
Once you've got an more specific idea of what you want to build and need help, feel free to ask a new question on StackOverflow (if it isn't already answered).
I think you would have to use a meta refresh and maybe have the python write the status to a file and then have the php read from it.
You could use AJAX as well to make it more dynamic.
Also, probably shouldn't use exec()...that opens up a world of vulnerabilities.
You could use a queuing service like Gearman, with a client in PHP and a worker in Python or vice versa.
Someone has created an example setup here.
https://github.com/dbaltas/gearman-python-worker
Unfortunately my friend, I do believe you'll need to use Sockets as you requested. :( I have little experience working with them, but This Python Tutorial on Sockets/Network Programming may help you get the Python socket interaction you need. (Beau Martinez's links seem promising as well.)
You'd also need to get some PHP socket connections, too, so it can request the status.
Continuing on that, my thoughts would be that your Python script is likely going to run in a loop. Ergo, I'd put the "Check for a status request" check inside the beginning of a part of that loop. It'd reply one status, while a later loop inside that script would reply with an increased status.. etc.
Good luck!
Edit: I think that the file writing recommendation from Thomas Schultz is probably the easiest to implement. The only downside is waiting for the file to be opened-- You'll need to make sure your PHP and Python scripts don't hang or return failure without trying again.
Related
Why in Chrome, when I hit "STOP" button when PHP script is executing, it does not stop execution? Even closing TAB doesn't stop it. ignore_user_abort() is false. Any ideas on how to force PHP to stop? Got large script, that makes some big files and executes for like 10 minutes...
There is no way for a user to stop a PHP script. Since PHP is run on the server when a page is called, you cannot stop it unless you have a condition in your script that will quit to your liking.
As I mentioned in a comment, this answer to a previous question has a lot of info on this.
But one takeaway that may be your issue: PHP may not know it's disconnected until it tries to send data and is refused, see this bit in the docs:
the next time your script tries to output something PHP will detect that the connection has been aborted
So depending on your required output, you may be able to send some sort of "heartbeat" data to the browser that will trigger the abort if the user disconnects. If it doesn't seem to be sending, try doing some explicit buffer-clearing with flush(). If you (or your framework) are using output buffering, you may have to work around it.
If you can't do that (if you're generating an output file or some such), you'll likely have to rearrange things, since there's no way PHP can know the connection is closed. My suggestion is to use a queueing system to offload the generation of things to a separate script that you can then cancel/kill manually - here's a good overview of queueing systems, I personally use beanstalkd with PHP - it's simple, easy, works splendidly, and has some good PHP libraries (I've used pheanstalk and davidpersson's beanstalk). Any time you're generating large files like that, you should probably be using a queueing system anyway.
Short version: I want to connect a Client to a PHP server, but i have a limitation on the server of 10 PHP scripts running at the same time.
Question is: What is the best way to connect a client with PHP script, while staying under the limitation?
Long version:
My previous questions shows what i am really after, but here it is again:
I want to develop a a webchat using Java applet as the client side, and PHP as the back end server. Under normal circumstances I wouldn't ask a question like this just use the first thing that google pops up to my search. but right now i'm not under normal circumstances, but under restrictions: Server usage, as in my hosting isa shared account hosting, and 10 Entry processes(aka the number of PHP scripts running at the same time.) I need to make a server to my chat with these in minds, and lowering the performance as much as i can.
I did develop a Client/server connection using TCP in Delphi, but that was long ago, and i forgot much of it. And Now i try to resurface it, i realize i didn't know much about it.
So I got several questions, based on my researches:
What is a socket?
I did goggle this but i didn't find a really clear answer to this. This is the standard way of two programs communicating with each other right? and this where maybe one of my wrong knowledge is...
Is TCP/UDP protocol by Sockets?
I dont even know how to explain this question of mine...
What is stream exactly?
What i know from my C++ knowledge is its the ability to open files in binary form, and read from it from any point. I might be wrong because my C++ knowledge is old too.
Also i read about PHP sockets, and i found about that its capable to listen to a port with socket_create_listen but my concern is that does this scrip runs actively? like an infinitive loop? I'm asking this because the 10 process limitation.
And if i initiate a TCP Connection with a client does the script runs in an infinite loop again? does it counts on the active processes?
I know UDP doesn't need an active connection, because it just send it en masse and forgets about it terminating the script when it ends, but i don't know about TCP.
Sorry for the long post, and the many questions, and thank you for any help you can offer.
EDIT: I Forgot about GET/POST methods!
As i said that I'm planing a webchat and they need to communicate, but aside from direct connection there is the GET/POST method as well, which the script quickly does and terminates the script, but again the 10 process limit, what happens when 11 process tries to run at the same time?
Also is there a way to limit the simultaneously running processes? or put into a queue and wait till the others finish?
If your server is limited to only 10 concurrent threads, this is a hard limit and you can't do much. What you can do is to make the request as small as possible, and have as less things as possible resolved by php. So the possibility of concurrency would be very small.
Ideally, all your php's will start and exit very quickly, often redirecting the user to static content (html, js, img and css files).
Maybe you can make your whole webapp a lot of html files, and have some ajax.php file for the server communication...
I'm looking for some ideas to do the following. I need a PHP script to perform certain action for quite a long time. This is an extension for a CMS and this can't be anything else but PHP. It also can't be a command line script because it should be used by common people that will have only the standard means of the CMS. One of the options is having a cron job (most simple hostings have it) that will trigger the script often so that instead of working for a long time it could perform the action step by step preserving its state from one launch to the next one. This is not perfect but I can't see of any other solutions. If the script will be redirecting to itself server will interrupt it. What other options can suit?
Thanks everyone in advance!
What you're talking about is a daemon or long running program that waits for calls by client programs, performs and action, provides a response then keeps on waiting for more calls.
You might be familiar w/ these in the form of Apache & MySQL ;) Anyway PHP is generally OK in this regard, it does have the ability to function over raw sockets as well as fork sub-processes to handle multiple requests simultaneously.
Having said that PHP daemons are a tool where YMMV. Some folks will say they work great, other folks like me will say they have issues w/ interprocess communication and leaking memory even amidst plethora unset() calls.
Anyway you likely won't be able to deploy a daemon of any type on a shared hosting environment. You'll need to get a better server package or stick with a Cron based solution.
Here's a link about writing a PHP daemon.
Also, one more note. Daemons do crash from time to time and therefore you may still need to store state about whats going on, just in case someone trips over the power cord to your shared server :)
I would also suggest that you think about making it a daemon but if not then you can simply use
set_time_limit(0);
ignore_user_abort(true);
at the top to tell it not to time out and not to get interrupted by anything. Then call it from the cron to start it every day or whatever. I have this on many long processing daily tasks and it works great for me. However, it won't be able to easily talk to the outside world (other scripts can't query it or anything -- if that is what you want look into php services) so once you get it running make sure it will stop and have it print its progress to a logfile.
Anyone know how to close the connection (besides just flush()?), but keep executing some code afterwards.
I don't want the client to see the long process that may occur after the page is done.
You might want to look at pcntl_fork() -- it allows you to fork your current script and run it in a separate thread.
I used it in a project where a user uploaded a file and then the script performed various operations on it, including communicating with a third-party server, which could take a long time. After the initial upload, the script forked and displayed the next page to the user, and the parent killed itself off. The child then continued executing, and was queried by the returned page for its status using AJAX. it made the application much more responsive, and the user got feedback as to the status while it was executing.
This link has more on how to use it:
Thorough look at PHP's pcntl_fork() (Apr 2007; by Frans-Jan van Steenbeek)
If you can't use pcntl_fork, you can always fall back to returning a page quickly that fires an AJAX request to execute more items from a queue.
mvds reminds the following (which can apply in a specific server configuration): Don't fork the entire apache webserver, but start a separate process instead. Let that process fork off a child which lives on. Look for proc_open to get full fd interaction between your php script and the process.
I don't want the client to see the
long process that may occur after the
page is done.
sadly, the page isn't done until after the long process has finished - hence what you ask for is impossible (to implement in the way you infer) I'm afraid.
The key here, pointed to by Jhong's answer and inversely suggested by animusen's comment, is that the whole point of what we do with HTTP as web developers is to respond to a request as quickly as possible /end - that's it, so if you're doing anything else, then it points to some design decision that could perhaps have been a little better :)
Typically, you take the additional task you are doing after returning the 'page' and hand it over to some other process, normally that means placing the task in a job queue and having a cli daemon or a cron job pick it up and do what's needed.
The exact solution is specific to what you're doing, and the answer to a different (set of) questions; but for this one it comes down to: no you can't close the connection, and one would advise you look at refactoring the long running process out of that script / page.
Take a look at PHP's ignore_user_abort-setting. You can set it using the ignore_user_abort() function.
An example of (optional) use has been given (and has been reported working by the OP) in the following duplicate question:
close a connection early (Sep 2008)
It basically gives reference to user-notes in the PHP manual. A central one is
Connection Handling user-note #71172 (Nov 2006)
which is also the base for the following two I'd like to suggest you to look into:
Connection Handling user-note #89177 (Feb 2009)
Connection Handling user-note #93441 (Sep 2009)
Don't fork the entire apache webserver, but start a separate process instead. Let that process fork off a child which lives on. Look for proc_open to get full fd interaction between your php script and the process.
We solved this issue by inserting the work that needs to be done into a job queue, and then have a cron-script pick up the backend jobs regularly. Probably not exactly what you need, but it works very well for data-intensive processes.
(you could also use Zend Server's job queue, if you've got a wad of cash and want a tried-and-tested solution)
I am trying to create a multi threaded PHP application right now. I have read lots of paper that explains how to create multi threading. All of those examples are built on diving the processes on different worker PHP files. Actualy that is also what I am trying to do but there is a problem :)
There are too many jobs even to divide
in 30 seconds (which is the execution time limit)
We are using multi server environment on local network to complete the processes as the processes do not linked to each other or shares the same memory. We just need to fire them up and let them work at an exact time. Each of the processes works for 0.5 secs but it has a possibility to work for 30 secs.
Most of the examples fires up the PHP's and waits for the results. But unfortunately in my situation I dont need to expect a result from the thread. I just need it to execute the command and write the result to its own database.
How can I achieve to fire up the phps and wait for them to work for 10000 processes ?
ADDITIONAL INFO:
I know that PHP neither have multi threading feature nor is built for. But we have to create a way to use it for instance we can send request to http://server1/dothis.php?jobid=5 but standart methods makes us wait for the result. If we can manage to send request to this server without waiting for result it would solve our problem I think or we will need completely different approach such as a process divider with c++ or qt.
As has been pointed out, php doesn't support multi threading. However, and as tomaszsobczak mentioned, there is a library which will let you create "threads" and leave them running, and reconnect to them through other scripts to check their status and so on, called "Gearman".
From the project homepage: "Gearman provides a generic application framework to farm out work to other machines or processes that are better suited to do the work. It allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events. In other words, it is the nervous system for how distributed processing communicates."
Rasmus' blog has a great write up about it here:
playing with gearman and for your case, it might just be the solution, although I've not read any in depth test cases... Would be interested to know though, so if you end up using this, please report back!
As the comments say, multi-threading is not possible in PHP. But based on your comment:
If we can manage to send request to this server without waiting for result it would solve our problem I think
You can start a PHP script to run in the background using exec(), redirecting the script's output somewhere else (e.g. /dev/null). I think that's the best you will get. From the manual:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
There are several notes and pointers in the User Contributed Comments, for example this snippet that allows background execution on both Windows and Linux platforms.
Of course, that PHP script will not share the state, or any data, of the PHP script you are running. You will have to initialize the background script as if you are making a completely new request.
Since PHP does not have the ability to support multithreading, I don't quite know how to advise you. Each time a new script is loaded, a new instance of PHP is loaded, so if your server can handle X many PHP instances, then you can do what you want.
Here is something however:
Code for background execution
<?php
function execInBackground($cmd) {
if (substr(php_uname(), 0, 7) == "Windows"){
pclose(popen("start /B ". $cmd, "r"));
}
else {
exec($cmd . " > /dev/null &");
}
}
?>
Found at http://php.net/manual/en/function.exec.php#86329
Do you really want to have multi-threading in php?
OR do you just want to execute a php script every second? For the latter case, a cronjob-like "execute this file every second" approach via linux console tools should be enough.
If your task is to make a lot of HTTP requests you can use curl multi. There is a good library for doing it: http://code.google.com/p/rolling-curl/
As everyone's already mentioned, PHP doesn't natively support multi-threading and the workarounds are, well, workarounds...
That said, have you heard of the Facebook PHP Compiler? Basically it compiles your PHP to highly optimized C++ and uses g++ to compile it. This opens up a world of opportunities, including but not limited to multi-threading!
The project is open-source and its on github
if you just want to post a HTTP request, just do it using PHP CURL lib. It will solve your issue.