The app is downloading a file (plist) which is generated by the server. The server takes a loooong time to generate the file, so I would like to be able to show progress (probably view UIProgressView, but that's not important).
Since the file I'm downloading hasn't been created yet at the beginning of the request, we don't know the expectedContentLength. However, I have the means to provide progress updates from the [PHP] script itself. I'm using ob_flush() for each line in the file to do this, which works just fine in a browser.
But when I make the request from the app, I'm only getting a call from connection:didReceiveData: after the script has finished executing, so that's not of much use.
So my question boils down to this:
How can I tap into the progress of such a php script from my app?
I wouldn't mind sending 2 requests to the server, the first that generates the file and provides updates while doing so, and then another to download the actual file.
Since none of my NSUrlConnection delegate methods are being called until the request completes, what does my script need to do to trigger these methods?
Your problem is most likely on the server. If the server is sending data as it is processed and that processing takes a significant amount of time, you should probably get more than one notification of connection:didReceiveData:.
There is some discussion that might be relevant in the PHP manual.
http://php.net/manual/en/function.ob-flush.php
I would verify using a packet analyzer that the server is actually sending data incrementally as you expect.
Related
I have a PHP script which is run by AJAX. At the end of the script, the server makes an HTTP request to a different server to log the successful completion of the script. The problem is, this second server sometimes takes a while to respond, and I would like this to happen after the AJAX client finishes its request.
Is there some PHP library or similar which could do this? Is my best bet to log the completion to a file on the first server, then have a cron script making the HTTP requests to the second server based on the contents of the file?
U can use file_get_contents to call a remote server from your PHP, or use the more complex, but more feature rich CURL wrapper library PHP has.
I want to have my own variable that would be (most likely an array) storing what my php application is up to right now.
The application can trigger few processes that are in background (like downloading files) and I want to have a list what is being currently processed.
For example
if php calls exec() that will be downloading for 15mins
and then another download starts
and another download starts
then if I access my application I want to be able to see that 3 downloads are in process. If none of them finished yet.
Can do that? Only in memory, not storing anything on the disk?
I thought that the solution would be a some kind of server variable.
PHP doesn't have knowledge of previous processes. As soon has a php process is finished everything it knows about itself goes with it.
I can think of two options. Write knowledge about spawned processes to a file or database and use it to sync all your php request, (store the PID of each spawned process)
Or
Create an Daemon. The people behind PHP have worked hard to clean up PHP memory handling and such to make this more feasible. Take a look at their PEAR package - http://pear.php.net/package/System_Daemon
Off the top of my head, a quick architecture would compose of 3 peices
Part A) The web app that will take in request for downloads, and report back the progress of all request
Part B) You daemon, which accepts requests for downloads, spawns process, and will report back status of all spawned reqeust
Part C) The spawn request that will perform the download you need.
Anyone for shared memory?
Obviously you would have to have some sort of daemon, but you could use the inbuilt semaphore functions to easily have contact between each of the scripts. You need to be careful though because sometimes if you're not closing the memory block properly, you could risk ending up with no blocks left.
You can't store your own variables in $_SERVER. The best method would be to store your data in a database where and query/update it as required.
Issue summary: I've managed to speed up the thumbing of images upon upload dramatically from what it was, at the cost of using concurrency. Now I need to secure that concurrency against a race condition. I was going to have the dependent script poll normal files for the status of the independent one, but then decided named pipes would be better. Pipes to avoid polling and named because I can't get a PID from the script that opens them (that's the one I need to use the pipes to talk with).
So when an image is uploaded, the client sends a POST via AJAX to a script which 1) saves the image 2) spawns a parallel script (the independent) to thumb the image and 3) returns JSON about the image to the client. The client then immediately requests the thumbed version, which we hopefully had enough time to prepare while the response was being sent. But if it's not ready, Apache mod_rewrites the path to point at a second script (the dependent), which waits for the thumbing to complete and then returns the image data.
I expected this to be fairly straightforward, but, while testing the independent script alone via terminal, I get this:
$ php -f thumb.php -- img=3g1pad.jpg
successSegmentation fault
The source is here: http://codepad.org/JP9wkuba I suspect that I get a segfault because that fifo I made is still open and now orphaned. But I need it there for the dependent script to see, right? And isn't it supposed to be non-blocking? I suppose it is because the rest of the script can run.... but it can't finish? This would be a job for a normal file as I had thought at the start, except if both are open I don't want to be polling. I want to poll once at most and be done with it. Do I just need to poll and ignore the ugliness?
You need to delete created FIFO files then finish all scripts.
I created a script that gets data from some web services and our database, formats a report, then zips it and makes it available for download. When I first started I made it a command line script to see the output as it came out and to get around the script timeout limit you get when viewing in a browser. But because I don't want my user to have to use it from the command line or have to run php on their computer, I want to make this run from our webserver instead.
Because this script could take minutes to run, I need a way to let it process in the background and then start the download once the file has been created successfully. What's the best way to let this script run without triggering the timeout? I've attempted this before (using the backticks to run the script separately and such) but gave up, so I'm asking here. Ideally, the user would click the submit button on the form to start the request, then be returned to the page instead of making them stare at a blank browser window. When the zip file they exists (meaning the process has finished), it should notify them (via AJAX? reloaded page? I don't know yet).
This is on windows server 2007.
You should run it in a different process. Make a daemon that runs continuously, hits a database and looks for a flag, like "ShouldProcessData". Then when you hit that website switch the flag to true. Your daemon process will see the flag on it's next iteration and begin the processing. Stick the results in to the database. Use the database as the communication mechanism between the website and the long running process.
In PHP you have to tell what time-out you want for your process
See PHP manual set_time_limit()
You may have another problem: the time-out of the browser itself (could be around 1~2 minutes). While that time-out should be changeable within the browser (for each browser), you can usually prevent the time-out user side to be triggered by sending some data to the browser every 20 seconds for instance (like the header for download, you can then send other headers, like encoding etc...).
Gearman is very handy for it (create a background task, let javascript poll for progress). It does of course require having gearman installed & workers created. See: http://www.php.net/gearman
Why don't you make an ajax call from the page where you want to offer the download and then just wait for the ajax call to return and also set_time_limit(0) on the other page.
I was wondering if is is possible to send output from application ran by php to client.
For example i have application that outputs:
Hello world
And after 10 seconds it outputs
10 seconds passed
I'd like to know if it is possible to send "Hello word" and "10 seconds passed" to client without waiting until whole program finishes its job. Client would receive "Hello world" first and after 10 seconds second output.
Thank you.
Your title says "Asynchronous external aplication execution". By this, you would mean something that will execute a program from your PHP script, yet continue on its own process and not hang PHP page load. You may want passthru() specifically setting the command to output to a local file rather than your script (personally not tested, though the PHP manual says you can), or pcntl_fork() to split off your script into a separate process which will handle the program execution on the side. However, double-sending to a browser after it had already disconnected from your server and expecting it to display your uninvited message is impossible unless you install a trojan on the client which will auto-accept your second, new tcp forced connection.
But, if you want a progress message for your page load, simply echo "still loading..." anywhere along a number of for or while loops. File download progress bars on the other hand cannot be dealt with in PHP. Echoing "still loading..." in the middle of the download will corrupt the file. At the moment, I'm not aware of any facility to do this using any PHP, Javascript, or VB method, except in the browsers own API (if documented) if the client allows it by installing a plugin you authored. But why, when browsers already have built-in progress bars?
I think you should do this with javascript. It's totally unnecessary to use cpu-cycles on the server until all of your requirements are that show time passed.
Usually, a client pulls content from the server. If you want to push from the server to the client, you need to look into push technologies like Comet. There is not too much available for PHP though. Periodically pushing with the PHP script terminating inbetween requires a Message Queue.
I don't understand your application, but for batch processing this comes to mind:
php hello-world.php | php client.php
To scale it, use Hadoop.