run php process in background, send email when finished - php

I am writing a script that allows users to download vm-images from a remote repository. The images have to download from the remote repository (a), to a local server(b), and then the users can download the image from that local server(b) via a url link. This is achieved via a php exec call on an api with url endpoints.
The question I have, is that it can take a while for the image transfer from the "a" machine to the "b" machine. Is there a way to have the download process execute in the background. When image transfer is done, user gets an email containing the link to the file?
Otherwise, the user will just sit at a spinning page for as long as the max_execution_time setting will allow.
I was looking at this site for reference, but it was not super helpful.
Edit: I am running on a LAMP setup

You may want to look into starting a worker via beanstalk.
http://kr.github.io/beanstalkd/
You send a message containing the download link, and the email to send to. A worker can be started on demand when your message is sent, and automatically start the download. When the download is complete, your worker would fire off the email.
The PHP library to allow you to interface with beanstalk can be found here:
https://github.com/pda/pheanstalk

Beanstalkd
Beanstalkd is a daemon written to handle running jobs asynchronously so that your user is not left hanging while waiting for a task to finish. It's written in Ruby (I think), but there are many client libraries to interface with it.
Pheanstalk
Pheanstalk is the PHP library for integrating with Beanstalkd. You can define Job classes and then use this API to submit those jobs for processing.
Most major frameworks have support for something like this.

Related

Async/Thread on PHP7 with FPM

I found that pthreads does not work on web environment. I use PHP7.1 on FPM on Linux Debian which i also use Symfony 3.2. All I want to do is, for example:
User made a request and PUT a file (which is 1GB)
PHP Server receives the file and process it.
Immediately return true to user (jsonResponse) without awaiting processing uploaded file
Later, when processing file is finished (move, copy, duplicate whatever you want) just add an event or do callback from background and notify user.
Now. For this I created Console Command. I execute a Process('bin/console my:command')->start(); from background and I do my processing. But this is killing a fly with bazooka for me. I have to pass many variables to this executable command.
All I want to is creating another thread and just return to user without awaiting processing.
You may say this is duplicate. And point to pthreads. But pthreads stated that it is only intended for CLI. Also last version of pthreads doesn't work with symfony. (fatal error).
I am stuck at this point and have doubt if should I stay with creating processes for each uploaded file or move to python -> django
You don't want threads. You want a job queue. Have a look at Gearman or similar things.
Gearman provides a generic application framework to farm out work to other machines or processes that are better suited to do the work. It allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events. In other words, it is the nervous system for how distributed processing communicates.

How to use ImageMagick via another server using PHP's system() function?

We do a lot of processing for images and a lot of the time this processing kills all our CPU and causes our site to crash. What we want to do is put the image processing on another server, so that we can scale that server as nessicary and not have our current server crash.
I'm wondering how to go about this though. Our current process is:
1) User's make AJAX request to our Image Processing Script.
2) We construct a string based on the user's input. This string contains the commands to perform an ImageMagick process.
3) We run the string through PHP's system() command.
4) We then send headers to the page and use PHP's imagecreatefrompng() functions on the file to output the image to the user.
So what I'd like to know, what's the best way to now transfer the ImageMagick process. I thought of remote connecting to the other server via SSH, but I'm sure there is a limit on the number of connections that can be made via SSH. We have 100s of users online at a time so we need to be able to do that many connections at a time.
Anyone with any ideas on how best to transfer our image processing to another server would be greatly welcomed.
Thanks!
SSH would not be an appropriate protocol to distribute a work request to another server. A popular trend is to leverage a messaging queue to dispatch tasks to "worker" nodes. The implementation can very greatly depending on design, needs, and resource constraints. Here's a quick bare-bone outline...
A web server receives a new image item.
Writes image to CDN, or network mount &etc.
Publishes a task to a messaging queue, like RabbitMQ
A worker node listens for new tasks.
Consumes and performs request.
Writes result output next to source on CDN
Notifies tasks complete by either updating a record in DB, or publish back to MQ.
Checkout RabbitMQ/PHP "Hello World", and "Work Queues" articles for detailed examples.

PHP : How to run process separate from main process

I have a function to import data from excel to database, I make this function to run on server so this function doesn't need to interact with client anymore, the client web browser just need to upload the excel file to server, after that, the task will be run just on server so if the browser closed by client, the function still run on server, i've got this, the problem is, when the browser is leave open by client, the browser will loading as long as the function still active.How can i made the browser not wait respond from server so the browser will not loading while the process is run on server.Please help me.
Use a message queue to offload the task of processing the file from the web server to another daemon running separately.
You can take the cheap and easy route of execing a process with & in the command line, causing it to be backgrounded. However, that gives you little control / status.
The right way to go about it IMO is to queue up these long-running tasks in a database, with some status info associated with them. Then have a dedicated process which runs separate from your webserver, checking the database for tasks, and performs them, updating the database with success/failure status.
Look into using a queue such as Mseven's Queue Plugin:
Msevens Queue Plugin
Or, if you want a more daemon based job, look into Beanstalkd. The queue plugin by mseven is pretty self explanatry though. Stay away from forking processes using &, it can get out of control.

How to handle queueing of video encoding during multiple video uploads?

I am working on developing a video streaming site where users can upload videos to the site (multiple videos at once using the uploadify jquery plugin).
Now, I am faced with the question of encoding the videos to FLV for streaming them online.
When should the video encoding process take place ? Should it take place immediately after uploads have finished (i.e redirect the user to upload success page, and then start encoding in the background using exec command for ffmpeg ?) However, using this approach, how do i determine if the encoding has finished successfully ? What if users upload a corrupt video and ffmpeg fails to encode it ? How do i handle this in PHP ?
How do i queue encoding of videos since multiple users can upload videos at the same ? Does FFMpeg has its own encoding queue ?
I also read about gearman and message queueing options such as redis and AMQP in another related SO thread. Are these one of the potential solutions ?
I would really appreciate if someone could give answers to my questions.
You should use a software called gearman. It is a job server and you can call functions using PHP. You can transfer the process to background and it automatically handles queuing. Many processes can be run parallel also.
I've used it and it is very easy to install and operate.
For your use case,
Wrap the ffmpeg process in a php file with exec.
Save the uploaded file to db and give it an id.
Call the encode function after a user uploads file.
As soon as the encode function starts,
update the db to reflect that the file was "picked".
First run the ffmpeg info on the file to see if it is fine.
Then encode the file and after it is done, update db flag to "done".
After the process is done, you can call another function to send an email, tweet etc to the user that the encoding is complete.
Since encoding may take some time you may want to add the input files in a folder and add an entry in a database. Then you have a script that runs either constantly or each x minutes that convert the pending videos from the database to FLV format.
To queue them you will need to make a custom script that re-run FFMpeg for each files.
You could use a cron job on the server, or use something like beanstalkd (or gearman as mentioned in other answers to this question).
FFMpeg is just a command line utility. It doesn't have any queue and you will need to build your own queuing system to perform the work asynchronously.
For PHP queues, I had a good experience with the beanstalkd server, using the pheanstalk PHP client. Your upload script should then insert items to the queue for encoding, and return a response to the user saying the video will be processed shortly. Your workers should fetch items from the queue and run FFMpeg on them. You can read the FFMpeg status code to figure whether the encoding was successfully completed.

Using Javascript to perform a process and send updates/callbacks to a webserver

I am working on a process to allow people to upload PDF files and manage the document (page order) via a web based interface.
The pages of the PDF file need to be cropped to a particular size for printing and currently we run them through a Photoshop action that takes care of this.
What I want to do is upload the PDF files to a dedicated server for performing the desired process (photoshop action, convert, send images back to web server).
What are some good ways to perform the functions, but sending updates to the webserver to allow for process tracking/progress bars to keep the user informed on how long their files are taking to process.
Additionally what are some good techniques for queueing/tracking jobs/processes in general (with an emphasis on web based technologies)?
Derek, I'm sure you have your reasons for using Photoshop, but seriously, did Imagemagick render insufficient for you? I worked once with fax utility that converted Fax.g3 files to TIFF, then increased contrast and brightnes by 15% using Imagemagick and converted it back to PDF. IM worked as standalone Linux program invoked by system() call and I know there is new Imagemagick PECL extension.
Create a queue, and push jobs to that. Have a cronjob or daemon running that takes jobs from the queue and process them. Make sure that you use some sort of locking, so you can safely stop/start the daemon/job.
If you expect the job to finish quickly, you can use a technique known as "comet". Basically, you establish a connection from javascript (Using XmlHttpRequest) to your server-side script. In this script, you check if the job is completed. If not, you sleep for a second or two - then check again. You keep doing this until the job finishes. Then you give a response back. The result is that the request will take a while to complete, but will return immediately. You can then take appropriate action in javascript (Reload the page or whatever).

Categories