I have my web app developed on azure. When I start an ffmpeg process with exec, if the file is larger than 50MB, it stops the web app until it finish to process the file. Why does that happens?
Funny enough I programmed this for a customer myself some weeks ago.
You can use AJAX in front-end and start each convertion via a JS.
You can use a php "daemon"
2.0 Start the program independently php converter.php video1.mp4 >/dev/null 2>&1 &
2.1 You can start the normal script (via website), and it could restart itself as independent php converter.php (e.g. when a POST or GET is given) and the program will start itself in the script with shell_exec('php converter.php daemon'); (for example) in the background. You then can check via AJAX or maybe a websocket if the process has finished.
Tips:
Use a .lock-file or something similar in case the program could start a second time without it being finished yet.
Use this repo composer require php-ffmpeg/php-ffmpeg https://github.com/PHP-FFMpeg/PHP-FFMpeg
Related
Hi I am doing a project with Raspberry Pi. I make a python program which have an endless loop inside. I also make a PHP website which call to that python program, make it run in background by this way:
$call = "sudo python ../python/readver12.py > /dev/null 2>&1 &";
shell_exec($call);
Everything seem okay, but I don't know how to get the status of my python program is running in background or not, and make it available in my website with PHP ?
I guess there is many ways to do that:
Try get the logs from terminal
You can throw some logs at your terminal while running your endless python script, and capture it with PHP
https://docs.python.org/3/howto/logging.html
Write on files and read with it with php
Straight forward, you write in file with python and read with PHP
http://www.pythonforbeginners.com/files/reading-and-writing-files-in-python
API REST On PHP website and python with cURL
You can use cURL inside your python script to comunicate with your endpoints in php and get the needed data.
cURL: http://pycurl.io/docs/latest/index.html
PHP Api Rest: https://www.codeofaninja.com/2017/02/create-simple-rest-api-in-php.html
I hope it helps
With shell_exec() PHP will wait to continue your script until the application you're calling exits and will return the output of that application as a string. When your application picks back up the child is already done (or you've hit the PHP time limit).
It sounds like you want to start the process and monitor it while it's running. For that look at proc_open() and proc_get_status().
I am using a PHP server at the back end and a basic web page which asks the user to upload an image. This image is used as an input to a MATLAB script to be executed on the server side.
What I need is something like a MATLAB session(not clear on that word) that is already running on the server side which runs the MATLAB script. I know about the command: "matlab -nodesktop -nojvm" which can be used but the point is that I don't wish to invoke MATLAB again and again, rather, just execute the MATLAB script on the running MATLAB instance whenever a user uploads an image, get the output(necessary).
There are some constraints:
1. OS -> Ubuntu
2. Can't use python engine.
There exist multiple interfaces to control matlab. Probably best choice for this case is matlabcontrol or the matlab engine for python (which you can't use for some reason). On windows a third alternative would be com.
Besides controlling the matlab process, you could implement an application in matlab which recieves the data, processes it and sends it back. I solved a similar problem using apache xmlrpc in matlab.
There are also some submissions on matlab file exchange, directly providing a matlab console via web
You could write Matlab code to check the upload folder for new images regularly. Process new images and then move the processed images to an archive folder.
To check for new files use dir command
FILES = dir(['path/to/upload/folder/*.PNG']);
Replace PNG extension with that of your image files.
To move files use movefile command
movefile('path/to/upload/folder/Filename.PNG', 'path/to/archive/folder/', 'f')
To run the Matlab script from terminal and keep it running in the background
/usr/local/MATLAB/R2014a/bin/matlab -nodisplay -nosplash -r "cd /path/to/matlab/code; MatlabScript" ; < ctrl > Z; bg; disown -h %1
There is an API for C++ where you call the Matlab engine using engOpen. This will open Matlab and leave it running until you close it. Then your C++ program can wait and listen for the image to process.
http://www.mathworks.com/help/matlab/calling-matlab-engine-from-c-c-and-fortran-programs.html
Another option is to compile the Matlab script as a standalone executable. Hard code the input image name and output and let PHP handle moving around the inputs and outputs. All the server needs to do is call the executable. It takes about 5 seconds to start the Matlab runtime each time.
I use Phirehose to get a live and continuous stream of the Twitter UserStream API. So far I have been able to execute php -S localhost:8000 index.php and it work fire up and work fine.
Now I want to use the data from the CLI script in Laravel.
1) How can I stream the Phirehose data to Laravel?
2) How can I get this script to stay active in the background of a non-GUI droplet # DigitalOcean?
In your Phirehose script, write each tweet to a database. In your Laravel application (which I am assuming is being accessed by users, from their browsers?), query that database. The database need not be as heavy as MySQL, it could instead be memcache, redis or one of the NoSQL options.
For getting a Phirehose script to run in the background I would login over ssh and do this:
nohup php myscript.php 2>&1 &
(This assumes you have installed the php-cli package for your distro.)
The nohup part means you can logout and it will keep running. The 2>&1 means both stdout and stderr messages will be written to nohup.out. The & at the end is what puts it into the background.
(In fact I do something a bit more complicated: I have my Phirehose script write to a keep-alive file every 10 seconds. I then have another PHP script that is started on 1-minute cron, that will check that keep-alive file is being updated and, if not, it will start the phirehose script running.)
I wanted to use my local server which is running Windows 7 to take advantage of the task scheduler to setup some cron jobs for some of my php files.
I can do this currently by:
start http://theurl
Which opens in my default browser. However I was hoping to accomplish this without physically opening a browser so when I come back to my computer after a few days I don't have millions of Chrome windows open.
How can I load a URL in Task scheduler without opening a browser client via cmd?
I was able to accomplish the cron job by using a program called wget. I setup task scheduler to run the wget.exe at my specified time with these arguments:
wget -q -O - http://theurl.com > tmp.txt
This would load the website and store it to a temporary text file which is overwritten next time it is used.
If you just want to run some php files you don't need a browser. You can just run it from the commandline:
php -f /path/to/php/file.php
However if you really need to access a page you can do several things like: file_get_contents() or making a cURL request from PHP.
You don't need cmd or shell access. If your host has the HTTP wrapper enabled, a call to file_get_contents() is all you need:
file_get_contents( 'http://theurl');
You can also use fopen() if you're not interested in the response from the server.
i am working on web Payroll project using symfony framework. we have 27000 employees to process every month. when we doing employee payroll process we can not depend on browser request as it is a long time taking process and server time out coming. as a work around we like to execute the php script from the background on linux server. then even browser closed the scripts can excute the background.
what is the best way to do this task please help
More info
we would like to give the pay roll process start button from the web interface when user click on the button pay roll process should start even browser close it should execute until done from the background.
regards
To run in the backgroud do the following steps:
Wrap your php command in a shell script:
**
#!/usr/bin/bash
# set up environment variables, PATH etc. here
php /home/yourapp/yourscript.php
**
Code up a web page/php script to request the start.
In the script you need the line:
system('/home/yourapp/yourscript.sh > scriptlog.txt &'
You probably need an extra link to browse the "scriptlog.txt" file from the web.
There are different ways to handle this problem.
The first way is to execute , startinan shell command in background (and with using nohup if you like) starting from your PHP code. Thats nearly the same think like the first answer in the duplicate question.
The 2nd way is to use PHP PCNTL Feature.
Using PCNTL you can create child processes, which running in background. So it's possible to make an fork and returning to the user: "pay roll process is running - you will get an mail, if the system is ready".