I use Phirehose to get a live and continuous stream of the Twitter UserStream API. So far I have been able to execute php -S localhost:8000 index.php and it work fire up and work fine.
Now I want to use the data from the CLI script in Laravel.
1) How can I stream the Phirehose data to Laravel?
2) How can I get this script to stay active in the background of a non-GUI droplet # DigitalOcean?
In your Phirehose script, write each tweet to a database. In your Laravel application (which I am assuming is being accessed by users, from their browsers?), query that database. The database need not be as heavy as MySQL, it could instead be memcache, redis or one of the NoSQL options.
For getting a Phirehose script to run in the background I would login over ssh and do this:
nohup php myscript.php 2>&1 &
(This assumes you have installed the php-cli package for your distro.)
The nohup part means you can logout and it will keep running. The 2>&1 means both stdout and stderr messages will be written to nohup.out. The & at the end is what puts it into the background.
(In fact I do something a bit more complicated: I have my Phirehose script write to a keep-alive file every 10 seconds. I then have another PHP script that is started on 1-minute cron, that will check that keep-alive file is being updated and, if not, it will start the phirehose script running.)
Related
I have my web app developed on azure. When I start an ffmpeg process with exec, if the file is larger than 50MB, it stops the web app until it finish to process the file. Why does that happens?
Funny enough I programmed this for a customer myself some weeks ago.
You can use AJAX in front-end and start each convertion via a JS.
You can use a php "daemon"
2.0 Start the program independently php converter.php video1.mp4 >/dev/null 2>&1 &
2.1 You can start the normal script (via website), and it could restart itself as independent php converter.php (e.g. when a POST or GET is given) and the program will start itself in the script with shell_exec('php converter.php daemon'); (for example) in the background. You then can check via AJAX or maybe a websocket if the process has finished.
Tips:
Use a .lock-file or something similar in case the program could start a second time without it being finished yet.
Use this repo composer require php-ffmpeg/php-ffmpeg https://github.com/PHP-FFMpeg/PHP-FFMpeg
Hi I am doing a project with Raspberry Pi. I make a python program which have an endless loop inside. I also make a PHP website which call to that python program, make it run in background by this way:
$call = "sudo python ../python/readver12.py > /dev/null 2>&1 &";
shell_exec($call);
Everything seem okay, but I don't know how to get the status of my python program is running in background or not, and make it available in my website with PHP ?
I guess there is many ways to do that:
Try get the logs from terminal
You can throw some logs at your terminal while running your endless python script, and capture it with PHP
https://docs.python.org/3/howto/logging.html
Write on files and read with it with php
Straight forward, you write in file with python and read with PHP
http://www.pythonforbeginners.com/files/reading-and-writing-files-in-python
API REST On PHP website and python with cURL
You can use cURL inside your python script to comunicate with your endpoints in php and get the needed data.
cURL: http://pycurl.io/docs/latest/index.html
PHP Api Rest: https://www.codeofaninja.com/2017/02/create-simple-rest-api-in-php.html
I hope it helps
With shell_exec() PHP will wait to continue your script until the application you're calling exits and will return the output of that application as a string. When your application picks back up the child is already done (or you've hit the PHP time limit).
It sounds like you want to start the process and monitor it while it's running. For that look at proc_open() and proc_get_status().
How would you run a windows task schedule to open a webpage, post login information and then run the url?
Background:
CRM has crons that were setup for a linux only. It has a manager where I can run the jobs as well manually. I want to run the web url that does these jobs manually through the windows server but requires that each time it connect it login with a specific user.
How would I setup a scheduled task on windows server that :
1. Opens and Logs into page then runs the url for the manual job.
Runs every minute
So essentially it needs to look like this:
http://thewebsitename.com/?username=someuser&password=apass
http://thewebsitename.com/theurltorunjobmanually.php
Can scheduled tasks run a php command instead as well? For example if I set up a WGET script, could the scheduler run that php script? Have not been able to figure out how to do this, linux seems to be pretty easy in this scenario
This could be as simple as:
Download wget for Windows
Create a batch file with the following contents:
wget --post-data "username=someuser&password=apass" http://thewebsitename.com/
wget http://thewebsitename.com/theurltorunjobmanually.php
You also asked about running a PHP script via the scheduled task, you can add this line to the batch script:
C:\path\to\PHP.exe script.php
Not sure if you're looking for methodology or an actual solution, but we have a process somewhat like this where we need to login to our CRM and run an upload, task creation process at regular intervals. Used to be manual but now we use an automation software product, Foxtrot. You can find it here for whatever it is worth: http://www.enablesoft.com/foxtrot-professional/
You can put the cURL or wget commands in a batch file or PowerShell script and have the Windows Task Scheduler call it.
I need to use an Apache handler to run a PHP script, rather than running it through CLI. I'm using APC user cache, which stores variables using the Apache process. If I run my PHP script through CLI, then it won't have access to the APC variables.
A possible solution is creating a directory restricted to localhost and putting my scripts in there. Then, I can use a browser to run the PHP scripts. However, I'm not too experienced with Linux and I don't know how to implement this. Here's how I need it to work:
One of the cron job fires.
The cron job opens the PHP script using a web browser.
After the PHP script is finished processing, the web browser closes.
I don't know how to close the browser once the task is finished. Also, multiple PHP scripts will be running simultaneously (called by different cron jobs), I'm not sure how this will work. I'm using the Lynx browser on CentOS.
In Debian/Ubuntu I can run a script using lynx, say
/usr/bin/lynx -source 'url'
For eg:
/usr/bin/lynx -source http://google.com
Once execution is completed, the browser quits default.
I wanted to use my local server which is running Windows 7 to take advantage of the task scheduler to setup some cron jobs for some of my php files.
I can do this currently by:
start http://theurl
Which opens in my default browser. However I was hoping to accomplish this without physically opening a browser so when I come back to my computer after a few days I don't have millions of Chrome windows open.
How can I load a URL in Task scheduler without opening a browser client via cmd?
I was able to accomplish the cron job by using a program called wget. I setup task scheduler to run the wget.exe at my specified time with these arguments:
wget -q -O - http://theurl.com > tmp.txt
This would load the website and store it to a temporary text file which is overwritten next time it is used.
If you just want to run some php files you don't need a browser. You can just run it from the commandline:
php -f /path/to/php/file.php
However if you really need to access a page you can do several things like: file_get_contents() or making a cURL request from PHP.
You don't need cmd or shell access. If your host has the HTTP wrapper enabled, a call to file_get_contents() is all you need:
file_get_contents( 'http://theurl');
You can also use fopen() if you're not interested in the response from the server.