For an iOS Push Notification server, I am implementing a web service that checks a feed on the net for a particular price.
Therefore I need my PHP to keep checking a price (every 20 seconds or so) and check values.
I was wondering (forgive my ignorance I just started with PHP today) is the way people do this a cronjob? Or is there some special way to fire a php script that runs until it's killed and repeats a task?
Thanks!
John
If PHP was your preferred route, a simple script such as the following can be set to run indefinitely in the background (name this grabber.php):
#!/usr/bin/php
<?php
do {
// Grab the data from your URL
$data = file_get_contents("http://www.example.com/data.source");
// Write the data out somewhere so your push notifications script can read it
file_put_contents("/path/to/shared/data.store", $data);
// Wait and do it all over again
sleep(20);
} while (true);
And to start it (assuming you're on a unixy OS):
$ chmod u+x grabber.php
$ ./grabber.php > /path/to/a/file/logging/script/output.log 2>&1 &
That & at the end sends the process to run in the background.
PHP is probably overkill for this however, perhaps a simple bash script would be better:
#!/bin/bash
# This downloads data and writes to a file ('data-file')
doWork () {
data=$(curl -L http://www.example.com/data.source)
echo $data > data-file
sleep 20
doWork
}
# Start working
doWork
$ chmod u+x grabber.sh
$ ./grabber.sh > /path/to/logger.log 2>&1 &
That is possible by setting up a cron jobs on your server.
Login to your web hosting e.g cpanel create a new cron job and add the path to the php file that you want to run. e.g php /home/[your username]/public_html/rss/import_feeds.php. There is field where you can input the number of minutes would you like the php script to run.
Run a PHP file in a cron job using CPanel
Related
2 part question,
I am running a script that executes a second script.
I have it setup this way because I read if I put a script in the /etc/init.d directory it will run it at start up.(true or false?)
I have tried adding >> LoopTriggerLogging.log at the end of each line but nothing comes out in the log file
So I have a first script as follows
#!/bin/bash
/var/www/Dev/LoopTrigger.sh
exit
This triggers the following script to run
#!/bin/bash
while true; do
# do some work
php /var/www/Dev/FindNearestDriverAndSendPush.php
php /var/www/Dev/ProcessCustomerPayment.php
php /var/www/Dev/ProcessDriversPayment.php
# write to LoopTriggerLogging.log
sleep 2 # sleep and repeat
done
What I would like is to have the commands logged along with any errors. I have tried to read a little on this but get lost in the answers and what they are trying to tell the user. I am still new at this and learning, kindly give definition to any commands or options. I am open to a best practice scenario.
Also, with putting in the etc/init.d directory will this tell the script to run at start up?
Is there a way to run this script without it taking up the command line because its an endless script?
My ultimate goal is to get the 3 php files to execute every 2 seconds with some sort of logging.
I did some reading on Cron but seems it is not meant for this type of use case.
Ive also seen this:
exec > logfile 2>&1 (dont know what this does)
set -x makes bash print every command before executing it
FOO=BAR (dont know what this means)
echo $FOO (dont know what this means)
if I put a script in the /etc/init.d directory it will run it at start
up.(true or false)
True. If you put a script in init.d then that script will run for every startup.
My ultimate goal is to get the 3 php files to execute every 2 seconds
You are using the correct way of running it approx every 2sec depending upon the time your php script takes to run. Crontab runs after a minimum of one minute so that would not an option.
I have tried adding >> LoopTriggerLogging.log at the end of each line
but nothing comes out in the log file
You can use /var/www/Dev/LoopTrigger.sh >> LoopTriggerLogging.log in your first script so that whenever it runs it will
Create a file for the first time and append the content from the next time.
Push all the logs of the second script into the file.
Note: As logs will keep on appending to the single file, this file will become very huge at some point of time so make sure your handle it well.
I am facing a problem that somehow I don't see the solution to it. I have a XML file that needs to be importet to custom DB structure, when user uploads / imports the file the ajax post is waiting untill the file import is finished, but this could take 5 hours or more I don't know. What is the best way to handle this UI issue.
I was thinkg about thread uplaod, to split the file in multiple parts and upload each with it's own thread (pthreads, having problems with instalation on centos 7 / PHP 7)
Or if there is any other way that I could import the file in the background and whenerever the user refreshes the page there would be a status log output so that user would know when the import is finished and if successful.
You would want to run them using a background job ( a detached process ) this way the end user gets a confirmation message right away, and then send an email when the long running task is complete. Then they don't have to wait for it to finish. As I mentioned in the comments I have a class I wrote on my git hub for this
https://github.com/ArtisticPhoenix/MISC/blob/master/BgProcess.php
But it passes the args as a path because it's setup for Code Igniter, so you would have to change that or split the arguments up within your code.
Anyway the basics is similar to running a cron job, This varies in the implantation depending on the OS of the server. But on Linux the command is like this
php -f "path/to/phpfile.php" "{args}" > /dev/null &
The > /dev/null & part sends the output to null ( throws it away ) and the & runs it as a non-blocking process meaning the script starting the command can continue on. So using an example as this
.. other code before starting background job ..
exec( 'php -f "path/to/phpfile/xmlProcessor.php" "testXML/2" > /dev/null &');
.. code to tell user job is started .. this runs right after the call without waiting for that process to finish.
Then in xmlProcessor.php you would have this
<?php
$args = explode('/', $argv[1]);
$file = $ags[0];
$user_id = $args[1];
... code to process xml
... email user confirmation of completion
http://php.net/manual/en/reserved.variables.argv.php
As I said typically would call it this way,
exec( 'php -f "path/to/phpfile/xmlProcessor.php" "testXML" "2" > /dev/null &');
And access them using
$argv[1] // = testXML
$argv[2] // = 2
But because I use this with CI, it does it's routing for me to a special controller and handles all that. The nice thing about my class is that it should find the PHP executable under most cases, and it has windows compatibility built in ( which was a pain in the ...)
Using that class you would just call it like this
$command = new BgProcess( "path/to/phpfile/xmlProcessor.php", "testXML", 2);
echo $command;
Would output 'php -f "path/to/phpfile/xmlProcessor.php" "testXML/2" > /dev/null &' after starting the process ( the return is just for debugging )
Basically your running a separate background job with PHP via the command line.
I have a simple question, i searched and I couldn't find a solution.
I have a simple shell script that run a small php code every 2 seconds, I wrote it and save as a file:
$ cat every-2-seconds.sh
#!/bin/bash
while true
do
php /home/account/domains/domain.co.il/public_html/my-php-script.php
sleep 2
done
Now, i need that this script will always run on background, but I also need that it will run on startup, Just like a service, it should always run in background, and I never want to start it manually (of course, if something happen and it will stop, i should be able to start it manually)
I heard about nohup, but its not a service right? and I can start it on startup.. :(
Can you help me on this??
You can make your script run with this line of code (assuming you are in the directory with your script)
nohup every-2-seconds.sh &
The & will run this as a background task and nohup will keep the process running even after you've disconnected from your session.
To handle starting it on reboot you need to add this command to your crontab
crontab -e
#reboot /path/to/every-2-seconds.sh > /dev/null
In the crontab you need to specify the full path. You can change /dev/null to the file you want output to go to (assuming you want the output)
Currently I have a parser.php which loads an xml file and inserts new data from the xml file into a mysql database. How would I go about refreshing this php file every 30 seconds so my mysql table always has fresh data? I think I could use short-polling to do this, but I'm guessing this is not the most efficient of options.
Thanks in advance
This is a non-PHP solution which will require you to have shell (SSH) access in order to run the script, however you can also run it through PHP with exec() if you want to. Shared hosting environments might present a challenge for this approach but as long as you can execute scripts under your user credentials you should have no problems running it.
First you will need to create a bash script with the following content and save it (I'll use the name parser.sh for the purpose of this example). You can then adjust the timeout in the sleep 30 line if you want to.
#!/bin/sh
while true
do
php parser.php
sleep 30
done
In order to run the script you'll need to give it execute permissions.
chmod +x parser.sh
Now you can use the nohup command with the ampersand (&) argument to ensure that the script will run in the background even when a termination signal is sent after, lets say, closing the shell (SSH). The ampersand is important!
nohup ./parser.sh &
Now you can use top or ps aux | grep parser to ensure that the script is running. As I already said before you can also use PHP exec() to start the process but shell is still the preferred and most reliable way to do this.
If you want to stop the background process which executes your script then you'll simply have to kill it. Just use ps aux | grep parser to find out the PID of the parser process (its in the second column to the left) and use it with the kill command.
kill 4183
You need to use a cronjob, but crons jobs runs every 1 minute or more.
Another way is to make a "daemon".
Very basic example:
<?php
while(true) {
// check if 30 seconds has passed.
// then execute some function...
};
?>
Then you need to execute this in your terminal:
$ php script.php &
This link should help.
Greatings!
I want initiate one php page as background process from another php page.
Use popen():
$command = 'php somefile.php';
pclose(popen($command,'r'));
This launches somefile.php as a background process.
This is a technique I used to get around restrictions applied by my webhost (who limited cronjobs to 15 minutes of execution time, so my backup scripts would always timeout).
exec( 'php somefile.php | /dev/null &' );
The breakdown of this line is:
exec() - PHP reference Runs the specified command, as if from the Linux Command Line.
php somefile.php: Invokes PHP to open, and run, somefile.php. This is the same behaviour as what would happen if that file was accessed through a web browser.
| ("pipe") - Sends the output of the proceeding command to a specified target. In this instance, it would "pipe" the content which would normally be read by the web browser accessing the file.
/dev/null - A blackhole. No, not kidding. It is a place where you send output if you just want it to disappear.
& - Appending this character to the end of a Linux command means "Do not wait - Send this to the background and continue."
So, in summary, the provided code will execute a PHP script, return no output, and not wait for it to finish before continuing onto the next line.
(And, as always, if any of these assumptions on my part are in error, I would love to be corrected by more knowledgeable members of the community.)
You have to make sure, that the background process is not terminated when the processing of the page finished. If you are on a Linux system, you could try to use the nohup command:
$command = 'nohup php somefile.php';
pclose(popen($command,'r'));
If it still gets terminated, you could try the "daemon" command.