How to detect whether file uploaded by FTP - php

I have a client who uploads an XML file to a directory on my server using a third party script over which I have no control. Upon upload the file gets processed by a php script.
I have a cronjob which checks the dir every hour to see whether a file has been uploaded but if the upload takes place at, say, 3:01pm it won't be processed until 4:00pm. My host forbids running a cronjob more frequently than once per hour.
Is there a way to detect and advise immmediately upon a file arriving?

The short answer is yes, it is possible to detect a file upload and perform an immediate action. However, this is not what I would call a "simple" solution. You would want to set up some sort of daemon so that you can run PHP continually (or use a different language entirely to monitor for the file).
The simple solution is to lower your cron interval, and then just add some kind of anti-stacking mechanic (like a lockfile) to ensure your script doesn't execute if it's already running.

make a file checker.sh , and run it in background ./checker.sh /dev/null 2>&1 & , or run it in screen
#!/bin/bash
COUNTER=0
while [ $COUNTER -lt 99999 ]; do
if [ -f "/path/to/your/file" ]; then
#do action here ,run a script a program etc #the file is there
fi
sleep 5
let COUNTER=COUNTER+1
done

If you are running Linux you could look at the inotifywait command which is part of the inotify-tools package. This can be set up to wait until a file is created in a particular directory as follows:
#!/bin/bash
while true
do
inotifywait -e create /home/myuser/upload_dir/ && php /home/myuser/scripts/process_file.php
done

Related

Logging script commands and errors in linux

2 part question,
I am running a script that executes a second script.
I have it setup this way because I read if I put a script in the /etc/init.d directory it will run it at start up.(true or false?)
I have tried adding >> LoopTriggerLogging.log at the end of each line but nothing comes out in the log file
So I have a first script as follows
#!/bin/bash
/var/www/Dev/LoopTrigger.sh
exit
This triggers the following script to run
#!/bin/bash
while true; do
# do some work
php /var/www/Dev/FindNearestDriverAndSendPush.php
php /var/www/Dev/ProcessCustomerPayment.php
php /var/www/Dev/ProcessDriversPayment.php
# write to LoopTriggerLogging.log
sleep 2 # sleep and repeat
done
What I would like is to have the commands logged along with any errors. I have tried to read a little on this but get lost in the answers and what they are trying to tell the user. I am still new at this and learning, kindly give definition to any commands or options. I am open to a best practice scenario.
Also, with putting in the etc/init.d directory will this tell the script to run at start up?
Is there a way to run this script without it taking up the command line because its an endless script?
My ultimate goal is to get the 3 php files to execute every 2 seconds with some sort of logging.
I did some reading on Cron but seems it is not meant for this type of use case.
Ive also seen this:
exec > logfile 2>&1 (dont know what this does)
set -x makes bash print every command before executing it
FOO=BAR (dont know what this means)
echo $FOO (dont know what this means)
if I put a script in the /etc/init.d directory it will run it at start
up.(true or false)
True. If you put a script in init.d then that script will run for every startup.
My ultimate goal is to get the 3 php files to execute every 2 seconds
You are using the correct way of running it approx every 2sec depending upon the time your php script takes to run. Crontab runs after a minimum of one minute so that would not an option.
I have tried adding >> LoopTriggerLogging.log at the end of each line
but nothing comes out in the log file
You can use /var/www/Dev/LoopTrigger.sh >> LoopTriggerLogging.log in your first script so that whenever it runs it will
Create a file for the first time and append the content from the next time.
Push all the logs of the second script into the file.
Note: As logs will keep on appending to the single file, this file will become very huge at some point of time so make sure your handle it well.

Running shell_exec() in background AND getting its output afterwards

First of all, I would like to point out that this is NOT a duplicate to this or this and other related questions. I am asking how to get the output as well and my question is mostly related to that.
I need to run a bash script from PHP and get its output in PHP. However, doing:
echo shell_exec('script');
dies out on the 60th second due to Apache's FcgidIOTimeout. If I use the following to run it in the background, I won't be able to get the output:
shell_exec('<myscript> > /dev/null 2>/dev/null &');
I cannot:
Modify the bash script;
The code and all the functionality needs to be in a single file (unless I use a temp file);
If I were to write the output of the shell_exec() function to a temp file, I would also need to find a way to verify whether the
process has finished;
Using a database is an overkill which I cannot afford;
My current and only idea is to create a tmp file using PHP and write the output to it, then using top -c | grep <myscript> (with a refresh button) and if it returns true then it is still ongoing. However, I suspect that would not be practical and efficient in most of the time.
It is important that I use a temp file and not actually creating a "permanent" file to write to.
Solution for a similar problem: A few years ago I had a similar problem. I had to upload a file to a FTP server. I wondered how to communicate to the FTP server that the file was uploaded completely, so that the FTP server can perform some tasks on it. My solution was to rename the file to something like *.completed after it was uploaded completely. Then the process on the FTP server could look for *.completed files only.
Solution adjusted to your problem: I'd suggest you to rename your temp file after it was generated by your bash script. This way you can independently find out if the script was executed successfully. shell_exec() could look like this:
shell_exec('<myscript> > tmp-file && mv tmp-file tmp-file.completed &');
Be aware that this only redirects the channel STDOUT into the the tmp-file. If you also want to redirect STDERR into the tmp-file, try this:
shell_exec('<myscript> &> tmp-file && mv tmp-file tmp-file.completed &');

Write LogFile with date on running service

I've created a service with reactphp which runs and does some stuff. It is started as a daemon so all output should be logged in a file.
This log file should be named 'foo-log-file-$(date "+F")'. I want to have a single log file for each day.
Problem:
As mentioned the script runs as a service, without stopping. The starting call for the script is therefore only done once.
php my_script.php >> /var/log/bar/log-file--$(date "+%F") 2>&1
So everything which is printed to the console from this script is saved into the file, but the file is only created with the date-string when it was called and is not updated with a new date.
Question:
Is it possible to solve this without writing the log logic in the php-script? Can i handle this requirement with bash?
FYI
The answer of #fedorqui was a good approach, i solved it with a cronjob, which copies the file to a different one and empties the rest.
You cannot use move, cause the through the running service, it is open all time and you get the error:
cannot move 'foo.log' to 'bar.log': Text file busy
So i cp it and clear the old one with:
cp foo.log foo.log.$(date +"%F");
cp /dev/null file.log;

Waiting for sourced bash script to finish before running it again

I have 2 bash scripts..
The first one (begin.sh) is receiving variables from an external PHP script via SSH2, (not visible in this script as they are dynamic) :
But they look something like this :
$1 = myfile.mp3
$2 = artwork.jpg
$3 = my title - my artist
Here is the first script (begin.sh):
#!/bin/bash
. ./process.sh
And here is the second (process.sh):
#!/bin/bash
wget -O /root/incoming/shows/$1 http://remoteserver.com/files/$1;
exec lame --decode /root/incoming/shows/$1 - | /root/incoming/stereo_tool_cmd_64 - - -s /usr/incoming/settings/settings.sts | lame -b 128 - /root/incoming/processing/$1;
wait
mv /root/incoming/processing/$1 /var/www/html/processed/$1;
#Send Email when process is complete
recipients="somebody#somewhere.com"
subject="Podcast Submission"
from="somebodyelse#somewhere.com"
importLink="http://thisserveraddress.com/processed/$1"
artwork="http://anotherserver.com/podcast-art/$2"
message_txt=$(echo -e "A new podcast has been submitted.\n\nTitle : $3\n\nImport : $importLink")
/usr/sbin/sendmail "$recipients" << EOF
subject:$subject
from:$from
$message_txt
EOF
The process in the script above is time consuming (takes about 8 minutes to complete) and is very processor intensive (uses about 50% CPU), so I only want to run one of these processes at a time. The trouble is, the entire process can be executed remotely at any time by multiple users. So I need to find a way of running these jobs serially in the order that they come in.
I thought sourcing the process script would effectively queue the job's, but it doesn't. If the script is executed again while it's already running nothing happens.
Any suggestions?
Further explanation of what the process.sh script is doing for clarity....
First the host downloads the mp3 file from remoteserver.com
Then it takes the downloaded mp3 file and uses lame to decode it to wav, then another app performs a whole bunch of audio processing on the file after which it re-encodes it back to mp3.
When that's done it moves the new mp3 file to a publicly accessible folder.
Once that's done it sends an email to inform that all this has taken place, and outlines the various links where everything can be downloaded from.
The lock principle could be as follow :
When your script starts, the first thing it does is creating an empty script.lock file in its working folder.
And when it finishes it deletes the script.lock file.
EDIT : Even better, you can create a script.lock DIRECTORY with mkdir, as sugested by Dror Cohen in his comment
That's the general idea.
In order to work it in fact needs to really start only if there is no current script.lock that exists. If it does, it instead creates a new file containing the parameter of the call in a /queue/ folder.
So in the end you would have a begin.sh that would be like :
First check if script.lock exists.
- If it does, then write a new file in /queue/ and stop
- If not, create script.lock and proceed
In the very end of the script, it checks if there are any files in the /queue/ folder.
- If there are none, it deletes the script.lock and stops
- If there is a file in /queue/, it takes the older one, deletes it and starts itself again with the parameters saved in the file.

PHP Recurring Operation

For an iOS Push Notification server, I am implementing a web service that checks a feed on the net for a particular price.
Therefore I need my PHP to keep checking a price (every 20 seconds or so) and check values.
I was wondering (forgive my ignorance I just started with PHP today) is the way people do this a cronjob? Or is there some special way to fire a php script that runs until it's killed and repeats a task?
Thanks!
John
If PHP was your preferred route, a simple script such as the following can be set to run indefinitely in the background (name this grabber.php):
#!/usr/bin/php
<?php
do {
// Grab the data from your URL
$data = file_get_contents("http://www.example.com/data.source");
// Write the data out somewhere so your push notifications script can read it
file_put_contents("/path/to/shared/data.store", $data);
// Wait and do it all over again
sleep(20);
} while (true);
And to start it (assuming you're on a unixy OS):
$ chmod u+x grabber.php
$ ./grabber.php > /path/to/a/file/logging/script/output.log 2>&1 &
That & at the end sends the process to run in the background.
PHP is probably overkill for this however, perhaps a simple bash script would be better:
#!/bin/bash
# This downloads data and writes to a file ('data-file')
doWork () {
data=$(curl -L http://www.example.com/data.source)
echo $data > data-file
sleep 20
doWork
}
# Start working
doWork
$ chmod u+x grabber.sh
$ ./grabber.sh > /path/to/logger.log 2>&1 &
That is possible by setting up a cron jobs on your server.
Login to your web hosting e.g cpanel create a new cron job and add the path to the php file that you want to run. e.g php /home/[your username]/public_html/rss/import_feeds.php. There is field where you can input the number of minutes would you like the php script to run.
Run a PHP file in a cron job using CPanel

Categories