This is the first time I've ever used a CRON.
I'm using it to parse external data that is automatically FTP'd to a subdirectory on our site.
I have created a controller and model which handles the data. I can access the URL fine in my browser and it works (however I will be restricting this soon).
My problem is, how can I test if it's working?
I've added this to my controller for a quick and dirty log
$file = 'test.txt';
$contents = '';
if (file_exists($file)) {
$contents = file_get_contents($file);
}
$contents .= date('m-d-Y') . ' --- ' . PHP_SAPI . "\n\n";
file_put_contents($file, $contents);
But so far only got requests logged from myself from the browser, despite having my CRON running ever minute.
03-18-2010 --- cgi-fcgi
03-18-2010 --- cgi-fcgi
I've set it up using cPanel with the command
index.php properties/update/
the 2nd portion is what I use to access the page in my browser.
So how can I test this is working properly, and have I stuffed anything up?
Note: I'm using Kohana 3.
Many thanks
You're not using the correct command for calling Kohana.
Make sure you're using the full path to index.php so you can eliminate any path errors. Here are the switches available for use in Kohana:
--uri: Self explanatory
--method: HTTP Request method (POST, GET, etc ...) (Overrides Kohana::$method)
--get: Formatted GET data
--post: Formatted POST data
You should be using something like this:
php /path/to/kohana/directory/index.php --uri=properties/update/
I can't remember if you need double quotes around the value, don't forget to try that if it doesn't work.
you probably aren't running Cron with root permissions on that file.
put mailto="youremail#yourdomain.tld" at the start of the cron file to have it email you errors.
If you don't have root access to the cron file (I.E. SSH) I don't know if you can do this in cPanel.
Related
Background
Hi,
I am new to cron jobs and I am trying to set up a test one just to see how they work. I created it in cPanel like this:
wget -O - -q https://website.com/staging/wp-content/themes/AT103/test-cron.php
My file is pretty simple so far:
<?php
$email_to = "info#domain.com";
$title = "Test title";
$body = "Test body";
mail($email_to, $title, $body);
?>
All works fine here, I recieve the email every interval that my cron job runs.
What I want
On my site people can put up ads/listings to sell their stuff. In the cron job I want to go through all listings and email the buyers and sellers for the ones that has expired. As a first step in this I need to access the variables on my website, e.g. a listing object which is put in a variable like $post. Or through a function which returns the stuff I want if easier.
What I tried
I tried to access a random file and its functions by using lots of different code examples found online. The file is in the same folder as my test-cron.php. Here are a few things I tried to put at the top of test-cron.php (one at a time, not all at once):
require '/functions.php';
require 'https://website.com/staging/wp-content/themes/AT103/functions.php';
require '/home3/username/public_html/staging/wp-content/themes/AT103/functions.php';
This all resulted in the same thing, I did not get an email anymore. So I assume there is some sort of error with the lines? I also tried with require_once() with the same result.
Questions
How do I access other scripts ("live" variables or functions) in my folder hierarchy through a cron job?
If it is not possible for some reason, can I instead access my database information somehow?
If the file to be required/included, is in the same folder as the running script you would use one of the following:
require 'functions.php'; no leading slash tells it to look in the include path, then in the current directory.
require './functions.php'; (better) the ./ explicitly says look in the current directory.
https://www.php.net/manual/en/ini.core.php#ini.include-path
EDIT:
I just realized I did not address the fact that you are using cron and that's because ...
You are sill running PHP, cronjob or not makes no difference it still works the same!
However, it can be more difficult to debug on a production server. If you want to see exactly what's happening when the script fails then you can wrap it in a try block, catch and send the error to your email, or output the error and view it in the browser.
I know on Bluehost shared-hosting, if any of my cronjobs produce any output it will be automatically sent to me via email. I use the format below, and always get an email telling me when & why it happened. While developing you can simply navigate to your test-cron.php in the browser.
<?php
try {
require './functions.php';
/*
all of your logic
*/
} catch (Error $e) {
echo "Caught Error: \n" . $e;
}
sorry for long text.
I am looking for an idea to trigger execution of php function (in different directory, tested and in same directory) when specific act occurs.
System is Debian and php framework yii.
With API we receive new data from remote server and for specific value in data we receive I need to start function in separate process. No need to wait for its completion.
Because API response time I can't integrate this function into it and integration breaks down API rendering it unusable.
I have read dozens of answers on Stackoverflow and many other examples.
For test I just tried to create new folder at specific location and file wasn't created (permission problem I think but can't confirm since it makes it if done from main code - api).
It is supposed to do following:
pass 2 arguments received from API to function
create folder if it doesn't exist
call internal class that uses fpdf to print pdf file.
save document and mail it with PHPMailer.
Can't use pcntl_fork because it requires additional installation (which I am not allowed to do).
Covered topics:
forking with pcntl_fork (execution stops on reaching it.)
popen/pclose, exec, proc_open/proc_close (no reply and can't confirm it actually entered function).
Of course this situation excludes possibility of use of include and require.
Can't release api code but here is what I was asking it to do:
background.php
$docs_dir='test_folder';
if (!file_exists('/var/www/created_documents/'.$docs_dir))
{
mkdir('/var/www/created_documents/'.$docs_dir, 0777, true);
chmod('/var/www/created_documents/'.$docs_dir, 0777);
}
And it does nothing.
Few examples what I used in api code to jump to it (many others were deleted during work).
$proc = popen('php ' . __DIR__ . '/background.php &', 'r');
return $proc;
while (!feof($proc))
{
$data = fgets($proc);
var_dump($data);
}
exec("php background.php");
$cmd = "php background.php";
$timer = popen("start ". $cmd, "r");
pclose($timer);
exec($cmd . " > /dev/null &");
pclose(popen("php background.php &","r"));
You could make a separate internal http request using curl and its async funcionality.
Or you may use a queuing mechanism where one part is firing an event and the other part is consuming it. RabbitMQ or Gearman could do this for you.
So I'm working with PHP to attempt to execute a script I have in the same directory. The script runs fine locally and the permissions for the http-data user are set to be able to execute the script referenced in this block of PHP
$cmd = system('th neural_style.lua -style_image'.' ~/'.$style.'.jpg '.'-content_image '.$content_image.' -gpu 0 -backend cudnn -save_iter 1 -cudnn_autotune -output_image /var/www/html/processed/'.$email.'/out.png 2>&1', $retval);
echo '
</pre>
<hr />Recent output: ' . $last_line . '
<hr />Return value: ' . $retval;
The script should execute fine using the system method from what I understand, I know the variables look messy though this is the error I get from PHP:
sh: th: command not found
I set my default interpreter to bash instead of dash thinking that might be an issue, no dice. Torch is in the same directory, and like I said runs fine as my login.
I know what I'm trying to do in a way is like sacrilege, if there is a better way to run a script that takes 8 minutes roughly to complete using some user input from the web, I want to know. This is just what came natural to me. I'm looking to notify the user when the process is complete with an email anyways so any way of executing it is just dandy.
Edit: any mention of "http-data" was supposed to say "www-data".
Change the default shell for your http-data user to bash or dash. It is currently using sh.
Check what your $PATH variable is inside the PHP environment.
I have a script that, inserts into the database e.g. 20,000 users with email addresses in batches of 1000
(so two tables, emailParent, emailChild), there are 1000 rows in emailChild for every row in emailParent.
I want to run a script that sends these emails which basically says
//check_for_pending_parent_rows() returns the id of the first pending row found, or 0
while($parentId = check_for_pending_parent_row()){//loop over children of parent row}
Now because this is talking to the sendgrid servers this can take some time.
So I want to be able to hit a page and have that page launch a background process which sends the emails to sendgrid.
I thought I could use exec() but then I realized, I am using code igniter, which means the entry point MUST be index.php hence, I don't think exec() will work,
How can I launch a background process that uses code igniter?
This is not really an answer, Just something that is too long to post as a comment
#Frank Farmer: 70 lines seems a bit
excessive, this example from simple
test does it in pretty much half that,
What is the difference?
<?php
//---------------------------
//define required constants
//---------------------------
define('ROOT', dirname(__file__) . '/');
define('APPLICATION', ROOT . 'application/');
define('APPINDEX', ROOT . 'index.php');
//---------------------------
//check if required paths are valid
//---------------------------
$global_array = array(
"ROOT" => ROOT,
"APPLICATION" => APPLICATION,
"APPINDEX" => APPINDEX);
foreach ($global_array as $global_name => $dir_check):
if (!file_exists($dir_check)) {
echo "Cannot Find " . $global_name . " File / Directory: " . $dir_check;
exit;
}
endforeach;
//---------------------------
//load in code igniter
//---------------------------
//Capture CodeIgniter output, discard and load system into $ci variable
ob_start();
include (APPINDEX);
$ci = &get_instance();
ob_end_clean();
//do stuff here
Use exec to run a vanilla CLI PHP script to calls the page via cURL
See http://php.net/manual/en/book.curl.php for info on cURL
This is what I have had to do with some of my codeigniter applications
(Also make sure you set time out to 0)
And doing it this way, you are still able to debug it in the browser
Petah suggested cURL, but recently (since 2.0), CodeIgniter now permits calls to your controllers through the CLI:
This should be easier than cURL.
I want to download a large amount of Files to my Server. I have a List of different Files to download and Locations where to put them. This all is not a Problem, i use wget to download the File, execute this with shell_exec
$command = 'wget -b -O' . $filenameandpathtoput . ' ' . $submission['url'];
shell_exec($command);
This works great, the Server starts all the Threads and the Files are downloaded in no Time.
Problem is, I want to notify the User when the Files are downloaded... And this does not work with my current way of doing things. So how would you implement this?
Any Suggestions would be helpful!
I guess that you are able to check whether all files are in place with something like
function checkFiles ()
{
foreach ($_SESSION["targetpaths"] as $p)
{
if (!is_file($p)) return false;
}
return true;
}
Now all you have to do is to call a script on your server that calls this function every second (or so). You can either accomplish this with Meta Refresh (forcing the browser to reload the page after n seconds) or by using AJAX (have a look at jQuery's .getJSON, for example).
If the script is called and the files are not yet all downloaded, print something like "Please wait" and refresh again later. Otherwise, show the success message. Thats all.
You can consider using exec to run the external wget command. Your PHP script will block till the external command completes. Once it completes you can echo the name of the completed file.