Ok. I'm really stumped on this one.
Basically, I need to call a function for the Wordpress plugin W3 Total Cache as part of a cron job in crontab. I'd like to automatically clear the entire page cache nightly.
Here's the code that works fine within wordpress that I need to call:
if (function_exists('w3tc_pgcache_flush')) {
w3tc_pgcache_flush();
}
I'm currently using the following script:
#!/usr/bin/php
<?php
define('DOING_AJAX', true);
define('WP_USE_THEMES', false);
$_SERVER = array(
"HTTP_HOST" => "http://example.com",
"SERVER_NAME" => "http://example.com",
"REQUEST_URI" => "/",
"REQUEST_METHOD" => "GET"
);
require_once('/path-to-file/wp-load.php');
wp_mail('email#example.com', 'Automatic email', 'Hello, this is an automatically scheduled email from WordPress.');
if (function_exists('w3tc_pgcache_flush')) {
w3tc_pgcache_flush();
}
?>
and the command line:
php -q /path-to-file/flushtest.php
I used the wp_mail function to test and make sure I'm getting something.
The script is working fine except that the page cache is never flushed. I get the email and there aren't any errors in the log either.
Any ideas?
Thanks for your help.
The better version is now to use wp-cli. The latest version (0.9.2.8) is compatible with this plugin. Just run this command from anywhere in your wordpress directory :
wp w3-total-cache flush
Change the order a bit an try if it still works:
w3tc_pgcache_flush(); # let it crash here so that you won't get the mail
wp_mail('email#example.com', 'Automatic email', 'Hello, this is an automatically scheduled email from WordPress.');
Related
Background
Hi,
I am new to cron jobs and I am trying to set up a test one just to see how they work. I created it in cPanel like this:
wget -O - -q https://website.com/staging/wp-content/themes/AT103/test-cron.php
My file is pretty simple so far:
<?php
$email_to = "info#domain.com";
$title = "Test title";
$body = "Test body";
mail($email_to, $title, $body);
?>
All works fine here, I recieve the email every interval that my cron job runs.
What I want
On my site people can put up ads/listings to sell their stuff. In the cron job I want to go through all listings and email the buyers and sellers for the ones that has expired. As a first step in this I need to access the variables on my website, e.g. a listing object which is put in a variable like $post. Or through a function which returns the stuff I want if easier.
What I tried
I tried to access a random file and its functions by using lots of different code examples found online. The file is in the same folder as my test-cron.php. Here are a few things I tried to put at the top of test-cron.php (one at a time, not all at once):
require '/functions.php';
require 'https://website.com/staging/wp-content/themes/AT103/functions.php';
require '/home3/username/public_html/staging/wp-content/themes/AT103/functions.php';
This all resulted in the same thing, I did not get an email anymore. So I assume there is some sort of error with the lines? I also tried with require_once() with the same result.
Questions
How do I access other scripts ("live" variables or functions) in my folder hierarchy through a cron job?
If it is not possible for some reason, can I instead access my database information somehow?
If the file to be required/included, is in the same folder as the running script you would use one of the following:
require 'functions.php'; no leading slash tells it to look in the include path, then in the current directory.
require './functions.php'; (better) the ./ explicitly says look in the current directory.
https://www.php.net/manual/en/ini.core.php#ini.include-path
EDIT:
I just realized I did not address the fact that you are using cron and that's because ...
You are sill running PHP, cronjob or not makes no difference it still works the same!
However, it can be more difficult to debug on a production server. If you want to see exactly what's happening when the script fails then you can wrap it in a try block, catch and send the error to your email, or output the error and view it in the browser.
I know on Bluehost shared-hosting, if any of my cronjobs produce any output it will be automatically sent to me via email. I use the format below, and always get an email telling me when & why it happened. While developing you can simply navigate to your test-cron.php in the browser.
<?php
try {
require './functions.php';
/*
all of your logic
*/
} catch (Error $e) {
echo "Caught Error: \n" . $e;
}
the situation is like this. I have created multiple php files with the name operation.php and is hosted in my domains. Something like this:
example.com/operation.php
example123.com/operation.php
example1234.com/operation.php
Okay so now what I want is to code 1 single PHP script that will be the mother of all these operation.php scripts. Something like if I execute motherexample.com/operaterun.php from my browser then all these php scripts will run one by one.
Is this possible? Will this result in server outage? I do not want to run all these at once, maybe take a gap of 10seconds between each script execution.
Need help!
UPDATE I'm not sure whether you Guys are getting it or not but here's another example..
Let's say you have 100 sites each having maintenance.php located at example001.com/maintenance.php now it is not possible to load and run each of those 100 maintenance.php in the browser one by one. So this is why I want 1 single mother-maintenance.php that when run from the browser will execute each of those maintenance.php one by one or maybe with some time gap!
If a user is doing it, I will recommend AJAX.
Otherwise you can try a code like that.
<?php
$somearg = escapeshellarg('blah');
exec("php file2.php $somearg > /dev/null &");
Found on (https://stackoverflow.com/a/1110260/4268741)
However you need to make some changes to it to get it work on your project.
You need to run something like a wget call from the mother motherexample.com/operaterun.php.
wget is for linux, if you are using a windows domain check for alternatives
example
wget http://example.com/operation.php
wget http://example123.com/operation.php
wget http://example1234.com/operation.php
You can use AJAX request. Just make request from your mother script to child pages.
Or you can use simple way - use iframe for each child. If you have 100 child pages then just generate on mother page 100 iframe with src to child. Or you can create this iframe by javascript and add it to page. Then you can make delay between requests.
I think file_get_contents can help you:
<?php
//index.php
set_time_limit(0);
$op_list = array(
"example.com/operation.php",
"example123.com/operation.php",
"example1234.com/operation.php",
);
$opts = array(
"http" => array(
"method" => "get",
"timeout" => 60,
)
);
$context = stream_context_create($opts);
foreach($op_list as $key => $value)
{
if($response = file_get_contents($value, null, $context))
{
$json_decode = json_decode($response, true);
echo "{$value}: time_start: {$json_decode['time_start']}, time_end: {$json_decode['time_end']} <br />";
}
else
{
echo "{$value}: fail <br />";
}
}
?>
Here is a complete example:
http://try.valinv.com/php/21a6092d20601c77982f7c1ec6b2cc23.html
I've created a little command line tool to help me launch sites. It uses Symfony2 console to help create the commands and add some structure. What I'm trying to figure out is if there is a way, I can crate a "blank" or a default command so if you don't put in a command it just defaults to this. An example might help explain:
A "normal" console command would look like this:
php launch site foo
I want to make this do the exact same thing as above:
php launch foo
The only thing I can think of is to sort-of short circuit the application->run process and check if "foo" is in my own command list, if it's not then force the console to run site foo.
The crappy thing about that is if you just typo'ed a different command, the system would just try and run as a site and instead of an error message you'd get an error saying it can't launch that site (which is an error, but the wrong error and not a helpful one to a user).
Maybe I missed something in the console docs, but is there a way to do what I'm trying here?
So what I ended up doing was just attempt my own match, if I can find the command, then I run the application as normal, if not, I try to launch the site:
if (!array_key_exists($argv[1], $list))
{
$cmd = $this->application->find('launch');
$args = array(
'command' => 'launch',
'alias' => $argv[1]
);
$input = new ArrayInput($args);
$output = new ConsoleOutput();
$cmd->run($input, $output);
}
else
{
$this->application->run();
}
It works fine, it just feels a little meh, I'm open to other suggestions if anyone has them.
I have a script that, inserts into the database e.g. 20,000 users with email addresses in batches of 1000
(so two tables, emailParent, emailChild), there are 1000 rows in emailChild for every row in emailParent.
I want to run a script that sends these emails which basically says
//check_for_pending_parent_rows() returns the id of the first pending row found, or 0
while($parentId = check_for_pending_parent_row()){//loop over children of parent row}
Now because this is talking to the sendgrid servers this can take some time.
So I want to be able to hit a page and have that page launch a background process which sends the emails to sendgrid.
I thought I could use exec() but then I realized, I am using code igniter, which means the entry point MUST be index.php hence, I don't think exec() will work,
How can I launch a background process that uses code igniter?
This is not really an answer, Just something that is too long to post as a comment
#Frank Farmer: 70 lines seems a bit
excessive, this example from simple
test does it in pretty much half that,
What is the difference?
<?php
//---------------------------
//define required constants
//---------------------------
define('ROOT', dirname(__file__) . '/');
define('APPLICATION', ROOT . 'application/');
define('APPINDEX', ROOT . 'index.php');
//---------------------------
//check if required paths are valid
//---------------------------
$global_array = array(
"ROOT" => ROOT,
"APPLICATION" => APPLICATION,
"APPINDEX" => APPINDEX);
foreach ($global_array as $global_name => $dir_check):
if (!file_exists($dir_check)) {
echo "Cannot Find " . $global_name . " File / Directory: " . $dir_check;
exit;
}
endforeach;
//---------------------------
//load in code igniter
//---------------------------
//Capture CodeIgniter output, discard and load system into $ci variable
ob_start();
include (APPINDEX);
$ci = &get_instance();
ob_end_clean();
//do stuff here
Use exec to run a vanilla CLI PHP script to calls the page via cURL
See http://php.net/manual/en/book.curl.php for info on cURL
This is what I have had to do with some of my codeigniter applications
(Also make sure you set time out to 0)
And doing it this way, you are still able to debug it in the browser
Petah suggested cURL, but recently (since 2.0), CodeIgniter now permits calls to your controllers through the CLI:
This should be easier than cURL.
I'm discarding the old question for a better formulated question.
I'm using the twitter stream api via php like the script below.
When I run it via the command line. The script keeps running and when I hit ctrl+c the script stops.
Thats great and all, but i'd like to run it in the background. When a user creates a new search, the script below gets activated and stays running till a signal is given to sop. Just like the cli version, but in the background.
How do I achieve this?
Ok, here's a piece of code:
$opts = array(
'http' => array(
'method' => 'POST',
'content' => 'track=ipad'
)
);
$context = stream_context_create($opts);
$stream = fopen('http://test#stream.twitter.com/1/statuses/filter.json','r', false, $context);
while (!feof($stream)) {
if (!($line = stream_get_line($stream, 200000, "\n"))) {
continue;
}
$tweet = json_decode($line);
// mysql query
}
I'm running it
This should be solved by running the php script as a daemon from commandline, there are alot of things to conisder tho.
what when the system reboots?
what when an error occurs?
what about memory management?
what about monitoring?
A good start is Jeroen Keppens' presentation PHP in the dark he did on multiple php conference.
You might want to look into ajax or other web-browser methods like comet or websockets to achieve what you want.
update
So if you want it to run on the server side you should use php's exec function something like:
exec("php-cli runForever.php");
I'm trying to do the same thing with this tutorial. It uses the System_Daemon pear package and provides start/stop functions.