code igniter and exec? - php

I have a script that, inserts into the database e.g. 20,000 users with email addresses in batches of 1000
(so two tables, emailParent, emailChild), there are 1000 rows in emailChild for every row in emailParent.
I want to run a script that sends these emails which basically says
//check_for_pending_parent_rows() returns the id of the first pending row found, or 0
while($parentId = check_for_pending_parent_row()){//loop over children of parent row}
Now because this is talking to the sendgrid servers this can take some time.
So I want to be able to hit a page and have that page launch a background process which sends the emails to sendgrid.
I thought I could use exec() but then I realized, I am using code igniter, which means the entry point MUST be index.php hence, I don't think exec() will work,
How can I launch a background process that uses code igniter?

This is not really an answer, Just something that is too long to post as a comment
#Frank Farmer: 70 lines seems a bit
excessive, this example from simple
test does it in pretty much half that,
What is the difference?
<?php
//---------------------------
//define required constants
//---------------------------
define('ROOT', dirname(__file__) . '/');
define('APPLICATION', ROOT . 'application/');
define('APPINDEX', ROOT . 'index.php');
//---------------------------
//check if required paths are valid
//---------------------------
$global_array = array(
"ROOT" => ROOT,
"APPLICATION" => APPLICATION,
"APPINDEX" => APPINDEX);
foreach ($global_array as $global_name => $dir_check):
if (!file_exists($dir_check)) {
echo "Cannot Find " . $global_name . " File / Directory: " . $dir_check;
exit;
}
endforeach;
//---------------------------
//load in code igniter
//---------------------------
//Capture CodeIgniter output, discard and load system into $ci variable
ob_start();
include (APPINDEX);
$ci = &get_instance();
ob_end_clean();
//do stuff here

Use exec to run a vanilla CLI PHP script to calls the page via cURL
See http://php.net/manual/en/book.curl.php for info on cURL
This is what I have had to do with some of my codeigniter applications
(Also make sure you set time out to 0)
And doing it this way, you are still able to debug it in the browser

Petah suggested cURL, but recently (since 2.0), CodeIgniter now permits calls to your controllers through the CLI:
This should be easier than cURL.

Related

Access variables or functions from other scripts in cron job via cPanel

Background
Hi,
I am new to cron jobs and I am trying to set up a test one just to see how they work. I created it in cPanel like this:
wget -O - -q https://website.com/staging/wp-content/themes/AT103/test-cron.php
My file is pretty simple so far:
<?php
$email_to = "info#domain.com";
$title = "Test title";
$body = "Test body";
mail($email_to, $title, $body);
?>
All works fine here, I recieve the email every interval that my cron job runs.
What I want
On my site people can put up ads/listings to sell their stuff. In the cron job I want to go through all listings and email the buyers and sellers for the ones that has expired. As a first step in this I need to access the variables on my website, e.g. a listing object which is put in a variable like $post. Or through a function which returns the stuff I want if easier.
What I tried
I tried to access a random file and its functions by using lots of different code examples found online. The file is in the same folder as my test-cron.php. Here are a few things I tried to put at the top of test-cron.php (one at a time, not all at once):
require '/functions.php';
require 'https://website.com/staging/wp-content/themes/AT103/functions.php';
require '/home3/username/public_html/staging/wp-content/themes/AT103/functions.php';
This all resulted in the same thing, I did not get an email anymore. So I assume there is some sort of error with the lines? I also tried with require_once() with the same result.
Questions
How do I access other scripts ("live" variables or functions) in my folder hierarchy through a cron job?
If it is not possible for some reason, can I instead access my database information somehow?
If the file to be required/included, is in the same folder as the running script you would use one of the following:
require 'functions.php'; no leading slash tells it to look in the include path, then in the current directory.
require './functions.php'; (better) the ./ explicitly says look in the current directory.
https://www.php.net/manual/en/ini.core.php#ini.include-path
EDIT:
I just realized I did not address the fact that you are using cron and that's because ...
You are sill running PHP, cronjob or not makes no difference it still works the same!
However, it can be more difficult to debug on a production server. If you want to see exactly what's happening when the script fails then you can wrap it in a try block, catch and send the error to your email, or output the error and view it in the browser.
I know on Bluehost shared-hosting, if any of my cronjobs produce any output it will be automatically sent to me via email. I use the format below, and always get an email telling me when & why it happened. While developing you can simply navigate to your test-cron.php in the browser.
<?php
try {
require './functions.php';
/*
all of your logic
*/
} catch (Error $e) {
echo "Caught Error: \n" . $e;
}

How to bypass security checks in a PHP script if run from the CLI?

I have a PHP script which is typically run as part of a bigger web application.
The script essentially makes some changes to a database and reports back to the web user on the status/outcome.
I have an opening section in my PHP:
require $_SERVER['DOCUMENT_ROOT'].'/security.php';
// Only level <=1 users should be able to access this page:
if ( $_SESSION['MySecurityLevel'] > 1 ) {
echo '<script type="text/javascript" language="JavaScript">window.location = \'/index.php\'</script>';
exit();
}
So, basically, if the authenticated web user's security level is not higher than 1, then they are just redirected to the web app's index.
The script works fine like this via web browsers.
Now to my issue...
I want to also cron-job this script - but I don't know how to bypass the security check if ran from the CLI.
If I simply run it from the CLI/cron with 'php -f /path/to/report.php' and enclose the security check in a "if ( php_sapi_name() != 'cli' )", it spews out errors due to multiple uses of $_SERVER[] vars used in the script (there may be other complications but this was the first error encountered).
If I run it using CURL, then the php_sapi_name() check won't work as it's just being served by Apache.
Please can anyone offer some assistance?
Thank you! :)
If you invoke the script through the CLI some of the $_SERVER variables will be defined however their values may not be what you expect: for instance $_SERVER['DOCUMENT_ROOT'] will be empty so your require will look for a file called 'security.php' in the filesystem root. Other arrays such as $_SESSION will not be populated as the CLI does not have a comparable concept.
You could get around these issues by manually defining the variables (see "Set $_SERVER variable when calling PHP from command line?" however a cleaner approach would be to extract the code that makes the database changes to a separate file which is independent from any specific and that does not depend on any SAPI-specific variables being defined.
For instance your PHP script (let's call it index.php) could be modified like this:
require $_SERVER['DOCUMENT_ROOT'].'/security.php';
require $_SERVER['DOCUMENT_ROOT'].'/db_changes.php';';
// Only level <=1 users should be able to access this page:
if ( $_SESSION['MySecurityLevel'] > 1 ) {
echo '<script type="text/javascript" language="JavaScript">window.location = \'/index.php\'</script>';
exit();
} else {
do_db_changes();
}
Then in the SAPI-agnostic db_changes.php you would have:
<?
function do_db_changes() {
// Do the DB changes here...
}
?>
And finally you would have a file, outside the web root, which you can invoke from cron (say cron.php):
<?
require("/absolute/path/to/db_changes.php");
do_db_changes();
?>
Like this you can continue using index.php for the web application and invoke cron.php from cron to achieve your desired results.

Call php script multiple times, with unique include

I'm trying to set up a cron job to update all of our clients. They each have their own db and directory in our web root. An individual call uses this script:
<?php
include_once 'BASEPATH'.$_REQUEST['client'].'PATHTOPHPLIB';
//Call some functions here
//backup db
$filename='db_backup_'.date('G_a_m_d_y').'.sql';
$result=exec('mysqldump '.Config::read('db.basename').' --password='.Config::read('db.password').' --user='.Config::read('db.user').' --single-transaction >BACKUPDIRECTORYHERE'.$filename,$output);
if($output=='') {
/* no output is good */
}else {
logit('Could not backup db');
logit($output);
}
?>
I need to call this same script multiple times, each with a unique include based on a client variable being passed in. We originally had a unique cron job for each client, but this is no longer a possibility. What is the best way to call this script? I'm looking at creating a new php script that will have an array of our clients and loop through it running this script, but I can't just include it because the libraries will have overlapping functions. I'm not considering cUrl because these scripts are not in the web root.
First off, a quick advert for the Symfony console component. There are others, but I've been using Symfony for a while and gravitate towards that. Hopefully you are PSR-0 /Composer -able in your project. Even if you aren't this could give you and excuse to do something self contained.
You absolutely don't want these sorts of scripts under the webroot. There is no value in having them run through apache, and there are limitations imposed on them in terms of memory and runtime that are different in a command line php context.
Base script:
<?php
if (PHP_SAPI != "cli") {
echo "Error: This should only be run from the command line environment!";
exit;
}
// Script name is always passed, so $argc with 1 arg == 2
if ($argc !== 2) {
echo "Usage: $argv[0] {client}\n";
exit;
}
// Setup your constants?
DEFINE('BASEPATH', '....');
DEFINE('PATHTOPHPLIB', '...');
require_once 'BASEPATH' . $argv[1] . 'PATHTOPHPLIB';
//Call some functions here
//backup db
$filename='db_backup_'.date('G_a_m_d_y').'.sql';
$result=exec('mysqldump '.Config::read('db.basename').' -- password='.Config::read('db.password').' --user='.Config::read('db.user').' --single-transaction >BACKUPDIRECTORYHERE'.$filename,$output);
if($output=='') {
/* no output is good */
} else {
logit('Could not backup db');
logit($output);
}
Calling Script Runs in cron:
<?php
// Bootstrap your master DB
// Query the list of clients
DEFINE('BASE_SCRIPT', 'fullpath_to_base_script_here');
foreach ($clients as $client) {
exec('/path/to/php ' . BASE_SCRIPT . " $client");
}
If you want to keep things decoupled inside the caller script you could pass the path to the backup processing script rather than hardwiring it, and if so, use the same techniques to get the param from $argc and $argv.

How can I run a php script exactly once - No sessions

I have the following question: how can I run a php script only once? Before people start to reply that this is indeed a similar or duplicate question, please continue reading...
The situation is as follows, I'm currently writing my own MVC Framework and I've come up with a module based system so I can easily add new functionality to my framework. In order to do so, I created a /ROOT/modules directory in which one could add the new modules.
So as you can imagine, the script needs to read the directory, read all the php files, parse them and then is able to execute the new functionality, however it has to do this for all the webbrowsers requests. This would make this task about O(nAmountOfRequests * nAmountOfModules) which is rather big on websites with a large amount of user requests every second.
Then I figured, what if I would introduce a session variable like: $_SESSION['modulesLoaded'] and then simply check if its set or not. This would reduce the load to O(nUniqueAmountOfRequests * nAmountOfModules) but this is still a large Big O if the only thing I want to do is read the directory once.
What I have now is the following:
/** Load the modules */
require_once(ROOT . DIRECTORY_SEPARATOR . 'modules' . DIRECTORY_SEPARATOR . 'module_bootloader.php');
Which exists of the following code:
<?php
//TODO: Make sure that the foreach only executes once for all the requests instead of every request.
if (!array_key_exists('modulesLoaded', $_SESSION)) {
foreach (glob('*.php') as $module) {
require_once($module);
}
$_SESSION['modulesLoaded'] = '1';
}
So now the question, is there a solution, like a superglobal variable, that I can access and exists for all requests, so instead of the previous Big Os, I can make a Big O thats only exists of nAmountOfModules? Or is there another way of easily reading the module files only once?
Something like:
if(isFirstRequest){
foreach (glob('*.php') as $module) {
require_once($module);
}
}
At the most basic form, if you want to run it once, and only once (per installation, not per user), have your intensive script change something on the server state (add a file, change a file, change a record in a database), then check against that every time a request to run it is issued.
If you find a match, it would mean the script was already run, and you can continue with the process without having to run it again.
when called, lock the file, at the end of the script, delete the file. only called once. and as so not needed any longer, vanished in nirvana.
This naturally works the other way round, too:
<?php
$checkfile = __DIR__ . '/.checkfile';
clearstatcache(false, $checkfile);
if (is_file($checkfile)) {
return; // script did run already
}
touch($checkfile);
// run the rest of your script.
Just cache the array() to a file and, when you upload new modules, just delete the file. It will have to recreate itself and then you're all set again.
// If $cache file does not exist or unserialize fails, rebuild it and save it
if(!is_file($cache) or (($cached = unserialize(file_get_contents($cache))) === false)){
// rebuild your array here into $cached
$cached = call_user_func(function(){
// rebuild your array here and return it
});
// store the $cached data into the $cache file
file_put_contents($cache, $cached, LOCK_EX);
}
// Now you have $cached file that holds your $cached data
// Keep using the $cached variable now as it should hold your data
This should do it.
PS: I'm currently rewriting my own framework and do the same thing to store such data. You could also use a SQLite DB to store all such data your framework needs but make sure to test performance and see if it fits your needs. With proper indexes, SQLite is fast.

How can I test a CRON job with PHP?

This is the first time I've ever used a CRON.
I'm using it to parse external data that is automatically FTP'd to a subdirectory on our site.
I have created a controller and model which handles the data. I can access the URL fine in my browser and it works (however I will be restricting this soon).
My problem is, how can I test if it's working?
I've added this to my controller for a quick and dirty log
$file = 'test.txt';
$contents = '';
if (file_exists($file)) {
$contents = file_get_contents($file);
}
$contents .= date('m-d-Y') . ' --- ' . PHP_SAPI . "\n\n";
file_put_contents($file, $contents);
But so far only got requests logged from myself from the browser, despite having my CRON running ever minute.
03-18-2010 --- cgi-fcgi
03-18-2010 --- cgi-fcgi
I've set it up using cPanel with the command
index.php properties/update/
the 2nd portion is what I use to access the page in my browser.
So how can I test this is working properly, and have I stuffed anything up?
Note: I'm using Kohana 3.
Many thanks
You're not using the correct command for calling Kohana.
Make sure you're using the full path to index.php so you can eliminate any path errors. Here are the switches available for use in Kohana:
--uri: Self explanatory
--method: HTTP Request method (POST, GET, etc ...) (Overrides Kohana::$method)
--get: Formatted GET data
--post: Formatted POST data
You should be using something like this:
php /path/to/kohana/directory/index.php --uri=properties/update/
I can't remember if you need double quotes around the value, don't forget to try that if it doesn't work.
you probably aren't running Cron with root permissions on that file.
put mailto="youremail#yourdomain.tld" at the start of the cron file to have it email you errors.
If you don't have root access to the cron file (I.E. SSH) I don't know if you can do this in cPanel.

Categories