I have a crawler which scrapes a website for information and then inserts the values into a database, it seems to insert the first 4000~ rows fine but then suddenly stops inserting values to the mysql database even though the crawler is still scraping the website
Database table
CREATE TABLE IF NOT EXISTS `catalog` (
`id` varchar(100) NOT NULL DEFAULT '',
`title` varchar(100) DEFAULT NULL,
`value` double DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
PHP insert function
function addToCatalog($id, $title, $value){
$q = "INSERT INTO catalog VALUES('$id', '$title', $value)";
return mysql_query($q, $this->connection);
}
php scrape function
function scrape($pageNumber){
$page = file_get_html('http://example.com/p='.$pageNumber);
if($page){
$id = array();
$title = array();
$value = array();
//id
if($page->find('.productid')){
foreach ($page->find('.productid') as $p) {
$id[] = $p->innertext;
}
}
//title
if($page->find('.title')){
foreach($page->find('.title') as $p){
$title[] = $p->innertext;
}
}
//value
if($page->find('.value')){
foreach($page->find('.value') as $p){
$value[] = $p->innertext;
}
}
for($i=0; $i<sizeof($id); $i++){
$add = $database->addToCatalog($id[$i], $title[$i], $value[$i]);
echo $id[$i]." ".$title[$i]." ".$value[$i]."<br>";
}
}
}
for($i=0; $i<31300; $i++){
scrape($i);
}
Any help with this problem would be appreciated.
If the execution of the process stops after about 30 seconds, your problem is probably the max_execution_time setting.
Had a similar issue not too long ago, turns out it was due to PHP running as FastCGI and a process daemon terminating the script, try counting the # of seconds it takes before the script exits, if its the same amount each time, try switching to CGI then trying again.
It could also be your web host terminating the script to protect shared resources, so if you are using a shared hosting server, it may be worth an upgrade.
Related
The environment this is running in is as follows. It's an instance of Server2Go on a thumb drive.
Apache/2.2.15 (Win32),
PHP/5.3.2,
SQLite 2,
MySQL 5.1.46-community,
and Perl 5.8.
The php script opens a file and loops through it line by line. A query is built and executed with the information in each line of the file.
The table structure is as follows:
CREATE TABLE `exp_report` (
`b_unit` varchar(11) DEFAULT NULL, //IMPORTANT TO THIS PROBLEM
`b_unit_title` varchar(255) DEFAULT NULL,
`act_code` varchar(11) DEFAULT NULL,
`act_title` varchar(255) DEFAULT NULL,
`adopted_bgt` varchar(20) DEFAULT NULL,
`amended_bgt` varchar(20) DEFAULT NULL,
`encumb` varchar(20) DEFAULT NULL,
`ytd_exp` float(14,2) DEFAULT NULL, //IMPORTANT TO THIS PROBLEM
`encumb_ytdexp` float(14,2) DEFAULT NULL, //IMPORTANT TO THIS PROBLEM
`available_bgt` varchar(20) DEFAULT NULL,
`percent` varchar(20) DEFAULT NULL
) ENGINE=MyISAM DEFAULT CHARSET=utf8;
When the script is executed I echo the query statement that is generated before and after the query is executed in the php script. This is to ensure nothing strange is happening to the data in PHP.
Example output:
INSERT INTO exp_report (b_unit, b_unit_title, act_code, act_title, adopted_bgt,
amended_bgt, encumb, ytd_exp, encumb_ytdexp, available_bgt,
percent
)
VALUES('01000101', 'COUNCIL', '414000', 'SALARIES & WAGES', '259500','214500',
'0', '2', '209296.72', '5203.28', '0.97574228'
);
The Result in the database looks like this:
010001 | COUNCIL | 414000 | SALARIES & WAGES | 259500 | 214500 | 0 | 2.00 | 2.00 | 5203.28 | 0.97574228
Please note the first field and the last 4 fields of the above data.
The first field is b_unit varchar(11)` and the program attempts to insert 01000101 but the last two digits are cut off when it is stored in the database.
The next two fields are ytd_exp float(14,2) and encumb_ytdexp float(14,2). I attempt to insert '209296.72' in encumb_ytdexp and it is converted into 2.00.
The remaining two fields are varchars and they store the information properly.
If I copy the query that is echoed to the browser and run it within phpMyAdmin the data is stored properly and it looks exactly like what was passed through in the query statement.
I can't seem to figure out what is causing this behavior.
Any advice is appreciated.
Code for the php script:
<?php
include('includes\db_fns.php');
$filename = "d2013expRep.xls";
$lines = file($filename);
foreach($lines as $key=>$current){
$contents = explode("\t",$current);
if($key=="0"){
continue;
}
$actr = 0;
//var_dump($contents);
foreach($contents as $key2=>$c2){
$la = trim($c2);
if($actr==0){
$finout .= "'".$la."'";
}else{
if($key2=="7"){
$la = (float) $la;
$finout .= ",'".$la."'";
}else{
$finout .= ",'".$la."'";
}
}
$actr++;
unset($la);
}
$query = "INSERT INTO exp_report (b_unit,b_unit_title,act_code,act_title,adopted_bgt,amended_bgt,encumb,ytd_exp,encumb_ytdexp,available_bgt,percent) VALUES($finout); ";
//echo($query."<BR><BR><BR>");
$result = mysqli_query($dc2a,$query) or trigger_error("Failed Query: " . mysqli_error($dc2a));
$finout .= "\r\n";
echo $query."<BR><BR><BR>";
unset($finout);
unset($actr);
}
//$query = "LOAD DATA LOCAL INFILE '/Library/WebServer/Documents/".$filename."' INTO TABLE exp_report FIELDS TERMINATED BY '\s\t' LINES TERMINATED BY '\n' IGNORE 1 LINES ";
//$result = mysqli_query($dc2a,$query) or trigger_error("Failed Query: " . mysqli_error($dc2a));
//echo $query;
?>
I have attempted all of respondents suggestions to no avail. Perhaps these shots will help shed some light on this question.
The data is inserted into the database once the script is executed. I'm using text/strings as opposed to numbers/decimals/floats. Now the database is telling me records do not exist when I know they do.
When I execute a query using like the result is an empty set. The field I am searching is setup as a text field.
Does this make sense to anyone or am I chasing a ghost?
Hard to see what's wrong from what you posted
Ensure the read file is a properly tab separated text file: maybe I'm wrong but from the extension .xls it seems you're trying to read a native Excel binary file (when I save from Excel as tab-separated values I get .txt as extension but I'm on Mac...)
I rewrote your code to be more clear
Also (that may fix the issue) I added 3 things:
every value is escaped before inserting into the values string. This will save you if the value contains a single quote, for example
I enclosed the field' names between backticks to avoid potential mysql reserved keywords conflicts.
EDIT:
It seem you're parsing a UTF-16 file. After reading the line it's converted to UTF8.
Hope this helps
<?php
include('includes\db_fns.php');
$filename = "d2013expRep.xls"; // <--- make sure you're importing properly tab separated text data (are you importing an xls file??)
$lines = file($filename);
foreach($lines as $lineNum => $line) {
// It seem you're working with a UTF-16 text file: convert to UTF8
$lineUTF8 = mb_convert_encoding( $line, "UTF-8", "UTF-16LE" );
$contents = explode( "\t", $lineUTF8 );
if($lineNum==0) {
continue;
}
$queryValues = "";
foreach( $contents as $valueNum => $value ) {
// good practice to escape the string
$queryValues .= ( $valueNum == 0 ? "" : ", " ) . "'" . mysql_escape_string( trim( $value ) ) . "'";
}
// backtick field names to avoid reserved keywords conflicts
$query = "INSERT INTO `exp_report` ( `b_unit`,
`b_unit_title`,
`act_code`,
`act_title`,
`adopted_bgt`,
`amended_bgt`,
`encumb`,
`ytd_exp`,
`encumb_ytdexp`,
`available_bgt`,
`percent` ) VALUES( $queryValues ); ";
$result = mysqli_query($dc2a,$query) or trigger_error("Failed Query: " . mysqli_error($dc2a));
// echo $query . "<br /><br />";
}
?>
I have spent many hours debugging, and scouring the internet for a solution to this unusual problem. Heres the deal:
I am working on a Work Order Submission and Tracking system. There are two databases involved:
The database where the submissions data gets posted, which is located on the same physical machine, but on a separate virtual machine as the webserver serving the php. They are on the same class C subnet.
The database of our tracking system. Located on a different physical server on a different IP altogether, also a virtual machine.
Our work order system allows for multiple 'services requested', stored in an array. In our sumbissions database, this is stored as a comma separated string, i.e. "40,60,70" but in our tracking system database, each 'service requested' needs a separate entry, as to allow the different aspects of the project to be tracked and completed at different times, by different staff.
THE PROBLEM IS: When I place my second insert statement, the one destined for the tracking database, in a for loop, it completely hangs, and takes maybe 5 to 15 minutes, before it passes that point in the code, and sends the confirmation email. The data does not get inserted either.
When I take it out of the for loop and simply do one insert in the submissions database and one insert into the tracking system, it works fine.
First, Ill post the code that works, but only posts one 'service' to the tracking system:
public function insertOrder()
{
$services = implode( ',', $this->model->chk );
$curdate = $this->model->getMySQLDate( $this->model->curdate );
$dueDate = $this->model->getMySQLDate( $this->model->dueDate );
$sql = "INSERT INTO orders VALUES(DEFAULT,
{$this->sanitize($services)},
{$this->sanitize($curdate)},
{$this->sanitize($this->model->submittedBy)},
{$this->sanitize($this->model->shortDesc)},
{$this->sanitize($this->model->projDetails)},
{$this->sanitize($dueDate)},
{$this->sanitize($this->model->dueDateNotes)},
{$this->sanitize( $this->model->approveBy)},
{$this->sanitize( $this->model->cost )} )";
$this->execute( $sql );
$this->convertServicesToTracks();
$notes = $this->model->getTracksNotes();
$dueDate = $dueDate.' 12:00:00';
$shortDescNoQuotes = str_replace("\"","'",$this->model->shortDesc);
$sqlTracks = "INSERT INTO todos VALUES(DEFAULT,
{$this->sanitizeTracks($this->model->chk[0])},
NULL,
{$this->sanitizeTracks($shortDescNoQuotes)},
{$this->sanitizeTracks($notes)},
now(),
{$this->sanitizeTracks($dueDate)},
NULL,
12,
NULL,
'active',
NULL,
now() );";
//echo $sqlTracks;
$this->executeTracks( $sqlTacks );
} private function executeTracks( $sql )
{
$db = $this->getTracksDB( );
$this->check4Error( $db, $sql );
return $result;
}
private function getTracksDB()
{
if (!$this->tracksdb) $this->tracksdb = new mysqli(AbstractSQL::TRACKS_HOST, AbstractSQL::USER, AbstractSQL::PASS, AbstractSQL::TRACKS_SCHEMA);
return $this->tracksdb;
}
private function convertServicesToTracks()
{
//converts submission data to tracking system data
}
private function sanitizeTracks($arg)
{
if (!isset($arg)) return "NULL";
if (is_numeric($arg) && !is_double( $arg) ) return $arg;
return "'{$this->getTracksDB()->escape_string($arg)}'";
}
When I add this simple for loop around the second INSERT statement, it hangs, even if the array only has one item!
for($i = 0; $i < count($this->model->chk); ++$i)
{
$sqlTracks = "INSERT INTO todos VALUES(DEFAULT,
{$this->sanitizeTracks($this->model->chk[$i])},
NULL,
{$this->sanitizeTracks($shortDescNoQuotes)},
{$this->sanitizeTracks($notes)},
now(),
{$this->sanitizeTracks($dueDate)},
NULL,
12,
NULL,
'active',
NULL,
now() );";
//echo $sqlTracks;
$this->executeTracks( $sqlTacks );
}
Any help would be greatly appreciated. And I apologize for the long code snippets!!
Is it iterating through the for loop? I see you have an echo, did that write anything out? How many items does it have to iterate through? 5 min seems like a long time but if there are a lot of items that could be why it's taking so long. Are you seeing any errors in your logs?
Something you might try is hold the count in a variable so it doesn't have to calculate that each time. It might speed up your for loop but I'm not sure it will insert the data.
for($i = 0, $count = count($this->model->chk); $i < $count; ++$i)
{
$sqlTracks = "INSERT INTO todos VALUES(DEFAULT,
{$this->sanitizeTracks($this->model->chk[$i])},
NULL,
{$this->sanitizeTracks($shortDescNoQuotes)},
{$this->sanitizeTracks($notes)},
now(),
{$this->sanitizeTracks($dueDate)},
NULL,
12,
NULL,
'active',
NULL,
now() );";
//echo $sqlTracks;
$this->executeTracks( $sqlTacks );
}
I found this in the PHP for loop reference: http://php.net/manual/en/control-structures.for.php
Well, this may not be the problem, but shouldn't you generally use a foreach loop to avoid hitting parts of the array that may not exist? There is more about this here. If you loop through an empty index, it would break your SQL statement. Like this:
foreach($this->model->chk as $val)
I have a checking script, it checks if the server/switch/router is alive.
The records are stored all in one db
CREATE TABLE IF NOT EXISTS `mod_monitoring` (
`id` int(11) NOT NULL,
`parentid` int(11) NOT NULL,
...
) ENGINE=MyISAM DEFAULT CHARSET=latin1;
So a router could have a switch below it(connected via parent ID) and that could have a server under it, now if a server goes down, its fine because nothing would be under it and no double email would get sent out, however lets say a router goes out that has a router under it and a couple servers.
Because we check them all, we would send out emails for each item to the admin telling them each one is dead, but I need to send out only one email about the router going down. Hope that makes sense, I need to somehow only make an array of the IDs that have no children under it..
I could make an array of all the nodes that are down, but then how do I check if its the first one in the tree? and remove all the ones that are under it
Anyone could help? Been thinking about this for ages now!
If I understood what you want and that is iterate from parent to parent (which required a not specified number of JOIN), you need to use a Stored Procedure. Infact, to achieve this goal you need the Kleene closure that is not doable in a SQL query.
In the end I ended up making array of all the dead id's $key => $id
and then using the following
if(is_array($dead)) {
foreach($dead as $key => $id) {
$conn = $db->query("SELECT * FROM mod_monitoring WHERE id = {$id}");
$data = $db->fetch_array($conn);
if($data['parentid'] == 0) {
$final[] = $id;
unset($dead[$key]);
}
}
}
if(is_array($dead)) {
foreach($dead as $key => $id) {
$conn = $db->query("SELECT * FROM mod_monitoring WHERE id = {$id}");
$data = $db->fetch_array($conn);
if(in_array($data['parentid'], $final)) {
unset($dead[$key]);
}
if(in_array($id, $dead)) {
unset($dead[$key]);
}
}
}
I have a script that opens a huge XLSX file and reads 3000 rows of data, saving it to a two dimensional array. Of all places for Apache to crash, it does so in a simple loop that builds a MySQL query. I know this because if I remove the following lines from my application, it runs without issue:
$query = "INSERT INTO `map.lmds.dots` VALUES";
foreach($data as $i => $row)
{
$id = $row["Abonnementsid"];
$eier = $row["Eier"];
$status = $row["Status"];
if($i !== 0) $query .= "\n,";
$query .= "('$id', '$eier', '$status', '0', '0')";
}
echo $query;
I can't see a thing wrong with the code.
I'm using PHPExcel and dBug.php
Why is this script crashing Apache?
EDIT: Perhaps I should elaborate on what I mean by crash. I mean a classic Windows "Program has stopped working":
EDIT: Another attempt inspired by one of the answers. Apache still crashes:
$query = "INSERT INTO `map.lmds.dots` VALUES";
$records = array();
foreach($data as $i => &$row)
{
$id = $row["Abonnementsid"];
$eier = $row["Eier"];
$status = $row["Status"];
$records[] = "('$id', '$eier', '$status', '0', '0')";
}
echo $query . implode(",", $records);
EDIT: I have narrowed it down further. As soon as I add a foreach loop, Apache crashes.
foreach($data as $i => $row) {};
Like the others respondents have said this is most likely a memory issue, you should check both your Apache error logs and your PHP error logs for more info.
Assuming this is a memory problem, I suggest you change your code so that you execute multiple insert statements inside the foreach loop rather than storing the whole thing in a big string and sending it to the database all at once. Of course, this means that you're making 3000+ calls to the database rather than just one, I'd expect this to be a bit slower, you can mitigate this problem by using a prepared statement which should be a bit more efficient. If this is still too slow, try changing your loop so that you only call the database every N times round the loop.
The amount of string concatenation and the amount of string data involved could be too much to handle at once during the permitted execution time.
You could try to just collect the values in an array and put them together at the end:
$query = "INSERT INTO `map.lmds.dots` VALUES";
$records = array();
foreach($data as $i => $row) {
$records[] = "('".mysql_real_escape_string($row["Abonnementsid"])."', '".mysql_real_escape_string($row["Eier"])."', '".mysql_real_escape_string($row["Status"])."', '0', '0')";
}
$query .= implode("\n,", $records);
Or insert the records in chunks:
$query = "INSERT INTO `map.lmds.dots` VALUES";
$records = array();
foreach($data as $i => $row) {
$records[] = "('".mysql_real_escape_string($row["Abonnementsid"])."', '".mysql_real_escape_string($row["Eier"])."', '".mysql_real_escape_string($row["Status"])."', '0', '0')";
if ($i % 1000 === 999) {
mysql_query($query . implode("\n,", $records));
$records = array();
}
}
if (!empty($records)) {
mysql_query($query . implode("\n,", $records));
}
Also try it with reference in foreach to avoid that an internal copy of the array is made:
foreach($data as $i => &$row) {
// …
}
This does sounds like a memory issue. Maybe it has nothing to do with the loop building the SQL query. It could be related to reading the "very" large file before that. And the loop pushing memory usage over the limit. Did you try freeing up memory after reading the file?
You can use memory_get_peak_usage() and memory_get_usage() to get some more info about consumed memory.
If that doesn't solve your issue. Install a debugger like Xdebug or Zend Debugger and do some profiling.
Alright, it turns out that updating PHP from 5.3.1 to 5.3.5 made the problem go away. I still have no idea as to what made PHP crash in the first place, but I suppose my PHP could simply have been broken and in need of a reinstall.
I have a php application (using a LAMP stack) that sends thousands of application. I would love to stop by force emails to be sent. I can't stop the senidng by closing the browser obviously.
Should I kill processes, or is there any other way to do so ? What process should I kill ? There may be more than one..?
PS: Of course, the application is badly designed.. but here is not the question.
If it's your own (self written) application, perhaps you should add some functionality that allows you to suspend or halt the execution.
One example would be on every X iterations, the script checks a resource for commands. If there are commands in the resource queue, it executes them in order, removes them and continues (if applicable).
For example, flat file or DB, you could add a STOP-SUSPEND_EXECUTION command. When your script reads that line or row out, it suspends normal execution but continues checking the resource periodically. After which if a RESUME command is read, execution resumes from where it left off as it hasn't left the iterative loop.
Now you can, either by CLI or other interface, add commands to the queue, and the application will respond accordingly.
You could even get fancy, adding timestamps to defer command execution.
PS: If you're performing tasks like mass mailing, etc., perhaps you'd consider moving these scripts to a command line interface. I mention this only based on your comment about "closing the browser".
Could use some work, but it does the trick. run() takes a callback function $job as an argument. That function represents a single iteration of whatever batch job you're doing (mass mailing, etc.) and $data as an array of data. With each iteration, $job is given the next element of the $data array as a set of arguments.
$data = array(
array('name' => 'bob', 'email' => 'bob#site.com'),
array('name' => 'jim', 'email' => 'jim#site.com'),
array('name' => 'ann', 'email' => 'ann#site.com'),
);
$job = function($name, $email){
// do something with $name
// and $email
};
$batch->run($job, $data);
You need some tables (a 'la MySQL Workbench):
CREATE SCHEMA IF NOT EXISTS `batchtest` DEFAULT CHARACTER SET latin1 COLLATE latin1_swedish_ci ;
USE `batchtest` ;
CREATE TABLE IF NOT EXISTS `batchtest`.`job` (
`id` CHAR(24) NOT NULL ,
`alias` VARCHAR(255) NOT NULL ,
`status` INT NOT NULL DEFAULT 0 ,
`timestamp` TIMESTAMP NOT NULL ,
PRIMARY KEY (`id`) )
ENGINE = InnoDB;
CREATE TABLE IF NOT EXISTS `batchtest`.`queue` (
`id` INT UNSIGNED NOT NULL AUTO_INCREMENT ,
`job_id` CHAR(24) NOT NULL ,
`action` VARCHAR(255) NOT NULL ,
`params` TEXT NULL ,
`timestamp` TIMESTAMP NOT NULL ,
PRIMARY KEY (`id`) )
ENGINE = InnoDB;
Whenever you want to pause/resume/abort a job, add a row to the queue table with the job_id and action (pause, resume, or abort) and the job will respond. The job will automatically remove the completed commands from the queue table.
That's the gist of it.
class BatchJob{
const STATUS_STARTING = 0;
const STATUS_RUNNING = 1;
const STATUS_PAUSED = 2;
const STATUS_ABORTED = 4;
const STATUS_COMPLETED = 5;
protected $_id = null;
protected $_alias = null;
protected $_pdo = null;
protected $_pauseSleep = null;
protected $_status = self::STATUS_STARTING;
protected $_jobTable = 'job';
protected $_queueTable = 'queue';
public function __construct($pdo, $alias){
$this->_pdo = $pdo;
$this->_alias = $alias;
$this->_id = vsprintf('%04x%04x%04x%04x%04x%04x', array(
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
mt_rand(0, 0xffff),
));
$this->output("Initializing job");
$this->_pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
$statement = $this->_pdo->prepare("INSERT INTO {$this->_jobTable} (id, alias, status) VALUES (:id, :alias, :status)");
$statement->execute(array(
':id' => $this->_id,
':alias' => $this->_alias,
':status' => $this->_status,
));
}
public function run($job, Array $data, $pauseSleep = 10){
$this->_pauseSleep = $pauseSleep;
$iteration = 0;
$this->updateStatus(self::STATUS_RUNNING);
while($this->_status != self::STATUS_ABORTED
&&$this->_status != self::STATUS_COMPLETED){
$statement = $this->_pdo->prepare("SELECT id, action, params FROM {$this->_queueTable} WHERE job_id = :job_id");
$statement->execute(array(
':job_id' => $this->_id,
));
foreach($statement->fetchAll() as $command){
switch($command['action']){
case 'resume':
$this->updateStatus(self::STATUS_RUNNING);
break;
case 'pause':
$this->updateStatus(self::STATUS_PAUSED);
break;
case 'abort':
$this->updateStatus(self::STATUS_ABORTED, true, false);
exit;
break;
}
$statement = $this->_pdo->prepare("DELETE FROM {$this->_queueTable} WHERE id = :id");
$statement->execute(array(
':id' => $command['id'],
));
}
if($this->_status == self::STATUS_PAUSED){
sleep($this->_pauseSleep);
continue;
}
call_user_func_array($job, (Array) current($data));
if(!next($data)){
$this->updateStatus(self::STATUS_COMPLETED, true, false);
}
}
}
protected function output($string){
echo ">>> [{$this->_alias}:{$this->_id}] [" . date('Y-m-d H:i:s') . "] {$string}" . PHP_EOL;
}
protected function updateStatus($status = null, $updateDatabase = true, $updateOutput = true){
if(!is_null($status)){
$this->_status = $status;
}
if($updateDatabase){
$statement = $this->_pdo->prepare("UPDATE {$this->_jobTable} SET status = :status WHERE id = :id");
$statement->execute(array(
':id' => $this->_id,
':status' => $this->_status,
));
}
if($updateOutput){
$reflection = new ReflectionClass(__CLASS__);
$statusCodes = array_flip($reflection->getConstants());
$this->output("Job status change [{$statusCodes[$this->_status]}]");
}
}
public function __destruct(){
$this->updateStatus();
}
}
If you can modify the script you can insert a line like this in the main cycle of the script (preferably before the mail() line):
if (connection_aborted ())
exit ();
This will stop the PHP script if you close the browser window. Although this is the default behavior php often fails to stop scripts right away.
You can do this without knowing much of the inner working of the script and it is nicer than killing Apache.
httpd - this will stop all of Apache.
Typically you would kill the web server. If you run the cgi exe you can kill that.
Are you asking how to shut down a php script gone wild? If so you could always restart apache. If I misunderstood your question I apologize in advance.
Edit your php.ini to set mail in the disabled-functions. Then php will fail to be able to use the mail function.
disable_functions = mail
http://www.php.net/manual/en/ini.core.php#ini.disable-functions