PHP weird Seg-faults on mysqli_stmt_bind_result - php

When migrating a PHP script from PHP 5.2 to PHP 5.3, I've stumbled to the following problem:
The general purpose of the script is data mining.
I have a procedure inside that adds data to the MySQL server.
Since it is really repetitive, I've rewritten it (a while ago) to use MySQLi, in particular prepared statements, since there are a total of 3 possible queries to perform.
Anyway, now, on the PHP 5.3 server, the script is crashing on the following line:
mysqli_stmt_bind_result($prepCheck, $id1);
Where $prepCheck is created with $prepCheck = mysqli_prepare($con, $checkQuery) or die("Error");. The query runs fine on the MySQL server ($checkQuery, that is) and the PHP code was working, too, on the previous server.
Running the script with strace didn't reveal anything, since the last thing in it is the system call for echo "Execute";, which is 29936 19:44:18 write(1, "Execute\n", 8) = 8.
The connection object is not FALSE, and even if it was, it should fail with another error, right?
Here comes the weirdest part:
This procedure does not fail when I run the script, limiting the number of pages visited and the script completes successfully. However, when I set a higher limit, it fails, always on the first call to this procedure, and precisely on this line.
If anyone has any suggestions what could be causing this, they would be deeply appreciated.
I can paste code if anyone needs to see a larger picture, but the procedure is very long and boring to death (may be that's why the script is failing :).
Here is how the script starts: error_reporting(E_ALL); ini_set('display_errors', '1');.
No error is reported besides the 'magical' Segmentation fault. I'm not using APC.
Not sure if it's relevant, but I'm using CLI to run the script, not a web-interface.
PHP version is 5.3.8, MySQL version is 5.1.56. The memory limit is set to 64MB.
EDIT: The procedure failing + some of the other code is uploaded here: http://codepad.org/KkZTxttQ. The whole file is huge and ugly, and I believe irrelevant, so I'm not posting it for now. The line that's failing is 113.

An answer to my own question, since I've solved the issue, and there are no other answers...
Credit goes to #jap1968 for pointing to me to the function mysqli_stmt_error (which I assumed I would not need, since I have error_reporting(E_ALL)).
The problem was that MySQL had a very weird default configuration: particularly
connect_timeout = 10
wait_timeout = 30
This caused the MySQL server to close the connection after only 30 seconds (default is more than a half hour, according to MySQL website). This in turn, caused the mysqli_stmt_bind_result function to fail with a Segmentation Fault.

Related

How to set TimeOut for execution the query from PHP Sqlsrv

How do I set the TimeOut for EXEC the Query or Stored Procedure in SqlSrv PHP
Because i am using PHP to call SQL-Server Stored Procedure.
For ex
"EXEC SP_Name"
some times its taking too long time, so the PHP page shows 500 Internal server error.
If possible to set the time limit the SP was stopped then i show the error description etc...
how do i fix this issue?
Thanks in Advance.
in your php.ini file set
max_execution_time = 60--or whatever time you wish
Refer This Link for more details
The #Jayasurya Satheesh's answer refers to the time of any php execution time, but not for db query exec time. Actually php lacks in a easy way to control the execution time of a db query.
It is more on the DB/SQL settings side. If you can define the TimeOut on the SQL Server side, this will return "timeout" to PHP.
Anyway, you can define set_time_limit(25) (where 25 is the number of seconds you want), You can put this in your php code before your SQL EXEC, not neccesary to deal with php.ini if you want to set_time_limit for specific php file. So, if the query takes more than 25 seconds, the php will stop execution.
Remember set_time_limit() does not affect other php scritps, it's only for the current file where it's placed

PHP terminates script unexpectedly

I'm working in PHP and creating a system with a lot of PHP-driven elements and I have noticed that some of my pages stop displaying text produced using the echo command.
I have made a small example of this. Of course, my program is not supposed to just print allt numbers from 1 to 10000, but this example demonstrates how the script just terminates without any warnings.
Example code:
<?php
for ($i = 1; $i <= 10000; $i++) {
echo $i, '<br>';
}
?>
Output:
1
2
More numbers...
8975
8976
8977
8
What is causing this? Is it a buffer issue, and how do I resolve it?
The fact that your code ran to completion on the cli suggests to me that your script is exceeding the ini.max_execution_time runtime configuration.
Note in the linked documentation that the value of this configuration on the cli is 0 which means it does not time out when run in that environment.
The default setting in the browser is 30 seconds.
You can show your current setting in the browser with:
echo ini_get('max_execution_time');
And you should be able to increase it with:
ini_set('max_execution_time', 0); // turns off timeout
If the script you have shown us behaves as you describe the there's something very wrong going on. If this is a Unix or Linux based system and its repeatedly exhibiting this behaviour then the kernel is terminating the script - unless it has been configured not to do so, the kernel will be forcing a core dump of the process.
Either go build a new system to run your code on or Google how to capture and diagnose a core dump on your operating system.
update
If xdebug is reporting the process is still running then it probably hasn't dumped its core, but "not producing output" != "not running". What state is the process in? What happens when you redirect the ouput? What is the end-to-end output channel when it misbehaves?
The problem did not lie directly with my PHP installation or the application itself, but somewhere in my IDE PHPStorm. When running the code with the same PHP interpreter outside of the IDE's wrappers, it all works fine. The procedures described by the many users here helped with that. Thank you.

set_time_limit not working on heroku

I am using PHP with heroku. I keep on getting a request timeout error due to some database insertions and queries.
I added this line to all my php files in order to avoid this error:
set_time_limit(0);
However, I am still getting this error. Does heroku ignore this command?
I did a simple check to see if the time limit is being changed:
echo 'TIME : '.ini_get('max_execution_time');
set_time_limit(0);
echo 'TIME : '.ini_get('max_execution_time');
It is being changed from 30 (default value) to 0. Despite the change, I am still getting the error.
Also, I would like to add that the php file is being called by ajax.
Furthermore, as far as I know, php is not set to safe mode, so there is no reason why the command should be ignored.
Heroku suggests to use a background job, and as far as I can tell, it forces you if the task takes more than 30 seconds. Has anybody managed without using a background job?
Update: Tried using:
ini_set('max_execution_time', 0);
Still does not want to work
If you have to go over the 30s request timeout on Heroku, you'll need to use a background job - there is no way around that (Heroku will just kill the request if it takes longer than 30 seconds). Heroku has some documentation on this.

AjaXplorer [written in PHP] is too slow on IIS

I've installed AjaXplorer (very nice web file explorer), written in PHP, on my IIS (Windows Server 2008 SP2 x64). It works too slow for me.
What can be the cause? Are there some settings in php.ini? Or, maybe, something is wrong with IIS?
I use 32-bit PHP, php-cgi.exe as interpreter.
Regards,
First off, CGI will always be slow. It needs to boot the entire PHP runtime for each request. Try using FastCGI (If you're using IIS 7, or if you're using IIS 6)...
After that, try to see why it's slow. Is it because the PHP script takes a long time to execute (meaning it's a code issue), or is it because of a server config. To test, modify this into the start of the entrance point of the PHP program (index.php):
define(START_TIME_CUSTOM, microtime(true));
function onEndTimeCompute() {
$timeTaken = microtime(true) - START_TIME_CUSTOM;
echo "Completed In: ".number_format($timeTaken, 4)." Seconds\n";
}
register_shutdown_function('onEndTimeCompute');
That write Completed in n Seconds to the end of the generated output (even if die() is called). It may cause some issues if Ajax calls are expected to return JSON, so don't do it as a rule, just for trying to figure out what's going on.
So, if the total request takes 1 second, yet you see Completed in 0.004 Seconds, you know that the PHP code itself is not the issue (it's either in the setup of the interpreter by CGI, or somewhere else in IIS)...
That should at least show you where the problem is...

set_time_limit() timing out

I have an upload form that uploads mp3s to my site. I have some intermittent issues with some users which I suspect to be slow upload connections...
But anyway the first line of code is set_time_limit(0); which did fix it for SOME users that had connections that were taking a while to upload, but some are still getting timed out and I have no idea why.
It says the script has exceeded limit execution of 60 seconds. The script has no loops so it's not like it's some kind of infinite loop.
The weird thing is that no matter what line of code is in the first line it will always say "error on line one, two, etc" even if it's set_time_limit(0);. I tried erasing it and the very first line of code always seems to be the error, it doesn't even give me a hint of why it can't execute the php page.
This is an issue only few users are experiencing and no one else seems to be affected. Could anyone throw some ideas as to why this could be happening?
set_time_limt() will only effect the actual execution of the PHP code on the page. You want to set the PHP directive max_input_time, which controls how long the script will accept input (like files) for. The catch is that you need to set this in php.ini, as if the default max_input_time is exceeded, it'll never reach the script which is attempting to change it with ini_set().
Sure, a couple of things noted in the PHP Manual.
Make sure PHP is not running in safe-mode. set_time_limit has no affect when PHP is running in safe_mode.
Second, and this is where I assume your problem lies.....
Note: The set_time_limit() function and the configuration directive max_execution_time only affect the execution time of the script itself. Any time spent on activity that happens outside the execution of the script such as system calls using system(), stream operations, database queries, etc. is not included when determining the maximum time that the script has been running. This is not true on Windows where the measured time is real.
So your stream may be the culprit.
Can you post a little of your upload script, are you calling a separate file to handle the upload using Headers?
Try ini_set('max_execution_time', 0); instead.

Categories