Inconsistent behavior in a php code - php

I coded a function to help me handle transaction with files in CodeIgniter.
today I was trying this code:
function($db_trans_func, $context){
if(is_callable($db_trans_func)){
$context = $db_trans_func($context);
FirePHP_::info_(time(), "After Db trans");
}
}
that is just a snippet from my helper. But the problem is, when this code runs and in the case where the execution of the function $db_trans_func takes place it takes more time to run, php passes to next code FirePHP_::info_($context, "From db transaction"); before the ending of the line before.
That is abnormal for me. because in the normal case the lines should run one after the other.
Can anyone help me solve this problem ? How can I tell php to not run
FirePHP_::info_(time(), "After Db trans");
after that:
$context = $db_trans_func($context);
finishes its execution?

I'm not entirely clear, but my assumption is:
db_trans_func is running some function against the DB (such as setting a transaction begin)
you are comparing the php function FirePHP_::info_(time(), "After Db trans"); against the time recorded in the db, or similar
In other words, you have a function that DOES fire first in php, then a second one. They ARE running consecutively; BUT, the DB result takes longer, of course, and so the db effect is seen afterwards. In other words, these are different threads running asynchronously
Does that make sense to you, and is it possible?

Related

2006 MySQL server has gone away while saving object

Im getting this error "General error: 2006 MySQL server has gone away" when saving an object.
Im not going to paste the code since it way too complicated and I can explain with this example, but first a bit of context:
Im executing a function via Command line using Phalcon tasks, this task creates a Object from a Model class and that object calls a casperjs script that performs some actions in web page, when it finishes it saves some data, here's where sometimes I get mysql server has gone away, only when the casperjs takes a bit longer.
Task.php
function doSomeAction(){
$object = Class::findFirstByName("test");
$object->performActionOnWebPage();
}
In Class.php
function performActionOnWebPage(){
$result = exec ("timeout 30s casperjs somescript.js");
if($result){
$anotherObject = new AnotherClass();
$anotherObject->value = $result->value;
$anotherObject->save();
}
}
It seems like the $anotherObject->save(); method is affected by the time exec ("timeout 30s casperjs somescript.js"); takes to get an answer, when it shouldn`t.
Its not a matter of the data saved since it fails and saves succesfully with the same input, the only difference I see is the time casperjs takes to return a value.
It seems like if for some reason phalcon opens the MySQL conection during the whole execution of the "Class.php" function, provoking the timeout when casperjs takes too long, does this make any sense? Could you help me to fix it or find a workaround to this?
Problem seems that either you are trying to fetch heavy data in single packet than allowed in your mysql config file or your wait_timeout variable value is not set properly as per your code requirement.
check your wait_timeout and max_allowed_packet values, you can check by below command-
SHOW GLOBAL VARIABLES LIKE 'wait_timeout';
SHOW GLOBAL VARIABLES LIKE 'max_allowed_packet';
And increase these values as per your requirement in your my.cnf (linux) or my.ini (windows) config file and restart mysql service.

Matlab urlread error within timer function

I am using a timer function in matlab to continuously execute a certain script. Within this script, I am using urlread to retrieve data from webservices, which works like a charm.
I am now trying to use urlread to execute a simple http-request within this script to insert data into a mysql-database. Thus, I simply specify the url-string and define the value to be parsed to the php parser.
Code-within script being executed in timer-function:
db_url = 'http://someurl/update.php?value=';
db_url = strcat(db_url,num2str(value));
urlread(db_url);
clear db_url
My problem is the following: When I run the timer, it works fine for one execution, but then stops displaying the following error:
"Either this URL could not be parsed or the protocol is not supported."
What is going wrong? When I check my mysql database, I see that one new line has been added to my database, which means it generally works, just won't execute multiple times within the timer.
Any idea what is going wrong? Many thanks in advance!
I figured out what the problem was. The value variable is an array with increasing in size each iteration. Thus, what I needed to do was specify value(end), like so:
db_url = 'http://someurl/update.php?value=';
db_url = strcat(db_url,num2str(value(end)));
urlread(db_url);
clear db_url

PHP connection_aborted() and/or register_shutdown_function work intermittently

I've a PHP script that outputs a file to the user (as a download) which is also used to record what the user is downloading.
Basic structure is this:
set_time_limit(0);
ignore_user_abort(true);
register_shutdown_function('shutdown_fn'); //as a fail safe (i think)
//some other code here
//do some mysql queries
while(!feof($fh) && !connection_aborted()) {
echo fread(....);
ob_flush;
ob_end_flush;
sleep(1);
}
fclose($fh);
//do some more mysql queries here and set a boolean to track if it was done successfully
function shutdown_fn () {
//check boolean to see if queries failed, if so, do them here
}
The above code seems to work 99% of the time just fine. However, there are some instances when the second set of queries don't execute at all (the other 1%). I have no idea why. The files being sent to the user range from very small to very large (and in both cases they work just fine so i cant see how a large file (or small file) would be breaking the code).
Any thoughts? I hope i have explained myself well enough
I need to see some more code like the opening/reading of the file to further help you, but if you really want to be sure and not depend on the one shutdown_fn() function then why not call it yourself as well on the end of the script? Reset the boolean in the shutdown_fn() so whenever the actual shutdown is triggered than your sql queries are not ran twice.

Stop PHP with ajax

I have a JavaScript functions which calls a PHP function through AJAX.
The PHP function has a set_time_limit(0) for its purposes.
Is there any way to stop that function when I want, for example with an HTML button event?
I want to explain better the situation:
I have a php file which uses a stream_copy_to_stream($src, $dest) php function to retrieve a stream in my local network. The function has to work until I want: I can stop it at the end of the stream or when I want. So I can use a button to start and a button to stop. The problem is the new instance created by the ajax call, in fact I can't work on it because it is not the function that is recording but it is another instance. I tried MireSVK's suggest but it doesn't worked!
Depending on the function. If it is a while loop checking for certain condition every time, then you could add a condition that is modifiable from outside the script (e.g. make it check for a file, and create / delete that file as required)
It looks like a bad idea, however. Why you want to do it?
var running = true;
function doSomething(){
//do something........
}
setInterval(function(){if(running){doSomething()}},2000); ///this runs do something every 2 seconds
on button click simply set running = false;
Your code looks like:
set_time_limit(0);
while(true==true){//infinite loop
doSomething(); //your code
}
Let's upgrade it
set_time_limit(0);
session_start();
$_SESSION['do_a_loop'] = true;
function should_i_stop_loop(){
#session_start();
if( $_SESSION['do_a_loop'] == false ) {
//let's stop a loop
exit();
}
session_write_close();
}
while(true==true){
doSomething();
should_i_stop_loop(); //your new function
}
Create new file stopit.php
session_start();
$_SESSION['do_a_loop'] = false;
All you have to do now is create a request on stopit.php file (with ajax or something)
Edit code according to your needs, this is point. One of many solutions.
Sorry for my English
Sadly this isn't possible (sort of).
Each time you make an AJAX call to a PHP script the script spawns a new instance of itself. Thus anything you send to it will be sent to a new operation, not the operation you had previously started.
There are a number of workarounds.
Use readystate 3 in AJAX to create a non closing connection to the PHP script, however that isn't supported cross browser and probably won't work in IE (not sure about IE 10).
Look into socket programming in PHP, which allows you to create a script with one instance that you can connect to multiple times.
Have PHP check a third party. I.E have one script running in a loop checking a file or a database, then connect to another script to modify that file or database. The original script can be remotely controlled by what you write to the file/database.
Try another programming language (this is a silly option, but I'm a fan of node). Node.js does this sort of thing very very easily.

check cron job has run script properly - proper way to log errors in batch processing

I have set up a cronjob to run a script daily. This script pulls out a list of Ids from a database, loops through each to get more data from the database and geneates an XML file based on the data retrieved.
This seems to have run fine for the first few days, however, the list of Ids is getting bigger and today I have noticed that not all of the XML files have been generated. It seems to be random IDs that have not run. I have manually run the script to generate the XML for some of the missing IDs individually and they ran without any issues.
I am not sure how to locate the problem as the cron job is definately running, but not always generating all of the XML files. Any ideas on how I can pin point this problem and quickly find out which files have not been run.
I thought perhaps add timestart and timeend fields to the database and enter these values at the start and end of each XML generator being run, this way I could see what had run and what hadn't, but wondered if there was a better way.
set_time_limit(0);
//connect to database
$db = new msSqlConnect('dbconnect');
$select = "SELECT id FROM ProductFeeds WHERE enabled = 'True' ";
$run = mssql_query($select);
while($row = mssql_fetch_array($run)){
$arg = $row['id'];
//echo $arg . '<br />';
exec("php index.php \"$arg\"", $output);
//print_r($output);
}
My suggestion would be to add some logging to the script. A simple
error_log("Passing ID:".$arg."\n",3,"log.txt");
Can give you some info on whether the ID is being passed. If you find that that is the case, you can introduce logging to index.php to further evaluate the problem.
Btw, can you explain why you are using exec() to run a php script? Why not excute a function in the loop. This could well be the source of the problem.
Because with exec I think the process will run in the background and the loop will continue, so you could really choke you server that way, maybe that's worth trying out as well. (I think this also depends on the way of outputting:
Note: If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
Maybe some other users can comment on this.
Turned out the apache was timing out. Therefore nothing to do with using a function or the exec() function.

Categories