I have the following code which executes a piece of code on the MongoDb's side:
$mongoCode = new MongoCode('/* Some JS code */');
$db->execute($mongoCode, array(
'socketTimeoutMS' => 1000000,
));
As you see I have tried to set timeout for the code's execution by setting the socketTimeoutMS value in second parameter of execute() function. But it does not work. Documentations in PHP website indicate that the second parameter of execute() command is sent to code as arguments.
How can I set timeout for MongoDB::execute()? Please note that I am using version 1.5 of MongoDB driver for php and MongoCursor::$timeout is deprecated and does not work anymore.
You can set the socketTimeoutMS on MongoClient:
$mongo = new MongoClient("mongodb://localhost:27017",
array(
"socketTimeoutMS" => 100000
)
);
The args parameters for the execute method are passed to the code not to the driver.
You can also set a timeout just when executing the command:
$result = $mongo->dbname->command(
['eval' => $code],
['socketTimeoutMS' => 1]
);
Alternatively, if you're not executing commands, you can set the timeout on the cursor:
$cursor = $collection->find([]);
$cursor->timeout(10000);
This will obviously not work on the execute command, because that command doesn't return a cursor.
You want the MongoDB::command implementation for this which actually accepts the argument:
<?php
$mongo = new MongoClient('mongodb://192.168.2.3/test');
$db = $mongo->test;
$code = new MongoCode( 'sleep(100); return "hello";' );
try {
$res = $db->command(
array("eval" => $code),
array( 'socketTimeoutMS' => 1 )
);
echo var_dump( $res );
} catch (Exception $e) {
echo 'Caught exception: ', $e->getMessage(), "\n";
}
?>
Note that even though the exception will be thrown for the timeout, this does not not actually stop the code running on the server. That you would have to handle yourself.
Look into the killOp() and currentOP() methods, with their usage and implementation for a way to control and processes left running after your timeout expires on this operation.
Really try to look for other approaches rather than executing JavaScript on the server like this.
Related
This question already has answers here:
curl: (6) Could not resolve host: google.com; Name or service not known
(7 answers)
Closed 9 months ago.
Ok so I am a little stuck with this issue. I have a foreach loop (usually 50 results) that queries an API using Guzzle via Laravel Http and I am getting really inconsistent results.
I monitor the inserts in the database as they come in and sometimes the process seems slow and other times the process will fail with the following after x number of returned results.
cURL error 6: Could not resolve host: api.coingecko.com
The following is the actual code im using to fetch the results.
foreach ($json_result as $account) {
var_dump($account['name']);
$name = $account['name'];
$coingecko_id = $account['id'];
$identifier = strtoupper($account['symbol']);
$response_2 = Http::get('https://api.coingecko.com/api/v3/coins/'.urlencode($coingecko_id).'?localization=false');
if($response_2->successful()){
$json_result_extra_details = $response_2->json();
if( isset($json_result_extra_details['description']['en']) ){
$description = $json_result_extra_details['description']['en'];
}
if( isset($json_result_extra_details['links']['twitter_screen_name']) ){
$twitter_screen_name = $json_result_extra_details['links']['twitter_screen_name'];
}
}else {
// Throw an exception if a client or server error occurred...
$response_2->throw();
}
$crypto_account = CryptoAccount::updateOrCreate(
[
'identifier' => $identifier
],
[
'name' => $name,
'identifier' => $identifier,
'type' => "cryptocurrency",
'coingecko_id' => $coingecko_id,
'description' => $description,
]);
//sleep(1);
}
Now I know I am within the API rate limit of 100 calls a minute so I don't think that is the issue. I am wondering if this is a server/api issue which I don't really have any control over or if it related to my code and how Guzzle is implemented.
When I do single queries I don't seem to have a problem, the issue seems to be when it is inside the foreach loop.
Any advice would be great. Thanks
EDIT
Ok to update the question, I am now wondering if this is Guzzle/Laravel related. I changed the API to now point to the Twitter API and I am getting the same error after 80 synchronous requests.
I think it's better to use Asynchronous Request directly with Guzzle.
$request = new \GuzzleHttp\Psr7\Request('GET', 'https://api.coingecko.com/api/v3/coins?localization=false');
for ($i=0; $i < 50 ; $i++) {
$promise = $client->sendAsync($request)
->then(function ($response) {
echo 'I completed! ' . $response->getBody();
});
$promise->wait();
}
more information on Async requests: Doc
I have a similar problem as yours.
I doing the HTTP requests in the loop, and the first 80 requests are okay.
But the 81st start throwing this "Could not resolve host" exception.
It's very strange for me because the domain can be resolved perfectly fine on my machine.
Thus I start digging into the code.
End up I found that Laravel's Http facades keep generate the new client.
And I guess this eventually trigger the DNS resolver's rate limit?
So I have the workaround as following:
// not working
// as this way will cause Laravel keep getting a new HTTP client from guzzle.
foreach($rows as $row) {
$response = Http::post();
}
// workaround
$client = new GuzzleHttp\Client();
foreach($rows as $row) {
$response = $client->post();
// don't forget use $response->getBody();
}
i believe it's because $client will cached the DNS resolve result, thus it will reduce the call to DNS resolver and not trigger the rate limit?
I'm not sure whether it was right. BUT it's working for me.
I am facing a weird problem using Smarty. I am generating an email's body through a template. Most times it works as expected, but from time to time, the returned data is empty. However, I do not see any error in my logs, neither I catch any exception. It is just as if the template was empty.
This is the piece of code I am using to get the email's body:
// $data is an array with template's data
// $tpl is the template's path
$s = new Smarty();
$s->assignArray( $data );
try {
$body = $s->fetch( $tpl );
} catch ( \Exception $e ) {
Debug::Log( $e->getMessage() );
}
// Sometimes $body is empty, but no exception is thrown.
I checked that the template has no errors, after all, it works in most cases.
I also saved $data contents when $body is empty and I ran the code manually to get $body content, but it worked, so I do not think the problem is related to template vars.
Another test I did is to try to process the template up to 5 times, sleeping for a second between the tries, but the result was always empty.
The template's cache path is writable.
I am using PHP 5.6.40, Smarty 3.1.21 and Apache2.
Can you give me a hand to debug this issue?
Update
I have been able to reproduce the problem. Smarty always returns an empty result whenever the fetch method is called after PHP detected that the client closed the connection. For example, take this code:
ignore_user_abort(1); // Continue running even if the connection is closed
set_time_limit(180); // 3 minutes
$s = new Smarty();
$s->assignArray( $data );
// Keep writing data untill PHP realises that connection was closed
while( 1 ) {
if(connection_status() != CONNECTION_NORMAL || connection_aborted( ) ) {
break;
}
echo "123456789";
}
$body = $s->fetch( $tpl );
if ( '' == $body ) {
throw new Exception("Result is empty");
}
die('Code never reaches this point');
If I call the script above and I close the connection immediately, the result of the fetch method is always empty.
However, if PHP did not detect that the connection was closed, even though it really was, the result of fetch is not empty.
ignore_user_abort(1); // Continue running even if the connection is closed
set_time_limit(180); // 3 minutes
$s = new Smarty();
$s->assignArray( $data );
// Sleep to make sure the connection was closed
// PHP do not realise the connection is closed untill it tries to write something
sleep( 60);
$body = $s->fetch( $tpl );
if ( '' == $body ) {
throw new Exception("Result is empty");
}
echo "Now the result is not empty";
This is the code I used to call the above scripts:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://myhost/test.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
curl_exec($ch);
curl_close($ch);
echo "all done";
This seems to be related to this question: PHP ob_get_contents "sometimes" returns empty when it should not?
My script does a lot of things so it takes quite a long time to finish. Some users close their browser before the script finished, and that is when Smarty returns an empty result, as it uses ob_start a lot.
Best wishes,
blanking the page without exceptions seems to be the way smarty does everything..
im not familiar with python, however, i suspect that only exceptions are thrown. not notices. you might try at the end of your code to check of thrown notices or other warning, hoever it is called in python.
it might still be a folder persmission, have you also checked that the templates_c directory exists, and has opermissions? or any {var.name} without $.
it can be anything, smarty nevers throws exceptions, it just blanks the page.
if it still does not help, crate a basic overly-simplified template, and try that for some time to see if it still happens. if it does, it is a mistake in your template.
As far as I'm concerned, it turns out to be a bug in PHP 5.6. I made some tests using print_r with the return flag set to true, and the result after closing the connection was never empty with PHP 7.0 and PHP 8.0. However, when I used PHP 5.6 the result was empty.
Example:
<?php
error_reporting( E_ALL );
ini_set('display_errors', 1);
ignore_user_abort(true);// (curl disconnects after 1 second)
ini_set('max_execution_time','180'); // 3 minutes
ini_set('memory_limit','512M'); // 512 MB
function testPrint_r($length)
{
$test1 = array('TEST'=>'SOMETHING');
$test2 = print_r($test1, true);
$test3 = "Array\n(\n [TEST] => SOMETHING\n)\n";
if(strcmp($test2, $test3)!==0) {
throw new Exception("Print_r check failed, output length so far: ".$length);
// consult your error.log then, or use some other reporting means
}
}
$message = "123456789\n";
$length = strlen($message);
$total_length = 0;
while(1)
{
echo $message;
$total_length += $length;
testPrint_r($total_length);
}
die('it should not get here');
Using PHP 5.6, if you call the script and close the connection, the Exception is thrown because print_r returns an empty result. However, using PHP 7.0 or PHP 8.0 the script keeps running until it reaches the maximun execution time.
Kind regards,
I'm trying to access the MongoDB profiler in PHP with the same query I would use in the mongo client:
$db = $mongo->selectDB('myapp_db');
$array = $db->execute('return db.system.profile.find();');
echo '<pre>' . print_r($array, true);
But I get this:
Array
(
[retval] => Array
(
[value] => DBQuery: myapp_db.system.profile -> undefined
)
[ok] => 1
)
Profiling is enabled and works fine in the client.
Method MongoDB::setProfilingLevel — Sets this database's profiling level
<?php
$dbname = 'students';
$mongo = (new MongoClient());
$db = $mongo->$dbname;
# 0 (off), 1 (queries > 100ms), and 2 (all queries)
$db->setProfilingLevel(2);
# …
# Some queries
# …
$response = $db->system->profile->find();
foreach ($response as $query) {
print_r($query);
}
Also:
Method MongoDB::getProfilingLevel — Gets this database's profiling level
Method MongoCursor::explain — Return an explanation of the query, often useful for optimization and debugging
There's no need to execute a query in JavaScript, which blocks the server/database, when you can use PHP itself:
$mongo = new MongoClient();
// Alternatively, use selectCollection from $mongo->myapp_db
$collection = $mongo->selectCollection('myapp_db', 'system.profile');
foreach ($collection->find() as $document) {
print_r($document);
}
This makes more efficient use of memory, since you can iterate through results instead of fetching the entire MongoDB::execute() response in a single array.
Additionally, your original code returns the cursor (a DBQuery object) from JavaScript. To ensure compatibility with other drivers, you should invoke cursor.toArray() before returning. This is discussed in Sammaye's answer to a similar question here.
I am trying to call a magento api method thanks to a php SoapClient object.
The problem is the method called creates magentos products and can be relatively long (up to 2 minutes). I need to get the returned values of this method but after a while, the soap call stops and return null.
$session_id = _get_session_id();
$client = new SoapClient($api_url . '&SID=' . $session_id, array('trace' => 1));
try {
$session = $client->login($api_user, $api_password);
$result = $client->call($session, 'api_call.method', array($arg1, $arg2);
}
catch(SoapFault $soapFault) {
...
}
I really need to get the called method return value, whatever the time it takes.
Do you know why the call return null after a while?
Is there a default timeout that can be configured?
Here is a solution, thanks to Jürgen comment:
ini_set('default_socket_timeout', 120); // 2 minutes
This set the call timeout to 2 minutes long
I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.
<?php
include 'echove.php';
$e = new Echove(
'xxxxx',
'xxxxx'
);
// Read Video IDs
# Define our parameters
$params = array(
'fields' => 'id,referenceId'
);
# Make our API call
$videos = $e->findAll('video', $params);
//print_r($videos);
foreach ($videos as $video) {
//print_r($video);
$ref_id = $video->referenceId;
$vid_id = $video->id;
switch ($ref_id) {
case "":
$metaData = array(
'id' => $vid_id,
'referenceId' => $vid_id
);
# Update a video with the new meta data
$e->update('video', $metaData);
echo "$vid_id updated sucessfully!<br />";
break;
default:
echo "$ref_id was not updated. <br />";
break;
}
}
?>
Thanks!
Try the set_time_limit() function. Calling set_time_limit(0) will remove any time limits for execution of the script.
Also use ignore_user_abort() to bypass browser abort. The script will keep running even if you close the browser (use with caution).
Try sending a 'Status: 102 Processing' every now and then to prevent the browser from timing out (your best bet is about 15 to 30 seconds in between). After the request has been processed you may send the final response.
The browser shouldn't time out any more this way.