Am trying to include two cURL command blocks. The second block depends on a value returned by the first block. I know that there could be network issues in trying to establish cURL calls between servers. The value returned by the first block will be saved in a database. What I want to know is how to write the syntax on the PHP file that will execute both cURL command blocks, so that when the value is returned by the first block I could start the second block, keeping in mind that there could be network problems. I know how to write the cURL command blocks. I just need that part that will avoid network issues. Here is what I was thinking: (inside the PHP file)
$default = "default value";
$a = $default;
//first block executes initializing $a
$a = "value returned by first block. Already saved to database";
//second block can't execute until this value is not set to default value
while($a != $default){
//second block executes
//somewhere to the end of second block
$result = curl_exec($ch); //where $ch is the handle for my cURL call
//exiting the while loop:
if(isset($result))exit(0); //don't know if this is the proper way of exit
//a while loop inside a PHP script
}
Please let me know if this seems to be correct or how can I improve it. One more thing, since I am fairly new to PHP, how do you comment out syntax lines in PHP?
Do you use // as in JAVA?
thank you
cURL is not asynchronous so you don't have to worry about making your code wait for it to finish. That is, PHP will not execute the lines after curl_exec() until the cURL request has completed. Something like this should work for you:
$result = curl_exec($ch);
if($result !== false){
//Do Stuff on successful curl
}
else{
echo curl_error($ch); //print the curl error
}
Also, the way to exit a while loop is break;, if you only want to skip the current iteration, use continue;
Related
I am trainee, learning PHP. As a small project I wrote some code.
I am trying to make a (test) website about cars. I want to show that each car has several options. I have a database in MySQL.
I wrote (this is just a part of the code):
foreach ($key['options'] as $options) {
$contents = str_replace("[OPTION]",$options['text'], $contents);
}
$result= '';
foreach ($key['motor'] as $motor) {
$nm = $motor['name'];
$cnt = $motor['count'];
$result .= $cnt ." " .$nm. " ";
}
return $result;
$paginas .= $contents;
echo $paginas;
So far the code. The thing is, after the result code, the script stops (which is normal).
But I don't want that the script stops, the only thing I want is that the final echo also echo's all my return $result options PLUS the echo of '$paginas'.
It is a bit hard to explain maybe, but I hope you guys understand.
Have you read the documentation of the return statement?
return returns program control to the calling module. Execution resumes at the expression following the called module's invocation.
If called from within a function, the return statement immediately ends execution of the current function, and returns its argument as the value of the function call. return also ends the execution of an eval() statement or script file.
If called from the global scope, then execution of the current script file is ended. If the current script file was included or required, then control is passed back to the calling file.
No matter where it is used, it transfers the control to a different part of the code. The statement(s) following the return statement in the current context are not executed.
I edited the code to echo instead of return. That works.
I made blocks on the website where the information about the car is shown. 1 block per car
The thing is with the $result echo: Now I do not get a list of all different options in 1 block, but I get for every option a new 'block'. so for example:
Block: Steeringwheel. Block:Headlights. Block: Engine type
My goal is to get all these things into 1 block.
If some condition is met, how can I make the function:
Stop executing the rest of the function
Wait X time
Restart the function
Would it be something like:
function someFunc() {
if (x == 0) {
sleep(60);
someFunc();
return;
}
...other code only to be run if above is false...
}
someFunc();
...other code only to be run if above function finishes running completely...
In case its relevant and there is some library to handle APi limits or something, I am doing this for an API connection. First I receive a webhook hit via
file_get_contents('php://input')
which contains a URL. Then I hit the URL with
file_get_contents( $url )
and, after parsing $http_response_header into a $headers array, check it's header as if ($header['api_limit'] == 0) ... (in the above example this is x). If "x" is 0, then I want the function to wait a minute until the limit cycle resets and run the second file_get_contents( $url ) and the parsing that follows it again.
The main reason I wanted to handle it this way is to not have to record anything. The webhook I receive via file_get_contents('php://input') only happens once. If API rate limit is hit and I try to use the URL in the webhook but fail, then the URL is lost. So I was hoping the function would just wait X time until the rte resets until trying to use the webhook-received URL with file_get_contents($url) again. Is this bad practice somehow?
With rate limited resources you usually want to cache a copy of the data for blocks of X minutes so that the limit is never actually exceeded. For example, in the case of a maximum of 10 requests per hour you would cache the response for at least 6 minutes before attempting fetch a new response.
It is not a good idea to stall the entire PHP interpreter until the rate limit is lifted.
As for approaching "repeat an attempt to do something until it works" in general, this is not something that PHP handles very well given you usually want a request and response cycle with PHP to be as fast as possible so it can move onto the next request. Your PHP application should provide an immediate yes/no response to an external utility that triggers the task at a given interval.
I solved it like this:
// This will be the testing variable, where in the actual script
// we'll check the response code of file_get_contents
$count = 0;
function funcTwo( &$count ) {
// Here I'd run file_get_contents and parse the headers
$count = ++$count;
echo "functTwo() running $count... \n";
// Here I'll test for response code 429, if true, restart
if ($count !== 5) {
echo "Count only = $count so we're gonna take a nap... \n";
sleep(1);
echo "Waking from sleep $count and rerunning myself... \n";
funcTwo($count);
return;
}
echo "Count finally = $count, exiting funcTwo... \n";
}
// This function does the main work that relies on the successful response from
function funcOne( $count ) {
echo "functOne() running! \n";
// The function will be delayed here until a successful response is returned
funcTwo($count);
echo "Count finally = $count, so now we can finally do the work that \n";
echo "depends on a successful response from funcTwo() \n";
// Do main work
echo "Work done, exiting funcOne and script... \n";
}
funcOne($count);
I read that file_get_content is synchronous, but when I tried the code below I dont' think so :
$url = "http://foo.com";
$a = array("file11.php", "file2.php", "file3.php");
foreach ($a as $file)
{
$final = $url . "/" . $file;
print "Calling $final ...";
$res = file_get_contents($final);
if ($res)
print "OK";
else
print "ERR!";
print "<br>";
}
Each file executes some complex tasks, so I know the minimal excution time of any script, but this code runs very fastly and seems not to wait each request ! How can I wait for each file request?
Thanks :)
The above code is definitely synchronous. So if you say that the code exits after a few seconds, while it should be a lot longer, then you probably have a problem with the code.
Try to wrap this code in a try {} catch. And print the error. See what it says.
Try { code here } catch (Exception $e) { }
Also, most default settings in the php.ini for MAX_EXECUTION for a script is 30 seconds. After that it will exit on a fatal timeout error too. Check the setting in your php.ini and adjust it to your needs.
Edit:
Gathering your comments, I now assume you are trying to execute the php files you are referring to. This makes your question very confusing and the tags just wrong.
The code you use in your example only reads the contents of the file, so it's not executing anything. Which explains why it returns so fast, while you expect it to take a while.
If you want to execute the referred php files, approach it like this:
Include_once( $final );
Instead of opening the contents.
My php script uses php simplehtmldom to parse html and get all the links and images that I want and this can run for a duration depending on the amount of images to download.
I thought it would be good idea to allow cancelling in this case. Currently I call my php using Jquery-Ajax, the closest thing I could find is php register_shutdown_function but not sure if it can work for my case. Any ideas?
So once php is launched, it cant be disturbed? like fire ajax again to call an exit to the same php file?
This is good only in case you are processing really massive data loads through AJAX. For other cases, just handle it in JS to not display result if canceled.
But as I said If you are processing huge loads of data, then you can add a interrupt condition in every nth step of running script and fulfill that condition using another script. For example you can use a file to store a interrupt data, or MySQL MEMORY table.
Example.
1, process.php (ajax script processing loads of data)
// clean up previous potential interrupt flag
$fileHandler = fopen('interrupt_condition.txt', 'w+');
fwrite($fileHandler, '0');
fclose($fileHandler);
function interrupt_check() {
$interruptfile = file('interrupt_condition.txt');
if (trim($interruptfile[0]) == "1") { // read first line, trim it and parse value - if value == 1 interrupt script
echo json_encode("interrupted" => 1);
die();
}
}
$i = 0;
foreach ($huge_load_of_data as $object) {
$i++;
if ($i % 10 == 0) { // check for interrupt condition every 10th record
interrupt_check();
}
// your processing code
}
interrupt_check(); // check for last time (if something changed while processing the last 10 entries)
2, interrupt_process.php (ajax script to propagate cancel event to file)
$fileHandler = fopen('interrupt_condition.txt', 'w+');
fwrite($fileHandler, '1');
fclose($fileHandler);
This will definitely affect performance of your script, but makes you a backdoor to close execution. This is very simple example - you need to make it more complex to make it work for more users simultaneously, etc.
You can also use MySQL MEMORY Table, MEMCACHE - Non-persistent Caching Server or whatever non-persistent storage you could find.
I am having problems with cURL not being able to connect to a server that returns an xml feed and am not sure if my code is stacking up and causing the problem. Is it possible the final function called in this foreach loop is still running when the next loop iteration comes round.
Is it possible to make sure all functions in the loop complete before the next iteration begins, or does foreach do this by default anyway? I tried setting a return true on process_xml() and running a test in the loop: if($this->process_xml($xml_array))continue;
but it didn't seem to have an effect and seems like a bad idea anyway.
foreach($arrayOfUrls as $url){
//retrieve xml from url as string.
if($url_xml_string = $this->getFeedStringUsing_cURL($url)){
$xml_object = simplexml_load_string($url_xml_string);
$xml_array = $this->feedStringToArray($xml_object);
//process the xml.
$this->process_xml($xml_array);
}
}
No, this is not possible. Each statement is executed and finished before the next statement is run.
and am not sure if my code is stacking up
Not sure? If it's important to you why don't you find out? Without knowing what OS you are running on its rather hard to advise how you'd go about that - but netstat might be a good starting point.
Is it possible the final function called in this foreach loop is still running
It's highly improbable - PHP scripts run in a single thread of execution unless you tell them not to - but the curl extension allows you to define callbacks into your php code which run before the operation completes, and the curl_multi_ family of functions also allow you to run php code while requests are in progress.