I am sorry to sound confusing but I will try to explain in the best way possible.
In the controller I have a function search
public function search(){
/*
I run my logics and get few URL from
where I need to fetch further data
the urls are saved in the URL array
$urls[0] = "http://url1.com/search1";
$urls[1] = "http://url2.com/search2";
I then set this in data variable and send it to view
so that It can be run in AJAX
I tired running get_file_contents but it executes
in series one after the other URL.
If there are 10 URL (5 secs per URL) the over all processing time
increases drastically
*/
$data["urls"] = $urls;
$resp = $this->load->view('ajaxer',$data,TRUE);
/* based on the $resp i need to run further business logic's */
}
Now the $resp is actually giving me only the HTML code. It is not executing the HTML and hence the ajax is not run.
Any thoughts on how to execute this will be really helpful.
Regards,
Amit
Your code is absolutelly ok. But your javascript is not getting any response data (only headers), because you are not returning any output.
If you want to "execute your HTML" you need to change the line with view to this:
$this->load->view('ajaxer',$data);
or this:
$resp = $this->load->view('ajaxer',$data,TRUE);
echo $resp;
You forgot to echo output in the controller. Apart from this you need few minor modification in your function.
public function search(){
/*
I run my logics and get few URL from
where I need to fetch further data
the urls are saved in the URL array
$urls[0] = "http://url1.com/search1";
$urls[1] = "http://url2.com/search2";
I then set this in data variable and send it to view
so that It can be run in AJAX
I tired running get_file_contents but it executes
in series one after the other URL.
If there are 10 URL (5 secs per URL) the over all processing time
increases drastically
*/
// You need to check either request came from Ajax request or not. If not it will echo passed string. It prevents to access this function besides Ajax request
if (!$this->input->is_ajax_request()) {
echo "Ajax Requests allowed.";
die;
}
$data["urls"] = $urls;
$resp = $this->load->view('ajaxer',$data,TRUE);
// Standard way to set response for output in json format.
// #param status will help to check all things goes correct or not. if not please pass false on the basis or your feature's requirement
$this->output->set_output(json_encode(array('status'=>true,'response'=>$resp)));
// Standard way to get output set above step.
$string = $this->output->get_output();
echo $string;
exit();
/* based on the $resp i need to run further business logic's */
}
Updated code is here. Hope you find your answer
Related
Im working with so called webhooks. What basically happens is. There's a process happening in the background and when that process finishes it will send a POST request to an URL that I have to specify. For example 'www.bla/process.php'.
The post request that is sent will have a body of data. My question is , is it possible to read the data that is sent and just print it out for example?
Yes
It is possible to pass info from one page to another and print it out for example.
There are many method's...
// THE SAME
echo $_POST['DATA1'];
echo ($_POST['DATA1']);
// THE SAME
// IF = DATA IS SET AND NOT EMPTY
if (isset($_POST['DATA1']) && !empty($_POST['DATA1'])) {
$DATA1 = $_POST['DATA1'];
}
echo($DATA1);
I've script which get some values from XML record.
There's code:
<?php
//Data
$xml_data = '<image_process_call><image_url>https://i.pinimg.com/originals/e4/41/54/e44154308e3466d987665c6d50887f06.jpg</image_url><methods_list><method><name>collage</name><params>template_name=Nun Face in Hole;</params></method></methods_list><result_format>jpg</result_format><result_size>800</result_size><template_watermark>false</template_watermark></image_process_call>';
//Settings
$app_id = '';
$key = '';
$sign_data = hash_hmac('SHA1', $xml_data, $key);
//Send request
$request_url = 'http://opeapi.ws.pho.to/addtask?data='. $xml_data .'&sign_data='. $sign_data .'&app_id='. $app_id;
$request_xml = simplexml_load_file($request_url);
$request_id = strval($request_xml -> request_id);
if (isset($request_id)) {
$result_url = 'http://opeapi.ws.pho.to/getresult?request_id='. $request_id;
sleep(6);
$result_xml = simplexml_load_file($result_url);
$result_status = strval($result_xml -> status);
$result_img = strval($result_xml -> result_url);
if (isset($result_img)) {
echo $result_img;
} else {
echo 'Result image not found';
}
} else {
echo 'Request ID not found';
}
?>
The problem depends on time to generate the second XML file. $result_xml took few seconds so I have to use sleep(6) function.
If I remove this, I need to refresh the page (minimum three times) to get a link to generated image from second XML.
Do you have an idea how to do it more professionally? I can't be sure that every image will be generated in 6 seconds (sometimes shorter sometimes longer).
Is there any method for genereting the result only after receiving $result_img? Thanks in advance for your help!
I think it is worth writing.
In practice, it looks like this:
Script does $request_xml and XML from site return:
<image_process_response>
<request_id>2d8d4dec-4344-4df0-a1e1-0c8df304ad11</request_id>
<status>OK</status>
<description/>
<err_code>0</err_code>
</image_process_response>
Script gets request_id from this XML and do $result_xml. However, this is XML and script doesn't get image's url immediately. It needs to wait a few seconds.
After three times refreshing the page or using sleep(6) function finally we get:
<image_process_response>
<request_id>2d8d4dec-4344-4df0-a1e1-0c8df304ad11</request_id>
<status>OK</status>
<result_url>
http://worker-images.ws.pho.to/i1/9F1E2EAF-5B31-4407-8779-9A85F35862D3.jpg
</result_url>
<result_url_alt>
http://worker-images.ws.pho.to.s3.amazonaws.com/i1/9F1E2EAF-5B31-4407-8779-9A85F35862D3.jpg
</result_url_alt>
<limited_image_url>
http://worker-images.ws.pho.to/i1/3F797C83-2C2E-401C-B4AF-C4D36BBD442D.jpg
</limited_image_url>
<nowm_image_url>
http://worker-images.ws.pho.to/i1/9F1E2EAF-5B31-4407-8779-9A85F35862D3.jpg
</nowm_image_url>
<duration>2950.879097ms</duration>
<total_duration>2956.124067ms</total_duration>
</image_process_response>
Edit:
After trying to immediately generate the image I get such an XML:
<image_process_response>
<request_id>e615f0a1-ddee-4d81-94c4-a392f8f123e8</request_id>
<status>InProgress</status>
<description>The task is in progress, you need to wait for sometime.</description>
</image_process_response>
So this is reason why I see blank page...
Do someone have an idea how to force a script to reconnect with the second XML until it finds a result_url?
The problem depends on time to generate the second XML file.
$result_xml took few seconds so I have to use sleep(6) function. If I
remove this, I need to refresh the page (minimum three times) to get a
link to generated image from second XML.
Do you have an idea how to do it more professionally? I can't be sure
that every image will be generated in 6 seconds (sometimes shorter
sometimes longer). Is there any method for genereting the result only
after receiving $result_img? Thanks in advance for your help!
According to Pho.to API, An add task request is a queued POST request.
In my opinion, Send request in while-loop, but wait for smaller time instead of fixed 6 seconds, Check the status in image_process_response, Keep looping until it is not InProgress, After that, You can safely send second request to get processed image result.
You may encounter timeout issue due to low timeout configuration for DoS protection if you run this script on web server (via CGI/FastCGI), To resolve this situation, You need a queue for adding task in your HTTP request, and then process it offline (means without web environment).
I'm trying to build a little script that would let me do this:
http://example.com/appicons.php?id=284417350
and then display this in plain text
http://a3.mzstatic.com/us/r1000/005/Purple/2c/a0/b7/mzl.msucaqmg.png
This is the API query to get that information (artworkUrl512):
http://ax.itunes.apple.com/WebObjects/MZStoreServices.woa/wa/wsLookup?id=284417350
Any help and example code would be much appreciated!
I am not sure why you have jQuery in your tags, unless you want to make the request dynamically without a page refresh. However you can do this simply in PHP using the following example:
$request = array (
"app_id" => #$_GET["id"]
);
// parse the requests.
if (empty($request["app_id"])) {
// redirects back / displays error
}
else {
$app_uri = "http://ax.itunes.apple.com/WebObjects/MZStoreServices.woa/wa/wsLookup?id=" . $request["app_id"];
$data = file_get_contents ($app_uri);
$json = json_decode (trim($data));
print($json->results[0]->artworkUrl100);
}
$request = file_get_contents($itms_url);
$json = json_decode(trim($request));
echo $json[0]->artworkUrl512;
should work in PHP. Unless of course there is more than one hit to the search. A solution using jQuery is probably not very much more difficult.
I connect up to a domain API and perform an availability check on only one domain.
I would like to loop this 10 times, in the most efficient way to check for any changes in status
I would like it to do the checks as quickly as possible (reduce the time between checks)
I would like it to ouput each time it completes a check (if a multiple loop is in place it only outputs all checks once finished, in one go, rather than one at a time after each check / iteration in the loop)
Cheers!
<?php
// connection credentials and settings
$location = 'https://TheApiServiceURL.com/';
$wsdl = $location.'?wsdl';
$username = 'APIuser';
$password = 'APIpass';
// include the console and client classes
include "class_console.php";
include "class_client.php";
// create a client resource / connection
$client = new Client($location, $wsdl, $username, $password);
/**
* Example usage and output results to screen
*/
// Example #1: Check domain name availability
print('========== consoleMethod[domainLookup] ==========<br/>');
$client-‐>set('domain', 'domain.com');
$client-‐>domainLookup();
$client-‐>screen($client-‐>response());
$client-‐>unset('domain');
?>
I googled sub-strings of your code. I found the documentation - the provided code is from the examples section.
About the Class "Client"
This is what it says about the screen method:
public function screen($var)
{
print '<pre>';
print_r($var);
print '</pre>';
return $this‐>connection;
}
and
public function response()
{
return $this->response;
}
Efficiently Looping 10 Times
If you want to get your response every iteration (is what you want, right?), do this:
$client‐>set('domain', 'domain.com');
$i=0;
while($i<10)
{
$client‐>domainLookup();
echo $client‐>response();
// or $client‐>screen($client‐>response());
$i++;
}
$client->unset('domain');
According to this benchmark while beats for. But it will be a minor difference at 10 iterations. However, If you really do want to tweak it I suggest timing different approaches- maybe even trying to copy paste the commands 10 times.
DomainLookup() Speed
Or refered to as "Checking Speed" by you.
This depends on the function domainLookup() which is provided by the api. So you'll have to see what this function is doing if you want to make the "checking speed" quicker. You could multi-thread the function but php isn't really made for that.
Hi I have a php srcript that receives GET data and I want to redirect the data from GET to another page in wordpress using POST. It's that possible, and how?
Thank's for the help.
The only way this could be done in pure PHP is by using cURL and printing the result of that request in the page:
<?php
// sort post data
$postarray = array();
foreach ($_GET as $getvar => $getval){
$postarray[] = $getvar.'='.urlencode($getval);
}
$poststring = implode('&',$postarray);
// fetch url
$curl = curl_init("http://www.yourdomain.com/yourpage.php");
curl_setopt($ch,CURLOPT_POST,count($postarray));
curl_setopt($ch,CURLOPT_POSTFIELDS,$poststring);
$data = curl_exec($curl);
curl_close($curl);
// print data
print $data;
?>
Obviously you'd validate the GET data before you post it.
If there's another way you can do this I'd be interested to know as this method is not ideal. Firstly, cURL must be enabled in PHP, and secondly there will be some overhead in requesting another URL.
Only using form and javascript, which is not bulletproof.