PHP Ajax Curl Multithread - php

I have 3 API (www.api1.com, www.api2.com, www.api3.com) that must be called via ajax curl, now is working fine, but I realized that while api1 not done the api2 process will be waiting until the api1 is done, so how can i make it being parallel call (the fastest will be shown)?
Note : I mean, I have some funcion in PHP that execute Curl and the function is called via ajax.

If you are using apache, than it may be a config issue. Look into apache multithreading. Nginx may be better suited, but i'm not an expert in the multithreading part yet. I have no issues doing this with Nginx, I have many virtual servers setup that run simultaneously. If XDEBUG has been established and is monitoring each virtual web server, it will prevent them from running simultaneously.
There is no such thing as ajax curl. Each browser will implement ajax differently, and curl may be used under the hood, but it's confusing to say ajax curl. You use ajax with javascript, and likely use curl (or an abstraction to curl like Guzzle) on the server side to communicate with other servers.

mpyw/co provides the very simple, cURL and Generator based solution.
function curl_init_with($url, array $options = [CURLOPT_RETURNTRANSFER => true])
{
$ch = curl_init($url);
curl_setopt_array($ch, $options);
return $ch;
}
Example
Parallely execute 3 cURL requests to join with their results
$results = Co::wait([
curl_init_with('www.api1.com'),
curl_init_with('www.api2.com'),
curl_init_with('www.api3.com'),
]);
Parallely execute 2 generators those contain cURL requests
Co::wait([
function () {
var_dump(yield [
curl_init_with('www.api1.com'),
curl_init_with('www.api2.com'),
curl_init_with('www.api3.com'),
]);
},
function () {
var_dump(yield [
curl_init_with('www.api4.com'),
curl_init_with('www.api5.com'),
curl_init_with('www.api6.com'),
]);
},
]);

Related

Is it possible to start executing PHP script on a multipart/form-data file upload request before file is uploaded?

It should be be a common use-case, but I can't find whether it's achievable at all.
I want to validate a multipart/form-data uploaded file extension on server-side - must I wait for the file to fully upload?
I might be missing something, but it doesn't make sense, especially when handling large files.
Can't I execute PHP before file is uploaded, get metadata and maybe cancel the request altogether?
I know I can split to two separate requests, I'm looking for a single-request solution, if applicable
You should wait until the file is fully uploaded so you can then validate. There is no one single-request solution.
If you use an Apache/Nginx HTTP server, it executes PHP scripts only after it finished loading the whole request from the client - which is too late for your use case, as Sergio in the other answer correctly points out.
There is a single-request solution in PHP, but you need to have control over the HTTP requests in your PHP script.
You can chose to not use Apache, but instead start a HTTP server from your php-cli (either by using the native socket functions or some HTTP server package such as react/socket that uses them in the background).
$loop = React\EventLoop\Factory::create();
$socket = new React\Socket\Server('127.0.0.1:8080', $loop);
$socket->on('connection', function (React\Socket\ConnectionInterface $connection) {
// here you can have $connection->on(...) event handlers
});
$loop->run();
Then you can have handlers that handle each chunk of the incoming request (example from the react/socket package, specifically ReadableResourceStream):
$connection->on('data', function ($chunk) {
echo $chunk;
});
And instead of echoing the chunk, you can validate its contents and call $connection->close() if you need, which effectively terminates the unfinished upload.
But this whole thing is a complex solution, and I'd recommend to use it only for a upload service that is completely separated from the application that generates the form page (which can still run under a regular Apache HTTP server because it's just much easier).
You can validate it before interact with the server in the frontend, in PHP the script executes when request is finished

Guzzle slow on laravel forge and homestead

I don't understand why guzzle requests are really slow on laravel forge and laravel homestead. I did not change default server configuration on forge and homestead.
Every simple request like this one ...
$client = new GuzzleHttp\Client();
$response = $client->get('path-to-my-api');
... takes about 150ms (on homestead and forge). This appends on every request (same network or internet). I read some posts about guzzle and it seems to be very fast for every users, but not for me.
Versions :
curl 7.35.0 (x86_64-pc-linux-gnu) libcurl/7.35.0 OpenSSL/1.0.1f zlib/1.2.8 libidn/1.28 librtmp/2.3
PHP Version 5.6.0
Guzzle 5.1.0
Something really weird is that when I do this (asynchronous) ...
$req = $client->createRequest('GET', 'path-to-my-api', ['future' => true]);
$client->send($req)->then(function ($response) {
});
... it takes about 10ms. It's great but I dont understand why. And I don't want to perform asynchronous requests.
Maybe my time measure is buggy, but I think it's Ok : I use PHP debug bar like this :
// .....
// synch
Debugbar::startMeasure('synch','SYNCH Request');
$response = $client->get('path-to-my-api');
Debugbar::stopMeasure('synch');
// asynch
Debugbar::startMeasure('asynch','ASYNCH Request');
$req = $client->createRequest('GET', 'path-to-my-api', ['future' => true]);
$client->send($req)->then(function ($response) {
Debugbar::stopMeasure('asynch');
});
I know it's not easy to answer this question (because it's vague), but I have no clue for now :(. I can edit it if you want. Thanks a lot.
Guzzle cannot be slow - it's just a library. Your synchronous requests are probably taking longer because your API is taking long to respond, and your asynchronous requests seem to be faster because it's not blocking the network until it receives a response.
Try calling the API directly in your browser or using cURL in your terminal - you'll probably find the latency is there.

php SoapClient and load balancing server

I am working on a php SoapClient. The Soap server is a .NET server.
Everything works fine when the client calls a http address and when the answer come from the server that the client calls.
The issue I have is that the client must call a https address and the server uses a load balancing system, leading to get an answer from another server (client calls serverX and get an answer sometimes from serverY, sometimes from serverZ, etc.).
Here is the php code that I use (works fine when no https and no load balancing, doesn't work with https and load balancing):
$client = new SoapClient('http://www.serverA.com/serviceB.svc?wsdl');
$immat = 'yadiyadiyada';
$params = array('immat' => $immat);
$result = $client->__soapCall('setImmat', array($params));
$id_found = $result->setImmatResult;
Any idea of what I should do? Any tips would be greatly appreciated!
Thanks
I at last found a work around.
Instead of instantiating the php SoapClient with the XML file provided by the server, I made a copy of it on the client side and I modified it a little bit. I only changed the "schemaLocation": server side, the value was something like "https://www.serverY.com/serviceB.svc?xsd=xsd0", I replaced it by "https://www.serverX.com/serviceB.svc?xsd=xsd0".
Now I instantiate the php SoapClient with this new file:
$client = new SoapClient('/local_path/wsdl.xml');
and it works!

Proper and fast way to TELNET in PHP. Sockets or cURL

Almost all examples of the TELNET implementations in PHP are with sockets (fsockopen). This does not work for me, because it takes an unacceptable amount of time (~ 60 seconds).
I have tried fsockopen for other purposes and found it slow in contrast to cURL.
Question #1: Why are sockets that slow?
Update: I found we need to set stream_set_timeout function, and we can control the socket execution time. I'm curious how to set the proper timeout or how to make it "stop waiting" once the response is received.
I can't get the same thing implemented with cURL. Where should I put the commands which I need to send to telnet? Is CURLOPT_CUSTOMREQUEST the proper option? I'm doing something like this:
class TELNETcURL{
public $errno;
public $errstr;
private $curl_handle;
private $curl_options = array(
CURLOPT_URL => "telnet://XXX.XXX.XXX.XXX:<port>",
CURLOPT_TIMEOUT => 40,
CURLOPT_RETURNTRANSFER => TRUE,
CURLOPT_HEADER => FALSE,
CURLOPT_PROTOCOLS => CURLPROTO_TELNET
);
function __construct(){
$this->curl_handle = curl_init();
curl_setopt_array($this->curl_handle, $this->curl_options);
}
public function exec_cmd($query) {
curl_setopt($this->curl_handle, CURLOPT_CUSTOMREQUEST, $query."\r\n");
$output = curl_exec($this->curl_handle);
return $output;
}
function __destruct(){
curl_close($this->curl_handle);
}
}
And then something similar to this:
$telnet = new TELNETcURL();
print_r($telnet->exec_cmd("<TELNET commands go here>"));
I am getting "Max execution time exceeded 30 seconds" on curl_exec command.
Question #2: What is wrong with the cURL implementation?
what you need to be doing is using NON-Blocking IO and then Poll for the response. what you are doing now is waiting/hanging for a response that never comes -- thus the timeout.
Personally I've written a lot of socket apps in php they work great -- and I detest cURL as buggy, cumbersome, and highly insecure... just read their bug list you should be appalled.
Go read the Excellent PHP manual complete with many examples for how to do Polled IO they even give you an example telnet server & client.
sockets are not slow. sockets are the base for communication. Curl uses sockets to open a connection to the remote server. Everything works on sockets (i think).
I don't think you can use curl to use a telnet service, well, that's not entirely true, i guess you can connect and send a single command. Curl was designed with http protocol in mind which is stateless (you open a connection, send a request, wait a reply and the close the connection).
Sockets are the only option.
I am getting "Max execution time exceeded 30 seconds" on curl_exec command.
My guess is that the remote server is the culprit. check to see if it works using a regular terminal client, or increase the max_execution_time in php ini.
UPDATE
It seems it is possible to use curl for telnet, check this:
http://www.cs.sunysb.edu/documentation/curl/
But i still think you are better off using sockets.
use pfsockopen instead of fopensock ..its much faster and keeps connection alive all the way

All possible ways to read a file from a remote server

i want to provide most possible flexibility for my script and so i need all possible ways in php and javascript to read the content(not sourcecode) of a php file from a remote server.
so far i found curl, fopen and include for php and none for javascript, but i dont even know if this is possible with javascript.
Thanks for any hint.
PHP:
- fopen() + fread()
- file_get_contents()
- curl
- executing shell commands
`wget 'www.google.com' -O saved.htm`;
$result = `cat saved.htm`;
JavaScript:
// not for remote server
var req = new XMLHttpRequest();
req.open('GET', 'http://www.google.com', false);
req.send(null);
if (req.readyState==4) alert(req.responseText);
You've got the major options for PHP figured out.
As for javascript (assuming it's running in a web browser), the same-origin policy will complicate things.
Possible workaround for Javascript include:
Using a script-tag proxy
Using a PHP proxy script on the domain that your page is loaded from. Your javascript asks the PHP script to grab the remote content. PHP script does that, and outputs the contents back to you javascript.
Javascript is a client side scripting language primarily, you can't just simply get an external resource with it without either
server-side help ( xhr to
server-side page to do curl/wget )
the resource has to be on your
domain and you can XMLHttpRequest it without server-side help.

Categories