Guzzle slow on laravel forge and homestead - php

I don't understand why guzzle requests are really slow on laravel forge and laravel homestead. I did not change default server configuration on forge and homestead.
Every simple request like this one ...
$client = new GuzzleHttp\Client();
$response = $client->get('path-to-my-api');
... takes about 150ms (on homestead and forge). This appends on every request (same network or internet). I read some posts about guzzle and it seems to be very fast for every users, but not for me.
Versions :
curl 7.35.0 (x86_64-pc-linux-gnu) libcurl/7.35.0 OpenSSL/1.0.1f zlib/1.2.8 libidn/1.28 librtmp/2.3
PHP Version 5.6.0
Guzzle 5.1.0
Something really weird is that when I do this (asynchronous) ...
$req = $client->createRequest('GET', 'path-to-my-api', ['future' => true]);
$client->send($req)->then(function ($response) {
});
... it takes about 10ms. It's great but I dont understand why. And I don't want to perform asynchronous requests.
Maybe my time measure is buggy, but I think it's Ok : I use PHP debug bar like this :
// .....
// synch
Debugbar::startMeasure('synch','SYNCH Request');
$response = $client->get('path-to-my-api');
Debugbar::stopMeasure('synch');
// asynch
Debugbar::startMeasure('asynch','ASYNCH Request');
$req = $client->createRequest('GET', 'path-to-my-api', ['future' => true]);
$client->send($req)->then(function ($response) {
Debugbar::stopMeasure('asynch');
});
I know it's not easy to answer this question (because it's vague), but I have no clue for now :(. I can edit it if you want. Thanks a lot.

Guzzle cannot be slow - it's just a library. Your synchronous requests are probably taking longer because your API is taking long to respond, and your asynchronous requests seem to be faster because it's not blocking the network until it receives a response.
Try calling the API directly in your browser or using cURL in your terminal - you'll probably find the latency is there.

Related

Is there a maximum of simultaneous localhost cUrl request with PHP?

I'm setting up a vuejs app and multiple Laravel applications that must communicate between them. Communication between Laravel applications are done through cUrl request. For now, everything are on my dev machines (Last MacBook Pro - Mac OS Majove) with the help of MAMP Pro (PHP 7.3).
Problem is when I do simultaneous queries, I had a :
CURL error 28 communication timeout ... with 0 bytes received.
It if of course not a timeout problem (I tried with 2 minutes on all applications and all timeouts - same result). Since I'm working with API, there is no PHP session (so no session file lock).
It seems that cUrl connection is closed but I don't know why (I don't close it myself - and it don't hit the timeouts (connect/read/global) ).
More visually :
vuejs --ajax1--> Laravel A --cUrl--> Laravel B --cUrl--> Laravel C
vuejs --ajax2--> Laravel A --cUrl--> Laravel B --cUrl--> Laravel C
vuejs <--500-- Laravel A --X-- Laravel B <---- Laravel C
vuejs <--500-- Laravel A --X-- Laravel B <---- Laravel C
ajax1 and ajax2 are sent at the same time.
It is working if ajax1 and ajax2 are sent NOT at the same time.
What I know:
Communication is cut between Laravel A and Laravel B but Laravel B
execute the code en return a response (that never arrived because, I think, connection is closed ?). But both request hit Laravel C and Laravel C also runs.
What I tried:
Apache and nginx
Disable firewall
Increase all timeouts (PHP - cUrl)
Increase memory limit (PHP)
cUrl CURLOPT_FORBID_REUSE and CURLOPT_FRESH_CONNECT options
Change local domain name and tld of Laravel applications
What I wondering:
Is there a maximum requests that I can do, at the same time, with cUrl on the same machine (I, of course, set CURLOPT_MAXCONNECTS to 20 - without success) ?
Is there a PHP.ini configuration that I missed ?
Is it possible that the problem come from the fact that all these applications run on the same machine ? If yes, why ?
Are both servers (nginx and Apache) limits connection from same IP ? (since all applications are on the same machine, they all have the same IP).
Just bumped into the same problem. Where main backend is acting as a proxy to get data from other backend through curl(guzzle).
But i get timeout only if doing more than 5 simultaneous requests.
All applications are live and i've tried setting up a new VPS server to avoid having all backends on the same machine.
Setting CURLOPT_MAXCONNECTS also didn't work.
This is my fake proxy code(using laravel)
$uri = $request->get('url');
$api_url = env('URL');
$token = env('TOKEN');
$key = 'Bearer ' . $token;
$client = new Client([
'base_uri' => $api_url,
'headers' => [
'Accept' => 'application/json',
'Content-Type' => 'application/json',
'Authorization' => $key
]
]);
$request = $client->request('GET', $uri, [
'http_errors' => false
]);
$response = $request->getBody()->getContents();
return $response;
I haven't found a solution yet. Hopefully someone could help us solving this problem.

Handle concurrent puppeteer requests through php

I'm building a web app that retrieves dynamic generated content through puppeteer. I have set up (apache + php) docker containers, one for the p5js project that generates an svg based on a (large, 2MB) json file, and one container with PHP that retrieves that svg. Dockers runs in an Nginx config (nginx for routing, apache for quicker PHP handling). I'm using the cheapest CENTOS server available on digitalocean. So upgrading would definitley help.
I don't want the javascript in the p5js project to be exposed to the public, so I thought a nodejs solution would be best in this scenario.
The PHP page does a shell_exec("node pup.js"). It basically runs in approx 1-3 seconds which is perfect.
Problem is when I try to test a multi user scenario and open 5 tabs to run this PHP page, the loadtime drops to even 10+ seconds, which is killing for my app.
So the question would be how to set up this architecture (php calling a node command) for a multi user environment.
===
I've tried several frameworks like x-ray, nightmare, jsdom, cheerio, axios, zombie, phantom just trying to replace puppeteer. Some of the frameworks returned nothing, some just didn't work out for me. I think I just need a headless browser solution, to be able to execute the p5js. Eventually puppeteer gets the job done, only not in a multi-user environment (I think due to my current php shell_exec puppeteer architecture).
Maybe my shell_exec workflow was the bottleneck, so I ended up building a simple node example.js which waits 5 seconds before finish (not using puppeteer), and I ran this with several tabs simultaneously, works like a charm. All tabs load in about 5-6 seconds.
I've also tried pm2 to test if my node command was the bottleneck, I did some testing on the commandline, with no major results and I couldn't get PHP to run a pm2 command, so I dropped this test.
I've tried setting PuPHPeteer up, but couldn't get it to run.
At some time I thought it had something to do with multiple puppeteer browsers launched, but I've read that this should be no problem.
The PHP looks like:
<?php
$puppeteer_command = "node /var/www/pup.js >&1";
$result = shell_exec($puppeteer_command);
echo $result;
?>
My puppeteer code:
const puppeteer = require('puppeteer');
let url = "http://the-other-dockercontainer/";
let time = Date.now();
let scrape = async () => {
const browser = await puppeteer.launch({
args: ['--no-sandbox']
});
const page = await browser.newPage();
await page.goto(url);
await page.waitForSelector('svg', { timeout: 5000 });
let svgImage = await page.$('svg');
await svgImage.screenshot({
path: `${time}.png`,
omitBackground: true,
});
await browser.close();
return time;
}
scrape().then((value) => {
console.log(value); // Success!
});
I was thinking about building the entire app in nodejs if that is the best solution, but I've put so many hours in this PHP infrastructure, I'm at the point of really like getting some advice :)
Since I have full control over the target and destination site one brainfart would be to have node to serve a server which accepts a json file and return the svg based on a local p5js site, but don't now (yet) if this would be any different.
UPDATE
So thanks to some comments, I've tried a new approach: not using p5js, but native processing code (java). I've exported the processing code to a linux 64bit application and created this little nodejs example:
var exec = require('child_process').exec;
var cmd = '/var/www/application.linux64/minimal';
exec(cmd, processing);
// Callback for command line process
function processing(error, stdout, stderr) {
// I could do some error checking here
console.log(stdout);
};
When I call this node example.js within a shell_exec in PHP, I get this:
First call takes about 2 seconds. But when I hit a lot of refreshes, time is again building up by a lot of seconds. So, clearly, my understanding of multithreading is not that good, or am I missing something crucial in my testing?
I have a setup similar to you. The biggest problem is your droplet. You should use at minimum the cheapest CPU optimized droplet ($40 per month from memory). The cheapest generic droplets on DO are in a shared environment (noisy neighbours create performance fluctuations). You can easily test this out by making a snapshot of your server and cloning your drive.
Next as someone else suggested is to reduce the cold starts. On my server a cold start adds around 2 extra seconds. I take 10 screenshots before opening a new browser. Anything more than that and you may run into issues with memory.
If you are trying to use puppeteer with php you should use make sure you have composer installed then while inside the project folder open terminal windows and type composer require nesk/puphpeteer
also install nesk/rialto, then require autoload everything should work. One thing when working with puppeteer setting const use $ and skip using await command plus replace "." with ->
<?php
require 'vendor/autoload.php';
use Nesk\Puphpeteer\Puppeteer;
use Nesk\Rialto\Data\JsFunction;
$puppeteer = new Puppeteer;
$browser = $puppeteer->launch(['headless'=>false,'--proxy-server=000.00.0.0:80']);// Type Proxy
$bot = $browser->newPage();
$data = $bot->evaluate(JsFunction::createWithBody('return document.documentElement.innerHTML'));
$urlPath = $bot->url();
$bot->goto('https://google.com');//goes to google.com
$bot->waitForTimeout(3000);// waits
$bot->type('div', '"cellphonemega"');//searches
$bot->keyboard->press('Enter');//presses enter
$bot->waitForTimeout(8000);//waits
$bot->click('h3', 'https:');//clicks webpage
$bot->waitForTimeout(8000);//waits while site loads
$bot->screenshot(['path' => 'screenshot.png']);// TAKES SCREENSHOT!
$browser->close();//Shuts Down

PHP Ajax Curl Multithread

I have 3 API (www.api1.com, www.api2.com, www.api3.com) that must be called via ajax curl, now is working fine, but I realized that while api1 not done the api2 process will be waiting until the api1 is done, so how can i make it being parallel call (the fastest will be shown)?
Note : I mean, I have some funcion in PHP that execute Curl and the function is called via ajax.
If you are using apache, than it may be a config issue. Look into apache multithreading. Nginx may be better suited, but i'm not an expert in the multithreading part yet. I have no issues doing this with Nginx, I have many virtual servers setup that run simultaneously. If XDEBUG has been established and is monitoring each virtual web server, it will prevent them from running simultaneously.
There is no such thing as ajax curl. Each browser will implement ajax differently, and curl may be used under the hood, but it's confusing to say ajax curl. You use ajax with javascript, and likely use curl (or an abstraction to curl like Guzzle) on the server side to communicate with other servers.
mpyw/co provides the very simple, cURL and Generator based solution.
function curl_init_with($url, array $options = [CURLOPT_RETURNTRANSFER => true])
{
$ch = curl_init($url);
curl_setopt_array($ch, $options);
return $ch;
}
Example
Parallely execute 3 cURL requests to join with their results
$results = Co::wait([
curl_init_with('www.api1.com'),
curl_init_with('www.api2.com'),
curl_init_with('www.api3.com'),
]);
Parallely execute 2 generators those contain cURL requests
Co::wait([
function () {
var_dump(yield [
curl_init_with('www.api1.com'),
curl_init_with('www.api2.com'),
curl_init_with('www.api3.com'),
]);
},
function () {
var_dump(yield [
curl_init_with('www.api4.com'),
curl_init_with('www.api5.com'),
curl_init_with('www.api6.com'),
]);
},
]);

PHP SOAP awfully slow: Response only after reaching fastcgi_read_timeout

The constructor of the SOAP client in my wsdl web service is awfully slow, I get the response only after the fastcgi_read_timeout value in my Nginx config is reached, it seems as if the remote server is not closing the connection. I also need to set it to a minimum of 15 seconds otherwise I will get no response at all.
I already read similar posts here on SO, especially this one
PHP SoapClient constructor very slow and it's linked threads but I still cannot find the actual cause of the problem.
This is the part which takes 15+ seconds:
$client = new SoapClient("https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl");
It seems as it is only slow when called from my php script, because the file opens instantly when accessed from one of the following locations:
wget from my server which is running the script
SoapUI or Postman (But I don't know if they cached it before)
opening the URL in a browser
Ports 80 and 443 in the firewall are open. Following the suggestion from another thread, I found two work arounds:
Loading the wsdl from a local file => fast
Enabling the wsdl cache and using the remote URL => fast
But still I'd like to know why it doesn't work with the original URL.
It seems as if the web service does not close the connection, or in other words, I get the response only after reaching the timeout set in my server config. I tried setting keepalive_timeout 15; in my Nginx config, but it does not work.
Is there any SOAP/PHP parameter which forces the server to close the connection?
I was able to reproduce the problem, and found the solution to the issue (works, maybe not the best) in the accepted answer of a question linked in the question you referenced:
PHP: SoapClient constructor is very slow (takes 3 minutes)
As per the answer, you can adjust the HTTP headers using the stream_context option.
$client = new SoapClient("https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl",array(
'stream_context'=>stream_context_create(
array('http'=>
array(
'protocol_version'=>'1.0',
'header' => 'Connection: Close'
)
)
)
));
More information on the stream_context option is documented at http://php.net/manual/en/soapclient.soapclient.php
I tested this using PHP 5.6.11-1ubuntu3.1 (cli)

php SoapClient and load balancing server

I am working on a php SoapClient. The Soap server is a .NET server.
Everything works fine when the client calls a http address and when the answer come from the server that the client calls.
The issue I have is that the client must call a https address and the server uses a load balancing system, leading to get an answer from another server (client calls serverX and get an answer sometimes from serverY, sometimes from serverZ, etc.).
Here is the php code that I use (works fine when no https and no load balancing, doesn't work with https and load balancing):
$client = new SoapClient('http://www.serverA.com/serviceB.svc?wsdl');
$immat = 'yadiyadiyada';
$params = array('immat' => $immat);
$result = $client->__soapCall('setImmat', array($params));
$id_found = $result->setImmatResult;
Any idea of what I should do? Any tips would be greatly appreciated!
Thanks
I at last found a work around.
Instead of instantiating the php SoapClient with the XML file provided by the server, I made a copy of it on the client side and I modified it a little bit. I only changed the "schemaLocation": server side, the value was something like "https://www.serverY.com/serviceB.svc?xsd=xsd0", I replaced it by "https://www.serverX.com/serviceB.svc?xsd=xsd0".
Now I instantiate the php SoapClient with this new file:
$client = new SoapClient('/local_path/wsdl.xml');
and it works!

Categories