I have an API created with Slim 3.
In it I have some curl execution such as sending Push notification to a user.
I want send response to requester then execute curl or any other function.
I read about Threads in PHP and use pthreads-polyfill but it sends response after finishing Thread.
Sample tested code:
$app->get('/test', function (Request $request, Response $response) {
PushController::publish("1111111", "HELLO");
$result = 'OK';
return $response->getBody()->write($result)->withStatus(200);
});
I understand what you are trying to do, and threading is not the answer.
One solution is to call a script from the main one, as you mentioned.
A more elegant one imho, is to call fastcgi_finish_request . It will return the answer to the requester and continue to execute the script. Unfortunately this function is only available with PHP-FPM. Which is the industry standard by now but does not necessarily comes as default when you install a LAMP stack.
For your requirement two solutions may suite
Queue
Cron
Redis can be used as queuing server. For that you need to install redis server on your system. There is php implementation of predis for Redis. For more details about Redis you can read it in Redis official site. Beanstalkd can also used as Queuing server.
To learn, how to create cron jobs you can refer the exisitng stackoverflow question
Related
I am trying to implement async job in laravel, so I can send email (using 3rd party API), but let user go in the frontend so request doesn't wait for email to be sent.
I am using Laravel 6.18.
so I've created generic job with php artisan make:job EmailJob
I've set sleep for 60 seconds as a test of long email send.
then in my controller
EmailJob::dispatchAfterResponse();
return response()->json($obj,200);
In chrome console, I can see there is 200 response, however request is still no resolved, and there is no data returned, so my ajax/axios request still waits for full response, eventually it times out (60 seconds is too long), and produces error in frontend.
So question is, how to execute job after full response is sent ?
You have to change the queue driver and run queue:worker
The following 2 resources will help you
https://laravel-news.com/laravel-jobs-and-queues-101
https://laravel.com/docs/6.x/queues#connections-vs-queues
Just like in Terminable Middleware, this will only work if the Webserver has FastCGI implemented.
You can go that way, or you can do a Queue with Database driver, which is simpler to achieve than installing Redis.
You would still need to have a running process to complete the jobs. (worker)
I have a lot of data, which I want to transfer to Logz.io.
Basically, the project is written in Laravel and I want to create an event manager which will do all the job of logging info to Logz.io. I need an advice about which library h is better to use , something like guzzle curl, which can help me to send logs info to logz.io in the background. I mean, I don't want to wait until the request with data for logz.io will finish.
Thank you.
Well, as far as I know, there are two approaches that you could take:
1) Using Laravel's queue system to send your logs later. For this you'd have to enable a worker or a supervisor process to send them on the background.
2) There's this package by the people at spatie - spatie/async which is a wrapper around PHP's PCNTL extension and allows excecuting php code in separate threads.
I hope I could help you, even if a little.
PHP is not an async language. Laravel has a queue system that works out of the box.
Basically you put some jobs in a queue and another process (it can be also in another machine), run them. It will work well in your scenario because you don't need a real-time log collection but you can delegate it by few seconds.
I have a PHP REST server. There is an endpoint where I have a CURL request. This endpoint is the most used endpoint. Process of other code is very fast except for the CURL request. The return value of the curl_exec is not important to me nor the status. So is there a way so that I can put the curl req in aws queue or something so that instead of PHP Server doing the curl, the aws does it later. And in turn the endpoint becomes super fast
Edit
Thing is this can be solved in many ways (example cron, fork, fpsocket etc) but I want the smartest way. One that uses Amazon cloud services like SQS
You can solve your problem using Supervisord(Linux) and Gearman. Please check below links for more details
Supervisord and Gearman
I have an app which communicate through websocket with my server. I am using Ratchet and everything works perfect. The next thing i want to implement is to make requests to some other server and push responses through websocket to clients. My question is how to make parallel requests with react. Let say i have 5 endpoints which response i want to get parallel (thread). I want to call each endpoint every .2s for example and send response to websocket server clients. For example (this is just demonstration code):
$server->loop->addPeriodicTimer(.2, function ($timer) {
curl('endpoint1');
});
$server->loop->addPeriodicTimer(.2, function ($timer) {
curl('endpoint2');
});
$server->loop->addPeriodicTimer(.2, function ($timer) {
curl('endpoint3');
});
But this timer does not work this way. Is it even possible to achive this with react?
Im am not showing websocket code because communication between clients works nice.
To start. "React (Ratchet)" - operate in one thread mode (feature libevent). That is, anything that will block the process - bad idea... Curl request - will stop work socket server until it receives a response.
For your application - I would use ZMQ.
The point is this:
you run worker process (for example: ZMQWorker),
your server sends curl-data on ZMQWorker (via ZMQ).
ZMQWorker send Curl request
after sending the request, ZMQWorker send a response to the WebSocket
(via ZMQ). It you can get via reactphp/zmq
If you need to send a lot of concurrent requests - You will need pthread library - it provides multi-threading.
I've also heard that it is possible to provide work pthread + libevent, but personally I did not do.
P.S But the use of architecture ZMQ, you get a distributed architecture, to the same scalable!
My objective is consume various Web Services and then merge the results.
I was doing this using PHP cURL, but as the number of Web Services has increased, my service slowed since the process was waiting for a response and then make the request to the next Web Service.
I solved this issue using curl_multi and everything was working fine.
Now, I have a new problem, because I have new Web Services to add in my service that use Soap Protocol and I can't do simultaneous requests anymore, because I don't use cURL for Soap Web Services, I use SoapClient.
I know that I can make the XML with the soap directives and then send it with cURL, but this seems to me a bad practice.
In short, is there some way to consume REST and SOAP Web Services simultaneously?
I would first try a unified, asynchronous guzzle setup as others have said. If that doesn't work out I suggest not using process forking or multithreading. Neither are simple to use or maintain. For example, mixing guzzle and threads requires special attention.
I don't know the structure of your application, but this might be a good case for a queue. Put a message into a queue for each API call and let multiple PHP daemons read out of the queue and make the actual requests. The code can be organized to use curl or SoapClient depending on the protocol or endpoint instead of trying to combine them. Simply start up as many daemons as you want to make requests in parallel. This avoids all of the complexity of threading or process management and scales easily.
When I use this architecture I also keep track of a "semaphore" in a key-value store or database. Start the semaphore with a count of API calls to be made. As each is complete the count is reduced. Each process checks when the count hits zero and then you know all of the work is done. This is only really necessary when there's a subsequent task, such as calculating something from all of the API results or updating a record to let users know the job is done.
Now this setup sounds more complicated than process forking or multithreading, but each component is easily testable and it scales across servers.
I've put together a PHP library that helps build the architecture I'm describing. It's basic pipelining that allows a mix of synchronous and asynchronous processes. The async work is handled by a queue and semaphore. API calls that need to happen in sequence would each get a Process class. API calls that could be made concurrently go into a MultiProcess class. A ProcessList sets up the pipeline.
Yes, you can.
Use an HTTP client(ex: guzzle, httpful) most of them are following PSR7, prior to that you will have a contract. Most importantly they have plenty of plugins for SOAP and REST.
EX: if you choose guzzle as your HTTP client it has plugins SOAP. You know REST is all about calling a service so you don't need extra package for that, just use guzzle itself.
**write your API calls in an async way (non-blocking) that will increase the performance. One solution is you can use promises
Read more
its not something php is good at, and you can easily find edge-case crash bugs by doing it, but php CAN do multithreading - check php pthreads and pcntl_fork. (neither of which works on a webserver behind php-fpm / mod_php , btw, and pcntl_fork only works on unix systems (linux/bsd), windows won't work)
however, you'd probably be better off by switching to a master process -> worker processes model with proc_open & co. this works behind webservers both in php-fpm and mod_php and does not depend on pthreads being installed and even works on windows, and won't crash the other workers if a single worker crash. also you you can drop using php's curl_multi interface (which imo is very cumbersome to get right), and keep using the simple curl_exec & co functions. (here's an example for running several instances of ping https://gist.github.com/divinity76/f5e57b0f3d8131d5e884edda6e6506d7 - but i'm suggesting using php cli for this, eg proc_open('php workerProcess.php',...); , i have done it several times before with success.)
You could run a cronjob.php with crontab and start other php scripts asynchronously:
// cronjob.php
$files = [
'soap-client-1.php',
'soap-client-2.php',
'soap-client-2.php',
];
foreach($files as $file) {
$cmd = sprintf('/usr/bin/php -f "%s" >> /dev/null &', $file);
system($cmd);
}
soap-client-1.php
$client = new SoapClient('http://www.webservicex.net/geoipservice.asmx?WSDL');
$parameters = array(
'IPAddress' => '8.8.8.8',
);
$result = $client->GetGeoIP($parameters);
// #todo Save result
Each php script starts a new SOAP request and stores the result in the database. Now you can process the data by reading the result from the database.
This seems like an architecture problem. You should instead consume each service with a separate file/URL and scrape JSON from those into an HTML5/JS front-end. That way, your service can be divided into many asynchronous chunks and the speed of each can been tweaked separately.