Multi Outbound Request - Curl
Here's the problem, I have several clients I have to CURL outbound requests to. Say for instance I currently have 20 clients and I send around 100~1000 requests per minute to each of them. I also use CURL multi however it seems to have a limitation of how many requests it can make at a time and it will also depend on the longest CURL response for the entire routine to complete.
For instance I have the following clients:
Client 1
url: http://www.exampleclient1.com/process/
Client 2
url: http://www.exampleclient2.com/process/
... and so on
The main issue here is that I have a single script that does the job for every client. Say,
http://localhost/app/send/client1
> will send out the pending outbound queue to client 1's url
http://localhost/app/send/client2
> will send out the pending outbound queue to client 2's url
... and so on
The reason why I separated them is because there should be dedicated connections between clients and their latencies are different from each other. Some clients respond faster and have faster servers while some clients have slow servers or take more hops to reach.
Here's my question: Is there a means to simplify this process? Because it's a hassle that everytime I have to add a client to my database I'll also have to add
http://localhost/send/newclient1
http://localhost/send/newclient2
.
.
.
http://localhost/send/newclientn
to the list of cronjobs. Is it possible to put it in a single script instead so as my list of clients grow it will not affect the overall performance of the outbound CURL function I have?
By the way, I'm using PHP, CURL. If there's a solution that recommends the use of another technology other than PHP for this, a linux queuing manager...etc, you're welcome to suggest.
Thanks!
I have similar problem that you face, common issues are:
- Different response time of Resources URLs ( as you may call them clients )
- The Demand of adding new URL you add one or ten at once.
In addition to commons, my other observations:-
- URL from different services providers,
- with different data format
- and they require different type of requests for what you want.
I have reviewed couple of PHP-cURL libraries:
Rolling-Curl and Perallel-Curl , covered in this article .
PHP-Multi-curl (epiCurl) , this one I prefer.
multi-curl-php, code example .
phpmulticurl, Code Exmaple .
and there are other libraries and classes...
Now managing the the database and clients' send requests, you will at least well know RDBMS such MySQL or SQLite, other options RDBMS are considered.
In the RDBMS create a table that contains requests as in { "clientNo"
=> "URL" }
You can add your list of clients using PHPmyAdmin or any database management interface.
The Script: create a script that fetch the whole entry of database, one by one within foreach loop; preform creation on CURL node and adding handler to queue of multiple CURl calls.
test your script, and for production add it to cronjob.
every time you have to add a client , just add him to RDBMS (MySQL, SQLite ) using database management interface.
NOTE: if you are doing outbound management, you have to consider security issues, proxy; and possible create PHP and HTML form for submitting new clients.
Related
I’m using Ionic Framework (Angular v9) to send POST requests to the PHP files hosted on the GoDaddy server. POST requests work fine until the client sends them intermittently. However, when the client starts to send multiple POST requests to the server the requests are being rejected. The server responds with one of the following errors:
err connection closed
err connection timed out
err_connection_reset
Scenario where such errors are spotted - Autocomplete implementation. I’ve a product Table (phpmyadmin - MySQL DB) and a PHP file to retrieve data from that table. There is a textbox on the client side app. The user begins entering the name of the product. The app fetches the values from the textbox and sends a POST request to the server. The server extracts the value from POST request and queries the table for values that match the input string. Top 5 such results are sent back to the client.
This flow works as long as the client fires the POST requests slowly, i.e. enough time interval (a few seconds) elapses between 2 requests. However, this ideal scenario is impractical. User’s typing speed wouldn’t allow the passage of such interval. I want to query the DB and have the server return results on each keydown event.
But the problem is that the server is closing the connection / resetting the connection when the same client sends too many POST requests to the server without leaving enough time interval between any 2 requests.
Thank you in advance for help.
If you can you need to learn actual limitation (allowed POST request rate from your client to the server), then work off that.
But generally speaking you need to employ a debounce technique. Even if your host provider would allow that many requests it is not a great idea to have that volume coming out of a single client.
You can use in-built debounce for components such a ion-searchbar
You can use debounceTime RXJS operator to pipe your POST requests (so that it won't fire more often than it should)
You can assess caching of such auto-complete details on client to prevent frequent requests
Example with ion-search:
<ion-searchbar debounce="500"></ion-searchbar>
Example with debounceTime and formControl:
constructor() {
this.searchControl = new FormControl();
}
ngOnInit() {
this.searchControl.valueChanges
.pipe(debounceTime(700))
.subscribe(search => {
});
}
Most of the answers recommended that I should contact GoDaddy before diving headlong into this issue. So after contacting GoDaddy I found out that SHARED servers are not very good for data intensive applications. If the client makes too many requests to the server in a short interval of time it is in our best interest to upgrade to a PRIVATE VPS server. Simply beefing up a shared server won't necessarily improve the performance of the app or lift up any restrictions imposed.
Another suggestion I received from a senior developer was to optimise the DB. Normalisation, indexing, applying appropriate constraints and relations among tables etc. should be performed. If the DB is not optimised certain servers might close the connection and the POST requests could fail.
What approach, mechanisms (& probably code) one should apply to fully implement Model-to-Views data update (transfer) on Model-State-Change event with pure PHP?
If I'm not mistaken, MVC pattern states an implicit requirement for data to be sent from Model layer to all active Views, specifying that "View is updated on Model change". (otherwise, it doesn't make any sense, as users, working with same source would see its data non-runtime and absolutely disconnected from reality)
But PHP is a scripting PL, so it's limited to "connection threads" via processes & it's lifetime is limited to request-response cycle (as tereško kindly noted).
Thus, one has to solve couple issues:
Client must have a live tunnel connection to server (Server Sent Events),
Server must be able to push data to client (flush(), ob_flush()),
Model-State-Change event must be raised & related data packed for transfer,
(?) Data must be sent to all active clients (connected to same exact resource/URL) together, not just one currently working with it's own processes & instance of ModelClass.php file...
UPDATE 1: So, it seems that "simultaneous" interaction with multiple users with PHP involves implementation of WEB Server over sockets of some sort, independent of NGINX and others.... Making its core non-blocking I/O, storing connections & "simply" looping over connections, serving data....
Thus, if I'm not mistaken the easiest way is, still, to go and get some ready solution like Ratchet, be it a 'concurrency framework' or WEB server on sockets...
Too much overhead for a couple of messages a day, though...
AJAX short polling seems to be quite a solution for this dilemma....
Is simultaneous updating multiple clients easier with some different backend than PHP, I wonder?.. Look at C# - it's event-based, not limited to "connection threads" and to query-reply life cycle, if I remember correctly... But it's still WEB (over same HTTP?)...
I want to implement a PHP chat with multiple Rooms, however I don't want each browser polling the server, instead I prefer the server send the updates to all users in each room. Ideally I would have just a PHP instance running for each room (plus of course the AJAX requests sent by users for updating the DB, I know server side events are not widely supported):
users POST messages using a POST AJAX request
when the PHP script of a Room read the DB and find a new message, it will sent the update to ALL the users connected to that room, this way it will be more responsive and would put less pressure on DB communication
So basically If there are N users and K rooms I want to reduce the overhead from
N database/php poll requests every while
to
K database/php poll requests every while
You might better use web sockets for this purpose. If you want to use php, there are few libraries for that:
1) Ratchet
2) ReactPHP
3) d-Node
and others. I used Ratchet and React. They work fine, as for me
Yes, but it will require writing your own web server: i.e. a socket server in PHP to receive http requests from clients. You then just keep one array of sockets per chat room, and when you get a message you want to broadcast to all listeners, you create and send an SSE message to each client, something like:
data: {room:12,msg:"Hello World"}\n\n
(I think by registering the socket into multiple arrays that you can even use a single SSE connection to listen to messages from multiple chat rooms. So, you could even have a single PHP process running all chat rooms.)
However, if using, say, Apache+PHP, then what you want is not possible. Each SSE connection will get a dedicated PHP process. (If this is your only choice and polling the DB is really expensive, you could have a single process poll the DB, then push messages to an in-memory localhost DB, and have each PHP process poll that in-memory DB.)
I currently have a bookmarklet that is used by many people in my department at work. The script takes a list of product IDs or gathers a list from the current page and visits each product detail page and gets various pieces of information. Using queries is not possible as I don't have database access. Anyway, I was thinking of building a fully fledged application for this + other features using either rails or PHP/some other framework. The biggest thing is the ability to send asynchronous HTTP requests as Javascript does.
I found this project: http://code.google.com/p/multirequest/, but haven't looked into it yet. What is the best way to go about this?
See this library https://code.google.com/p/multirequest/
It allows you to send multiple concurrent requests with CURL from PHP v5+
An example script is available which basically boils down to...
Define callback methods for when each request completes
Define what each request will look like (headers, etc)
Push each url to the MultiRequest_Handler class
Start the MultiRequest_Handler
At the most basic level, you can use multi curl in PHP. I've tested up to 150 simultaneous requests. You'd want to set a timeout to make sure a single request doesn't stall the whole process. It's not asynchronous, it's simultaneous, but it's quick.
I have a php script which queries a list of clients from a mysql database, and goes to each client's IP address and picks up some information which is then displayed on the webpage.
But, it takes a long time, if the number of clients is too high. Is there anyway, I can send those url requests (file_get_contents) in parallel?
Lineke Kerckhoffs-Willems wrote a good article about Multithreading in PHP with CURL. You can use that instead of file_get_contents() to get needed information.
I would use something like Gearman to assign them as jobs in a queue for workers to come along and complete if this needs to scale.
As another option I have also written a PHP wrapper for the Unix at queue, which might be a fit for this problem. It would allow you to schedule the requests so that they can run in parallel. I have used this method successfully in the past to handle the sending of bulk email, which has similar blocking problems to your script.