how much concurrect connection using curl remote server with same server - php

I want to upload file to remote server.multiple user upload the files at same time so any limitation of curl connection

there is no built-in limitation on the max amount of curl_easy handles you can have running concurrently, nor any max limit of curl handles in a curl_multi handle, curl could easily upload over 1000 files concurrently without problems. curl will be the last of your problems, most (all? probably) operating systems have a max number of open sockets/file handles per process, for example, on windows server 2008, the limit seems to be somewhere around 16,000 handles per session, and if those limitations doesn't bite you first, you'll probably run out of RAM long before you hit any limits of curl itself.

Related

Apache/PHP response time and execution time

I have a web server application with Apache, PHP and MySQL in Windows Server 2008. The server also serves web pages and images.
Recently I have noticed that some users (8 users out of 150) that upload images have a response time from Apache of 200 seconds for example, but the execution time of the PHP script is 2 seconds. But other users are not affected and they're using the same script.
I know this times because I'm logging each request in a MySQL table.
To obtain the apache response time before the execution ends I use
microtime(true) - $_SERVER["REQUEST_TIME_FLOAT"]
And to obtain the PHP execution time I use
microtime(true) - $GLOBALS["tiempo_inicio_ejecucion"];
where $GLOBALS["tiempo_inicio_ejecucion"] is another microtime that I get at the beginning of the script execution.
The server load is low, CPU and RAM are far of their limits.
If I try to reproduce this behaviour uploading files from my PC, I can't reproduce it, it uploads fast.
I suppose is some network issue, but I can't get it solved, or maybe is a network issue of the clients.
How can I know what is happening here?
Thanks in advance.
Possible suggestion: A virus checker.
Your server may have a virus checker installed which scans files automatically when they are created. This will be scanning uploaded file. It is possible that it may be running low on resource or being given a low priority by the server, and thus scans of the uploaded files are taking a long time. However it won't release the file to the web server until the scan is complete, and thus the server takes a long time to start running the PHP code.
I have no idea if this is actually your problem, but I have seen similar problems on other Windows Server boxes. It can be a very difficult problem to diagnose.

Server Sent Events Apache configuration

I'm running a web application using server-sent events (eventsource). I've been working to properly set up the apache and PHP configuration files so that the program will accommodate all of my users and not timeout. I've already set the timeout to an appropriate amount of time in both PHP and apache, but I'm worried about the Server limit, Max Clients, and Max Requests Per Child. I need to connect around 500 users to the php file that runs the eventsource and run a PHP script every time a message is sent to the server. The eventsource file seems to take up about a 1/4 MB of ram and a negligible amount of processing power. Can someone explain what these limits do, and advise me on how best to set them?
Each SSE connection will use a dedicated PHP process, so counts as one of the Apache processes. (Each will also be using a socket and a local port.)
500 simultaneous clients is a lot, even more so if they all use PHP, and you are going to need a lot of memory on your server. But, if you have enough memory, set both MaxClients and ServerLimit to 500. (I'd suggest starting with 50 or 100, run some stress tests, and keep increasing those limits and repeating until you see your server starting to swap.)
For stress-testing SSE, I've found SlimerJS to be the best choice. (The WebKit in PhantomJS (as of 1.9.x) is too old to support SSE.) Selenium would do the job too. Make sure to keep clients and server on different machines, as 100+ clients will also use a lot of memory and load.

PHP server timeout on script

I have a function for a wordpress plugin I'm developing that takes a lot of time.
It connects to the TMDb (movies database) and retrieves one by one all movies by id (from 0 to 8000) and creates a XML document that is saved on the local server.
Of course it takes a bunch of time, and PHP says "504 Gateway Time-out The server didn't respond in time."
What can I do???? any sugestions!!!
Assuming a one-time execution and it's bombing on you, you can set_time_limit to 0 and allow it to execute.
<?php
set_time_limit(0); // impose no limit
?>
However, I would make sure this is not in production and it will only be ran when you want it to (otherwise this will place (and continue to place) a large load on the server).
Try to set:
set_time_limit(0);
at the script head. But i think it's the servers problem, you read too long. Try read in thread mode.
I think this is not related to script timeout.
504- Gateway Timeout problem is entirely due to slow IP communication between back-end computers, possibly including the Web server.
Fix:
Either use proxies or increase your cache size(search for "cache" in your php.ini and play with it) limit.
Dot

Optimise Lighttpd for lots of connections serving small files

I have a Lighttpd(1.4.28) web server running on Centos 5.3 and PHP 5.3.6 in fastcgi mode.
The server itself is a quad core with 1gb ram and is used to record viewing statistics for a video platform.
Each request consists of a very small bit of xml being posted and the receiving php script performs a simple INSERT or UPDATE mysql query. The php returns a very small response to acknowledge the request.
These requests are performed very frequently and i need the system to be able to handle as many concurrent connections as possible at a high rate of requests/second.
I have disabled keep alive as only single requests will be made and so I don't need to keep connections open.
One of the things that concern me is that in server-status I am seeing a lot of connections in the 'read' state. I take it this is controlled by server.max-read-idle which is set to 60 by default? Is it ok to change this to something like 5 as I am seeing the majority of connections being kept open for long periods of time.
Also what else can I do to optimise lighttpd to be able to server lots of small requests
This is my first experience setting up lighttpd as I thought it would be more suitable than apache in this case.
Thanks
Irfan
I believe the problem is not in the webserver, but in your PHP application, especially in MySQL part.
I would replace lighty with apache + mod_php, and mysql with some NoSQL such Redis, which will queue the INSERT requests to the database. Then I would write a daemon / crontab that insert the data in MySQL.
We had such thing before, but instead of Redis, we created TXT files in one directory.

What are the limits of PHP's multi curl functions?

Is there any limits in the max amounts of concurrent connections a multi curl session make?
I am using it to process batches of calls that I need to make to a API service, I just want to be careful that this does not effect the rest of my app.
A few queries, do curl sessions take up the amount of connections the apache server can serve? Is multi curl a ram or CPU hungry operation? I'm nit concerned about bandwidth because I have lots of it, a mighty fast host and only small amounts of data is being sent and received per request.
And I imagine it depends on server hardware / config...
But I can't seem to find what limits the amount of curl sessions on the documentation.
PHP doesn't impose any limitations on the number of concurrent curl requests you can make. You might hit the maximum execution time or the memory limit though. It's also possible that your host limits the amount of concurrent connections you're allowed to make.

Categories