PHP curl get request slow speed - php

I use curl in custom Zend Framework library to make a GET request to a Drupal website. On the Drupal end I use rest export pages that receive get request and return some data.
This is my curl request structure in the ZF2
$this->chanel = curl_init();
curl_setopt($this->chanel, CURLOPT_URL, "SOME URL LINK");
curl_setopt($this->chanel, CURLOPT_TIMEOUT, 30);
curl_setopt($this->chanel, CURLOPT_RETURNTRANSFER,1);
curl_setopt($this->chanel, CURLOPT_USERAGENT, 'Mozilla/5.001 (windows; U; NT4.0; en-US; rv:1.0) Gecko/25250101');
$result=curl_exec ($this->chanel);
curl_close ($this->chanel);
Both Drupal and Zend Framework websites are located on my localhost.
The execution time normally takes around 15 seconds. This is too long.
I tried the same link with a Restlet Client (Chrome Extension) and it takes around 1 second or less to execute and retrieve the data.
Do you have any suggestions why it is so slow and how I can improve the speed?

Try putting some loggers in your code, put time stamps in various code blocks and inside functions, check if curl is taking time or something else? Put timestamps and loggers after each line to debug the performance issue.
Also try using it from command line as follows:
curl --get "URL HERE"
And check if its fast or not, if its fast, the code you assume to be slow, try executing the direct command from your code.

Please use ip address instead of hostname.
If your Drupal in same machine with your ZF2 app, you can use 127.0.0.1.
I think that can be caused by DNS look up.

Related

How do I get php to simulate a visit to a url?

I set up a crontab to execute a php file every minute.
Now I need to create the php file but I’m clueless on what the contents should be.
All the code needs to do is visit the website url.
No need to save anything.
It just needs mimic loading the home page just like a browser would.
That in turn triggers a chain of events which are already in place.
It is an extremely low traffic site so that’s the reason for it.
I know, I could do it with curl.
But for reasons I won’t get into, it needs to be a php file.
Can anyone point me in the right direction please. Not expecting you to provide code, just direction.
Thanks!
You can use curl in PHP to just send a request to the page:
$curl_handle = curl_init();
curl_setopt($curl_handle, CURLOPT_URL, "the.url-of-the-page.here");
curl_exec($curl_handle);
curl_close($curl_handle);
curl
Example
You could also do it with one line (note that the whole HTML of the page is retrieved which takes a bit longer):
file_get_contents('URL');
As Prince Dorcis stated you could also use curl. If the website is not yours you should maybe (or have to) use curl and send a request with a useragent (you can find a list here):
curl_setopt($curl_handle, CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.13) Gecko/20080311 Firefox/2.0.0.13');
Marco M is right, but there is a catch (it may not be for most but there is sometimes)
file_get_contents("https://example.com");
normally does the trick (i use that more than i should) BUT !
There is a setting in php.ini that needs to be on for that function to enable it to open URLs !
I had that once with a webhoster, they did not allow that ;)

How to make a webapp (PHP) receive a response from an API (Python) after a long wait time?

I have a web application written in PHP, running on a Linux Azure virtual machine with NGINX. The application is connected to an API (written in Python) on a separate server with NGINX (similar Linux Azure virtual machine). This API performs a complex operation which takes between 30sec and 20 min to complete. So the application has to wait for it.
The problem is that with long wait times, the API respond is not registered in the web app. I have tried the following:
— verified in the endpoint of the API and the logs that the API provides a response after long processing times (it does)
I suspect it is a timeout issue so have tried:
— fixed the PHP timeout settings and the timeout for the  /login_c/check_login endpoint
— checked the code for the request and response sent and received from the API, where I am using curl method. This is the parameter for the time out of curl:
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 2100);".
The exec method executes in the background:
exec($command);
The following articles did not provide a solution:
Setting Curl's Timeout in PHP
PHP cURL methods time out on some URLs, but command line always works
Any advice on how to solve this problem?
You must edit php.ini or add to php script:
ini_set("max_execution_time",1800); //for 30 minutes request
It seems that this solved the problem:
curl_setopt ($ch, CURLOPT_CONNECTTIMEOUT, 0);
curl_setopt($ch, CURLOPT_TIMEOUT, 2100);".

PHP file_get_contents behavior across versions and operating systems

I am using file_get_contents in PHP to get information from a client's collections on contentDM. CDM has an API so you can get that info by making php queries, like, say:
http://servername:port/webutilities/index.php?q=function/arguments
It has worked pretty well thus far, across computers and operating systems. However, this time things work a little differently.
http://servername/utils/collection/mycollectionname/id/myid/filename/myname
For this query I fill in mycollection, myid, and myname with relevant values. myid and mycollection have to exist in the system, obviously. However, myname can be anything you want. When you run the query, it doesn't return a web page or anything to your browser. It just automatically downloads a file with myname as the name of the file, and puts it in your local /Downloads folder.
I DON'T WISH TO DOWNLOAD THIS FILE. I just want to read the contents of the file it returns directly into PHP as a string. The file I am trying to get just contains xml data.
file_get_contents works to get the data in that file, if I use it with PHP7 and Apache on my laptop running Ubuntu. But, on my desktop which runs Windows 10, and XAMPP (Apache and PHP5), I get this error (I've replaced sensitive data with ###):
Warning:
file_get_contents(###/utils/collection/###/id/1110/filename/1111.cpd):
failed to open stream: No such file or directory in
D:\Titus\Documents\GitHub\NativeAmericanSCArchive\NASCA-site\api\update.php
on line 18
My coworkers have been unable to help me so I am curious if anyone here can confirm or deny whether this is an operating system issue, or a PHP version issue, and whether there's a solid alternative method that is likely to work in PHP5 and on both Windows and Ubuntu.
file_get_contents() is a simple screwdriver. It's very good for getting data by simply GET requests where the header, HTTP request method, timeout, cookiejar, redirects, and other important things do not matter.
fopen() with a stream context or cURL with setopt are powerdrills with every bit and option you can think of.
In addition to this, due to some recent website hacks, we had to secure our sites more. In doing so, we discovered that file_get_contents failed to work, where curl still would work.
Not 100%, but I believe that this php.ini setting may have been blocking the file_get_contents request.
; Disable allow_url_fopen for security reasons
allow_url_fopen = 0
Either way, our code now works with curl.
reference :
http://25labs.com/alternative-for-file_get_contents-using-curl/
http://phpsec.org/projects/phpsecinfo/tests/allow_url_fopen.html
So, You can solve this problem by using PHP cURL extension. Here is an example that does the same thing you were trying:
function curl($url)
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
$url = 'your_api_url';
$data = curl($url);
And finally you can check your data by print_r($data). Hope it you it will works and you will understand.
Reference : http://php.net/manual/en/book.curl.php

Php handling of unresponsive curl

I have a php script that fetches data from external sites using curl then, after three minutes, reloads itself, fetches new data and displays updates. It works fine but if there is a network failure, and I presume it's curl not getting responses, php just hangs without returning errors or anything. These hanging processes then needs to be killed manually.
How can I deal with this situation? Tweak curl options? Modify php script that it watches for unresponsive curl? Or handle everything from the browser through ajax, including firing off a script that kills hanging php processes?
Solution: I've added
curl_setopt($ch, CURLOPT_FAILONERROR = true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
to my curl and added a catch for these errors to my response checking part. Conceptually, it's all that was needed, CURLOPT_CONNECTTIMEOUT doesn't seem to be necessary because I already have reloading set up in case of errors.
It works with manual disconnect but I haven't seen how the script handles real life network failures yet. Should be okay.
To handle network issue, use CURLOPT_CONNECTTIMEOUT option to define some seconds. It will wait for the given amount of seconds to connect to the targeted host.
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
And use CURLOPT_TIMEOUT option to define number of seconds you want to allow your curl for a particular operation. This will be helpful if the targeted server doesn't release the connection.
curl_setopt($ch, CURLOPT_TIMEOUT, 30);

MJPEG Video Stream from a Security Camera using PHP and cURL

I have multiple self-hosting network security cameras (Axis 2100, they are pretty old) that I want to broadcast to a website. The cameras support live streaming in MJPG format, but in order to stream the video from these cameras, I must make them public. For security reason, I want to restrict the viewing of these cameras through my website where I can authenticate the users. Since the webcams are on a separate host, I'm using cURL and PHP to log in to the cameras, get the MJPG image stream, and echo the live stream back to be displayed on the webpage.
header('content-type: multipart/x-mixed-replace; boundary=--myboundary');
while (#ob_end_clean());
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://XX.XXX.XXX.XX/axis-cgi/mjpg/video.cgi?resolution=320x240');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_ANY | CURLAUTH_ANYSAFE );
curl_setopt($ch, CURLOPT_USERPWD, 'USERNAME:PASSWORD');
$im = curl_exec($ch);
echo $im;
curl_close($ch);
The problem is when I have multiple sessions of trying to access the same php page with the code above on the same browser, only one session get loaded while the rest remained blank and eventually displayed a 500 internal server error. It works when open it through multiple different browsers, however, with degrade in performance.
Ultimately, I would like to make it a webcam server where I can make one input stream connection from the camera and broadcast it out to multiple clients.
My website is hosted by GoDaddy on a linux server, and I'm not on a dedicated server, so I don't think I get to install any fancy open source video stream hosting server.
Thank You and sorry for such a long post.
-T.Ho
i had here a mortal combat all night with the same problem, and your detailed problem description helped me figured out what were the problems in my case.
nginx+phpFastcgi on windows must use multiple php-cgi.exe configuration (some respawn process problem)... but this is not the main thing (because your case is linux...)
and main thing is:
running multiple simple 'WGET httpTargetImgSite' from command line - do not reproduce a problem - is OK!!!
... so conclusion is that server-side must be good, and problem reason must be browser caching related!
if your php code is named videoproxy.php
- direct browser calling of: http://serverip/videoproxy.php
- or calling some html file
<html><body><img src='videoproxy.php' /></body></html>
... will have a problem.
but this code WILL NOT:
<html><body><script>
document.write("<img src='videoproxy.php?dummy="
+(new Date().valueOf()) +"' />");
</script></body></html>
(dummy unique number to preserve img caching)

Categories