fopen() very slow when used in Joomla - php

I'm writing a authentication plugin for Joomla.
The plugin is calling an external site to verify the username and password.
The code works, but my problem is that when I'm calling fopen() from the Joomla plugin, it takes a very long time (63 seconds) for it to respond.
When running the same code on the server (but not through Joomla), the fopen() call only takes 0.1 second.
Is there any settings that Joomla can have changed, that makes the fopen() call taking so long? Or is there any other function that I should use instead of fopen()? (I have tried file_get_contents() but with the same result)
Below is the code I'm using (based on this article: http://wezfurlong.org/blog/2006/nov/http-post-from-php-without-curl/) (I don't have cURL installed so that is not an option.)
$username = "admin";
$password = "1234" ;
$verb = 'GET'
$url = "https://xxx.xxx.xxx/api.phtml/login" ;
$params = array("username"=>$username, "password"=>$password);
$params = http_build_query($params);
$url .= '?' . $params;
$cparams = array( 'http' => array( 'method' => $verb,
'ignore_errors' => true ) );
$context = stream_context_create($cparams);
$fp = fopen($url, 'rb', false, $context);
The allow_url_fopen is enabled.
Joomla! Version: Joomla! 2.5.27 Stable
PHP Version: 5.2.6-1+lenny10
I have been struggeling with this for three days now, so any help would be very appreciated!
Thanks!

Usually when we have a problem with fopen, we switch to curl and that solves the problem. In any case, check your .htaccess and your local php.ini (or .user.ini) files for the Joomla site for any restrictions on fopen.

Related

heartbleed affect the 3rd party php server receipt verify of itunes?

I have a working 3rd party php codes verify the receipt sent from ipad.
but it seems https://sandbox.itunes.apple.com/verifyReceipt no long response to my php code.
there's not even an error stat like {"status":21000} if I visite the url directly.
I've tried different ways on server side, like curl_exec($ch); or file_get_contents
even the simple test get nothing returned at all.
$result = file_get_contents('https://sandbox.itunes.apple.com/verifyReceipt');
echo $result;
I wonder if this is caused by heartbleed and what can I do?
my original working php code:
if ($isSandbox) {
$endpoint = 'https://sandbox.itunes.apple.com/verifyReceipt';
}
else {
$endpoint = 'https://buy.itunes.apple.com/verifyReceipt';
}
// Connect to Apple server and validate.
$postData = json_encode(array("receipt-data" => $receipt));
$options = array(
'http' => array(
'header' => "Content-type: application/x-www-form-urlencoded",
'method' => 'POST',
'content' => $postData
),
);
$context = stream_context_create($options);
$result = file_get_contents($endpoint, false, $context);
Well, after hours searching and checking, I finally figure this out, here's what happened:
I upload the test file to another server, turns out it's working so
not a php source problem.
I added
error_reporting(E_ALL);
ini_set('display_errors', '1');
to the php file got the error code is file-get-contents Couldn't
resolve host name
so I remember I yum update all the server software yesterday to
deal with heartbleed. and seems some update modified the
resolv.conf.
I changed the resolv.conf added google nameserver
8.8.8.8 but still not working
restart the nginx and php-fpm, problem solved

How to speed up file_get_contents()? or any alternative way

I have a website scraping project. Look at this code:
<?php
include('db.php');
$r = mysql_query("SELECT * FROM urltable");
$rows= mysql_num_rows($r);
for ($j = 0; $j <$rows; ++$j) {
$row = mysql_fetch_row($r);
$html = file_get_contents(mysql_result($r,$j,'url'));
$file = fopen($j.".txt", "w");
fwrite($file,$html);
fclose($file);
}
?>
I have a list of url. This code means that, make text files using the contents(HTML) from each url.
When running this code, I can make only one file per second [each file size~ 20KB]. My internet is providing 3mbps downloading speed, but I can't utilize that speed with this code.
How do I speed up file_get_contents()? Or how do I speed up this code using threading or configure php.ini file or any other methods?
As this was not one of the suggestions on the duplicate page I will add it here.
Take a close look at Curl Multi PHP Manual page.
Its not totally straight forward but once you get it running its very fast.
Basically you issue multiple curl requests and then collect the data returned as and when it returns. It returns in any order so there is a bit of control required.
I have used this on a data collection process to reduce 3-4 hours of processing to 30 minutes.
The only issue could be that you swamp a site with multiple requests and the owner considers that an issue and bans your access. But with a bit of sensible sleep()'ing added to your process you should be able to reduce that possibility to a minimum.
You can add few controls with the streams.
But cURL should be much better, if available.
$stream_options = array(
'http' => array(
'method' => 'GET',
'header' => 'Accept-language: en',
'timeout' => 30,
'ignore_errors' => true,
));
$stream_context = stream_context_create($stream_options);
$fc = file_get_contents($url, false, $stream_context);

zend http client current url

I am porting a old project written in plain php to zend framework(and I am new to zend), I am using zend http client(with cURL adapter) in zend project to replace the cULR part of old php project. I got stuck-up as I don't know the zend http client alternate for
$landing_url = curl_getinfo($ch, CURLINFO_EFFECTIVE_URL);
which return the url of the landing page after any redirection during the cURL request. I could able to be successfully do the following with zend
$config = array(
'adapter' => 'Zend_Http_Client_Adapter_Curl',
'curloptions' => array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_FOLLOWLOCATION => true,
),
);
$redirecting_page_address ='https://www.domain.tld/someredirectingurl';
$client = new Zend_Http_Client($redirecting_page_address, $config);
$response = $client->request();
and got the required page as output using $response->getBody() now I want to know the url of the landed page where $redirecting_page_address redirected to. Thanks in advance.
As Cyril answered for himself, there is no high-level API in zend framework 2 (2.2.8 2014-09-17) to fetch the effective URL. You can, however, fetch the original curl handle and use native PHP functions on it:
// $client would be a Zend_Http_Client
$handle = $client->getAdapter()->getHandle();
$effectiveUrl = curl_getinfo($handle, CURLINFO_EFFECTIVE_URL);
If redirect is done by headers, you can use
if ($response->isRedirect()){
$newUrl = $response->getHeader('Location');
}
In my research over this I found there no way to do this till the current versions of zend framework (2.0.6), maybe in the future versions this facility can be included.

Retrieving a URL though an authenticating proxy using PHP

I am attempting to script the retrieval of an secure (password protected + https) URL trough a proxy server that requires authentication. I found many examples of how to use the 'proxy' option and how to send credentials with stream_context_create.
$url = "https://$server/$uri";
$cred = sprintf('Authorization: Basic %s\r\n', base64_encode("$user:$pass"));
$options['header']=$cred;
$options['proxy']="tcp://10.1.1.1:3128";
$params = array('http' => $options);
$ctx = stream_context_create($params);
$fp = #fopen($url, 'rb', false, $ctx);
However I did not find any where a uid/password was provided to the proxy server. I attempted to add a header manually:
$prox_cred = sprintf('Proxy-Connection: Keep-Alive\r\nProxy-Authorization: Basic %s\r\n', base64_encode("$proxy_user:$proxy_pass"));
$options['header'] .= $prox_cred;
I have root access to the proxy server and I can see my request hitting it, but when I look into the traffic I don't see my headers. I guess they are being added to the portion that gets encrypted and sent to the end server.
For some reason I didn't want to complicate things by using the cURL modules for the first time. However based on the responses I tried it and it took about 6 minutes to get things working.
if ($proxy_server != '') {
curl_setopt($ch,CURLOPT_PROXYAUTH,CURLAUTH_BASIC);
curl_setopt($ch,CURLOPT_PROXY,"$proxy_server");
if ($proxy_user != '') {
curl_setopt($ch,CURLOPT_PROXYAUTH,CURLAUTH_BASIC);
curl_setopt($ch,CURLOPT_PROXYUSERPWD,"$proxy_user:$proxy_pass");
}
}
Thanks!

How can i add exception?

I have a php site www.test.com.
In the index page of this site ,i am updating another site's(www.pgh.com) database by the following php code.
$url = "https://pgh.com/test.php?userID=".$userName. "&password=" .$password ;
$response = file_get_contents($url);
But,now the site www.pgh.com is down.So it also affecting my site 'www.test.com'.
So how can i add some exception or something else to this code,so that my site should work if other site is down
$response = file_get_contents($url);
if(!$response)
{
//Return error
}
From the PHP manual
Adding a timeout:
$ctx = stream_context_create(array(
'http' => array(
'timeout' => 1
)
)
);
file_get_contents("http://example.com/", 0, $ctx);
file_get_contents returns false on fail.
You have two options:
Add a timeout to the file_get_contents call using stream_get_context() (The manual has good examples; Docs for the timeout parameter here). This is not perfect, as even a one second's timeout will cause a notable pause when loading the page.
More complex but better: Use a caching mechanism. Do the file_get_contents request in a separate script that gets called frequently (e.g. every 15 minutes) using a cron job (if you have access to one). Write the result to a local file, which your actual script will read.

Categories