Equivalent of iframe on server side - php

Is there any way to create an "iframe-like" on server side ? The fact is, I need to acceed to certains page of my society's intranet from our website's administration part.
I already have a SQL link to the database that works fine, but here I would access to the pages without duplicating the source code on the webserver.
My infrasructure is the following:
The Webserver is in a DMZ and has the following local IP: 192.168.63.10.
Our Intranet server is NOT in the DMZ and has the following IP: 192.168.1.20.
Our Firewall has serverals rules and I've just added the following:
DMZ->LAN Allow HTTP/HTTPS traffic and LAN->DMZ Allow HTTP/HTTPS (just as we've done for the SQL redirection)
I've tried the following PHP function:
$ch = curl_init();
// set URL and other appropriate options (also tried with IP adress instead of domain)
curl_setopt($ch, CURLOPT_URL, "http://intranet.socname.ch/");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
I've also tried:
$page = file_get_contents('http://192.168.1.20/');
echo $page;
Or:
header('Location:http://192.168.1.20');
But in all thoses cases, it works fine from local but not from internet. From internet, it doesn't load and after a while, says that the server isn't responding.
Thanks for your help !

Your first and second solution could work. Can your webserver access 192.168.1.20? (try ping 192.168.1.20 on your webserver) or resolve the Hostname intranet.socname.ch ? (try nslookup intranet.socname.ch)
What you're looking for is called "proxy", here is a simple PHP project that I found:
https://github.com/Alexxz/Simple-php-proxy-script
Download the repo, copy example.simple-php-proxy_config.php to simple-php-proxy_config.php and change $dest_host = "intranet.socname.ch";
It should do the trick! (may also need to change $proxy_base_url)

Related

PHP CURL sometimes work and sometimes not

I have a weird scenario here. I have 3 servers:
1.) http://my-server1/test
--> This server url will only return a json object "test"
2.) http://my-server2/get_request
--> This url will send a request via PHP CURL method
3.) http://mylocal-machine-server/get_request
--> The same as my server2, only that, it is run on my local machine via XAMPP
The get_request method in both second and third server has the ff. simple code to test CURL:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.google.com');
curl_exec($ch);
The two servers executed the request successfully and the content of google.com was displayed. Now, I changed the url from google.com to my server 1 url in get_request method for both server 2 and my local server, so it looks like this now:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://my-server1/test');
curl_exec($ch);
I run the get_request method on both the second server and my local server. The get_request on my local server was able to get the "test" json object. However, the get_request on my second server takes a while to load, and when it finished loading, it didn't display anything.
I found the culprit. The ip address of my second server is not whitelisted on the firewall of my first server (the url where I get the data). The second server time out because it has no access yet to the first server. I just realized this when I use the curl_error suggested by greeflas, and find that the error is connection time out.

Godaddy fails to make connection with remote server via curl

Ok, I am having a hard time simply trying to get contents from Go Daddy Host server to our company's proprietary server. Originally I was using file_get_contents, then I searched all over SO and realized curl was a better option to bypass security and configuration. Here is my code:
function get_content($URL){
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $URL);
$data = curl_exec($ch);
if(curl_errno($ch)){
echo 'Curl error: ' . curl_error($ch);
}
curl_close($ch);
return $data;
}
echo 'curl:' . get_content('https://xxx-xxxxx:4032/test2.html');
Here is the error:
Curl error: Failed to connect to xxx-xxxxx.com port 4032: Connection refused
Here are some facts:
If I enter the URL into my browser, I will be able to retreive test2.html
If execute the EXACT script on a different web host (Lunar Pages), then it will work perfectly fine
get_content() will work on google.com
Go Daddy representatives cannot help us
On our server, we've disabled the firewall (while we tested this)
I would have posted this with a comment, but I don't have enough upvotes to do that. GoDaddy is one of the worst hosts for custom code. Sure they're good for things like WordPress, but if you're wanting custom functionality within your code, they're one of the worst.
This is just an example, GoDaddy blocks most file_get_contents and cURL calls within their firewall. I would go with a host like HostGator or Digital Ocean... Both are cheap but not near as limiting.
Before making a switch, I would try to run this same code on another environment locally and make sure you can connect.

Ubuntu Lamp - Proxy - Firewall - Joomla curl and fsockopen wont work

Ok i have been struggling with this for a couple of days now.
I have a Joomla installation on a local machine on our network for our intranet, also installed is Jomsocial.
The problem is that when i go to site configuration or edit an event or navigate to any joomla module that calls an external api, i get either
CURL error : 7 Failed to connect to maps.google.com port 80: Connection timed out
or
Connection timed out (110)
The issue is definetely not Joomla or Jomsocial a i have other php applications running on the same server that also cant contact external api's
The server setup is
Ubuntu 14Lts
PHP 5.5
Apache 2.4.7
MariaDB
Server sits behind a proxy, but has full internet access from CLI. all the necessary php extensions is enabled. i have set the global proxy variable in /etc/environment also in apt config and set the proxy variable in Joomla. My Joomla updates and components updates work fine but not curl of fsockopen functions are working.
I have no idea where else to look for the error. My thinking is that the www-data user might not have sufficient privileges to execute fsockopen and curl from a browser.
Any advice?
UPDATE, i have tested the site on another machine which is not on the corporate network (directly connected to the internet) and everything works. So i am pretty certain that my issue is on my machine and permissions on the network, specifically my www-data user. How can i fix this?
It appears that the http_proxy variable is not used by PHP (mod_php) even if PassEnv is used to pass it, or if it directly set with SetEnv. In addition, it is displayed correctly when getenv('http_proxy') is called in a PHP script.
However, there are two ways to get it working:
Set it in the Apache envvars (/etc/apache2/envvars) as follows:
export http_proxy=http://proxy.example.com:8080/
and restart Apache.
Put in the PHP files that load the application (e.g. index.php, bootstrap.php and etc.):
putenv('http_proxy=http://proxy.example.com:8080/');
Again, if you test with getenv('http_proxy') you will see that they are set correctly.
I've just had the same problem with a pretty close setup (only difference is mysql instead of MariaDb, and Joomla 3.4.1) and it took me quite a while to get everything together, so I will put the list of possible stumbling blocks here:
Make sure php5-curl is installed. Joomla can use a proxy only with CURL as transport layer.
sudo apt-get install php5-curl
I found no use in entering the proxy in the Joomla configuration. The only good it did was that the update connection would not time out but return immediately.
It is not enough to place the environment variables in /etc/apache2/envvars, you also need to use "PassEnv" in /etc/apache2/apache2.conf,
i.e. (taken from https://stackoverflow.com/a/21571588/1967646)
PassEnv http_proxy
Also, I needed to pass both HTTP_PROXY, HTTPS_PROXY as xml-lists were fetched via http and files lateron via https (probably update files from github). Possibly, you need to have these variables in lower case but on the joomla configuration page "PHP information" similarly named variables show up in upper case.
I don't know where this really made any difference, but restarting apache2 as follows seems to be the right way (instead of apache2ctl).
sudo service apache2 restart
I put together some haphazard code for testing whether curl and php would work together or not, most of it comes from https://stackoverflow.com/a/1477710/1967646. I only added plenty of error reporting. Put in a file test.php in the webfolder's root dir and look at with your favorite browser.
<?php
ini_set('display_errors', 'On');
error_reporting(E_ALL);
$url = 'http://update.joomla.org/core/list.xml';
function get_page($url, $proxy=true) {
if ($url!='') {
$ch = curl_init ();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
if ($proxy) {
curl_setopt($ch, CURLOPT_PROXY, '<enter your proxy host here>');
curl_setopt($ch, CURLOPT_PROXYTYPE, CURLPROXY_HTTP);
curl_setopt($ch, CURLOPT_PROXYPORT, <enter your proxy port here>);
}
if (! $html = curl_exec($ch)) {
echo '<br>Last CURL error is '.curl_error($ch).'<br>';
} else {
echo '<br>CURL without error.<br>';
}
curl_close($ch);
return $html;
} else {
echo 'Empty URL.';
}
}
echo 'Hello, getting pages via curl:';
$html=get_page($url);
var_dump($html);
echo bin2hex($html);
echo '<br>';
var_dump(get_page($url, false));
echo '<br>done.<br>';
?>
Use this:
export http_proxy=http://your.proxy.server:port/
or this:
From man curl:
-x, --proxy <[protocol://][user:password#]proxyhost[:port]>
Use the specified HTTP proxy.
If the port number is not specified, it is assumed at port 1080.

curl_exec($ch) not executing on external domains anymore, why?

I was using cURL to scrape content from a site and just recently my page stated hanging when it reached curl_exec($ch). After some tests I noticed that it could load any other page from my own domain but when attempting to load from anything external I'll get a connect() timeout! error.
Here's a simplified version of what I was using:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'http://www.google.com');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
$contents = curl_exec ($ch);
curl_close ($ch);
echo $contents;
?>
Here's some info I have about my host from my phpinfo():
PHP Version 5.3.1
cURL support enabled
cURL Information 7.19.7
Host i686-pc-linux-gnu
I don't have access to SSH or modifying the php.ini file (however I can read it). But is there a way to tell if something was recently set to block cURL access to external domains? Or is there something else I might have missed?
Thanks,
Dave
I'm not aware about any setting like that, it would not make much sense.
As you said you are on a remote webserver without console access I guess that your activity has been detected by the host or more likely it caused issues and so they firewalled you.
A silent iptables DROP would cause this.
When scraping google you need to use proxies for more than a few hand full of requests and you should never abuse your webservers primary IP if it's not your own. That's likely a breach of their TOS and could even result in legal action if they get banned from Google (which can happen).
Take a look at Google rank checker that's a PHP script that does exactly what you want using CURL and proper IP management.
I can't think of anything that's causing a timeout than a firewall on your side.
I'm not sure why you're getting a connect() timeout! error, but the following line:
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
If it's not set to 1, it will not return any of the page's content back into your $contents.

PHP curl and file_get_contents returns the main website located on the server when I enter a valid URL without DNS resolution

When I call the following:
file_get_contents('http://whgfdw.ca');
or
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://whgfdw.ca');
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_exec($ch);
The return value is the HTML of the homepage of the main site located on the local (dedicated) webserver. I would be grateful if anyone could help me understand this? Perhaps there is something I'm overlooking.
I can't use simple URL validation because http://whgfdw.ca is a perfectly fine URL; it just doesn't have a DNS entry.
My ideal functionality is to be able to catch a DNS lookup failure or a 404 or a case of no content and then act on it. Thanks!
If you got a valid response then that DNS entry exists somewhere. It may be on an internal DNS server, in the /etc/hosts file of the local server or somewhere else in the stack but the bottom line is its being resolved in some way. So the question becomes wheres the entry its resolving to entered. Its possible that there is an application that is set to resolve all lookups to the local server (similar to how openDNS and many ISP's will resolve an un-resolved DNS name to their search page).
Given that its somehow being resolved there really isnt a way to validate it unless you compare the content of the response to some content you expect. Catching a 404 is pretty easy, you can also set up reverse lookup in php to catch unresolved names i believe. But you need to tackle that resolution first i should think.

Categories