problems in dynamic craetion of subdomains in php - php

Hello
I am creating subdomains in php by the following code.
function subd($host,$port,$ownername,$passw,$request) {
$sock = fsockopen('localhost',2082);
if(!$sock) {
print('Socket error');
exit();
}
$authstr = "$ownername:$passw";
$pass = base64_encode($authstr);
$in = "GET $request\r\n";
$in .= "HTTP/1.0\r\n";
$in .= "Host:$host\r\n";
$in .= "Authorization: Basic $pass\r\n";
$in .= "\r\n";
fputs($sock, $in);
while (!feof($sock)) {
$result .= fgets ($sock,128);
}
fclose( $sock );
return $result;
}
$domain="memories.mydomain.com";
$subd="abcdef";
$request ="frontend/x3/subdomain/doadddomain.html?domain=$subd&rootdomain=$domain&dir=public_html/$subd&go=Create";
$host="ftp.mydomain.com";
$port="2083";
$ownername="ownername";
$passw="my_PASSWORD";
$result=subd($host,$port,$ownername,$passw,$request);
$show = strip_tags($result);
$d="http://$subdomainname.$domain";
echo '<META HTTP-EQUIV="Refresh" Content="0; URL='.$d.'">';
I am seeing in my cpanel sub-domain is creating but i want to redirect to my sob-domains when creation completed but instead of subdomain i am redirecting to error page which is
http://abcdef.memories.mydomain.com/cgi-sys/defaultwebpage.cgi. What is the problem behind it.Why am i redirecting to unsupported link.?
Thanks

Wow, you're writing an automated front end for an automated front end. Why don't you just create the subdomain directly in the php script, and skip messing around with cpanel?

Are you sure this should be done as a GET request? Doing things on a server via GET that have consequences is NEVER a good idea: http://thedailywtf.com/Articles/The_Spider_of_Doom.aspx
Don't use your own http client. Use curl (good) or something like file_get_contents (not as good)
Check if CPanel wants this done as a POST instead.
You pass in a port #2083, but then ignore it completely and use a hardcoded port #2082 in the function.
Your request URL has no leading /, which makes it an invalid request. You can use such paths in an HTML page, because the browser rewrites the href="" stuff by putting on the address of the page the link is in - but you're not using a browser. You're rolling your own version.

Related

Connect through HTTPS instead of HTTP

I want to use a simple API and i want to do it in the secure way.
It is currently using sockets and port 80. As far as I know port 80 is open and it doesn't seem such a secure connection.
As the data to send contains user and password i want to use HTTPS instead of HTTP to make it secure.
I was wondering if it is so simple as just changing this line;
$headers = "POST /api/api.php HTTP/1.0\r\n";
For this other line
$headers = "POST /api/api.php HTTPS/1.0\r\n";
And changing the port to 443
Here is the connect function:
// api connect function
function api_connect($Username, $Password, $ParameterArray)
{
// Create the URL to send the message.
// The variables are set using the input from an HTML form
$err = array();
$url = "api.text-connect.co.uk";
$headers = "POST /api/api.php HTTP/1.0\r\n";
$headers .= "Host: ".$url."\r\n";
// Create post string
// Username and Password
$poststring = "Username=".$Username."&";
$poststring .= "Password=".$Password;
// Turn the parameter array into the variables
while (list($Key, $Value)=#each($ParameterArray))
{
$poststring .= "&".$Key."=".urlencode($Value);
}
// Finish off the headers
$headers .= "Content-Length: ".strlen($poststring)."\r\n";
$headers .= "Content-Type: application/x-www-form-urlencoded\r\n";
// Open a socket
$http = fsockopen ($url, 80, $err[0], $err[1]);
if (!$http)
{
echo "Connection to ".$url.":80 failed: ".$err[0]." (".$err[1].")";
exit();
}
// Socket was open successfully, post the data.
fwrite ($http, $headers."\r\n".$poststring."\r\n");
// Read the results from the post
$result = "";
while (!feof($http))
{
$result .= fread($http, 8192);
}
// Close the connection
fclose ($http);
// Strip the headers from the result
list($resultheaders, $resultcode)=split("\r\n\r\n", $result, 2);
return $resultcode;
}
?>
Your code has a huge number of issues regardless if it's using HTTP or HTTPS - implementing an HTTP client (or server) is MUCH more complicated than simply throwing some headers across a socket then sinking the response.
What's particularly bad about this approach is that it will work some of the time - then it will fail and you won't understand why.
Start again using curl.
Doing it this way you only need to change the URL (it also implements a cookie jar, support for header injection, automatic following of redirects, routing via proxies, verification or non-verification of SSL certificates amongst other things).
I was wondering if it is so simple as
No, it isn't. It really, really isn't.
HTTPS is HTTP tunnelled over SSL. So you don't change the content of the HTTP request at all.
You do need to perform all the SSL handshaking before you do the HTTP stuff though.
SSL is crypto, it is therefore hard. Don't try reinventing this wheel. Use a library such as cURL.
curl
and set CURLOPT_SSL_VERIFYPEER = false

PHP Async GET request works on one server, but doesn't on the other

Please see the edits at the bottom for additional information!
I have two servers. Both should be able to call each other with a GET request.
To make the request (it's more firing an event than makeing a request actually) I am using this code:
function URLCallAsync($url, $params, $type='POST')
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
// Data goes in the path for a GET request
if('GET' == $type) $parts['path'] .= '?'.$post_string;
$out = "$type ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
// Data goes in the request body for a POST request
if ('POST' == $type && isset($post_string)) $out.= $post_string;
fwrite($fp, $out);
fclose($fp);
}
I feed the function with the exact same data (but the url) on both servers (I copied the calling file to test it!!) but it only works in one direction!
I write the calls to that function in a log file so I can investigate if something is going wrong.
Server A -> Server B, works exactly as it should, the logfile at server A contains the correct url
Server B -> Server A, only prints the correct information in the logfile of server B, but Server A never receives the request.
What could be the reason for something like this?
edit:
Could it be the differnt kinds of server?
Server A is nginx, Server B is apache.
Server A also has a '~' symbol in it's url, maybe thats the problem?
The parameters of the get request are encoded with php's "urlencode" maybe that creates problems?
I tried around a bit, but the problem is still that the request isn't coming trough to Server A. But from a browser it works perfectly somehow (assuming I enter the correct URL with the parameters).
edit2:
If I exchange "URLCallAsync" with "file_get_contents" it works like it should. But the problem is that file_get_contents is blocking!
So it can only be the function itself. But strangely it works in the opposite direction :(
edit3:
The function "URLCallAsync" runs trough without error, notice or anything else.
It just isn't received by the other server.
What exactly is file_get_contents doing so different???
I got it working.
After a lot of fiddling with wireshark I found that file_get_contents is even simpler than my async function!
It simply omits the Content-Length field completly! It just provides "GET ..." and "Host".
It also uses HTTP/1.0 instead of 1.1, but that didn't change anything.
So the solution is: Also posting the Content-Length header (which had the value 0, since i used GET) will somehow make the server reject the request. I don't know for sure if it was the server that rejected the request, or something else, like a firewall that maybe detected a "malformed" request, but at least the problem is solved.
So next time you send requests, don't provide the Content-Length header if you don't need it :)

How do I ping a webpage in php to make sure it's alive?

I have a list of urls for some webpages and I want to make sure that this url ( webpage ) exists and not deleted or it doesn't exist. I want to do it in PHP.
How do I ping a webpage to make sure it's alive?
$urls = array(...);
foreach($urls as $url) {
$headers = get_headers($url);
if ( ! $headers OR strpos($headers[0], '200 OK') === FALSE) {
// Site is down.
}
}
Alternatively you could use ping.
$response = shell_exec('ping ' . escapeshellarg($url));
// Parse $response.
Update
You mention you want this to be scheduled. Have a look into cron jobs.
Create a PHP script that fires off an HTTP Request to each URL that you want to be kept-alive.
PHP HTTP Request
I suggest setting up a Task in your operating system that accesses this script every 15 minutes in order to keep these applications alive. Here's some info on running PHP from the command line in Windows.
See http://www.planet-source-code.com/vb/scripts/ShowCode.asp?lngWId=8&txtCodeId=1786 for a thorough implementation.

Why is my script taking so long to retrieve headers?

<?php
set_time_limit(0);
$errorArr = array();
if (!isset($argv[1]))
{
array_push($errorArr, "You forgot to enter a host.");
}
if ((isset($argv[1])) AND (!filter_var($argv[1], FILTER_VALIDATE_IP)))
{
array_push($errorArr, "The host you entered is not a valid IP address.");
}
if (!isset($argv[2]))
{
array_push($errorArr, "You forgot to select a port.");
}
if (!empty($errorArr))
{
echo "You have the following errors:\n";
print_r($errorArr);
die("Syntax is as follows: php {$argv[0]} host port\n");
}
$host = $argv[1];
$port = $argv[2];
echo ":::Connecting...\n";
$fh = fsockopen($host, $port);
if (!$fh)
{
die(":::Connection failed.\n:::Aborting.\n");
}
echo ":::Connected!\n:::Sending headers.\n";
$header = "PROPFIND /webdav/ HTTP/1.1\r\n";
$header .= "Host: {$host}\r\n";
$header .= "User-Agent: BitKinex/3.2.3\r\n";
$header .= "Accept: */*\r\n";
$header .= "Pragma: no-cache\r\n";
$header .= "Cache-Control: no-cache\r\n";
$header .= "Depth: 1\r\n";
$header .= "Content-Length: 220\r\n";
$header .= "Content-Type: text/xml\r\n\r\n\r\n";
if (!fwrite($fh, $header))
{
die(":::Couldn't send headers. Aborting.\n");
}
$exHeader = explode("\r\n", $header);
foreach ($exHeader as $ecHeader)
{
echo "<<<{$ecHeader}\n";
}
echo "\n:::Retrieving syntax...\n";
while(1)
{
while ($data = fgets($fh, 512))
{
echo ">>>{$data}";
flush();
}
}
?>
I'm working on a script to connect to WebDAV, upload a file, and disconnect. It connects and sends headers fine, but then it takes forever to retrieve syntax. At times, it takes several minutes, and I can't understand why. Is it a problem in my code?
And yes, I realize there's an infinite while loop there. That's done on purpose, because I haven't figured out how to know when the server is done sending information to me. So I guess that's another question, if anyone could provide insight to that.
Thanks
Your problem is because you are sending the Content-Length header with a value of 220, while not sending any content at all. The server hangs in there expecting content, but it never arrives...
And for your infinite loop thing, you don't need it at all. fgets will return false if the connection has closed. Send the Connection: close header to tell Apache to end the connection after the data has been sent. Your while loop will evaluate to false when the data has been read entirely and the connection has closed, and your loop will exit.
You might want to test it using cURL then. Try this one out: http://curl.haxx.se/mail/archive-2006-02/0000.html
That way you can see if it's server side or code side.
WebDAV can CHUG if the machine you are connecting to handles lots of traffic in general. And especially lots of web traffic. The reasons are complex, but solutions I have used in the past have primarily involved coding around the delay. Either by dumping things in a line to wait, or by pushing things to a box that isn't under heavy load but is more directly connected to the server in question and can push the files to it via different means.
This all requires access, however, and if you have control over the machines you are connecting to, you should be able to reconfigure them to give yourself priority. (which may not be an option if you are connecting to a production web server) However, I've never had to deal with this in PHP. So the problem certainly could be caused by other reasons.

php libcurl alternative

Are there any alternatives to using curl on hosts that have curl disabled?
To fetch content via HTTP, first, you can try with file_get_contents ; your host might not have disabled the http:// stream :
$str = file_get_contents('http://www.google.fr');
Bit this might be disabled (see allow_url_fopen) ; and sometimes is...
If it's disabled, you can try using fsockopen ; the example given in the manual says this (quoting) :
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
Considering it's quite low-level, though (you are working diretly with a socket, and HTTP Protocol is not that simple), using a library that uses it will make life easier for you.
For instance, you can take a look at snoopy ; here is an example.
http://www.phpclasses.org is full of these "alternatives", you can try this one: http://www.phpclasses.org/browse/package/3588.html
You can write a plain curl server script with PHP and place it on curl-enabled hosting, and when you need curl - you make client calls to it when needed from curl-less machine, and it will return data you need. Could be a strange solution, but was helpful once.
All the answers in this thread present valid workarounds, but there is one thing you should keep in mind. You host has, for whatever reason, deemed that making HTTP requests from your web server via PHP code is a "bad thing", and have therefore disabled (or not enabled) the curl extension. There's a really good chance if you find a workaround and they notice it that they'll block your request some other way. Unless there's political reasons forcing you into using this particular host, seriously consider moving your app/page elsewhere if it need to make http requests.

Categories