I am creating subdomains dynamically through the php script. This is creating well. But when I try to open it I am getting a default cgi error page ( http://subdomain.mydomain.com/cgi-sys/defaultwebpage.cgi). But after 15-20 minutes later this is working fine. Why is this?
My subdomain creation code is below:
function subd($host, $port, $ownername, $passw, $request) {
$authstr = "cpanel_userid:cpanel_password";
$sock = fsockopen('localhost',2082);
if(!$sock) {
print('Socket error');
exit();
}
$pass = base64_encode($authstr);
$in = "GET $request\r\n";
$in .= "HTTP/1.0\r\n";
$in .= "Host:localhost\r\n";
$in .= "Authorization: Basic $pass\r\n";
$in .= "\r\n";
fputs($sock, $in);
while (!feof($sock)) {
$result .= fgets ($sock,128);
}
fclose( $sock );
return $result;
}
I'm calling this function from my code. My subdomain is creating well but this is taking a bit more time to setup properly. Where am I going wrong? My server is running Linux.
Thanks.
I presume that your web hoster needs the time to completely setup the subdomain. Looks normal to me.
Related
I have a weird issue and I can't seem to find a solution or anything closer to the issue I am having ,
Here is the thing , I have a scoket script run via php on command line, it accepts connection and reads data in json format from mobile app clients and sends appropriate response in json.
Everything works fine except the number of connection does not go above 256 connection.
I would like to know why is that, and how can I solve it ? I have been It on so many days, but no luck!
Here is the script snippet
<?php
date_default_timezone_set("UTC");
$server = stream_socket_server("tcp://192.168.1.77:25003", $errno, $errorMessage);
if (!$server) {
die("$errstr ($errno)");
}
echo "Server started..";
echo "\r\n";
$client_socks = array();
while (true) {
//prepare readable sockets
$read_socks = $client_socks;
$read_socks[] = $server;
//start reading and use a large timeout
if (!stream_select ($read_socks, $write, $except, 10000)) {
die('something went wrong while selecting');
}
//new client
if (in_array($server, $read_socks)) {
$new_client = stream_socket_accept($server);
if ($new_client) {
//print remote client information, ip and port number
echo 'Connection accepted from ' . stream_socket_get_name($new_client, true);
echo "\r\n";
$client_socks[] = $new_client;
echo "Now there are total ". count($client_socks) . " clients";
echo "\r\n";
}
// echo stream_socket_get_name($new_client, true);
//delete the server socket from the read sockets
unset($read_socks[array_search($server, $read_socks)]);
}
$data = '';
$res = '';
//message from existing client
foreach($read_socks as $sock) {
stream_set_timeout($sock, 1000);
while($resp = fread($sock, 25000)) {
$data .= $resp;
if (strpos($data, "\n") !== false) {
break;
}
}
$info = stream_get_meta_data($sock);
if ($info['timed_out']) {
unset($client_socks[array_search($sock, $client_socks)]);
#fclose($sock);
echo 'Connection timed out!';
continue;
}
$client = stream_socket_get_name($sock, true);
if (!$data) {
unset($client_socks[array_search($sock, $client_socks)]);
#fclose($sock);
echo "$client got disconnected";
echo "\r\n";
continue;
}
//send the message back to client
$decode = json_decode($data);
$encode = json_encode($res);
fwrite($sock,$encode."\n");
}
}
P.S.: What I did is, extensive search on the topic, and went over article like these,
http://smallvoid.com/article/winnt-tcpip-max-limit.html and two dozens others.
I have a windows 7 running this thing + wamp 2.5 which runs php 5.5.12
It's nothing to do with your code, it's a "feature" of MS Windows to make you buy the server edition (or upgrade to a different OS). Functionally there's no difference between the server and desktop editions of the NT kernel (some different optimization tweaks) its just a means of ensuring you are complying with the terms of the licence.
I did some google search for this but all I can find is separate things and the cPanel's interface is a little bit confusing to be honest.
Basically, I have this script to create files+mysql backup:
<?php
$auth = base64_encode(":");
$domain = "";
$theme = "";
$secure = false;
$ftp = false;
$ftpserver = "";
$ftpusername = "";
$ftppassword = "";
$ftpport = "21";
$ftpdirectory = "/";
if ($secure) {
$url = "ssl://" . $domain;
$port = 2083;
} else {
$url = $domain;
$port = 2082;
}
$socket = fsockopen($url, $port);
if (!$socket) {
exit("Failed to open socket connection.");
}
if ($ftp) {
$params = "dest=ftp&server=$ftpserver&user=$ftpusername&pass=$ftppassword&port=$ftpport&rdir=$ftpdirectory&submit=Generate Backup";
} else {
$params = "submit=Generate Backup";
}
fputs($socket, "POST /frontend/" . $theme . "/backup/dofullbackup.html?" . $params . " HTTP/1.0\r\n");
fputs($socket, "Host: $domain\r\n");
fputs($socket, "Authorization: Basic $auth\r\n");
fputs($socket, "Connection: Close\r\n");
fputs($socket, "\r\n");
while (!feof($socket)) {
$response = fgets($socket, 4096);
echo $response;
}
fclose($socket);
?>
I would like to run that PHP file so it backups up my stuff exactly like the cPanel would on any regular shared hosting plan: Daily, Weekly and Monthly. I want to do this because of my website got screwed up because the hosting company backup was not working properly.
Do I need to create separate cron jobs for daily, weekly, monthly or I can do all of them at once?
If I input a command in the Command field, is it enough to get it working or I need to setup the other input fields?
What's the correct command to run my backups as explained above?
Thanks!
In that interface, you have a few different options for the command,
php -q /path/to/your/script.php
would work.
If you add #!/usr/bin/php -q to the top of your PHP script, then you can just call it directly from cron:
/path/to/your/script.php
The general answers to your questions (not having to do with that web interface) are covered here:
How to execute a php script every day
This might also be helpful, a random but pretty decent reference on crontab syntax:
http://www.thegeekstuff.com/2009/06/15-practical-crontab-examples/
I was looking for a practical way to detect file system changes. Than I found this pretty simple script from "Jonathan Franzone". But my problem is, it doesn't scan sub folders. Since I'm just a newbie in PHP, I would like to ask in here to have robust offers to solve.
Note: I made an extended search on site before writing. Many questions asked about this aproach to secure website. But no complete reply at all.
<?php
/**
* File : ftpMonitor.php
* Monitors a remote directory via FTP and emails a list of changes if any are
* found.
*
* #version June 4, 2008
* #author Jonathan Franzone
*/
// Configuration ///////////////////////////////////////////////////////////////
$host = 'ftp.domain.com';
$port = 21;
$user = 'username';
$pass = 'password';
$remote_dir = '/public_html';
$cache_file = 'ftp_cache';
$email_notify = 'your.email#gmail.com';
$email_from = 'email.from#gmail.com';
// Main Run Program ////////////////////////////////////////////////////////////
// Connect to FTP Host
$conn = ftp_connect($host, $port) or die("Could not connect to {$host}\n");
// Login
if(ftp_login($conn, $user, $pass)) {
// Retrieve File List
$files = ftp_nlist($conn, $remote_dir);
// Filter out . and .. listings
$ftpFiles = array();
foreach($files as $file)
{
$thisFile = basename($file);
if($thisFile != '.' && $thisFile != '..') {
$ftpFiles[] = $thisFile;
}
}
// Retrieve the current listing from the cache file
$currentFiles = array();
if(file_exists($cache_file))
{
// Read contents of file
$handle = fopen($cache_file, "r");
if($handle)
{
$contents = fread($handle, filesize($cache_file));
fclose($handle);
// Unserialize the contents
$currentFiles = unserialize($contents);
}
}
// Sort arrays before comparison
sort($currentFiles, SORT_STRING);
sort($ftpFiles, SORT_STRING);
// Perform an array diff to see if there are changes
$diff = array_diff($ftpFiles, $currentFiles);
if(count($diff) > 0)
{
// Email the changes
$msg = "<html><head><title>ftpMonitor Changes</title></head><body>" .
"<h1>ftpMonitor Found Changes:</h1><ul>";
foreach($diff as $file)
{
$msg .= "<li>{$file}</li>";
}
$msg .= "</ul>";
$msg .= '<em>Script by Jonathan Franzone</em>';
$msg .= "</body></html>";
$headers = "MIME-Version: 1.0\r\n";
$headers .= "Content-type: text/html; charset=iso-8859-1\r\n";
$headers .= "To: {$email_notify}\r\n";
$headers .= "From: {$email_from}\r\n";
$headers .= "X-Mailer: PHP/" . phpversion();
mail($email_notify, "ftpMonitor Changes Found", $msg, $headers);
}
// Write new file list out to cache
$handle = fopen($cache_file, "w");
fwrite($handle, serialize($ftpFiles));
fflush($handle);
fclose($handle);
}
else {
echo "Could not login to {$host}\n";
}
// Close Connection
ftp_close($conn);
?>
Thanks for anyone have a solution or at least try to help.
EDIT: Actually I was willing to ask for deleting automatically but, I dont want to be too much demanding. Why not if it will not be a pain.
I've installed php 5.3.14 on Ubuntu Desktop 12.04.
with:
allow_url_fopen = 1
Doesn't work bellow:
<?php
echo file_get_contents('http://www.example.com');
Works bellow:
<?php
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
Even curl_exec() works.
I've also tried like this with Python, Python could fetch www contents.
I'm not using Firewall, Proxy.
But no problem with local network.
(192.168.1.36 is my local server machine.)
<?php
echo file_get_contents('http://192.168.1.36);
Is any configuration or installation problems?
Thanks.
One of the answers are:
You need to also check in PHP.ini file
extension = php_openssl.dll
If it is enable or not, if not then just enable that by removing ";" sign
This is a simular question:
PHP file_get_contents does not work on localhost
Check the answer and then it should work
I'm trying to use file_get_contents to retrieve the output a browser would receive from another file on the same domain.
I've moved to another server and now it always gets a timeout.
Below is a sample of what I'm trying to do.
index.php
<?php
echo file_get_contents('http://'.$_SERVER['SERVER_NAME'].'/sample.php');
?>
sample.php
<?php
echo 'test';
?>
Any ideas what might be the cause of this problem?
EDIT
Our server manager mentioned something about apache not responding to localhost, does that perhaps ring a bell?
Thank you
Are you sure the URL is actually correct? Have you tried using $_SERVER ['HTTP_HOST'] instead? On the machine that runs PHP, what does the host from the generated URL resolve to? Has your web server (Apache?) been set up listen on the localhost interface?
You can use fsockopen to do the same , along with you can specify the timeout
<?php
$fp = fsockopen($_SERVER['SERVER_NAME'], 80, $errno, $errstr, 30/*timeout*/);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
$out = "GET /sample.php HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
?>
Check documentation for more details
http://php.net/manual/en/function.fsockopen.php