PHP Doamin based licensing system - php

I tried to create a domain based licensing system. The script should be check for license each time it'll be run. I'll use two domains which are hosted in two different servers and networks so that if a server is down, another one will help the scripts for licensing purpose.
Look at the codes bellow...
if (file_get_contents("http://domain.com/lic/ok.xml",0,null,0,1) !== false || file_get_contents("http://domain.net/lic/ok.xml",0,null,0,1) !== false) {
echo 'All is well';
} else {
echo 'Error, sorry!';
}
And output:
Warning: file_get_contents(http://domain.com/lic/ok.xml) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found in C:\wamp\www\ok.php on line 3
All is well
I just created a license file in domain.net. In domain.com, there was no "ok.xml". That means if any of the server goes down, the error message like above will be shown. Actually I want to display "All is well" is any of the servers contains the license file.
So how my code should be re-written? Which function should I use? Please help me.
PS: I'm fully new here and not known with rules of this types of community. So please help me to go on. Don't close the topic directly, help me to find out the correct way to post/ask questions.

Use #file_get_contents to suppress errors.
However, doing a remote HTTP request on every page load is pretty much unacceptable. If your server (or one of them) is slow the site doing the requests will be slow, too.

Related

How to restrict website accessing, if user is on remote device and not on a work computer? (work time tracker app)

I would like to make a PHP website, where employees can log in/out themselves and these logs will count as a time when they started and ended their working day. I would like to allow them to do that only on their work computer and not for example on their phone while they are still on the way, but they want to avoid "being late".
So I'm struggling with a few ideas, but any of them seems to be the right solution.
Allow using the website only for specific IP addresses. But then I realized that in our place IP address is dynamic and changing it for static costs too much in our area.
Check user location. But then I saw that when I'm checking my public IP address, the location is completely wrong! Our building isn't even close to the loaded area.
Using a COOKIE/token on a work computer. But it's very easy to set the same cookie on your own device and I'm not the only IT employee here.
Checking MAC address. As I read here it's possible only in specific cases.
Block access for mobiles. But detecting a mobile is based on browser and if the user click "Request Desktop Site" scripts will say that's a computer.
Is there another method, which I can use to achieve my goal? Am I missing something?
May I bind my app for example with some other technologies that will allow me to do that? Or maybe I should try a combination of all of them?
I couldn't find any script, which would take care of that. In the worst case it doesn't have to be "perfectly secure", but I would like to be it at least hard, annoying, or time-consuming to try to "cheat" in this system.
I would run your app in the office LAN. Nobody will be able to access it from outside except if they can do remote desktop to the office computer or if they have VPN. But if you are in the IT team you may could fix IP ranges for the office computers so that you could exclude the VPN.
In terms of security, in any case it may be better having it running in your LAN. I'm sure you've got a server somewhere and if it's not the case then you could use a NAS (Synology offers NGINX, Apache, PHP and much more) or a little Rasperry Pie or something similar.
If you haven't got a fixed IP, you could also use DynDNS and have it mapped to a sub-domain such as company-name.dyndns.org and then on your PHP app you could have a cron job that gets the IP address from the domain name and updates it every minutes (I'm sure it's quickly run). It could then store it inside a config file, this way:
<?php
define('ALLOWED_IP_FILE', 'allowed-ips.inc.php');
$ALLOWED_DOMAINS = [
'your-company.dyndns.org',
'you-at-home.dyndns.org',
];
$allowed_ips = [];
foreach ($ALLOWED_DOMAINS as $allowed_domain) {
$ip = gethostbyname($allowed_domain);
if ($ip !== $allowed_domain) {
// Store with the IP in the key and value for ease when checking the IP validity.
$allowed_ips[$ip] = $ip;
} else {
fprintf(STDERR, "ERROR: Could not find the IP of $allowed_domain!\n");
}
}
$allowed_ips_export = var_export($allowed_ips, true);
$config_file_content = <<<END_OF_CONFIG
<?php
// Don't edit! This config file is generated by cron-ip-address.php.
\$ALLOWED_IPS = $allowed_ips_export;
END_OF_CONFIG;
if (file_put_contents(ALLOWED_IP_FILE, $config_file_content) === false) {
fprintf(STDERR, 'ERROR: Could not write config file ' . ALLOWED_IP_FILE . "\n");
}
This generates a config file to include in your app. Example of content generated if you run the script I wrote above:
<?php
// Don't edit! This config file is generated by cron-ip-address.php.
$ALLOWED_IPS = array (
'142.250.203.99' => '142.250.203.99',
'23.201.250.169' => '23.201.250.169',
);
Now, in your app, just include it and test the presence of the IP in the $ALLOWED_IPS array:
<?php
include ALLOWED_IP_FILE; // If this is declared in a common config file.
// include 'allowed-ips.inc.php'; // If you haven't got a common config file.
if (!isset($ALLOWED_IPS[$_SERVER['REMOTE_ADDR']])) {
http_response_code(403);
die('Sorry, you cannot access it from here.');
}
Ideally, if what you actually want to track is when employees are in the workplace and logged on / for how long, it would be probably better to just track local machine-logins via a domain controller - a method reachable from the internet is suboptimal exactly for the reasons you mentioned.
If you have an intranet which users cannot tunnel into but can access from their work machines, I'd say hosting your login-page only inside that intranet is the easiest way to achieve what you want with the methods you suggest.
Alternatively, if employee work-machines use windows under a domain controller - you can restrict access to Windows certificate-storage, then install/push a certificate and require that to be present via your server-configuration. In that case, it doesn't matter if the website is accessible from the internet. I'm sure there are similar solutions if work-machines are not on Windows.
This admittely old question gives some pointers in the right direction on how to require client certificates from a Webserver (IIS in that case).

get last bitcoin price bitstamp

My goal is fairly simple, this is a PHP file and I included it into my header because I want to display the last bitcoin price using bitstamp.net not any other bitcoin exchange prices.
<?php
function getprice($url){
$decode = file_get_contents($url);
return json_decode($decode, true);
}
$btcUSD = getPrice('https://www.bitstamp.net/api/ticker/ '); //bitstamp
$btcPrice = $btcUSD["last"];
$tempround = round($btcPrice, 2);
$btc_Display = "$".$tempround;
?>
Well, this seems to work, but some times upon refreshing the page I get an error.
Warning: file_get_contents(https://www.bitstamp.net/api/ticker/ ):
failed to open stream: HTTP request failed! HTTP/1.1 400 BAD_REQUEST
in C:\xampp\htdocs\hidden\btcprice.php on line 3
The error doesn't happen often its very random in timing, but what does it mean and how can I prevent it?
It took me a while to get the error because I don't know what is causing it. I'm curious how to prevent it, am I leaving something out? I used a guide to learn how to do this that got the last bitcoin price from btc-e, but I don't want to use btc-e. I have to use bitstamp last price.
Also no JavaScript is allowed (or should I say I'm trying to avoid JavaScript for this little project) and I don't understand PHP OOP stuff, so please no examples in that.
Your code is working for me. The w3.org defines 400 as follows:
The request could not be understood by the server due to malformed
syntax. The client SHOULD NOT repeat the request without
modifications.
However, that can happen when you use a Web-Api. Espacially Api's from Bitcoin-exchanges can be pretty unstable and answer with errors from time to time, according to my own experience. How RobotMind already mentioned, you should put a
try
{
}
catch
{
}
around the getPricefunction.
Another option is to use Curl. This way you can easily access the Status-Code and react accordingly if an error should happen.

How to check if the web connection is established?

Disambiguation: 'connection' doesn't mean database connection.
REAL QUESTION
Scenario 1: I am taking weather feed from a 3rd party site, and it's working fine. Suddenly my client complained that it's broken, and showing an error. Then quickly I just simply commented out the codes.
Scenario 2: Suppose I embedded Google fonts into my website and it's looking fine. Suddenly Google's services are banned in a country, and my site looks dumb.
PROBABLE SOLUTION
If I can put a checker like:
<?php
// suppose this is the URL of my connection I need to connect my site on loading
$connection = http://www.feed-from-somewhere.com/feed/my-feed123
if ( isset( $connection ) ) {
// show the feed from 3rd party site
} else {
// do my backup plan for the feed instead
}
?>
But I can't find any way. If I can do such thing than it would be better for all such 3rd party sharing and site won't malfunction in future.
Waiting for a nice solution...
Maybe you can try the fopen() function in php.
as described in the php.net fopen() --
"If filename is of the form "scheme://...", it is assumed to be a URL
and PHP will search for a protocol handler (also known as a wrapper)
for that scheme."...
example could be:
#$connection = fopen("http://iewuhf.com/", "r");
//note # is used to suppress the error ortherwise, if connection fails, a warning
//will be displayed
if(!$connection){
echo 'false';
}else{
echo 'true';
}

I get an error on one host, not on other

Sadly, one plugin is causing me problems. It works great on one website but gives me an error on another website.
this is the error:
simplexml_load_file() [function.simplexml-load-file]: http://steamcommunity.com/profiles/76561197971691194/?xml=1:1: parser error : Document is empty
XenForo_Application::handlePhpError() in Steam/ControllerPublic/Register.php at line 117
If I go there, this is the code:
// Get User Profile Data
$id = $session->get('steam_id');
$xml = simplexml_load_file("http://steamcommunity.com/profiles/{id}/?xml=1");
if(!empty($xml)) {
$username = $xml->steamID;
$avatar = $xml->avatarFull;
The link is valid, you can try yourself and let's say go here: steamcommunity.com/profiles/76561197971691194 or http://steamcommunity.com/profiles/76561198041253738
Really need help, this is the only thing that is blocking me from starting the website!
That probably means that your host is blocking outbound HTTP requests.
Ask them to stop that, or find a better host.

Why is my 301 Redirect taking so long?

In a long tiredsome quest to speed up my site, I have figured out something is wrong with the redirection: currently my index.php handles all the homepage redirections via PHP header location 301 Redirect Permanently: website.com >> website.com/en/home and website.de >> website.de/de/home etcettera etcettera (around 20 for this multilingual website) it takes anywhere from 200ms to 6000ms to do the redirecting. Check out the waterfall!
After that, the page loads in a thunderbolt's blink of an eye!
What a waste of time wouldn't you say? What is the server doing all this time?
After careful examination, my best guesse is: ITS DOING LAUNDRY!
I am almost giving up on PHP for this!
Any and all clues to my puzzling prob are very welcomed +1
A. Given facts: Apache/2.0.54 Fedora, PHP 5.2.9. there is no database: just flat php files with around 15 php includes that completes my page with headers and footers). YSlow Grade: 92/100! Good page Speed: 93/100! javascript and css are as much as possible combined. Cache controlls seem well set too (as proven by the grades). Whats missing in those 7 points out of 100: not using Keep-Alive (beyong my controll in shared hosting and not using Content Delivery Network. I can live with those missing 7 points, but this is major hit on speed!
B. Furthermore: i recently was given great insights over here that i should use url rewriting via htacces. Point taken, BUT, perhaps there is sometin else wrong here that i should correct before moving on to the for me more difficult apache regex syntaxes.
C. Faster way: When I php include the intended homepage, instead of redirect, then all loads fast, but the url is not rewritten: it sits at website.com on the browser bar, whereas i wish after including it to become website.com/en/home. Is this possible with PHP? To include+change the current address of the url, too?
Conclusions: you can redirect using index.php, or using .htaccess. Sofar from my tests (coming from the genius answers below!THANKS EVERYONE!) the latter seems unmatched in speed: much faster redirecting than a php redirect! reducing the redirect to shorter than the first dns lookup.
see here how to do this correclty for multilingual site
Damn, I hate getting stuck with this kind of problem. You need to eliminate some variables.
First I should point out that PHP will not flush all of its own headers until you start outputting things (or, if the output_buffering(?) ini directive is set to x bytes, until you have output x bytes). So the following script will not finish "sending headers" until the very end:
<?php
header('Content-Type: text/pants');
sleep(6);
header('Ding-Ding: time to put the socks in the dryer');
echo "z"; // headers are sent here
What happens to the call to en/home if you put exit; or echo "wheeeee"; exit; at the very top of that PHP script? Then what happens when you substitute it with a plain, empty file? If the php script with exit is slow but the plain text file is fast, the PHP interpreter is probably playing funny buggers. If you still get the delay for both, you've eliminated the actual response generation as the cause (but I'm still trying to come up with some ideas if this is the case).
Also, can you ssh to the server? If so, can you try wgetting the same page from inside the server? If you can without the speed problem, I would be looking at the client side. If you can't SSH, you could try doing a request from PHP, though I'm really not sure if this will work:
<?php
$context = stream_context_create(array(
'http'=>array(
// send request headers if you need to
'header'=>array(
'Foo: Bar',
'Bar: Baz',
),
),
));
$start = microtime(true);
$response = file_get_contents('http://yourserver.com/', null, context);
$end = microtime(true) - $start;
var_dump($end);
// for some bizarre reason, PHP emits this variable into the local scope.
var_dump($http_response_header);
Have you tried doing the same request from other machines, or other places in the world? This can confirm or deny if it's just your machine.
Another thing you can try if it is the response generation is to do a little bit of hack-profiling on the production server. I hate having to do this stuff, but sometimes your code just refuses to behave on the production server like it behaves in your development environment or on staging. Do this to the script that generates /en/home:
<?php
// put this at the very top
$rqid = uniqid('', true);
$h = fopen(__DIR__.'/crap.log', 'a');
fwrite($h, $rqid.' [START] '.microtime(true).PHP_EOL);
fclose($h);
// do all that other wonderful stuff, like laundry or making a cup of tea
// put this at the very end
$h = fopen(__DIR__.'/crap.log', 'a');
fwrite($h, $rqid.' [END] '.microtime(true).PHP_EOL.PHP_EOL);
fclose($h);
Run a few requests against it, check to make sure 'crap.log' is getting stuff written to it (check permissions!!), and then you'll have some data that will show whether there is something in your script that needs to be investigated further as the cause of the slowness.
Oh, did I mention MySQL indexes? Are you doing any queries during the request? Have you added all of the proper indexes to the tables?
Steven Xu raises a good point in the comments for your question - are you sure the program you're using to generate the waterfall is giving you good info? Try installing Firebug if you haven't already, click the little firebug icon in the bottom right of firefox and make sure the "Net" panel is open, then re-run your request and see if the waterfall is consistent with the results you're seeing in the program you used.
Also, I know this is kind of a boneheaded suggestion and I apologise, but I think it needs to be said: your host doesn't allow ssh and only uses PHP 4? I would seriously consider another host. It may even solve this specific problem.
I will add more stuff as I think of it.
If it is indeed the headers taking ages, then your JS/CSS/HTML is irrelevant.
You can do the forwarding in .htaccess.
RewriteEngine On
RewriteRule ^$ en/home [R=301]
This will essentially send the same header, but it won't invoke the PHP engine first to do it :)
Update
On closer inspection, it would seem to me that your en/home page is taking the longer time to download.
I think Ignacio Vazquez-Abrams may have the answer: after you call header() to do the redirection you need to call exit() to cause the PHP script execution to stop. Without that the script will keep executing, sending output to the browser, until the end. Since the browser has to wait for the server side script to end before performing the redirection that could cause the problem.
Update
Just read Alex's update and he seems to be correct. The /en/home page is where the time is.

Categories