PHP/Ruby captive portal first domain seems cached - php

I have created a captive portal with iptables
I use what many people seem to use : Users can request DNS, packet marked as 99. 99 means no internet else the user does have access.
When a user visits a page when visiting for example stack overflow. The user gets the disclaimer. He/she clicks on okay. What happened is that the server executes the following rules :
`sudo /sbin/iptables -t mangle -I captivePortal 1 -m mac --mac-source {$mac} -j RETURN`;
`sudo /sbin/iptables -t mangle -I captivePortal 1 -s {$_SERVER['REMOTE_ADDR']} -j RETURN`;
What i have tried:
Used sinatra stand alone with thin. Render template with erb. When the user reloads after authenticating they get the disclaimer when visiting the initial domain. When the visit another they dont get the disclaimer page.
Set up apache2 with php, rewritten all in php. added meta tags to prevent caching in the browser but same result. original domain redirects to disclaimer always but other sites are okay.
What i want to achieve
Users need to click accept on disclaimer before they can use WIFI.
Edit : Reloading apache2 does correct this problem.

Used rmtrack as described here : http://www.andybev.com/index.php/Using_iptables_and_PHP_to_create_a_captive_portal
/usr/sbin/conntrack -L \
|grep $1 \
|grep ESTAB \
|grep 'dport=80' \
|awk \
"{ system(\"conntrack -D --orig-src $1 --orig-dst \" \
substr(\$6,5) \" -p tcp --orig-port-src \" substr(\$7,7) \" \
--orig-port-dst 80\"); }"

Related

Wordpress/PHP create a static copy of the page

I need to make a static page from the dynamic one with all assets downloaded and all the links converted to local ones and download it in some tmp folder. Like when you press Ctrl+S in a browser. I tried using wget with shell_exec:
shell_exec("wget -E -H -k -p http://youmightnotneedjquery.com/ 2>&1");
The problem is that it works perfectly when I run it from console, but when I use shell_exec, I get an error
Permission denied youmightnotneedjquery.com/index.html: No such file
or directory Cannot write to 'youmightnotneedjquery.com/index.html'
(No such file or directory).
As I understand, there is some problem with permissions, I tried to create a seperate directory
with some high permissions and www-data as owner and specify it in the command using -O flag, but I get an error that I can't use -k and -O flags at the same time. So I hope to solve that issue with permission, but I still have to specify the destination folder somehow. Or maybe there's a php solution without wget that I can use, as it seems not quite hard but a lot of work to do.
You may try something like
shell_exec("cd some_nice_dir && wget ...")
You may also want to read up on man wget as it has a lot to say about interferences between -O and several of the other options you specify.
Helped using -P flag and creating a folder with owned www-data
shell_exec("wget -E -H -k -p http://mysite.local/ -P some-temp-folder 2>&1")

Laravel OAUTH: Restrict users from requesting any scope they want

When requesting an OAUTH Grant Password token, the user can specify his desired scope. How can one prevent a regular user from requesting and admin scope?
The code exemplifies a malicious request that asks for an admin scope, although he shouldn't have accesss to it.
curl -X POST \
http://a.myapiserver.com/api/oauth/token \
-F grant_type=password \
-F client_id=2 \
-F client_secret=PpMrx32Zow5OcQf491GXXT0dlEzMNuYHt6fe4Wdy \
-F username=regularuser \
-F password=strongpasss \
-F scope=admin
Problem has been solved by adding a middleware ScopeLogic and adding it to the passport::routes.
found the solution here: https://code.i-harness.com/en/q/259c0dd

PHP exec() works in command line but not in web

I'm trying to use jarun's "googler" in a PHP script in order to search YouTube and find the URL of the first result. The command I'm executing is googler --np --json -C -n 1 -w youtube.com -x <name of youtube video>, and it works perfectly on my local machine. Here is my code:
<?php
exec("googler --np --json -C -n 1 -w youtube.com -x thomas the dank engine", $results);
var_dump($results);
?>
When I execute this in the command line, it works perfectly as it should, but when I do it via a web browser or a GET request, it does not work. I am aware that it is being executed as another user. In my case, it's the user www-data, so I gave that user full sudo permissions without a password, and did the following commands:
sudo -u pi googler --np --json -C -n 1 -w youtube.com -x thomas the dank engine
as well as
su - pi -c 'googler --np --json -C -n 1 -w youtube.com -x thomas the dank engine'
neither of these worked. Does it have to do with googler? What am I doing wrong?
When adding 2>&1 to the command, I get the following error message:
stdout encoding 'ascii' detected. googler requires utf-8 to work properly. The wrong encoding may be due to a non-UTF-8 locale or an improper PYTHONIOENCODING. (For the record, your locale language is and locale encoding is ; your PYTHONIOENCODING is not set.) Please set a UTF-8 locale (e.g., en_US.UTF-8) or set PYTHONIOENCODING to utf-8.
Try putting:
putenv("PYTHONIOENCODING=utf-8");
in the script before calling exec(). googler apparently requires the locale or this environment variable to be set.
You must remove exec from the disable_functions parameter in the php.ini file for your server module installation of PHP (which is separate from your CLI installation). It is typically disabled by default for the server module.

Shell Script Searching for String in *.aspx site using Curl

I am deliberately trying to write a simple script which permanently checks a Website for a certain string and sends an Email as soon as the page gets updated and contains the string.
I managed most of the task so far, only it doesn't work on my intended website. (Works on simple ones though).
I figured it might have to do with Cookies or JavaScript.
Heres my Code:
#!/bin/bash
USERNAME="username"
PASSWORD="passwd"
URL="https://tickets.fcbayern.com/internetverkauf/EventList.aspx"
#!/bin/bash
echo "before"
for (( ; ; ));
do
count=`curl -s "https://tickets.fcbayern.com/internetverkauf/EventList.aspx" | grep -c "Ausverkauft"`
if [ "$count" != "0" ]
then
echo "Found Text"
sendEmail -f $USERNAME -s smtp.gmail.com:587 \
-xu $USERNAME -xp $PASSWORD -t $USERNAME \
-o tls=yes -u "Web page changed" \
-m "Visit it at $URL"
sleep 5
fi
echo "ende"
done
The intention is to check if
the site contains "Ausverkauft" or "sold out" (english version)
then send an email
if not repeat check after 5 seconds
Would be amazing if you could help me.
Testing the script on simple sites worked fine!
Thanks a lot!

what is the best way to run shell script (with root privilege) via php?

I've made a simple bash script for server admininstration and I cannot figure how can I run it in safely inside a php page: I'd like to create a php admininstration page but I obviously don't want to hard-code root password anyware. Let's make an example (this is a foo script, of course)
#!/bin/bash
touch /$1
this simple/stupid script will not work if the user who run it as no writing permission on /.
Actually the script add apache virtualhosts, ftp users and so on...
any ideas?
thanks
Use
sudo /path/to/executable/file
and set up sudo so it can execute the following command for the current user as a root.
http://www.sudo.ws/sudo/sudoers.man.html - here is the sudoers manual, the configuration file, that you have to modify.
zerkms ALL = (ALL) NOPASSWD: /sbin/iptables -L FORWARD -n -v -x
This is example from my /etc/sudoers. Here I allowed to run command /sbin/iptables -L FORWARD -n -v -x as root without asking a password for user zerkms.

Categories