$_SERVER['HTTP_USER_AGENT'] extract mail client - php

Currently I'm trying to create a mailing system with statistics. I send a mail with an image which links to a file where I can read the $_SERVER['HTTP_USER_AGENT']. When I open the mail in thunderbird this is my user-agent:
Mozilla/5.0 (Windows NT 6.2; WOW64; rv:17.0) Gecko/20130328 Thunderbird/17.0.5
I'd like to get the name and the version of the mailclient.
I tried to use
stristr($_SERVER['HTTP_USER_AGENT'],'thunderbird') but there are alot of email clients and if I make an if else structure with all of the clients it will get very big and I will always miss some email clients. how can I extract it the easy way and get a result like THIS?

I think you are looking for http://php.net/get_browser. Note that it requires a 3rd party text file that holds information regarding browsers/clients which can be parsed by get_browser()
You can obtain the latest php_browscap.ini at http://tempdownloads.browserscap.com/
Edit: I checked to be sure, but the files contains Thunderbird too.

Related

Why can some text not be sent via the form?

Sending the following form ends with a single character # shown in the MS Edge browser(Microsoft Edge 42.17134.1.0, Microsoft EdgeHTML 17.17134) and we can expect the final result should be OK!. The expected result is shown in other popular browsers. Wrapping the form with the correct HTML5 skeleton with Doctype etc. does not help. Why can some particular text not be sent via the form? Well, to be strict it can as long as I do not touch $_POST or I do not want to include this meta tag on my webpage where the form is sent. I can access the post data via php://input and everything is there in a raw but this is not a solution.
The issue was found in my own content management framework where some tags(including meta tags) are sent via the form to the PHP script.
<?php
if(isset($_POST['test'])){
echo "OK!";
} else{
echo '
<form action="/" name="template" method="post" accept-charset="UTF-8">
<textarea name="test"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /></textarea>
<input type="submit" />
</form>
';
}
?>
Naturally, the source code is saved with UTF-8 as well. Please, drag me away from this path in case I am doing something prohibited or potentially wrong. Eventually, I want to send the complete HTML head section via POST method.
Here are my questions:
Why text including <meta charset="UTF-8" /> cannot be properly sent using post method? Here, there is no question about the processing of the data via PHP script.
Is there any reason to search for the issue not in the source code but in web server installation? But, if yes why then it works with other browsers?
Can the HTTP requests be for some reason significantly different what might cause the described issue?
From where comes # can it be from the web server or the edge browser?
I am using XAMPP 7.2.4 on Windows 10.
Not everything should be allowed to send with the form by a user, but any tag is not a reason to be refused by a browser or httpd server.
The web server could reject some request, especially along the loaded mod_rewrite module with not a proper configuration,
...but what is the point of writing the # instead of an error message.
The main fear is that the HTTP requests might be different, maybe some mechanism against XSS incidentally will catch something? But as the life shown the issue here was somehow connected to cache, default content language and saved cookies.
The original issue from the question is related only to XAMPP <=> MS Edge and it has nothing to do with PHP scripts or UTF-8 encoding. It is established in the community that the meta tag with charset should be used as a safety mechanism, but what is not to be expected is that this meta tag might potentially have the influence on showing the website's content or just #. It should be strongly stressed that a similar issue is not appearing with other tags. The simplest case in which the scenario could be repeated is with the GET method by sending form data even to the HTML file.
http://127.0.0.1/?data=%3Cmeta+http-equiv%3D%22%22+%2F%3E
The other decent installations of Apache server have no such issues(I have tried apachelounge on Windows and Apache installations on different linux distributions). Therefore, I claim that reproducing the issue might be difficult and potentially can be the installation specific.
The simplest solution is to use a different version of XAMPP or other implementation of the httpd server.
For those, curious as I am, still, it is not explained why the XAMPP server has the issue with requests only from MS Edge. The typical HTTP request
GET /?data=%3Cmeta+http-equiv%3D%22%22+%2F%3E HTTP/1.1
Host: 127.0.0.1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: pl,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
Connection: keep-alive
Upgrade-Insecure-Requests: 1
seems not to has any unconventional stuff. There is an obvious User-Agent difference with MS Edge
GET /?data=%3Cmeta+http-equiv%3D%22%22+%2F%3E HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/64.0.3282.140 Safari/537.36 Edge/17.17134
Accept-Language: pl-PL
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Upgrade-Insecure-Requests: 1
Accept-Encoding: gzip, deflate
Host: 127.0.0.1
Connection: Keep-Alive
Cookie: PHPSESSID=*********
and the Cookie, and the Accept-Language: pl-PL only. To my surprise, it turned out that cleaning of the browser data solved my issue.

PHP opening link from Excel runs page three times

I'm having a strange issue, which I find difficult to summarize in a title.
First:
I have a webpage, where people need to be logged in.
I have a Excel document, with links to the webpage.
The problem:
When people are logged in, and they click on the link in the Excel document. The webpage tells them that they are not logged in.
What I found so far:
I'm using Office on Mac and I don't have any issues.
People using Office on Windows do have issues.
I think the issue is due to SESSIONS, that might be the reason why users aren't logged in while they should be.
I did some tests.
Every URL goes through index.php
index.php
<?php
session_start();
file_put_contents('log.txt', microtime().': SERVER '.print_r($_SERVER, true).PHP_EOL, FILE_APPEND);
exit;
Now when I click the link from Office on Mac (NO ISSUES!!!), I get a dump of the variable $_SERVER. Two important variables:
[HTTP_USER_AGENT] => Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36
[HTTP_COOKIE] => PHPSESSID=77lpqmdmvskv33d2ddsdlfs5q7; rememberme=1%3Ae79e92271e7e05a5ee5679b659b3cb5cbb61e60d96c158f4648960136b175164%3Accdee80c3e42705fcd7e8c234525beda86d27394653dfdfb42bdd3ec98592ca1
You can see the browser (Chrome) and the cookie, which contains a rememberme cookie for login.
Now, when I do the same by clicking on a link in Excel on Windows, I get the $_SERVER variable printed three times in the log file!
First:
[HTTP_USER_AGENT] => Microsoft Office Excel 2014
[HTTP_COOKIE] => PHPSESSID=0ivlfjf49j4b82858tstc2lmm3; PHPSESSID=tv6gs33j721d0tmm3rrjdoho45
Notice the user agent and no rememberme cookie.
Second:
[HTTP_USER_AGENT] => Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 10.0; WOW64; Trident/7.0; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; ms-office)
[HTTP_COOKIE] => PHPSESSID=0ivlfjf49j4b82858tstc2lmm3
Notice, still no chrome browser and rememberme cookie.
Third:
[HTTP_USER_AGENT] => Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36
[HTTP_COOKIE] => PHPSESSID=3s0hvtssghk7uomvkpb5k70tc2; rememberme=1%3Aa9bd74ad58a0d7075c27108be1adbd26ba6d18f6e8b39073152d6780131ffe70%3A643852f8636c76c0bfc4017ec7fe3eab98dd57f5bcfdf86f0e37b5ec28a0c0ef
Finally user agent is Chrome and rememberme cookie is set.
So, it's getting a long story. But clicking on the link in Excel from Windows, it does strange things. Anyone an idea what is happening?
Oke, I found the problem. Below an answer from superuser.com
The URL you're using needs some more information from a cookie to
display the search results rather than the search page. Paste the URL
into a different browser (or remove your cookies) and you'll get the
same results.
Clicking a URL in Excel seems to open it in your default browser. But
that's not really true. Before opening it in your browser, Excel first
runs Microsoft Office Protocol Discovery. This uses a Windows/Internet
Explorer component to determine if the URL works. (It does not
identify itself as Internet Explorer, but as "User Agent: Microsoft
Office Existence Discovery".) And if the results are (somehow) okay
then it will open the result of that check in your default browser.
Lacking the cookies (more precisely: lacking a session), GoDaddy gives
that Internet Explorer component some redirect. And the result of that
is opened in your default browser. That's the URL you're seeing.
Most likely your default browser is not Internet Explorer? Then
pasting the URL into IE directly and clicking it, to get the cookies,
might then also make the link work from Excel. (Just for testing; it's
not a permanent solution.)
You will have more luck using a URL that does not rely on some hidden
information from a cookie, like
http://www.godaddy.com/domains/search.aspx?domainToCheck=superuser.com
Source: https://superuser.com/a/445431
So to solve this issue:
When Excel checked the link, it gets redirected to '/login' because it wasn't logged in. And finally that URL is the URL Excel opens in the real browser.
So I changed the login script and a user will not be redirected to '/login', but stay on the same URL and it will be shown the login form if not logged in. Excel now opens the original URL an if the user is logged in, it will see the page. If it is not logged in, the login form will be shown.

Force HTTP while fetching page source with PHP

How would I force HTTP (Not HTTPS), while getting the source code of: http://www.youtube.com/watch?v=2YqEDdzf-nY?
I've tried using get_file_contents, but it goes to HTTPS.
There is no way, because google forces you to use https. It will not accept longer unsecure connection.
They even start to downrank websites, which are not on SSL.
As for your Comment, i have done a little bit more research.
Maybe it is depended on the user-agent. I have no time to confirm this.
Try CURL with this User Agent:
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:24.0) Gecko/20100101

OS detection if useragent value is altered

Using Php is it possible to detect exact OS even if the browser agent value is altered?
Here is the case
If someone override Firefox useragent value using "general.useragent.override"
Actual: Mozilla/5.0 (X11; Ubuntu; Linux i686; rv:12.0) Gecko/20100101 Firefox/12.0
Override: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_3) AppleWebKit/534.55.3 (KHTML, like Gecko) Version/5.1.3 Safari/534.53.10
$_SERVER['HTTP_USER_AGENT'] value will be totally fake. it's not useful even to detect correct Operating System.
Is there any Php solution in this situation?
Thanks.
No, it is not possible. The only information you have is that supplied by the User-agent header, and if a user wants to send false information there is nothing you can do to detect it.
You can still use JavaScript to find the screen size but not the os this is how to
<script type="text/javascript">
document.write(screen.width+'x'+screen.height);
</script>
but this can be changed by the client anyway as its on the client side on ios there is one way by setting up a mobile management profile temp to very the device but thats a lot of work for the client so only do that if you have to
But in most cases you cannot very its not mod

Finding HTTP_REFERER empty/blank when using URL shortener site

I have a client who is trying to determine traffic metrics, to his site by way of those "URL shortened" sites: (tinyurl.com, bit.ly and x.co) to be specific, and to take action based on that traffic...
We would have thought that the HTTP_REFERER variable would have held the referring resource name, the shortened URL from that service. Instead the HTTP_REFERER field is empty and if read in some browsers its actually not even there (I guess NULL?).
Here is an example of an attempt to enter a shortened URL that goes to my client's site and the name of the page is x.php:
HTTP_CONNECTION:keep-alive HTTP_KEEP_ALIVE:115 HTTP_ACCEPT:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8,application/json HTTP_ACCEPT_CHARSET:ISO-8859-1,utf-8;q=0.7,*;q=0.7 HTTP_ACCEPT_ENCODING:gzip,deflate HTTP_ACCEPT_LANGUAGE:en-us,en;q=0.5 HTTP_COOKIE:ASPSESSIONDQADBDABT=HAEFPIOBONKMOIJFDGNHHEM HTTP_HOST:www.<myclientswebsite>.com HTTP_USER_AGENT:Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 (.NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729)
Why wouldnt the http://x.co shortened URL show up in the header info OR am I not looking in the correct place?
My client would ultimately like to redirect incoming traffic to the appropriate resource within his website AND/OR out to other sites he owns.
UPDATE: I've looked through his raw traffic logs and I cant find a specific referrer other than x.co or bit.ly, etc I do not see the "/" part.. Is there something I can change in his IIS6 settings, on his web server, that would allow us to see and utilize the information he's looking for, or are we just out of luck due to the design of the HTTP redirect?
Because the shortening services use response 301 or 302 and Location - there is no referer. Your browser does not pass it.
HTTP_REFERER is an optional header. In some cases they are stripped out (security software, proxies, etc).
In the case of URL shortening services, they probably do a header redirect and simply don't include HTTP_REFERER.

Categories