What makes an image download instead of opening in a new tab? - php

Don't know where to begin debugging this.
I have a local Apache server running a PHP backend that spits out a list of links from an API to the front.
...
<li>
Image
</li>
<li>
Image
</li>
...
Links are mixed both HTTP and HTTPS. I'm having a problem with Safari in particular. It appears to download the linked HTTPS image (HTTP opens fine in new tab) instead of viewing them in a new tab.
Expected behaviour: all links that have target="_blank" attribute should open the image in a new tab in all browsers.
Actual behaviour: all links open image in new tab in all browsers except for Safari (downloads jpg file instead)
cURL on HTTP links shows a 301 redirect (works fine in all browsers)
> GET /path/to/image1.jpg HTTP/1.1
> Host: hostpath
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/1.1 301 Moved Permanently
< Date: Mon, 20 Feb 2023 07:10:41 GMT
< Content-Type: text/html
< Content-Length: 178
< Connection: keep-alive
< Server: nginx
< Location: https://newpath.com/overHTTPS/image1.jpg
< Strict-Transport-Security: max-age=31536000; includeSubDomains; preload;
cURL on HTTPS links (these open in new tab fine in all browsers EXCEPT for Safari)
> GET /path/to/image2.jpg HTTP/2
> Host: hostpath
> User-Agent: curl/7.64.1
> Accept: */*
< HTTP/2 200
< content-type: image/jpg
< content-length: 150672
< last-modified: Thu, 24 Jun 2021 10:45:06 GMT
< x-amz-version-id: null
< accept-ranges: bytes
< server: AmazonS3
< strict-transport-security: max-age=31536000; includeSubdomains; preload
< date: Mon, 20 Feb 2023 07:16:15 GMT
< etag: "62a2466dbe39f0cd92908fa096ba9011"
< x-cache: RefreshHit from cloudfront
< via: 1.1 uid.cloudfront.net (CloudFront)
< x-amz-cf-pop: -cf-pop
< x-amz-cf-id: amz-cf-id==
cURL from totally different HTTPS as an experiment. (works! Safari opens this jpg to view in new tab just fine)
> GET /path/to/differentHTTPS/image2.jpg HTTP/2
> Host: m.media-amazon.com
> User-Agent: curl/7.64.1
> Accept: */*
>
< HTTP/2 200
< content-type: image/jpeg
< content-length: 13470
< server: Server
< date: Mon, 20 Feb 2023 07:29:44 GMT
< x-amz-ir-id: 6e4a2087-7e28-47ca-bef1-f332c0575d92
< expires: Sun, 15 Feb 2043 04:07:45 GMT
< cache-control: max-age=630720000,public
< surrogate-key: x-cache-214 /images/I/51U-ZNaX5sL
< timing-allow-origin: https://www.amazon.in, https://www.amazon.com
< edge-cache-tag: x-cache-214,/images/I/51U-ZNaX5sL
< access-control-allow-origin: *
< last-modified: Sat, 24 Jul 2021 09:53:23 GMT
< x-nginx-cache-status: HIT
< accept-ranges: bytes
< via: 1.1 uid.cloudfront.net (CloudFront)
< server-timing: provider;desc="cf"
< x-cache: Miss from cloudfront
< x-amz-cf-pop: -cf-pop
< x-amz-cf-id: cf-id==
<
For the most part, my original HTTPS origin and the test HTTPS origin have near identical response headers.
Might be how Safari treats requests to HTTPS resources from insecure HTTP origins (security?). So I deployed to my server which hosts everything over HTTPS; still exact same problem. Safari just will not open a .jpg from this external HTTPS origin in a new tab, it always downloads it.
I swapped in a totally different HTTPS link to an image, and it WORKS. Opens the image to view in a new tab, DOESN'T DOWNLOAD. Just not from the other HTTPS source.
Requests headers from all browsers and accepting image/*.
Any ideas on how I can dig through this? Not sure what else I can try!

Related

Proper HTTP Headers to Play AIFF File in Browser using PHP

I'm trying to put some AIFF audio files behind a login wall on a PHP site (i.e. out of web root). The first challenge is that AIFF's are not supported in all browsers, but that's expected -- see http://www.jplayer.org/HTML5.Audio.Support/ For now I'm using Safari to test because it supports AIFFs.
What I can't figure out is why Safari treats the 2 versions of the same file differently. For the direct file, it cues up the player and it works. For the streamed file, the player doesn't work.
Regular Download
Here are what the headers look like when I download the file directly (i.e. if I temporarily put the file into web root for testing):
curl -v http://audio.app/Morse.aiff -o /dev/null
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying 192.168.10.10...
* Connected to audio.app (192.168.10.10) port 80 (#0)
> GET /Morse.aiff HTTP/1.1
> Host: audio.app
> User-Agent: curl/7.49.1
> Accept: */*
>
< HTTP/1.1 200 OK
< Server: nginx/1.8.0
< Date: Sun, 06 Nov 2016 03:19:03 GMT
< Content-Type: application/octet-stream
< Content-Length: 55530
< Last-Modified: Sat, 05 Nov 2016 21:51:02 GMT
< Connection: keep-alive
< ETag: "581e5446-d8ea"
< Accept-Ranges: bytes
<
{ [5537 bytes data]
100 55530 100 55530 0 0 8991k 0 --:--:-- --:--:-- --:--:-- 10.5M
* Connection #0 to host audio.app left intact
Through PHP
And here are the headers when I stream the file through my PHP script (named source.php):
curl -v http://audio.app/source.php?file=Morse.aiff -o /dev/null
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying 192.168.10.10...
* Connected to audio.app (192.168.10.10) port 80 (#0)
> GET /source.php?file=Morse.aiff HTTP/1.1
> Host: audio.app
> User-Agent: curl/7.49.1
> Accept: */*
>
< HTTP/1.1 200 OK
< Server: nginx/1.8.0
< Date: Sun, 06 Nov 2016 03:36:46 GMT
< Content-Type: application/octet-stream
< Content-Length: 55530
< Connection: keep-alive
< Last-Modified: Sat, 05 Nov 2016 21:51:02 GMT
< ETag: "581e5446T-d8eaO"
< Accept-Ranges: bytes
<
{ [8431 bytes data]
100 55530 100 55530 0 0 4915k 0 --:--:-- --:--:-- --:--:-- 5422k
* Connection #0 to host audio.app left intact
The headers are almost identical -- the only difference I can make out is the order of them and the hashing algorithm my local dev box is using for the ETag value.
Here is the test PHP script (named source.php) that I'm using to stream the same file (located above webroot):
// Adapted from http://php.net/manual/en/function.readfile.php
$filename = (isset($_GET['file'])) ? $_GET['file'] : null;
// <do sanitization here>
$file = dirname(dirname(__FILE__)).'/audio/' . $filename;
// Mimicking AIFF headers from curl headers (does not work!)
$content_length = filesize($file);
$last_modified = date("D, d M Y H:i:s", filemtime($filename)). ' GMT';
header("HTTP/1.1 200 OK");
header("Content-type: application/octet-stream");
header('Content-Length: ' . $content_length);
header('Last-Modified: ' .$last_modified);
// attempts to do the same thing as NGINX... md5_file() would probably work
$etag = sprintf("\"%xT-%xO\"", filemtime($filename), $content_length);
header("ETag: $etag"); // quoting it exactly
header("Accept-Ranges: bytes");
// Output the file
readfile($file);
The expected behavior is that the browser would treat both versions the same. In my sample HTML page (adapted from http://www.w3schools.com/html/html5_audio.asp), only the direct download works -- the version of the file that's coming through PHP does not play. The same behavior happens when I hit both files in a browser directly.
<!DOCTYPE html>
<html>
<body>
<h2>From Stream</h2>
<audio controls>
<source src="/source.php?file=Morse.aiff&breakcache=<?php print uniqid(); ?>" type="audio/x-aiff">
Your browser does not support the audio element.
</audio>
<hr/>
<h2>Direct Downloads</h2>
<audio controls>
<source src="/Morse.aiff" type="audio/x-aiff">
Your browser does not support the audio element.
</audio>
</body>
</html>
The same approach has worked to play mp3s (but the headers are slightly different). Does anyone know what I'm doing wrong here or does anyone know why this approach isn't working with AIFFs? I haven't yet tried this same test using another server-side language, but I suspect this isn't a PHP issue and has something to do AIFFs. Can anyone shed light on this?

Don't set charset in header

How can I avoid, that PHP (or apache) automatically adds a charset=utf-8 to the Content-type eader when sending the content type?
(Reason is that this is causing issues with Internet Explorer)
orange public$ cat test.php
<?php
header('Content-Type: text/xml');
orange public$ curl -v example.com.orange.me.local/test.php
#* Trying 10.0.0.1...
* Connected to example.com.orange.me.local (10.0.0.1) port 80 (#0)
> GET /test.php HTTP/1.1
> Host: example.com.orange.imi.local
> User-Agent: curl/7.47.0
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Wed, 01 Jun 2016 16:24:49 GMT
< Server: Apache/2.4.18 (Ubuntu)
< Content-Length: 0
< Content-Type: text/xml;charset=UTF-8
<
* Connection #0 to host example.com.orange.me.local left intact
I was able to remove the charset using
ini_set('default_charset', '');
in my script and
AddDefaultCharset off
in my .htaccess

How can I prevent nginx from buffering within PHP?

I am trying to push a page to the browser while it is being generated from a PHP script. I have no access to my hosting provider's nginx configuration but they have told me that they use nginx 1.8.1. In my phpinfo() output I can see
output_buffering 0 0
and the same script works as expected on my local PC.
This is my starting script:
<pre>
<?php
for ($i = 0; $i < 100; ++$i) {
print('<b>.</b>');
flush();
usleep(100000); // 0.1 second
}
?>
</pre>
I start getting output immediately on my local PC but I have to wait the full 10 seconds before I see anything when the page is accessed from my hosting.
These are the default response headers:
HTTP/1.1 200 OK
Server: nginx
Date: Tue, 12 Apr 2016 12:32:05 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Vary: Accept-Encoding
X-Powered-By: PHP/5.4.45
Content-Encoding: gzip
If I add
<?php
header('X-Accel-Buffering: no');
I get
HTTP/1.1 200 OK
Server: nginx
Date: Tue, 12 Apr 2016 12:35:10 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Vary: Accept-Encoding
X-Powered-By: PHP/5.4.45
Content-Encoding: gzip
<?php
header('X-Accel-Buffering: no');
header('Content-Encoding: identity');
HTTP/1.1 200 OK
Server: nginx
Date: Tue, 12 Apr 2016 12:37:11 GMT
Content-Type: text/html; charset=UTF-8
Content-Length: 812
Connection: keep-alive
X-Powered-By: PHP/5.4.45
Content-Encoding: identity
Obviously, if the server knows the length of the content, it has waited for the script to finish before starting to send it to the browser.
These are the headers on my local machine:
HTTP/1.1 200 OK
Date: Tue, 12 Apr 2016 12:52:31 GMT
Server: Apache/2.4.7 (Win32) PHP/5.4.45 OpenSSL/1.0.1e
X-Powered-By: PHP/5.4.45
X-Accel-Buffering: no
Content-Encoding: identity
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=utf-8
The X-Accel-Buffering header gets passed through because I am not running nginx locally.
Are there any other headers I can pass through from PHP to stop nginx from buffering the content? So far I've only found config options, which I don't have access to.

wordpress on openshift custom domain "too many redirects" error, works on openshift default domain

wordpress on openshift custom domain "too many redirects" error, works on openshift default domain
curl request on custom domain does not work
curl -vvv http://www.hobbyhap.com/h
Hostname was NOT found in DNS cache
* Trying 54.204.79.83...
* Connected to www.hobbyhap.com (54.204.79.83) port 80 (#0)
> GET /h HTTP/1.1
> User-Agent: curl/7.35.0
> Host: www.hobbyhap.com
> Accept: */*
>
< HTTP/1.1 301 Moved Permanently
< Date: Tue, 27 Jan 2015 18:05:15 GMT
* Server Apache/2.2.15 (Red Hat) is not blacklisted
< Server: Apache/2.2.15 (Red Hat)
< Location: http://www.hobbyhap.com/h/
< Content-Length: 317
< Content-Type: text/html; charset=iso-8859-1
< Cache-control: private
< Set-Cookie: GEAR=local-54c70ac64382ec8161000031; path=/
< Accept-Ranges: none
<
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>301 Moved Permanently</title>
</head><body>
<h1>Moved Permanently</h1>
<p>The document has moved here.</p>
<hr>
<address>Apache/2.2.15 (Red Hat) Server at www.hobbyhap.com Port 80</address>
</body></html>
* Connection #0 to host www.hobbyhap.com left intact
curl request on openshift domain works
curl -vvv http://hhapp-hobbyhap.rhcloud.com/h/
I update blogs,options,site and sitemeta tables with the custom domain,and it started working

User authentication in tornado based on other php site

I wrote my own long-pollig Tornado/AJAX chat with rooms , whisper messages and other cool stuff . Till now as user authentication for just test purposes i've been using cookies . So u had to just enter your name ,after what cokie 'user' was created and chat would react accordingly to that cookie . But the problem is that i wrote this chat for a friend which has a php site. So basically i need to authenticate users based on his sessions. Thats where i got confused. And i am very ashamed , because i caught myself on a thought that i don't know how exactly session work , which is kind of absurd, because i don't consider myself such a bad programmer ^^ Well shit happens. Well ofcourse i know that sessions only store id on the client and other information is stored on the server , but that doesn't really help because i need know excatly what happens in details . Sure i googled a bit , but still am confused how to solve this problem. So the basic questions are :
1) Would appreciate if someone could in details explain one more time exactly how sessions work , and what i need know or have access to on php site , to use sessions in another application ...
*2)*So for example when i authenticate on my django site ,session is created with some value like 's5ds6dssd6' , and to tell the truth i don't know what to further do with it.Ashamed again. For example in PHP to extract username (if it was set) and check/do something i would do something like PHP_SESSION['username'] === ... .In django even less work just to use decorator or user.is_authenticated method. Yet how works inside and what i need i don't know.
There is a big chance what i wrote is stupid , and it's very easy , and i am a moron , which wrote before trying ...Yet even if i somehow would be able to get data from sessions/php site how could i be sure that some guy didn't create session with random id by himself , without authencating on php site ....
Well hope someone could point me in right direction . It felt necessary to write so much so you could udnerstand =) what bothers me and respond accordingly.... Sorry if i wrote something stupid.
1) Would appreciate if someone could
in details explain one more time
exactly how sessions work , and what i
need know or have access to on php
site , to use sessions in another
application ...
P.S: I am using Linux(I use the freely available Ubuntu which is the most popular/user-friendly Linux distro) as OS below and I would advice you to use a *nx distro(MacOSX is also pretty good but expensive in my opinion) as well with all your webdevelopment although all these commands are also available in Cygwin(windows).
Sessions are:
Session support in PHP consists of a
way to preserve certain data across
subsequent accesses. This enables you
to build more customized applications
and increase the appeal of your web
site.
Below I try to explain what sessions are and how they are using cookies
I created a simple no.php which does not use sessions and simply outputs Hello World:
Hello World
When we curl this script with the headers using -v we get the following output:
alfred#alfred-laptop:~/www/6500588$ curl http://localhost/6500588/no.php -v
* About to connect() to localhost port 80 (#0)
* Trying ::1... Connection refused
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 80 (#0)
> GET /6500588/no.php HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: localhost
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Tue, 28 Jun 2011 02:10:53 GMT
< Server: Apache/2.2.16 (Ubuntu)
< X-Powered-By: PHP/5.3.3-1ubuntu9.3
< Vary: Accept-Encoding
< Content-Length: 12
< Content-Type: text/html
<
Hello World
* Connection #0 to host localhost left intact
* Closing connection #0
As you can see from the output no cookie has been set. If you do this repeatedly you will get the same output.
Next I create a simple yes.php file which does make use of sessions.
<?php
session_start();
if (!isset($_SESSION['count'])) {
$_SESSION['count'] = 0;
}
echo $_SESSION['count']++;
Let's show the output from curl without storing the cookie:
alfred#alfred-laptop:~/www/6500588$ curl http://localhost/6500588/yes.php -v
* About to connect() to localhost port 80 (#0)
* Trying ::1... Connection refused
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 80 (#0)
> GET /6500588/yes.php HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: localhost
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Tue, 28 Jun 2011 02:12:47 GMT
< Server: Apache/2.2.16 (Ubuntu)
< X-Powered-By: PHP/5.3.3-1ubuntu9.3
< Set-Cookie: PHPSESSID=hrduhht116e9mikhkkj0gu7126; path=/
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Vary: Accept-Encoding
< Content-Length: 1
< Content-Type: text/html
<
* Connection #0 to host localhost left intact
* Closing connection #0
0
As you can see the count is 0, but also a cookie has been set: Set-Cookie: PHPSESSID=hrduhht116e9mikhkkj0gu7126; path=/. with session_id hrduhht116e9mikhkkj0gu7126
If we do not store this cookie when we issue the same curl command again we wil still receive 0 as answer(forget to count) and also receive another cookie.
alfred#alfred-laptop:~/www/6500588$ curl http://localhost/6500588/yes.php -v
* About to connect() to localhost port 80 (#0)
* Trying ::1... Connection refused
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 80 (#0)
> GET /6500588/yes.php HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: localhost
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Tue, 28 Jun 2011 02:16:42 GMT
< Server: Apache/2.2.16 (Ubuntu)
< X-Powered-By: PHP/5.3.3-1ubuntu9.3
< Set-Cookie: PHPSESSID=ihlj9c9fifl8f0lklu0umesas2; path=/
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Vary: Accept-Encoding
< Content-Length: 1
< Content-Type: text/html
<
* Connection #0 to host localhost left intact
* Closing connection #0
0
As you can see hrduhht116e9mikhkkj0gu7126 is not equal to ihlj9c9fifl8f0lklu0umesas2 which means a new cookie has been set and the information in that session is lost.
Next we store the cookie to cookie file issuing -c flag
alfred#alfred-laptop:~/www/6500588$ curl http://localhost/6500588/yes.php -v -c cookie
* About to connect() to localhost port 80 (#0)
* Trying ::1... Connection refused
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 80 (#0)
> GET /6500588/yes.php HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: localhost
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Tue, 28 Jun 2011 02:27:11 GMT
< Server: Apache/2.2.16 (Ubuntu)
< X-Powered-By: PHP/5.3.3-1ubuntu9.3
* Added cookie PHPSESSID="1h6710hhk84e0k9bj2kg7p03u5" for domain localhost, path /, expire 0
< Set-Cookie: PHPSESSID=1h6710hhk84e0k9bj2kg7p03u5; path=/
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Vary: Accept-Encoding
< Content-Length: 1
< Content-Type: text/html
<
* Connection #0 to host localhost left intact
* Closing connection #0
0
As you can see from ls(directory listing) we stored cookie to file named cookie.
alfred#alfred-laptop:~/www/6500588$ ls -al
total 20
drwxr-xr-x 2 alfred alfred 4096 2011-06-28 04:27 .
drwxr-xr-x 19 alfred alfred 4096 2011-06-28 03:59 ..
-rw-r--r-- 1 alfred alfred 196 2011-06-28 04:27 cookie
-rw-r--r-- 1 alfred alfred 12 2011-06-28 04:00 no.php
-rw-r--r-- 1 alfred alfred 114 2011-06-28 04:12 yes.php
That cookie to keep track of the count contains the following information according to cat(shows output of file)
alfred#alfred-laptop:~/www/6500588$ cat cookie
# Netscape HTTP Cookie File
# http://curl.haxx.se/rfc/cookie_spec.html
# This file was generated by libcurl! Edit at your own risk.
localhost FALSE / FALSE 0 PHPSESSID 1h6710hhk84e0k9bj2kg7p03u5
We next use that cookie to keep track of the count.
alfred#alfred-laptop:~/www/6500588$ curl http://localhost/6500588/yes.php -v -b cookie
* About to connect() to localhost port 80 (#0)
* Trying ::1... Connection refused
* Trying 127.0.0.1... connected
* Connected to localhost (127.0.0.1) port 80 (#0)
> GET /6500588/yes.php HTTP/1.1
> User-Agent: curl/7.21.0 (i686-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.18
> Host: localhost
> Accept: */*
> Cookie: PHPSESSID=1h6710hhk84e0k9bj2kg7p03u5
>
< HTTP/1.1 200 OK
< Date: Tue, 28 Jun 2011 02:40:18 GMT
< Server: Apache/2.2.16 (Ubuntu)
< X-Powered-By: PHP/5.3.3-1ubuntu9.3
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Vary: Accept-Encoding
< Content-Length: 1
< Content-Type: text/html
<
* Connection #0 to host localhost left intact
* Closing connection #0
1
As you can see we used that cookie with the same ID 1h6710hhk84e0k9bj2kg7p03u5 and the count is 1 instead of 0 when we don't use any cookie(or not store cookie and get new cookie).
So basically i need to authenticate
users based on his sessions.
sessions are just simple using cookies(sessionid) under the cover. You could for example override the standard implementation for sessions to use the database instead of the filesystem(interesting read!). But I would just use the session_id you receive from PHP(session_id) within your tornado application to authenticate your session because that should be unique(hard to guess).
session_id() returns the session id
for the current session or the empty
string ("") if there is no current
session (no current session id
exists).
P.S: I hope this answers your question a little bit. If not you could ask in the comments for a little bit more information?

Categories