HTTP Headers difference - load page incrementally - php

I have an HTML page that shows a progress bar as it steps through a process. It uses flush() to send the data to the browser. I'm trying to get this to work in a Zend process which I'm short circuiting by specifically sending a header, content, then ending the process with an exit command.
The HTML page displays correctly (progress bar steps through being done). The Zend/PHP page only shows the finished page (not the steps). I'm assuming this is a header problem since the method (flush()) is identical.
In Chrome, the header for the HTML page comes up as:
HTTP/1.1 200 OK
Date: Fri, 27 Jul 2012 14:38:07 GMT
Server: Apache/2.2.16 (Unix) mod_ssl/2.2.16 OpenSSL/0.9.8r DAV/2 PHP/5.3.2
X-Powered-By: PHP/5.3.2
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html
And the header for the Zend/PHP page comes up as:
HTTP/1.1 200 OK
Date: Fri, 27 Jul 2012 14:44:13 GMT
Server: Apache/2.2.16 (Unix) mod_ssl/2.2.16 OpenSSL/0.9.8r DAV/2 PHP/5.3.2
X-Powered-By: PHP/5.3.2
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-cache
Pragma: no-cache
Keep-Alive: timeout=5, max=98
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html; charset=utf-8
The only header information I'm specifying in the PHP is:
header('Content-Type: text/html; charset=utf-8');
I'm using this code from this page: http://w3shaman.com/article/php-progress-bar-script
Any help would be appreciated. Thanks.

Call ob_flush() before you call flush() as Zend could have output buffering activated.

Mathieu had the fix. Adding ob_flush() before flush() in the Zend/PHP page fixed the problem. I'm not sure if Zend is activating output buffering as suggested or not.

Related

Joomla Not Sending Custom Header

So I've got a server to server application. The PHP script on server 1, domain 1 sets a custom header in the page (Authorization: Bearer 123456789). The script on server 2, domain 2 uses get_headers() to read the headers.
It all works fine when the files are served natively. But when the script on server 1 is included in a Joomla module get_headers() doesn't retrieve the custom header.
In both cases, developer tools shows the custom header but also some different headers than returned by get_headers().
The code below uses JFactory to set the headers if Joomla is loaded but it is the same result using header(). Joomla just isn't passing the custom header.
I don't get it. Anyone have any idea what is going on here? Its not a SEF or htaccess issue.
<?php
// Server 1
if(!class_exists("JFactory")){ // no Joomla
header('Authorization: Bearer 123456789');
} else { // Joomla framework loaded
$app = JFactory::getApplication();
$app->setHeader('Authorization: ', 'Bearer 123456789');
$app->sendHeaders();
}
The code on server 2:
<?php
// Server 2
$headers = get_headers("http://server1.com/");
foreach($headers as $header) {
echo $header ."<br/>";
}
Output from get_headers() when served natively:
HTTP/1.1 200 OK
Date: Thu, 19 Jan 2017 12:44:35 GMT
Server: Apache
Authorization: Bearer 123456789
Content-Length: 0
Connection: close
Content-Type: text/html
Output from get_headers() when served by Joomla:
HTTP/1.1 200 OK
Date: Thu, 19 Jan 2017 12:45:49 GMT
Server: Apache
Set-Cookie: 3c460b3da9ecb202e794816b4144c6ff=ja7mn4b4njov98lsv76kk8pvu2; path=/; HttpOnly
Vary: Accept-Encoding
Content-Length: 1264
Connection: close
Content-Type: text/html
Native headers displayed by developer tools:
Authorization: Bearer 123456789
Date: Thu, 19 Jan 2017 13:07:32 GMT
Server: Apache
Connection: Keep-Alive
Keep-Alive: timeout=5, max=100
Content-Length: 0
Content-Type: text/html
200 OK
Joomla headers displayed by developer tools:
Pragma: no-cache
Date: Thu, 19 Jan 2017 12:19:24 GMT
Last-Modified: Thu, 19 Jan 2017 12:19:25 GMT
Server: Apache
Authorization: : Bearer 123456789
Vary: Accept-Encoding
Content-Type: text/html; charset=utf-8
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection: Keep-Alive
Keep-Alive: timeout=5, max=100
Content-Length: 76888
Expires: Wed, 17 Aug 2005 00:00:00 GMT
Remove double dot from setheader call :
$app = JFactory::getApplication();
$app->setHeader('Authorization', 'Bearer 123456789');
$app->sendHeaders();
Thanks for the suggestion Yoleth. I tested this and got the same result.
However I have found the problem. The Joomla site setting the header is using a component called Site Lock. This is similar to putting the site off line but has some nice features for developers.
Basically Site Lock was preventing the page being served and just returning the headers from the lock page (as it should). I don't know why I didn't see it earlier. Sometimes just can't see the forest for the trees!

goutte - guzzle: Not getting file download after form submission

I am using goutte to click a button in a form and get the download link for a file. I am correctly logged in the website.
The code I am using is what already worked in other parts of the page and gave me the right results until this point, but in this case I'm not sure it's enough:
$form = $crawler->selectButton('btn_name')->form();
$tableResults = $this->client->submit($form);
$result = $client->getResponse()->getContent(); // contains the same page of the form
The button sets some parameters and sends the request via ajax (since the page does not reload after posts):
<button id="btn_id_was_long" name="btn_name" onclick="PrimeFaces.ab({source:'btn_name'});return false;" type="submit"><span>DAE</span></button>
When I inspect the content of the response though I get the html page that contains the form but there is no trace of an attachment.
When the button is clicked with the above code I get the following headers in response:
Date: Tue, 08 Nov 2016 16:37:42 GMT
Server: Apache/2.2.29 (Unix) mod_ssl/2.2.29 OpenSSL/1.0.1e-fips DAV/2 mod_jk/1.2.31
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1
Liferay-Portal: Liferay Portal Enterprise Edition 6.2.10 EE GA1 (Newton / Build 6210 / November 1, 2013)
X-JAVAX-PORTLET-FACES-NAMESPACED-RESPONSE: true X-Powered-By: JSP/2.2
Transfer-Encoding: chunked Content-Type: text/html;charset=UTF-8
When I perform the same action in the browser I get this instead, with the file download:
HTTP/1.1 200 OK
Date: Tue, 08 Nov 2016 17:31:34 GMT
Server: Apache/2.2.29 (Unix) mod_ssl/2.2.29 OpenSSL/1.0.1e-fips DAV/2 mod_jk/1.2.31
x-content-type-options: nosniff
x-frame-options: SAMEORIGIN
X-XSS-Protection: 1
Content-Encoding: gzip
Liferay-Portal: Liferay Portal Enterprise Edition 6.2.10 EE GA1 (Newton / Build 6210 / November 1, 2013)
Content-Disposition: attachment;filename=DAE_00001_17/10/2016.pdf
Expires: 0
Pragma: public
Cache-Control: must-revalidate, post-check=0, pre-check=0 portlet.http-status-code: 200
Keep-Alive: timeout=5, max=99
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: application/pdf;charset=UTF-8
Any clue what I'm doing wrong?

How to fetch redirected URL using python? (CURLOPT_FOLLOWLOCATION not working)

I'm working on crawling information from a website: http://www.fatwallet.com
There are many redirected URLs. For instance: http://www.fatwallet.com/ticket/store/A4C?s=storepage
is redirected to http://www.a4c.com/?siteID=.7WaaTN6umc-s1Ih0x_Q67n6r7gInoh6Ug
I would like to use PHP to find out the redirected URL.
I have used "curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true)". I know it will automatically redirect 5 times.
However, the problem is, the page i get is not the final page, instead it's a page in between.
curl_exec returns:
HTTP/1.1 302 Moved Temporarily Server: Apache Location:
www。fatwallet。com/interstitial/signin Vary: Accept-Encoding
Content-Encoding: gzip Content-Length: 20 Content-Type: text/html
Date: Mon, 13 Apr 2015 12:03:19 GMT Connection: keep-alive
Set-Cookie: JSESSIONID=A9E28337052B56ADAC8451854A276210; Path=/;
HttpOnly
HTTP/1.1 302 Moved Temporarily Server: Apache Location:
www。fatwallet。com/interstitial/signin Vary: Accept-Encoding
Content-Encoding: gzip Content-Length: 20 Content-Type: text/html
Date: Mon, 13 Apr 2015 12:03:19 GMT Connection: keep-alive
HTTP/1.1 200 OK Server: Apache Cache-Control:
no-cache,no-store,max-age=0 Expires: Wed, 31 Dec 1969 23:59:59 GMT
X-UA-Compatible: IE=edge,chrome=1 Vary: User-Agent,Accept-Encoding
Content-Language: en Content-Encoding: gzip Content-Type:
text/html;charset=UTF-8 Content-Length: 16949 Date: Mon, 13 Apr
2015 12:03:20 GMT Connection: keep-alive Set-Cookie:
list_styles=grid; Expires=Sat, 01-May-2083 15:17:27 GMT; Path=/
Set-Cookie: non_mem=f86c0692-826f-40f2-9fa1-1e2f9a957af8; Expires=Sat,
01-May-2083 15:17:27 GMT; Path=/ ............
It seems that the third redirected code is "HTTP/1.1 200 OK", but it is not the final page. If you check http://www.fatwallet.com/ticket/store/A4C?s=storepage you will understand what I mean. Also, there is no way to find the final URL in the page returned.
So my question is, could it be able to make curl continue redirecting even if it receives HTTP/1.1 200 OK?
Is there another way to solve this(by using snoopy or python)?
Thanks for all!
Seems that last redirect is done via JS, not the native HTTP answer. You just need more advanced crawler with function to execute JS code.
Just see the source code of the first redirected page (view-source:https://www.fatwallet.com/interstitial/signin) and you will find the last one in some HTML elements, it seems that some JS code is reading those values and doing the last redirect

Chrome totally ignoring Access-Control-Allow-Origin: * header

I am setting this with htaccess. I know it's being set properly because if I set another header:
Header set Access-Control-Allow-Origin2: *
Then chrome does see this. As soon as I remove the 2 however, chrome just completely ignores it. If I make my file a PHP file and put this in it:
<?php header("Access-Control-Allow-Origin: *"); ?>
Then it works.
Here are the response headers as reported by Chrome of the .htaccess method which I need to work and which does not:
HTTP/1.1 304 Not Modified
Date: Sun, 30 Mar 2014 00:13:06 GMT
Server: Apache/2.2.22 (Ubuntu)
Connection: Keep-Alive
Keep-Alive: timeout=5, max=100
ETag: "208f3-178a2-4f5c4f119cd34"
Vary: Accept-Encoding
Here are the response headers as reported by Chrome from the PHP method which for some reason does work:
HTTP/1.1 200 OK
Date: Sun, 30 Mar 2014 00:13:09 GMT
Server: Apache/2.2.22 (Ubuntu)
X-Powered-By: PHP/5.3.10-1ubuntu3.10
Access-Control-Allow-Origin: *
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 23
Keep-Alive: timeout=5, max=99
Connection: Keep-Alive
Content-Type: text/html
Again, I know the htaccess is setting the header, even if I go to an online service that checks reponse headers, I see this back:
HTTP/1.1 200 OK
Date: Sun, 30 Mar 2014 00:18:14 GMT
Server: Apache/2.2.22 (Ubuntu)
Last-Modified: Sat, 29 Mar 2014 20:48:34 GMT
ETag: "208f3-178a2-4f5c4f119cd34"
Accept-Ranges: bytes
Vary: Accept-Encoding
Content-Encoding: gzip
Access-Control-Allow-Origin: *
Content-Length: 33393
Content-Type: application/javascript

PHPSESSID Cookie gets send multiple times

In my current application the PHPSESSID Cookie gets send multiple times. Here's a sample response:
HTTP/1.1 200 OK
Date: Tue, 11 Jun 2013 08:18:29 GMT
Server: Apache/2.2.17 (Ubuntu)
X-Powered-By: PHP/5.3.15-1~dotdeb.0
Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/ PHPSESSID=625qvi6328pdq2t7psh4t3voi6; path=/ PHPSESSID=625qvi6328pdq2t7psh4t3voi6; path=/ PHPSESSID=625qvi6328pdq2t7psh4t3voi6; path=/
Cache-Control: no-cache
x-debug-token: 9dcc688323f1dad273d4c8fc7117f405a52ce998
Vary: Accept-Encoding
Content-Encoding: gzip
Keep-Alive: timeout=15, max=100
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html; charset=ISO-8859-1
As you can see, there are three PHPSESSIDs.
I tried to reproduce this behavior with a single file with three session_start(); calls:
<?php
session_start();
session_start();
session_start();
but the cookie was send only once.
Any idea how this could happen?
I found the culprit. Somewhere deep in the legacy code was a session_commit which was called multiple times.

Categories