How to change accept header at google chrome? - php

I want to see an error which will happen if I send not valid(according to XHTML standard) markup to browser. I'm using google chrome as browser, apache as server and php as script language. I've created a script with such lines:
header('Content-type: application/xhtml+xml');
$content = <<< XHTML
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="application/xhtml+xml; charset=utf-8" />
<title>XHTML</title>
</head>
<BODY>
<P>Lorem ipsum dolor sit amet...</P>
</BODY>
</html>
XHTML;
echo $content;
It should be incorrect because of BODY from capital letters.
But I'm getting correct results, because I see that google chrome accept such mime-types:
Accept:text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
So it interprets result as text/html and everything is fine.
I want to make this document invalid. How I can remove text/html from accepted headers at google chrome? Or I am wrong at something?
UPD
When I removed closing BODY tag, I can see an error. Seems that browser don't care about such little mistakes.

You are misdiagnosing the problem.
The accept header tells the server that HTML and XHTML are equally preferred.
Your script ignores the accept header entirely and sends an XHTML content type
XML parsers will not throw parsing errors on well-formed documents, even if they are invalid
Current browsers (as far as I know, I haven't experimented with this in a while) will fall back to an HTML parser if they get non-well-formed XHTML (because telling end users that page authors made a type is, 99/100 times, unhelpful).
If you want to detect validity errors, then use a validator (which is a tool designed to detect them) and not a browser (which is a tool designed to render web pages and recover from author errors).

So it interprets result as text/html and everything is fine.
I want to make this document invalid. How I can remove text/html from accepted headers at google chrome? Or I am wrong at something?
Although the browser will send several content types in the Accept header, the server is open to choose one of them in the response.
This is done by setting the Content-Type header in the response (As you do)
header('Content-Type: application/xhtml+xml');
This tells the browser that response is an XHTML document. You have no further chances to influence the browsers validation process.
Seems that chrome is too much forgiving and accepts the uppercased <BODY>.

Related

Content type HTML on PHP Page

I found a Webpage saved as something.php. But the source code tells me <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
I also found out that PHP code does not work on the webpage.
What is the need for making the file extension PHP if HTML is used?
(Not exactly HTML, but XHTML)
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
What is the need for making the file extension PHP if HTML is used?
(Not exactly HTML, but XHTML)
Considering your comments so far, particularly you stated there is no PHP; you can just change the file extension to XHTML. You can always change it back.
I wonder what other PHP files exists where you "found" this page and why. Assuming someone before you developed the site, there is probably a reason they used PHP file extensions.
Unless your host doesn't support PHP, then you should be able to run php code anywhere on that page by placing it inside "" tags. The 'Content-type' isn't relevant to whether PHP can run or not. Try adding the following code somewhere in your page:
<?php echo "Hey there, I'm a friendly PHP string!"; ?>
add this <?php echo "Hello!"; ?> in your page to test, and make sure that your server is running, and normally it works
Are you using a wamp/mamp server? Have you tried to turn it on?
These code are meta tags
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
and it has nothing to do with php unless you have included a php script to it.
Html/XHtml will run even though you are not using a php server. All php files has a .php file extension and will run only if you use a server like wamp for windows or mamp for mac.
You can still use html/xhtml code in a .php file.
For example I have an <h1>This is h1</h1> tag and you want to make it dynamic, you can put <?php ?> inside the tag and echo it out to display, <h1><?php echo "This is h1"; ?></h1>.
In case you want to put html code inside a php script, you can do it like this
<?php echo "<h1>This is h1</h1>"; ?>
You can learn more about php and other programming languages by the help of google. Just take your time, relax and enjoy learning. Don't pressure yourself, remember learning is not a medicine that when you take it, it will work in a few minutes. Learning takes time and practice. Enjoy coding.

Can change of file encoding affect how my css displays a html page?

A weird thing when I use PHP Smarty.
It seems the encoding of php file affects the css.
PHP file (ANSI)--test2.php
<?php
include_once("inc/smarty_inc.php");
$smarty->display('test.tpl.htm');
Smarty file(ANSI)--test.tpl.htm
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>JPR</title>
</head>
<body>
<div style="width:500px;height:200px; background-color:Red;margin:auto;">
test
</div>
</body>
</html>
when these two files are ANSI. The div shows at the center of the page.(both IE and firefox)
when one of them is converted into utf-8. The div shows at the left of the page.(only in IE ,It's OK with Firefox)
What's the matter with it? How to make it OK in IE with uff-8?
You are saving the file in UTF-8 with BOM, this causes the first bytes in the response to be the bytes for byte order mark, and not doctype. When IE doesn't see doctype first, it goes to quirks mode, where the box model is different.
You need to convert it into UTF-8 without BOM. This depends on the text editor or converter you are using.

Do I need to declare content type/charset in both HTML and PHP?

If the content type and character set are declared in the PHP header, is there a reason to have them again in the usual HTML DTD?
<?php ob_start( 'ob_gzhandler' );
header('Content-type: text/html; charset=utf-8'); ?> // here
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> // and here
...
If you are sending the charset in the headers, the is no need to repeat it in the HTML markup.
It is better to send this information in one place (DRY principle), as if the charsets conflict (ie. a header with UTF-8 and a meta with iso-8859-1), the browser will probably go to quirks mode.
Having said that, some automated tools (web scrapers) may not look at the header and deduce the page encoding only by the meta tag.
It is important to keep both the header and meta tag the same for each page - mixing different charsets may confuse browsers and cause display issues.
Having the charset in the HTML source may be helpful if someone decides to save a page, or for web scrapers :). libxml looks up the meta tag to determine the charset to use when parsing the markup. Show your fellow developers some web scraping love.
If you declare it in the HTTP headers, then it will survive transcoding by proxies and won't ever trigger a "Whoops, I guessed the wrong encoding, restart parsing from top" situation in browsers.
If you declare it in the body of the document then it will survive being access outside of HTTP (or another system with content-type headers, such as email).
If you declare it in both then you get the best of both worlds so long as no transcoding happens.
Note that if you don't use UTF-8 or UTF-16 then the XML spec requires that you specify it in the XML prolog (and that using an XML prolog will trigger Quirks mode in IE6).

no quirks mode on php pages?

My site is all php pages, since it's all database stuff. However, I'm having trouble putting the pages into no quirks mode... I did the regular thing:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
and this doesn't work. If I create an html page with the same code, of course it works.
So this leads me to believe that I can only to off quirks mode in html pages? Maybe this is a stupid question, and I don't need to turn off quirks mode in php page?.. Please help, <form> keeps breaking a line, and I've tried multiple fixes, but I'm thinking it has to do with the site being in quirks mode.
It doesn't matter how the page is rendered, PHP or static. I suspect your issue is in the doctype declaration. Are you sure the php isn't outputting any characters like a line break before the doctype?
Your PHP script may be outputting some Content-Type header that tells the browser the page is some other formatting than the doctype specifies. I'm not sure how the browser is supposed to resolve such a conflict. I'm also not sure how you've got PHP set up to output the Content-Type headers, but if you look in the PHP manual for headers, you'll probably be on the right track. The setting may also be in your web server config.
If you're doing XHTML, shouldn't you have an XML declaration above the doctype?
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
"http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
Also, if you're serving XHTML, be sure the content type is text/xml, not text/html. That's a common error. XHTML is XML and must be served as such or most browsers will ignore the declaration and go with quirks mode.
(Better yet, don't use XHTML, use HTML5.)
If that doesn't fix it, I think Pete Michaud must have the right idea. Check with View Source to see what actually got sent to the browser.
Yes, so apparently I was adding the code to the wrong page. And the page I should have been adding it to had code that was conflicting. So thanks, it works now.. the stupid page break is still there, like to create new line's no matter what css I apply, but atleast the site is standardized now. Thanks everybody!

Blank page in IE6

A site I am working on that is built using PHP is sometimes showing a completely blank page.
There are no error messages on the client or on the server.
The same page may display sometimes but not others.
All pages are working fine in IE7, Firefox 3, Safari and Opera.
All pages are XHTML with this meta element:
<meta http-equiv="Content-Type" content="application/xhtml+xml; charset=utf-8" />
It appears that I have fixed the problem by adding this PHP code:
header('Content-type: text/html; charset=utf-8');
I have read that this problem may be caused by XHTML, encoding, gzip compression, or caching, but nobody has been able to backup these guesses.
As the problem was intermittent I am not confident that my solution has actually solved the problem.
My question is, are there reproducible ways of having IE6 show a blank page when other browsers display content?
If so, what causes it and what solves it?
This is a content-type problem from IE.
It does not know how to handle application/xhtml+xml.
Although you write xhtml+xml, IE only knows text+html.
It will be the future before all agents know xhtml+xml
change your meta tag with content type to content="text/html;
Sounds like bug #153 "Self Closing Script Tag" bug in IE, which is well known to cause blank pages.
Due to IE's bug, you can NEVER code the following and expect it to work in IE.
<script src="...." />
(if the tag is self closing, you are in for a world of pain)
Instead, always code as;
<script src="...."></script>
I had a similar problem that was language specific - only the page with multibyte characters didn't show in IE6 and IE7. Turns out in these two browsers, the order of the Content-Type meta tag and the title tag is a big deal. So putting the tag (which contained Japanese characters) after the meta tag fixed the problem.
Not sure if this exactly matches your experience. It depends on which specific version of IE (including service packs) is being used.
A known rendering issue with IE6 SP2 & IE7 (both use the same rendering engine) is the existence of orphaned tags in your HTML. This may be an orphaned div or script tag.
<script language="javascript"> // no closing tag
alert('hello world');
<body>
hello world
</body>
The above renders just fine in IE6 SP1 and Firefox, but you will only see a blank page in IE6 SP2 & IE7.
There are certain other tags that must have a separate closing tag. Check that any <div> and <script> tags have an ending </script> or <div> tag, not just a closing slash at the end of the opening tag. Another one is <textarea>. You have to have both tags.
You can test if this is occurring with your site if you can View Source of your blank page and get the source html even though your page is blank.
You should serve pages with the Content-Type header as text/html to IE users. You don't need to change the meta tag, just leave it as application/xhtml+xml (IE will ignore it).
I got this bug due to a typing error.
I wrote the meta tag :
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-15" />
Thanks to you i corrected it to :
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />
and i haven't the problem now.

Categories