Question: Does Google support pages sent as application/xhtml+xml
?
Answer:
Not really.
Posted: 2003-01-31 03:12:43 UTC by Xiven | Cross-references (0) | Comments (11)
Cross-references
None
application/xhtml+xml
?
Answer:
Not really.
Posted: 2003-01-31 03:12:43 UTC by Xiven | Cross-references (0) | Comments (11)
None
Comments
Freaky (2003-01-31 14:16:14 UTC)
And does Google include application/xhtml+xml in it's Accept: headers? Ok, maybe it Accept:'s text/xml or so, but with a higher weight than text/html?
Xiven (Registered) (2003-02-03 16:27:46 UTC)
To be honest I don't know, I haven't actually looked at the Google Accept header.. The way this site decides which to send is: if the Accept header _explicitly_ mentions support for application/xhtml+xml or if the User Agent is in the list of known supporting user agents, send XHTML otherwise send HTML. I had mistakenly added Googlebot into this list (I had been led to believe that it supported it). I've now removed it, and it's all better: http://www.google.com/search?q=xiven
The list of supporting UAs is currently: W3C_Validator, WDG_Validator and Opera (with version >= 6). Gecko isn't in this list since it sends the Accept header anyway.
Obviously browser sniffing always has problems, specifically those of public caches. This site uses the HTTP 1.1 "Cache-control: private" header, but this doesn't guarantee that it won't get cached publically. Ah well...
Basil Crow (2003-02-15 12:05:18 UTC)
Why does application/xhtml+xml break JavaScript?
I am writing a site in 100% valid XHTML 1.1 (with a proper XML declaration and DTD), sent as application/xhtml+xml. But even a simple "document.write" doesn't work! Mozilla 1.3b's JavaScript Console says that "Error: document.write is not a function." What??? When served as text/html there are no problems, but when served as application/xhtml+xml it seems you can kiss your JavaScripts goodbye.
So in the meantime I am doing the Evil Thing and sending XHTML 1.1 as text/html. Any ideas on how to get JavaScript back?
Xiven (Registered) (2003-02-15 12:50:42 UTC)
There's a discussion about document.write not working under application/xhtml+xml in Mozilla here: http://www.dhtmlcentral.com/forums/topic.asp?TOPIC_ID=15994
Bugzilla bug: http://bugzilla.mozilla.org/show_bug.cgi?id=111514
Xiven (Registered) (2003-03-13 13:15:49 UTC)
Further to this, in Opera 7.02 the <script> element is non-functional with XHTML sent as application/xhtml+xml
earnest boatright (2003-07-02 06:19:10 UTC)
YES I USE GOOGLE FOR MY SEARCH ON MY HOME PAGE
WITCH IS HTTP;//KNOLOGYHOME.NET, IN GOOGLE
WHEN I WANT TO TYPE A WORD FOR GOOGLE TO SEARCH
THE CURSOR MOVE'S REAL REAL SLOW ALSO, IT WANT
TO HANG UP
Jim Dabell (2004-08-18 14:27:27 UTC)
The Googlebot uses a varying Accept header. Usually it uses:
Accept: text/html, application/*
Which means that text/html is favoured but application/xhtml+xml is acceptable.
"The way this site decides which to send is: if the Accept header _explicitly_ mentions support for application/xhtml+xml or if the User Agent is in the list of known supporting user agents, send XHTML otherwise send HTML."
Firstly, you need to list the User-Agent header in your Vary header. Secondly, the mechanism you describe is flawed in that it will serve application/xhtml+xml to user-agents that prefer text/html.
Xiven (Registered) (2004-08-22 11:09:13 UTC)
I have long since changed the way I do this for the very reason that it is just plain wrong, broken and bad for caches (see http://www.xiven.com/weblog/2003/11/30/Upgrades). There are still some basic flaws in my Accept header use though:
Accept: */*;q=0.2, application/xhtml+xml;q=0.1
currently causes the site to send application/xhtml+xml as it is a more specific match. When I get around to it, I will fix it.
Anonymous (2007-04-05 05:46:09 UTC)
I suspect that document.write is a "bad" thing with regards to XML. depending on what get's sent to it, it could invalidate the XML or even make it not well-formed - thereby loosing the benefits of XHTML. I can only imagine that Firefox that the page is only parsed for well-formedness when it loads. Running scripts which can break things might break firefox in various ways or at least force it to re-parse the document every time it carries out a document.write request. In the event that a parse fails AFTER a page is already loaded, the user may be "confused" or otherwise upset.
I can see why they would have left it out. the only question is, why do the other browsers still support it inside valid XML?
Anonymous (2007-04-05 05:48:09 UTC)
I don't think it is part of the dom spec. I'm sure someone can confirm/deny this.
Anonymous (2007-04-17 07:22:29 UTC)
efwefwef