"escape_fragment" pages. So, first page is created by HTMLUnit, and
then small JavaScript code is inserted, that would redirect a User to
"real" !# pages. In that case Google Bot would see "escape_fragment"
static pages, while if some "escape" page was ranked by another
crawler, and User comes to it - User would be redirected to !# pages.
But I am afraid that this would be counted as a bad style and my site
will get into Google Black List (while I'm trying to worry about
another crawlers - Google is still number one! and I don't want to
disappoint it ) )
What do you think about this approach? Could you comment on chances to
get to Google Black List with this approach?
Thanks!
--
You received this message because you are subscribed to the Google Groups "Google Web Toolkit" group.
To post to this group, send email to google-web-toolkit@googlegroups.com.
To unsubscribe from this group, send email to google-web-toolkit+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/google-web-toolkit?hl=en.
No comments:
Post a Comment