"escape_fragment" pages. So, first page is created by HTMLUnit, and
"real" !# pages. In that case Google Bot would see "escape_fragment"
static pages, while if some "escape" page was ranked by another
crawler, and User comes to it - User would be redirected to !# pages.
But I am afraid that this would be counted as a bad style and my site
will get into Google Black List (while I'm trying to worry about
another crawlers - Google is still number one! and I don't want to
disappoint it ) )
What do you think about this approach? Could you comment on chances to
get to Google Black List with this approach?
You received this message because you are subscribed to the Google Groups "Google Web Toolkit" group.
To post to this group, send email to firstname.lastname@example.org.
To unsubscribe from this group, send email to email@example.com.
For more options, visit this group at http://groups.google.com/group/google-web-toolkit?hl=en.