A potentially big story if you are a lead generator

Valleywag ran a story yesterday.

I almost fell out of my seat. I then followed the link to the primary souce at Search Engine Watch.

I have not found a further post on DMConfidential, the authority for direct marketers on this yet.

If the story is true, every performance marketer should immediately look at their server logs.

I have not seen any other publications, most notably

The story, if you did not click the links, is that Google is crawling forms of “high quality” Web sites.

What the story found incredulous is that this was hijacking of corporate data.

What this writer found preposterous is the next step of the equation in terms of reverse engineering rulesets. Form decisioning and rulesets are core assets of companies like LowerMyBills, LendingTree, Autobytel, and even to some extent AT&T’s Web forms.

Decisioning trees and rulesets increase conversion by showing relevant content to the user (and alternate user paths) once a user has self-profile themselves.

Hopefully, you see where I am going here.

In my former role, one project we considered in terms of competitivestrategy is using that very same crawler for example to see what the “service rate” was for a competitor on product. To use an example from the EDU category, was a competitor segmenting by “age” for EDU and if so did they route the user to school that we didn’t know or bought traffic based upon a different filter set.

In plain terms, after the user selected their age did the EDU competitor have a school (and thus a payment for that lead) that we did not have. If they did, we could/would identify the school and add it to our sales propect sheet or we would might know that there was no way to “convert” that lead.

By Google going through Web “forms” they are more apt to understand the conversion rate for an advertiser *whether or not they have Google analytics on their page. Further, by combining some of this crawler data with keyword bidding and clickthrough data, Google might be able to improve their algorithm. Oh, and they won’t be giving back that money to those they spidered.

This is a big story. Hopefully, Google will issue a release offering clear transparency on their agreements in using this data. At the very least, firms that spend vast amounts of money on Google per month, maybe $500,000 or more, should be checking their server logs.

Stay tuned.

Advertisements

2 comments so far

  1. Jens on

    there was an article on that in theregister [theregister.co.uk] as well – it´s certainly controversial to crawl such forms and it remains to be seen how exactly Google will do it. If webmaster don´t like it at all, they should simply disallow the google crawler to crawl them. Just wondering whether that actually helps a site to increase visibility if the have the right “high quality” content [what ever that means is not really clear yet, right?] or only is good for google to improve their algos.

  2. […] to Search Engine watch via OverPricedDad  Google is now crawling wabsite forms for relevant content. In very simple terms Google is filling […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: