Archive for the ‘lead generation’ Category

As if LendingTree needed more problems…

LendingTree, in  the midst of a major revenue tailspin, has another problem to deal with; one that compromise the brand.

According to the Charlotte Observer, LendingTree alerted customers of a breach in their consumer profile security. Worse, it appears that this breach was ongoing for 1.5 years. According to the same report, “More than 160 data breaches were reported during the first quarter of 2008.”

How does this happen? It’s rather incredulous that a company with such a large headcount and arguably leading mortgage lead generation firm could let this continue for the duration that it did.

Even Tier 2 lead generation firms guard their data more feverishly. The lead generation space is usually inhabited by more questionable marketers than some other industries, however with the continued news out of Valueclick and sensitives over behavorial targeting and customer acquisition strategies this is really poor timing…and certainly sure to at least impact near term form conversion in a negative way.

Looks like Diller split the company apart just in the nick of time.

If I’m a unique supply-side firm, like Zillow or Bankrate even, I’m sending an email to my users right now stating how I protect their data. That’s sure to win points and increase brand equity.

More commentary:

Lead MarketWatch

Lead Critic…who had the story on Monday


AdTech: Some brief follow-up comments


JT Batson from the Rubicon Project was kind enough to respond in the comments below. Please check them out. Once again, thanks for all who respond as this blog develops its voice.

According to JT, their publisher acquisition at the show and thus booth expenditure was worth it. So I stand corrected. Congratulations on the acquisitions, JT.

JT, what was the major selling point that resonated with the publishers you spoke with over competitors like Pubmatic, AdBrite, others — happy to publish in a separate post.


I had the chance to attend AdTech this week. I meant to write a more indepth piece, but I’m more intrigued by going through Google’s numbers this morning.

Here are some bulleted comments:

– Google and Yahoo have a presence (Yahoo as the former Blue Lithium employees), interesting, but that’s about all

– AdReady had a large display, if you are not familiar with AdReady check them out. They help performance marketers manage production costs and roi in display. It will be interesting to see their product adoption.

– The Rubicon Project, for some reason had a rather large booth. If you are not familiar with the Rubicon Project check them out. You may like their single integration product or you might regard them as a souped up Glam. The interesting thing is the AdTech audience really isn’t their target audience in this writer’s opinion.

– Neilsen and Comscore with a small presence and little visitors.

– IndexTools completely abandoning their booth in the wake of the Yahoo acquisition.

– Very little in the way of targeting add-ons, but eBureau and TargusInfo displaying for lead quality and validation.

For a more indepth review, visit DM Confidential.

A potentially big story if you are a lead generator

Valleywag ran a story yesterday.

I almost fell out of my seat. I then followed the link to the primary souce at Search Engine Watch.

I have not found a further post on DMConfidential, the authority for direct marketers on this yet.

If the story is true, every performance marketer should immediately look at their server logs.

I have not seen any other publications, most notably

The story, if you did not click the links, is that Google is crawling forms of “high quality” Web sites.

What the story found incredulous is that this was hijacking of corporate data.

What this writer found preposterous is the next step of the equation in terms of reverse engineering rulesets. Form decisioning and rulesets are core assets of companies like LowerMyBills, LendingTree, Autobytel, and even to some extent AT&T’s Web forms.

Decisioning trees and rulesets increase conversion by showing relevant content to the user (and alternate user paths) once a user has self-profile themselves.

Hopefully, you see where I am going here.

In my former role, one project we considered in terms of competitivestrategy is using that very same crawler for example to see what the “service rate” was for a competitor on product. To use an example from the EDU category, was a competitor segmenting by “age” for EDU and if so did they route the user to school that we didn’t know or bought traffic based upon a different filter set.

In plain terms, after the user selected their age did the EDU competitor have a school (and thus a payment for that lead) that we did not have. If they did, we could/would identify the school and add it to our sales propect sheet or we would might know that there was no way to “convert” that lead.

By Google going through Web “forms” they are more apt to understand the conversion rate for an advertiser *whether or not they have Google analytics on their page. Further, by combining some of this crawler data with keyword bidding and clickthrough data, Google might be able to improve their algorithm. Oh, and they won’t be giving back that money to those they spidered.

This is a big story. Hopefully, Google will issue a release offering clear transparency on their agreements in using this data. At the very least, firms that spend vast amounts of money on Google per month, maybe $500,000 or more, should be checking their server logs.

Stay tuned.

The post that LeadPoint and Quinstreet may not want you to see

Actually, put a big asterisk next to the title for now. A colleague of mine asked me to help him evaluate the solution of company called Kaleidico today.

(A few disclaimers: I don’t know anyone at Kaleidico, nor do I know how much currency and traction they have in the space.)

Kaleidico bills itself as a tool to manage lead quality, but what really enraptured me when I visited their Web site was their public display of a “30-day application” rate by lead provider across what would seem to be there clients. (I tried to cut-and-paste an image here, didn’t work. So just visit the site and check it out).

Having spent time in a very competitive lead generation vertical, lead quality was evaluated across every grain:
– by publisher
– by day (would you believe that quality of leads drops to make numbers, oops, I mean towards the end of the month)
– by creative message
– even by lead client

By lead client? Yes, would you believe that there are consistently lead buyers who purchase leads and fall below the baseline for application rate.

For example, if all clients’s close rate were “down” for a specific source and below baseline, that might finger the publisher as the culprit for quality. However, if all clients were up against the baseline and one client was down, we would assume that it was the client issue. If a certain lead client was always down against average, we had two concerns:

1) That the customer wasn’t getting the best option or the customer wasn’t be serviced
2) More importantly for our business, that a client who can’t close leads on par with the average is destined not to be a client (or in business) upon a shakeout.

What Kaliedico is doing is creating a network effect amongst buyer by exposing their close data in aggregate. This is a very powerful tool, if accurate. I say if accurate, because I would like to understand how Kaleidico reaches these numbers.

If accurate, consider some of the consequences of the tool though:

– as a lead buyer, it allows me on further step down the funnel to evaluate my distribution or marketing mix (if I haven’t built this in house already)
– as a lead buyer, it tactically it allows me to guess my effective lead price by a provider *before trying their leads
– as a lead buyer, it allows me to understand if my agents or sales staff are closing leads above or below baseline on a per provider basis
– as a lead officer or agent, it creates a brand value around a lead. A Root lead I might work harder, than a Quinstreet lead in this case (which impacts the close rate).
– as a lead provider, it either gives me pricing leverage or handicaps by pricing

However to make Kaleidico’s chart truly valuable and actionable, you need to show not only the close rate by provider (Root, LMB, etc.), but you need to show the close rate by advertiser or client as well (UofP, AIU, etc). Isn’t that fair?

For example, why should LeadPoint suffer if Countrywide buys all their refinance leads at the best price, but has a terrible close rate. (Disclaimer: I know nothing of Countrywide conversion; using Countrywide to make a point)

Given their technology and business model, this probably will not occur because the Kaleidico is getting paid by the client.

For an industry that is razor focused on daily metrics and sees sporadic innovation, Kaleidico’s product, again if accurate, is something to pay attention to.

More on lead gen from LeadsCon:

What agencies are seeing on leadgen.

A perspective from the search world.

Comparing Zillow and….Nextag? Really?

The recent press around Zillow’s new service makes me recall the early 2000’s.

For background, Zillow is lauching a service that allows mortgage seekers to keep their indentity private while soliciting bids from mortgage lenders looking to secure their loan.

It’s rather ironic because Zillow’s roadmap is nearly the exact inverse of NexTag’s–only a few years later.

For those that don’t know NexTag started out as a reverse auction model (but didn’t have the traffic), essentially allowing customers to name their price for any hard good. Zillow is betting their collection of organic traffic is critical mass-enough to make a reverse auction worthwhile for morgage lenders who have seen their lead volumes drop in the wake of the subprime correction.

It’s a good bet–and personally I’d love to see a reverse auction work–but all the data suggests that it won’t.

It gets back to the ROI vs. effort level tradeoff. If I’m a morgage lender, I want to know how many leads I have coming in to satisfy the call center folks that I have at what price at what quality level, at what volatility (all the things you’ve heard in this column before). By putting lead supply essentially in the hands of the seller instead of the buyer, this creates a great deal of volatility and lenders may not go for this, meaning the buyer may not get the best price for their mortgage (not enough competition).

What Zillow has going for it obviously is the origination point. Whereas NexTag still has to purchase inventory through search and media display at a cost. Zillow’s audience is captive and costs nothing for Zillow to redistribute.

I’ll review this again in a few months. I think initially lenders using Zillow will use it much like LeadPoint to supplement lead volume on a remnant basis once their current lead sources are exhausted on a daily and monthly basis.

And that is not good for the consumer.

More on lead gen: Where’s the distribution?

Earlier this month, I commented on incentive marketing practices driving lead generation for Yahoo Autos!

With some of today’s comments, it begs the question: Where does the distribution come from for a burgeoning lead generation firm.

To be fair performance marketing is very Darwinistic, where the fit survive by adapting and mastering their domains. However, consider some of the major news surrounding performance-based media in the last six quarters:

– Q3/Q4 2006 – Google changes their quality ranking which has a profound impact on lead gen landing pages. Search traffic declines for lead gen companies.

– Q1 2007 – The mortgage shakeout. Well that actually collapses a vertical not a distribution point? Well, sort of. It was the high economic monetization of mortgage ads that got many companies into display marketing. This allowed companies to sprinkle in other offers to generate traffic when mortgage marketing was not working. A comment adage heard in lead gen, “Nothing monetizes like mortage.” Take this away, and now you have companies like LMB trying desperately to optimize around auto insurance. The numbers just don’t compute.

– 2007 – The meteoric rise to economy of scale of Facebook (in parallel with MySpace) along with applications that contribute to more social media usage. Why does this matter? Less time in email (or for many potential lead gen customers, less exposure to spam and questionable subject lines). Combine that with better spam filter and declining open rates, circle “email distribution” as a tougher acquisition channel

– Q4 2007/Q1 2008 –  Potential FTC legislation around behavorial marketing. Inmarket indicators in display, poof, maybe.

To be sure many companies will adapt to these changes and continue to aggressively grow their businesses.

LowerMyBills upsells credit report customers. Quinstreet has moved into new verticals.

It will be interesting to see the new marketing vehicles that take hold if some of the tried and true lead generation distribution points stumble a bit.

Lead quality always drops, reviewing the classic UofP – Quinstreet issue

Update: John DeMayo, formerly of, has a good point in the comment section here….to which I’ve added a counterpoint. Thanks John


The post on lead verification is focusing my thoughts on lead generation today. 

I’m not going to use the Quinstreet-University of Phoenix analogy here.

Actually, it tells a great story and its rather short. So I’ll use it. I think it was in 2005.

In bulletpointed fashion:

– Quinstreet drives leads to University of Phoenix to the tune of half of Quinstreet’s revenue.

– This is done through mostly affiliate search which bring quality leads to University of Phoenix at a respectable price.

– Quinstreet can’t figure out how to grow more profit with University of Phoenix so they acquire lower quality leads at a lower price from the distributors and blend these into University of Phoenix.

– Quinstreet grows the University of Phoenix relationship.

– University of Phoenix gets frustrated sends out RFP for new company to manage their lead distribution. That company becomes

The problem with the lead gen world is one of real time profit and companies without secret sauce.

Lesson if you are using a new lead provider quality will always, always, always, be better in the beginning. Know your distributions; pay attention to their business goals and the challenges with their business. The only way a new lead gen company can really get the attention of a major advertiser or client is to drive them high quality leads.

Back to the metrics again, performance marketing is a relationship or index amongst the following factors: price, volume, quality, and volatility.

A new lead provider to a client (whether new or legacy) will be able to grow a relationship focusing, typically, on quality first. Why, let’s examine:

– Price: No way a leveraged lead buyer will pay top dollar for a new lead provider so that’s out

– Volume: Either the new lead provider is just growing their business (small volume) or needs to take the volume away from another partner, either way this typically does not happen in a day, week, or even month’s time.

Quality: Nothing like high quality leads. High quality leads are the safe harbor in any market. They convert and that keeps the business going, especially in a recession.

Volatility: What converts for one advertiser typically needs some massaging before converting for another. There always tweaks. Expect some volatility.

By the way, forward ahead to Q3 2007, University of Phoenix buys Aptimus to take their lead quality inhouse.

Lead verification: Tough one

Greg Yardley linked to this blog last week and he’s got some comments on lead verification on his blog:

I thought I would augment his post.

Lead verification, the process of ascertaining with some confidence factor, that the information in a lead is accurate.

Most firms bundle lead verification in with a lead quality review (many major lead companies have whole department looking at lead quality).

However, the issue of lead quality is not one that can be solved at a service-based level. It’s sort of like the conundrum of Efficient Frontier, but I’ll leave that for another post.

To have an effective solution, I will review with the two classic product development questions:

– What’s my ROI?

– How much effort do I need to accomplish it?

Unfortunately, upon this calculation, any slew of offsite lead verifications just don’t work.

In terms of lead quality, Greg is on target that there are a number of other factors to consider with lead quality. This is: resubmission of a lead, resubmission of doctored lead, time sensitity of that lead (even within a daily segment).

Do any lead verifications solve all of these problems? No.

And therein lies the problem.

If no solution can solve 90% of the problem, now I am spending just too much cost to work with an outside service.

I need to: negotiate and sign a contract, integrate within their system, insure that I have support for when their system has releases, train my staff on using the data, etc. etc. Further, I have concerns about the company’s trajectory, i.e. does my competitor maybe buy them one day.

Integrations within the real time supply chain in lead generation (from distribution source through lead transmission) are just too difficult to manage.

Then again, if anyone does have a great solution, maybe I’ll reconsider, but only after some other company works out the bugs.

Glam Media: I’ll just link to it

Awesome post by Michael Arrington at Tech Crunch on Glam Media:

How the saga on Glam Media plays out will be really telling to see how far the online media industry has come in terms of ethics, media analysis and agencies growing up.

Here’s hoping that publishers have not bet their scalability and production costs on Glam or that agencies understand and can explain the value of impressions that are served on Glam.

If not, let’s hope we’re not doomed for another eFront Media. Read the post if you haven’t had a shot.