X

Downstream solutions. Upstream problems.

I saw an interesting article in the New York Times this weekend titled "Put Ad on Web. Count Clicks. Revise." The premise of the article goes something like this: because the web provides functionality to test every variation of a banner ad for effectiven

Tim Leberecht
Tim Leberecht is Frog Design's chief marketing officer. He is a member of the CNET Blog Network and is not an employee of CNET.
Tim Leberecht
3 min read
By Nick de la Mare

I saw an interesting article in the New York Times this weekend titled "Put Ad on Web. Count Clicks. Revise." The premise of the article goes something like this: because the web provides functionality to test every variation of a banner ad for effectiveness, the next big thing is tailoring advertising in the moment, and leveraging findings from click-thru rates to construct more relevant offerings for consumers.
If I had to construct a tag-line for the so-called "data practice" services cited in this article it would be "downstream solutions to upstream problems." From the media-buying perspective I understand the argument: if the chosen vehicle for the ad is wrong, the advertiser will recognize it faster and will be able to adapt on the fly. Quick changes in placement and timing make ads more effective at targeting particular populations. But from the standpoint of advertisers and brands trying to understand the consumers they serve, this service misses the boat.
Coming from a research-heavy design consultancy, I believe this effort represents not a huge step forward but a band-aid placed over a much larger issue. Ad agencies and the companies that hire them should be doing a much better job understanding their consumers before they ever put their banner ads out there.
The article cites a Vespa campaign of 27 web-based ads, with variations in messaging ranging from "Pure fun. And function" to "Smart looks. Smarter purchase." The second message, combined with a no money down, zero-percent interest offer, attracted 71% more responses than the average of other Vespa ads. The two underlying value propositions ("Vespa, all about the fun" vs. "Vespa, it's a prudent financial decision") represent wildly different core assumptions about the product and its users.
It seems like a no-brainer to assume that doing a little research before designing the ads, speaking to customers and employees in-store, conducting contextual inquiries into existing owners and trend-scrapes tracking the rise of couponing and price consciousness, would yield the same results as the results of click-thru rates, as well as revealing additional deeper data that could be leveraged to fill out the campaign and adapt the product offering itself.
I'm not saying that tracking click-thrus isn't sensible and smart; it's just reactive. Only after you put something out there can you judge the validity of your messaging and when you do, your tool for judging that response is relatively blunt and binary (and the product, if off-base, is fundamentally unchanged).
By taking a proactive approach instead—i.e. talking to people and testing your assumptions before ever constructing an ad, and then altering the product to more closely align it to your findings—allows you to build your offering holistically. Now your banners reflect your product, and vice-versa, and there will likely be less need to retrofit the argument around a leap of faith.
The rise of data practices in digital advertising appears to be more of an effort to retain relevancy on the part of the agencies than something that fundamentally creates value for the consumer. And calling it new is a bit of a misrepresentation. Many of the old lessons of direct marketing are simply being ported over to the web by advertisers. Like the good-old days of 800-numbers and rebate codes, I'm sure it'll be successful. But calling it a "radical new approach" may be an overstatement.