Decreasing CPAs 50% in 2 years, increasing CVR 90%
I started managing creative for Yelp’s performance team in mid 2020. At this point, the majority of assets were being created by a partner agency. This is a great way to scale and expand learnings, but not having an in-house voice on message and brand can be a pain point.
Brand consistency matters, as well as adjusting for each channel’s best practice. This can be done in a thoughtful and scalable way with the right creative strategy and the right team. This case study is focused on Facebook for a B2B audience.
The vision:
Test creative at pace
Deliver assets at speed for fatigue and learnings
Improve creative quality of the assets
Adhere to design best practice, branding, & message
Discover more of what works and why
Ongoing learnings about audience, message, and design
Listen and learn
One core element was obvious from the beginning: the branding was off. The majority of the early work was focused around coaching agency designers to familiarize with brand design, message, and tone—as well as the general quality of the assets.
Next, I got the lay of the land. Metrics on an ad level really matter here—the top-level KPIs as well as the secondary “storytelling” data. Once I’d started compiling WoW and MoM data, I was able to begin tagging ads and creating a rubric for win/learn/fail on creative types.
In the meantime, we tested weekly creatives—at this point proposed by agency and highly edited by my team. Initially the agency encouraged a complete spread of asset type, but I found isolating the variables and changing one element the best method to learn what worked.
Creative strategy
Once I had enough data, I created a tagging system to analyze what worked. Broad buckets are best initially, before drilling down into specific message keywords or design elements. For example, I quickly learned the stock the agency was selecting was not as effective as Yelp’s branded illustrations or photography. Common sense is crucial with tagging buckets—a data point is only as reliable as the label its been given.
In addition to investigating results per tag, I dug deep into audience and conversion data, campaign structure, and channel best practice. Cross referencing key points with the results was what lead to truly impactful creative concepts. From here I had enough information to develop a system.
Develop the system
Once we had a solid set of learnings around design and message type, I directed our team to start iterating off core top performers and design/message tags that outperformed. This allowed us to start getting extra mileage from our hero creative and find new top spenders far more often.
To manage and maintain agile learnings and strategy, I then developed an editorial calendar and matrix system for managing new creative concepts and weekly testing. I would plan four weeks out, but kept on top of weekly data to act nimbly as needed.
A scalable and speedy balance of testing for performance (iterations) and testing for knowledge (new ideas)—based on learnings—was what lead us to more and more wins. For each new quarterly brief, as well as week over week testing, we incrementally increased the quality, honed message, and pushed the design.
Partner well
The performance team and the creative team should never be two sides of a coin that never meet. A crucial part of this win was to partner closely and often with the performance team. Earning trust and great communication meant that we could find the right balance of pushing the creative, without getting in the way of scale, speed, and spend.