Facebook is a wonderful way to advertise ... if you like spending money like a sailor on shore leave. With it's zillions of fans any budget can be spent quickly, and ineffectively, if not planned well. One of the larger challenges is to figure out your demographic targeting. With more traditional advertising platforms (i.e. Yellow Pages) it was the world or nothing; however online advertising often offers some level of demographic targeting.
Facebook takes demographic targeting to the nth level offering demographic targeting to very specific interests, i.e. 'Peanut Butter and Jelly.' However, with so many interests groups to pick from it's super easy to get overwhelmed, pick anything remotely relevant, and end up with a campaign where 90% of your targeting is spent ineffectively. So what's a business owner to do?
In my case, I decided that, after months of guesswork, a full fledged experiment in ad spending was needed to narrow down my most successful demographics for Spirit Pieces.
To structure this experiment, I decided to split my grouping into 4 plus one control. Each of the four represented a different time period in the buying cycle, the control was pretty much everyone between the ages of 25-65. For each group, I sent up a separate campaign and ad group with a different saved audience. Each one utilized the same image and roughly the same text.
One of the biggest goals I had for this experiment was to understand buying intent of each of my groups. Spirit Pieces has some of the leading innovative artwork in the memorial space and it gets a lot of 'pretty shiny' clicks. While it's wonderful to share the art on the site, at the end of the day I am a business and I need to convert. Enter Facebook Leads
The first time I played around with Facebook Leads I wasn't that impressed. My normal link click cost is usually under 20 cents; Facebook Leads cost me a lot more - not worth the cost. However, to answer the question of buyer intent it was, and is, critical
I created a single lead form (shared between ads) asking not only the standard email, but what people were interested in and how soon they were planning to purchase. Adding this question made this experiment worthwhile (more on that below.)
To even out any daily bias, I ran the experiment for 2 weeks with a $10 daily budget; or $140 for each ad (OK, I lied, title should be 'What I learned spending $700 ...')
Each day I watched the leads come in and resisted ending the experiment early. I tried to spot early patterns in the data but after a few days stopped as I realize I was driving myself nuts. As with all things, it went very slowly and then it was over too fast.
It took a bit of time to compile the results but ultimately it did yielduseful results.
The first graph shows how many leads I got per group (identified by letter, sorry competitors.) They were roughly all in the same ball park with 'Group E' trailing by half. I was surprised by how strongly the control group did even though it was second last - go shiny bouncy ball!!
Number of leads per group
The next chart is how much I spent per lead. I'm also charting the 'power ranking' in this chart. The power ranking is a calculation using the buying intent of the lead. I used a lower number (i.e. 1) for 'Under a year' and a higher number (exponentially, i.e. 32) for Immediate. For an answer '1-3 month' I used 8. the power ranking is the average lead cost divided by the average lead value for the group.
As an example, if the group lead average cost was $3, and the average buying intent was 8 (1-3 months) the power ranking would be 3/8 or .375. To wit, a lower power ranking is good.
Cost Per Lead and Power Ranking
Here the differences between the groups stand out a bit more. As you can see, Group A had both the lowest cost AND the highest buyer intent where Group D had the highest cost and the lowest buyer intent. While I was hoping for a clear single winner here, knocking out one poor producer and validating what I was selecting was better than random was a big win.
If I had the budget, the next step would be to further split each interest group into its own ad and run it. I could also do this experiment again by playing with demographics. However I be a wee little company so this may be the furthest I can go (at the moment) with this type of experimentation.
By Dave Blake
Owner, Spirit Pieces