Category Archives: Case Studies

[Case Study] 55% Increase in Leads for Firegang Marketing

Firegang Dental Marketing helps dentists grow their practices. Firegang hired me in June of 2016 to help them improve conversions of people who land on their free book landing page to order completion.

In plain English, Firegang has a book: this book…

Firegang Book

They give it away for free on a landing page. The problem? Oh wait…

The Problem

Firegang had a very respectable 46% conversion rate from the book landing page to the checkout page.

The trouble was that only 33% of people who clicked to the checkout page were completing the order. That means an astounding 67% of people who already said “I want the book” were then abandoning the cart.

Here’s why that was important: buyers of the book were opted into a webinar funnel, and from the webinar Firegang offers their flagship marketing services. Only 33% of people completing the book order meant that a large number of qualified leads were likely never seeing an offer.

Firegang gave me a single directive: improve the “opt-in-to-book-order” rate.

I’ll talk about the methodology below, but for those of you who like to speed read, sneak preview of the results.

The Results

Screen Shot 2016-08-23 at 11.56.36 AM

To summarize what you’re looking at, this is a split test between the pre-intervention funnel and the post-intervention funnel.

 

Pre Intervention Post Intervention
126 people clicked “I want the book” 115 people clicked “I want the book”
38% of them completed their order 59% completed their order
48 people ended up on the webinar 68 ended up on the webinar

That’s a 55% improvement.

I won’t go into Firegang’s real financial details, but say they sell consulting for only $2000, and convert at only 1% from the webinar.

At the old conversion rate, for every 1000 opt-ins, they’d get 380 on a webinar, and sell to between 3 and 4 of them. Let’s call that $8000.

At the new conversion rate, for every 1000 opt-ins, they get 590 on the webinar, and sell to between 5 and 6 of them. Let’s call that $12,000.

Ok, how did we do it?

How we did it

Initially, Firegang asked me to rewrite their cart-abandonment email sequence, which I did. But my theory was that the biggest win would be in the checkout page itself.

Here’s what the funnel looked like:

Screen Shot 2016-09-07 at 1.15.06 PM

As you can see, there are two pathways people can take to the webinar: a direct book order from the checkout page, and from the email sequence.

I asked for the cart-abandonment figures, and when I saw only 33% were completing their order, I knew something must be wrong. Here’s why:

  1. You go to Amazon to buy a widget. You add it to your cart, then you go to your cart page.
  2. You then decide, “meh, I don’t want it”. Even though it’s free.

So I checked out the checkout page…

Screen Shot 2016-09-07 at 1.19.35 PM

That’s the above-the-fold portion. This isn’t standard for checkout pages. This looked more like a sales page. Here’s an example of a standard checkout page…

Screen Shot 2016-09-07 at 1.21.15 PM

That’s for Ryan Levesque’s excellent book, Ask. Notice anything?

The checkout page is meant to collect shipping information with a minimum of friction, and entice the customer to confirm. That’s it.

Firegang’s existing page was making a great case for the value of the book, to people who already wanted it, and just wanted to checkout hassle-free.

On this hypothesis, I drew up a wireframe and sent it over…

Screen Shot 2016-09-07 at 1.24.21 PM

It took two months for the split test results to come back, but this simple change produced the potential four-figure earnings difference I described above.

What’s next?

I also rewrote their email series, though.

The old conversion rate from email-to-order was 6.57%. Think about that: I could double that rate, and only 6 additional leads out of 100 would end up on the webinar. So it’s a lower-leverage intervention point than the checkout page.

But with the biggest win out of the way, we’re currently testing the new email series. I’ll be back to update this post when we have the results.

Want to learn more? Get my free 5-minute report on the 3 biggest profit-levers most businesses don’t even know they’re sitting on.

Case Study – $7000+ in One Week

In Q2 of 2016, Taylor Pearson and I collaborated on the launch of his first course. After 3 months  of experimentation and fine-tuning, we produced a result that allowed us to take screenshots like this:Taylor Before and After

Taylor would caution me to include the caveats: the fact that the launch in the “after” photo was to a finely-curated cohort.


I want to go one-further and make the caveat
the entire point of this case study.

I love reading success stories, but we’re often left with correlation/causation confusion: did you convert gangbusters because your copy was fire, or because you found a seamless product-market-fit for a bleeding-neck pain?

Taylor and I took an iterative approach to his course rollout:

  1. A “soft launch” in April to his most engaged cohort to-date: 150 people who had downloaded his Antifragile Planning Template, upon which the course would be be based. We wrote copy based on the results of a “deep dive” survey to that group.
  2. A secondary launch in May to a cohort of around 1000 (randomly selected from email list), incorporating product and positioning improvements gleaned from results and surveys.
  3. A “rollout” in late June to a carefully segmented cohort of the whole list, again incorporating the learnings and survey results from the second generation launch.

The “rolling” launch gave us a luxury not available to all course-creators: a “control group”. In at least 3 instances, we changed only one variable and noticed an order-of-magnitude result.

(In at-least-a-few-others, I changed too many things, creating confusion about what was working and what wasn’t, a mistake I won’t repeat.)

All-the-same, there are at least three things I can tell you with high-confidence we’re “sure” worked. (I’ll also tell you about the things we have a good hunch about, but with which we can’t prove causation.)

2 Things We Know for Sure Worked

Cohort Control

For the final (“successful”) round of the launch, we launched to a group of only 260, all of whom had “opted in” after we sent a week of “teaser” content from the course to Taylor’s entire list.

Quick dilation: Taylor chose 3 tantalizing excerpts from his course, and embedded each on its own landing page on his site. Then, after bi-monthly “teasers” at the end of Taylor’s regular essays, we hit the whole list with emails Monday, Wednesday, and Friday, each linking to one of the course excerpts.

We asked people to “opt in” just to participate in the launch, and set a time-limit, to maximize enrollment.

Taylor Landing Page

By asking people to “opt in”, and selling only to those who demonstrated interest, we accomplished 3 things:

  • Identified the cohort of Taylor’s entire list with the most intense interest in the course, relative to whom we could measure conversions.
  • Segmenting meant we felt we could sell more unabashedly to those who had made the micro-commitment to opt in as they were already marked as interested. As I’ll describe below, we emailed these folks an average of twice-a-day on launch week.

 

Note from Taylor: I wanted to do this because I didn’t want to send a shitload of launch emails to everyone on the list that was potentially a long term asset, but not interested in the masterclass. I’ve had lots of interesting opportunities come up from cool people, partners at 500 startups, PE guys, etc. that I would like to stay in communication with that doesn’t want to get 10 emails in three weeks during a launch. I also try to be very conscious that most people have two inboxes – one they check and one they let accumulate as a swipe file/newsletter. I do everything I can to stay in the latter.

Dopamine Week/Content Marketing

This is a secondary effect of the above. While creating the course excerpts gave us a way to identify the most engaged cohort of Taylor’s list, it also let us “educate” them and “warm them up”, so they knew roughly what to expect before we launched.

After we converted badly to the May cohort (those chosen at random), Taylor proposed this style of “feeler” launch for the final round. We called it “dopamine week” because it let people feel the “dopamine release” of getting a small win from his material.

How do we know for sure it worked? If what we were measuring was whether giving people a dose of the course material would cause them to want more, a good metric would be did they opt in for the launch, and, more importantly, did many end up buying.

If either the response to the “teaser/dopamine” week or eventual sales from the cohort who opted in were weak, we’d know content from the course wasn’t the 80/20 of selling it, but people did end up opting-in, and a high percentage of those folks ended up buying.

3 Things we’re pretty sure worked

Copy/Positioning/Surveys

Between each pair of launch iterations, Taylor and I surveyed both his buyers and non-buyers, asking the former why they decided to buy, and the latter why not. We also embedded “survey” style questions in the emails leading up to launch week: questions like “what’s your biggest challenge when it comes to productivity”, and “what’s the issue you most help the course will help?”

Can you ever be 100% sure copy made the difference unless you A/B test two sales pages with the same cohort and get statistically-significant data? No. But we’re pretty sure it helped.

What we changed: After reviewing Taylor’s initial deep dive survey, I had a mental picture of a new entrepreneur or somebody just on the verge of leaving a day job. He got into this for the freedom to choose his own path and live anywhere, but entrepreneurship is proving a lot more stressful-than-advertised.

After some spirited discussions, Taylor convinced me to think of someone with 5-10 years in the game, with ambitions to emulate heroes like Warren Buffett and Steve Jobs.

Why we’re pretty sure it helped: While the final version of the copy converted well, it’s hard to know exactly how big a part copy played. But we have pretty strong anecdotal evidence. We started noticing that the survey results we’d get back were “echoing” the story we were already telling about the course. When it started to feel like we weren’t discovering anything “new” from any of the responses, we were pretty sure we had it dialed-in.

Nate: My hunch about this is that focus on a single pain-point and value prop probably had a greater effect than the particular thing we chose. Once we had conceptual agreement, you were a lot more comfortable turning up the emotional intensity of the copy.

Taylor’s Note: I basically didn’t want to sell someone that I had a magic pill that would let them quit their job and start a business. It’s a productivity masterclass so yea, it might help you do that, but that’s not the purpose. This was more to do with long-term branding considerations. TBH, it would not surprise me if Nate’s original copy would have sold better.

Massive Email launch week

“What would launch week look like if we sent twice as much email?” Taylor asked me on a strategy call.

What we changed: We decided to send a variation on an email I have in my evergreen funnel for 80/20 Drummer. Whenever someone clicked to the sales page but didn’t buy, they’d get an email: “I notice you checked out the course but haven’t pulled the trigger. Is there anything I can answer for you?”

Taylor took it one step further, asking to get on the phone with the clickers – even offering his cell # in the email.

The result was a real-time stream of questions and feedback that we then incorporated into future emails. Instead of sending something rote and generic, like “Reminder – course is still open”, this allowed us to talk about questions he’d received just the day before. We even got to have some fun with it: a customer who nearly decided not to buy because of a video (more on that below) wrote a constructive critique that became the whole subject of an email: “I get it – I should stick to books”.

Why we’re pretty sure it helped: Again, the only true way to measure the effectiveness of a strategy like this would be an A/B test between two random samples of the same cohort. But the volume of questions Taylor’s extra emails generated, not-to-mention the open and click rates of the “off-the-cuff” emails, let us know they drove engagement pretty well.

Nate: What gave me the “spidey sense” this was succeeding was the response your self-effacing, “off-the-cuff” emails got. It’s also hard to argue that more proof is bad, and it was great to be able to incorporate some of the early rave-reviews into the subsequent emails.

Taylor’s Note: I felt way more comfortable sending a lot of messages during launch week because of the dopamine campaign. We explicitly told people “if you click here, we are going to try very hard to sell you stuff because you’ve expressed that you think you could benefit from it”

We saw almost no unsubscribes from the cohort that opted in.

Shelling for Decent html for The Sales Page

After we switched over to Teachable’s native sales page, Taylor made the decision to shell out some money to my developer for decent html. The result was a clean sales page.

Sales Page New

Why we’re pretty sure it helped: We didn’t split test it, but plenty of others have. And it’s hard to go wrong with a clear offer above-the-fold, and a visual flow that draws the reader in. We asked my developer to make a banner that included the video player on the left and clear CTA on the right, with the text headline visible below.

Compare that to the default sales page, which forced us to embed the video player beneath the welcome banner, and felt very distracting and “out of the box”, instead of cleanly-branded in Taylor’s aesthetic:

Teachable Default Page

Nate: We did all that hard work on the positioning, and it was a bummer to see it buried by clunky graphics on the first-generation sales page. My feeling is: it’s intuitive that without good copy people won’t buy. And if you’re going to bother to write good copy, you should make sure people can see it clearly, otherwise your efforts are wasted.

2 Things I Wouldn’t Do Again

The following are things that I know for-sure didn’t help, and I’m 85% sure hurt.

Changing the format of the deliverable between launches

When we rolled out to the initial cohort, we offered a personal coaching call with Taylor as the deliverable. The idea was to validate without spending any time creating the course ahead-of-time. While the call sold, it’s my belief that we didn’t get a lot of good information about future deliverables/price points from the initial launch.

Why I’m pretty sure it hurt us: When we switched from a live call to a webinar, conversions went down, and when we switched from a webinar to the first generation of the Teachable course, they went down again. Because we were also launching to progressively colder cohorts it’s impossible to tell exactly what part the changing deliverables played, but that’s just the point: next time I’d validate with as close to the final deliverable as I could. Then we’d know for sure that the deliverable wasn’t the problem.

Initially, we offered Taylor’s personal time. Then we took it off-the-table in future incarnations. As such we knew people would pay for Taylor’s time, but we didn’t know if they’d pay for an automated course.

If I had this to do over-again, I’d solve the problem of not-building-anything-until-we-got-paid by trying to pre-sell an MVP version of the eventual Teachable course, then spending 2 weeks building that MVP.

Taylor: I did this because it basically let me get paid to do customer development. I learned a lot that shaped the actual content. So I agree with Nate from a purely marketing perspective that it wasn’t a good benchmark, but that wasn’t necessarily the point in my mind. We did a second cohort that got what amounted to a live 2 hour webinar recording (so still automated) which I think was a better test.

Raising The Price Between Cohorts

Luckily, Taylor’s mid-launch-week intervention showed us price was an issue during the June launch. Otherwise, we’d have no way of knowing.

After launching in May at $199, we decided to test $299 for the next round. I didn’t voice any objection. I should have.

Raising the price gave us the “too many variables” conundrum again, and it’s intuitively a bad move when you’re rolling out to a “colder” cohort. I’d have kept the price the same: then we’d have known from Day 1 how well the course would perform with a randomly-selected cohort.

Were I to do this over, I’d start with the highest price at which I could find product-market-fit with my initial cohort, then see how conversions faired as we rolled it out to less-engaged members of Taylor’s audience.

Nate: I’ve got a few big takeaways. Things that, if we were doing this again knowing what we know now, I’d want to nail if we got everything else wrong.

  • Product-market fit matters. Launching to a cohort that’s interested and “warm” can be a make-or-break.
  • Content matters: letting people feel what it’s like to use your solutions likewise made the difference between zero interest and a lot of interest.
  • Validate with as close the final deliverable at the final price point as possible.

Case Study – How I Beat My Last Product Launch by 3x

I usually hate case studies for online product launches: “I made x amount of money from my launch and I did ABC things, so I’m just going to assume a certain thing I did – and not something else, coincidence, or luck – made the difference.”

There’s no “control group”, so you’re not 100% certain which thing you did resulted in success or failure, if anything did.

That’s why I’m happy I did a launch to my 80/20 Drummer list last week, because it was the fourth launch I’ve done in the last year, and it had the following in common with my last 2 launches:

  • Same product at the same price.
  • Same list, with same average-subscriber-age (but actually fewer new subscribers because I’d suspended a paid traffic campaign).
  • Same format (Jeff Walker style funnel), with same videos.

I only changed 3 things, and I’m not going to speculate as to which had the largest effect. what I can say is last week’s launch brought in more than 3x the average of the previous 3. Two of the three changes were things that worked brilliantly for Taylor Pearson’s course launch in June (that case study is coming soon).

Screen Shot 2016-07-31 at 12.15.36 PM

(BTW –  I know there are some 80/20 Drummer subscribers reading this article, and if you’re here, welcome, and I hope you take these tips and launch your own successful online product!)

Here are the three things I did differently this time around…

“Live Broadcast” The Launch

In previous launches, I pre-wrote the emails. Man did I think I was brilliant. I had the “one thing you probably didn’t know about [product]” email, the “one person’s story with [product]”email, and the “it’s now or never – time to decide” email. All of which added up to decent-but-not-stellar conversions.

The one tweak I made this time, that I borrowed from Taylor’s and my strategy in his launch?

Write the emails interactively, in real-time.

Each email I sent out to my cohort of around 6,000 inevitably garnered some questions and comments. I just took screenshots of some of the best, and incorporated them into the next-day’s email. For instance, a student asked me if he should finish a classic drum book before enrolling in his course. So I made the next-day’s email all about that.

Screen Shot 2016-07-31 at 12.10.24 PM

I still tried to hit the Jeff Walker/Frank Kern beats of telling my story, resonating with my audience’ pain/desires/obstacles, and supplying proof, but I drew my inspiration from the current batch of responses in my inbox instead of recycling old ideas or writing off-the-cuff.

While I don’t know for sure if this made the crucial difference, I suspect “live-tweeting” the launch gave it a feeling of group-participation, and reassured people that if they asked questions, they’d be answered. I also stole a second element from Taylor’s and my launch strategy that ensured I’d get more such questions…

Give Out Your Number

Whenever someone clicked on a link to the sales page, he/she would receive a reply: “still have questions about the course? Call me: here’s my number.”

Screen Shot 2016-07-31 at 12.11.22 PM

I only had 5-6 phone conversations last week, but almost every person I talked to told me they were impressed by my accessibility. It wasn’t necessarily about answering their questions about my course – my sales page does a decent job of that. It was about reassuring people I cared enough to get on the phone, and that I wasn’t just out to make a quick buck.

The “call me” response also allowed me to send twice as many emails during launch week without annoying people. When it’s a personal email, that promises some of your time, people will lower their guard. (Because you’re lowering yours.)

Will this work for everyone? No – especially if your product price is too low, or your conversion rates are garbage. And keep in mind the vast majority of people won’t call you. But it’s a good heuristic: are you charging enough and making a good enough case for your product that it’s worth 15 minutes of your time to talk to someone? And for every one person who takes you up on your invitation to take some of your time, how many more buy just because they’re impressed you offered?

Promise to Raise The Price (Then Really Do It!)

In previous launches, I was vague about whether-or-when the price was going up. That undermined my scarcity. Sure, I was only opening the course for enrollment a few times-a-year. But there was no reason not to just wait till next time unless the timing was perfect.

This launch, I didn’t just promise to raise the price. I told people exactly how much I was raising it, and exactly when the next opportunity to buy would be.

I know this was effective, because I talked about it on a few of my phone calls:

Them: I’m just wondering whether the timing is right, or I should wait till next time.

Me (if I thought it was a fit): That’s your decision. Just to let you know, though, the next time it’s opening will be November, and the price will be $50 higher.

Them: Ok – but what if I’m not quite ready to start?

Me: You definitely shouldn’t invest money in anything unless you’re sure, but just to let you know, the course is on-demand, so you could lock-in the savings this week, then start next month, and if you end up deciding it’s not the right way to go, I’ve got a 90-day no-questions-asked money-back policy.

Don't be a fence-sitter
Don’t be a fence-sitter

This is why I love sales calls. You can learn about objections in real time. And my exchange with the above would-be student was a near-perfect encapsulation of sales-page principles:

Scarcity: There’s only so much time to enroll, or the opportunity goes away, and the next time the price will be higher.

Risk Reversal 1: If you’re worried the timing’s not perfect, remember – it’s on-demand and self-paced so you can start any time.

Risk Reversal 2: If you decide to enroll then change your mind, there’s a money-back guarantee.

As I’ve said, it’s impossible to tell in exactly what proportions promising to raise the price, “live tweeting” the launch, and publishing my number each contributed.

But in aggregate, they added up to big wins.

Do you have a tip or tactic that you’re sure was causative of, and not just correlated with, success? Leave a comment below!