Google Analytics, with its tremendous ability to set up a visual sales funnel according to predetermined goals and very granular data segmentation, has earned its reputation as a very powerful, but often underutilized tool. NSI gets to help a lot of organizations of all sizes fix this, but we often don’t get such a picture-perfect view of the ball whishing right over the goal line, smack into the net, as we did this morning.
Our client wanted its website to decrease costly inbound sales calls by converting many more prospects more cheaply online. Ecommerce pages were set up, and, of course, Google Analytics was in the starting lineup. After a substantial effort, it was time to view the data and prepare a slide for the Board Meeting. But insights from the data weren’t immediately forthcoming.
First the process of implementing virtual URLs was still in progress, so products weren’t fully sorted out from each other. Second, there were many sales paths, so the funnel was complex, with customers coming in, leaving and reentering at many different points.
Further, the funnel–which would be expected to narrow down in turn accountholders to applicants to customers–suddenly bulged in the ecommerce phase, seeming to close more people than had originally entered. Most disconcerting, a key product only showed a 12% conversion rate online, much lower than the same product sold by phone.
It all seemed to add up to what might be the most memorable, career changing slide of the Board Meeting. Had the entire effort really been a disaster?
Not at all. Even without virtual URLs, we were still able to reduce data noise. Step 1 was to take a closer look at the dates of the greatest numbers of abandoned carts. They clustered together into two sizable time periods. After a short interchange with the client, we knew to revise the reporting period to exclude these–both were times when the ecommerce system wasn’t fully functional, prompting customers to abandon their carts and telephone instead. When we leveled the playing field between online and telephone that one product’s conversion rate inched up to 17%. Still bad, but less so.
Another observation came to light: while an abnormally high number of prospects exited the online funnel, most of these exists were at the Review and Checkout steps. In fact, defections at any other point were all less than 5%. This indicated that the system worked fine; the culprit may have been sticker shock. Sure enough, the client used a hidden-pricing strategy to prevent prospects from incorrectly self-qualifying out. This tactic makes sense when the product’s value proposition is not easily assessed, but it can also frustrate comparison shoppers. Indeed, the call center confirmed that some of their sales were from prospects who called in to register surprise over cost.
A real-time call with a salesperson who could reiterate the value proposition and offer installment plans, alternative forms of payment and discounts (not yet added to the ecommerce pages) often overcame the remaining objections. Still, though, that did not explain the entire conversion gap–the sales team was good, but why would sticker shock affect online sales so much more than telephone sales?
Step 2 was to take a closer look at this by examining the telephone-based sales process. When we downloaded the application prompting inbound calls, we discovered that, while the website was silent on price, the form listed it plainly! In other words, while almost all inbound callers were fully aware of the cost, online prospects included both those who considered the cost affordable and those who did not. Google Analytics revealed a significant flaw in the client’s sales strategy that was easily corrected: hidden pricing, if utilized, needed to include both channels. Suddenly the conversion gap seemed easily explained.
Step 3 was to test a few of the entry ULRs that suddenly swelled the number of visitors to the shopping cart. They came from a mysterious page labelled “/billing.asp” which, on investigation, turned out to be the page where existing customers are billed for renewals, at which point they enter the same shopping cart.
Now the online process made complete sense. We had cleared out data noise by simply adjusting the reporting parameters. By putting on the customers’ shoes, we identified a substantial incongruity in the information provided to online and telephone prospects. And, finally, by testing referring URLs, we understood where traffic truly originated. Our entire investigation was less than an hour, but an hour of data analysis can be worth weeks of data collection. Shoot and score!