Watch Kevin Charette chat with Matthew Gomes who’s the Senior Director at OppLoans about how the company has been continuously optimizing website conversions. If you prefer reading, we’ve got the summary below.
Matt Gomes: I have been with OppLoans for a little over four years at this point. I initially joined the marketing, built out direct mail pre-screen program from the ground up. I have owned the forecasting function for the last four years for the firm, and the previous year and a half to two years, I have been focused on new product development.
To dive a little into what OppLoans does; we are a Chicago-based fintech platform that is hyper-focused on increasing credit-access for middle-income credit challenge consumers. I think folks who have a sub 600 fico score who have an emergency expense like a car repair, a medical bill on a high deductible plan, or something like that.
50% of Americans today don’t have 500 dollars in their savings account, and they don’t have many good options to meet that need. That is the role we play; we like to think of this as a rescue rehab graduate program and be able to gradually go up the fico spectrum.
At this point, I would say that our website is the lifeblood of the business for us. 100% of our applications are sourced online.
Yes, we use direct mail like most personal lenders, and banks. It is an incredibly powerful tool, but all the applications are still taken online, and we don’t do any over the phone.
When the company was originally founded it was about 70% split in favor of physical locations, and we made the pivot to be 100% digital about five years ago. There is a whole host of reasons why we did it.
In the broader context, definitely in the fintech environment, the web is the primary acquisition channel regardless of where you are in the fintech landscape More broadly, I think digital is becoming more important for traditional financial institutions as well.
80-90% of my banking is either on my cellphone or my laptop. I think the shift to digital is only going to get stronger.
When I joined around mid-2016, the firm had already made that decision in late 2015. Jared Kaplan, our CEO, kind of led that charge where we wanted to expand our geographical footprint and the cost to do that with physical locations is astronomical compared to going the digital route.
When I joined, we had a very janky setup with Google Analytics that we had contracted someone else, to sum up, the tags, and all the tracking. Pretty quickly we figured out that it was just outdated data. We could not update the tags ourselves. What we thought were applications were not applications that we thought were high-quality were not high-quality. It was a bit of a disaster.
We immediately started thinking about how to figure out what is happening on the website. The first investment that I asked for after joining was Heap.
What attracted me to Heap is that we dropped in the snippet and immediately started tracking data. We use Heap to track 100% of what happens on the front end and a decent amount of what happens on the server-side.
We have taken a phased approach there. Going back to 2016 and early 2017, we had bifurcated the analyses we were doing. So, yes, we had Heap sitting on the front end, and we could quantify our application conversion rate, and we know where it’s trending.
We identified a few pain points in the field so we can test against that and hopefully lift an application completion rate.
The bad answer to showing anything to the executive team is usually that’s up in PowerPoint. It depends on the complexity of the test we are doing. So, start at the highest level and move down. One of the nice things that we have been able to do is allow non-technical users on the marketing team, product team, or design team.
Taking it one step further, we do want to carry that analysis all the way through from step one being an application to how do they perform at the first chokepoint, which is going to be business rules and initial underwriting.
I led a little bit here; I’ve continued harping on our application flow. We are 100% digital, so the lifeblood of the business is the application.
One of the first things we did in setting up key events was defining each step of the application so that we could make a conversion from step one to step two, and so on.
What we started to see in 2017 was this slow gradual decline of people completing the application. Diving deeper, we were seeing a flat performance on desktop devices. On the mobile side, it was a pretty substantial hit that we were seeing month over month. The reason it was manifesting itself was that over time there could be a greater shift to mobile traffic.
At this point, we are at about 90% of mobile traffic at OppLoans.
We thought it was annoying to have many steps to fill out the application so why don’t we take it from five steps to three steps. Sure enough, we saw more than a 10% lift in app completion just by making that change.
The other biggest lift that we have ever driven in testing is also related to redesigning the app again, which we did quite recently. It is the opposite of the approach we took back in 2017.
We thought, now that 90% of our traffic is mobile, what is we made the application look like an app? So, we went completely in the other direction and not is, I believe, a 16-step application flow. It is very graphic-heavy, there is one question per page, and is very easy to pick the answer by eliminating the dropdowns. Interestingly it only drove about a 2% lift in the app conversion.
It depends on how mature your company is. If you find yourself in the same shoes as I was in back in 2016, the key to improving customer experience first and foremost is to make sure things are not breaking all the time.
When we first started setting up key definitions, that was the primary focus, and we were constantly monitoring conversions. If they suddenly nosedived, something is probably broken.
If you are an established company, the key is to have a complete view of what your customers are experiencing.
I think what worked well is that we are yet to run a test where we did not know what the outcome was.
A pain point that I have felt earlier in my career was having to repeat the same tests because we did not have the tracking set up appropriately.
I think here we could certainly improve on is making our application more responsive. By that, I mean tailoring it to where did this customer come from.
It depends on our marketing channel as to whether or not it is our primary source of truth. Anything digital to digital is a key evaluation point for us. So, Google, SEM, Facebook Ads, and any other display networks that we are using are the key data points that we are using to make campaign decisions.
We are a 100% online business, so I don’t think you can operate without a solution like Heap. You could always try and build one internally, but the investment is astronomical to do it internally.
Like any tool, you are not going to capture an absolute 100% of the data. I am yet to see anything that does. The only way you could do that is if you built it in-house, and it lived live in your site versus sitting on top. We see a less than 5% gap over any period.
I think where we have had problems, and this is more of an internal bandwidth issue, is in standardizing the approach to creating events. But from an actual data collection standpoint, we have not had issues.
We have multiple analytics teams at OppLoans. Marketing analytics lives under marketing, and our role is to be 100% focused on improving the efficiency of the capital we put out in the marketplace. It’s through optimizing the credit files that we are using with various partners and things like that.
We also have a general business intelligence team, and they support the non-acquisition non-marketing focused functions across the rest of the firm. So that is everything from summarizing the data to supporting legal.
Like just about everyone, we saw about 75% hit to just the overall demand and traffic levels in kind of March and April right when nobody knew what was going on. Lockdowns were much stricter. We have seen a bounce back in terms of customer demand and broader economic activity.
We periodically review what we consider kind of the minimum requirements to be able to measure a test, and it has changed as the organization evolved.
At this point, anything we are testing will go to a maximum of 25% of traffic for the first 24 hours. Assuming nothing is broken, we will expand that to 100% traffic at a 50-50 split, and it will be run until we hit a minimum of 1,000 observations in each variation.
It is certainly exciting that people are continuing to want to get more into testing and figuring out what the data is telling them. It is certainly encouraging that people are interested in it.
Be the first to access actionable reports, guides, tips, videos, podcasts from experts in Customer Engagement, retention and more!
Please wait while you are redirected to the right page...