When I caught up with Will Foley, Director of Revenue Operations at Splash, I was impressed and intrigued by his unique four-step approach to forecasting. It was fascinating to hear how he's found that forecasting can deliver back to a business beyond what it was intended for, and amongst a very interesting conversation I was fortunate enough to learn some tips myself.
Rory Brown (RB): Thanks so much for being here Will. To start perhaps you could quickly give an introduction to yourself what you do and how you got into sales rev ops in the first place.
Will Foley (WF): My name is Will Foley. I work at Splash, an event marketing platform tool. I have been in SAS for the last seven years wearing many different hats. Those hats, the single thread through all of them is just my love for operational excellence, adhering to as well as designing beautiful processes that net really great results. I've worn customer success hats, I've renewed, I've expanded, I've been on the individual floors with quotas. The evolution has led me to take my operational skill-set and my experience which was client-facing to leading with the revenue operations team at Splash. What I really enjoy doing is providing not only our external customers, but the customer to me, which are my sales and CS teams with an amazing experience.
RB: I'm right in just hearing that you've carried quotas in the
past? So you know what all these folk are going through?
WF: It is one of the more critical positions to be in. I use the word motivating because it will spark action, it will spark emotion, it will spark a lot of different things. What's most important is figuring out how to control that for constructive outcomes because there is, I think a belief that it takes a special person to be a 'caring individual', but that special person just requires the ability to control and manage those emotions rather than actually nothing that you're born with. That's what I help people do and that's why.
RB: I'm very interested in is your take on the world of forecasting. I believe you've got an interesting four-step process.
WF: Yes. I think what's most important is really establishing a foundation and what that foundation represents in the metric world is specific route metric. So that's the foundation. That's if we can get the team to do the behaviours that we want, we know that we're putting them in the best position to be successful.
Then step two is really figuring out what the right weightings is and how to really build, not a level of confidence in the actual numbers, but the confidence in the ability to think through how to weight your own pipeline. We really want to empower how to, as an individual rep, think about weighting their outcomes, forecasting at an individual level.
The third is process adherence. This is, I think, one of the most important, which is we know when we're most successful and have deep connections. That when you do these activities, you are going to win or be in a position to win more times than not adhering to those processes. We look very heavily into "are we taking the steps, in order, and sometimes not within the right order, but taking the right steps in order to close the cycle?" If not, that's okay too, but it shouldn't be in your forecast.
Then the fourth is our ability to ladder those individual forecasts up. That's our ability to take manager deal calls from the individual contributors to the managers, to our CRO level, to the CEO level, to the business. What's most important is that whenever there's these conversations, we have to keep in mind that if there is a discrepancy on any of those levels, which we know happens, the smaller they are the less they compound when it gets rolled up to the top. We're really looking to make sure that we're standardising the way that we are forecasting across individuals and across the managers so that the outcome at the top is all consistent forecasted deals, which we believe result in the most accurate forecast possible.
RB: So point four, you're going against the bad habit that companies have of every layer of the company submits a forecast and every layer it goes up they shave off an extra bit until eventually gets up at 65% to the CEO. They shave off 10% to their investors and off we go?
WF: Yes. I wouldn't call that forecasting. We recently, because rev ops at Splash are responsible for both sales and revenue retention forecast, did a huge exercise with the CS leaders to standardise across their teams how they forecast at an individual deal level. We are taking what we learned in those conversations and applying it on the sales side as well. It is a pick-your-own-path forecast model. It's a flowchart that walks you through the forecast categories to the outcome that we think best represents the likelihood of that deal.
This flowchart will help every individual contributor who is responsible for forecasting their own category to follow the flowchart to find weighting that they should apply to individual scores, or that individual client or prospect. What we have done is we now have taken that and created a standardised way to forecast. If rep A is forecasting their book of business and rep B is forecasting their book of business and they switched the next week, the outcomes should be the same because they're all using the same pick-your-own-path of forecasting.
RB: That's really clever. When you go right back to the beginning you're talking about this whole managing emotions, your claim is that if you switch the pipelines around and you get a rep to objectively forecast based on someone else's pipeline, take emotions out of it, they should be able to come up with a very similar forecast to the person whose pipeline it was originally?
WF: 100%. That is what we are driving through and for with this type of process.
RB: Absolutely love that. Could you give two or three examples of what’s in that flowchart?
WF: Yes. Very generically, we are looking at the activities in our tool. If they're active, highly active, or maybe moderately active, you then go to the next node of the process flowchart. In that node, it could be use case; are they using the platform in sophisticated, moderate, or non sophisticated way? You then go into the next node. As you answer these types of questions, you can then get to the same answer every time because we have created an objective evaluation framework that should, a majority of the time, lead two people to forecast the same way on a single deal or account.
RB: And what you end up with is a weighting on the deal, a probability, is that right?
RB: Where your good practice is, this is replacing let's have a standardised seven point tick list for what a commit is. We know that tick list has to be a commit which is let's say, 96%. What you're doing is you're expanding that idea quite a bit making it more complex, but you're enabling people to put a probability to a deal rather than just saying, "Yes, it's coming in"?
WF: Yes. There are pros and cons to each. For those that are looking at a tick-like process, I think that that is much more simple than following a flowchart. However, I think there's times in the marketplace that can utilise that type of forecasting methodology versus maybe the times that we live in today, which require a little more sophistication to guide us through some of the most uncertain times that we exist in in modern age. So we have taken the approach that, "Hey, ultimately we want to make it as simple as possible, but right now things aren't simple." We are investing more into the flowcharts path to your weighting versus the tick box or the check box methodology. But in more consistent and certain times, we will start paring that back to exactly what you just said, with making a checklist rather than a flowchart.
RB: You obviously set out on this path to explore an optional alternative to the standard that you just described? What were the aims of what that would achieve? I'm guessing visibility is one key one, but perhaps you could share the why behind doing it.
WF: The biggest why, and you've nailed one of them, which is visibility, but I think what's most important is consistency and our belief that following a single process consistently delivers a better experience for those individuals involved. The clearer of the instructions that we can provide, the clearer the process is for the person, rep A versus rep B is, they now can collaborate and work together because the processes across the team and across offices is consistent. It's an evolution. Step one is consistency.
Step two is refining and improving our forecast. When I think about the uncertain times that we exist in today, there's no historical data to suggest that what we are weighting are risk categories, or the likelihood of a deal closing, or a client renewing, to suggest that that's true, but instead what we do to build confidence around this is a narrative and a story behind each of the deals that we look at.
When we created these weightings, since we had no historical data, we started applying the weights to a cohort of prospects and clients. We said, "Hey, this client and this client both have the same weightings and when we talk about that client with these parameters and we say that the weighting for each of them is 50%, which means that we think one out of two of these deals is going to close, is that true? Does that feel right? Does that make sense?" Going through that process helps us get to that answer, which builds the confidence in our weighting during these uncertain times, that allows us to now fly a ship, maybe not as accurate as we want, but not blindly by any means.
RB: Well, I think a lot of people would give that when it comes to forecasting. It sounds like what you're doing is you're turning forecasting on its head and it's delivering something back to the business beyond what it was intended for, which is to tell people what's going to happen. Actually, I'm guessing by this process of reps studying exactly why a deal is a set probability, they're exploring the journey and health and scenario around that deal in the first place, which is good for coaching, I presume.
WF: You're nailing what I think is phase two into phase three, which is, phase two is refined. We're learning. We think we feel really good. As these deals close, as these clients renew, we're looking at our narratives and saying, "Did they stand true? Did one out of two of those deals close?" If not, then we need to: one, look at the weighting, and two, look at the variables that we're using to evaluate a deal. Once we refine, we start going into phase three, which is how do we positively influence our forecasting so that when we pick up on these, either the risk categories or discounts for the likelihood of a deal to close or a client to renew and turn that into an opportunity. So if we know that having a third demo is really important for a deal in this hypothetical situation and we know that we get our clients to have that third demo, we are more likely to renew. Now we can turn that into, "Okay, what is the playbook to get that third demo on the books and secure it so we can start increasing the likelihood of more deals closing because we know that that action, that behaviour has a positive influence to the business."
RB: Taking you back away from the weightings, I've not to come across a weighting structure like that before. Normally, people do the mathematical way, which is, "Let's look at the values and industries and apply weightings based on stages," not this way which is much more intimate to your business, isn't it?
WF: I think there is a place for weighting stages. I think that right now we have no historical data to say that this stage produces this outcome. This is where we are trying to rethink different ways to tackle these challenges because there's so much uncertainty. At the end of the day, I think we are using both to triangulate the accuracy of it to see which method is proving out to be more successful. We are definitely looking at stage progression, we are definitely looking at this behavioural weighting, and we are also looking at a predictive score. But all of those right now we know we don't have a lot of historical data to suggest one is right over the other, and so we are using all three to really pick up on where we should be spending our time. Over time, they should converge to one another, but at this point, the variability is so high that what we're looking for actually is to spend time on where there's the biggest gaps between each of those forecasts. That is an inspection point. That's an opportunity during limited resources, limited time, and high uncertainty, a way to determine where I should prioritise my time, because that's where interesting things can be uncovered.
RB: So what you're saying there is that if we start on the path from day one of, let's say, exploring three different ways of forecasting, your flow weighted, your mathematical weighted, maybe some sort of predictive model, at the start they're going to present gaps between what they're saying, which gives you areas to explore. Eventually, what you're hoping is you're going to be sitting there with three forecasting systems that are always fairly accurate, always tell you the same story and you can be super, super confident of the story you're telling to the business?
WF: Exactly. Right now that requires an additional overhead. At this point, when I think about the most important metrics to keep reviewing on an almost hourly basis is our ability to forecast our revenue, which is how intimately do we know our business. That's our number one job.
RB: I want to come back to point three of the four key ingredients and that was around process adherence. Now, this might be coming into the flow part of things. You started to open up on this and I was really interested. Maybe you could explain a little bit what that means?
WF: I think if I had to use some other familiar terms, it would be like, "What is your selling strategy? What is your process?" Some that are out there is MEDDPIC value selling. What we're trying to do here with process score at Splash is build another layer of confidence, that the playbooks that we have put in place we are recommending and ultimately saying to our team that, "If you follow it, you will be more successful at closing the deals." It becomes a commitment to the individual contributors, to the management who ultimately is the one that is making those decisions with the feedback of the team, that "These are the processes, these are the playbooks that met the most likelihood to close that deal."
The commitment is that every rep is going to follow that process. The management team is committing to the sales team, the individual contributors that we will make sure that the process is as optimised as possible. When there's new opportunities, as we know, things change all the time, things are changing faster than ever right now, and we need to make sure that that process is optimised with the current market. That in order to optimise the process, we need to adhere to the process. If we don't adhere to the process, then we can't optimise the process. Now what we're talking about is probably one of the toughest and hardest position that anybody is in, which is you don't know what behaviours are driving deals to close. That is a very scary position to be in because then forecasting is hard and everything else becomes that much harder because you don't know what the behaviours are that are resulting in the forecast that are being presented. This then introduces a high level of uncertainty, which is a very nerve-wracking position to be in.
RB: So what you're saying, is that there's almost a psychological and emotional commitment to sticking to what the business is providing, so that the business can then see what's working to improve the business. Everyone wins. How does that work in terms of pushing that out to individuals? As everyone's got the 'lone wolf', or people that think, "I get there in my own way"? How does that work in that scenario?
WF: I do think that it ultimately comes down into the belief and a commitment. The way that we build that belief and that commitment is that we feverously document all of our SLAs to make sure that everything we are saying is the process that you should be adhering to. That it is clearly outlined to the individuals and has clear understanding of if that process is being successful.
Commonly, you'll see an SLA between marketing and sales. The marketing team will provide this quality of MQLs at this volume and the sales team can take that information and say, "Great, our commitment is to deliver this amount of business from this amount of leads." For those that don't hit our bookings will be recycled. Our commitment is to provide that feedback. When you take that and apply it to almost every stage of the business, you start developing that belief in processes and the visibility into if those processes are being successful or not.
RB: The score itself, can you tell me a little bit about how you produce that?
WF: When I think about score adherence, it's less about the actual score and more about the consistency that you can drive. The simpler it can be, the easier it is to understand. An example would be a 1, 2, 3 or 0, 1, 2, 3 weighting system on each of the processes on "Did you hit this? Did you hit that?" and weighting it and being able to then aggregate across the revenue funnel, where you have challenges and where you don't, because the weighting is less important to take away and say, "Hey, 0, 1, 2, 3 is the right way for us." What's most important is that everybody is weighting each of the stages consistently so that when you are aggregating that information, you can see the ebbs and flows of the process adherence with whatever metrics that you are using to do so.
RB: Anything final that you think you would like to share?
WF: When I think about the challenge surrounding the integrity of the forecast, I am biased towards these complex algorithms that really are interesting to me from a mathematical standpoint. Over time I have really realised that good data comes from good process, and good process has good process adherence.
When I think about coming into a situation where I have a forecast that's inaccurate or I don't have processes defined and I need to start implementing some of these things to improve my forecast, I'm going to go back to the most basic level of that formula, which is point number one: can I set really simple activity metrics to start driving the adherence to a process? Once I graduate from that I can then go into the next stage: can I get them to start forecasting their ability to hit those numbers or hit those metrics? Then start going up the next ladder and say, "Hey, how accurate am I to that forecast? What are the reasons why I'm accurate or inaccurate?" The more that I can teach my individual contributors that basic forecasting ability, the greater likelihood that I have to produce a company-wide accurate forecast.
RB: This is quite an interesting topic, because this is where I see a lot of problems. Many companies immediately think, "We need to be smart. Let's do something really cool, and smart, and complex, and metrics" and they've never actually put the basics and foundations in. Let's just say your sales process starts at X and finishes at Y. Between X and Y you could choose five points or you could choose 100. People normally go straight to 100, trying to capture them all and it falls and they end up with nothing.
You've obviously had to rein yourself back because you've just told me that you're naturally the person that would go towards the other end. How did you do that? Because I think a lot of people struggle mentally with that? It's almost like they're not capitalising on their own intelligence because they're keeping things simple.
WF: It is rooted in the fact that, in my purview, process and data are one and the same. You can't have one without the other. When you look at what drives good data, it's people. At the end of that stream, at the end of that click on a computer, or the behaviour, or where that data is coming from, it's a person. When I think about myself and the power of habits, I think about some of the key things that I learned from The Power of Habit, a fantastic book that helped me just learn more about myself, which is that in order to build a habit, it takes about four to six weeks to even have it start to click. There lies the challenge that I'm trying to solve for step one of creating a forecast. Can I create habits across the revenue team to provide consistent behaviours that result in a consistent forecast?
If you explore that book, one of the stories told is simply an individual that wanted to start working out to have a healthier habit. The way that she coached herself through it was that it was micro steps, breaking it down to its simplest form. For example, in order to work out in the morning, I need to wake up. The first habit that was built was setting an alarm clock, the second habit was when that alarm clock opens or rings, it's to turn it off, the third habit is to sit up in bed, the fourth habit is to take your feet off the bed, the fifth habit is to stand up.
If you break that down to a sales process, you talk about emails, phone calls. Those are the basic value-adding activities that then turn into meetings. Once you start establishing that as a behaviour, the mind gets onto cruise control a bit and start tackling bigger and harder challenges. When you to go 0 to 100, what you're doing is we're just overloading people, and that's not a really fun position to be in. You have to lead them to that 100 state and it starts by building the habits and the activities that lead you to it.
RB: What you're basically saying is you give bite-sized chunks, because ultimately you're building all this stuff for individual contributors to use and to navigate. You give them bite-sized chunks and you allow them a grace of time to digest that bite-sized chunk, prove that they digest it because you can see retrospectively, "Okay, this is being adhered to." Then you give them something new and keep going like that. Otherwise, you overload them and nothing happens
RB: It sounds very simple but very good. The best ideas always are, right?
WF: That is the point that I'm at, which is to start simple because those are the greatest ideas.
Discover why Kluster, the leading revenue analytics and forecasting platform for B2B SaaS companies, has received multiple awards from G2 in their 2022 Winter Report. Kluster is the number one platform worldwide for Opportunity Scoring, Risk Analysis, and Live Forecasting. Learn more about the platform's features, awards, and fast integration times on our website.
Pravesh Mistry, Chief Revenue Officer at Truework, believes a strong level of clarity is what makes a good culture!