Shell Life Podcast Episode 005 Post

EP05: Using Data to Drive Sales, Supply Chain and Product Decisions with USAOpoly’s Eric Richardson

An in-depth discussion on making data driven sales, supply chain and product decisions. As a data expert, Eric shares how consumer brands can harness data without extensive resources, how to bring and present data to retail buyers to influence replenishment and how to recognize and utilize sudden surges in consumer demand.

Transcript

Logan Ensign: 3:18

Eric Richardson. Welcome to Shelf Life. To get us kicked off, we'd love to just learn a little bit more about you and your background, your career journey, what got you where you are today. Yeah, thank you.

Eric Richardson: 3:30

I have an interesting background that spans operations, consulting, banking and now games. I did the as you mentioned before operations for Macy's and optimized their supply chain, kind of cut my teeth on that. Then I went back to school, got an MBA from University of San Francisco and did a stint in banking and capital stress testing, heavily using data to understand capital requirements. I helped out some startups in the Bay Area building algorithms and understanding how to forecast last mile deliveries. Then randomly I got a call from a recruiter and he said would you like to join this gaming company? He said they had surfboards, bikes and it was on the beach. I said sign me up, I'll do it. Now I do. I'm the manager of forecasting and data analytics for USAopoly, based in Southern California. I work with data every day, from customer data to supply chain data to financial data. It kind of really in a weird full circle way really encompasses all of what I did before into one job. I really enjoy doing it.

Joel Beal: 4:49

Are there surfboards and bikes, Eric?

Eric Richardson: 4:52

Surprisingly there were. They were in the warehouse area downstairs. People didn't really ever use them. Some of the bike chains got rusty and the surfboards. I didn't really trust them. There's also sharks right off the coast there and I'm kind of scared of sharks. I watch people surf a lot there.

Logan Ensign: 5:11

Well, I think in that description you use the word data, by my count, six or seven times. Eric, I know you personally as well. We know that you very much know your way around data. I think one topic we'd love to dive into is what happens when your buyer doesn't. As you kind of think about that dynamic, what pitfalls do you see people falling into, whether it's buyers or suppliers, when it comes to making data driven decisions?

Eric Richardson: 5:39

Yeah, that's a great question. Thank you for having an accurate count on my use of the word data. The way that I approach this and the way that I look at it is that there's a human element to all purchasing decisions. When you are recommending things to a company that you're trying to sell something to, you're trying to convince someone else that it's a good idea to spend their money on your product. What I've found is sometimes, no matter how much good data you can show someone, you can become more efficient. For reasons unknown, they just don't want to purchase. What I focus on is the idea that it's kind of like pushing on a string. Like you can't really push on a string, it kind of bends, but you can pull on a string. So what you have to do is you have to put all the data in front of someone in a really easy to understand, digestible way that anyone that's looking at this, from the top to the bottom, can see very clearly. I will make money from this. I won't make money from this. How quickly will I make my money back? And that's really what I focus on. I pull tons of information. It's just so much it would make your head spin, but a lot of what I do is making it really simple. Maybe sometimes it's a million row spreadsheet and I'll bubble it down to like four cells just for someone to look at. It just really encompasses high level. Here's when you're going to make the money off of this. Here's how long it's going to be tied up for. Here's the benefit to you, your return.

Joel Beal: 7:15

Eric, do you find that the you know? You say, hey, I'm going to have millions of rows of data. Right, there's a lot of data out there in the world. Now, today, you want to bubble that up to a couple key things, right, simplify it. Do you find that? The things that you're focusing on? Does that depend a lot on who you're talking to, or is it the same general points that are kind of being made over and over again?

Eric Richardson: 7:39

There are commonalities of like things that are always the same, that people want to see, but a lot of it is knowing your audience. One thing I learned in school was sell to your audience. You know you have to understand your product, but you also have to understand the person you're trying to sell it to even better. So when I'm presenting information to someone, I'm essentially selling something to them, but I'm just giving them data, trying to make them believe in what I see.

Joel Beal: 8:08

So, yes, it has to be tailored and simplified Data tells stories and, as you said, you know the data is the data, assuming it's correct. You know it's factual, but the data that's presented the way you present it, can tell quite different stories, right? You see this across academia, science, etc. How often do you find you're able to change someone's opinion? Because it's one thing, often data confirms things that we already think, we know, hunches we have. It could be a little harder when you're like, wow, the data is very different than what I expected and I'm curious if you have examples of that or if you've seen that find it happens frequently, infrequently. Does that depend whether you're talking to somebody internally or you're trying to? You know sell to a buyer at? You know a retailer, etc.

Eric Richardson: 8:49

You know, I wish I could tell you people change their mind more than they do, but that is one of the tough things you can throw out the most accurate information to someone. They may not receive it as accurate or they might have other sources that are telling them other things. Some of the wins I've had, you know, looking at certain big companies, algorithms that they've spent millions of dollars on to develop and the buyer at this big company is really relying on that algorithm to tell them what to order, when to order how much, and you know down to the end of down to like the time that they need to pick it up and ship it to get it into their warehouses. And there's an example of when I noticed, by graphing out and visualizing how their forecasting algorithm was forecasting one of my products, I noticed something funny and one of my coworkers and I would kind of dug into and we said, oh my gosh, this is incorrect because of an anomaly that happened last year. How do we get this message across? How do we present this and let them know that it's wrong? It came down to simplifying again and then not talking about why the data was incorrect, but talking about the main causes as to why the algorithm was misreading the historical data. So we took that information and we went to the parties that needed to hear it and we presented it to them in a simple way hey, inventory out of stock. Last year your, your algorithm didn't read that because it was such an anomaly as an outlier that it just cleaned that outlier from the data set. It ignored it all together. Therefore, you guys are going to be massively under stocked this year. We recommend putting these numbers into your own algorithm because you're comfortable with that and seeing what happens, and the result of that was a massive win for our company because the circle data was corrected and they accepted the change that we recommended. But it doesn't happen like that a lot with some big retailers. That's kind of the exception to the rule there.

Logan Ensign: 10:52

But what I'm hearing is don't let that discourage you, it's still worth it.

Eric Richardson: 10:55

Yeah, the fun for any forecaster or quant is digging into it and looking at the data to find the root causes of what are making these numbers appear in front of me, what's causing those numbers to change. So a lot of the fun is is learning through through digging in and exploring the data and joining it with other sets. And if the end result is a good one, like like someone accepts and takes your recommendations, that's awesome. If someone doesn't, you simply look at that and say, look, I learned, my team learned, we're all smarter and better because of this. And now we know, next time this comes up, we're going to put it in front of the parties that need to see this, maybe in a different way, and maybe we'll get a different result, but it doesn't stop us from doing it.

Joel Beal: 11:44

Eric, you were mentioning earlier, I guess, the counterparty in this being a large company, a large retailer. It sounded like lots of resources, lots of, probably, data scientists and a large engineering team to build all sorts of algorithms. You know, tell us. You know, usa Oply, you know, smaller company, although your products are all over the place. Logan and I were just chatting about how we both interacted with them just over the weekend in unexpected places, you know, in a national park, in my case, with one of your grand or your national parks monopoly, which was cool to see. But how do you, as a smaller company, how do you create this data driven culture? How do you invest, you know, in a way that's appropriate for kind of your size and probably the resources and kind of create that culture? We have a lot of smaller companies that I think are trying to figure that out.

Eric Richardson: 12:33

Joe, you hit the nail on the head with when you're saying culture. It is about creating a culture and not creating a number system or creating a data practice. It all comes down to the culture. So when I came into my company, there was a forecasting function that had been running for 10 years. They largely already knew what they were doing. They could forecast really, really well, but there was a lot of hesitancy in upper management as far as accepting and trusting the information that was coming in front of them and then using the information that was great information that was being generated to then impact and drive business decisions that would drive revenue and drive efficiencies and cost savings. I made the mistake of coming in at first like a bull in a china shop and saying, oh, look at all this stuff, let's do this, let's do this, let's do this. And it wasn't accepted too well. But what I learned from that is the importance of starting small. So in any company that's trying to adapt a data driven culture, I would encourage the folks that are trying to implement that To start small, start with little wins. So start by validating and verifying hey, my company is already doing this. Why don't I show them the data that proves that, hey, you made a great decision by doing this business XYZ decision. I'm going to show you how that impacted your revenue. I'm going to show you how that impacted your inventory on hand. So you start small and get little wins and you kind of build from there. And what I found at my company is that the more little wins that you can get and the more you can show them that, hey, you already know this innately because you made a great business decision, the more that you can show them that the data supports that, the more people start to trust and say, hey, what does the data say? One of the cool things for me was that the first time someone in the boardroom said what's the data saying about this? Let's ask Eric to model this before we make this decision. And that was a cool moment for me because it had taken maybe three and a half years of proving little wins and putting data and models in front of the bosses before they were comfortable with getting ahead of it. So start small, prove out what already is doing well. And then I would also encourage people make sure that when you are trying to develop a data driven culture, make sure that you don't throw data out there, that's just data. Whenever you're presenting data or analytics to someone, you should always give the business case as to why the data says. This Data says buy here, sell here, but then you have to translate that into the business case around. This is by this product, by this product, hold it here, move it from here, move it from here. So you have to quantify and put context around the data rather than just throwing numbers out of people. I would also say really important to driving data driven culture is hiring really smart people that are smarter than you are and not being afraid to, not being afraid to let them run free, not being afraid to give them a long leash per se, to to have fun and explore and experiment with the data, because you never know what you'll find. We have a really, really smart young programmer, a database analyst, that works with me and I always tell people without him I wouldn't be able to do my job. The things that he thinks of and the way that he writes codes and the way that he kind of serves up data for me to analyze and make great business decisions with is really awesome, but he's super, super smart and I want him to take my job someday. So you have to not be scared to hire talent that will surpass you. Man, I love all those points.

Logan Ensign: 16:31

That first point, eric, is fascinating to me on data culture, that often data can reinforce things we already know and that's an important part of the change management journey. That's not a waste of time. And then, what I also heard was make sure that, as you present data, that you're presenting context and insights. And then, lastly, don't be afraid to hire hire great people. Yeah, I think that's that's a fantastic sort of blueprint to help organizations go on that journey.

Joel Beal 17:01

So, as we look into the future of analytics and data, obviously everyone right now is talking about AI. I think the focus has probably shifted a little bit from machine learning a couple years ago and what was going to happen there with generative AI more recently. I'm curious if you've played with any or either of those and what you think the impact will be on analytics going forward.

Eric Richardson: 17:28

Yeah, I mean in grad school in Silicon Valley, pretty much in San Francisco, we would build models like the random forest models and the large language learning models. When we would build those out and train them and then have them give us results and predictions, this was what 10 years ago, we were thinking man, this is so cool, but I don't know if we're ever going to have the computing power or if anyone will ever massively adopt this. Now, fast forward 10 years and it is everywhere mainstream. I think it's largely a good thing. I think it will help simplify smaller tasks, which it's already doing helping you manage your calendar at work, helping you write better code, helping to check to make sure that you might not have made simple input errors when you're inputting things into a spreadsheet. I think, on the whole, there's a theory out there that AI is going to take people's jobs. I think that AI is going to really amplify people's jobs and give people more time to hang out with their family and do things that they like doing, because it will make them more efficient. As far as in forecasting and supply chain optimization, ai has been around for a long time, in the simple fact that when people say artificial intelligence, it's these large, large, large models, but when you boil those models down, it's just a simple equation, like, hey, I have a Y variable that I'm trying to predict, a Y target that I'm trying to predict, and then here are my X variables, like the things that will make that thing happen. So those simple equations have been used for years and years in forecasting, and I think the computing power that comes with AI and the power of AI to just kind of really comb through what everyone else has done in the past, I think it's going to make my job a heck of a lot easier, to the point where I don't have to go through every skew that I'm trying to forecast and figure out what's my seasonal demand pattern or what's my profile for when the demand will hit. I see AI being able to just really, really quickly snap the fingers, serve that up to me and then I can really act on it.

Joel Beal: 19:52

Well, I think that's a nice segue. As we talk about we're in a world is particularly the last couple of years, where I think there's more unpredictability than ever before. I mean, there's been huge swings in demand when COVID hits, during COVID, now post COVID, if we can call it that. I know that's really impacted the toy industry. I'm curious to hear how you kind of dealt with that tricky couple of years that probably continues even now. Yeah, just how you kind of responded then and how maybe with things like AI, other tools, you get better at being able to respond quickly in the future to future changes that we can't predict.

Eric Richardson: 20:31

Yeah, covid through everyone an unpredictability wrench. It flipped a lot of things on its side and all the norms that we were used to changed. For the toy industry. It made a lot of changes for the better. Everyone couldn't go out. So what are they going to do? They're going to buy toys and games and they're going to enjoy time with the only people they can interact with their family. Across the whole toy industry we really saw a massive demand spike. To be honest, none of us predicted that, because none of us could have predicted COVID happening. And then, following that massive demand spike, was a supply spike which just so happened, got held up off the port of Los Angeles and created a lot of empty shelves and tied up a lot of working capital in containers for a long time. But one of the cool things that my team learned from that big unpredictable demand spike that we couldn't have known about until it actually happened is we were pretty nimble. We would watch our data very closely on the daily level, sometimes on the hourly level, minute level, but we watched one retailer very closely and their data in some ways represents demand overall. It can be used as a proxy for what the market is thinking and we saw one interesting metric start to spike. My analysts and I were looking at it and we're like that doesn't really make too much sense. Why are people looking at these products so much During Christmas it makes sense but not in February or March or April? This doesn't make sense. So we looked at that. We said, all right, let's come back to this in a week and see if this is a trend or if this is just an anomaly. Sure enough, we went back in a week and the interest in these products that we are measuring via people looking at them and interacting with these products had really went through the roof. So we made a very simplified set of data and took it to our production managers and said, hey, I think that people are going to be buying games a whole lot because of what I mentioned. They're only with their families. It's a really fun way to interact and it's relatively inexpensive. Here are five products that we need to really invest in now and get ahead of and bring supply in to meet this demand. I think it's gonna keep happening. So we did that with a couple of products and got ahead of it, and we had inventory in ahead of time. The problem was that we were a little too conservative, like we couldn't have predicted that it was even more so. We sold through everything we had. We saw that spike and we brought the inventory in on time, but it still wasn't enough. We had a lot of really good learnings from that as far as understanding tolerances in our models that we build, understanding how consumers think and then also having a great outlier case that we could throw under our models when we wanna throw some unpredictability into them in the future. I would encourage the data folks that are helping to give information of people making decisions. Trust, but verify. So if you see things that are anomalies in your data, make a note, flag it and see if it creates a pattern. Once it creates a pattern, don't be afraid to build the business case that supports what you should do with this pattern that you're seeing and then act on it, but always put good numbers behind it as to the probability of it happening, how much incremental revenue it's gonna get you and what the cost is.

Joel Beal: 24:17

If we go back to earlier in our conversation around, you bring data and it's a question of how often can you convince someone, change someone's mind based on that? You use this example of okay, you're seeing interests, people are looking at your products online. Maybe it hasn't quite converted into higher sales yet. I mean it's crazy time right. Everyone's, as you said, holding in their houses and you flag it. It sounds like you come back later. You sort of say okay, we really think there's something building here. You go to your production manager and say, hey, I want more. Was that a conversation? Did you find that? People said yeah, that makes a lot of sense. I can tell the story to myself around why there's this burgeoning demand for our product. Or was it a debate internally? I have that play out.

Eric Richardson: 25:03

Yeah. So at first, when I showed it to my analysts and the other team members, they were kind of like you're crazy. And I was like, well, let's just all watch it, so we can all be in this together with this crazy decision. And then we went to the production folks. So we went to the finance folks that would be paying for the production and we presented it to them and they said we think you're probably onto something here, but how can we know? So what we did is we trusted but verified. We modeled backwards. So we essentially back tested a model and said this metric that we were using, like the interest in the product without buying it in the past, how often has that interest correlated to a purchase? So, meaning, for every person that looks at this, how often did they buy it? So we ran that model and then we back tested it. And then we ran it with the numbers that we were seeing and we said, okay, this is a little crazy and it's out of the ordinary for anything that we've ever seen. But, based on how history has played out previously, this interest should translate into this many purchases. And when we put that information in front of them, the green light was given to test it out, saying, all right, cool, let's invest some money in this inventory that we need anyway. If it doesn't sell now and your prediction is wrong, it'll sell by Christmas anyway. So really helped them. It helped change their opinion by back testing the model and showing them the proof that, hey, this metric I was looking at that I think, is gonna impact purchases. In the past that actually has predicted purchases, so we believe it will now.

Joel Beal: 26:42

I love the trust, but verify back testings obviously always a great way of saying, okay, if we're seeing this, let's see how this model would have worked in the past. I think a good way, in my experience, of helping build that trust. But even your comment at the end you did that and it was probably okay, let's try this out. And then in retrospect you're like man, I wish I would have doubled that order, yeah.

Eric Richardson: 27:04

I mean, in retrospect, we all wish we would have doubled and tripled our orders, but it always comes down to risk tolerance and understanding where your organization is and where your capital requirements are. And at that same time, we didn't know if we were gonna get paid from vendors and we didn't know what everything would be as far as cash flow. So it was a calculated risk that paid off. But yeah, looking back, I wish we would have purchased five times more.

Joel Beal: 27:31

Yeah, I think there's a lot of companies in that boat, but they didn't even get that first one, so that already got you ahead.

Logan Ensign: 27:37

Well, eric, I'll say one of the things I just love about these conversations is getting to learn more about folks, businesses and their products. And I think you all have a pretty unique model and so, particularly around licensing, we know that's really big in toys and games, but we also know it's maybe especially critical for you all in your business model. So I'd love to hear more about kind of that business model. How do you make decisions about what IP to license? What games have that longevity and popularity to fit your model? We'd just love to hear more about that.

Eric Richardson: 28:13

Yeah, that's a great question. I love talking about this. So at the core of our business we produce board games. We have our own IP blank slate, tell illustrations, cues and cues tappel. Many people may have played those games, but we have another segment of our business that does a lot of money, where we work with large board game manufacturers like Hasbro, and then we tie up their game mechanics with really cool IP that people are interested in. So if you're a big Star Wars fan or if you're a big Frozen fan, we're gonna give you a game you're familiar with, like Monopoly, and we're gonna let you play it as Elsa or as Darth Vader. The interesting thing about licenses are a lot of it comes down to measuring consumer interest. So when we are evaluating licenses, we wanna look at what's the staying power of this. Is it just a flash in the pan, like a TV show People are really interested in? Then, once the season ends, no one's gonna be interested in. Our games are gonna sit on the shelves. A big way that my team and I have impacted that is helping the our organization make data-backed decisions on choosing licenses is we've built models that take into account customers' propensity to purchase, meaning like looking at other games and other products that are similarly licensed. So is this a very fanatic audience that searches on Google a lot for it and then doesn't purchase, or are they an audience that tends to purchase a lot of the products or they're committed to it? So we found Disney, star Wars, super Mario. A lot of these licenses really have staying power and I think we've just scratched the surface as far as tapping the consumer base and the consumer that wants to buy and really interact with their favorite licenses. We use a lot of Google trends. We use a lot of market research information about who else is offering this type of license and what price point is it at and how long has it lasted, and this is something that we've. It's really been part of the creating a culture that makes data-backed decisions, because the license acquisition process used to just happen. We had experts that knew the licenses and they had experience with the people that were licensing them out. So we got them and we tried them, but what we found is there would always be 30% that didn't hit and my CFO kind of challenged me and said how can we figure out how to make only 10% mess? So we developed this methodology around understanding propensity to purchase and understanding how deep someone is willing to go. If someone's willing to commit their harder in dollars to your product and that license over and over, it's worth committing to and saying we're going to invest three years of development into making a game and printing a whole bunch of them.

Joel Beal: 31:24

Do you generally know when you get a license again getting back to this kind of gut idea, will you be like I know this one's going to be a winner and there might be others where you're like it may work, it may not. I'm curious how that maybe the gut reaction to brands. We all have things that we gravitate toward and how that aligns with how things actually perform.

Eric Richardson: 31:46

Yeah, it's funny. You asked that because people at my work like make fun of me sometimes in a nice way. They're like oh, I bet Eric likes this one. It's not going to sell Cause the one game that I really sucked my neck out on. I loved it, I played it and I thought it was so fun, it was so great. So then I forecasted a whole bunch and we bought not that much, thank God, and it was just such a dog, it bombed. So I clearly wasn't the target audience for that. But we do have kind of a gut reaction to this could be good, this could be bad, but I like to think in ones and zeros or black and white. So when I'm making a decision or I'm recommending a decision for a license, I need to strip that decision of all of my biases and I need to strip all of my emotion from that decision and go all right, I really like this. So I'm going to really dig in and try to prove with the data that this is going to work. Or if I don't like it, I still have to do that same process. So it really has to be process driven. But I would say my track record for my gut is really bad. So I'm really happy that I have a great team and we understand how to analyze it from a data perspective, because if we trusted my gut, I don't know if our company would be around still.

Logan Ensign: 33:05

Well, in the journey to get there, I mean, was there a lot of iteration and feedback loops based on okay, these were our winners and this is what we could have seen beforehand, or was it pretty clear to you? Oh, let's look at these different data elements, and that's going to be really helpful in informing what to make bets on.

Eric Richardson: 33:25

Yeah, that's interesting because I mean we have had so many meetings about this and so many strategy sessions about this, because the quantitative side of it's really easy. All right, what you get together, you know you get some coffee, you get a whiteboard and some markers and you write down all of the variables and they're all things that you can measure quantitatively, that you think can predict the success of something. But what we always get hung up on and it's so hard for us data people to measure is how do you measure the qualitative things Like hey, logan really likes this, or Joel has a Squishmile as key chain, so he's going to buy more Squishmiles than so? The qualitative factors we found involving teams that are not familiar or may not even be comfortable with data, but they have really good depth of insight as to how those licenses work, what the fan base is, how they react to conventions. You know We've even tapped into. You know, tell me how many people come dressed up as these characters from this license at Comic-Con and we're like that's a predictor that people are fanatical and they may buy it. So then we take that and we go. It does this scale? So we look even bigger.

Logan Ensign: 34:46

Well, one more question here for you, eric. We have heard from other folks, other customers of ours, a rethinking in how people approach manufacturing, particularly manufacturing in China. I'm curious if you've got a perspective there how you all are thinking about that. Are you evaluating different ways to source product? We'd just love to understand your thinking there.

Eric Richardson: 35:10

Yeah, that's a great question and putting all the political answers aside and looking at this from the business perspective, the role of any forecaster is to measure demand, try to predict demand and then create enough supply to then meet that demand. A key element to that is time of transit. So if we need stuff on shelves in a month but our production lead time is three months, we're kind of in a bad situation. So we have started to find domestic options to have dual production. So maybe we have capacity and components for 20% of our full production plan in a warehouse in the United States where we can surge and create that and get it to market in three weeks to a month in order to meet the demand. With that comes it's a little bit more expensive but it pays off because you get to meet the demand and you get all the incremental revenue. We've just started doing that, probably since COVID, I think, for the industry as a whole. Covid was a real driver for that because we all kind of realized, as all of our products were sitting off the coast not being sold, that we kind of need a different solution to this. So I think industry-wide we've kind of adapted the dual procurement strategy where you get enough components and you get enough raw materials to make a certain amount to surge for demand. One of the things that people don't think about a lot with overseas production is counter-fitting of products. So I'm a member of the IACC is the International Anti-Counter-Fitting Coalition. We meet in Washington with Homeland Security and reps from Amazon, alibaba and we talk about strategies of how to clean the markets up. From a board game perspective they're made out of paper and plastic. They're not a ton of money to produce, so you get a lot of value out of it for the consumer. That same factor of it, you know, not being very complicated or a lot of money to produce, has led to lots of counterfeits. We find that when we reduce in the United States, zero counterfeits. When we reduce overseas, the longer we produce the game, the more counterfeits introduce themselves into the market. And the more counterfeits are in the market, the more it dilutes your brand, the more it steals sales away from your partners who you work really hard to get a lot of sales and to save that shelf space. So to me, looking at it, it's not only from a just-in-time perspective and meeting demand, it's also how can we protect our brand the best and what are the best decisions as far as production location that will balance those two things Very interesting on the counterfeiting point.

Joel Beal: 38:17

So to summarize that it sounds as though the mindset is really that kind of flex production, that last 20% you have more localized so you can flex that up and down based on maybe shorter term fluctuations in demand. But it's still cheaper to produce overseas to the bulk of production at least as there for now.

Eric Richardson: 38:36

Yeah, it's all about the mix or the weighted average of your production cost. If we don't have to use the surge demand, it's great and our cost of goods is 10% lowers or so. But when you do have that built in where you can flex if needed, we've seen a lot of success with that. The goal is just to forecast it well enough to get it all overseas.

Eric Richardson: 39:02

but I wish it works better more. If you just nail that forecast perfectly, problem solves that three month lead time doesn't matter if you could forecast out six with perfection.

Eric Chrardson: 39:15

Yeah, I wish it was that easy. Bringing it back to AI, though I really think with AI driven methods, I think that's going to happen more and more frequently. Where we're going to have less misses on our forecast, there's going to be a lot better lead time and better visibility into where demand is going. So bringing it back to that, that's something I'm really excited about for AI to make my record a little cleaner.

Joel Beal: 39:42

There you go. Well, I think it'll be both. I think the forecasts are going to get better and you certainly see this a lot in fashion with fast fashion. But there's also that ability to capture those smaller, shorter term trends. But your supply chain has to be able to handle it and I think that's to me a very interesting approach of saying look, is low cost manufacturing going to come back to the United States? Probably not fully, but that ability to say, well, I'm seeing short term trends that I can capture, I can get those incremental sales, I'm willing to pay a bit more for that production. And I think AI, just like the more data that we're collecting, you kind of see those shorter term patterns that maybe fast agile brands can adapt to pretty quickly. So I think you're going to see both as my take.

Eric Richardson: 40:29

I think you're correct on that 100%. I think it will start to happen sooner than later. It's all about what companies adopt it, and then it gets cheaper and then everyone adopts it.

Joel Beal: 40:42

Well, Eric, this has been fascinating.

Eric Richardson: 40:46

Yeah, thank you for having me. I just I love talking data, I love talking supply chain and I love talking with people who understand what I'm talking about when I'm talking about it and they don't just go huh or have like a blank look on their faces. So I really appreciate you all having me on the program. I'd love to come back any other time to talk about what I'm passionate about data and supply chain.

Logan Ensign: 41:06

Amazing. Well, and I don't know if I have the final count of data, but maybe we cleared 100 in the word count there. We'll look back at the transcripts.

Joel Beal: 41:13

We'll have to add a counter, and you know.

Eric Richardson: 41:16

Well, I was actually using AI to count. We're at 258 right now.

Logan Ensign: 41:19

All right, well, fantastic, eric, and thanks for joining us here on Shelf Life. Thank you, You've been listening to Eric Richardson, manager of forecasting and data analytics at USA Oply. That's all for this week. See you next time on Shelf Life.