Measuring, Capturing and Paying Attention
In this episode of the Consumer Insights Podcast, Thor is joined by Dr. Jonathan Stringfield, VP, Global Business Research and Marketing at Activision Blizzard.
A successful advertising campaign is determined by its impact on consumer behavior. But how can you best measure that impact?
Dr. Jonathan Stringfield, VP, Global Business Research and Marketing at Activision Blizzard, explores this question through innovations in attention measurement. With deep insights experience from the worlds of tech, media, and entertainment, he also highlights how the gaming ecosystem —with over 3.5 billion users— is a key space for insights professionals to learn from.
We discuss:
- The importance of integrating messy human factors into decision-making
- The need to understand the underlying principles of AI and other technologies to avoid blindly relying on them
- Why honing communication and soft skills can help you unlock your full potential as an insights professional
You can access all episodes of the Consumer Insights Podcast on Apple, Spotify, or Spreaker. Below, you'll find a lightly edited transcript of this episode.
Thor
Hello everyone and welcome to the Consumer Insights Podcast. Today, I'm excited to have an incredible insights leader joining me for what I know will be a riveting conversation. I'm thrilled to introduce today's guest, Dr. Jonathan Stringfield, VP of global business research and marketing at Activision Blizzard. A research and marketing executive with nearly 20 years of experience at leading technology, media, and entertainment companies like Twitter and Facebook. He has deep expertise in quant and qual research methodologies, insights, empirically focused marketing, organizational strategy, and leadership. Jonathan is also the author of the book Get in the Game, which explores how to level up your business with gaming, e-sports, and emerging technology. Thank you so much for joining me, Jonathan.
Jonathan Stringfield
I'm really happy to be here. Thanks for having me. I'm excited to have a conversation. I'm really happy to be here.
Thor
Now to kick things off, could you take a couple of minutes to tell us about yourself, your company, and how you got to where you are today? How did it all begin?
Jonathan Stringfield
Sure. Yeah, of course. So, I think as noted, I work over at Activision Blizzard, where I oversee research and marketing for a media division. Long and short, Activision Blizzard is a video game company, a fairly large one, making legendary titles like Call of Duty, Candy Crush, World of Warcraft, and what have you. My team works with marketing, community, and advertisers to, one, talk about the power of gaming, and two, find creative ways to bring them into the environment in a way that makes sense for our players and drives value for advertisers, the studios, and everyone involved in between. Before that, I think the common denominator in my career has been the research component. At Activision Blizzard, my team works on everything from hand-to-hand combat in terms of ROI studies to bigger thought leadership studies about the power of gaming. This is very similar to the roles I had at social media platforms before—both at Twitter, where I was overseeing much of their measurement science practice in North America, and at Facebook, where I was one of the early members of their measurement science team. Going all the way back to Nielsen, I worked on television ratings. So, long and short, insights have been my entire career, always focused more on advertising measurement research, but heavily on the quantitative side. Along with all of that, I was pursuing a PhD in sociology, focusing on how folks think about and use technologies, how they use it to interface with one another, and how we get to know each other and think about our identities, intermediated by things like technology. When you look at that through the lens of social media, gaming, or whatnot, it all makes sense as a generalized trajectory.
Thor
With that background and that level of expertise and exposure to different industries, how do you define an insight?
Jonathan Stringfield
Yeah, I mean, I think insight is really what you do with the information that you gain from data. More specifically, it's what you can see through pattern recognition, experience, or otherwise. Data, in the abstract, can be as far as to say useless, or as much to say it’s just a collection of values. Information is what you extract from data to derive value, but the insight is how you apply it.
Folks who are really successful in insights are those who can see beyond the superficial.
Do pattern recognition, or see connectivity between a given insight, other insights, or other points of data or information. That makes something meaningful and ultimately leads to a high-quality decision for them, their organization, or the situation.
Thor
I love that definition. Having been in the industry for so long, do you feel that your way of looking at that definition has changed over the course of your career? If so, what is it that made it change?
Jonathan Stringfield
I mean, like so many things, there have been ebbs and flows. But that core concept of extracting meaningful parts of information has been the core of what I've applied throughout my career to make it successful. It's about seeing the big picture and how it ratchets down to smaller, seemingly inconsequential pieces of information to derive value.
I also found success in being able to take highly technical insights or information that might not otherwise be understandable or perhaps complex to others, and explaining them in human terms. When you do that, it not only becomes more actionable, but also allows for much better internalization within the business. You're not operating at a level where people "kind of" understand what you're saying but still see it as a technical abstraction.
Thor
I love that. I know that attention measurement is one of your areas of expertise, and you’ve recently been involved in some really fascinating work on this topic. Could you share a bit more about that?
Jonathan Stringfield
I can, and you know, taking a step back from attention, as noted, if you look at the trajectory of my career, it's always either directly or indirectly been working within the advertising community in a data or insights-led capacity. Essentially, what I’ve been doing, one way or another, directly or indirectly, is focusing on measuring the value of advertising, right? So, if we start there and think about how value in advertising is determined, and if you think about the history of the advertising community, it hasn’t really changed a lot. I find myself having to talk about this often, one, from a data perspective, but two, from a broader perspective on what’s going on in the advertising industry. The way media is bought—whether we’re talking about placements in video games, social media, television, or anything in between—is still based on the same measures we’ve been using since, depending on where you want to start, the 1960s or 1970s: reach and frequency. That’s to say, how many people saw something and how often they saw it. Of course, that has value. We want to ensure people had an opportunity to see something. But if you think about it now, in the digital age—and indeed, most of my career has been in digital media or digital forms of media—I’ve fundamentally believed that we should be able to do better. Definitely not at the same level of bartering or arbitrage on media that we were using back in the day when television was at its advent and Nielsen was measuring things. I ground all that because what I think attention measurement represents—whether it’s specifically attention measurement or otherwise
It is a move towards a world where we’re not just looking at the breadth of media but also the depth, focusing on the things that actually have meaning.
Where I’ve been personally interested and where we’ve made the most advancements in the measurement world is not just about whether you or someone else saw a message, but whether it had some impact. Did it drive action? Did it have a cognitive impact? Ultimately, I would like to think that if an advertiser, a brand, or what have you, is going through the trouble of putting together an ad campaign, they want to effect change. They want to convince you to buy a product or to think better of their company, right?
So, I think attention is a step in that direction because we’re no longer just asking, “How many people saw it?” but “Did they internalize it?”
We can go as deep into that as you want, but even just having some of these baseline measures is a big step forward. If you think about reach as the de facto measurement for a lot of media—someone having the opportunity to see something—attention is a step beyond that, saying they definitely saw it. We’re not all the way to understanding cognition yet, but we’re getting a lot closer. And this is especially important for video gaming because one of the things we do really, really well is capture attention.
So, if we’re not in a position where we can ensure that advertisers and marketers understand that, we’re leaving a lot of value on the table. There’s a lot of incentive to make sure we’re thinking about this seriously.
Thor
Building on what you just said, let’s talk about your book Get in the Game. To my understanding, the book explains why marketers and executives should pay closer attention to gaming. What are some of the opportunities in the gaming ecosystem that you think more insights professionals should be aware of?
Jonathan Stringfield
Yeah, I think there are at least two levels I’d invite them to think about. The first is, again, particularly for those that work in the media world. Whether they work in social media, television, print, or whatever, we generally want to have marketing activities in media that consumers find meaningful, where they are spending a lot of time, and where you can find a lot of consumers. Gaming checks all those boxes, like, in spades.
In fact, you’ll hear all the stats: 3.5 to 3.6 billion people play games, and the gaming industry itself is bigger than the movie industry, the music industry, and so on.
So, it’s a massive, massive entertainment ecosystem, but one that historically hasn’t received a lot of attention from marketers, in part because it wasn’t super easy to tap into.
And just in full candor, even today, it’s not super easy, but that’s a lot of the hard work we’re doing—trying to make it more seamless or turnkey. Even back in the day when I was at Facebook and Twitter, perhaps not surprisingly, I was a huge gaming fan. I’d be having conversations with CMOs and CEOs of big Fortune 100 companies, and they’d talk about the future of social media, which, of course, was partially true. But from a fan perspective, I kept looking at the gaming industry and thinking, “Are y’all seeing this? Are you paying attention to it?” Because gaming was bigger than even some of the platforms I was representing at the time. It always kind of confused me. So, as that business grew, particularly at Activision Blizzard, it was a great opportunity, and I’ve been happily working here ever since. On one hand, there’s a lot to be said about the where, when, and how of it. But in general, having folks, particularly those on the insights side, start to be cognizant that there are a lot of measurable and imminently valuable opportunities in marketing activities is one thing. The second level is perhaps even more fundamental. If you look at video gaming—and I talk a lot about this in the book—I go through a history of gaming. It’s not an exhaustive run, so if you're not interested in the history of technology, you won’t get bogged down. But I spend quite a bit of time tracing it out because what you’ll find is that
As long as there have been video screens, there have been video games. We, as humans, have tended to turn to games to help us understand and internalize new technologies.
That has been the case with video games forever, from the first computers to televisions, certainly mobile phones, and so on. And if you think a little further, we can talk about more speculative topics like the metaverse. It’s the same deal. Everything we talk about that’s ostensibly the metaverse right now is basically a video game. That was the big realization we landed on last year when metaverse hype was super high: we’re really just talking about gaming in a different form. So, I often encourage folks—insights professionals or otherwise—that if you want to understand the future of technology, how consumers internalize, think about, learn, adapt to, and maybe even grow to love new technologies, gaming is a really good way to do it. It fulfills deeply human needs and is a very human way for us to relate to technology.
Thor
I’ve never thought about it that way, but I love that you describe gaming as a human way to spend time understanding how technology evolves. Do you have any stories from your career where you’ve integrated insights to fuel innovation? It could be anything from building a better campaign to improving a product.
Jonathan Stringfield
Sure, so I think we're going to go back in time a little bit. One, because I think it's safer, but two, because it’s kind of an interesting one. A fairly junior employee made a series of innocuous inquiries and insights-related practices that may have changed the entire television industry. So, with that lead-up, back in the day, early in my career, I worked at Nielsen on television ratings, and I was part of a team that did universe estimation.
Universe estimation is basically demography. If we think about what a television rating is—historically speaking, this is super complicated these days—but essentially, it’s how many people watched a program divided by how many people could. So basically, I made denominators for a living. This was very much like big data before big data was a thing. We were working with census data and other forms of population projections to figure out how many people lived in a given DMA market or television market.
So long and short, it was very early on, big data work, quantitatively serious, and very tied to the academic community. We worked closely with them, and also with our federal data apparatus, like the US Census Bureau. That part is important because when you’re making a reliable estimate of what a TV rating is, it’s not just about how many homes exist in, say, the United States, but how many of them are “TV homes,” i.e., homes that had a television that qualified under Nielsen’s definition.
For a long time, Nielsen used data from the 1970 census, which was the last time the federal government asked all its citizens, 100% of them (because it's an enumeration), whether they had a television. The percentage was something like 97-98%, meaning effectively everyone. So, every year when we were doing population estimates, we would apply that factor, and out of 100 million homes, we’d estimate that 97-98% had a television.
Now, the problem was that I started working at Nielsen in 2005. So, we were talking about data that was at least 30 years old. Moreover, around the same time, what constituted a TV household started to shift. Nielsen's definition had been the same for a very long time, going back to the advent of television. It had to be a screen with specific specs that could get linear broadcasts—cable, over the air, anything like that. But today, that sounds silly, right? We don’t even call it TV anymore. People watch video content on all sorts of monitors and devices that wouldn’t be captured under that old definition.
But Nielsen was still caught in that rut. So, first, just based on the insight of “Hey, maybe we shouldn’t use data that’s this old,” and second, how we could expand or improve our estimates of how many people had a television in their household, I started asking some questions. In the past, when Nielsen would go out and invite people to be panelists, if they didn’t have a TV by Nielsen’s definition, we’d move on. Nielsen would thank them for their time and go to the next person. But I was like, “Well, hold on, let’s still talk to them. Let’s figure out who they are, what their demographics are, and what technologies they do have.”
Without getting into all the specifics, what essentially happened was that we discovered the definition of a TV household was actually dramatically lower than what we had been using. By dramatic, I mean only by a percentage point or two, but at the scale of hundreds of millions of households, bartering $70 billion worth of TV advertising, that’s a big shift. This eventually pushed Nielsen to reset the fundamentals of their business. The definition of what a TV is or how people consume content could no longer be the one that had been in place since the 1950s or 60s.
So whenever I hear broader conversations about the decline of television, my joke is, “Well, you're welcome.” I may have started it—not that I’m solely responsible for it—but I’m pretty sure I was one of the maybe 50 people who helped push it over the ledge. So, if you’ve been displeased with how reliable linear television advertising is today, I’m sorry, but also kind of not.
Thor
That's a fascinating story. I think people will appreciate it more than complain. I know you've also expressed concerns about AI and machine learning models not being antagonized enough. How do you think insights professionals should be approaching these technologies? Any advice you would give?
Jonathan Stringfield
Yeah, I mean, it's in my mind, and again, we'll be a little reductive, but I think necessarily so is that good practice for a statistician, right, is that you can create a model pretty easily these days, right? And you can get insights from it, you can get information from it, you can get all sorts of nice coefficients that have more or less meaning, but a lot can go wrong, right? And the ways in which you construct a model can really drastically shift the outcomes. Even for folks that took entry-level stats in college or whatnot, that got as far as regression analysis, that's pretty basic stuff. But I feel like we don't quite internalize that part enough when we talk about what is essentially a broader, more elaborate, more complicated or sophisticated utilization of what are fundamentally statistical models. No, not frequentist models, more Bayesian or predictive or what have you, but still statistical models. Models that have and can be influenced by human intervention, right? By the biases that we pull into, by the information that we put into it, right? Like, again, if we think back to our Stats 101 course, and you may not even have needed to take a Stats 101 course to kind of hear this old truism like “garbage in, garbage out,” which is a big, you know, debate like generative AI communities or what have you. If you're putting bad data into these things, you're going to get bad data out. So I think you know, on the one hand, it's kind of created this interesting movement that I think could get dangerous, where, because it is a powerful tool, right? When we think about gender AI or what have you, or, you know, any of these new kind of new technologies that kind of like allow you to supercharge being able to like, you know, do programming or what have you, folks are kind of getting this mentality, it's like, well, wow, with these tools, I can do anything, right? So like, all this stuff they're teaching you in school, enough, and I have seen the sentiment like a ton of times, right? They're like, oh, kids should be pulled out of school. All they're learning about is stuff from the 1950s. That's useless. We're in a new world, and so on and so forth. And I was like, okay, hold on, right? So like, you know, again, like I, obviously, I have my biases, right? Like I have a PhD, I work at a university, like obviously I've invested interest in saying, no, maybe there's some value to this stuff. But I've kind of had to cheekily remind some of these folks that, like, okay, but what most of AI is—and again, that's a broad definition and a sliding scale, and that in and of itself could be an hour-long conversation—usually when people are talking about it, they're talking about ML. Most of what we know about machine learning is based upon statistical research and various work that was done in the 1950s. So, like, again, I'm not trying to be too reductive or too historical, but in that world, if you're just kind of marching forth blindly without the understanding of how some of the probability theorems against at work or what have you, they're not saying you need to understand that to be an expert but,
I think it's a very self-defeating view to believe that all expertise has died and we don't need to pay attention to that and so on and so forth, because you're going to start really believing in a black box that you don't understand.
And I think that's the biggest danger and where I really ask folks to have some degree of pause—that we need to be able to understand not just how we use them as tools, and again, they're powerful tools, and I think they have a lot of applications and what have you. But also, we need to know what's fueling them, because if that's not the case, then again, the information's off, the insights you derive from the information will be bad, and so on and so forth.
Thor
That's such a good reminder. Thank you for highlighting that. With your background and expertise, I think many people will take it seriously. You've previously mentioned that insights professionals need to be realistic about the degree to which research is actually guiding decisions within an organization. What skills do you think are essential for insights professionals today?
Jonathan Stringfield
Yeah, I mean, like I think there's no better demonstration of that than working in the advertising industry, right? Like, you know, it became very kind of fashionable, I think over the past 15 years, to, you know, agencies—even creative agencies—were like, "Oh, we're data-led, and you know, we're so sophisticated with data," and so on. And they were, but a lot of decision-making was still kind of made by heuristics of the past, like passions, intuition, things of that nature, right?
And that's an industry, but that's very common in many industries, right? That like, you know, even aside from the fact that any given piece of information or insight we give you has liabilities to it, has conditions, has error, has variability, right? It can be an abstraction of what if something is more or less wrong. You even take all of that aside. That still just assumes that all decision-making should be done in this very kind of rational way, or in a more pessimistic view, that folks can understand how to apply...
an insight or information to the business decision process. So that's all kind of a lead-up to what I generally mean when I say it’s not enough to be right. And for people listening to this who can't see me, I'm putting up air quotes around "right." That's not enough, right? Because one, you might not be. And two, like any statistician worth their salt will tell you, we never have the answer, we have an answer, right? And we think it's an answer that's better than other answers. And hopefully, if we're really good at what we're doing, we're telling you how much belief we have around that.
But even aside from that, there are other factors that go into decision-making, whether it’s in business or otherwise. So, I think we need to understand that.
And I think, importantly, understanding all of those messy human factors, like how we can be empathetic and relate to the problems that our stakeholders have, and how our insights either inform or abate that, is almost as crucial as the hard technical skills.
And what I've found when working with insights professionals in the past is that we do tend to highly privilege and believe in the virtues of highly technical skills, which is absolutely true. But then all the soft skills in terms of how we actually put that into action aren’t as focused on. And again, partially because I don't think that can be taught, but I think it is something that is just as important as finding the best answer given the information at a given time and understanding the variability around it.
Thor
And if we take a slightly broader perspective, what would you say is the DNA of a successful insights team?
Jonathan Stringfield
I think in that light, then, it really comes from—and it seems counterintuitive because again, that same mentality where we really think about hard technical skills—but a good insights team is super creative, right? Like any given potential data project, you can go in with a very concrete ask of, like, “I need to understand X,” and that's fine. And, like, a good insights team will generally be able to get that for you.
But the biggest value is, okay, there's X, but then also if we look at A, B, C, and W, Y, and Z, there's a lot of other interesting things that we weren’t even considering, right? So, what I've generally tried to encourage in my teams is, like, don't just give me the answer I'm looking for, but, like, give me how you would answer it, or give me, you know, tell me what the data is telling you, or, like, all these things that are kind of in that more amorphous area of creativity. But I really think that...
being able to look at and understand an insights-based problem and look at complex data and just kind of be like, “Well, gee, what if I turn this the other way or transform the data in that way or did some other kind of creative data transformation,” that these little changes can make a big difference in terms of the interpretation that comes out of it. So, I think there's that necessary bar of entry that you obviously need to have typically, right? And there's qualitative research—I know I’m...
largely speaking on the quantitative side—but just to kind of keep on track, that some degree of quantitative know-how is the bar to entry. But those that are great, I think, are those that are just super creative and therefore also super curious about, “How can I look at the same problem like five or six different ways to come back with a solution or an answer that's even better than what the original question would have even implied?”
Thor
And if we, if we double click on this a bit more, but from the angle of inspiration, what, you know, what people have inspired you during the course of your career? What's the best career advice you've ever received?
Jonathan Stringfield
Yeah, I mean, I've been lucky to work with, I think, a lot of great leaders. And I think, you know, each kind of had their strengths and weaknesses, and each kind of had, like, various degrees of influence and what have you. And, you know, I still think, like, even some of my earliest supervisors at Nielsen are probably the most formative in terms of really helping me balance between trying to be, like, a fairly straight-laced, hardcore, quant-focused type person and, you know, still being able to, like talk with internal stakeholders and make it meaningful or what have you. Her name was Christine Pierce, and I think she's still at Nielsen. She's wonderful. And, like, you know, I think that that's always what she did, right? She was super smart, super quantitatively curious and capable, but also really excellent at communicating things. And I think I learned a lot from her in terms of how to actually not just take a good answer but make it meaningful for other folks.
And then the second piece, you know again, I think for any insights person, the work we do is valuable, but because we tend to lack those soft skills, what tends to fall by the wayside is self-advocacy and career progression, things like that. We're not always awesome at that, right? Like, early on in my career—and I think to a certain degree, I still kind of fall into this habit, even though I'm old enough to definitely know better—is the belief that, “Oh, well, if I do awesome, great work, well, that speaks for itself.” Right? Like everything else doesn’t matter, right? Because I'm just, like, so good at being able to do this very complex thing, and therefore, that's all that should matter. But no, right? That's not how businesses or the world work, or what have you. So, you know, one of my managers, a gentleman named Tim at Twitter, gave me this level set at one point, where he was like, “Listen, I can't be the only person that advocates for you, right? You have to do that too.” And I was like, “Oh.” And, like, that's one of those things that's super uncomfortable—self-advocacy or what have you. There are bad ways of doing it, better ways of doing it, or what have you. But even just from a career progression standpoint,
You need to be able to support yourself, defend yourself, talk about the value you bring.
Because again, if you're an insights professional—and again, I'm super biased, but I believe you bring disproportionate value to an organization—they might not always know that. And you need to be able to be assertive in knowing, understanding, and communicating what that value is.
Thor
I think that's advice that a lot of people really could benefit from and sometimes, you know, often forgotten for many people within the insights community. And looking ahead to the future, what opportunities do you think there are for insights professionals to make even more business impact and to challenge the status quo? How do you think the insights function should evolve?
Jonathan Stringfield
I'm probably really preaching to the choir, but I nonetheless believe it to be true that, you know, insights need to be at the table at the highest levels of an organization, right? And what I've found is that research teams, or insights teams, or data teams, or data science, or what have you, have always kind of been embedded in other orgs, other teams, typically because of either historical precedent or just because organizationally it’s easier, right? Like data science would be in your engineering team, market research typically under marketing, and so on and so forth. But, you know, again, I think that's kind of like a means-to-an-end view of the value that research and insights can bring, rather than making sure that it is something that is, again, taken seriously and used well within an organization, making sure that it's kind of baked into decision-making at the core. So, you know, even in my current role where I was brought in to oversee research but then picked up, you know, the marketing components and creative and what have you, I was kind of in this interesting role where it was kind of a hodgepodge of different teams. And in general, it would be very easy to say, “Well, okay, I could just say I'm VP of marketing, and you know, marketing usually has research and all these other things within it.” And that was very deliberately not what I wanted. I asked that my title be VP of Research and Marketing, with research before marketing, because research comes before good marketing, right? And research is kind of core to my career. So even that, in kind of my own small way, is kind of like my push in that direction.
I think every organization should have a Chief Research Officer or a Chief Insights Officer or something similar to it.
And again, I'm not saying that these disparate insights-led functions, whether it's data science, market research, measurement science, or anything in between, shouldn’t exist within other teams—like, of course, right? And again, we get value by being cross-functional, but there also needs to be a world where it is at that level where, whether it's marketing or engineering or product or what have you, research and insights are considered just as vital to the future of an organization or business, because it is. I think that’s definitely the way things can be lined up if we just look at the problems that modern organizations are facing more directly.
Thor
Jonathan, it hurts me so much, but we've come towards the end of this podcast. And I only have one more question. It is a question I'm very curious to hear your answer to, which is: who in the world of insights would you love to have lunch with?
Jonathan Stringfield
Yeah, I feel like this is one of those questions where, no matter what, someone's going to get snubbed. But, like, you know, honestly, I wouldn't even—I mean, maybe this is a cop-out, and if it is, you can call me on it—but I don't even think it would be a person. I would be more interested, as those opportunities arise, to talk with folks now just entering this arena, right? Like the new entrants, the new hires, the promising talent of tomorrow. Not because I'm arrogant enough to believe that I can give them any profound degree of mentorship or what have you that, you know, they're not going to get otherwise, but, if anything, to kind of challenge their worldview, but just as importantly, challenge mine in terms of, you know, what's going on in this broader world. Because, you know, look, on the other hand, I could sit here and say someone very senior and established in the organization, and I'm sure I'd sit down with this person at lunch, and we'd have a wonderful conversation, but, like, you know, what we do and our perceptions are pretty baked, right? Like, I don't think there's going to be a lot of, like you know, that more kind of useful back-and-forth and reshaping of thinking. Which is why I would kind of bias not towards what I think is the obvious of picking some of these, you know, kind of superstars out there, but more like, “Tell me, bring in someone who's young, who's ready, and who thinks I'm an idiot.” And I'd love to hear where I am an idiot because that's the only way you learn.
Thor
Thank you so much for that. Wow. This has been such an amazing conversation, Jonathan. Your perspective on insights is truly unique, and I think we can all learn from it. I'd like to rewind and return to some of the moments of our conversation that have really stuck with me.
I'll start off with when you reminded us that what you do with the information that you gain from data—the insight really lies in how you apply it and your ability to see beyond the superficial. You also reminded us that a good insights team is actually super creative. The biggest value is not just in the insights they provide, but in the creative approaches to how they apply them. And talking about inspiration, you helped us and brought our attention to the video game industry. If you look at video gaming, you will find that as long as there have been video screens, there have been video games. And if you think further, this is still true. When you look at new use cases, such as the metaverse, the first use cases are, in fact, in gaming. Gaming is a great window for us to see the future of human interaction with new media.
When we talked about AI and machine learning, you reminded us to be cautious to spend time researching and understanding whatever technology you plan to apply. Otherwise, you're going to start believing in a black box that you do not understand. And lastly, on the topic of career advice, remember that your boss can't be the only person advocating for you. You need to do that too. You need to be able to support yourself, promote yourself, and highlight the value you bring forward. I know that I've learned a lot from talking to you today, and I'm sure our audience has as well. Thank you so much for joining me.
Jonathan Stringfield
Thanks for having me. I had a lot of fun.
Related Content
Democratizing Insights to Drive Innovation
Stravito Jan 18, 2024
In this episode of the Consumer Insights Podcast, Thor is joined by Shivani Shah, National Category and Shopper Insights Senior Manager at Church &...
2023 Year in Review, Part 1
Stravito Dec 14, 2023
In this special edition Year in Review episode, host Thor Olof Philogène, revisits some of the brightest gems from the podcast this year.