One of the great convergences between a traditionally qualitative-based discpline like marketing and other disciplines like manufacturing or production has been the tsunami of data that we now have at our fingertips to use quantitative insights to drive improvement.
Much like the manufacturing Kaizan mentality and process of "continuous improvement," so too do marketers have the capability--and requirement--to leverage insights and data to test and iterate ourselves. Whether it's a simple A/B test or using different messages or web pages for different audiences, we're now able to work in real-time to adjust and find the right swim lane for our project to be successful.
Increasingly our tools are making this decision for us--it's not new to have your automation systems help you run these tests and choose a winner for you based on your audience response--and then optimize the rest of your spend and campaign accordingly. But even though machines may be doing more of the analysis for us, we can't lose sight of our need to learn from what we did. Why did something work or not? What did we hear from our customers--what is their feedback, what did they mention on the phone to the sales team?
It's that learning part that we often blow right past on the way to iterating. But if we miss it, we often miss the nugget of insight and feedback that will help us fine tune our programs just that little bit. And often it's that "little bit" that can make all the difference between a home run and a base hit. We can't always hit a home run, of course--but by taking the time to learn from what worked and what didn't before we take the "winning path" we can increase our chances and earn more at-bats for our team.
Comentarios