• This field is for validation purposes and should be left unchanged.
Guidance and results for the data-rich, yet information-poor.

Category: Course Outline: Strategic Development

How to Win Big with Actionable Analytics

A Turtle with Strategic Soft Skills Will Beat a
Tactical Data Scientist Rabbit Every Time

by Eric King, President, The Modeling Agency

Sure, It’s Great to Run Fast

I recently read a great article in Software Advices’s  Plotting Success blog authored by Victoria Garment entitled  “3 Ways to Test the Accuracy of Your Predictive Models”.  In the post, Garment features the expertise of three legends in the predictive analytics industry: Dr. John Elder, Dr. Karl Rexer and Dean Abbott.  I know each of them directly and they possess a vast amount of practical experience.  They each shared highly valuable tips on how to test and validate the accuracy of your predictive models so that they hold up well on deployment.  I highly recommend the article and advice to modeling practitioners at any level to atma2void a myriad of tactical pitfalls that cause their models to fall short of their potential or give misleading results.

In this post, I wish to offer up a strategic consideration to complement Garment’s post and ensure that the great efforts to arrive at high-performance validated models do not disintegrate at the project level – which is where the real organizational impact occurs.  In our practice, we see far too often that analytic practitioners strive to build superior models – only to find that they’re not well-aligned with organizational objectives, not well adapted to the overall environment, not ultimately adopted, or not understood by leadership.

But First, Know Where the Finish Line Is

It almost sounds trite, but it is incredibly rare in my company’s experience that analytics teams in large organizations make the effort at the start of an analytic project to undertake a comprehensive assessment and develop a resulting targeted project plan.  They start with data and software, which is like starting a novel on chapter 9.

If leadership’s motivation for the analytics are not known; if users are not on board; if an analytic sandbox wasn’t prepared and IT won’t tolerate multiple pull requests; if significant retrofitting is required after deployment – then even the a model with a validated low error rate just won’t survive.  It’s akin to setting a fast rabbit 45 degrees off course.  Even a turtle with a mediocre model will outperform the rabbit if that model can communicate moderate lift in terms that leadership can apply.

Take the Time to Sharpen Your Axe

The current environment of fast pace and immediate demands is actually costing organizations far more than they realize.  Abraham Lincoln said “Give me six hours to chop down a tree and I will send the first four hours sharpening the axe.”  If analytic teams would follow the guidance of public domain modeling process guides that start with the planning exercise of business understanding and data understanding, they would arrive at analytics that are truly measurable, actionable and sustainable.

Instead, most organizations continue to build individual custom cars.  If they took a little extra time to design an analytic factory that churns out well-fitted models nearly at the speed of business demand, imagine how agile and effective that would be.

The Best Solution Combines Sound Tactics and Strategy

In my experience with analytics, a turtle running on a prioritized opportunity and sound project plan will win the race over a technically accurate model that’s misapplied.  However, with modern software and strategic training, there’s no reason not to have both.

The Modeling Agency’s vendor-neutral courseware establishes a purposeful blend of tactical and strategic predictive analytics training.  In fact, the courseware places slightly more emphasis on project planning and strategic implementation than tactical model development.  At the same time, the tactical and strategic orientations may be taken independently, or as a series depending on experience, role and objectives:

tma1

When taking the full series, these courses will ensure that you build accurate, validated models that avoid common tactical shortcomings like test data memorization and other effects that the Elder / Rexer / Abbot article skillfully presented.  At the same time, there is no other courseware on the market that insists upon making predictive analytics actionable at the project level with a purely strategic orientation.

To really stand out with the soft skills to manage a modeling effort at the project level and arrive at measurable results that catch the attention of leadership, first get trained on the strategic practice of analytics, then start with a thorough assessment and project definition.  Your organization will then move beyond the common dysfunction in analytics toward tangible and sustainable performance, and you will have a powerful case summary for your professional profile.

About the Author

Eric A. King is the president and founder of The Modeling Agency, LLC – an advanced analytics training and consulting company providing strategic guidance and impactful results for the data-rich yet information-poor.  Eric is a co-presenter in a popular monthly live, interactive analytics webinar entitled “Data Mining: Failure to Launch”.  Eric may be reached at (281) 667-4200 x210 or eric@the-modeling-agency.com.