Data

Data


Aptitude, Application, Simulation, Or Concordance — It Is Time To End The Hypocrisy

Make any assertion to any useful analyst and they are likely to ask to see the test results. Except one.

Everyday analysts accept the assertion that other analysts know what they talking about — test results unseen. I am not saying that assertion goes completely unchallenged. I am saying — no one asks for test results. Why the hypocrisy?

Analysts test everything. While that does not always include the discipline it should, we are all wired to test. At public speaking events, I am often asked about testing. So many of us are thinking about it. Why don’t we do it?

I offer two hypotheses. And yes, I am aggressively testing both and several others for that matter…

One, analysts as a group lack confidence. Way too many of you fear that others know more than you do. You are afraid to call others out directly. Asking for test results is a crutch. It let’s you challenge people indirectly. And that is healthy and wise… unless you really need to challenge people.

Two, there is no test data! Way too many of you fear that others know more than you do. It has to come down to fear, because there is no testing available to actually know. There is no real data to compare or debate.

These two hypotheses, if true, create a chicken and egg scenario. How do we know who we can trust to tell us who we can trust? At least one group is offering an alternative -

http://www.chicagotribune.com/business/ct-biz-data-science-standard-test-20180625-story.html

If you will allow me to summarize, this article introduces the idea of a panel or consortium of various businesses and groups in Chicago coming together to apply a test. Simple enough, right? If we don’t know who to trust, get a big enough group and trust them! Wait… why?

Forgive me, but explaining the hundred plus ways this is all wrong would take too long. If you could pass the test on cognitive bias, you will be able to figure it out yourself. Instead, let me finish this article by offering ideas on what an alternative might look like.

Who Is Best Positioned To Develop A Disciplined Testing Approach?

For starters, rather than crowd-sourced, panel-based, or group-sourced — it should be market-based. Sorry non-profits and consortiums, we need something with strong and meaningful feedback loops. It is the only way to know if this is actually working!

The best way to develop a good test is to put it into the market. Testing that actually allows the adopter to successful hire and develop good teams will eventually win market share. It may not be exceptionally efficient, but it is the only clear path to defend-able outcomes.

Next, don’t look to the “masters of the universe”. Many of the powerhouse analytic shops have internal testing. Big tech, big banks, and big pharma — to name a few — leverage testing. It is a competitive advantage. So… they don’t share. The day they opt to share will be a day after it stopped working. So look to start-ups, or at least a smaller player.

How Would It Work?

Mediums and styles are debatable. Disciplined testing should however leverage some of our best analytics. This would include artificial intelligence, natural language processing, and strong experimental design.

Testing should not only focus on aptitude, but also application. Clinical tests employ concepts like script concordance testing, while other disciplined fields make strong use of simulations. Analytics needs its own version of these. Testing must reach beyond terms to concepts and principals. It must reach beyond simple application to efficient execution and innovation. It must identify those capable of teaching, evangelizing, and elevating this science.

Beyond Hiring

Hiring is certainly the easiest way to garner market-based feedback, but testing and assessment should power a range of activities. Training, planning, and other aspects of professional development would benefit greatly from insightful and effective testing.

Typically at best, existing training programs offer post-course certification tests. But what is the feedback loop? These folks are selling training. Do you really think they are interested in calling out their student’s failures and short-falls. Is there any penalty for providing certifications to pretty much anyone who paid for the training? Is it any wonder few hiring managers even read your certifications section?

Full Disclosure

Corsair’s Ventures, the parent company of Corsair’s Publishing, is developing just such an analytic assessment tool. We believe, of course, that we have an ideal framework and format for providing the industries first commercially viable tool. We expect that our success will be measured by the success of our clients and our ability to identify analytic aptitude, talent, and ability. Time will tell. We expect competition, but we also expect to succeed. Either way, this solution is long overdue.

Learn more here:

http://corsairs.ventures

http://register.corsairs.network/?sc=art

Report Page