Friday, October 7, 2016

Smarter Survey Results and Impact: Abandon the Asker-Puker Model!

FoldsToday's post comes from a source of deep pain. Analysis Ninjas are valued less than I would prefer for them to be.

The post is also sourced from a recent edition of my newsletter, The Marketing – Analytics Intersect. I send it once a week, and it contains my insights and recommendations on those two topics. Like this blog, the newsletter is geared towards being instantly actionable (rather than just theory-smart, which is pretty cool too). Do sign up if you want to deliver a small electric shock of simulation to your brain each week.

TMAI #41 covered a graph that resulted from a survey done by Econsultancy and Lynchpin. I received a ton of responses for it, and great discussion ensued. It prompted me to write this post, essentially an expanded version of TMAI #41. I've added new insights, recommendations, and two bonus lessons on how to do surveys better and a direct challenge to your company's current analytics strategy.

If your heart is weak, you can stop reading now. I promise, I won't mind one bit. I heart you. If you are open to being challenged… then here are the short-stories inside this post…

Let's go and challenge our collective thinking!

The World Needs Reporting Squirrels. Wait. What!

Some of you know that I created the phrases Reporting Squirrels and Analysis Ninjas to emphasize the difference between those that puke data and those that puke insights with actions attached to them.

Here is my slide the first time I presented the concept in a keynote…

reporting squirrels analysis ninjas

Cute, right? :)

While companies, medium and large, often need both roles, I’ve massively pushed for every company to want more Analysis Ninjas and for analysts to have careers where they can rapidly undertake metamorphosis from Reporting Squirrels to Analysis Ninjas (after all the difference in salary is tens of thousands of dollars).

If you are curious, here is a April 2011 post: The Difference Between Web Reporting And Web Analysis.

With that as context, you can imagine how heart-broken I was when Jane shared the following visual from a study done by Econsultancy and Lynchpin. It contains the answers to the question which analytics skills are most in demand…

econsultancy analytics skills

Checkout the y-axis… what do you see as the common pattern across them all?

Just data puking.

One row after another of data puking skills.

Ranked.

Almost nothing that quite captures the value of Analysis Ninjas! N. O. T. H. I. N. G.

I did a random Google search and got this list of analytical skills:

+ Understanding relationships between numbers

+ Interpreting mathematical information

+ Visual perception of information

+ Ability to organize information

+ Pattern recognition and understanding trends

+ Argumentation and logical thinking

+ Ability to create insightful data visualizations

+ Hypothesis development and design of experimentation

+ Strategic thinking skills

And, that is just a random list!

None of these are in demand?

Look at the list in the graph, what kind of purple squirrel with ant legs and an elephant's nose that nobody needs is Lynchpin describing?

This did not happen at Econsultancy, but the data did cause introspection at my end.

And, my first question was the one that is also top of mind of all readers of Occam's Razor… Is the world so dark that the only "analytical" skills that are valued are directly tied to data puking and you should immediately shut down your Analysis Ninja efforts?

Let me share three thoughts for your consideration, then some guidance on how to do surveys right, and end with a call to arms for all of you and the "data people" you choose to work with.

Three thoughts that explain the Econsultancy/Lynchpin graph.

1. The survey design is at fault.

The otherwise well-respected Econsultancy and Lynchpin dropped the ball massively in creating the list of answers for the respondents to choose from.

I have to admit, I believe this is a major flaw (and not just for this question in the entire report). What is disappointing is that they have done this for nine years in a row!

It poses these questions…

How is it that in nine years no one at these organizations realized they were simply asking people to rank data puking answers? Did the survey list the skills Econsultancy and Lynchpin hire for and value in their own analysts?

The graph illustrates data for three years… Was the fact that almost nothing changed in three years in terms of priority not trigger a rethink of the options provided for this question? Anyone reading the report at the two companies creating it should have thrown a red flag up and said hey guys, the respondents keep rating the answers the same, maybe we are not asking the right question or providing the best choices for our respondents to pick.

More on how to avoid this flaw in your surveys, of any kind, below.

2. The survey is targeted to the wrong folks.

They might be the wrong folks to accurately judge what analytical skills and how to appreciate the value of each skill as they rank them. That could explain the results (not the answer choices though).

Econsultancy/Lynchpin provides this description in the report: "There were 960 respondents to our research request, which took the form of a global online survey fielded in May and June 2016. Respondents included both in-house digital professionals and analysts (56%) and supply-side respondents, including agencies, consultants and vendors (44%)."

The survey was 76% from the UK and EU. Respondents were solicited from each company's database as well as Social Media.

Here is the distribution provided in the report:

econsultancy lynchpin survey audience

On paper it looks like the departments are to be what you would expect. It is difficult to ascribe any blame to the folks who got the survey. There is a chance that there is a UK and EU nuance here, but I don't think so.

3. It is our fault.

My first instinct in these cases is to look into the mirror.

Perhaps we have not succeeded as much as we should when it comes to show casing the value of true data analysis. Perhaps all the people involved in all digital analytics jobs/initiatives, inside and outside companies, are primarily data pukers, and none of them have skills to teach companies that there is such a thing as data analysis that is better.

Then, you and I, and especially our friends in UK and EU, need to work harder to prove to companies that CDPs (customized data pukes, my name for reporting) do not add much value, the rain of data does not drive much action. You and I need to truly move to the IABI model were we send very little data, and what little we send out is sent with copious amounts of Insights from the data, what Actions leaders need to take, and the computation of the Business Impact.

The more we deliver IABI, by using our copious analytical skills, the more the leaders will start to recognize what real analytical skills are and be able to separate between Reporting Squirrels and Analysis Ninjas.

Bottom-line… I would like to blame the competency at Econsultancy and Lynchpin, especially because I believe that truly, but I must take some responsibility on behalf of the Analysis Ninjas of the world. Perhaps we suck more than we would like to admit. I mean that sincerely.

Bonus #1: Lessons from Econsultancy/Lynchpin Survey Strategy.

There are a small clump of lessons from my practice in collecting qualitative feedback that came to fore in thinking about this particular survey. Let me share those with you, they cover challenges that surely the E+L team faced as they put this initiative together.

If your survey has questions that cease to be relevant, should you ask them again for the sale of consistency as you have done this survey for nine years?

There is a huge amount of pressure for repeated surveys to keep the questions the same because Survey Data Providers love to show time trends – month over month, year over year. It might seem silly that you would keep asking a question when you know it is not relevant, but there is pressure.

This is even worse when it comes to answer choices. Survey Creators love having stability and being able to show how things have changed, and they keep irrelevant/awful/dead answers around.

If you are in this position… You will be brave, you will be a warrior, you will be the lone against-the-tide-swimmer, and you will slay non-value-added stuff ruthlessly. You will burn for from the ashes shall rise glory.

If you are the Big Boss of such an initiative, here is a simple incentive to create, especially for digital-anything surveys: Give your team a standard goal that 30% of the survey questions for any survey repeated each year have to be eliminated and 10% new ones added.

Your permission will 1. force your employees to think hard about what to keep and what to kill (imagine that, thinking!) 2. create a great and fun culture in your analytical (or reporting :( ) team and 3. push them to know of the latest and greatest and include that in the survey.

If I feel I have a collection of terrible choices, do you have a strategy for how I can identify that?

This does not work for all questions of course, but here is one of my favourite choice in cases where the questions relate to organizations, people skills, and other such elements.

Take this as an example…

skills gap question

How do you know that this is a profoundly sub-optimal collection of choices to provide?

For anyone with even the remotest amount of relevant experience, subject matter expertise, it is easy to see these are crazy choices – essentially implying purple squirrels exist. But, how would you know?

Simple.

Start writing down how many different roles are represented in the list.

That is just what I did…

skills gap question roles test

It turns out there are at least five roles in a normal company that would possess these skills.

So. Is this a good collection of skills to list? Without that relevant information? If you still go ahead and ask this question, what are you patterning your audience to look for/understand?

Oh, and I am still not over that in looking for what analysis skills are missing in the company, no actual analytical skills are listed above! Ok, maybe statistical modeling smells like an analytical skill. But, that's as close as it gets.

I share this simple strategy, identifying the number of different roles this represents, to help you illuminate you might have a sub-optimal collection of choices.

There are many other strategies like this one for other question. Look for them!

If your survey respondents are not the ideal audience for a question, what's your job when crafting the survey?

J. K. I.

Just kill it.

If you don't want to kill it… Personally interview a random sample of 50 people personally (for a 1,000 people survey). Take 10 mins each. Ask primitive basic questions about their job, their actual real work (not job title), and their approximate knowledge. If these 50 pass the sniff test, send the survey. Else, know that your survey stinks. JKI.

I know that I am putting an onerous burden on the survey company, taking to 50 people even for 10 mins comes at a cost. It does. I am empathetic to it. Consider it the cost of not putting smelly stuff out into the world.

If your survey respondents won't be able to answer a question perfectly, what is a great strategy for crafting questions?

Oh, oh, oh, I love this problem.

It happens all the time. You as the survey creator don't know what you are taking about, the audience does not quite know what they are talking about, but there is something you both want to know/say.

Here's the solution: Don't do drop down answers or radio button answers!

The first couple times you do this, ask open ended questions. What analytical skills do you think you need in your company? Let them type out in their own words what they want.

Then find a relatively smart person with subject matter expertise, give them a good salary and a case of Red Bull, and ask them to categorize.

It will be eye opening.

The results will improve your understanding and now you'll have a stronger assessment of what you are playing with, and the audience will not feel boxed in by your choices, instead tell you how they see the answers. (Maybe, just maybe be, they'll give you my list of analytical skills above!)

Then run the survey for a couple years with the choices from above. In year four, go back to the open text strategy. Get new ideas. Get smarter. Rinse. Repeat.

I would like to think I know all the answers in the world. Hubris. I use the strategy above to become knowledgeable about the facts on the ground and then use those facts (on occasion complemented by one or two of my choices) to run the survey. This rule is great for all kinds of surveys, always start with open-text. It is harder. But that is what being a brave warrior of knowledge is all about!

If your survey results cause your senior executives, or random folks on the web, to question them, what is the best response?

The instinct to close in an be defensive, to even counter-attack, is strong.

As I'm sure your mom's taught you: Resist. Truly listen. Understand the higher order bit. Evolve. Then let your smarter walk do the talking.

Simple. Awfully hard to do. Still. Simple.

Bonus #2: The Askers-Pukers Business Model.

The biggest thing a report like Econsultancy/Lynchpin's suffers from is that this group of individuals, perhaps even both these companies, see their role in this initiative as Askers-Pukers.

It is defined as: Let us go ask a 960 people we can find amongst our customers and on social media a series of questions, convert that into tables and graphs, and sell it to the world.

Ask questions. Puke data. That is all there is in the report. Download the sample report if you don't have a paid Econsultancy subscription. If you don't want to use your email address, use this wonderful service: http://ift.tt/1E6xzmk

Even if you set aside the surveying methodology, the questions framing, the answer choices and all else, there is negative value from anything you get from Askers-Pukers, because the totality of the interpretation of the data is writing in text what the graphs/tables already show or extremely generic text.

Negative value also because you are giving money for the report that is value-deficient, and you are investing time in reading it to try and figure out something valuable . You lose twice.

Instead one would hope that Econsultancy, Lynchpin, the team you interact with from Google, your internal analytics team, any human you interact who has data sees their role as IABI providers ( Insights – Actions – Business Impact).

This is the process IABI providers follow: Ask questions. Analyze it for why the trends in the data exist (Insights). Identify what a company can/should do based on the why (Actions). Then, have the courage, and the analytical chops, to predict how much the impact will be on the company's business if they do what was recommended.

Insights. Actions. Business Impact.

Perhaps the fatal flaw in my analysis above, my hope above, is that I expected Econsultancy and Lynchpin to be really good at business strategy, industry knowledge, on the ground understanding of patterns with their massive collection of clients. Hence, knowing what actually works. I expected them to be Analysts. Instead, they perhaps limit their skills inside the respective company to be Askers-Pukers.

Both companies are doing extremely well financially, hence I do appreciate that Askers-Pukers model does work.

But for you, and for me, and for anyone else you are paying a single cent for when it comes to data – either data reported from a survey, data reported from your digital analytics tool, data reported from other companies you work with like Facebook or Google or GE – demand IABI. Why. What. How Much. If they don't have that, you are talking to the wrong people. Press the escape button, don't press the submit order button.

[Isn't it ironic. Econsultancy and Lynchpin did exactly what their survey has shown for nine years is not working for companies in the UK: Reporting. The outcome for both of them is exactly the same as the outcome for the companies: Nothing valuable. This is explicitly demonstrated by their full report.]

Bottom-line.

I hope you see that this one survey is not the point. E + L are not the point. What their work in this specific example (and you should check other examples if you pay either company money) illuminates is a common problem that is stifling our efforts in the analytics business .

This applies to E+L but it applies even more to your internal analytics team, it applies extremely to the consultants you hire, it applies to anyone you are giving a single cent to when it comes to data.

Don't hire Askers-Pukers. Don't repeat things for years without constantly stress-testing for reality. Don't make compromises when you do surveys or mine Adobe for data. Don't create pretty charts without seeing, really looking with your eyes, what is on the chart and thinking about what it really represents.

Applied to your own job inside any company, using Google Analytics or Adobe or iPerceptions or Compete or any other tool… don't be an Asker-Puker yourself. Be an IABI provider. That is where the money is. That is where the love is. That is where the glory is.

Carpe Diem!

As always, it is your turn now.

Is your company hiring Reporting Squirrels or Analysis Ninjas? Why? Is the work you are doing at your company/agency/consulting entity/survey data provider, truly Analysis Ninja work? If not, why is it that it remains an Asker-Puker role? Are there skills you've developed in your career to shift to the person whose business is why, what, how much ? Lastly, when you do surveys, of the type above or others, are there favourite strategies you deploy to get a stronger signal rather than just strong noise?

Please share your life lessons from the front lines, critique, praise, fun-facts and valuable guidance for me and other readers via comments.

Thank you. Merci. Arigato.

PS: I hope this post illuminates the valuable content The Marketing – Analytics Intersect shares each week, sign up here .

Smarter Survey Results and Impact: Abandon the Asker-Puker Model! is a post from: Occam's Razor by Avinash Kaushik



from Occam's Razor by Avinash Kaushik http://ift.tt/2cbJ0nC
via IFTTT

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.