Analysis is about understanding

The buzz about Target’s data mining team may be finally dying down.  Of all the blogs I’ve read about it, Dean Abbott’s blogs were by far the best among the various commentaries. He pointed out the hypothesis-testing, or “forensic” aspects, of the data mining process. One point tends to get overlooked, probably because it’s so obvious.  It’s definitions.  In other words, what is a pregnant woman? 

At Target, a pregnant woman was someone who signed up in their baby-shower registry.  Presumably, only pregnant women signed up for the baby-shower registry. But once they defined what a pregnant woman was, they were able to start data mining.

But this is too easy. So let’s take another practical example. What is a household? Analysts, programmers, and statisticians talk about householded data all the time, like it’s a given.  But how do we define a household?

Is a household every person living at the same postal address? If so, does that mean 2 or more college students sharing the same postal address is a household? Should we impose an arbitrary limit and exclude postal addresses with 5 or more individuals with different last names? Does that mean a household includes all individuals with fewer than 5? What does that make postal addresses with 5 or more individuals with last names? Frat houses?   

Is everyone at the same postal address with the same last name a household? This is a common definition used by a variety of third-party data vendors and market research firms. But what if a wife doesn’t take her husband’s last name? Are they a household?

What about same sex couples living in states where same-sex marriage is NOT legal? Are they a household? 

What about a young married couple who move back in with one spouse’s parents due to economic reasons? These multi-generational or accordion households may be more common, but do we define them as a household? 

Does context matter? Should the definition of a household for a retailer be the same as the definition for a bank? 

Definitions are vital both for communication purposes and to the analysis process. We have to know what we’re looking for before we can begin mining the data. And we have to remain flexible as we mine the data, reconsidering our definitions as we go.

Getting back to the example of Target, I was struck by the fact that a MINOR was on their database. Perhaps, Target has a data quality issue with one of their vendors and assumed, incorrectly, that the customer database consisted only of adults. But it could have also been a definition issue.

Did someone forget to define “those who signed up for the baby-shower registry” as ADULT women

 

How to get ahead in analytics

Advice on getting a “job” in analytics is a dime a dozen right now, as businesses insist there is a talent shortage, while “talent” insists there is a “job” shortage. I’ve said it before and I’ll say it again – there is a problem in matching the appropriate analytical professionals to the appropriate roles. But there are some things you can do to get at least a potential advantage or two.

1. Ditch the accent.
Accents say a lot about who you are and where you’re from. When I lived in North Carolina, there was a clear distinction between the proper Southern accent, and the rural lower class accent. If you want to hear the difference, listen to a few Southern executives, and then listen to the salaried and hourly individual contributors. Like hires like, giving someone with the rural accent a distinct disadvantage.

There is a second reason for ditching the accent. No one is going to hire you if they don’t understand what you are saying. There are very talented people in analytics who moved here from overseas, but it’s very difficult to understand them. We’re simply not used to the accents, and some accents are so thick that we don’t get used to them over time. Seriously. I worked with someone for years and always had to concentrate on his every word. A complete stranger might not exert the effort to hire an otherwise stellar candidate.

2. Learn to write
Some people on LinkedIn make me want to scream. They can’t or won’t use proper grammar, punctuation, or even the proper case. Maybe you’re on a mobile device. Maybe English is a second language. Maybe you didn’t realize that the slang in your country is completely incomprehensible to someone in another country. Fine. But don’t complain no one will hire you if you present yourself as functionally illiterate. If you can’t write on LinkedIn, how in the world will you ever communicate a complex analysis or model to a non-technical audience?

3. LinkedIn
Like it or not, you will be judged by your behavior online, especially LinkedIn. Be yourself, but be professional. Make sure you have genuine recommendations from credible people. I remember one particular analyst who would try to get additional recommendations for himself by recommending others. That’s not a bad idea normally, but only if you mean it. He was copying and pasting the exact same recommendation time and again. Apparently he didn’t realize that his copying-and-pasting was visible on his own profile.

4. Be nice
We all have different experiences and some of us are better at SQL, SAS, R, etc. If someone has a question, be nice and try to answer the question or ask for additional information. Be helpful. But don’t throw the other person under a bus. Yes, they aren’t as good as you are at something. We can’t all be good at everything. But asking for help is not an opportunity to show you’re a rocket scientist by throwing that other person under the bus. No one wants that on their team.

5. Avoid job boards
Here is a blog from a recruiter in Boston about his recent job board experience:
http://www.yourversion.com/index.php?p=viewpage&url_id=13397496
Read that blog carefully. Hiring is done through online and offline connections, including LinkedIn. If you’re looking on job boards or exclusively through recruiters then you’re in trouble. I’ve met the same unprofessional recruiters discussed in that blog and I swore I would never work with a recruiter again (except for one or two notable and trusted exceptions that I have good working relationships with).

Remember that being an analyst is a career, a profession, a calling. Professionals have good working relationships with others – including those they have never met. People move in and out of jobs as quick as they can punch a time card. Do you want a career or do you want a job?

6. It’s not all about you
You aren’t the only analyst on the planet. Plenty of other people will apply for the same role. How those other people behave impacts how you’re perceived and whether you get hired. It never fails that when I start a new role, the hiring manager will mention what I brought to the table that the others did NOT. It’s not all about you – it’s also about the other people who applied for the role and what they did wrong.

7. Be positive
No one likes a negative Nellie. If everyone is out to get you, the world is unfair, and no one listens to you, then guess what happens? No one listens to you. They won’t return your calls. And after voice mail or e-mail #50 they may think you’ve stopped taking your anti-psychotic meds.

Yes, recruiters and hiring managers should return a phone call or e-mail. But they are overwhelmed and just may not have the time to do it. It’s unfair but it’s life. Don’t take it personally. Move on. The sooner you do, the sooner you’ll find something.

What about skills, like SAS or R?
Yes I realize I haven’t said a thing about college degrees, majors, R or SAS. But that should be obvious. If you know SAS, apply for SAS roles. If you know R, apply for R roles. Step one is getting past the software screening your resume for those keywords. That’s the stuff that gets you the phone call. It’s the soft stuff that seals the deal.

Why I Prefer SAS

I follow about 20 odd blogs about stats and analytics in Google Reader, everything from R to SAS, Frequentists and Bayesians, genuine opinions and marketing propaganda. The sheer number of discussions, debates, and volleys between the R and SAS fanboys is getting tired. There are pros and cons to both platforms, as there are with any software platform. I’m always struck by one difference between the R and SAS camps: SAS is more business focused, where R seems more code focused.

R: The Tinker’s Tool?

The R camp always seems so much more technical. They are concerned with code, algorithms, and platforms. And there is code. Lots of code. Code without context or application. The marketing propaganda is handled by Revolution Analytics. Their goal is to persuade me why I have to use R to analyze “big data.” Every reason listed applies equally well to SAS.

SAS: Science, Business, and Beyond

If I flip over to the SAS blogs, I can find SAS code, similar marketing propaganda, and business advice. There are blog entries about analyzing data within context, systems thinking, and customer segmentation. It was a SAS blog where I was first introduced to the idea of “beginning with the end in mind” – not Stephen Covey or Fitzgerald Analytics (sorry Jaime!).

I’ve found the SAS blogs and training instrumental in advancing my analytical career over the years. SAS knows that analysts have to understand the business, but that we don’t necessarily graduate with those and the other “soft” skills required for analysts to get a “seat at the table.”

If R users have this same understanding, then I’ve definitely missed those blogs. Until the R users and propagandists offer the same business sense and training as SAS, I’ll continue preferring SAS over R. Analytics is about creating business value, not code.

Analysts: “Seek First to Understand, Then to be Understood”

A lot has been made of improving analytics by applying Covey’s principle of “beginning with the end in mind.” By beginning with the end in mind, a good analyst can decide what the gap is between the requested deliverable, the current state of analytical affairs, find and rectify the gaps, and deliver what the business requested.

All that work is for naught unless the analyst has a variety of other, non-analytical, skills. One of these skills, or habits, is what Covey calls “empathic listening.”

Why Empathic Listening?

Empathic listening involves understanding a person by understanding their world – their role, their job, their stress, their goals, their aspirations. In short, by putting yourself in that person’s shoes. More often than not, we don’t listen empathically. We’re told what to deliver – a report, an analysis, a predictive model – and deliver what we were told to. And then we find ourselves having coffee with our fellow analysts, saying the following:

  • I don’t understand what she wants.
  • We gave him what he asked for, why isn’t he happy?
  • Why don’t they listen to us? We told them that months ago. Why didn’t they act on it when they could have?
  • She never listens to me.
  • We’ve all said or heard these things. It obviously goes to show the superiority of the analytical approach to intuition. Right?

    It’s All About YOU

    Wrong.
    YOU are the one making these statements.
    YOU are communicating the fact that
    YOU don’t listen, that
    YOU don’t understand the perspective, the situation, that the non-analytical business person is in. And until
    YOU understand that person’s world,
    YOU won’t be able to deliver and communicate the analytics that person requires to succeed in his or her non-analytical world.

    Now that I’ve pissed YOU off, let me assure you that I’m not talking about faking the data the Director needs to get his promotion to VP. We know these types and they are helpful reminders to either keep our resumes up-to-date or to polish up on our political skills.

    What I am talking about is understanding what the business person really needs, not what they said they needed. And that means using your analytical skills to help that business person get what he or she needs to work and thrive in their non-analytical world. You have to understand where that business partner is coming from. That means being able to go beyond merely hearing the person’s request and fulfilling it. You have to first understand what they mean by the request – what they need. After all, if he or she knew what they needed, they could pull and analyze the data themselves and what they wouldn’t need is you. Your job isn’t just pulling and coding and doing math. Your job is to understand what your business partner needs to do their job. Understanding begins by transitioning business requests into conversations.

    Conversations as Data Collection

    Think of a conversation as a data collection effort that ends with multiple observations. The first set of observations is what the business partner asked you to do. Perhaps they asked you to create product segments, re-run an existing analysis, or to create “an econometric model predicting foreclosures.” Any single request is simply a conversation starter, not a command requiring you to immediately drop what you’re doing and pull data to deliver as an e-mail attachment or instant message.

    A long time ago I received an instant message from my then boss, asking me to re-run some code which he provided the link to. Being a good boss, he also provided the context that we were assessing the impact some recent tax rebates had on customer sales. They hadn’t found any impact and wanted me to re-run the code, to see if anything had changed. I glanced at the code and IMed him back that the code didn’t include household income. Adjusted gross income determined the size of the tax rebates. Wouldn’t it be better to segment by household income, as a proxy for adjusted gross income, to ascertain whether the rebates had impacted sales?

    Any single analytical request, even in the form of an instant message, can be the beginning of a two-way conversation. But that was just getting the conversation started. We’re not empathically listening yet. All I really did was acknowledge what was requested, process it, probe, and re-state the goal back to my boss. Doing this was how I knew I understood what he needed.

    How Would You Feel If…?

    But all that was pretty straightforward active listening, or parroting, and it’s a way to get to the facts. What data needed analyzing and how. How is cognitive, not emotional. Empathy doesn’t involve checking off a box that the person is being “emo.” It means understanding that person’s perspective so you can share their feelings – to know how you would feel were you in that same situation. How would you feel if you had to tell the CEO of a Fortune 50 company whether the recent tax rebates had an impact on sales? There was a reason that code was being run multiple times, and it wasn’t cognitive. It was emotional.

    And that’s the key. The last observation in any conversation a good analyst has with a business partner, or their boss, is noting the emotional undertones – the why – driving the request. Watch and listen carefully to how requests are made. Are they expressing frustration? Are they worried? What’s the context? Maybe they’re worried about missing a revenue goal they have to hit this quarter. Maybe they have a sick spouse or child at home. Maybe they are worried that you don’t have the time or data to give what they think they need. Maybe they are worried about the CEO’s reaction if the news they deliver is not what he or she wants to hear.

    All these things go unsaid. That’s why empathic listening to so important. Unless you acknowledge their emotions – their worldview – and ask them to help you understand what’s going on, you won’t be able to deliver what they need to do their jobs. That’s why they don’t listen to you.

    You don’t listen. You walk in with all the answers – with the prescription – without understanding, and diagnosing, where they are coming from.

    As you understand their jobs, needs, stress, and frustrations, you’ll understand what analytics they need – what they can act on – and then and only then will you offer something they can and will act on.

    And they’ll listen to you.

    Dear Analytics Recruiter

    Dear L***a,

    I received your e-mail this past week regarding that new analytical opportunity in M*******.  And yes I do remember you, which is why I’m surprised to hear from you.  Perhaps you’ve forgotten our conversation a year ago.  Please bear with me as I provide you with a little context to jog your memory.  

    Last year I was searching for new analytical opportunities.  You posted several positions within various analytical groups on LinkedIn and I contacted you regarding one opportunity that appeared to be a good fit with my skills and experience.  You responded by claiming that I had only the process and strategy, but no hands on analytical experience.  You also unwittingly contacted people in my network who recommended me as a good fit for the opportunity.  

    In the end, the company in question hired one of my former co-workers.  And he didn’t get the opportunity through you, but through his network.  

    And so here we are, a year later. I’m doing what I love… analyzing customer behavior, and putting those insights into production *after* explaining them to the executives. 

    And you’re asking me if I remember you. 

    Yes, we remember you. 

    We can plan, manage, strategize, and verbalize what we do, as well as being able to actually do it. You don’t know it when you see it.  So of course I’m not forwarding this opportunity to anyone in my network.  We are obviously doing quite well without you.  

     

    Why hello again!

    This past week I received an automated notification that someone had subscribed to my blog. My poor ignored blog that, to my shock and horror, hasn’t seen a new post since May. Admittedly, I’ve been busy since then, moving back to New England, starting a new analytics role, and overdosing on the Red Sox and whoopie pies. But I hadn’t forgotten this blog, and did spend time scribbling ideas in a notebook. Unfortunately, said notebook has been misplaced.

    Aside from my lame excuses for not writing, there is one thing that strikes me as odd. It’s the sheer number of analytical blogs already out there. Seriously. I follow about 10 to 15 very useful statistical blogs out there. I don’t include the bloggers who whine that no one listens to them. Those blogs are amusingly ironic, but not terribly useful.

    More troubling are the popular bloggers out there that haven’t analyzed a single datum in their lives. Their apparent popularity, and penchant for self-promotion, makes me wonder about whether all the concern about the tech/social media bubble is misplaced.

    Perhaps we should worry more about an analytics bubble and the damage caused by the in-flux of non-analytical self-promoters selling snake oil masquerading as analysis. A non-analytical business audience cannot discern between analytics and snake oil. And once that “analytics” credibility is gone, it won’t come back. The business audience will broadly stereotype analytics as snake oil and dismiss it, no matter how useful it is to those who know better.

    Which leads me to the real reason for this posting – whether I should shut down this blog. Time is finite and most of mine is spent with my head down in SAS and SQL. I’m not convinced that this blog adds to the analytics conversation in a productive manner that builds and promotes the field. And I certainly don’t want to give the appearance of a self-promoting snake oil saleswoman.

    And so, assuming anyone is out there reading this…

    What do you think? Should I put a bullet in this blog?

    Reporting is to Analysis as …

    On television when we watch Doctor Who, Sherlock Holmes, or Castle, we watch the scientists, detectives, and writers discover the truth and make the world a safer place.

    Every week we tune in and watch these fictitious characters:
    1. Find a mystery to solve or mission to accomplish
    2. Collect objective evidence
    3. Make observations on the behavior of the people involved in the mystery or mission
    4. Ask questions to gather additional subjective evidence from these people
    5. Form hypotheses
    6. Test the hypotheses using all the available evidence
    7. Solve the case or return back to a previous step

    This circular process has entertained generations since Sir Arthur Conan Doyle published the first short story featuring Sherlock Holmes.

    In the real world, data miners, data scientists, analysts and statisticians may call this process CRISP-DM, SEMMA, or just plain old analysis.
    The Analytical Process Defined as CRISP-DM
    Contrast this to a linear programming or reporting process. On Castle or The Good Wife, the detectives or lawyers receive specialized reports on the evidence collected. Perhaps the dried blood collected from a glove is tested for DNA and the results are given to the detective or lawyer. The detective or lawyer uses the report to test their hypotheses about who really murdered the victim. But the report itself is not “analysis.” The report was the output or deliverable from a well-defined linear programming or production process.

    The Linear Reporting Process

    Everything in the process is well-defined and controlled to ensure an accurate report. But the successful production of the report ends the process. The programmer, technician, or proverbial “white lab coat” delivers the report, checks the deliverable off their status report, and moves on to another case, starting the linear process over again, much like a factory procedures widgets, or a business intelligence tool, like Cognos or Omniture, automates dashboard and report production.

    Detectives use the report, but the report is not, in and of itself, analysis. That’s why the fictional detective may ask for evidence to be re-tested – perhaps there is a new suspect whose DNA should be tested against the sample from the bloody glove. The DNA report is a report and reports are used in the circular analysis process, but the reports are not themselves “analysis.” Analysis is circular, done manually, and incorporates reports, data, and subjective observations. But reports are produced through a linear production process.

    So what?

    Why go on and on about the vagaries of analysis compared to reporting? One recycled plot line involves the innocent person wrongly convicted of a crime he or she didn’t commit. Inevitably some other detective, our Lestrade, points to a single report as proof of the accused’s guilt. He or she found a single report and stopped, convinced of the person’s guilt. Psychologists call this confirmation bias, but what it amounts to is a linear process instead of the circular process where multiple data sources are tested until a clear and convincing story or conclusion is reached.

    In other words, he or she failed to go through the circular analytical process data miners, analysts, statisticians, the Doctor, or Sherlock Holmes go through. And the audience eats up the stories week after week, as the planet is saved, murderers are convicted, and innocent victims are saved from wrongful convictions.

    In the real world, businesses make important decisions that would bore audiences. Its important to understand the difference between analysis and reporting if you’re trying to create a data driven culture. If you’re trying to create an executive dashboard with the KPIs that managers and executives need for Monday morning decision-making, then you need to hire the teams, vendors, and technologies for a linear production process. If you have tough business problems that need solving – in other words, your business needs insights and recommendations – then hire the people and agencies that engage in a circular analytical process.

    There’s a big difference between the consulting detective and the white lab coat. They are equally important and the consulting detective can’t function without the white lab coat (e.g., the analytical consultant can’t function with the DBA). But it’s important for the business know what it wants, needs, and is trying to solve for, before sending out the RFP or posting a role to LinkedIn. Hiring the wrong agency or person is great television, but it’s a costly mistake that will hurt your company’s bottom line.

    Note: I would be remiss if I did not provide a link to a classic, and brilliant, blog on this issue published by Brent Dykes for Omniture in 2010. Brent outlines the differences between reporting and analysis in a way that applies to all reporting and analysis, not just web analytics.

    What is Modeling without Analysis?

    There have been a lot of discussions, buzz, and white papers about analytics, statistics, data, and big data lately. These discussions can get pretty heated, insofar as any analytical discussion can get heated.

    Just this past week I was mentioning the pros and cons of variable clustering in variable/dimensionality reduction when the other person interrupted me. He kindly informed me that I was wrong. He was of the “school of thought” that one should never begin by exploring one’s data but should just throw it into a regression model, “see what sticks” and be done with it.

    OMG

    I could write a textbook about why that’s wrong, but I don’t have to. Anyone who has taken an introductory statistics course knows that is wrong. Take for example “Using Multivariate Statistics” by Tabachnick and Fidell …every chapter includes the assumptions underlying the statistical procedure being reviewed.

    Now some people may argue that we shouldn’t mindlessly adhere to (frequentist?) statistical procedures and their assumptions. For example, Olivia Parr Rud, in the “Data Mining Cookbook” argues against being overly concerned with multicollinearity (pages 106 through 108). In some instances, we might want to change approaches midstream in our analytical process, if it doesn’t look like the original plan is going to work given the available data (e.g., switching to ridge regression if multicollinearity is a perceived problem on a particular project).

    Nevertheless, we won’t know know whether any assumptions are violated if the data isn’t examined before being modeled. There are several established, reputable, “schools of thought” regarding data analysis, such as CRISP-DM (shown in the figure below) and SEMMA.
    CRISP-DM

    Regardless of the approach you prefer, the data should be cleaned, explored, mined, selected, modified, classified, reduced, and so on. We should always analyze the data before modeling it. Otherwise, the quality of our predictive models won’t be very good. Issues of quality and evaluating one’s statistical model is a separate issue.

    The point I’m trying to make here, and which I was trying to make with this analyst, was that is there no reputable “school of thought” advocating throwing unanalyzed data into a model. If such a “school of thought” exists, it’s the “garbage in garbage out” (GIGO) “school of thought.”

    The Current State of Analytics: Power and Learned Helplessness

    Seth Godin claims that we live in an age of Linchpins: empowered employees going above and beyond the factory-work of the past to produce works of artistic excellence. As someone who spent 10 years producing strategic insights through analytics, I find that today’s companies want analytical factory workers, not linchpins.

    In the past 6 months I’ve met many managers and analysts from different industries, all stuck in a state of learned helplessness. Each one tells the same story – they have given up in their their attempts to generate insights. They just produce the numbers and reports the executives and clients want in exchange for a paycheck.

    How is it that, in the midst of the golden age of analytics, did analysts end up in a collective state of learned helplessness?

    The Analyst’s Perspective

    Analysts are required to be mindless factory workers, producing models and metrics through productionalized methods and processes, filling in predetermined reporting templates. These reports are then automatically published through Business “Intelligence” systems, or into Excel, and e-mailed to executives, with a few uncreative bullet points, placed there by a non-analytical manager, masquerading as analysis.

    I can already hear you clicking away from this blog as I hit that nerve of disbelief. What I just described is what many analysts suffer through in exchange for a paycheck. This is what recruiters and hiring managers are looking for. Don’t believe me? Search through the job postings on LinkedIn. As of May 1, 2011, “analytics process” brings up 985 job postings. At the same time, “analytics creativity” bring up 179. The former reflects the dominance of analytical positions requiring mindless automatons and factory workers. The latter reflects the very few positions where analysts have the autonomy and control to truly analyze data and provide insights – not reports.

    In some of the discussions in the analytics and research LinkedIn groups, many analysts complained about the absence of opportunities to provide creative, strategic analytics and insights to companies and clients. Managers turned the conversation around to the tried and true formula of blaming the analysts themselves for failing to have the talent and education that companies allegedly require. Blaming the analyst doesn’t fix the fact that these companies want, and advertise for, factory workers.

    The Executive’s Perspective

    It’s also easy to blame companies for creating the roles and situations that result in factory work and reports instead of analytical insights. It’s harder to ask the question of why these process-obsessed roles exist, given that executives and companies say they want creative and strategic analytical insights and recommendations.

    The reality is that what executives really want is power. There are a number of ways of defining power, and one of those definitions includes knowledge or information.

    Traditional management structures are hierarchical and command-and-control in nature. Analysts report to a Director or Manager, who in turn reports to a Director or Vice President, and so on. Even within matrixed organizations, we still have someone we report to, with the positional power to make or break our careers through a performance review or pink slip.

    Analysts have a unique form of power – they have data. Analysts have the power to analyze that data and provide recommendations to management on what to change and how to change it to increase revenue and/or reduce costs.

    Executives don’t get into positions of power by permitting analysts to go running around telling everyone what strategies did or did not work and by providing recommendations on what to do next. Executives get and maintain power by maintaining control over the analysts and the data. Hence, the “need” for those command-and-control management structures.

    To maintain their power, executives have to find an efficient way to control their analysts and the potential power those analysts have. That’s where the processes and reporting templates come in. By forcing the analysts to calculate predetermined metrics into predefined reporting templates, the executives undermine analysts’ power by preventing them from doing what they do best: analyzing data to provide strategic and actionable insights.

    What the executives get instead is the predetermined answers they were looking for. In other words, executives get the numbers they need to look amazing and retain their positions and power.

    Publicly, the executives whine about the absence of business-minded analysts capable of producing strategic insights. Privately, these same executives created the processes to prevent the insights they publicly claim to seek.

    Analytics requires mindless automatons to produce metrics populating report templates used by decision makers to make themselves look good, justify their decisions, and to maintain or achieve promotions to increasing levels of power.

    The truly talented, and employed, analysts have learned to adapt through learned helplessness.

    The Path to Insights

    These disempowered analysts I speak of have one thing in common: the structure of their organizations. Each one works in a company where analytics has been decentralized into verticals, silos, lines of business, or in an agency/vendor. In my experience, empowered and creative analysts work in centralized analytics organizations or “centers of excellence.”

    In Analytics at Work, Davenport et al. support the notion of empowering analysts advocating

    1. “autonomy at work – the freedom and flexibility to decide how their jobs are done” (p. 103)
    1. and “a strong culture of trust – where they believe that the other people in the organization are open and honest and act with integrity” (p.104).
  • Davenport et al. note that these conditions, and centralized analytics organizations, are the exception rather than the rule. I submit that these are the conditions necessary to empower analysts – that by changing the management structures under which analysts work, companies can achieve the strategic insights and recommendations they claim to want.

    That these conditions do not exist is because it is not in the executives own self-interests to allow them to exist. That to maintain their power, executives maintain control over the stories the analysts can tell through pre-determined processes and decentralized analytics teams that report to the executives who could be negatively impacted by the stories the analysts could tell, if they were permitted to analyze the data.

    The Solution

    Both parties – executives and analysts – have to take responsibility for the current state of affairs. Consistent with Seth Godin’s Linchpin concept, analysts are going to have to stand up for themselves and defy their managers and/or processes. I encourage every analyst to defy the process – analyze the data and provide your manager and/or client with insights and recommendations. Who knows? Maybe the managers and clients are frustrated by the same reporting straightjackets and are looking for the path to those strategic insights themselves. If so, you have taken the initiative to improve your company – something many managers, including myself (once upon a time ago), always appreciated, encouraged, and enjoyed seeing.

    If that isn’t the case – if the manager rejects the analysis and demands the process be followed and the reporting template to be filled out – then the analyst must start looking elsewhere for employment. And yes this a dire situation given the keyword search I did earlier. But if we stay in these soul-sucking roles, then we empower these executives to disempower us and destroy our souls.

    The best thing we can do is to maintain the current “talent” shortage – to force companies to change their organizational structures and processes – to teach executives that the best way to maintain their power is to share or delegate that power with/to their analysts. With luck, those executives who fail to learn this lesson will themselves receive a pink slip.

    Social Media Won’t Tell You Why – Your Analyst Will

    One distinction some market researchers and/or social media pundits like to draw is the difference between why and what. “What” is behavior – what purchases a customer makes or what links someone clicks on. “Why” is the reason for the “what” – the observed behavior such as the purchase or link clicked on.

    It’s been argued that behavior is merely a lagging indicator and that the real key is to understand the why through social media or qualitative research. By listening to what you’re customers tell you, you’ll know why they did what they did and can act accordingly.

    There are two reasons why this is misguided.

    We don’t know why we do what we do

    Much of our behavior is driven by motivations and emotions that we are not aware of. Lacking such self-awareness, we may offer socially acceptable reasons or reasons that fit within the context of the survey question being asked. This is the reason why conducting good qualitative research is so hard. It takes a good researcher, with a good understanding of human psychology, to be able to get past the superficial reasons to understand the underlying emotional drivers of the behaviors we observe within the particular situations and contexts that the behaviors occur in.

    Good luck getting that out from 140 characters.

    Deriving the Why from the What

    Sometimes the reasons why we do what we do can be discerned from our behavior. A former colleague of mine had to discover the reason why retail customers returned ivory colored toilets. After all, once a customer returns a toilet, it can’t be re-sold. Presumably this behavior was driving costs up. To change this behavior, the retailer had to understand why it was happening to begin with.

    The analyst looked at the transaction data before and after customers purchased and returned ivory toilets. She found that those same customers returning ivory toilets turned around and promptly bought white toilets. Customers wanting white toilets were mistakenly purchasing ivory toilets instead.

    But why were they making this mistake? Because there was no picture of the toilet on the package to illustrate that an ivory toilet is not a white toilet. Apparently customers thought ivory and white were the same color.

    Using only behavioral data – and after making a trip to the store’s toilet aisle – she was able to discover the why behind the what. Could the retailer have conducted a survey to get the same information? Sure. But a good analyst knows what data and method is more appropriate given a particular business objective or question.

    A good analyst knows when to conduct a survey, when to analyze the behavioral data, and when to collect and analyze social data.

    Social media isn’t going to tell you the why, your analyst will.