Lack of hands-on testing for products by industry analysts [AR Practitioner Question]

question-mark-graphic.jpgQUESTION:  It disturbs me that analysts don’t have hands-on experience with the products they advise their clients about.  How are can they be credible?

ANSWER: Analysts, especially at the larger end-user centric firms like Gartner and Forrester, have access to what they would claim is the world’s largest testing lab – their clients. Analysts covering hot product areas, both software and hardware, are hearing from scores of clients every month who are evaluating products or using them in production. As a consequence, the analyst firms claim that they can develop a series of data points that quite accurately portray the quality of a vendor’s products. 

Note: This is a classic SageCircle article from our pre-blogging newsletter. This particular article appeared in March 2001. It is being brought into the blogging era because ZL Technologies in its lawsuit against Gartner criticizes Gartner because it does not do “a single minute of independent testing of the products it purports to evaluate” (page 9 of the original court filing).

SageCircle Technique:

  • Vendors can help themselves by ensuring that their product evaluation programs for prospects are Continue reading

Evidence Sidebar – Gartner needs to cover the role of information from end-user inquiries (part 6 of 7 about Gartner’s Q3 AR Call)

Gartner’s Analyst Relations team holds a quarterly conference call for the analyst relations (AR) community. SageCircle occasionally will post about the call, but for this particular call there was so much information that we have a seven-part series to highlight details and provide commentary. See below for links to all seven posts.

One of the announcements at the Gartner Q3 AR Calls was the rollout of the “Evidence Side-bar” for written research. During the presentation the Evidence Side-bar was described as “a description of the evidence behind the written research” and it will be “positioned on the front page of each document.” This is a welcomed development as increased transparency can only enhance the credibility and usefulness of Gartner’s research. Additional detail about the Evidence Side-bar taken from the AR Call presentation includes: 

  • Methodology
    • A high level view of the methodology  …or…
    • A link to the Methodology Document  …or…
    • A pointer to the Methodology Statement
  • Source
    • Primary research, e.g., “Gartner Survey”
    • Secondary research
    • Reference to another Gartner note, etc – with appropriate details
  • Notes
    • Additional information or commentary
    • A description of models used
    • Criteria or inclusion of technologies or technology
    • Include forecast assumptions.

This is all well and good. However, there was a glaring omission in the discussion of sources so SageCircle submitted the following question:

Question from Gartner AR Call

Frankly, we did not think that the Gartnerians would respond to the question. Much to our surprise, they did. Here is VP Mike Anderson’s reply:

“Inquiry is a great source of the evidence that a lot of analysts use for those results. I do not believe that we have standardized on the content of how inquiry be presented or how inquiry was used.

Instances where there were substantial numbers and quantities or where particular demographics have become important in the analysis that is being presented, those will be the things that analysts will be putting into the Side-bar. We’ll see feedback to them to ensure the Continue reading

Former Forrester analyst criticizes the Wave… but why did he wait?

Source: Corporate Integrity website

Source: Corporate Integrity website

Michael Rasmussen (blog, bio, Twitter) is a former Forrester analyst now with his own boutique firm, Corporate Integrity. In his recent blog post The Forrester GRC ‘Ripple’ (OOOPS . . . I Mean, ‘Wave’) Michael calmly dissects the Wave methodology and makes several suggestions for improving it. It is well worth the read. 

However, this SageCircle blog post is actually in response to a Twitter direct message Carter received about Rasmussen’s post: “while part of me admires Rasmussen for offering critique, why didn’t he do so while AT Forrester? Hints at sour grapes.”

Probable Answer: It was only after he left the firm could he see the problem in the methodology

Major firms are constantly tweaking their methodologies with input from the analysts. But that is typically done around the edges with the goal of increasing the efficiency, fixing minor problems, or silencing vendor complaints. Frankly, it is the rare analyst at a major firm who takes the time to do an in-depth analysis of her firm’s research methodology to see where it is really broken. This lack of observations occurs for a variety of reasons:

  • Analysts are too busy with day-to-day activities
  • They drank the kool-aid that what the firm does is perfect
  • “If it ain’t broken, don’t fix it” attitude toward things that appear to be working

So Rasmussen should not be criticized for not criticizing the Wave methodology when he was an employee of Forrester. Rather than think of this post as “sour grapes,” it is much more likely that he did not have an “ah, ha!” moment until he was on the outside looking in.  We all know that hindsight is 20-20.

SageCircle Technique:

  • Research consumers and AR professionals should develop a deep understanding of the methodologies used by their key analysts
  • Research consumers should press analysts for detail on Continue reading

Presentation déjà vu could be a sign you are dealing with a problem analyst

Something that always stands out in my experience as an AR manager at a major vendor is what I call “presentation déjà vu.” This happens when you are reviewing an analyst presentation and just feel like you’ve seen it before. This typically occurs when you are looking at the slides of an analyst you have not dealt with before. Perhaps you have seen the presentation before or maybe the deck is just so generic or archetypical that it is immediately recognizable. No big deal. However, presentation déjà vu might also be a warning signal that you are dealing with a type of problem analyst.

Some analysts fall into a trap of doing a light revision of a past presentation for an upcoming conference. This is especially true for Gartner analysts who have to do essentially the same presentation year-after-year at Symposium (e.g., the Powerhouse Vendor and its successor session Gartner Compares). This can be a real problem because if the analyst is not paying careful attention in the revision process, old information and recommendations could be repeated.  This may cause tech buyers to make wrong decisions resulting in missed sales opportunities for vendors who are mis-represented. This could be disastrous for both the IT manager and the vendor.

Rather than pouncing on the analyst for using Continue reading

Essential AR skill – knowing analyst firm research methodologies

One of the essential skills for all analyst relations (AR) professionals is knowing the different types of research methodologies used by analysts and the implications for vendors. 

For example, advisory analysts* are very well positioned to spot  a vendor’s message inconsistencies. That is because advisory analysts are talking to a wide variety of people that interact with that vendor in addition to the analysts’ direct interactions which are arranged by the vendor (click on graphic to enlarge). Advisory analysts hear from their end-user clients during inquiries what the vendor sales representatives are saying to the end user. The analyst is then in position to compare what the sales rep is saying to prospects/customers versus what AR and executives are directly telling the analyst. Advisory analysts are also talking to financial analysts and the press, which provides additional opportunity to compare what AR is saying to the analyst in contrast to what the vendor is saying to these other communities.

This is a critical insight for AR teams that Continue reading

Spoon feed analysts public information

This should not have been a surprise to me, but I was shocked when I first started dealing with analysts as an analyst relations (AR) professional with the number of analysts who never bothered to check my company’s public information. Yeah, it was OK that they never read the marketing content on the website. But they also never perused the quarterly financial statements even when they were basing part of their analysis on the financial strength of my employer and very visibility stating the “facts.” Here is an example. 

A Gartner analyst sent me a courtesy review copy of slides for an upcoming Symposium presentation. One the statements on the “Challenges and Strengths” slide was that the margins for a particular business were a “challenge.” Huh? This particular division had consistently improved its margins – year-over-year and quarter-to-quarter – for ten straight quarters. What was going on? That was when the light bulb went “Click!” for me. The analyst had not read the quarterly statements. So I put together a simple table that extracted a few relevant financial facts for the business group going back four years. It showed the challenges the business had early on, but then it illustrated the consistent, never wavering progress for 2.5 years. After reviewing the simple table consisting of public information, the analyst moved margins from “Challenge” to “Strength.”

That was a win for AR, but it outraged me that a former colleague was making public speeches about a company without bothering to check the facts. Yes, perhaps this was a bit naïve of me. Once I took a few deep breaths and calmed down, I set about spoon feeding this analyst and the other Gartner analysts in the same research area the basics about this particular business group’s financials. Every quarter I would add to the aforementioned Continue reading

Using social media as a research tool is not the end all, be all even for social computing analysts

icon-social-media-blue.jpgIn another example of his radical transparency – at least for an analyst at a major firm – Jeremiah Owyang posted How crowdsourcing helps some – but not all research activities that discusses how he uses both social media-based and traditional Forrester research methodologies. It is an interesting read.

This post provides insights for analyst relations (AR) professionals into how a leading edge analyst is leveraging social media today as a research tool. These insights could prove useful as Continue reading

Follow

Get every new post delivered to your Inbox.