• Recent Posts: Influencer Relations

    Why KCG’s analyst relations awards beat the IIAR’s

    Why KCG’s analyst relations awards beat the IIAR’s

    We used 18,777 data points from the Analyst Attitude Survey to compare the two leading awards for analyst relations teams. Although we found that KCG‘s awards are more useful than the IIAR‘s, both primarily reflect corporate performance rather than that of the AR teams. As a result, there’s very little that AR teams can do better or worse in these […]

    Netscout continues unwise Gartner suit

    Netscout continues unwise Gartner suit

    Netscout and Gartner have scheduled their trial for next July. The case stands little chance of improving Netscout’s value. It does, however, risk harming the reputation of both analyst firms and analyst relations professionals. Over the last weeks, pressure has mounted on Netscout’s lawyers. Netscout claims Gartner’s Magic Quadrant harmed its enterprise sales and that the truth of Gartner’s statements […]

    Is this how the Quadrant lost its Magic?

    Is this how the Quadrant lost its Magic?

    Gartner’s Magic Quadrant is the most influential non-financial business research document. In the late 1980s, it was a quick and dirty stalking horse to provoke discussions. Today it is an extensive and yet highly limited process, based on the quantification of opinions which are highly qualitative. The early evolution of the MQ tells us a lot about the challenge of industry […]

    Saying farewell to David Bradshaw

    Saying farewell to David Bradshaw

    A funeral and celebration for David Bradshaw (shown left in this 2000 Ovum awayday photo, arm raised, with me and other colleagues) is to take place at West Norwood Crematorium, London SE27 at 2.45pm on Tuesday 23rd August and after at the Amba Hotel above London’s Charing Cross Station, on the Strand. David considered that that Ovum in that incarnation was […]

    David Bradshaw 1953-2016

    David Bradshaw 1953-2016

    David Bradshaw, one of the colleagues I worked with during my time as an analyst at Ovum, died on August 11. He led Cloud research in Europe for IDC, whose statement is below. David played a unique role at Ovum, bridging its telecoms and IT groups in the late 1990s by looking at computer-telecoms integration areas like CRM, which I […]

Is it time to incorporate risk analysis into analyst list rankings?

Analyst Relations PlanningEvery AR team needs to manage their analyst list(s) to ensure they are focused on providing the right attention to the right analysts.  SageCircle stands on the “analyst list management” soapbox a lot because it such an important aspect of an effective and efficient AR program.  Creating a ranked list based on impact and then tiering based on available resources is the way to manage your service levels for analysts and ultimately manage your stress. There are many data points that go into an analyst ranking frameworks like visibility, research coverage, reputation, firm, geography and so on. This post is the opener for a discussion on whether risk should be added to the ranking criteria.

In this context, the risk being discussed is the potential damage to sales deals, market perception, internal politics, and such that can be caused by an analyst with a negative opinion. How much effort should you put into negative analysts?

So, should risk be incorporated into the analyst ranking framework as either a primary or secondary criterion? For instance, two analysts that are pretty much equal in all other criteria could see a negative analyst getting ranked higher than a positive analyst because there is more risk associated with the negative analyst and AR wants to invest more time to move that analyst’s opinion. If the two analysts are on the border between Tier 1 and Tier 2 Continue reading

Defining “Analyst List Service Level Framework”

An analyst list service level framework defines the amount of resources that are available and the type and amount of each type that will be provided to any group of analysts as defined by ranking and tiering. For example, a service level framework will indicate which analysts receive one-on-one briefings from executives, get highest priority to getting their information requests handled, and are first contacted when major news breaks. Because service level frameworks reflect the amount of resources that the analyst relations (AR) team has available they must evolve as resources change.

Service level frameworks typically align with the tiers (e.g., Tier 1 and Tier 2) established by the analyst list management process.

Defining “Tiering an Analyst List”

Tiering is a process for segmenting an analyst list so that analyst relations (AR) can prioritize its activities. The most common labels are based on numbers (e.g., Tier 1, Tier 2 and Tier 3).  Tiering starts with a ranked list of analysts and then draws lines between analysts to create groups. Tiering is based on AR resources (e.g., AR headcount, executive bandwidth for briefings, budget, et cetera). The fewer the resources the smaller the number of analysts that can be included in the Tier 1 group. 

Tiering is one step in analyst list management and follows ranking and is used to provide structure to the service level framework.

Defining “Ranking an Analyst List”

Ranking is a process for ordering a list of analysts based on formal weighted criteria. The criteria can include points such as research coverage, visibility (e.g., publications, press quotes, social media, speeches, etc), firm affiliation, geography, risk, and others. Criteria and weights are driven by the objectives of the vendor at both the business unit and analyst relations level. Criteria and weights should evolve over time as objectives change. 

While firm affiliation is an important data point, it is not the primary driver for analyst lists. Ranking should focus on individual analysts and not automatically give top placement for analysts employed by the largest analyst firms.

Ranking is one step in analyst list management and precedes tiering.