Tag Archives: Tactics

Sharing shows how you’re faring on the web

How frequently people share content they find on your website, blog or other electronic outreach may ultimately be the most important measure of well you’re building and maintaining your issue or action coalition.

If the people you reach out to are forwarding, Digging, Delicious-ing the content you are generating, then they’re voting with their actions that what you have to say is important because it:

  • Contributes to or furthers a conversation they think matters;
  • Advances their self or civic interest; and/or
  • Confirms their values, beliefs or ideas.

The key, then, is to make sure that the content you’re generating is “share worthy” by concentrating on its:  

  • Trustworthiness – Do you take every step possible to make sure that content is accurate, complete, low on spin and authentic to the style and culture of your organization?
  • Relevance – Do you know in great detail who your coalition partners are, what interests and motivates them, and do you provide them with what they need and want?
  • Immediacy – Is the meaning and value of your content instantly recognizable as valuable without requiring a complicated explanation.
  • Usefulness – More than ever before, content is king, especially well-written, timely and relevant news, how-tos and other material that adds value to everyday life, or at least makes it easier and more productive.  

Bottom line, any time you’re posting information, ask yourself: “Will my partners and audiences use this material and, if so, how will their task/day/life go better?”  If you and your content have an answer, then odds are what you have to say is “share worthy” and thus an Internet success.

Advertisements

Quick tip: Heads up on a new Twitter tool

From E-Media Tidbits:

“A new Twitter interface application, Twitterfall, has been around for a month now.  … this is a must-see — for about 10 minutes. Then it becomes a must-use.

Here’s what Twitterfall does:

  • Scanning. You can choose to watch everyone’s tweets go by, or log in to watch only the tweets of those you follow. Thanks to Comet technology, Twitterfall has an especially fast search service. You can alter the speed from 0.3 tweets per second to a mind-scrambling 10 tweets per second.
  • Keyword tracking. You can see the most popular terms of the moment, and just follow tweets containing those keywords (including hashtags). Or you can enter your own search term (as on the Web-based Twitter service Monitter) to track tweets mentioning it. You can combine keywords, too.
  • Geo-filtering. You can enter a location to narrow down your display to tweets from that location that also mention keywords you choose (again as with Monitter). The words Mumbai and Chengdu come to mind.
  • Basic usability. Unlike Monitter, you can use Twitterfall to post tweets yourself, reply to tweets and mark tweets as favorites. Just hovering over a tweet pauses the whole thing. You can also follow a user with one click — a feature some popular clients like Tweetdeck lack. You can filter by language and choose to exclude retweets. You can save favorite searches. And you can customize the appearance of the interface, including the font size.

This is quite simply the best-designed Twitter interface …”

Quick Tip – Test your coalition to build understanding

People love quizzes and surveys – at least when their GPA isn’t at risk.

And it appears that, besides the entertainment value, taking “tests” actually helps you better remember what you’ve learned, even if it wasn’t covered on the test.

It works even better than simply giving people more time to study, at least in terms of long-term recall of the materials, according to the Journal of Experimental Psychology.

The lesson here is that an effective – and entertaining – way of educating coalition members about information and messaging may be to occasionally let them test and cement their knowledge with a casual survey or quiz offered online, at meetings and in other forums.

Quick Tip: Making sure we speak the same language

If you feel that you and your coalition or network members are talking past one another, you might try mediating the conversation through Wordle.net. (Another take on this concept with richer features can be found at http://manyeyes.alphaworks.ibm.com/manyeyes/.)

Wordle describes itself as a tool “for generating ‘word clouds’ from text that you provide. The clouds give greater prominence to words that appear more frequently in the source text. You can tweak your clouds with different fonts, layouts, and color schemes” to emphasize differences in frequency of use.

Wordle’s real beauty is that it gives you an easily understood quantitative visual analysis of whether you and your audiences are using the same language to talk about common issues and concerns.

It’s not just text responses that you can run through Wordle.  Some use it to analyze how people are tagging content (see http://www.wordle.net/gallery/wrdl/505252/WRI_Delicious_Tags:_4_Feb_2009) to see if the language they use is the same as that of their audiences. (This visual example of a Wordle chart may take some time to load.)

All in all, it’s a good, fast way to mid-course reality check whether you and those you’re trying to motivate are talking about the same things in the same way.

Are phone surveys dead?

We can no longer expect representative samples to accrue from random digit dial phone surveys, according to The Metrics Insider.

That’s a problem for those of us who rely on phone surveys to help define the audiences, issues and messages that are most likely to compel our coalitions members to coalesce and take action.

The full article is here; an excerpt follows:

“The CDC’s National Health Interview Survey has become a highly visible source in the research community for tracking the incidence of what they call wireless-only, and what media researchers generally call cell phone-only, households.

On Dec. 17, their latest findings were released, covering the first six months of 2008. According to the NHIS, 17.5% of US households were wireless only. To put that into perspective, just three years prior, in the same study, only 7.3% of US households were wireless-only.

When we look at target demographics, the cell-only situation becomes even more dramatic. Fully 21.6% of US Hispanics live in cell-only households. And consider this: 31.4% of 18-24 year-olds live in cell-only households, as do 35.7%– over a third!-of 25-29 year-olds…

Why is this a research issue? Well, most RDD is done at calling centers, using auto-dial systems that automatically place hundreds of calls, handing off the call to a live interviewer when someone answers. But it is against the law to auto-dial cell phones. So RDD sampling systematically and by design excludes cell phone exchanges. In order to sample cell phones, a human has to manually dial the number, rendering the process several times more costly than RDD dialing of land lines…

…we recruit our panelists exclusively online (save for our “Calibration sample,” a control sample that is recruited randomly and offline, but not via RDD.). This has many benefits, but for today’s purposes, it allows us to assure that we represent persons from all phone-status households.

A year ago we did a study of our U.S. panel, in order to understand both phone status composition of the panel, and how Web usage might vary by phone status. We found that 19% of our panelists were cell-only, and that another 23% were cell-primary; in other words, if we’d relied on RDD we would have totally excluded 19% of the panel, and dramatically under-represented another 23%.

Measuring your Facebook make-up

You can’t deny the appeal of MySpace and Facebook if you’re trying to organize thought and action around an issue or project – unless you’re a corporate IT manager (but that’s a whole ‘nother story).

Social sites command large audiences whose members potentially can become your advocates. And by listening to their conversations, you can get ideas and feedback on how to improve your outreach and advocacy.

But how do you know that you’re making a real impact with Facebook, et al?  And how do you get the numbers and analysis that enables you to report back results that are meaningful and understandable to other, perhaps less Web 2.0-savvy members of your organizations.

At least some answers to those questions can be found in this article from The Measurement Standard.  It provides some simple benchmarks, as well as a framework for how to go about measuring your social site presence.

It also underscores implicitly a key point about social media’s impact on most organizations and their communicators.  

Most issues, most groups, aren’t going to move the needle on an issue by sheer numbers.  The real value comes from the insights and analysis gained from relatively unfiltered access to people who’ve just proven they care enough, or are interested enough, to act. And action is the most important attribute you want from potential allies or coalition members when pressing for change.

In case you’re counting: eight more social media metrics

If I’m a little OCD-ish these days about measuring social media effectiveness, I’ve got a reason.

The decision makers I often work with are skeptical of Web 2.0.  They have at least a vague sense that it is a field of endeavor on which they need a substantial presence.  However, they often are of a generation for whom social media is not broadly familiar or attractive. Additionally, they often work on initiatives or in organizations where caution is a desirable and appropriate thing.

Besides their social media skepticism, they share another trait.  They’re almost universally driven as decision makers by cold, hard numbersparticularly if those numbers can be contextualized with how they represent progress towards organizational goals.

So measurement rationale, strategies and methodologies are always on my reading list – at least until a short upcoming combo business/pleasure trip to Berkeley brings fiction back into my life for five days.

And that’s why here are eight more social media metrics rather than a review of the newest Great American Novel:

  1. Unique visitors — human log-ons minus duplications indicates reach.
  2. Duration — length of stay demonstrates reach and intensity of engagement.
  3. Inbound links — a high “link to” count demonstrates credibility and influence. 
  4. Downloads — video views, document downloads, etc.  are measures of engagement.
  5. User ratings — user-generated rankings such as star ratings and favorites show credibility, influence, reach and engagement.
  6. Conversation — chat, comments and conversation indicate engagement and impact.
  7. Return visits — frequency of return visits displays credibility, influence and “stickiness.”
  8. Next clicks — where visitors go next can indicate credibility and satisfaction (satisfied – may go to unrelated topic site; dissastisfied – may go to related, perhaps oppositional, site).