Mapping Corporate Twitter Account Networks Using Twitter Contributions/Contributees API Calls

Savvy users of social networks are probably well-versed in the ideas that corporate Twitter accounts are often “staffed” by several individuals (often identified by the ^AB convention at the end of a tweet, where AB are the initials of the person wearing the that account hat (^)); they may also know that social media accounts for smaller companies may actually be operated by a PR company or “social media guru” who churns out tweets their behalf via Twitter accounts operated in the company’s name and in support of it’s online marketing activity.

Rooting around the Twitter API looking for something else, I spotted a GET users/contributees API cal, along with a complementary GET users/contributors call that return “an array of users (i.e. Twitter accounts) that the specified user can contribute to”, and the accounts that can contribute to a particular Twitter account respectively.

I didn’t know this functionality existed, so I put out a fishing tweet to see if anyone knew of any accounts running this feature other than the twitterapi account used by way of example in the API documentation. A response from Martin Hawksey (on whom I’m increasingly reliant for helping me keep up and get my head the daily novelties that the web throws up!), suggested it was a feature that has been quietly rolling out to premium users: Twitter Starts Rolling Out Contributors Feature, Salesforce Activated. Via his reading of that post (I think), Martin suggested that a Bing(;-) search for “via web by” would turn up a few likely candidates, and so it did…

So why’s this interesting? Because given the ID of an account that a company users for corporate tweets, or the ID of a user who also contributes to a corporate account via their own account, we might be able to map out something of the corporate comms network for an organisation operating multiple accounts (maybe a company, but maybe also a government department or local council ,or lobbiest group), or the client list of “social media guru” operating various accounts for different SMEs.

Anyway, here’s quick script for exploring the TWitter contributors/contributees API. The output is a graphml file we can visualise in Gephi.

And here are a couple of views over what it comes up with. Firstly, a map bootstrapped from the @twitterapi account:

Twitter contributors network

And here’s one I built out from HuffingtonPost:

HuffingtonPost twitter contributors network

So what do we learn from this? Firstly it’s yet another example of how networks get everywhere. Secondly, it raises the question (for me) of whether there are any cribs in other multi-contributor social network apps (maybe in tweet metadata) that allow us to identify originating authors/users and hence find a way into mapping their contribution networks.

As well as building out from an account name to which users contribute, we can bootstrap a map from a user who is known to contribute to one or more accounts (code not included in Github gist atm).

So for example, here’s a map built out from user @VeeVee:

Twitter contributors netwrok built out from @veevee

I guess one of the next questions from a tool building point of view is: is there a more reliable way of getting cribs into possible contributor/contributee networks? Another is: are any other multi-contributor services (on Twitter or other networks, such as Google+) similarly mappable?

PS Just noticed this: Google to drop Google Social API. I also read on a Google blog that the Needlebase screenscraping tool Google acquired as part of the ITA acquisition will be shut down later this year…

Author: Tony Hirst

I'm a Senior Lecturer at The Open University, with an interest in #opendata policy and practice, as well as general web tinkering...

17 thoughts on “Mapping Corporate Twitter Account Networks Using Twitter Contributions/Contributees API Calls”

  1. Big red circles around things always catch my attention which is why the TechCrunch article stuck in my mind (which I only went looking for to try and work out what was going on). On more trawl for more info it was interesting to see the Salesforce twitter account had gone back to ^AB type signatures to use their ‘Social Media Monitoring and Engagement’ platform radian6 for tweeting. Given that the search term is ‘via web by’ suggests that almost 2 years on Twitter hasn’t got around to a post as a contributor part of their API (imagine this has left some businesses scratching their head).

    The HuffingtonPost web is interesting. Given that it appears updates are via the web why bother with a network of nameless accounts.

    [Didn’t know Social API was closing – balance is restored ;)]


  2. Hi there,
    Goldsmiths CAST student here. I get the following error when I run your script:

    Traceback (most recent call last):
    File “”, line 15, in

    Am I doing something wrong here?


  3. @sam how are you running the script… it’s all a bit clunky (didn’t I warn you?!;-)

    An example way of calling it is:
    python -contributeto twitterapi -depth 5

    PS I also updated the gist just now to the copy I currently have running locally, just in case..

  4. @tony Thanks for this, the updated script seemed to work, however, now I only have one Node when I open the graph.graphml into Gephi. Should I change something in the script?

  5. @tony. Not sure what’s happening, as it looks like I’m putting in the required data in lines 6-9:

    parser = argparse.ArgumentParser(description=’Mine Twitter account contributions’)
    parser.add_argument(‘-contributeto’,nargs=’*’, help=”MichelleObama,whitehouse,datastore,castlondon,GdnDevelopment”)
    parser.add_argument(‘-contributeby’,nargs=’*’, help=”danmcquillan,zia505,datastore,castlondon,AlexGraul”)

    But I get this at the ned of my python output and no nodes:

    fetching fresh copy of fetched url:
    {‘userlist’: [], ‘graph’: , ‘contributors’: {‘twitterapi’: []}, ‘accountlist’: [‘twitterapi’], ‘contributees’: {}}
    contributors {‘twitterapi’: []}
    contributees {}
    accountlist [‘twitterapi’]
    userlist []

    1. @sam is simplejson and urllib2 loading? From a console, run Python (just type: python); then:
      import simplejson
      import urllib2
      Or start putting print statements everywhere to try to track what’s going on;-)

    2. @sam: also note that: ‘help=”A space separated list of account names (without the @) for whom you want to find the contributors.”)’ is a statement that appears when you you call the help file relating to the script from the command line ( python -h ), not a “PUT SPACE SEPARATED VALUES HERE” instruction to the user.

      The ‘parser’ commands in the script set up a parser that python uses when you execute a command from the command line. So from the command line, if you type something like:

      python -contributeto twitterapi

      the script knows about -contributeto
      whereas it doesn’t know about:
      python -someRandomCommandLineArgument somerandomvalue

      (If you look up the python documentation – use your favourite search engine to search for: python argparse – it will explain what argparse is about.)

      Also note that the script accepts space separated multpiple values [help=”A space separated list of account names (without the @) for whom you want to find the contributors.”] so you can run things like: python -contributeto twitterapi starbucks

      If you try comma separated vals, it probably won’t work…

      It’s also worth bearing in mind that most accounts aren’t associated with contributions to/by other accounts…

    3. @sam oh yes, one final thing… the script uses unauthenticated twitter api calls, so it maxes out quite quickly (150 calls an hour). I should probably print an error message when this happens, but I don’t (feel free to add it into the script). A quick way to check (though it uses an API call) is just to call the API from your browser eg paste:
      into your browser location bar. If you get a message along the lines of “Error – too many calls/API rate limit exceeded, back off for an hour..” then you’ve maxed out for a bit… You can get more calls per hour using OAuth/authenticated calls to the API, but that’s more code, more things to go wrong, etc etc.

  6. @tony Almost got it working – just a few glitches which I am ironing out though printing – will give it another try first thing in the morning – my eyes are going a bit matrix at the mo’. Thanks for your help!

      1. @tony Nothing wrong with your script – just one of my libraries not properly installed. All sorted now. Thanks for your help. :-)

        1. @sam Ah, thanks… I maybe need to write a diagnostic script that tests for libraries I commonly use that folk can run as a test script; would that be useful?

Comments are closed.