Collective Intelligence: The Genomics of Crowds
Copyright by Stephan Klaschka 2010-2024
From my series on how to build a successful BRG.1
Group intelligence beats individual brilliance – and businesses are willing to pay for the crowd’s wisdom in the social sphere. The MIT’s ‘genetic’ model allows combining social ‘genes’ to harness the collective intelligence of crowd wisdom successfully and sustainably, for example in scientific research or business/employee resource groups.
We use collective intelligence every day
Whenever we face a big decision, we turn to our friends, our family, or our confidants. We seek information, guidance, advice, confirmation, or an alternative perspective. No matter if we make a life decision (partnership, job, picking a school, etc.), a purchasing decision (house, car, mobile phone) or a less monumental decisions (which movie to watch, which restaurant to go to), we make our decision more confidently and feeling better informed after reaching out to our personal network.
What we do is tapping into the collective intelligence, knowledge, or wisdom of a crowd that we know and trust: we are ‘crowd sourcing’ on a small scale. We do this because we instinctively know that the focused collective intelligence is higher than the intelligence of individuals.
What is collective intelligence or the ‘wisdom of the crowd’?
Wikipedia, the iconic product of global collaboration and collective knowledge, brings it to the point:
“The wisdom of the crowd is the process of taking into account the collective opinion of a group of individuals rather than a single expert to answer a question. A large group’s aggregated answers to questions involving quantity estimation, general world knowledge, and spatial reasoning has generally been found to be as good as, and often better than, the answer given by any of the individuals within the group. An intuitive and often-cited explanation for this phenomenon is that there is idiosyncratic noise associated with each individual judgment, and taking the average over a large number of responses will go some way toward canceling the effect of this noise.”
Scaling up to a ‘crowd’
When we read a movie review and rating on Netflix or customer ratings of a product on Amazon, for example, we tap into a larger and anonymous crowd. On the other end, Netflix and Amazon know how they get people like you and I to deliver them free content (reviews, ratings) that runs their business.
So, let’s take this to a level where it really gets interesting for you! How can you get a crowd to do your work? How do you build a framework in which strangers work on your business problems and deliver quality result for free.
Genetics of Collective Intelligence
MIT professor Tom Malone dissects the mechanics of collective intelligence in his groundbreaking article in the MIT Sloan Review (April 2010). The MIT Center for Collective Intelligence researched to understand this matter better and identified a number of building blocks or ‘genes’ than need to come together to engage and tap into the ‘wisdom of crowds’ successfully and sustainably.
Since these ‘genomic combinations’ are not random at all, we can also combine genes to build a collective intelligence system. Depending on what it is that you want to achieve, the genes can be combined to a model that suits your specific purpose. This is ‘social genomics’ made easy, and you don’t need a biology major!
Interestingly, this social genomics can be used independently for social projects you have in mind but also in relation to Employee or Business Resource Groups (ERG/ERG). – The common link lays in the organizational design that is similar to the generic BRG/ERG business model discussed previously. Thus, collective intelligence systems need to address the same questions as a business model:
Strategy or the goal: what needs to be accomplished?
Staffing or the people: who does the work? Are specific individuals doing the work or is there collaboration within a more or less anonymous crowd?
Structure and Processes or how to organize and conduct the work? How is the product created, and how are decisions made?
Rewards or why do they do it? What are the incentives, what is the measure for success?
Motivation is Key
It is crucial to get the motivation right, i.e. why people engage and continue to come back to contribute more to the cause or project. It comes down to finding the basic drivers for human motivation. This explains why people invest much of their time and resources to crowd sourcing.
The famous $1million Netflix Prize was a 5-year open competition for the best collaborative filtering algorithm to predict user ratings for films, based on previous ratings. The winner had to improve Netflix’s algorithm by 10%. The million-dollar reward in 2006 gives a flavor of just how valuable the crowd’s wisdom is for a company! In contrast to common belief, money is not always the driver. If it was, how do you explain the popular virtual ‘farming’ on Facebook, for example, where players pay hard cash for virtual goods?
In the more clandestine intelligence community, recruiting operatives plays to four motivational drivers for an individual: Money, Ideology, Conscience, and Ego (easy to remember as ‘MICE’).
The drivers for attracting collective intelligence are a bit different, as Tom Malone found out. Nonetheless, there are parallels: He calls the key motivators Money, Love, and Glory.
Real-World Examples
Everyone knows Wikipedia, arguably the best-known social collaboration and crowd-sourcing project thriving from an intellectual competition over Love and Glory, no monetary incentives involved for the authors.
How powerful Glory and Honor are we see also in areas away from the mainstream where you may not expect to find crowd-sourcing and gamification: in scientific research. The following two impactful examples reflect successful implementations for large crowds collaborating and competing to solve scientific problems:
Seth Cooper’s AIDS research challenge on the (former) FoldIt online platform challenged players to find the best way of folding a specific protein. We will not dive into the science behind it and its medical significance; here are the details for those who are interested to dig deeper: MedCrunch Interview with Seth Cooper at TEDMED 2012. For our purpose, we establish that a relevant scientific problem in AIDS research, which remained unsolved within the scientific community for a decade, took the crowd 10 days to solve!
You may find it surprising that there was has no monetary incentive involved whatsoever – yet FoldIt attracted over 60,000 players(!) from around the world. The winner of the AIDS-related challenge was later recognized and honored at the 2012 TEDMED. It was not a Nobel-prize laureate from an Ivy-League institution but a laboratory assistant from Britain – who, well, enjoys folding proteins and collaborating on the puzzle with think-alike from other countries. This is the power of Love and Glory!
Another example is the ongoing “Predicting a Biological Response” on Kaggle.com, a geeky online platform for people who like developing descriptive models. My friend and colleague David Thompson (whom you might remember from Difference affords opportunity – social media leverage by an ERG) of Boehringer Ingelheim (a major yet privately held bio-pharmaceutical company) designed this scientific competition to compete for the best bio-response model for a given data set of scientific relevance.
The challenge offers a $10,000 prize for the winning model and lesser amounts for the models coming in second and third. The monetary award together with a time limit of three months helps to speed up the process and keep up the competitive pressure. Last time I checked, 467 teams competed and have already submitted 4,300 entries with another month to go. The quality of the model is summarized in a single number (‘log loss’), so competitors can compare their results directly and immediately, the same quantifier determines the winner.
Note that the Kaggle participation is not driven by the monetary incentive primarily; otherwise, the number of participants should correspond directly with the amount of money offered for a particular challenge, which is not the case. Thus, participants are in it more for the challenge and fun than for the cash. (If you are a participant and disagree, please correct me if I am wrong!!)
On the other hand, don’t underestimate the business value of the gamification of science either: another ongoing competition in Kaggle offers a serious $3million reward!
The bottom line
Social collaboration, crowd-sourcing, and collective intelligence all rely and depend on humans collaborating to make things happen. What holds true in the real world seems to hold true also in the virtual world: the magic formula is all in the genes…
Stay tuned for my next post: Imitators beat Innovators!
From my series on how to build a successful BRG (=Business Resource Group) group, i.e. a business-focused ERG (=Employee Resource Group) first published on OrgChanger.com.