processworkshopoutcomespeoplelinkscontactHome
Metavu


putting the puzzle pieces together

  catalyZer; collaboration systems


Making Sense of Data



"Not everything that counts can be counted, and not everything that can be counted counts."
Sign hanging in Einstein's office at Princeton, Collected Quotes from Albert Einstein

IBM has estimated that the Predictive Analytics market will grow to $120 Billion by
2015 (IBM Report, March 20, 2012). This prediction relies on two assumptions:

• that this market can be separated from intersecting markets and quantified; and
• that the task of making sense of big data can be handled by Predictive Analytics.

If instead, making sense of the deluge of Big Data requires collaborative intelligence,
operating on next generation crowdsourcing platforms that can evolve, through use,
toward Ecosystem Utility, then capacity to exploit synergies across computer data
processing and human pattern recognition is needed.

Collaborative Intelligence – Making Sense of Data






Predictive Analytics + Collaborative Intelligence – Making Sense of Data


Collaborative intelligence

harnesses diverse stakeholder
perspectives (eyes on the
world), skills, priorities and
expertise for cross-disciplinary
problem-solving. Data
(objective) is interpreted
(subjective) as different
eyes on the problem world
interpret data differently, 
integrating knowledge
for problem-solving
and innovation.

Whereas collective intelligence identifies similarities in aggregates to improve prediction,
collaborative intelligence characterizes next generation social networks as crowdsourcing
ecosystems where disparities among agents strengthen the “genetic diversity” of the
ecosystem. As in nature, each individual organism plays a unique role in its Ecosystem,
so also collaborative intelligence characterizes distributed, multi-agent networks where
each unique agent is an autonomous, unique contributor.

The term crowd-sourcing was introduced by Jeff Howe in his article in Wired Magazine
(June 2006) and defined as the act of sourcing tasks traditionally performed by specific
individuals to a group of people or community (crowd) through an open call.

Responders, unknown to the Task Requester, could work on the task. Crowdsourcing is
advantageous for tasks where perspectives of many people on a given question are
needed. Thus far, crowd-sourcing has typically been used for small, rapidly performed
tasks, e.g. to identify galaxies in photographs, circle craters on Mars, provide opinions
about user interface design, translate text, arrive at consensus about how to tag image
search results or products in online catalogs, test responses to search queries for quality.

Using next generation crowd-sourcing platforms to engage stakeholder expertise for more
complex tasks requires capacity to aggregate and integrate contributions, and a platform that
can evolve, respond to feedback, credit contributors, tag projects, tasks, and topics. Design
is complemented by analytics. For complex problem-solving, stakeholders with diverse expertise
should weigh in on a question, as in nature evolution benefits from "genetic diversity."

TRANSFORMING DATA INTO KNOWLEDGE
Michael Witbrock distinguishes traditional relational databases from what he calls
“intelligence knowledge bases” in that, although both are used to store and retrieve
information, one has more query flexibility.* His distinction is that in the former domain
information is captured implicitly in design, while in the latter domain information is
explicitly represented. Knowledge Bases, beyond simply retrieving previously stored data,
can infer knowledge from content. Witbrock focuses on ontologies that allow automated
reasoning, which is only one component of a smart system.

The complementary component is collaborative intelligence, built on next generation
crowd-sourcing platforms — capacity to aggregate human input from a range of pattern-
recognizers, all reading and interpreting data from the perspective of their unique expertise
and priorities as interpreters. The challenge is not simply “getting the ontologies right” so that
an inference engine can perform automated reasoning on the data presented, but building a
system with capacity to evolve its ontologies in response to user input and user needs.

Predictive Analytics complements Collaborative Intelligence in making sense of
data. Collaborative intelligence emerges from Design Ecosystems, smart learning networks
that connect humans and devices. Collaborative intelligence notes indicators that drive
effective decision-making in developing new products and programs by

• engaging stakeholders, gathering their contributions to know their priorities and expertise;
• monitoring use of the system to capture new learning and development opportunities;
• using Value Stream Mapping to track improvements in performance of projects monitored and
Return-on-Investment of the testbed as collaborative intelligence evolves.

Day-to-day online project tracking gathers data for analytics. Key Performance Indicators (KPIs),
project manager queries, system usage data and patterns make it possible to gauge and respond to
user needs, gathering feedback for improved performance and effective outreach. Models and
simulations test alternative assumptions, predicting likely outcomes of alternative courses of
action to show which assumptions, uncertainties, and risks those outcomes are most sensitive
to and to enable risk and tradeoff analysis. As crowd-sourcing evolves from early platforms toward
capacity to aggregate data, it becomes possible to integrate diverse interpretations of that data to
converge toward collaborative intelligence. Embedded Continual Assessment (ECA), built into online
project tracking, can

• gauge risks, where early intervention is needed;
• identify skill-sets and gaps where learning and outreach is needed;
• address complex project management issues, where focus group sessions are needed;
• offer Just-in-Time Knowledge Access, providing capacity for the system to evolve online;
• default to process in play, using Embedded Continual Assessment to gather feedback about
where specific interventions and process changes are needed;
• capture signals to trigger interventions to drive continuous improvement; and
• converge toward collaborative intelligence, linking people and knowledge in a system with
capacity to respond to continual feedback.




Witbrock M. (2007) Knowledge is more than Data: A Comparison of Knowledge Bases with Databases.
Report. Cycorp Inc. January.

to top

 


MetaVu Network | P O Box 4001 | Los Altos CA 94024 | 650 292 0751 |
info @ metavu.net © 2002-2014 MetaVu Network



Collaborative Intelligence for Making Sense of Data