The latest edition featured a new venue and a new structure. We dubbed this session "inunData," and we set out to explore the latest developments in targeted marketing, cognitive computing, and so-called "big data." It seems -- these days -- as though we are all swimming in data. We invited three presenters to discuss the latest developments in data analysis, and the latest tools being employed to make sense of -- not just collect -- actionable information about the connected universe.
Ken Hertz made the point that, "We've gone from largely connectable (everyone had internet access, but nobody had smart phone) to totally connected (everyone has a smart phone, all the time) in only about six years." When being connected is no longer an intermittent activity, we acquire a staggering amount of continuous observational data, the opportunity to be led (or misled) by the data becomes overwhelming, and some very enticing predictive technologies have begun to emerge.
As Ed Dumbill wrote recently in Forbes
The mainstream media has adopted a definition of big data that's broadly synonymous with 'analytics.' We're now in the age of social networking, pervasive mobile phones, and ubiquitous network-connected sensors. We end up in a place where the available data is too big, unstructured or fast-moving for our conventional approaches to work. Hence the emergence of big data technologies, and their support for uncertain and evolving business processes, where analytics and probabilistic understandings are often the chief ways of deriving benefit from the data. Business needs to become deeply familiar with the new canvas on which they're painting, the potentials and caveats of using data.
Our first speaker was Will Page, the Director of Economics at Spotify. Page was previously the Chief Economist at PRS for Music. There he published pioneering work on the so-called "Long Tail," the broader value of the music industry and Radiohead's In Rainbows -- about which he famously asked if legal free could compete with illegal free.
Since joining Spotify, Page has revealed more insights into how hits happen in a digital age, uncovering encouraging trends in both advertising and the economy. Spotify has launched in 55 markets around the world, and recently announced 40 million active users and over 10 million paying subscribers. That made Spotify a great place to start when discussing digital music success.
Page's presentation opened with a brief overview of the global state - or stasis - of the music industry. Music sales, outside of Japan (the #2 music market, behind the U.S.) have been flat in the last two years.
But beyond the simple top line sales figures, Page worked out some fascinating formulae on the economics of digital music, specifically how many digital song streams were need to generate the same amount of revenue as one album sold. Based on his calculations, in 2013, 2000 music streams were the equivalent of 1 album sold. For the first quarter of this year, the value of a stream had increased. The new formula, according to Page is 1500 streams = one album sold.
Page then moved on to discuss how old media has begun working with (and not against) new media. He cited some examples from the Netherlands and the U.K. But he believes that everyone has a stake in this. "If you partner with a telco," he said, "it can raise the entire industry, not just one company."
Page closed by revisiting a favorite topic of many in the entertainment industry, Chris Anderson's The Long Tail. Anderson's original thesis was that 80 percent of sales (of albums, movies, etc.) come from just 20 percent of titles.
Page proposes a corollary he calls "the emerging 95/5 rule." His contention is that we're moving towards a place where the hits are becoming even more concentrated. He pointed out that increasing a consumer's choice doesn't mean an increased number of options chosen. Paradoxically, it may mean that choice-besieged consumers are more likely to fall back to the comfort of the familiar.
And there's more to it than paralysis by analysis. "Accelerators, such as Facebook, Twitter, Instagram, and others," Page asserted, "have favored the head of the curve."
The second presenter of the morning was Stephen DeAngelis, President and CEO of Enterra Solutions. Enterra is a cognitive computing firm focused on next-generation big data-enabled predictive analytics, cognitive computing, and consumer insights for CPG companies and government agencies.
DeAngelis is a technology and supply chain entrepreneur and patent holder with more than 25 years of experience in building, financing and operating technology and manufacturing companies. In 2012, Forbes magazine recognized him as one of the "Top Influencers in Big Data." In 2014, he became a contributing member of Wired magazine's Innovation Insights blog.
DeAngelis' presented a primer on cognitive computing, the process of using computers to catch signals and collect data to learn from it. DeAngelis articulated his view of the evolution of analytics:
⢠Analytics 1.0 - Diagnostic - Collecting basic facts and doing preliminary analysis
⢠Analytics 2.0 - Predictive - Data expansion
⢠Analytics 3.0 - Prescriptive -Self learning and insights based actions
This third iteration, the one that we're going through now -- prescriptive analytics -- is where companies, "need to combine semantic and mathematical analyses," according to DeAngelis. This is cognitive computing, where systems are trained to "use the reasoning capability to constrain the mathematical capability."
DeAngelis stressed that this was going well beyond collecting data that merely placed individual consumers into a pre-determined box. "Using these techniques, "DeAngelis said, "We can get beyond the stereotypes and increase the satisfaction of the consumer."
He continued, "It involves a qualitative constraint on the front, and a qualitative understanding at the end."
The final segment of Big Bang 12 was led by Damien Patton, Founder and CEO of Banjo, which was recently named "The Fastest Growing Engineering Company in Silicon Valley" by Wired Magazine.
Banjo's impressive growth is perhaps surpassed only by the eclectic nature of Patton's resume. His background includes seven years in NASCAR, two tours of duty in Desert Storm, and even time spent as a crime scene investigator.
Patton described how events around the world are often just "a bunch of unstructured stuff." But with Banjo, he and his team tried to make sense of these things. "We're trying to boil the ocean," he said. "We're trying to make too much of all the information. There's just too much data."
So Banjo was established to create a platform that would instantly update users on news and events happening around the world. It constantly monitors social network data, with special emphasis on location information. On the entertainment side, they've geofenced every venue in the world.
Banjo learns about the normal patterns of activity of various locations. As a result of this learning process, it can then detect disruptions in the normal rhythm of activity, and alert users to breaking news and unusual events.
One of Patton's important insights was that "Mobile data and web data are not the same thing, at all." He explained that the location data from mobile content is a key to sensing the changes in what happens around us.
Patton brought up the issue of privacy unprompted. He cited the EU Court's recent decision on Google and Internet privacy. "We [Banjo] make a distinction between public and private sharing," he said. "When it comes to private sharing, we use the data for signaling, but we anonymize it. Our engineers are not allowed to do a damned thing until they take privacy into account."
In answer to a request to hear a specific example of how to use Banjo, Damien cited a TV station that uses it to see if a breaking news story is worth spending the money to send out their news helicopter or other remote units.
Patton brought the day's discussions full circle with a quote from an executive at NBC, one of Banjo's key clients: "Banjo doesn't replace old-fashioned good journalism." It's a tool. It's an innovative tool that takes enormous quantities of data, learns from it, then uses the lessons learned to separate the signals from the noise.
No comments:
Post a Comment