Category Archives: business-analytics

IT Tech Trends Survey: what does the future hold for business analytics?

The latest IT Tech Trends Survey is now open.

We’re looking to hear what you feel are the major drivers and motivators in the tech industry right now. This year we’ve devoted a considerable chunk of the survey to investigating business analytics – its role in the workplace and where we currently are on the adoption cycle. Typical questions:

  • Where is it used in the organization?
  • Where does your organization need to be?
  • What are the concerns you face?

Think you have the answers? Want to share your perspective? Complete the Tech Trends survey now!

In 2010, more than 50% of the respondents told us mobile application development will overtake other types of development by 2015. And nearly 70% felt cloud computing will become the primary way organizations will acquire IT in the next five years.

So, what exactly happens to the results of this survey? Last year the survey garnered press coverage across the globe, allowing IBM to highlight its offerings for the top trends identified.

We also use the survey here at developerWorks to help shape the strategy for our content and site. It helps us ensure we’re delivering relevant content that best fits the need of the 4m visitors who visit us each month.

Take the survey now and pass the word along!

Whatever happened to artificial intelligence?

At a recent business analytics event, Lennart Frantzell demonstrated how (at least at a practical level) there has been a shift in business computing from Artificial Intelligence (AI) to Business Analytics:

Using a healthcare example, Lennart explained how 20 years ago AI was used to form a diagnostic method for treating snakebites in Australia. The approach was to look at the cognitive process doctors go through when treating snakebites and build a system of complex algorithms to mimic this process. The emphasis was on the algorithm – not the underlying dataset. Any sub-optimal decisions made by doctors (say as a result of bias in their individual experience) would also reflected in the system.

Fast forward 20 years. In order to treat HIV in Ethiopia, business analytics is being used to crawl over 41,000 HIV treatment histories. The EuResist system takes data from a new patient and matches this against patients who have been successfully treated in the past, so determining the most appropriate treatment. The treatment consists of a cocktail of drugs, in which the proportion of each drug in the cocktail can affect how successful the treatment will be. This obviously adds a layer of complexity to determining the ideal solution. What success are they seeing on this project? Over 78% accuracy, outperforming 9 out of 10 human experts. 

The key difference here compared to the snakebite project is the focus on data. The EuResist project pulls data from disparate databases into a flexible DB2 platform that can be analyzed using business analytics. The algorithms are simpler than those used in AI, but the results can be impressive because the reliance is on exposing trends in the data.

The separation of the algorithms and the data also makes it easier to create products that can be implemented with minimal customization, compared to large AI systems that need to be custom-built. Eg. the underlying technology and methodology for treating HIV in Ethiopia can be applied to looking at Asthma in Western Europe.

As we continue to produce more data (just take a look at the 389,000 datasets the US government makes publicly available), business analytics can play a significant role in turning this data into insight and solve problems that were previously out of the reach of artificial intelligence systems.

See more on this business analytics presentation.

Learn about IBM’s Business Analytics solutions.

Can Twitter sentiment analysis predict outcomes (like the Irish election)?

When I was growing up, election coverage was characterized by an exuberant political pundit leaping around large cardboard charts of the UK with the kind of coloring normally reserved for the weather report. The ‘exit polls’ we were familiar with only updated about every four hours and only included those people who were prepared to be cornered by the political researchers hanging around near polling stations.

Fast forward to 2011.

We currently have a general election unfolding in Ireland. The Irish online news site The Journal has been crawling over Twitter, that political social network du jour, using the conversations that happen there to predict which way the election will sway. And so far the headline graphic looks like this:


It’s a great case study in the current status of analytics and throws up some wonderful points that have relevance beyond the Irish political scene.

Data is everywhere

Researchers no longer need to go in search of data. Whilst I don’t deny the added color and in-depth insight from questionnaires, focus groups and other tools used by human researchers (whether in the political or commercial realm), there is rich data out there that you don’t have to force out of people. Social networks like Twitter and Facebook give us access to voluntarily-provided information on social groups. We no longer have to bug people to provide us with data.

Growing importance of social media analysis

Let’s face it, we’ve seen a huge growth in the use of social networks over the last two years (not sure why I pick that time frame, maybe tied up with when Twitter/Facebook buttons first starting appearing in ads and on TV). We’ve taken our social lives online. And the beauty of being online is that everything can be tracked. We leave traces. and when you aggregate all of these, patterns start to appear. Is this level of analysis creepy? The privacy issue definitely has to be considered, however I’d contend that the information is so much more valuable in aggregate (effectively anonymized) than it is at the individual level.

Sentiment analysis can throw a curve-ball

Here is what the volume of conversations around the Irish election shows us:


Now look at the sentiment:


Fine Gael have by far the most conversations. However, much of this conversation is not positive. I’d say from a marketing perspective this is something we need to pay more attention to. Far too often we’re still using raw numbers as a determinant of campaign success. We need to add the sentiment layer on top to understand more of the nature of the conversations we ignite.

Presentation is everything

The first image I highlight in this post is so immediately descriptive. Newspapers have been producing wonderful infographics for decades. In the business world we still end up with reports that look more like this:

(not meaning to pick on anyone, this is just an image that came up in a search)

How much further our story goes if we take time to package it up. Business analytics will only move further into the mainstream if the findings are presented in an easily-consumable fashion.

So, having stuck my neck out in favor of The Journal’s Twitter Tracker, I’ll have to come back next week with some post-election analysis. In the meantime, back to Twitter to watch this election unfold.

IBM Watson: counting down to the Jeopardy challenge

I’ve covered this before, but there is palpable excitement in the air as there are literally minutes before an IBM computer competes against Jeopardy all-time champions Ken Jennings and Brad Rutter (see here for local US show times).

Delaney Turner over on the IBM Software Blog does an excellent job of running through the different ways you can connect and learn more about every aspect of IBM Watson and this fascinating project.

For those interested in some of the specs of Whatson, check out this technical post on Wikibon.

On February 9, Nova aired this breathy backgound piece on Watson’s four year build-up to this event with an in-depth look at technology used and the team that created it.

I thought it’s worth sharing the chapters:

Chapter 1: smartest machine on earth – preview

Watch the full episode. See more NOVA.

Chapter 2: the challenge

Watch the full episode. See more NOVA.

Chapter 3: programming intelligence

Watch the full episode. See more NOVA.

Chapter 4: Watson’s audition

Watch the full episode. See more NOVA.

Chapter 5: machine learning

Watch the full episode. See more NOVA.

Chapter 6: playing the game

Watch the full episode. See more NOVA.

For more information, follow IBM Watson on Twitter.  

What would YouTube want with a recommendation engine?

Techcrunch recently reported that Google (as the owner of YouTube) is looking at the purchase of Twitter-based movie recommendation site Fflick. Judging by the Fflick site today, this is more than just idle rumor:


What are the implications for YouTube?

On the one hand it signals a more concerted effort from the beefy video sharing site to play nicer with the other social networks in the playground (or at least hold hands with Twitter whilst working out a relationship with arch-rival Facebook). It could mean we see the kind of functionality in YouTube present on other video networks like Livestream: a display of all the Twitter backchannel related to a piece of content. For instance:


See the running commentary down the right? This is more ‘chat’ than ‘comments’ with tight integration with Facebook/Twitter. 

On the other hand, it also opens up the possibility of YouTube to start mining user data to offer recommendations. How useful is this on a video site? Just look at the Netflix story. The popular US video rental service has made a big deal of its ability to guess what movie you want to add to your rental wishlist. It bases its recommendations on what you’ve seen in the past, how you rated it, what others like you have seen (and a bunch of other variables even including the day of the week on which you’re viewing the site!) Netflix prizes this technology enough to have made it a central part of the site navigation and even paid a team from AT&T $1 M for coming up with a winnning algorithm in 2009.

YouTube has a much bigger collection of content, a wealth of behavioural data through its huge viewing figures. It generally knows less about its visitors than Netflix does as the site doesn’t require you to login to engage. Potentially, that’s where the Twitter piece comes in to play: you give up some of this information about yourself each time you tweet. Fflick provides the service to tie the tweet back to the video. Fflick also provides the service to pick through your Tweets and use these to determine what content you might like to see next.

This kind of application of predictive analytics is hot right now in the social media space. Foursquare is believed to be using predictive analytics to keep Facebook at bay in the location-based-services sector.

Social media is making us increasingly impatient and we are starting to demand more from our interfaces. Add to that the growing market for hand-held devices that offer precious little space for content, let alone navigation, and you have a compelling case for services using whatever technology they can to pinpoint what you probably want to do next, and serve that up. If they don’t engage, the next video-sharing site is only a short URL away.

More on the Fflick acquisition

Lotusphere 2011: building collaborative business intelligence with Cognos 10

Over on the IBM Software Blog, Cognos Product Marketing Manager Brendan Farnand explains just why business intelligence solutions from Cognos have a place at the Lotusphere social business event:

"Everyone involved in a decision or a solution needs to know who else is involved, what transpired before they were asked to contribute and what other ideas are out there for that decision or solution."

Business intelligence shouldn’t happen in isolation. As I’ve pointed out before, many reports from sales figures to customer service levels have added value if key constituents can comment on the results and define follow-up actions. Pairing key functions from the Lotus suite with Cognos Business intelligence allows exactly that:


As I won’t be at Lotusphere this year, I’m looking forward to following Brendan on Twitter

If you can’t make it to Lotusphere, check out this Tech Talk webinar where Brendan highlights Cognos’ built-in collaboration and social networking functionality.

IBM Watson: is artificial cleverness the same as AI?

Let’s start with the obvious: this is the opinion of one mere human. Someone who would fail miserably at the US quiz show Jeopardy: it’s that ‘start-with-the-answer’ approach that just screws me up every time. Not being a native of this soil, I claim it’s just not part of my DNA.

But an IBM supercomputer called Watson (which was indeed conceived on US soil) appears to be performing awfully well at the contest and as such is causing a lot of media attention, much of it centered around the whole field of artificial intelligence (AI) and IBM’s involvement in this area.

As PC World reports, Watson overcame two Jeopardy all-time champs in a practice round recently. How does it do this? The silicon contestant has read countless encyclopedias and other tomes, contains natural language processing capabilities and can even determine how confident it is in its response. Couple this with industry-leading computational power and you have one efficient competitor.

IBM has a history in the development of pitting computers against humans on the cerebral battlefield. In the late ‘nineties, Deep Blue defeated chess grandmaster Gary Kasparov (although Kasparov disputes that he was indeed beaten). However the team behind the Watson project are quick to point out that the level of computing required to deal with the high-level semantic reasoning they are up against is different to the logic-bound nature of chess. Chess is a game of limited moves on an 8×8 grid; Jeopardy a game of infinite words.

I can’t help but think back to my Philosophy of the Mind classes where we studied the Turing test – that black box approach to measure AI proposed by Alan Turing in the 50s. Sometimes called the ‘imitation game’, the concept was that if someone could ask questions to a black box and not discern whether a computer or a person was inside, you could attribute intelligence to the machine on a par to that which us humans enjoy. This Stanford article does a good job of discussing the Turing Test and its objections in some detail.

One objection that stands out is that of origination: could a computer do more than just perform tasks (or deal with questions) set by humans? In the case of Watson, it was a team of people within IBM Research that came up with the idea to build a supercomputer to compete in Jeopardy. The motivations? Showcase technology. A fun work-related project. Team-building. The question is whether a computer could have had the ‘wisdom’ (foolhardiness) to come up with the idea of the project in the first place.

I’d suggest this level of decision-making is a quantum leap beyond the semantic analysis of IBM Watson.

Jonah Lehrer, in the provocatively-titled Proust was a Neuroscientist, uses the filter of art to illustrate what neuroscience is uncovering about the complexity of our intelligence. Within the poetry of Walt Whitman you find the idea that feelings and emotions are born in our bodies, not our minds:

"Antonio Damascio, a neuroscientist who has done extensive work on the etiology of feeling, calls this process the body loop. In his view the mind stalks the flesh; from our muscles we steal our moods."

You can’t separate our thought process from our bodily existence. This could be a problem for a computer lacking flesh and bones.

I don’t just bring this up in the vein of being a contrarian or mean-spirited towards what is quite an astounding piece of computing. I think there is a message here that relates to the technology at the core of Watson: business analytics.

Decision-making within the enterprise happens at different levels and business analytics doesn’t necessarily apply at all of those. For instance, business analytics is ideal at helping a marketer pinpoint prospects who might be interested in a particular offer. It’s less good at determining whether that same marketer should run a conference program if they’ve never run one before. We’re still not close to being able to automate that intuitive part of the decision-making process in business.

Last year I sat in a discussion around decision management and heard from a product marketing manager that a barrier to adoption of business analytics systems is the fear from decision-makers that this technology will take away their jobs (the very same people who normally sign the check on these kinds of purchases). This would suggest we in the field of business analytics need to do a better job of explaining that there are some decisions that can be automated and others that cannot. Business analytics consists of a set of tools that us humans can use to make smarter decisions, but like all tools, it has limits.

So whilst IBM Watson shows what computers can achieve in the human realm, it’s worth bearing in mind (no pun intended) that computers pose little threat to the human realm. The Jeopardy contest that is coming up on February 14 is a battle of one computer against 2 humanoids. If Watson wins, we’re not talking about the dawn of a new era where Jeopardy is played out by tin robots bearing the IBM insignia. We are talking about a triumph of a technology that has applications in healthcare and customer service and beyond – a technology that remains a tool in the hands of us mere humans.

More about IBM Watson, including some wonderful videos on its construction

(Image courtesy of The Doctor Fun Archive)

The growing role of predictive analytics in data center management

Crisis management is generally a costly business. Switching gears away from your forward-thinking strategy and pulling in resource to deal with issues not on the radar can really stymie growth and efficiency.

Especially in the IT management space.

In the past, the role of the IT manager was largely reactive: as soon as a problem occurs, they would have to jump in and manage the crisis. This was, and continues to be, a costly exercise for IT departments – often costing organizations millions of dollars annually.

Investment in predictive analytics has the potential to drastically reduce the surprises faced by IT management. In a recent article in Enterprise Networking Planet, Drew Robb shows how predictive analytics can be used to monitor networks across enterprises and mine behavioral patterns to get out in front of potential issues like usage spikes and plan for them before they occur. As IT moves towards virtualization and cloud models which allow for flexibility in terms of resource allocation, predictive analytics really comes into its own as a tool to help manage these spaces. For instance, with a cloud-based installation, resources can be deployed or changed in minutes, rather than weeks. If you have multiple users and applications on the installation, predictive analytics can be used to determine where resources should be apportioned prior to any impact on service levels.

Maintenance isn’t only the area where predictive analytics play a role.

Steven Sams, IBM’s vice president of Global Site and Facilities Services points out that by 2012 global data storage capacity will need to be 6.5 times what it is today (fueled largely by internet cloud-based services). He recently explained to Forbes’ Quentin Hardy how predictive analytics can be used by data center managers to plan for this growth:

"Tech planners need the same kind of big pattern-finding software more commonly used by designers, chief executives, and finance types. Among the new analytic offerings from IBM are cash flow-based scenario software, for figuring out whether to build, consolidate, or do nothing"

Obviously these decisions can have serious implications on business operations and costs. Sams highlights a Chinese bank that has managed to go from 38 to 2 data centers with a cost saving of $180 million a year using this technology. To better serve this market, IBM has launched a predictive analytics tool for use by the Global Business Services division on data center engagements.

As we move into 2011 and beyond, predictive analytics can play a major role in the way IT departments manage data centers and their operations. Given what’s at at stake, expect to see a lot more interest in this area.