Blog: David Loshin Subscribe to this blog's RSS feed!

David Loshin

Welcome to my BeyeNETWORK Blog. This is going to be the place for us to exchange thoughts, ideas and opinions on all aspects of the information quality and data integration world. I intend this to be a forum for discussing changes in the industry, as well as how external forces influence the way we treat our information asset. The value of the blog will be greatly enhanced by your participation! I intend to introduce controversial topics here, and I fully expect that reader input will "spice it up." Here we will share ideas, vendor and client updates, problems, questions and, most importantly, your reactions. So keep coming back each week to see what is new on our Blog!

About the author >

David is the President of Knowledge Integrity, Inc., a consulting and development company focusing on customized information management solutions including information quality solutions consulting, information quality training and business rules solutions. Loshin is the author of The Practitioner's Guide to Data Quality Improvement, Master Data Management, Enterprise Knowledge Management: The Data Quality Approach and Business Intelligence: The Savvy Manager's Guide. He is a frequent speaker on maximizing the value of information. David can be reached at loshin@knowledge-integrity.com or at (301) 754-6350.

Editor's Note: More articles and resources are available in David's BeyeNETWORK Expert Channel. Be sure to visit today!

Recently in Business Intelligence Category

Quick thought experiment: You are configuring a scorecard to report a rolled-up key performance indicator, or KPI. This scorecard starts out with a KPI that is based on a single measured metric, and you have a process in place to measure that metric, apply some weights to the raw score, and then present that score, which is then presented in relation to previously reported scores for the same KPI.

As time progresses, the managers decide that the KPI can be improved by integrating a second measurement and weighted raw score. This is implemented, but here is the issue: the new representation of the KPI is a different indicator with the additional measurement than it was prior to the integration of that measurement. So can the score associated with this new incarnation of the (same old) KPI be compared with the previously reported scores?

There are two sides to this question. On the one hand, if the KPI being reported is different than the one that had been reported earlier, then a comparison is unreasonable since they are effectively measuring two different things. On the other hand, one might anticipate that the weightings associated with the old raw score and the new raw score would be adjusted to scale in line with the previous set of scores.

So let's throw this out to the general readership: how do you suggest presenting the historical view of this KPI whose underlying measures are adjusted over time?


Posted January 5, 2009 9:59 AM
Permalink | No Comments |

I am currently at the Teradata Third-Pary Influncers meeting. One observation based on the various architecture diagrams and case studies pesented over the past two days is the wisdom of the Teradata/SAS strategic partnership. The partnership is a win for Teradata, which can provide the scalability and system support for analyzing massive data sets, and is likewise a win for SAS in opening up new markets for software sales. But the true winner is the shared customer community, which apparently is making significant strides in exploiting the computational power to speed up their SAS analytic applications, integrate real-time analytics into operational environments, and achieve a high return on their technology investments.


Posted July 22, 2008 4:06 PM
Permalink | No Comments |

First of all, the canonical example of the power of data mining and preditive analytics, the correlation of purchasing beer and diapers, is widly misused. The notion is that that through analysis, one data miner discovered that males typically buy diapers along with beer, and this is typically followed by explaining why males buy beer with diapers, and then saying that putting beer and diapers together will increase overall sale of both items.

Anyone familiar with urban legends would immdiately be dubious, and so I did a quick search and found a http://www.dssresources.com/newsletters/66.phpgood analysis of the history (and relevance) of diapers and beer.


Posted July 1, 2008 1:35 PM
Permalink | No Comments |

Yet another quickie from the Independent Analyst Plaftorm in Phoenix. I am listening to the Informatica presentation by Karen Hsu, and she is discussing the introduction of orchestrating buiness process and workstreaming into Informatica's support platform. I could see an interesting workflow integration impact for extra-enterprise information quality management and information integration.

Something to explore a little more: Business Process Execution Language (BPEL).


Posted July 1, 2008 9:41 AM
Permalink | No Comments |

Next up at IAP: Composite Software, introducing a combination of a search capability and the use of a relatively sophisticated approach to profiling across federated data in order to present a portal for searching though collections of data and prioritizing views that can be materialized in real time. Noted expert Clive Finkelstein commented on the similarity with what used to be the Axio product from Evoke (now part of Informatica), but the interesting part is their use of the relationship discovery purely for searching.

Also: the product is an "appliance," meaning that it is packaged software on top of hardware. No details of the hardware were presented, but it probably uses a number of multi-core CPUs with a lot of memory (how else could they do the analysis?).

Seems like an extremely interesting product, especially in the context of supporting e-discovery.


Posted June 30, 2008 5:02 PM
Permalink | No Comments |

The second set of presentations at the Independent Analyst Platform was by Kevin Quinn and Vincent Lam, representing Information Builders and iWay Software, an owned subsidiary of Information Builders. Kevin's presentation screamed through the extremely versatile organization and presentation of reporting, analysis, and some of the ways that Information Builders' product landscape feeds into an organizational business productivity and improvement activity. Reliance on data integration that has evolved over 30+ years from within lends a degree of credibility to the claims of pervasiveness and scalability.

Vincent's presentation on iWay spanned the capability spectrum of numerous abilities for data integration. One interesting note: many other BI vendors have recognized the need for a data integration (or ETL) capability, then went out and bought a vendor or two to fill that void, then wriggled and writhed through the process of making the purchased tools work together. Information Builders has grown their own internal data integration suite, which obviates that need to make things work together, and that is an extremely appealing notion.


Posted June 30, 2008 2:01 PM
Permalink | No Comments |

I am sitting at McCarran airport waiting to board my flight back from TDWI, and am thinking about one trend I noticed at the vendor exhibits: there is a growing set of vendors selling high-performance columnar-based database systems. Interestingly, the common denominator is the positioning of the software as a means for virtualizing a data warehouse appliance.

Orienting the data in a columnar manner is nicely suited to analytic applications, so the clear opportunities for these kinds of products are partnered solution providers for specific types of analytics, or with data aggregators and providers to allow for data linkage and then analysis.

Some of the vendors (or vendor reps) I bumped into over the past few days include ParAccel, Vertica, Sybase IQ, Infobright. Kognitio, alternatively, is not columnar but through data distribution across parallel systems can also talk the virtual appliance talk.

One conclusion that can be drawn is an emerging market for providing high performance analytics platforms with a low barrier to entry points towards cracking open that small.medium business market. One interesting thing to watch is the ways these guys will partner with other BI vendors (e.g., OLAP, visualization, end-user analytics) to see who can put together a robust end-to-end BI solution suitably priced for the $50-$100 million company.


Posted February 21, 2008 12:50 PM
Permalink | No Comments |

One of the hazards of advocating techniques intended to improve business through better customer insight is the occasional question of faith: does a good business intelligence strategy and program necessarily equate to greater profits? Sometimes I wonder: if customer analysis and predictive analytic techniques work so well, then one who is knowledgeable in the area should be able to apply the ideas directly to his/her own business, right? Isn't this just another example of eating one's own dog food?

Here is what I mean: using our business intelligence and data analysis and data mining and predictive analytics, we claim that we can increase response, reduce costs, extend customer lifetimes, improve lifetime values, etc. So as an experiment, I should be able to start a retail business and accumulate a bunch of customers who will always be satisfied, will never threaten to cancel their service, and will always be just about to buy the products I have already determined they need. They will each be at the center of a huge sphere of influence, and I will exploit the viral marketing opportunities by turning every satisfied customer into a walking advertisement for my products and services. I will have optimized my product and service offerings so that as one product becomes obsolete, the customer is dying to upgrade to the next level, and I will time their releases so that no follow-on product cannibalizes its predecessors' sales.

The idea intrigues me: pick a product or service to sell and then apply the performance improvement techniques driven by busiess intelligence. Some thoughts:

I would want to pick a business that is recession-proof (plumber? pest exterminator? funeral director?).

I would have to sell a product that needs updating or replacement within a relatively short cycle. Selling replacement windows is probably out. Selling office supplies is more like it.

You get the picture: a broad market where some knowledge of the customer community can drive repeatable sales, and where customer data is easy to get, maintain, enhance, analyze, and exploit.

There are a lot of success stories out there for applications of BI to business productivity improvement. Yet that is not true across the board, and that probably means that owning the software doesn't necessarily imply achieving the benefits without a little hard work. Ultimately, the successful organizations exploit BI by adapting their business processes to exploit the the knowledge discovered, and put practices in place to measure the value of each decision. Maybe that is what drives the belief in BI?


Posted February 11, 2008 7:21 PM
Permalink | 1 Comment |

I am probably one of the few people still sitting at my office desk in 2007, but I have one more thing to do before the remaining seconds in 2007's clock tick down to 0 - my BI predictions for 2008:

1) Greater emphasis on embedding and integration of business intelligence componentry into operational and analytico-operational applications. 2006-7 showed that there was a never-slaked thirst on behalf of the mega-vendors to gobble up vendors providing the various component capabilities for end-to-end enterprise information management. As such, most of the big BI vendors are now absorbed into even larger monoliths, and the result is the further integration of their capabilities into a stacked set of offerings. Second, smaller BI tool vendors (open-source included) are demonstrating more and more OEM involvement with vertical solution vendors in the telecommunications, health care, financial, pharma industries, etc. Third, Microsoft is pulling desktop-users up the stack as more BI functionality is provided (consider the launch of Sharepoint, the inclusion of advanced analytics into Excel...). In addition, we are seeing more development of business applications that exploit BI within real-time (or near real-time) operations, such as embedded call center analytics. This points to a trend to move business intelligence out of its own arena and further its inclusion into the mainstream. My prediction: Across the board, vendor messaging will focus on performance management and integrated reporting within both "pull" and "push" frameworks, increased embedding of reporting and performance metrics into existing operational applications, and establishment of BI capabilities as a "dial-tone" service.

2) Recognition of the criticality of location: Mobile phones have it. Automobiles have it built-in. Now you can purchase one for less than $150. What is it? A device that can access and manipulate data from geosynchronous positioning satellites. Of course, I am referring to the soon-to-be ubiquitous GPS device that hangs suctioned off a growing number of windshields. Most of the more sophisticated mobile devices have it also. In fact, the major geographic data companies are in midst of acquisitions, one by mobile device manufacturer Nokia. This underscores a growing understanding of the nature of location, not just in determining how to get from here to there, but also information about all the points in between, as well as where things happen. Consider this: no matter what, almost every business activity takes place somewhere, and the more you know about how these activities reflect the location in which they take place, the better your operational decisions will be. My prediction: increased incorporation of geographic business intelligence into analytical applications and platforms.

3) Emergence of policy management solutions to supplement MDM: Many of the MDM case studies are largely siloed consolidation and management of a focused data collection. We have seen a number of customer data integration hubs, product information management systems, even geographic data hubs, largely for analytical uses. However, the value of MDM is largely increased when the master data is used in both operational and analytic environments. But to establish reasonable master data services for operational or transactional applications, the MDM systems must be able to demonstrate management of the relationships between master objects within the operational contexts, which are typically designed to address defined business process requirements. On the other hand, the combination of siloed data sets into a master environment introduces numerous data quality and business operations requirements to be imposed across the master data as well as the upstream data sources. This means that the policies guiding business operations and data quality management must be absorbed into the MDM environment and integrated into the information flows. My prediction: as the importance of policy management for enterprise information management is observed, vendors will introduce a "metadata-like" mechanism for managing collections of business rules that compose the business and information policies to which master data must comply.

Well, now that the predictions are done, I guess I can wrap it up for 2007. Happy New year to everyone, and best wishes for a great 2008!


Posted December 31, 2007 2:10 PM
Permalink | No Comments |

Surely, you could not have been surprised to hear that IBM is buying Cognos. After months of transactions in which business intelligence vendors buy component vendors, only to be purchased by larger vendors as part of "industrial" demand-information programs, it looks like most, if not all of the major BI suite vendors are now absorbed (Cognos, Hyperion, Business Objects, as well as others such as Microsoft's acquisition of ProClarity). This does leave a few morsels left at the table, namely Microstrategy and Information Builders, although whether either is available or up for grabs is just grist for the rumor mill.

These announcements are always somewhat disruptive to the industry, since they shake up expectations about existing alliances and partnerships, and raises the question of whether a solution needs to be a monolithic, stacked one. So here is a quick thought: Large-scale (and pervasive) acquisitions are good for the industry, and are a little like forest fires in that they clear the way for smaller, innovative start-ups to create new tools to fill the void. We might expect that by next year at this time, we will see some interesting vendor offerings that can satisfy the growing market need.

fisheastfish.jpg


Posted November 12, 2007 7:10 AM
Permalink | 2 Comments |