I received a funny email the other day about excuses that school children use to explain why they haven’t done their homework. The examples were pretty creative: “my mother took it to be framed”, “I got soap in my eyes and was blinded all night”, and (an oldie and a goody) –“my dog ate my homework”. It’s a shame that such a creative approach yielded such a high rate of failure. Most of us learn at an early age that you can’t talk your way out of failure; success requires that you do the work. You’d also think that as people got older and more evolved, they’d realize that there’s very few shortcuts in life.
I’m frequently asked to conduct best practice reviews of business intelligence and data warehouse (BI/DW) projects. These activities usually come about because either users or IT management is concerned with development productivity or delivery quality. The review activity is pretty straight forward; interviews are scheduled and artifacts are analyzed to review the various phases, from requirements through construction to deployment. It’s always interesting to look at how different organizations handle architecture, code design, development, and testing. One of the keys to conducting a review effort is to focus on the actual results (or artifacts) that are generated during each stage. It’s foolish to discuss someone’s development method or style prior to reviewing the completeness of the artifacts. It’s not necessary to challenge someone approach if their artifacts reflect the details required for the other phases.
And one of the most common problems that I’ve seen with BI/DW development is the lack of documented requirements. Zip – zero –zilch – nothing. While discussions about requirements gathering, interview styles, and even document details occur occasionally, it’s the lack of any documented requirements that’s the norm. I can’t imagine how any company allows development to begin without ensuring that requirements are documented and approved by the stakeholders. Believe it or not, it happens a lot.
So, as a tribute to the creative school children of yesterday and today, I thought I would devote this blog to some of the most creative excuses I’ve heard from development teams to justify their beginning work without having requirements documentation.
- “The project’s schedule was published. We have to deliver something with or without requirements”
- “We use the agile methodology, it’s doesn’t require written requirements”
- “The users don’t know what they want.”
- “The users are always too busy to meet with us”
- “My bonus is based on the number of new reports I create. We don’t measure our code against requirements”
- “We know what the users want, we just haven’t written it down”
- “We’ll document the requirements once our code is complete and testing finished”
- “We can spend our time writing requirements, or we can spend our time coding”
- “It’s not our responsibility to document requirements; the users need to handle that”
- “I’ve been told not to communicate with the business users”
Many of the above items clearly reflect a broken set of management or communication methods. Expecting a development team to adhere to a project schedule when they don’t have requirements is ridiculous. Forcing a team to commit to deliverables without requirements challenges conventional development methods and financial common sense. It also reflects leadership that focuses on schedules, utilization and not business value.
A development team that is asked to build software without a set of requirements is being set up to fail. I’m always astonished that anyone would think they can argue and justify that the lack of documented requirements is acceptable. I guess there are still some folks that believe they can talk their way out of failure.
I always find it interesting when people pile onto the company’s latest and most popular project or initiative. People love to gravitate to whatever is new and sexy within the company, regardless of what they’re working on or their current responsibilities. There never seems to be a shortage of the “bright shiny object” syndrome – you know, organizational ADHD. This desire to jump on the band wagon often positions individuals with limited experience to own and drive activities they don’t fully understand. The world of data governance is rife with supporters and promoters that are thrilled to be involved, but a bit unprepared to participate and execute. It’s like loading a gun and pulling the trigger before aiming – you’ll make a lot of noise and likely miss the target. If only folks spent a bit of time educating others about the meaning and purpose of data governance before they got started.
Let me first offer up some definitions from a few reputable sources…
“Data governance is a set of processes that ensures that important data assets are formally managed throughout the enterprise” (Wikipedia)
“The process by which an organization formalizes the ‘fiduciary duty’ for the management of data assets” (Forrester Research)
“…the overall management of the availability, usability, integrity, and security of the data employed in an enterprise” (TechTarget)
For those of you that have experience with data governance, the above definitions are unlikely to be much of a surprise. For the other 99%, there’s likely to be some head scratching. I actually think most folks that haven’t been indoctrinated to the religion of data have just assumed that data governance is simply a new incarnation of yesterday’s data quality or metadata discussion. That probably shouldn’t be much of a surprise; the discussion of data inaccuracy and data dictionaries has gotten so much air time over the past 30 years, the typical business user probably feels brainwashed when they hear anything with “data” in the title. I actually think that Data Governance may win the prize for being among the most misunderstood concepts within Information Technology.
Data governance is a very simple concept. Data Governance is about establishing the processes for accessing and sharing data and resolving conflict when the processes don’t work.
A Data Governance initiative is really about instilling the concept of managing data as a corporate asset. Companies have standard methods and processes for asset management: your Procurement group has a slew of rules and processes to support the purchasing of office supplies; the HR organization has rules and guidelines for hiring and managing staff; and the finance organization follows “generally accepted accounting principles” to handle managing the company’s fixed and financial assets. Unfortunately, what we don’t have is a set of generally accepted principles for data. This is what data governance establishes.
The reason that you see the term process in nearly every definition of data governance is that until you establish and standardize data related processes, you’ll never get any of the work done. Getting started with data governance isn’t about establishing a committee – it’s about identifying the goals and identifying the policies and processes that will direct the work activities. You can’t be successful in managing an asset if everyone has their own rules and methods for accessing, manipulating, and using the asset. This isn’t rocket science – geez – the world of ERP implementations and even business reengineering projects learned this concept more than 10 years ago.
The reason to manage data as a corporate asset is to ensure that business activities that require data are able to use and access data in a simple, uniform, consistent manner. Unfortunately, in the era of search engines, content indexing, data warehouses, and the Cloud, finding and acquiring data to support a new business need can be painful, time consuming, and expensive. Everyone has their own terms, their own private data stash, and their own rules dictating who is and isn’t allowed to access data. This isn’t corporate asset management– this is corporate asset chaos. A data governance initiative is one of the best ways to get started in managing data as a corporate asset.
It’s rare these days to find clients who haven’t already decided on a standard BI platform. Most of the new BI tool discussions we get into with clients are with companies who’ve decided that it’s time to broaden their horizons beyond Microsoft.
The dirty little secret in most companies is that the BI reporting team has morphed into a de-facto enterprise reporting team. Why is this?
When it comes to reporting, there’s a difference between the BI team and the rest of IT. The fact is that BI teams are successful not because of the infrastructure technologies, but because of the technologies in front of the users: the actual BI tool. To the end user, data visualization and access are much more important than database management and storage infrastructure. So when a new operational system is introduced, users expect the same functionality, look and feel as their other reports.
An insurance company we’re working with is replacing its operational systems. The company’s management has already decided not to use the vendor’s reports—they’re too limited and brittle. They expect these reports to dovetail into the company’s information portal and work alongside their BI reporting. Companies are refreshing their operational platforms every seven to ten years. It’s now 2009, and the last time they refreshed their operational systems was in reaction to Y2K. It’s once again time to revisit those operational systems.
If you look at the challenges BI tool vendors are facing, there is limited growth in data warehousing. Most companies have standardized their BI tool suite. Absent disruptive technology or new functionality, there’s limited growth opportunity for BI tools in the data warehousing space.
But for every data warehouse or data mart within a company, there are likely dozens of operational systems that users need access to. The opportunity for BI vendors now is delivering operational information to business users. This isn’t about complex analytics or advanced computation. This is the retrieval of operational information from where it lives.
Photo by jakeliefer (via Flickr)
By Evan Levy
Sometimes we find clients who overestimate their need for analytics. Often, IT is focused on using BI to analyze a problem exhaustively, when sometimes exhaustive analysis just isn’t necessary. Sometimes our analytics requirements just aren’t that sophisticated.
Twenty years ago, WalMart knew when it needed to pull a product from the shelf. This didn’t require advanced analytics to drill down on the category, affinities, the seasonality, or the purchaser. It was simple: if the product didn’t sell after six days, free up the shelf space and move on. After all, there were other products to sell.
Why does this matter? Because we get so wrapped up in new, more sophisticated technologies that we forget about our requirements. Sometimes we just need to know what the problem and resulting action is. We don’t necessarily need to know the "why" every time. Often, all business users want is the information that’s good enough to support the decision they need to make.