Why the CIO Should Heed Stephen Covey’s 7 Habits

Steven Covey, author of the highly acclaimed management book, “The Seven Habits of Highly Effective People” passed away recently. I’m dedicating a post to him because his book and teachings were memorable during my early management training and his simple, but profound principles are particularly timely for CIOs today. When I look back at Covey’s legacy, two overarching themes come to mind: working effectively with others and time management. Both are interrelated and incredibly relevant […]

If you liked this, you might also like:

  1. 5 Steps to Brighten Shadow IT

Entities and Entitlement – The Bigger Picture of Identity Management

In this fourth “Entities and Entitlement” video, we explain the bigger picture – why identity is not just about people. It’s about all things – we call them “entities” – that we want to identify in our digital world. An identity ecosystem d…

Sequestration Planning May Illuminate Value Engineering via Enterprise Architecture

 The next 5 months portend a spectacle of US Congressional battles to be waged ahead of the pending, mandatory “sequester” – automatic, mandatory federal government spending reductions of about $1 Trillion over 9 years, in non-exempt, discretionary appropriations, set to take effect 1/2/2013. Forward-thinking planners in government IT organizations, in large Programs that depend upon IT, and among the Systems Integration (SI) community are likely to dust off Enterprise Architecture skills for analyzing budget cut implications across their IT investment portfolios, and possible cost savings opportunities to offset them. Leveraging a methodical, EA-guided approach to both assess impacts and adjust spending priorities, while illuminating new areas of savings, is a sure route to mitigating serious risks and delays to delivery of critical citizen services.

Whether you’re dusting off the existing EA artifacts, or need to take a very rapid, optimized route to constructing initial enterprise IT models, the driving principle at this time will be rapid, absolute reduction in complexity with a clear line-of-sight to cost savings. “Complexity” here simply refers to an inefficient or needlessly detailed volume of time and resources applied to deliver IT solutions – time spent re-engineering processes, building redundant interfaces and monitors, installing hardware & software in a piecemeal fashion.

Driving complexity, and therefore introducing cost savings, out of engineered systems is the central tenet of “Value Engineering”  – “the optimization of a system’s outputs by crafting a mix of performance (function) and costs”. Essentially, deliver the same capabilities with better value by driving down the cost to build and/or operate. Section 52.248-1 of the FAR (Federal Acquisition Regulations) describes the “Value Engineering Clause” that is inserted into many large Federal IT contracts – enabling the contractor to propose changes to the system being developed (i.e. a “Value Engineering Change Proposal”, or VECP). If the proposal is accepted, the actual or collateral savings derived by the government (through cost modification to the contract) can be shared with the contractor. It’s a win-win opportunity for the government, system beneficiaries and the contractor community to discover and propose engineering changes that will lower costs, yet still deliver the same or better results.

Many VECPs are submitted for technology assets – i.e. contained systems that may work better when newer, less-costly components are substituted…like missile systems or electronic devices. This may be because the contractor typically is the sole source of knowledge and research concerning how to optimize and build the components, the “value” and function of the asset is very clear (i.e. it’s delivered and explodes on target) and constant innovation is a demand of the environment. VECPs are also submitted for IT systems and programs, though it’s much more difficult to identify and propose the specific cost savings or cost avoidance that might result – since IT systems are frequently dependent upon many external or interfaced elements, vendor products and processes.

An EA-centric review of a program or line-of-business IT investment may quickly yield insight that would lead to specific value engineering opportunities, and therefore reductions in IT costs. For example, a particular set of information may be created, shared and recreated across several systems, using different processes, datastores and technologies. An segmented EA approach driving down from the particular business, process and information domains, may quickly illuminate targets of opportunity for database or interface consolidation – and therefore potential consolidation of supporting technologies (i.e. storage, networking, processors). This may lead to optimized technical operations and business process performance, which can be clearly mapped back to the Enterprise Architecture to validate that governance and mission requirements are (still) met, cross-enterprise risks are mitigated, and IT investment portfolios and procurement activities are properly adjusted or re-aligned.

With this kind of information, a VECP could be constructed that very clearly proposes both program-specific and collateral (i.e. across the rest of the enterprise) savings resulting from introduction of state-of-the-art consolidating technologies (for example, pre-integrated, self-contained and consolidated database engineered systems, perhaps cloud-enabled). At the very least, an EA view can help identify and prioritize targets of opportunity for Value Engineering that may become part of an effective and timely sequestration response – both before and after such an event, and in fact as part of the annual capital planning and investment control (CPIC) processes.

Sequestration Planning May Illuminate Value Engineering via Enterprise Architecture

 The next 5 months portend a spectacle of US Congressional battles to be waged ahead of the pending, mandatory “sequester” – automatic, mandatory federal government spending reductions of about $1 Trillion over 9 years, in non-exempt, discretionary appropriations, set to take effect 1/2/2013. Forward-thinking planners in government IT organizations, in large Programs that depend upon IT, and among the Systems Integration (SI) community are likely to dust off Enterprise Architecture skills for analyzing budget cut implications across their IT investment portfolios, and possible cost savings opportunities to offset them. Leveraging a methodical, EA-guided approach to both assess impacts and adjust spending priorities, while illuminating new areas of savings, is a sure route to mitigating serious risks and delays to delivery of critical citizen services.

Whether you’re dusting off the existing EA artifacts, or need to take a very rapid, optimized route to constructing initial enterprise IT models, the driving principle at this time will be rapid, absolute reduction in complexity with a clear line-of-sight to cost savings. “Complexity” here simply refers to an inefficient or needlessly detailed volume of time and resources applied to deliver IT solutions – time spent re-engineering processes, building redundant interfaces and monitors, installing hardware & software in a piecemeal fashion.

Driving complexity, and therefore introducing cost savings, out of engineered systems is the central tenet of “Value Engineering”  – “the optimization of a system’s outputs by crafting a mix of performance (function) and costs”. Essentially, deliver the same capabilities with better value by driving down the cost to build and/or operate. Section 52.248-1 of the FAR (Federal Acquisition Regulations) describes the “Value Engineering Clause” that is inserted into many large Federal IT contracts – enabling the contractor to propose changes to the system being developed (i.e. a “Value Engineering Change Proposal”, or VECP). If the proposal is accepted, the actual or collateral savings derived by the government (through cost modification to the contract) can be shared with the contractor. It’s a win-win opportunity for the government, system beneficiaries and the contractor community to discover and propose engineering changes that will lower costs, yet still deliver the same or better results.

Many VECPs are submitted for technology assets – i.e. contained systems that may work better when newer, less-costly components are substituted…like missile systems or electronic devices. This may be because the contractor typically is the sole source of knowledge and research concerning how to optimize and build the components, the “value” and function of the asset is very clear (i.e. it’s delivered and explodes on target) and constant innovation is a demand of the environment. VECPs are also submitted for IT systems and programs, though it’s much more difficult to identify and propose the specific cost savings or cost avoidance that might result – since IT systems are frequently dependent upon many external or interfaced elements, vendor products and processes.

An EA-centric review of a program or line-of-business IT investment may quickly yield insight that would lead to specific value engineering opportunities, and therefore reductions in IT costs. For example, a particular set of information may be created, shared and recreated across several systems, using different processes, datastores and technologies. An segmented EA approach driving down from the particular business, process and information domains, may quickly illuminate targets of opportunity for database or interface consolidation – and therefore potential consolidation of supporting technologies (i.e. storage, networking, processors). This may lead to optimized technical operations and business process performance, which can be clearly mapped back to the Enterprise Architecture to validate that governance and mission requirements are (still) met, cross-enterprise risks are mitigated, and IT investment portfolios and procurement activities are properly adjusted or re-aligned.

With this kind of information, a VECP could be constructed that very clearly proposes both program-specific and collateral (i.e. across the rest of the enterprise) savings resulting from introduction of state-of-the-art consolidating technologies (for example, pre-integrated, self-contained and consolidated database engineered systems, perhaps cloud-enabled). At the very least, an EA view can help identify and prioritize targets of opportunity for Value Engineering that may become part of an effective and timely sequestration response – both before and after such an event, and in fact as part of the annual capital planning and investment control (CPIC) processes.

Defining Architecture

Architecture is a concept everyone knows, but what is architecture actually? In Chapter 2.2 of TOGAF 9.1 it says: TOGAF embraces but does not strictly adhere to ISO/IEC 42010:2007 terminology. Yet, in its list of definitions, TOGAF attributes ISO/IEC 42010:2007 when it defines architecture: 1. A formal description of a system, or a detailed plan …read more

Summer School (Week 31, 2012)

I have been fortunate to participate in the Enterprise Architecture summer school that took place in week 31 in year 2012 at the IT University of Copenhagen. There where a lot of interesting presenters, academics and practitioners. One of the more notable presenters where Martin van den Berg who happens to be one of the […]

Link Collection — August 5, 2012

  • The Perils of Highly Interconnected Systems – Technology Review

    Totally agree with this statement: “The key thriving in an increasingly complex world is to develop a nuanced, stable theory of interoperability.”

    We need to switch our thinking to systems-of-systems, in which information flow, interoperability, change, systemic implications and dot-connecting are key.

    But, as the commenters say, I’m not sure I could make my way through the book either… 

    tags: systems systemsofsystems interop

  • Big Data Presents Oppurtunities and Costs – The CIO Report – WSJ

    Agreed: data is inherently messy

    “But because Big Data requires companies to accumulate increasing amounts of data, even free software like Hadoop is causing companies to spend more money. According to Rubin, whose clients include several of the world’s largest banks, many CIOs believe data is inexpensive because storage has become inexpensive. But data is inherently messy – it can be wrong, it can be duplicative, and it can be irrelevant – which means it requires handling, which is where the real expenses come in. “The cost of more data is the application and the computing power and the processes to reconcile all these things,” Rubin said.”

    tags: bigdata cio wsj rubin

  • Why Procrastination is Good for You | Science & Nature | Smithsonian Magazine

    “Much recent research about decisions helps us understand what we should do or how we should do it, but it says little about when,” he says.

    In his new book, Wait: The Art and Science of Delay, Partnoy claims that when faced with a decision, we should assess how long we have to make it, and then wait until the last possible moment to do so.

    tags: procrastination delay

Posted from Diigo. The rest of my favorite links are here.

Related posts:

  1. Link Collection — January 8, 2012
  2. Link Collection — January 29, 2012
  3. Link Collection — March 18, 2012