Business Architecture in the New Normal

Business architecture is a challenging capability. Designing the business, optimizing its processes and streamlining the way information is collected and used is very interesting, but most of all very difficult. We have entered the New Normal, crafted by crises we had (or have?), rapid spread and adoption of ideas and technology, and the expiration date of common business paradigms is close.  

I read a very interesting blog post by KLM’s SVP e-commerce Mr. Martijn van der Zee: “I Have a Lack of Strategic Vision” in which he points out that strategic visions in PowerPoint won’t cut-it for Air France-KLM in the digital era we are in today. He gives a number of reasons why long-term strategic plans are not working for him:

  1. Vision documents or strategy PowerPoints are drafted from an internal perspective (the company, or the department of a specific employee).
  2. These documents present a simplified overview of what’s happening in the outside world. 
  3. The author would feel really good about the document and would confuse strategic vision and effort with progress.

“In this new digital world, nobody knows where we’re going. The only way to get a glimpse of what’s happening is by trying to understand what our customer wants, build a working prototype and test it in the real world. Fail fast and often. PowerPoints won’t help you do that; building, testing and tweaking will.”

Prototyping in projects and setting up experiments are all part of a learning cycle in organizations. John Boyd introduced the OODA-loop to express a decision cycle of observe, orient, decide, and act. Since the speed of opportunities passing by is increasing, your organization need to speed up its OODA-cycle. How do you contribute to the OODA-loop of your organization?

OODA-loop for learning in organizations

The classic Architects view

Architects typically will argue that the temporary websites and databases behind the suggested prototype approach tends to stay a little longer than innovators and project managers promise when deploying them.  Architects know all about legacy, organic growth of organizations and their application landscapes, as well as the enormous efforts it will take to rationalize these landscapes. Architects prefer to craft a plan up-front, discuss the underlying principles (hoping to get them approved by senior management), design an integral picture of the preferred future, analyze the impact of the proposed changes and then start projects.

The internal focus and lack of speed in this approach are brought to the table by KLM’s Martijn van der Zee mentioned above. MIT’s Joi Ito goes even further in his TED-talk on innovation in the era of the Internet, by stating that the internet is fundamentally changing the way we innovate. Connecting, sharing and solving problems is faster than ever, due to the internet. The old-school “MBA-way of innovation” is way to slow and does not benefit from the connected world we are in today.

Okay, right… but what does that mean for the way we are architecting (in) organizations?

The alternative: Contribute to flexibility and scalability

  1. Architects in the New Normal should not hit the breaks on innovation, but facilitate innovation with all means they have at hand. By creating and presenting reusable building blocks (e.g. standard processes, information bundles, application services, technical standards), architects contribute to speeding up change. The more well-documented reusable building blocks you have available, the faster you can chain them into a working prototype with added steps/functionality to be bought (from the cloud) or build. 
  2. Architects in the New Normal should be able to engage with people that prefer other communication and learning styles then the one they prefer. “Doing” and “Concrete experience” is what business managers typically prefer, where architects prefer a style oriented on “Thinking”, and in some cases “Observation and reflection”. There is no wrong or right in these learning styles, it just helps you to take a different approach and go through all steps of learning to maximize the learning experience.

    Kolb’s learning styles
    http://shagdora.files.wordpress.com/2011/09/kolb1.jpg

  3. Architects in the New Normal should have a vision on speed. What are the fast moving processes and channels in the organization and where do we need to maximize stability? Gartner refers to this as Pace Layering, to distinguish “Systems of innovation”, “Systems of differentiation” and “Systems of record”. Although Gartner applies this concept on applications, one can abstract from that and look at business capabilities in general, from the perspective of these layers. All layers have their own pace of change and capabilities can move over time from the innovation-layer to differentiation layer on to the record layer. 
    Also architects in the New Normal should provide a set of criteria to help business managers decide on scaling up an experiment, including scenarios on the integration or re-engineering of functionality that was developed in a stand-alone experiment, to have it fit with the rest of the application landscape. Scalability is a key challenge, both for start-ups as well as for experiments in larger organizations. But also other kinds of –ilities are an issue in moving from experiment to full-blown solution, e.g. maintainability, security, interoperability, etc. etc.

    Architecture contributes to scalability
    http://www.edcapaldi.com/wp-content/uploads/2014/02/Scaling-up-the-Organization-Chart-Ed-Capaldi-Executive-Coach-Strategic-Advisor-Rockefeller-Habits7.jpg

    A very important development in the New Normal in this respect is the cloud. Embracing the cloud with a clear cloud strategy will be beneficial for all three tasks pointed out above here. Cloud-based experiments are often much easier to set up than on-premise solutions (where you need to buy servers, licenses, etc. before you can even start to experiment), and scalability is often less of an issue. On the other hand, integration of cloud solutions with on-premise ones (or with other cloud stuff) is not always easy. Architects can contribute there with a vision on cloud integration and define a set of standards to minimize integration effort.

    Business architecture techniques help you in this process. BiZZdesign applies these techniques in the tools, training and consultancy we provide. We strongly believe architecture capabilities should be focussing on the creation of business value from rationalisation and optimization as well as from growth and innovation. 

Categories Uncategorized

Enterprise Risk Management approach

In this blog post, Marc Lankhorst discussed  the value of EA in managing risk, compliance and security in the enterprise. He suggested a number of next steps. Two next steps are discussed in more detail in this blog:

  • Capture and visualize risk and security aspects of your organization. Visualize hazards, risks and mitigation measures in relation to the overall architecture and business strategy.
  • Measure and visualize the impact of risks and use these insights for decision making. Visualize data from e.g. penetration tests and use this to decide at the business level about necessary IT measures.

 

Enterprise Risk Management approach overview

The two steps from above are incorporated in an Enterprise Risk Management approach, visualized in Figure 1. This approach helps in understanding the consequences of risk & security policies, because the definition of risks and control measures on strategic level are step by step detailed into operational control measures.

 

Overview of Risk Management approach

Figure 1: Enterprise Risk Management approach

 

This is a model driven and cyclic approach which can be started on multiple points in the cycle, depending whether you are using a more top-down approach or a more bottom-up approach. Each phase will be explained briefly below:

  1. Assess risks. In this step, the risks that the enterprise has to cope with are identified and documented. This covers multiple risk types: these can be IT related (like cyber-attacks) risks, but also business related risks. Furthermore, risks can be based on identified threats (see step 6).
  2. Specify required control measures. For each risk is identified which control measures are required. Some risks may require extensive control measures (because of the high impact of the risk), as others may require less control measures. The combination of risks and control measures can be modelled with elements of the ArchiMate motivation extension (Assessment, Goal and Requirement) which makes the relation between these aspects clear. Furthermore, it can be incorporated in your existing EA models by linking risks and control measures to ArchiMate core elements. More details on this approach will be presented in a follow up blog.
  3. Implement control measures. The required control measures needs to be implemented. This is the step where the shift from design to implementation is made. Control measures can be implemented in several ways: some may be IT control measures like firewalls or authentication mechanisms. Others can be business focused control measures like the four-eyes principle.
  4. Execute & monitor. The implemented control measures needs to be executed. Furthermore, monitoring on operational level is necessary to get statistics of the performance and effectiveness of implemented controls. An example is to use pentesting on the technical infrastructure. With pentesting you look with a systematic and automated approach for weak spots in the infrastructure. Results of pentests are used to analyze vulnerabilities in the infrastructure and define new control measures.
  5. Analyze vulnerabilities. From executing & monitoring you obtained the necessary insights about performance and effectiveness of implemented controls for example via pentesting). In this step this data is analyzed to determine which vulnerabilities there are and how dangerous these are. The link is made between vulnerabilities and identified risks from step 2, by using the existing EA models. This gives insights in how well the risks are managed or that new or improved control measures are needed.
  6. Identify threats. In this step threats from the external or internal environment are identified. Threats from the internal environment can be based on the results of the previous step (analyze vulnerabilities). The identification of new threats can lead to new or changed risk assessments in step 1.

 

Top down vs bottom up

The approach described above can be applied top down or bottom up. In a top down approach will be started with the identification of threats and assessment of risks, which serve as a basis for design and implementation of control measures. A bottom up approach would typically start at the monitor & execute step: investigating the current implementation with pentests or other mechanisms and use this information to determine vulnerabilities in the current landscape.
Which approach fits best in your organization, depends on a number of aspects. In general, organizations with a more mature EA approach can follow more easily a top down approach.

Benefits of this approach

This approach includes the following benefits:

  • Systematic analysis of threats and vulnerabilities
  • Integrated design of control measures
  • EA models support business impact analysis of technical risks / vulnerabilities
  • Translate business risk & security decisions into effective enterprise changes. This requires a strong cooperation between business and IT.

These benefits help to embed security more in the business layer of your organization and will help to make well informed decisions based on operational risk impact and costs. 

Learn more about Security Architecture in our webinar, September 18th. Or Join our Security Architecture training course in the Netherlands, October 2nd. 

Categories Uncategorized

Welcome to the world of BPMN 2.0 – Some valuable tips!

In this blog I elaborate about the Business Process Model and Notation (BPMN) standard, which advantages it brings you and how you should use this standard in a powerful manner. BPMN is the de facto international standard for modeling business processes. Version 2.0 was released in 2011 and the standard is maintained by the Object Management Group. It is an extensive standard with formal semantics which enables you to describe your business processes up to the level of automating the processes by a process engine. For my work I use BPMN 2.0 to create process models and I think it is a powerful notation for describing business processes. However, you need to keep some cautions in mind. By presenting the advantages and some guidelines of applying the standard, I hope to make you enthusiastic for this beautiful standard!

Why use BPMN 2.0?

First of all, I would like to present three major advantages of using BPMN 2.0:

World of BPMN

  • Like English is a language used for verbal and written communication all over the world, BPMN is an international graphical ‘language’   (standard) that is used to communicate all over the process world. Since it is universally adopted as a process modeling standard, stakeholders outside the organization like auditors, partnering organizations and system implementers understand this standard. This   enables you to communicate about your business processes in a clear and consistent manner.

  • BPMN 2.0 is tool- and vendor independent. The standard is free to use and has a standardized underlying XML-scheme which makes you flexible with respect to contracting tool vendors when using tool support for describing or executing your business processes.

  • BPMN 2.0 is a very rich standard with a lot of different concepts. It enables you to describe your processes in great detail and use the right semantics for each of the details. The nature of the language enables you to bridge the gap from IT to business (and vice versa) and can be used to execute processes directly in an automation engine. Especially the many possibilities of modeling event driven behavior makes BPMN very powerful compared to other modelling languages.

BPMN 2.0: Handle with care

However, some firm critics exist because BPMN 2.0 should be too complicated for business  stakeholders. I can understand that the BPMN 2.0 standard does become complicated when you ‘just simply’ start using the standard and all of the included concepts. Therefore, I want to   present you some valuable tips to apply BPMN in practice:

  • Keep it pragmatic: The above mentioned richness of BPMN is also the seam side (and often expressed criticism) of the language; the richness of BPMN seduces people to create theoretical perfect models, which are no longer understood by stakeholders. Realize that a lot of the BPMN concepts are intended for expressing automation details (i.e. process execution) or exceptional situations. Since approximately 90 % of BPMN 2.0 users only use the standard to visually describe their business processes instead of execute the process directly in a process engine, I strongly advise you to keep the amount of concepts used for your process models limited.
  • BPMN keep it understandableLess is more: The bottom line is easy; the on the eye endless amount of concepts are not part of the standard to communicate to business stakeholders. To a certain level BPMN is quite intuitively to understand, but a tremendous amount of concepts should not be used to communicate to business stakeholders. Think about your own language; how many of the words in your own language (count them in a dictionary) do you use to communicate to a three-year-old kid to make him something clear? And your kid already had three years to  learn that language! Most of the business stakeholders simply do not speak BPMN on a mature level so you should adapt the use of the language to your target audience. Only use those concepts that are essential to communicate your message!

  • Consistency results in clarity: Except for communicating BPMN process models to  business stakeholders, applying BPMN in a simple manner also has the advantage of securing consistency in the different process models created in your organization. This requires clear conventions between the users of the standard: which concepts do we use, how do we use them and how do we name them in a recognizable way? Besides that, conventions with respect to lay-out are important for bringing your message in an unambiguous way to your business stakeholders.

Hopefully this offers you my perspective on using BPMN 2.0. Although BPMN is a great standard, the creation of BPMN models is not an end in itself. Above presented tips may help you to use  BPMN in an effective manner. The ultimate goal of creating process models is to deliver value   to your stakeholder(s). Since value is expressed in terms of benefits that are perceived by the stakeholder, the stakeholder is the starting point of creating process models. The starting point is therefore to identify your stakeholder and what information he needs in a process model! If   you do so, I am sure that BPMN brings you the value you are looking for!

The modeling process in BPMN

Learn more about BPMN during our BPMN Foundation training course. Do you have any additional do’s or don’ts for using BPMN 2.0? Please share by leaving a comment! 

Categories Uncategorized

Welcome in the world of BPMN 2.0 – Some valuable tips!

In this blog I elaborate about the Business Process Model and Notation (BPMN) standard, whichadvantages it brings you and how you should use this standard in a powerful manner. BPMN is the de facto international standard for modeling business processe…

Categories Uncategorized

Welcome to the world of BPMN 2.0 – Some valuable tips!

In this blog I elaborate about the Business Process Model and Notation (BPMN) standard, which advantages it brings you and how you should use this standard in a powerful manner. BPMN is the de facto international standard for modeling business processes. Version 2.0 was released in 2011 and the standard is maintained by the Object Management Group. It is an extensive standard with formal semantics which enables you to describe your business processes up to the level of automating the processes by a process engine. For my work I use BPMN 2.0 to create process models and I think it is a powerful notation for describing business processes. However, you need to keep some cautions in mind. By presenting the advantages and some guidelines of applying the standard, I hope to make you enthusiastic for this beautiful standard!

Why use BPMN 2.0?

First of all, I would like to present three major advantages of using BPMN 2.0:

World of BPMN

  • Like English is a language used for verbal and written communication all over the world, BPMN is an international graphical ‘language’   (standard) that is used to communicate all over the process world. Since it is universally adopted as a process modeling standard, stakeholders outside the organization like auditors, partnering organizations and system implementers understand this standard. This   enables you to communicate about your business processes in a clear and consistent manner.

  • BPMN 2.0 is tool- and vendor independent. The standard is free to use and has a standardized underlying XML-scheme which makes you flexible with respect to contracting tool vendors when using tool support for describing or executing your business processes.

  • BPMN 2.0 is a very rich standard with a lot of different concepts. It enables you to describe your processes in great detail and use the right semantics for each of the details. The nature of the language enables you to bridge the gap from IT to business (and vice versa) and can be used to execute processes directly in an automation engine. Especially the many possibilities of modeling event driven behavior makes BPMN very powerful compared to other modelling languages.

BPMN 2.0: Handle with care

However, some firm critics exist because BPMN 2.0 should be too complicated for business  stakeholders. I can understand that the BPMN 2.0 standard does become complicated when you ‘just simply’ start using the standard and all of the included concepts. Therefore, I want to   present you some valuable tips to apply BPMN in practice:

  • Keep it pragmatic: The above mentioned richness of BPMN is also the seam side (and often expressed criticism) of the language; the richness of BPMN seduces people to create theoretical perfect models, which are no longer understood by stakeholders. Realize that a lot of the BPMN concepts are intended for expressing automation details (i.e. process execution) or exceptional situations. Since approximately 90 % of BPMN 2.0 users only use the standard to visually describe their business processes instead of execute the process directly in a process engine, I strongly advise you to keep the amount of concepts used for your process models limited.
  • BPMN keep it understandableLess is more: The bottom line is easy; the on the eye endless amount of concepts are not part of the standard to communicate to business stakeholders. To a certain level BPMN is quite intuitively to understand, but a tremendous amount of concepts should not be used to communicate to business stakeholders. Think about your own language; how many of the words in your own language (count them in a dictionary) do you use to communicate to a three-year-old kid to make him something clear? And your kid already had three years to  learn that language! Most of the business stakeholders simply do not speak BPMN on a mature level so you should adapt the use of the language to your target audience. Only use those concepts that are essential to communicate your message!

  • Consistency results in clarity: Except for communicating BPMN process models to  business stakeholders, applying BPMN in a simple manner also has the advantage of securing consistency in the different process models created in your organization. This requires clear conventions between the users of the standard: which concepts do we use, how do we use them and how do we name them in a recognizable way? Besides that, conventions with respect to lay-out are important for bringing your message in an unambiguous way to your business stakeholders.

Hopefully this offers you my perspective on using BPMN 2.0. Although BPMN is a great standard, the creation of BPMN models is not an end in itself. Above presented tips may help you to use  BPMN in an effective manner. The ultimate goal of creating process models is to deliver value   to your stakeholder(s). Since value is expressed in terms of benefits that are perceived by the stakeholder, the stakeholder is the starting point of creating process models. The starting point is therefore to identify your stakeholder and what information he needs in a process model! If   you do so, I am sure that BPMN brings you the value you are looking for!

The modeling process in BPMN

Learn more about BPMN during our BPMN Foundation training course. Do you have any additional do’s or don’ts for using BPMN 2.0? Please share by leaving a comment! 

Categories Uncategorized

Enterprise Architecture: Key to Successful Business Transformations

In a recent article on Forbes.com with the provocative title “Is Enterprise Architecture Completely Broken?”, Jason Bloomberg gives a scathing critique of the state of practice in EA. His main point is that many enterprise architects overly focus on documentation and frameworks, instead of delivering real value and effecting business change. In short: architecture slows things down, rather than providing business value and agility. Although the article exaggerates the problem, it does touch on an important challenge for enterprise architects.

Some enterprise architecture practices are indeed concerned first and foremost with documenting the current state of the enterprise in detail. Often this is the result of the risk-averse and bureaucratic culture we encounter in many large, IT-intensive organizations. This kind of bookkeeping will never be complete, given the increasing pace of change organizations have to deal with. Of course, there is value in having a sufficiently accurate view of the business processes, applications and infrastructure of an organization. Herein lies the real challenge: we need just enough insight (a) to keep these complicated environments running and (b) to aid the enterprise architect, who should concentrate on the future direction of the enterprise.

Key to a successful EA practice is to focus on change. On the one hand, this requires process agility: rapid development and realization processes consisting of feedback loops to design, implement and measure enterprise-wide change. Classical, feed- forward, waterfall-like approaches simply won’t work in today’s volatile business environment. Using agile practices in EA processes is an important way to foster this   focus on change. On the other hand, EA products need to realize system agility: having organizational and technical systems that are easy to reconfigure, adapt and extend   when the need arises. Together, agile processes and systems create a foundation for true business agility: using your ability to change as an essential part of your enterprise strategy, outmaneuvering competitors with shorter time-to-market, smarter partnering strategies, lower development costs and higher customer satisfaction.

This focus on change is not just an EA issue, but involves all disciplines contributing to  business transformation. This point is missed by Bloomberg’s article, which focuses on  EA as a stand-alone discipline. However, successful enterprises manage their business   transformations from an integral perspective in which EA is a key player, strongly interlinked with other fields. The difficulties many organizations have with strategy execution show that this is not a sinecure.

As I have described in previous blog posts (1, 2, 3), a number of important connections between EA and other disciplines help in putting the architecture to work in effecting  business change with a holistic perspective:

  • Strategy development: Of course, it all starts with strategic direction. Many organizations have great strategies, but lack the ability to execute them. A sound EA practice bridges the gap between abstract, high-level strategies and concrete operational decisions, and provides feedback on the feasibility and impact of strategic choices.

  • Capability-based planning: Business capabilities are a pivot between strategy and realization. They provide a high-level view of the current and desired abilities of an organization, in relation to the organization’s strategy and its environment, and they are a starting point for more concrete developments in the enterprise.

  • Enterprise portfolio management: Managing investments in your capabilities is essential in effecting business change. Enterprise portfolio management provides the instruments to select areas for investment and change initiatives to realize the enterprise strategy. Enterprise architecture provides the necessary information about current and required capabilities and their dependencies to support a coherent portfolio management practice.

  • Program management: Managing change initiatives from the perspective of business outcomes is the core task of program management. The overview provided by EA is key in managing change in an integral manner, taking account of various dependencies and interactions within the enterprise and across its boundaries.

  • Risk management: Better alignment between enterprise strategy, architecture and implementation helps the organization to spend its risk and security budget wisely, focused on business-relevant risks. This may lead to both cost savings and lower risks at the same time, because you invest to protect the things your enterprise really cares about.

  • Regulatory compliance: Implementing standards and policies such as SEPA, Solvency II, Basel III and others requires enterprise-wide coordination, visibility and traceability from boardroom-level decisions on e.g. risk appetite of the organization, down to the implementation of measures and controls in business processes and IT systems. Enterprise architecture is indispensable to manage the wide-ranging impact of such developments.

  • Continuous delivery and improvement: Modern approaches to realization manage  the assets that make up your enterprise across their entire life cycle, doing away with the artificial distinction between ‘development’ and ‘maintenance’. Continuous improvement of business processes by Lean management and continuous delivery of software by agile & DevOps teams provides a steady stream of business value, in close interaction with customers and other stakeholders. Enterprise architecture connects these value streams, to ensure a coherent approach to change and to avoid building ‘agile silos’.

In all of these areas, EA functions as a kind of ‘enterprise knowledge hub’, integrating  and sharing information on various structures across the enterprise and the business transformation value chain. It provides you with relevant input for prioritizing and planning transformations. It gives you program-level coordination across value streams to realize these changes in a coherent manner. It helps you to track the realization of the expected benefits, and hence to correct your course if necessary.

Enterprise architecture as knowledge hub.

 

At BiZZdesign, we help our clients in establishing such a change capability, where we  pragmatically employ practices from various methods and techniques. Each activity and deliverable in your change processes should add business value, and we take a ‘lean and mean’ approach to the use of established methods such as TOGAF.

No one in his right mind would implement something like TOGAF cover-to-cover (which isn’t TOGAF’s intention anyway, but this is not always understood). This would indeed  lead to the document- and framework-centric EA practice that Bloomberg condemns, but to me this is largely a straw man argument. Instead, smart organizations use such methods as a source of inspiration and pick and choose those elements that fit within their specific context. Professional communities like those of The Open Group or the OMG are invaluable in bringing together practitioners and collecting best practices. At BiZZdesign, we also share such practices with our clients and others, by writing books, whitepapers and blogs like these, and by actively contributing to the activities of these communities.

Categories Uncategorized

How About Strategy? – Learning about strategic alignment

Nowadays, organizations operate in a dynamic and fast changing environment which makes formulating a consistent strategy a challenging task and executing that strategy even more difficult. More than half of organizations surveyed in previous economic studies indicated that they have not been successful at executing strategic initiatives. Moreover, a majority of organizations face problems when executing their strategic vision.

In an environment where competition and globalization of markets is intensifying, managing and surviving change becomes increasingly important. A business strategy determines the decisions and course of action that businesses take to achieve competitive advantage and is therefore crucial to survive change. Nonetheless, several economic studies indicated that many organizations fail to implement strategic alternatives. Therefore, it is important to know more about the reasons underlying the difficulties of organizations to reach strategic alignment.

Strategic alignment

Organizations develop and implement strategies to achieve (strategic) goals. The development of a strategy is about formulating what should be changed to evolve from the current situation to the desired future state. Strategy implementation is about translating the strategic plans into clear actions to execute the strategy. Strategic alignment is the ability to create a fit or synergy between the position of the organization within the environment (business) and the design of the appropriate business processes, resources and capabilities (IT) to support the execution. Strategic alignment cannot be reached when strategy development is considered to be a separate process from strategy implementation. Strategy development and strategy implementation are intertwined processes which both need to be successful for superior firm performance.

The way how organizations move from strategy development to strategy implementation is influenced by many factors. Consequently, strategic alignment is influenced by several factors which all contribute to the successful development and implementation of a strategy. We distinguish three categories in which several factors are combined that influence strategic alignment. How organizations manage the factors within these three categories determine whether they are able to reach strategic alignment or not. These three categories are:

  • Culture and shared beliefs: the collective thoughts and actions of employees towards the strategic orientation of the organization determine whether strategy implementation will be successful or not. Consequently, all the employees must be clear on the what, why, when and how of the strategy. According to previous studies the inability of management to overcome resistance to change is an important obstacle to strategy execution.

  • Organizational capabilities: capabilities, resources, systems and processes should be aligned with the strategy to be able to execute the strategy properly. An organization needs to consider their existing and needed capabilities and resources during strategy development and implementation. Strategic change gets obstructed when long-term strategic goals are not translated to short-term objectives or actions.

  • Communication: creating understanding throughout the organization about the strategy, like why it is developed and how it is implemented, is essential for developing and implementing a strategy. There should be a clear definition of purpose, values and behaviors to guide the implementation process. A poor or vague strategy makes it nearly impossible to successfully execute a strategy which makes it a killer of strategy implementation.

 

Strategic Alignment Survey

To get a better understanding of the strategic alignment efforts of organizations, we have created the Strategic Alignment survey. We seek to understand more about the way in which organizations move from strategy development to strategy implementation. The information gathered from this survey contributes to the work done on improving strategic alignment within organizations. We would like to learn from your organization’s experiences regarding strategy development and implementation and its efforts towards strategic alignment. For this reason we kindly ask you to fill in the Strategic Alignment survey.

In return for your time and effort spend there are several rewards which might interest you. The analyzed results of this survey will be published in a whitepaper to which you will have access. In addition, you can receive the book ‘Strategizer – The Method’, in which initial results on strategic alignment are documented, and you have a chance to win a book voucher worth €200,-.

Categories Uncategorized

Are Direct Messages really private, or not?

Social media have penetrated our lives. We share ideas, experiences, thought, complaints, and compliments with everybody. At the same time we see quite some controversy concerning the  privacy policies of companies such as Facebook and Twitter. Some people even think that European privacy regulation does not apply to them, as they are US-based companies. 

In our research project “New Models for the Social Enterprise” privacy regulation was one of the topics we tackled. The good news is that we as consumers outside the US are still governed by EU regulation and national policies. If a Dutch company, for example, uses data from Twitter to analyze what is mentioned concerning their brand, the mood etc., Dutch laws apply (the Wet Bescherming Persoonsgegevens). The data collecteda and analysed  by social media mining companies, such as Coosto, falls under the jurisdiction of country they work. If a company uses or stores Tweets, Facebook messages etc., they are data processors themselves. This does not hold for the use or storage of aggregated data that cannot be linked to individual users (which is not the same as anonymised data; data hardly ever is anonymous if collected in larger quantities…).

Mail exchange between companies and customers, or mail in general, is private. This brought the question to my mind whether or not direct messages in, for example, Twitter, are really private or not? In the example data we had from Twitter, direct messages did not appear, but this could be coincidental. The privacy rules of Twitter  do not mention direct messages.

In order to get clarification, we simply asked Twitter (privacy@twitter.com), on April 24th:

Dear sir, madam, 

I have been a frequent and enthusiastic user of Twitter, and will be so in the years to come, I expect. However, I do have a question concerning privacy. Everything I tweet is open to the public, and tweets are brought together and sold for business purposes to companies etc. So much is clear.

Direct messages, however, give the feeling of being private, similar to e-mail messages. From your privacy statement I cannot derive whether of not DMs are treated differently than normal tweets. I.e., are they also analyzed, aggregated and/or sold to third parties? 

 Kind regards,

@WilJanssen

It took a while and some friendly reminders, but in the end, I received an answer:

 

Encouraging, to say the least. Everything in a DM remains within Twitter and is not shared or sold. Maybe the answer above is not a legally correct answer, but still, it is clear. The use of DM’s is common in webcare, but should be used wisely. It is more effective to make sure the mail addresses of customers are known and correct and to use mail for information that is personal or private. Mail is an effective means for communication, easy to store and archive, and legally binding. Social media have a role in a swift and informal discussion. Use it wisely in a business relation.

Wil Janssen is managing director of InnoValor and guest author for our blog. InnoValor and BiZZdesign are research partners in the ‘New Models for the Social Enterprise’ project.

Categories Uncategorized

Systems are temporary, data is forever

“Data” is a big topic for many organizations. It may not come as a surprise that a lot is being said and written about using / securing / managing data. From a technical perspective, topics that are increasingly popular are big data, open data, and linked data, frequently in conjunction with security management, privacy, and business impact. More and more business forums also write about data.

Here are some highlights from last week’s news in the Netherlands (most of these are in Dutch):

What is interesting about these articles is the fact that the focus seems to shift away from the traditional ‘systems thinking’ and towards data and business impact. This seems to fit well with the growing realization that “systems are temporary, but data is forever”. Take for example the articles about Customer Relationship Management (CRM). Over the last few years, many organizations have updated their CRM capability – including systems and processes – many times. A switch in systems typically means migrating data, something that is often seen as a one of the (technically) most challenging aspects of a system upgrade due to a change in standards, definitions, data structures and so on.

Data issues, benefits of managing data as an asset

The good news is that this hard work typically has big benefits: many issues with data quality become visible during (or: shortly after) migration. Perhaps the following sounds familiar:

  • Records about key entities (such as customer, or product) are incomplete. There may be missing data about key attributes so the new system will not accept these records. Where does the missing data come from?

  • We find out that data is incorrect: there is a mismatch between what we think is true according to the data, and what can be observed in reality.

  • Data may be inconsistent: we have multiple records (which are potentially inconsistent) about our customers. Why is this the case? Can we reconcile these records? How does that affect different groups in our organization, such as Marketing, Sales, Finance, or the delivery organization?

     

  • We are once more faced with the challenge that we have different definitions of key concepts. For example, Marketing and Finance have a different definition of “Customer”. Therefore, when management compares reports about Customer from these two functions, there have always been inconsistencies (which may or may not have been ‘fixed’ with all sorts of local solutions based on extensive use of spreadsheets).We now have to buckle up and come to a standardized definition, or choose to re-develop or local solutions…

The list goes on and on. the good news, though, is the increased attention for managing data as an asset. John Ladley frames it nicely. Data is ‘the new oil’ for many organizations. Like oil, data can be dangerous. If you don’t manage it properly then it may explode.

Maturity

Over the last few months we have conducted many interviews with organizations around data management, and maturity of the data management capability. We have developed a “data management maturity scan” which is based on the DAMA DMBOK. This experience confirms the trend that we have just identified: there is a trend to take data management seriously and invest in the maturity of the data management capability. Some key findings:

  • Several organizations report that ‘culture’ is a key aspect in being successful with data. If the culture is all about systems (in Dutch: “probleempje systeempje” – which loosely translate to “build a new system for each problem”) then nothing will change.

  • We see some organizations take a “technology route” to solving data issues: start with tooling around meta data, data quality management etcetera, and “experiment” in projects to see what can be achieved. This is a minority.

  • More and more organizations focus on outcomes: what do we want to get out of our data? What do we need to make that happen? Based on goals and principles, sound investments in the data management capability can be made.

  • Handling (large volumes of) unstructured (and highly volatile) data has a lot of potential. However, most organizations recognize that this requires a more advanced capability than handling structured (“rows and columns”) data.

  • Ignoring some notable exceptions from recent engagements with one client, many organizations are starting to recognize the value of business-focused models of the data landscape: yes, it requires an investment, but the resulting models are almost a condition cine qua non for data management

Food for thought

So what does all this mean for you / your organization? As in so many domains, there is no silver bullet that will magically solve all your data problems. There is no cookie cutter approach: there are no answers, only (more and more) questions. Therefore, we offer some “food for thought”, some questions to answer in the context of your organization. First of all: have a look at your change portfolio, and focus on IT. How many of the upcoming projects are around “fixes”, around “stuff that has gone wrong with data” that we are now trying to fix? If you have figured this out, ask yourself the follow-up question: who is my go-to guy for data? Do I trust IT enough to fix my data issues, or should this be done by a data steward who truly understands the business?

Try to create a simple business case: do a quick “back of the napkin” assessment of how many hours per week a typical employee is busy with data issues (searching for missing data, fixing broken data, reconciling duplicate and inconsistent data, etcetera). Multiply this by the number of staff and an average salary to get a sense of how much bad data is costing you.

Focusing more on benefits: do you know which business entities are crucial for running day to day business? Also, does management have a solid understanding of processes, associated KPI’s, and the reports / dashboards / scorecards that go with it? Suppose we need new (BI-related) insights, how quickly can we typically deliver? Can we also show (i.e., in case of an audit) that we’re in control with respect to managing this data as a crucial business asset? Or would someone get really nervous when that question is asked…?

Of course we’ll gladly assist you in assessing where you are with respect to data. Our two-day course in Enterprise Data Management can, combined with a data management maturity assessment, be a great kick-start for getting the most out of your data!

Categories Uncategorized

Your enterprise and social media?! We’ve got an IDEA!

Social media is indispensable from the organizational environment. Where people collaborate interaction exists and since society’s large-scale adoption of the internet, social media shaped online conversations about, with and within organizations. Social media is a fact of life; it is no longer the question whether an organization should use social media, but how they should use it. However, research by Gartner shows that most social media initiatives fail to achieve participation from the community or to achieve any meaningful purpose. So why do some organizations fail in using social media, while others – for instance KLM – are extremely successful in it? I think because many organizations do not understand the importance of adequately incorporating social media initiatives within their organizational structure. They do not know how to use social media in the context of their enterprise and become a ‘Social Enterprise’.

Designing the Social Enterprise

I strongly believe in a sort of ‘manufacturability’ of organizations. With manufacturability I mean designing the organization(al change) by the use of business models, enterprise architecture and process management. These are the fundamentals of delivering customer value in an effective and efficient way (although more than just these disciplines is required). I think that social media should be subject to these disciplines too. Maybe social media is not that ‘manufacturable’ as a business process or architecture, in the end it is part of a ‘social system’ about which we should carefully think why and how we participate in it. It is indeed the field in which most of our customers (internal or external to our organization) are active. That’s why social media offers us great opportunities and hazards for creating and delivering customer value.

IDEA for the Social Enterprise

In the consortium project ‘New Models for the Social Enterprise’ we designed ‘IDEA for the Social Enterprise’. IDEA is an abbreviation for the Interactive Design and Engineering Approach. It offers a method – with its roots in design thinking – to incorporate social media in your organization. By several diverging and converging phases we propose coherent instruments which help you to understand the value of social media in relation to your business model, the related business processes and the customers.

To conclude: whether you like it or not, social media is one of the trends we cannot deny from a perspective of organizational design. Social media has become an important channel for creating and delivering customer value. In order to use social media in delivering optimal customer value, I am convinced that organizations need a good IDEA about how to integrate social media in their enterprise!

If you have any questions or interest in IDEA or our research project ‘New models for the Social Enterprise’, feel free to contact me at b.beuger@bizzdesign.nl

Categories Uncategorized

Achieving agility with data virtualization (2/2)

This posting is the follow up from a previous post where we described the need for agility as well as a setting where we believe that data virtualization techniques can help.

Following the definition of data virtualization of Rick van der Lans, we see data virtualization as a group of technologies that makes a heterogeneous set of databases and files look like one integrated database, which has some commonality with how many people see the concept of a federated database. As we will see shortly, though, data virtualization picks up where “traditional” data federation stops and provides organizations with a rich set of techniques for data integration issues:

Starting at the bottom, we see a series of source systems (or at least: the data part of it). The data structures are replicated and wrapped in the data virtualization server. The idea is that the virtualization software discovers the data structures in the source systems to make them available as virtual table structures. This achieves the notion of federation as mentioned earlier. If desired, the actual content of the source systems may be (partially) cached: this has the advantage that queries can be handled mostly in the virtualization environment to prevent huge workloads for the source systems.

Based on this virtualized ‘foundation’ layer, it is fairly straight forward to build new layers of virtual tables on top. This allows for building data structures that are close to the needs of end-users (i.e. star schema’s). It also allows for easy integration, application of transformation and integration rules and so on. In practice we increasingly see virtualized data warehouses, master data management hubs, etcetera.

One aspect of agility should be obvious from this discussion: development and data integration within the virtualized environment can be considerably more agile than in a traditional setting. Requirements and specification (e.g. meta-data management) could still be used, but rather than a long build and deploy time, we now have results available immediately in a virtual table structure. As a result, it is easy to learn-while-doing in quick and highly interactive cycles with end users: quick sprints will deliver a working prototype and later adjustments can easily be made without having wasted many valuable development hours.

This also demonstrates the fact that such a system itself is also considered to be agile:

  • It will be fairly easy – and most of all: fast – to adapt to ever changing business needs for information.

  • Deploying changes to a virtualized data model is easier than changing the data structure of a physical database, which can entail all kinds of data conversion issues.

  • Dealing with the impact of changes is easier, since no software is adapted, lowering the risk of disruptions and keeping the impact localized.

  • Integration of the solution is simple, since existing interfaces remain stable.

Built-in features around security, auditing, logging, and monitoring (i.e., when things change in the source systems) provide the organization with the means to stay in control of their data. In short:

  • Data virtualization decouples access to data from the source systems. This allows further manipulation of data without impacting the original systems.

  • Virtualized access to structured and unstructured data allows for uniform querying. Caching avoids heavy work-loads on the original transaction systems.

  • Data access can be optimized for various stakeholders with different needs, concept definitions, permissions etc.

  • Virtualization allows for rapid, incremental development & delivery of information with minimal impact on source systems.

This mechanism can be considered a key resource for agility that supports key activities in the organization. A virtual data warehouse with rapid / agile development of new data structures will make it easier to accommodate management that increasingly seeks data-based, rationalized decision making to complement with creative strategic skill. Suppose, for example, that there is a feeling that international markets can be conquered with a cross selling strategy: offer one product at a discount to generate interest for high-end services that will generate revenue. Running the numbers based on historic sales in countries where the organization was already active must be swift. Even more, when executing this strategy, the system should be flexible enough to easily monitor actual investments and revenues in near real-time.

The other obvious need for system agility in the field of data lies with compliance and regulations. Many industries are heavily regulated, for example in finance or healthcare, and rules for compliance reporting change all the time. In and by itself this need not be an issue. However, we often see that concept definitions change slightly, derivations and key calculations become more complex, other types of information are required, and so on. Here also, the rapid development cycles and flexibility.

If you have any questions or suggestions: either drop a note or get in touch via E-mail!

Categories Uncategorized

Achieving agility with data virtualization (2/2)

This posting is the follow up from a previous post where we described the need for agility as well as a setting where we believe that data virtualization techniques can help.

Following the definition of data virtualization of Rick van der Lans, we see data virtualization as a group of technologies that makes a heterogeneous set of databases and files look like one integrated database, which has some commonality with how many people see the concept of a federated database. As we will see shortly, though, data virtualization picks up where “traditional” data federation stops and provides organizations with a rich set of techniques for data integration issues:

Starting at the bottom, we see a series of source systems (or at least: the data part of it). The data structures are replicated and wrapped in the data virtualization server. The idea is that the virtualization software discovers the data structures in the source systems to make them available as virtual table structures. This achieves the notion of federation as mentioned earlier. If desired, the actual content of the source systems may be (partially) cached: this has the advantage that queries can be handled mostly in the virtualization environment to prevent huge workloads for the source systems.

Based on this virtualized ‘foundation’ layer, it is fairly straight forward to build new layers of virtual tables on top. This allows for building data structures that are close to the needs of end-users (i.e. star schema’s). It also allows for easy integration, application of transformation and integration rules and so on. In practice we increasingly see virtualized data warehouses, master data management hubs, etcetera.

One aspect of agility should be obvious from this discussion: development and data integration within the virtualized environment can be considerably more agile than in a traditional setting. Requirements and specification (e.g. meta-data management) could still be used, but rather than a long build and deploy time, we now have results available immediately in a virtual table structure. As a result, it is easy to learn-while-doing in quick and highly interactive cycles with end users: quick sprints will deliver a working prototype and later adjustments can easily be made without having wasted many valuable development hours.

This also demonstrates the fact that such a system itself is also considered to be agile:

  • It will be fairly easy – and most of all: fast – to adapt to ever changing business needs for information.

  • Deploying changes to a virtualized data model is easier than changing the data structure of a physical database, which can entail all kinds of data conversion issues.

  • Dealing with the impact of changes is easier, since no software is adapted, lowering the risk of disruptions and keeping the impact localized.

  • Integration of the solution is simple, since existing interfaces remain stable.

Built-in features around security, auditing, logging, and monitoring (i.e., when things change in the source systems) provide the organization with the means to stay in control of their data. In short:

  • Data virtualization decouples access to data from the source systems. This allows further manipulation of data without impacting the original systems.

  • Virtualized access to structured and unstructured data allows for uniform querying. Caching avoids heavy work-loads on the original transaction systems.

  • Data access can be optimized for various stakeholders with different needs, concept definitions, permissions etc.

  • Virtualization allows for rapid, incremental development & delivery of information with minimal impact on source systems.

This mechanism can be considered a key resource for agility that supports key activities in the organization. A virtual data warehouse with rapid / agile development of new data structures will make it easier to accommodate management that increasingly seeks data-based, rationalized decision making to complement with creative strategic skill. Suppose, for example, that there is a feeling that international markets can be conquered with a cross selling strategy: offer one product at a discount to generate interest for high-end services that will generate revenue. Running the numbers based on historic sales in countries where the organization was already active must be swift. Even more, when executing this strategy, the system should be flexible enough to easily monitor actual investments and revenues in near real-time.

The other obvious need for system agility in the field of data lies with compliance and regulations. Many industries are heavily regulated, for example in finance or healthcare, and rules for compliance reporting change all the time. In and by itself this need not be an issue. However, we often see that concept definitions change slightly, derivations and key calculations become more complex, other types of information are required, and so on. Here also, the rapid development cycles and flexibility.

If you have any questions or suggestions: either drop a note or get in touch via E-mail!

Categories Uncategorized