Enterprise Architecture-Based Risk Assessment with ArchiMate

Until quite recently, IT security was the exclusive domain of security specialists. However, in the last couple of years, organizations have started to realize that IT-related risks cannot be seen in isolation, and should be considered as an integral part of Enterprise Risk and Security Management (ERSM). ERSM includes methods and techniques used by organizations to manage all types of risks related to the achievements of their objectives.

It is only natural to place ERSM in the context of Enterprise Architecture (EA), which provides a holistic view on the structure and design of the organization. Therefore, it is not surprising that EA methods such as TOGAF include chapters on risk and security (although the integration of these topics in the overall approach is still open for improvement), and a security framework such as SABSA shows a remarkable similarity to the Zachman framework for EA. And as a corollary, it also makes perfect sense to use the ArchiMate language to model risk and security aspects.

The previous blog post in this series outlined a method for EA-based ERSM with ArchiMate. This article proposes an initial mapping of risk and security concepts to ArchiMate concepts, and illustrates how these concepts can be used as a basis for performing an organization-wide risk assessment.

ArchiMate mapping of risk concepts

Most of the concepts used in ERSM standards and frameworks can easily be mapped to existing ArchiMate concepts. And since ERSM is concerned with risks related to the achievement of business objectives, these are primarily concepts from the motivation extension. 

  • Any core element represented in the architecture can be an asset, i.e., something of value susceptible to loss that the organization wants to protect. These assets may have vulnerabilities, which may make them the target of attack or accidental loss.

  • A threat may result in threat events, targeting the vulnerabilities of assets, and may have an associated threat agent, i.e., an actor or component that (intentionally or unintentionally) causes the threat. Depending on the threat capability and vulnerability, the occurrence of a threat event may or may not lead to a loss event.

  • Risk is a (qualitative or quantitative) assessment of probable loss, in terms of the loss event frequency and the probable loss magnitude (informally, ‘likelihood times impact’).

  • Based on the outcome of a risk assessment, we may decide to either accept the risk, or set control objectives (i.e., high-level security requirements) to mitigate the risk, leading to requirements for control measures. The selection of control measures may be guided by predefined security principles. These control measures are realized by any set of core elements, such as business process (e.g., a risk management process), application services (e.g., an authentication service) or nodes (e.g., a firewall).

ArchiMate mapping of risk concepts

ArchiMate mapping of risk concepts

Using one of the extension mechanisms as described in the ArchiMate standard, risk-related attributes can be assigned to these concepts. The Factor Analysis of Information Risk (FAIR) taxonomy, adopted by The Open Group, provides a good starting point for this.

Qualitative risk assessment

If sufficiently accurate estimates of the input values are available, quantitative risk analysis provides the most reliable basis for risk-based decision making. However, in practice, these estimates are often difficult to obtain. Therefore, FAIR proposes a risk assessment based on qualitative (ordinal) measures, e.g., threat capability ranging from ‘very low’ to ‘very high’, and risk ranging from ‘low’ to ‘critical’. The following picture shows how these values can be linked to elements in an ArchiMate model, and how they can be visualized in ‘heat maps’:

  • The level of vulnerability (Vuln) depends on the threat capability (TCap) and the control strength (CS). Applying control measures with a high control strength reduces the vulnerability level.

  • The loss event frequency (LEF) depends on both the threat event frequency (TEF) and the level of vulnerability. A higher vulnerability increases the probability that a threat event will trigger a loss event.

  • The level of risk is determined by the loss event frequency and the probable loss magnitude (PLM). 

Qualitative risk assessment

Qualitative risk assessment

The example below shows a simple application of such an assessment. A vulnerability scan of the payment system of an insurance company has shown that the encryption level of transmitted payment data is low (e.g., due to an outdated version of the used encryption protocol). This enables a man-in-the-middle attack, in which an attacker may modify the data to make unauthorized payments, e.g., by changing the receiving bank account. For a hacker with medium skills (medium threat capability) and no additional control measures, this leads to a very high vulnerability (according to the vulnerability matrix above). Assuming a low threat event frequency (e.g., on average one attempted attack per month), according to the loss event frequency matrix, the expected loss event frequency is also low. Finally, assuming a high probable loss magnitude, the resulting level of risk is high. As a preventive measure, a stronger encryption protocol may be applied. By modifying the parameters, it can be shown that increasing the control strength to ‘high’ or ‘very high’, the residual risk can be reduced to medium. Further reduction of this risk would require other measures, e.g., measures to limit the probable loss magnitude.

Risk analysis example

By linking risk-related properties to ArchiMate concepts, risk analysis can be automated with the help of a modeling tool. In this way, it becomes easy to analyze the impact of changes in these values throughout the organization, as well as the effect of potential control measures to mitigate the risks. For example, the business impact of risks caused by vulnerabilities in IT systems or infrastructure can be visualized in a way that optimally supports security decisions made by managers.

Categories Uncategorized

SABSA

There are so many reference models and open-source material available for enterprise architects – so it isn’t surprising that some of this material is less well known. One useful specialized resource is free-use the open-source Security Architecture development and management method and framework – SABSA. SABSA stands for ‘Sherwood Applied Business Security Architecture’. This summary…

Related posts:

  1. Multiple Integrated Architecture Frameworks (MIAF) Increasingly I’m hearing architects talking about their frustration with the…
  2. Architecture Frameworks – too complex and multi-dimensional? Following the recent The Open Group’s “Business Transformation in Finance,…
  3. Zachman Ontology, not Framework? Time to rename it?There is an interesting discussion on the…

The Best Enterprise Architecture Tool

So recently my buddy over at OTN Bob Rhubart asked “ What tool or tools are indispensable in your role as an architect? When faced with a new project what’s the first thing you reach for? Why?”. I instantly protested regarding the brevity of the answers. He suggested I blog about it. So here goes. So the […]

Office Space 2: The Rise of Milton

bg outline

By: Dave Hood, CEO, Troux

basement 073014 2 (2)Forbes Contributor Jason Bloomberg recently wrote a few articles addressing the current state of Enterprise Architecture. In his first post, he explored whether enterprise architecture is completely broken, a question not uncommonly heard in our industry.

We often discuss the changing world of Enterprise Architecture on the Troux blog, and it has been an ongoing debate inside Troux as to whether we should even use the term EA when defining our market. I have to admit, when I first arrived at Troux, I didn’t even know what Enterprise Architecture was. I have always viewed what we do as being about delivering business value. I think the original EA practitioners had that in mind too, but maybe they were a decade too early or just couldn’t land at the right process and tools to deliver on the vision.

In Bloomberg’s article he compares EA practitioners to Milton, the Innotech employee from the 90’s movie classic, Office Space, who continued to get paid while not actually having a role in the organization.

While EA’s role has drastically changed over the years, I like to think we are all now part of a sequel – Office Space 2: The Rise of Milton. Enterprise architecture is no longer an IT-centric discipline focused on creating complex colorful maps and models only understood by a few. To use another Office Space analogy, it is no longer about producing “pieces of flare” in hopes of proving to the business that EA is delivering some sort of value.

Does that mean EA is obsolete? Quite the contrary.

The art of making business decisions has been around since the dawn of trade. Today, every aspect of a business is part of a digitally connected enterprise, meaning the impact of every business decision ripples across the entire organization. Making critical decisions without understanding these effects can have devastating effects. The speed of industry change and the complexity represented by the portfolios that make up your business mean that informed decisions need to happen quicker than ever to remain competitive. It’s our EA friends who were shamed to the basement office that are now poised to make that happen.

Sounds overwhelming, but at Troux, we teach our customers that there is no need to boil the ocean. Understanding your connected enterprise can happen with bite size undertakings, along a logical timeline. By identifying critical business capabilities and harnessing the right data to gain perspective we can land at an ideal course of action for moving the business forward.

While Bloomberg’s article starts out questioning whether there is a future for EA, he actually arrives at the similar conclusion to us and expands on that vision in his follow up article “Agile Enterprise Architecture Finally Crosses the Chasm.”

The sequel is here, and from what we have experienced with our own customers, it is going to be a big hit at the box office. Milton was able to quickly determine that the “people to cake ratio” was too big. With today’s data, knowledge and tools, we can quickly learn so much more. Here are just a few examples of companies using the Troux’s version of enterprise architecture to make timely, informed business decisions.

Cisco: Global networking solutions giant, Cisco, has successfully implemented Troux’s Enterprise Portfolio Management solution to help define a common desired operating model across its business units. This in-turn helps them identify and divest businesses that are unlikely to deliver the desired top-or-bottom-line results.  In addition, Troux is also used to compare potential acquisition targets to the target model to help quickly identify the true value of potential acquisitions.

U.S. Census Bureau: In 2012, the U.S Census Bureau set out to build optimal IT solutions to handle a myriad of challenges to the business. With an Enterprise Architecture (EA) discipline enabled by Troux, the Bureau now has a more integrated business overall, underpinned by an IT decision process, collaborative governance, and a common knowledge base. It all adds up to increased agility, efficiency and innovation

Bayer: Bayer started working with Troux in 2008, and the two companies have had much success together. To date, Bayer has used Troux to manage and optimize its landscape across information, technology, applications and business architecture portfolios.



New Call-to-action

Categories Uncategorized

Enterprise Risk Management approach

In this blog post, Marc Lankhorst discussed  the value of EA in managing risk, compliance and security in the enterprise. He suggested a number of next steps. Two next steps are discussed in more detail in this blog:

  • Capture and visualize risk and security aspects of your organization. Visualize hazards, risks and mitigation measures in relation to the overall architecture and business strategy.
  • Measure and visualize the impact of risks and use these insights for decision making. Visualize data from e.g. penetration tests and use this to decide at the business level about necessary IT measures.

 

Enterprise Risk Management approach overview

The two steps from above are incorporated in an Enterprise Risk Management approach, visualized in Figure 1. This approach helps in understanding the consequences of risk & security policies, because the definition of risks and control measures on strategic level are step by step detailed into operational control measures.

 

Overview of Risk Management approach

Figure 1: Enterprise Risk Management approach

 

This is a model driven and cyclic approach which can be started on multiple points in the cycle, depending whether you are using a more top-down approach or a more bottom-up approach. Each phase will be explained briefly below:

  1. Assess risks. In this step, the risks that the enterprise has to cope with are identified and documented. This covers multiple risk types: these can be IT related (like cyber-attacks) risks, but also business related risks. Furthermore, risks can be based on identified threats (see step 6).
  2. Specify required control measures. For each risk is identified which control measures are required. Some risks may require extensive control measures (because of the high impact of the risk), as others may require less control measures. The combination of risks and control measures can be modelled with elements of the ArchiMate motivation extension (Assessment, Goal and Requirement) which makes the relation between these aspects clear. Furthermore, it can be incorporated in your existing EA models by linking risks and control measures to ArchiMate core elements. More details on this approach will be presented in a follow up blog.
  3. Implement control measures. The required control measures needs to be implemented. This is the step where the shift from design to implementation is made. Control measures can be implemented in several ways: some may be IT control measures like firewalls or authentication mechanisms. Others can be business focused control measures like the four-eyes principle.
  4. Execute & monitor. The implemented control measures needs to be executed. Furthermore, monitoring on operational level is necessary to get statistics of the performance and effectiveness of implemented controls. An example is to use pentesting on the technical infrastructure. With pentesting you look with a systematic and automated approach for weak spots in the infrastructure. Results of pentests are used to analyze vulnerabilities in the infrastructure and define new control measures.
  5. Analyze vulnerabilities. From executing & monitoring you obtained the necessary insights about performance and effectiveness of implemented controls for example via pentesting). In this step this data is analyzed to determine which vulnerabilities there are and how dangerous these are. The link is made between vulnerabilities and identified risks from step 2, by using the existing EA models. This gives insights in how well the risks are managed or that new or improved control measures are needed.
  6. Identify threats. In this step threats from the external or internal environment are identified. Threats from the internal environment can be based on the results of the previous step (analyze vulnerabilities). The identification of new threats can lead to new or changed risk assessments in step 1.

 

Top down vs bottom up

The approach described above can be applied top down or bottom up. In a top down approach will be started with the identification of threats and assessment of risks, which serve as a basis for design and implementation of control measures. A bottom up approach would typically start at the monitor & execute step: investigating the current implementation with pentests or other mechanisms and use this information to determine vulnerabilities in the current landscape.
Which approach fits best in your organization, depends on a number of aspects. In general, organizations with a more mature EA approach can follow more easily a top down approach.

Benefits of this approach

This approach includes the following benefits:

  • Systematic analysis of threats and vulnerabilities
  • Integrated design of control measures
  • EA models support business impact analysis of technical risks / vulnerabilities
  • Translate business risk & security decisions into effective enterprise changes. This requires a strong cooperation between business and IT.

These benefits help to embed security more in the business layer of your organization and will help to make well informed decisions based on operational risk impact and costs. 

Learn more about Security Architecture in our webinar, September 18th. Or Join our Security Architecture training course in the Netherlands, October 2nd. 

Categories Uncategorized

Lessons learned in managing engineering team growth

Over the last couple of years the engineering team at Mendix has grown fast. Over the last 1.5 years the team has almost doubled and we are still looking for bright minds. There is a lot that can and will go wrong if you grow this fast. Here are my four most important lessons learned during the process (disclaimer: a lesson learned doesn’t necessarily mean that I execute it flawlessly.

The post Lessons learned in managing engineering team growth appeared first on The Enterprise Architect.

Welcome to the world of BPMN 2.0 – Some valuable tips!

In this blog I elaborate about the Business Process Model and Notation (BPMN) standard, which advantages it brings you and how you should use this standard in a powerful manner. BPMN is the de facto international standard for modeling business processes. Version 2.0 was released in 2011 and the standard is maintained by the Object Management Group. It is an extensive standard with formal semantics which enables you to describe your business processes up to the level of automating the processes by a process engine. For my work I use BPMN 2.0 to create process models and I think it is a powerful notation for describing business processes. However, you need to keep some cautions in mind. By presenting the advantages and some guidelines of applying the standard, I hope to make you enthusiastic for this beautiful standard!

Why use BPMN 2.0?

First of all, I would like to present three major advantages of using BPMN 2.0:

World of BPMN

  • Like English is a language used for verbal and written communication all over the world, BPMN is an international graphical ‘language’   (standard) that is used to communicate all over the process world. Since it is universally adopted as a process modeling standard, stakeholders outside the organization like auditors, partnering organizations and system implementers understand this standard. This   enables you to communicate about your business processes in a clear and consistent manner.

  • BPMN 2.0 is tool- and vendor independent. The standard is free to use and has a standardized underlying XML-scheme which makes you flexible with respect to contracting tool vendors when using tool support for describing or executing your business processes.

  • BPMN 2.0 is a very rich standard with a lot of different concepts. It enables you to describe your processes in great detail and use the right semantics for each of the details. The nature of the language enables you to bridge the gap from IT to business (and vice versa) and can be used to execute processes directly in an automation engine. Especially the many possibilities of modeling event driven behavior makes BPMN very powerful compared to other modelling languages.

BPMN 2.0: Handle with care

However, some firm critics exist because BPMN 2.0 should be too complicated for business  stakeholders. I can understand that the BPMN 2.0 standard does become complicated when you ‘just simply’ start using the standard and all of the included concepts. Therefore, I want to   present you some valuable tips to apply BPMN in practice:

  • Keep it pragmatic: The above mentioned richness of BPMN is also the seam side (and often expressed criticism) of the language; the richness of BPMN seduces people to create theoretical perfect models, which are no longer understood by stakeholders. Realize that a lot of the BPMN concepts are intended for expressing automation details (i.e. process execution) or exceptional situations. Since approximately 90 % of BPMN 2.0 users only use the standard to visually describe their business processes instead of execute the process directly in a process engine, I strongly advise you to keep the amount of concepts used for your process models limited.
  • BPMN keep it understandableLess is more: The bottom line is easy; the on the eye endless amount of concepts are not part of the standard to communicate to business stakeholders. To a certain level BPMN is quite intuitively to understand, but a tremendous amount of concepts should not be used to communicate to business stakeholders. Think about your own language; how many of the words in your own language (count them in a dictionary) do you use to communicate to a three-year-old kid to make him something clear? And your kid already had three years to  learn that language! Most of the business stakeholders simply do not speak BPMN on a mature level so you should adapt the use of the language to your target audience. Only use those concepts that are essential to communicate your message!

  • Consistency results in clarity: Except for communicating BPMN process models to  business stakeholders, applying BPMN in a simple manner also has the advantage of securing consistency in the different process models created in your organization. This requires clear conventions between the users of the standard: which concepts do we use, how do we use them and how do we name them in a recognizable way? Besides that, conventions with respect to lay-out are important for bringing your message in an unambiguous way to your business stakeholders.

Hopefully this offers you my perspective on using BPMN 2.0. Although BPMN is a great standard, the creation of BPMN models is not an end in itself. Above presented tips may help you to use  BPMN in an effective manner. The ultimate goal of creating process models is to deliver value   to your stakeholder(s). Since value is expressed in terms of benefits that are perceived by the stakeholder, the stakeholder is the starting point of creating process models. The starting point is therefore to identify your stakeholder and what information he needs in a process model! If   you do so, I am sure that BPMN brings you the value you are looking for!

The modeling process in BPMN

Learn more about BPMN during our BPMN Foundation training course. Do you have any additional do’s or don’ts for using BPMN 2.0? Please share by leaving a comment! 

Categories Uncategorized

Welcome in the world of BPMN 2.0 – Some valuable tips!

In this blog I elaborate about the Business Process Model and Notation (BPMN) standard, whichadvantages it brings you and how you should use this standard in a powerful manner. BPMN is the de facto international standard for modeling business processe…

Categories Uncategorized

Welcome to the world of BPMN 2.0 – Some valuable tips!

In this blog I elaborate about the Business Process Model and Notation (BPMN) standard, which advantages it brings you and how you should use this standard in a powerful manner. BPMN is the de facto international standard for modeling business processes. Version 2.0 was released in 2011 and the standard is maintained by the Object Management Group. It is an extensive standard with formal semantics which enables you to describe your business processes up to the level of automating the processes by a process engine. For my work I use BPMN 2.0 to create process models and I think it is a powerful notation for describing business processes. However, you need to keep some cautions in mind. By presenting the advantages and some guidelines of applying the standard, I hope to make you enthusiastic for this beautiful standard!

Why use BPMN 2.0?

First of all, I would like to present three major advantages of using BPMN 2.0:

World of BPMN

  • Like English is a language used for verbal and written communication all over the world, BPMN is an international graphical ‘language’   (standard) that is used to communicate all over the process world. Since it is universally adopted as a process modeling standard, stakeholders outside the organization like auditors, partnering organizations and system implementers understand this standard. This   enables you to communicate about your business processes in a clear and consistent manner.

  • BPMN 2.0 is tool- and vendor independent. The standard is free to use and has a standardized underlying XML-scheme which makes you flexible with respect to contracting tool vendors when using tool support for describing or executing your business processes.

  • BPMN 2.0 is a very rich standard with a lot of different concepts. It enables you to describe your processes in great detail and use the right semantics for each of the details. The nature of the language enables you to bridge the gap from IT to business (and vice versa) and can be used to execute processes directly in an automation engine. Especially the many possibilities of modeling event driven behavior makes BPMN very powerful compared to other modelling languages.

BPMN 2.0: Handle with care

However, some firm critics exist because BPMN 2.0 should be too complicated for business  stakeholders. I can understand that the BPMN 2.0 standard does become complicated when you ‘just simply’ start using the standard and all of the included concepts. Therefore, I want to   present you some valuable tips to apply BPMN in practice:

  • Keep it pragmatic: The above mentioned richness of BPMN is also the seam side (and often expressed criticism) of the language; the richness of BPMN seduces people to create theoretical perfect models, which are no longer understood by stakeholders. Realize that a lot of the BPMN concepts are intended for expressing automation details (i.e. process execution) or exceptional situations. Since approximately 90 % of BPMN 2.0 users only use the standard to visually describe their business processes instead of execute the process directly in a process engine, I strongly advise you to keep the amount of concepts used for your process models limited.
  • BPMN keep it understandableLess is more: The bottom line is easy; the on the eye endless amount of concepts are not part of the standard to communicate to business stakeholders. To a certain level BPMN is quite intuitively to understand, but a tremendous amount of concepts should not be used to communicate to business stakeholders. Think about your own language; how many of the words in your own language (count them in a dictionary) do you use to communicate to a three-year-old kid to make him something clear? And your kid already had three years to  learn that language! Most of the business stakeholders simply do not speak BPMN on a mature level so you should adapt the use of the language to your target audience. Only use those concepts that are essential to communicate your message!

  • Consistency results in clarity: Except for communicating BPMN process models to  business stakeholders, applying BPMN in a simple manner also has the advantage of securing consistency in the different process models created in your organization. This requires clear conventions between the users of the standard: which concepts do we use, how do we use them and how do we name them in a recognizable way? Besides that, conventions with respect to lay-out are important for bringing your message in an unambiguous way to your business stakeholders.

Hopefully this offers you my perspective on using BPMN 2.0. Although BPMN is a great standard, the creation of BPMN models is not an end in itself. Above presented tips may help you to use  BPMN in an effective manner. The ultimate goal of creating process models is to deliver value   to your stakeholder(s). Since value is expressed in terms of benefits that are perceived by the stakeholder, the stakeholder is the starting point of creating process models. The starting point is therefore to identify your stakeholder and what information he needs in a process model! If   you do so, I am sure that BPMN brings you the value you are looking for!

The modeling process in BPMN

Learn more about BPMN during our BPMN Foundation training course. Do you have any additional do’s or don’ts for using BPMN 2.0? Please share by leaving a comment! 

Categories Uncategorized

10 Easy Steps to Good Data

Last time I talked about the untold benefits of an Enterprise Data Model as a Reference Architecture for the Connected Enterprise.  This week I would like to discuss the most valuable asset in any company, Data.
Good Data: Your Most Valuable Asset
An…

Categories Uncategorized