Knowing Your Exposure Limitations

April 1, 2013

For those familiar with risk analysis – specifically the measurement [or estimate] of frequency and severity of loss – there are many scenarios where the severity of loss or resultant expected loss can have a long tail. In FAIR terms, we may have a scenario where the minimum loss is $1K, the most likely loss $10K and the maximum loss is $2M. In this type of a scenario – using Monte Carlo simulation and depending on the associated kurtosis of the estimated distribution – it is very easy for a severity distribution or aggregate distribution (that takes into account both frequency and severity) to be derived of which the resulting descriptive statistics don’t as accurately reflect the reality of exposure of the adverse event should it occur. While understanding the full range of severity or overall expected loss may be useful, a prudent risk practitioner should understand and account for the details of the organization’s business insurance policies to better understand when insurance controls will be invoked to limit financial loss for significant adverse events.

Using the example values above, an organization may be will willing to pay out of pocket for all adverse events –similar to the scenario above – up to $1M and then rely upon insurance to cover the rest. This in turn changes the maximum amount of loss the company is directly exposed to (per event); from $2M to $1M. In addition, this understanding could be a significant information point for decisions makers as they ponder how to treat an issue. Given this information consider the following:

1. How familiar you are with your organizations business or corporate insurance program?

2. Does your business insurance program cover the exposures you are responsible for or accountable for managing the risk of?

3. Are your risk analysis models flexible enough to incorporate limits of loss relative to potential loss?

4. When you talk with decisions makers, are you even referencing the existence of business insurance policies or other risk financing / transfer controls that limit your organization’s exposure when significant adverse events occur?

The more we can leverage other risk-related controls in the organization and paint a more accurate picture of exposure, the more we become a trusted advisor to our decision makers and other stakeholders in the larger risk management life-cycle.

Want to learn more?

AICPCU – Associate in Risk Finance (ARM) – http://www.aicpcu.org/comet/programs/arm/arm.htm

SIRA – Society of Information Risk Analysts – http://www.societyinforisk.org


Wonder Twin Powers Activate…

March 7, 2013

…form of risk professional. I really miss blogging. The last year or so has been a complete gaggle from a relocation and time-management perspective. So naturally, discretionary activities – like blogging – take a back seat. I want to share a few quick thoughts around the topic of transitioning from a pure information technology / information security mindset to a risk management professional mindset.

1. Embrace the Gray Space. Information technology is all about bits, bytes, ones and zeros. Things either work or don’t work; it is either black or white, it is either good or bad – you get the point. In the discipline of risk management we are interested in everything between the two extremes. It is within this space where there is information to allow decision makers to make more well informed decisions.

2. Embrace Uncertainty. Intuitively, the concept of uncertainty is contrary to a lot of information technology concepts. Foundational risk concepts revolve around understanding and managing uncertainty and infusing it into our analysis / conversation with decision makers. There is no reason why this cannot be done within information risk management programs as well.  At first, it may feel awkward as an IT professional to admit to a leader that there is uncertainty inherent within some of the variables included in your analysis. However, what you will find – assuming you can clearly articulate your analysis – is that infusing the topic of uncertainty in your conversations and analysis has indirect benefits. Such an approach implies rigor, maturity and builds confidence with the decision maker.

3. Find New Friends. Notice I did not type find different friends. There is an old adage that goes something to the effect of “you are who you surround yourself with”. Let me change this up: “you are who you are learning from”. You want to learn risk management? Indulge yourself in non-IT risk management knowledge sources, learn centuries old principles of risk management and then begin applying what you have learned to the information technology / information security problem space. Here are just a few places to begin:

a. https://www.societyinforisk.org/
b. Risk Management Magazine
c. The Risk Management Society
d. Property & Casualty  – Enterprise Risk Management

4. Change Your Thinking. This is going to sound heretical but bear with me. Stop thinking like an IT professional and begin thinking like a business and a risk management professional. Identify and follow the money trails for the various risk management problem spaces you are dealing with. Think like a commercial insurer. An entire industry exists to reduce the uncertainty associated with technology-related, operational risk – when bad things happen. Thus, learn how commercial insurers think so you can manage risk more effectively without having to overspend on third party risk financing products – as well as manage risk in such a way that can tie back to the financials – feelings and emotions. This is why I am so on-board with the AICPCU’s Associate in Risk Management (ARM) professional designation. You can also check out the FAIR risk measurement methodology which is also very useful for associating loss forms to adverse events which can also help tell the story around financial consequences.

5. Don’t Die On That Hill. I have to thank my new boss for this advice. Choose your risk management battles wisely and in the heat of the conversation ask yourself if you need to die on this hill. Not all of our conversations with decision makers, leaders or even between ourselves – as dear colleagues – is easy. It is way too easy for passion to get in the way of progress and influencing. Often, if you find yourself “on the hill” asking if you need to die – something has gone terribly wrong. Instead of dying and ruining a long term relationship – take a few steps back, get more information that will help in the situation, regroup and attack again. This is an example of being a quiet professional.

That is all for now. Take care.


The AICPCU ‘Associate in Risk Management’ (ARM)

September 14, 2012

A year or so ago I stumbled upon the ARM designation that is administered through the AICPCU or ‘the Institutes’ for short. What attracted me then to the designation was that it appeared to be a comprehensive approach to performing a risk assessment for scenarios that result in some form of business liability. Unfortunately, I did not start pursuing the designation until July 2012. The base designation consists of passing three tests on the topics of ‘risk assessment’, ‘risk control’ and ‘risk financing’. In addition, there are a few other tests which allows one to extend their designation to include disciplines such as ‘risk management for public entities’ and ‘enterprise risk management’.

I am about two months into my ARM journey and just passed the ARM-54 ‘Risk Assessment’ test. I wanted to share some perspective on the curriculum itself and some differentiators when compared to some other ‘risk assessment’ and ‘risk analysis / risk measurement’ frameworks.

1. Proven Approach. Insurance and risk management practices have been around for centuries. Insurance carriers especially those who write commercial insurance products are very skilled at identifying and understanding the various loss exposures businesses face. Within the information risk management and operational risk management space, many of the loss exposures we care about and look for are the same that insurance carriers may look for when they assess a business for business risk and hazard risk; so they can create a business insurance policy. In other words, the ‘so what’ associated with the bad things we and insurance carriers care about is essentially a business liability that we want to manage. Our problem space / skills and risk treatment options may be slightly different but the goal of our efforts is the same: risk management.

2. Comprehensive. The ARM-54 course alone covers an enormous amount of information. The material easily encompasses the high level learning objectives of six college undergraduate courses I have taken in the last few years:

- Insurance and Risk Management
- Commercial Insurance
- Statistics
- Business Law
- Calculus (Business / Finance Problem Analysis / Calculations)
- Business Finance

The test for ARM-54 was no walk in the park. Even though I passed on the first attempt, I short-changed myself on some of the objectives which caused a little bit of panic on my part. The questions were well written and quite a few of them forced you to understand problem context so you could choose the best answer.

3. ‘Risk Management Value Chain’. Some of the following thoughts are the largest selling points of this designation compared to other IT risk assessment frameworks, IT risk analysis frameworks and IT risk certifications / designations. The ARM curriculum connects the dots between risk assessment activities, risk management decisions and the financial implications of those decisions at various levels of abstraction. This is where existing IT-centric risk assessment / analysis frameworks fall short – they are either to narrow in focus, do not incorporate business context, are not practical to execute or in some cases, not useful at all in helping someone or a business manage risk.

4. Cost Effective. For between $300-$500 per ARM course – one can get some amazing reference material and pay for the test. Compare that to the cost of six university courses (between $6K – $9K) or the cost of one formal risk measurement course (~$1k). I am convinced that any risk management professional can begin applying learned concepts from the ARM-54 text within hours after having been introduced to the text. So just the cost of the text books alone (~$100 give or take) is justified even if you do not take the test(s).

5. Learn How To Fish. Finally, I think it is worth noting that there is nothing proprietary to the objectives and concepts presented in the ARM-54 ‘Risk Assessment’ curriculum. Any statistical probability calculations or mathematical finance problems are exactly that – good ole math and probability calculations. In addition, there is nothing proprietary about the methods or definitions presented as they relate to risk assessments or risk management proper. This is an important selling point to me because there are many information risk management practitioners that are begging for curricula or training such as ARM where they can begin applying what they are learning and not be dependent on proprietary tools, proprietary calculations or pay for the license to use a proprietary framework.

In closing, the ARM-54 curriculum is a very comprehensive risk management curriculum that establishes business context, introduces proven risk assessment methods, and reinforces sound risk management principals. In my opinion, it is very practical for the information / operational risk management professional – especially those that are new to risk management or looking for a non-IT or non-security biased approach to risk management – regardless of the industry you work in.

So there you have it. I am really psyched about this designation and the benefits I am already realizing in my job as a Sr. Risk Advisor for a Fortune 200 financial services firm. I wish I would have pursued this designation two years ago but I am optimistic that I will make for lost time and tangible business value very quickly.


Assurance vs. Risk Management

August 29, 2012

One of my current hot button is the over-emphasis of assurance with regards to risk management. I recently was given visibility to a risk management framework where ‘management assurance’ was listed as the goal of the framework. However, the framework did not allow for management to actually manage risk.

Recently at BSidesLA I attempted to reduce the definitions of risk and ‘risk management’ down to fundamental attributes because there are so many different – and in a lot of cases contextually valid – definitions of risk.

Risk: Something that can happen that can result in loss. It is about the frequency of events that can have an adverse impact to our time, resources and of course our money.

Risk Management: Activities that allow us to reduce our uncertainty about risk(s) so we can make good trade off decisions.

So how does this tie into assurance? The shortcoming with an assurance-centric approach to risk management is that assurance IMPLIES 100% certainty that all risks are known and that all identified controls are comprehensive and effective. An assurance-centric approach also implies that a control gap, control failure or some other issue HAS to be mitigated so management can have FULL assurance regarding their risk management posture.

Where risk management comes into play is when management does not require with having 100% assurance because there may not be adequate benefit to their span of control or the organization proper. Thus, robust risk management frameworks need to have a management response process – i.e. risk treatment decisions – when issues or gaps are identified. A management response and risk treatment decision process has a few benefits:

1. It promotes transparency and accountability of management’s decisions regarding their risk management mindset (tolerance, appetite, etc.).

2. It empowers management to make the best business decision (think trade-off) given the information (containing elements of uncertainty) provided to them.

3. It potentially allows organizations to better understand the ‘total cost of risk’ (TCoR) relative to other operational costs associated with the business.

So here are the take-aways:

1. Assurance does always not equate to effective risk management.

2. Effective risk management can facilitate levels of assurance, confidence as well one’s understanding of uncertainty regarding loss exposures they are faced with.

3. Empowering and enabling management to make effective risk treatment decisions can provide management a level of assurance that they are running their business they way they deem fit.


Heat Map Love – R Style

January 20, 2012

Over the last several years not a month has gone by where I have not heard someone mention R – with regards to risk analysis or risk modeling – either in discussion or on a mailing list. If you do not know what R is, take a few minutes to read about it at the project’s main site. Simply put, R is a free software environment for statistical computing and graphics. Most of my quantitative modeling and analysis has been strictly Excel-based, which to date has been more then sufficient for my needs. However, Excel is not the ‘end-all-be-all’ tool. Excel does not contain every statistical distribution that risk practitioners may need to work with, there is no native Monte Carlo engine and it does have graphing limitations short of purchasing third party add-ons (advanced charts, granular configuration of graphs, etc…).

Thanks to some industry peer prodding (Jay Jacobs of Verizon’s Risk Intelligence team and Alex Hutton suggesting that ‘Numbers’ is a superior tool for visualizations). I finally bit the bullet, downloaded and then installed R.  For those completely new to R you have to realize that R is a platform to build amazing things upon. It is very command-line like in nature. You type in instructions and it executes. I like this approach because you are forced to learn the R language and syntax. Thus, in the end you will probably understand your data and resulting analysis much better.

One of the first graphics I wanted to explore with R was heat maps. At first, as I was thinking a standard risk management heat map; a 5×5 matrix with issues plotted on the matrix relative to frequency and magnitude. However, when I started searching Google for ‘R heat map’, a similar yet different style of heat map – referred to as a cluster heat map – was first returned in the search results. A cluster heat map is useful for comparing data elements in a matrix against each other depending on how your data is laid out. It is very visual in nature and allows the reader to quickly zero in on data elements or visual information of importance. From an information risk management perspective, if we have quantitative risk information and some metadata, we can begin a discussion with management by leveraging a heat map visualization. If additional information is needed as to why there are dark areas, then we can have the discussion about the underlying quantitative data. Thus, I decided to build a cluster heat map in R.

I referenced three blogs to guide my efforts – they can be found here, here and here. What I am writing here is in no way a complete copy and paste of their content because I provide some additional details on some steps that generated errors for me that in some cases took hours to figure out. This is not unexpected given the difference in data sets.

Let’s do it.

1.    Download and install R. After installation, start an R session. The version of R used for this post is 2.14.0. You can check your version by typing version at the command prompt and pressing ENTER.

2.    You will need to download and install the ggplot2 package / library. Do this through the R gui by referencing an online CRAN repository (packages -> install packages …). This method seems to be cleaner then downloading a package to your hard disk and then telling R to install it. In addition, if you reference an online repository, it will also grab any dependent packages at the same time. You can learn more about ggplot2 here.

3.    Once you have installed the ggplot2 package, we have to load it into our current R workspace.

> library(ggplot2)

4.    Next, we are going to import data to work with in R. Download ‘risktical_csv1.csv’ to your hard disk and execute the following command. Change the file path to match the file path for where you saved the file to.

risk <- read.csv(“C:/temph/risktical_csv1.csv”, sep=”,”, check.names= FALSE)

a.    We are telling R to import a Comma Separated Value file and assign it to a variable called ‘risk’.
b.    Read.csv is the method or function type of import.
c.    Notice that the slashes in the file name are opposite of what they normally would be when working with other common Windows-based applications.
d.    ‘sep=”,”’ tells R what character is used to separate values within the data set.
e.    ‘check.names=FALSE’ tells R not to check the column headers for correctness. R expects to see only letters, if it sees numbers, it will prepend an X to the column headers – we don’t want that based off the data set we are using.
f.    Once you hit enter, you can type ‘risk’ and hit enter again. The data from the file will be displayed on the screen.

5.    Now we need to ‘shape’ the data. The ggplot graphing function we want to use cannot consume the data as it currently is, so we are going to reformat the data first. The ‘melt’ function helps us accomplish this.

risk.m <- melt(risk)

a.    We are telling R to use the melt function against the ‘risk’ variable. Then we are going to take the output from melt and create a new variable called risk.m.
b.    Melt rearranges the data elements. Type ‘help(melt)’ for more information.
c.    After you hit enter, you can type ‘risk.m’ and hit enter again. Notice the way the data is displayed compared to the data prior to ‘melting’ (variable ‘risk’).

6.    Next, we have to rescale our numerical values so we can know how to shade any given section of our heat map. The higher the numerical value within a series of data, the darker the color or shade that tile of the heat map should be. The ‘ddply’ function helps us accomplish the rescaling; type ‘help(ddply)’ for more information.

risk.m <- ddply(risk.m, .(variable), transform, rescale = rescale(value), reorder=FALSE)

a.    We are telling R to execute the ‘ddply’ function against the risk.m variable.
b.    We are also passing some arguments to ‘ddply’ telling it to transform and reshape the numerical values. The result of this command produces a new column of values between 0 and 1.
c.    Finally, we pass an argument to ‘ddply’ not to reorder any rows.
d.    After you hit enter, you can type ‘risk.m’ and hit enter again and observe changes to the data elements; there should be two new columns of data.

7.    We are now ready to plot our heat map.

(p <- ggplot(risk.m, aes(variable, BU.Name)) + geom_tile(aes(fill = rescale), colour = “grey20″) + scale_fill_gradient(low = “white”, high = “red”))

a.    This command will produce a very crude looking heat map plot.
b.    The plot itself is assigned to a variable called p
c.    ‘scale_fill_gradient’ is the argument that associates color shading to the numerical values we rescaled in step 6. The higher the rescaling value – the darker the shading.
d.    The ‘aes’ function of ggplot is related to aesthetics. You can type in ‘help(aes)’ to learn about the various ‘aes’ arguments.

8.    Before we tidy up the plot, let’s set a variable that we will use in formatting axis values in step 9.

base_size <- 9

9.    Now we are going to tidy up the plot. There is a lot going on here.

p + theme_grey(base_size = base_size) + labs(x = “”, y = “”) + scale_x_discrete(expand = c(0, 0)) + scale_y_discrete(expand = c(0, 0)) + opts(legend.position = “none”, axis.ticks = theme_blank(), axis.text.x = theme_text(size = base_size * 0.8, angle = -90, hjust = 0, colour = “black”), axis.text.y = theme_text(size = base_size * 0.8, angle = 0, hjust = 0, colour = “black”))

a.    ‘labs(x = “”, y = “”)’ removes the axis labels.
b.    ‘opts(legend.position = “none”’ gets rid of the scaling legend.
c.    ‘axis.text.x = theme_text(size = base_size * 0.8, angle = -90’ sets the X axis text size as well as orientation.
d.    The heat map should look like the image below.

A few final notes:

1.    The color shading is performed within series of data, vertically. Thus, in the heat map we have generated, the color for any given tile is relative to the tile above and below it –IN THE SAME COLUMN – or in our case for a given ISO 2700X policy section.

2.    If we transposed our original data set – risktical_cvs2 – and applied the same commands with the exception of replacing BU.Name with Policy in our initial ggplot command (step 7), you should get a heat map that looks like the one below.

3.    In this heat map, we can quickly determine key areas of exposure for all 36 of our fictional business units relative to ISO 2700X. For example, most of BU3’s exposure is related to Compliance, followed by Organizational Security Policy and Access Control. If the executive in that business unit wanted more granular information in terms of dollar value exposure, we could share that information with them.

So there you have it! A quick R tutorial on developing a cluster heat map for information risk management purposes. I look forward to learning more about R and leveraging it to analyze and visualize data in unique and thought-provoking ways. As always, feel free to leave comments!


Personal Risk Management

November 4, 2011

Somewhere between self-improvement, the feedback process, perception management and total quality management (TQM) is a lesson to be learned and an opportunity for introspection. I want [need] to document a few thoughts about the intersection of these concepts based off recent personal and professional experiences.

Self-Improvement. At some point while serving in the Marine Corps it became very obvious that there were three performance paths: be a bad performer and let the system make your life a living heck, be an average performer and let the system carry you along, be a stellar performer and push the system to its limits and possibly change it. I have always chosen to chase after stellar and it has worked pretty well for me over the years. However, in some professions to maintain stellar status – you have to constantly be seeking self-improvement.

Feedback. The term feedback means different things depending on the context in how it is being used. I find the act of feedback to be challenging both on the giving end as well as the receiving end – especially when it is feedback that is not complimentary. I have had both great and absolutely horrendous experiences – as an actor in both roles. The reality is that having feedback mechanisms in place whether formal or informal is critical to have – regardless of the merit of the feedback or how the feedback was communicated. More on this later when I attempt to tie all of this together.

Perception Management. Perception is reality to most people regardless of the facts. Anyone that is actively managing their career or personal life probably cares about perception. Furthermore, we probably want to be in control of how people perceive our actions, thoughts, attitudes and even mannerisms – lest it be established by others.

Total Quality Management. My current school studies are revolving around operations management. Specifically, quality improvement, TQM, Six Sigma, etc. There are concepts around TQM that can be applied to various dimensions of our lives: personal, professional, ethical, moral, giving, etc. Without going down a rabbit hole, I am convinced that quality improvement concepts allow us to construct guard rails (control limits) for the aforementioned dimensions.

So how does all of this tie together?

If you are serious about self-improvement and managing perception – you have to embrace feedback and take into consideration if you are approaching a quality limit if a feedback opportunity presents itself (me being the recipient). You may not agree with the merit of the feedback or agree with the delivery mechanism but you have to listen – just not hear – what is being communicated. This is really hard to do sometimes and how we react to the feedback experience can destroy relationships and further erode trust. When it comes to constructive criticism feedback – if someone is taking the time to give it – regardless of its validity – could this possibly be an indicator that we are approaching some of our quality limits – whether you have defined them or not?

For example, here are two commonly used rules for determining is a process is out of control:
1.    A single point outside the control limits.
2.    Obvious consistent or persistent patterns that suggest that there is something unusual about the data.

Keeping these two rules in mind, we can go through this exercise of introspection. Such an exercise requires one to put their pride on the shelf, set aside emotions, and really try to flush out the opportunity for self improvement. And, if all this can be done in a manner with a redemptive mindset – the better yet. In the end of such an exercise, there should always be one or more questions we should strive to answer:

1.    Is there something minor I can improve on? Is a slight adjustment needed to pull me back from the guard rails or better manage perception?
2.    Is there something major going on that calls for a massive adjustment? Is there really a fire that is producing all this feedback smoke?
3.    Was I a good partner in the feedback process? Did I listen? Did I have a redemptive mindset?

Hear me folks – this topic and what I have outlined is not something I consider myself to be a stellar example of. However, I do care about self-improvement, managing my perception, and adhering to quality in the execution of my responsibilities and will strive to keep in mind what I have outlined moving forward.

That’s it.


OpenPERT – A FREE Add-In for Microsoft Office Excel

August 15, 2011

INTRODUCTION. In early June of this year, Jay Jacobs and I started having a long email / phone call discussion about risk modeling, model comparisons, descriptive statistics, and risk management in general. At some point in our conversation the topic of Excel add-ins came up and how nice it would be to NOT have to rely upon 3rd party add-ins that cost between hundreds and thousands of dollars to acquire. You can sort of think of the 80-20 rule when it comes to out of the box Excel functionality – though it is probably more like 95-5 depending on your profession – most of the functionality you need to perform analysis is there. However, there are at least two capabilities not included in Excel that are useful for risk modeling and analysis: the betaPERT distribution and Monte Carlo simulation. Thus,  the need for costly 3rd-party add-ins or a free alternative, the OpenPERT add-in.

ABOUT BETAPERT. You can get very thorough explanations about  the betaPERT distribution here, here, and here. What follows is the ‘cliff notes’ version. The betaPERT distribution is often used for modeling subject matter expert estimates in scenarios where there is no data or not enough of it. The underlying distribution is the beta distribution (which is included in Microsoft Office Excel).  If we can over-simply and define a distribution as a collection or range of values – the betaPERT distribution when initially used with three values, such as minimum, most likely (think mode) and maximum values will create a distribution of values (output) that can then be used for statistical analysis and modeling. By introducing a fourth parameter – which I will refer to as confidence, regarding the ‘most likely’ estimate – we can account for the kurtosis – or peakedness – of the distribution.

WHO USES BETAPERT? There are a few professions and disciplines that leverage the betaPERT distribution:

Project Management – The project management profession is most often associated with betaPERT. PERT stands for Program (or Project) Evaluation and Review Technique. PERT was developed by the Navy and Booz-Allen-Hamilton back in the 1950’s (ref.1; see below ) – as part of the Polaris missile program. Anyway, it is often used today in project management for project / task planning and I believe it is covered as part of the PMP certification curriculum.

Risk Analysis / Modeling – There are some risk analysis scenarios where due to a lack of data, estimates are used to bring form to components of scenarios that factor into risk. The FAIR methodology – specifically some tools that leverage the FAIR methodology as applied to IT risk – is such an example of using betaPERT for risk analysis and risk modeling.

Ad-Hoc Analysis – There are many times where having access to a distribution like betaPERT is useful outside the disciplines listed above. For example, if a baker is looking to compare the price of her/his product with the rest of the market – data could be collected, a distribution created, and analysis could occur. Or, maybe a church is analyzing its year to year growth and wants to create a dynamic model that accounts for both probable growth and shrinkage – betaPERT can help with that as well.

OPENPERT ADD-IN FOR MICROSOFT OFFICE EXCEL. Jay and I developed the OpenPERT add-in as an alternative to paying money to leverage the betaPERT distribution. Of course, we underestimated the complexity of not only creating an Excel add-in but also working with the distribution itself and specific cases where divide by zero errors can occur. That said, we are very pleased with version 1.0 of OpenPERT and are excited about future enhancements as well as releasing examples of problem scenarios that are better understood with betaPERT analysis. Version 1.0 has been tested on Microsoft Office Excel 2007 and 2010; on both 32 bit and 64 bit Microsoft Windows operating systems. Version 1.0 of OpenPERT is not supported on ANY Microsoft Office for Mac products.

The project home of OpenPERT is here.

The downloads page is here. Even if you are familiar with the betaPERT distribution, please read the reference guide before installing and using the OpenPERT add-in.

Your feedback is welcome via support@openpert.org

Finally – On behalf of Jay and myself – a special thank you to members of the Society of Information Risk Analysts (SIRA) that helped test and provided feedback on the OpenPERT add-in. Find out more about SIRA here.

Ref. 1 – Malcolm, D. G., J. H. Roseboom, C. E. Clark, W. Fazar Application of a Technique for Research and Development Program Evaluation OPERATIONS RESEARCH Vol. 7, No. 5, September-October 1959, pp. 646-669


Follow

Get every new post delivered to your Inbox.