Tuesday, October 1, 2019

Data Envelopment Analysis and related literature Essay

As already stated, a rich history of literature and research which is demonstrating the importance of processes in analyzing the performance of an organization exists (Chase, 1981; Chase et al. , 1983; Levitt, 1972; Roth et al. , 1995). Especially, Roth et al. (1995; here and in the following) showed that the key drivers are process capability and execution in an empirical way. It was described in their study that an inappropriate design of certain processes and also the poor execution of a process can lead to process inefficiency, and that both process capabilities and people as major factors affect business performance. When estimating the performance of processes usually a number of different outputs have to be taken into consideration. Data Envelopment Analysis, the estimation method described in this chapter and used as a basis for measuring the efficiency of business processes, deals with these multiple outputs by the use of frontier estimation. In this process, it is specifically determined which relative performance amongst multiple inputs and outputs are present. This in turn is achieved by calculating ratios of weighted outputs to weighted inputs, and the determination of the relative efficiency (which is seen as the distance from a peer object to the best practice frontier) compared with the efficiency of other so-called Decision Making Units (Charnes et al. , 1978). Decision Making Units can be defined as firms or public-sector agencies, but also as single processes or process instances (Sengupta, 1995). Data Envelopment Analysis is therefore used in different areas of daily life, for example in education programs of schools, or the production and retail business (Metters et al. , 2003). The Data Envelopment Analysis method was introduced into the operations research literature by Charnes, Cooper, and Rhodes in 1978 (see Charnes et al. , 1978). They presented it as a new nonparametric (meaning it is entirely based on the observed input-output data and not based on the assumption of a normal (Gaussian) distribution underlying the measured parameters) and multi-factor productivity analysis model (Sengupta, 1995). The Data Envelopment Analysis model as it was originally formulated by Charnes, Cooper and Rhodes, later referred to as the â€Å"CCR model†, has the important and critical characteristic of the reduction of the multi-output, multi-input situation for each Decision Making Unit to that of a single â€Å"actual† output and a single â€Å"actual† input. In fact, the respective measurement of efficiency for a Decision Making Unit is identified by the original Data Envelopment Analysis model by developing the ration of weighted outputs to inputs to 3 he maximum under the assumption that alike ratios for every Decision Making Unit are not larger than one (here and in the following Frei et al. , 1999). This in turn results in a number of efficiency scores less than or equal to one, as well as a reference set of Decision Making Units identified as efficient. The method has also come to be known as the â€Å"input-output oriented model†, because by holding outputs constant and at the same time evaluating to what degree the inputs would have to be changed in order for a Decision Making Unit to be considered an efficient one, the overall efficiency score is determined. The also existing â€Å"output-oriented method† is very alike to the input-oriented method. Using this approach, the ratio of weighted inputs in relation to the outputs is minimized in order to be able to evaluate the actual amount that each Decision Making Unit’s outputs have the chance to be improved whilst holding the inputs on a constant level. In summary, in both cases, a Decision Making Unit identified as efficient has no potential for improvement, whereas as Decision Making Units seen as inefficient have efficiency scores that reflect the actual potential for improvement which is based on the achievement of other Decision Making Units. A proportional (linear) program must be carried out for each of the Decision Making Units to be able to define the relative efficiency scores. Because of the use of a linear function, the implied presumption is that the efficient frontier is piecewise linear. As a fact, the original model of Data Envelopment Analysis comes up with a ranking of the different Decision Making Units in the system in a scale of relative efficiency from the lowest to the highest, where the highest is considered to be one hundred percent efficient (Sengupta, 1995).

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.