IMPACT OF IT ENABLED PERFORMANCE MEASUREMENT ON



Implementation and Impact of IT supported Performance Measurement Systems

Sai S. Nudurupati and Umit S. Bititci

CSM, University of Strathclyde, Glasgow, UK.

E-mail: sai.nudurupati@strath.ac.uk

Abstract

There is little solid research evidence that illustrates the implementation and impacts of performance measurement systems. From a structured pilot case study, it was demonstrated that appropriately designed performance measurement systems with appropriate IT support, driven by senior management commitment, when implemented, had delivered positive impact on management and business. Based on these results, a hypothesis was constructed on several enabling factors (technical and people factors), which ensure the appropriateness of the IT support and its use for performance measurement and hence assess the impact of IT supported Performance Measurement Systems (IT-PMS). The hypothesis was tested in three other companies using action research as the main methodology. From the results obtained, it was found that IT-PMS will bring significant improvement in identifying business strengths and weaknesses, making proactive decisions, facilitating continuous improvement as well as disseminating information and knowledge throughout the organisation.

Keywords

Performance measurement, IT platform, drive from management.

Introduction

Despite the considerable amount of research carried out on performance measurement, little research has been focused on it’s implementation and use (Bourne et al, 2000a). The major barrier identified for implementation and use of performance measurement is the lack of IT platforms (Bierbusse et al, 1998, Bititci et al, 2000a, Bourne et al, 2000b) and people’s behaviour with information (Davenport, 1997, Eccles, 1991, Bititci et al, 2002a). There has been a revolution in the market place over the last decade with a number of software vendors offering different IT platforms to support performance measurement.

Many organisations find it difficult to produce a persuasive business justification for any investment opportunities because there is often high uncertainty about the scale of impact and the scale of costs likely to be incurred. Despite the large number of implementations of IT supported Performance Measurement Systems (IT-PMS) in different industrial sectors, almost no research has been done to find the scale of impact. According to Bititci (2002a), Holloway (2001) and Neely (2002), there is very little solid research evidence that illustrates the impact of performance measurement systems. Hence the objective of the research presented in this paper was to implement IT-PMS and assess it’s impact on business and management.

Background

The performance measurement revolution started in the late 1970s and early 1980s with the dissatisfaction of traditional backward looking accounting systems. Since then, there has been constant development in designing performance measurement. However, implementation and use of performance measurement has received considerable attention in recent years (Kennerly and Neely 2003, Bourne et al 2000, Nudurupati et al 2003)

2.1 Designing performance measures

There are a number of frameworks and models developed for designing performance measurement from strategy. Some of these models include Strategic Measurement and Reporting Technique (Cross et al 1989), The Performance Measurement Matrix (Keegan et al 1989), Performance Measurement Questionnaire (Dixon et al, 1990), Results and Determinants Framework (Fitzgerald et al 1991), Balanced Scorecard (Kaplan and Norton, 1992 & 1996), Cambridge Performance Measurement System Design Process (Neely et al, 1996), Integrated Dynamic Performance Measurement System (Ghalayini et al, 1997), Integrated Performance Measurement System Reference Model (Bititci et al, 1998), Performance Prism (Neely et al, 2001).

2.2 Implementing performance measures

When implementing a performance measurement system, usually indicators are poorly defined (Schneiderman, 1999), which can lead to misunderstanding. Hence, the measures and indicators should be clearly defined (Bourne et al 1998, Neely et al, 1996), understood and communicated. According to Bourne et al (2000), Marr et al (2002) and Nudurupati et al (2002), for implementing each measure, the following tasks are required:

• Data Creation: The policies, procedures and systems required to create the data required.

• Data Collection: The policies, procedures and systems required for data collection at regular intervals.

• Data Analysis: The policies, procedures and systems required to convert the collected data into useful information such as trend charts, comparison charts, summary reports, statistical analysis, etc.

• Information Distribution: The policies, procedures and systems required to communicate this information to the right people at the right time supporting decision-making.

According to Bititci et al (2000), Bourne et al (2000), Hudson et al (1999), Neely (1999), Bierbusse et al (1998), performance measurement implementation fails in many companies because of lack of IT support. Hence it is obvious that IT is one of the critical success factors for performance measurement implementation. According to Meekings (1995), successful implementation of performance measurement depends less on selecting the right measures and more on the way the measures are implemented and used by the people in the business. The data collection, analysis and reporting should be automated as much as possible to save time and effort as well as to provide consistency (Bourne et al 2000). It also should be made as a routine part of business where applicable. IT support during implementation of performance measurement could be provided as follows:

• Building their own IT platform from the available resources using tools such as MS Excel, MS Access, Web development tools, etc. Using these tools, the users can configure and build their own performance reports. (Bititci et al, 2002, Nudurupati et al 2003)

• Buying an IT platform/software available in the market (Nudurupati et al 2001, Marr et al 2001). In this case there are a number of options available. These are

▪ Enterprise Resource Planning (ERP) solution. ERP is a multi-module software system that includes a central relational database and several software modules for managing purchasing, inventory, production, personnel, shipping, customer service, financial planning, and other important aspects of the business. Some ERP vendors started to integrate performance measurement as a module or feature within their ERP platform, e.g. SAP Ltd., PeopleSoft, Oracle Corp. Ltd. etc.

▪ Business Intelligence (BI) Solution. BI is a continuous and systematic process, which facilitates the improvement of business processes and reduces the time used for decision-making. Typically, it includes software functions, such as data sourcing, data analysis, risk assessment, decision support, etc. It includes software tools, such as Data Marts, Data Warehousing, On-line Transactional Analysis tools, Multidimensional Databases or On-Line Analytical Processing (OLAP) tools, Ad-hoc and Prepackaged Query tools, etc. Typically BI platforms are provided by: Hyperion, SAS Institute, Cognos Ltd., Pilot Software Ltd, SAP Ltd., PeopleSoft, CorVu Plc., Gentia Software Ltd., Comshare etc.

▪ Dedicated software for performance measurement. These allow organisations to implement performance measurement frameworks, such as Balanced Scorecard, EFQM, etc. Typically, they collect the relevant information from different sources, analyses the information and communicates the information to different users who make decisions. Vendors of dedicated performance measurement platforms include PB Views Ltd., QPR Software Plc., Inphase Software Ltd., Hyperion, Cognos Ltd., Gentia Software Ltd., Comshare, Active Strategy etc.

2.3 Using and updating performance measures

Providing information on performance is not sufficient to improve business performance results. The real success lies in how people use this performance information (Prahalad et al, 2002, Davenport, 1997, Eccles, 1991). Many believe that the main reason, why performance measurement is short lived is because of peoples’ behaviour with the information (PIBs) (Bititci et al, 2002, Marchand et al, 2000). Meekings (1995) argue that making people use measures properly not only delivers performance improvement but also becomes a vehicle for a cultural change, which helps liberating the power of the organisation.

“PIB is defined as people’s behaviour with performance information in hand. It can be positive behaviour such as pro-active and confident decision making, continuously improving, etc. or negative behaviour such as resistance, wrong interpretation of information, etc.”

Senior managers are responsible for changing the way they manage their business (Hope et al 1998, Eccles 1991). They should attend the workshops and become deeply involved in shaping the objectives and the measurement system (Meekings 1995, Coch et al 1948). The commitment from the senior managers should come in the form of a drive, they should start using the system (i.e. performance information) and ask questions in the management briefings as a learning process with non-threatening management style (Bititci et al 2002). This makes the next level of management realise the interest shown by senior management and they will start using the system and look into the performance information before going to the management briefings. In this way, it will be deployed throughout the organisation to the lower levels, which will enhance the usage of performance measurement system throughout the organisation. The next best way for making people use the system is by building confidence and demonstrating the benefits of IT-PMS.

Just as the strategy for the company changes dynamically based on external fluctuations, the relevant performance measures/indicators should also be reviewed to remain their relevance with the strategy (Bourne et al, 2000, Dixon et al, 1990, Bititci et al 2000). Hence a performance measurement system should include an effective mechanism for reviewing targets (Ghalayini et al, 1996) and a process for developing measures or indicators as circumstances change (Meekings, 1995, Dixon et al, 1990, Kennerly et al 2003).

Methodology

Studying the effect of the dynamic socio-technical system such as IT enabled Performance Measurement System (IT-PMS) can only be realised in action (Coughlan et al, 2002). Since our objective is to find these intended and un-intended changes, action research was chosen as the main methodology of the research. Initially the intended and unintended changes resulted in a structured pilot case study, from which a hypothesis was generated (induction). The hypothesis was tested (deduction) at three companies. Data was collected before and after implementing the system. At both stages, structured interviews were conducted with all managers who used the system. In total 28 managers were involved with 56 interviews across all three companies. Personal observations (through involvement) also assisted in data analysis. Cross case analysis across three companies was done to generate theory and create new knowledge.

Hypothesis

There is anecdotal evidence for success of IT-PMS in four Scottish based companies. In one of these companies AFE (Packaging, Foil Rolling and Technical Products), a more detailed and structured case study was done. The design, implementation as well as impact of IT-PMS at AFE were the subject of previous papers (Bititci, 2002a and 2002b). This study provided strong empirical evidence that appropriately designed performance measurement systems (Neely et al 1996, Bititci et al 1998, Kaplan and Norton 1992 & 1996) if supported through appropriate IT platforms, appropriately implemented and used will:

1. Hypothesis 1.1: Improve Business Benefits by focusing attention to critical areas of the business (this is facilitated by identifying strengths and weaknesses of the business). Some of the business benefits include:

• making performance information more transparent and visible

• improving accuracy, reliability and credibility of performance information

• creating awareness of issues and focus around critical problems

2. Hypothesis 1.2: Improve Relationship with customers and suppliers by sharing the key performance related information

3. Hypothesis 1.3: Facilitate management to make more pro-active decisions (In this context proactive decision means - making decisions faster than before, with up-to-date information in near real-time basis)

4. Hypothesis 1.4: Act as a critical component of an ongoing process improvement tool

5. Hypothesis 1.5: Disseminate the performance information effectively throughout the organisation.

6. Hypothesis 1.6: Change the behaviour of people in the organisation (either negatively or positively)

In this context, the appropriateness of the IT platform is defined as how well it is designed and suitable for the organisation. It is achieved, with the technical factors shown in Table 1. The appropriate usage of the system is defined as how well the people use this system in the organisation, which is achieved with the people factors as demonstrated in Table 1.

| |Does not|Exist at|Exist at|Exist at|Exist at|

| |exist at|basic |emerging|standard|signific|

| |all |level |level |level |ant |

| | | | | |level |

|1. Technical Factors |

|Factor 1.1: provides up-to-date performance information and access to managers in (near) real time | | | | | |

|Factor 1.2: provides means to compare performance against targets and best-class performances | | | | | |

|Factor 1.3: provides open communication of information throughout the organisation (through web) | | | | | |

|Factor 1.4: provides secured access to customers and suppliers and facilitates communication with them (if | | | | | |

|necessary) | | | | | |

|Factor 1.5: provides consistent and accurate information | | | | | |

|Factor 1.6: reduces the time and effort required to gather data, analyse it and communicate it throughout the| | | | | |

|organisation | | | | | |

|Factor 1.7: provides statistical analysis for monitoring and controlling key variables of the process | | | | | |

|Factor 1.8: simple and easy to use | | | | | |

|2. People Factors |

|Factor 2.1: senior management are committed and are driving the implementation and use | | | | | |

|Factor 2.2: people are using the system in identifying business trends | | | | | |

|Factor 2.3: people are using the system for decision-making | | | | | |

|Factor 2.4: people are acting as teams to solve the issues (internal teams as well as external teams | | | | | |

|including suppliers and customers) | | | | | |

|Factor 2.5: people are using the system as a routine part of their business | | | | | |

|Factor 2.6: people are not resistant to use the system or statistical analysis | | | | | |

|Factor 2.7: people are knowledgeable enough to use the system and statistical approaches | | | | | |

|Factor 2.8: people are empowered to make decision based on information available | | | | | |

|Factor 2.9: people are confident about the information in their management briefings | | | | | |

Table 1: Technical and People factors to assess the IT support and it’s use for performance measurement

These technical factors for the appropriateness of the system as well as the people factors for the system’s use are assessed and hence the impact of the system on business and organisation is tested across three action case studies.

ACTION CASES

The IT-PMS was implemented at three companies.

5.1 Action Case 1

Company SLC based in Edinburgh was founded in 1858 had a vast experience in producing different types of labels using digital and combination press technology with 64 employees. Before the IT-PMS was implemented, the Company was not using any performance indicators derived from their strategy. They had few financial measures and the information systems were built based on these measures. The Company as a whole was suffering from:

• Lack of balanced set of indicators (such as customer measures, operational measures, etc.)

• Hidden information

• Difficulties with data processing and sorting information in a meaningful way

• Lack of effective communication of the information

• Lack of visibility over changes and trends

• Lack of readily available information to support decision-making

• Lack of culture in using information for decision-making

The researchers presented their work with AFE (Bititci, 2002a and 2002b) to the management team in SLC, who decided to adopt and implement IT-PMS using low cost SPC Software. Since the Company already had financial measures in place the management team wanted to deploy leading indicators (which directly effect their financial measures) and decided Overall Equipment Effectiveness (OEE) as a firewall. This meant measuring the availability, performance, quality and hence OEE of all the machines. Due to the lack of IT skills to support these measures, the Company decided to bring in the researchers (authors), who played a leading role in implementing this project as well as in finding the impact of IT-PMS.

The in-house information technology (MS Access, Html, Java Script, VB Script, CGI Script, SPC Software and Internet Information Server) was used in data analysis, reporting and communicating information as Normal Trend Charts, Statistical Charts and Summary Reports as shown in Figure 1. These reports and charts (information) are communicated to all managers through Intranet within the company. The system was built to suit any person with basic PC knowledge. Once they select an option and click a button on the web page as shown in the Figure 1a the software platform automatically connects to the MS Access database and gets the information in the form of normal charts, statistical quality control charts as shown in Figures 1c and 1d as well as summary reports as shown in Figure 1b.

|[pic] |

|(a) Menu Page |

|[pic] |[pic] |[pic] |

|(b) Summary Report on OEE Performance|(c) Shewhart chart OEE Performance (Week 1 to Week 42, 2001) |(d) Shewhart chart OEE Performance (Week 42, 2001 to |

| | |Week 22, 2002 |

Figure 1: Sample pages available on the Intranet of Company SLC.

The IT-PMS implementation started in week 42 and completed in week 50. In Figure 1c, the chart represents the OEE Performance of the whole factory (all machines) before Week 42 (prior to implementation). The OEE Performance measure is almost constant. After implementing IT-PMS system during Week 42 to Week 50, which had driven improvement in OEE, as shown in the Figure 1d. This improvement in OEE of the factory has reflected in their financial figures with improvements. Initially this also gained Marketing and Financial Manager’s confidence in the system. It also drew the Managing Director’s attention. However, the system included only limited measures (within operations) with limited flexibility due to the limitations of data capturing system in shop floor.

Four to six months after the implementation, the interviews with the managers demonstrated that there was not much impact on its business and management as shown in Figure 3. However, the reasons for not having a significant impact/improvement were:

• The IT-PMS was implemented only for measuring the operational activities of the Company

• There were limitations in the data collected (because the automated data capturing system was originally designed based on financial measures and it cannot be changed to capture data required to measure OEE by shift, by product basis, etc.)

• The senior management did not develop confidence in the system due to the limitations in the data capturing and hence did not drive the company wide use of the system

• The performance results were not communicated to everyone in the organisation, especially for all team leaders and people on the shop floor

5.2 Action Case 2

Company HSL based in Perthshire was formed in 1979 was established as a leading source of pure unspoilt natural mineral water with 200 employees. Before the IT-PMS was implemented, the Company did not have performance measures derived from strategy, however they had few financial measures. The data was collected on paper on the shop floor. The company HSL was suffering from:

• Information was hidden in paper sources

• Difficulties associated with data collection on the shop floor

• Lack of visibility over changes and trends

• Lack of effective communication of the information

The researchers presented their work with AFE (Bititci, 2002a and 2002b) to the management team in HSL, who decided to adopt and implement IT-PMS. Since the Company already had financial measures in place the management team wanted to deploy leading indicators (which would directly affect their financial measures) and decided Overall Equipment Effectiveness (OEE) and downtime as the leading indicators. This meant measuring the availability, performance, quality and hence OEE as well as downtime of all machines. This work was included as part of a bigger project, continuous improvement. It was resourced by company’s internal resources. The researchers (authors) played a participating role (as external participants) in implementing this project. The in-house information technology (Cognos Reporting Tool) was used in data analysis, reporting and communicating information as Normal Trend Charts and Summary Reports. However the researchers played a lead role in observing and collecting the information on the impact of the IT-PMS.

After IT-PMS was implemented at HSL and used for six months, the interviews with the managers demonstrated that there were significant improvements as shown in Figure 3. The reasons for these improvements are:

• The Senior Management was committed to the system and its use

• The management team gained confidence in performance information generated by the system because the data was accurate and timely

• The performance results were communicated to everyone in the organisation, especially for all team leaders and people on the shop floor

5.3 Action Case 3

Company ADL based in Dumbarton was founded in 1988 although the roots of the Company were originated in 1827. It is a leading international spirit producer with 1200 employees. Before the IT-PMS was implemented at ADL, the data was stored at different sources such as Data3, Lotus Notes, Lotus Approach Database, AS 400, Cognos Reporting Tool, MS Access etc. There is no single source to collate data from different sources, analyse it and communicate it. The business is functionally organised and there was no communication of information between different departments. The departments were completely isolated and the improvement projects launched by one department do not always tie in with the other departments, “One person’s music was sometimes noise to the other”. Most of the managers are bureaucratic and based their decisions on experience rather than on information. Some of the managers are unfocused and were not clear about their objectives. The culture of employees at ADL is not to change their roles, “They always wanted to hold on to their Cheese and do not want to change their Cheese” the company as a whole was suffering from:

• Performance information was not available at a single source (as data was available at different sources)

• Having few lagging indicators and not having leading indicators

• Lack of visibility because information was hidden

• Difficulties associated with data processing and sorting information

• Changes and trends are not transparent to everyone

• Information available for decision-making was out-of-date

The researchers presented their work with AFE (Bititci, 2002a and 2002b), SLC and HSL to the management team at ADL, who decided to adopt and implement IT-PMS using low cost SPC Software for their Operations and Quality Departments. The management decided to measure Line reliability, End of line Quality, Customer complaints, On Time In Full (OTIF) and Absenteeism. The Company brought in the researchers, who played a leading role in implementing this project.

The in-house information technology (MS Access, Html, Java Script, VB Script, CGI Script and SPC Software) was used in data analysis, reporting and communicating information as Normal Trend Charts, Statistical Charts, Pareto Analysis and Summary Reports as shown in Figure 2. These reports and charts (information) are communicated to all managers through Intranet within the Company. The system was built to suit any person with a basic PC knowledge. Once they select an option and click a button on the web page as shown in Figure 2a the software platform automatically connects to the source database and gets the information in the form of summary report as shown in Figure 2b, Pareto chart as shown in Figure 2c, normal trend chart with target as shown in Figure 2d and statistical chart as shown in Figure 2e.

|[pic] |[pic] |

|(a) Menu Page |(c) Pareto Analysis of Defect Classification on Line 14 |

|[pic] | |

|(b) Summary Report of Major A and B, Critical A and B | |

|Defects on Line 01 | |

|[pic] |[pic] |

|(d) 10-Week Rolling Average: Line Reliability of all Lines with a Target = |(e) Percentage of Major A Defects on Line 12 including an Upper Control |

|63.5 |Limit = 0.4% |

Figure 2: Sample pages available on the Intranet of Company ADL.

After implementing the IT-PMS system, the key information is available to everyone in the Company for decision-making. But, however the information was initially restricted to the senior managers for the first three months and later rolled out to the lower levels. Since the information available in one department is available to the other departments and vice versa, managers started communicating and acting as a team. It created visibility and transparency of the available information. The major impact of the system at this company is Pareto Analysis presenting defect classification on each line. This made each line-crew concentrate in reducing the top two or three defects on their line and identifying process variables.

After implementing the IT-PMS for three months at ADL, the interviews with the managers demonstrated that there were some improvements as shown in Figure 3. The reasons for these improvements are:

• The Senior Management was very committed to the implementation and it’s use

• All the management gained confidence in performance information generated by the system in near-real-time

• The performance results are communicated to everyone in the organisation, especially to all team leaders and people on the shop floor

However the reasons, why the system did not make a significant impact are:

• The people on the shop floor did not start using the system (at the time of interviews), because they did not have access to computer(s), which displays information on the production lines

• Even though managers are committed to implementation, some of them lack vision in terms of what they want from this implementation

• It requires more than three months to have it’s impact on implications such as improving cultural behaviour, improving relationships with customers and suppliers, etc.

6. Discussion

The final results of the interviews at each company are compared against each other as shown in Figure 3. It is clearly evident that the impact of implementations at AFE and HSL are more significant than at ADL and SLC. The reason being that the performance measurement was implemented throughout the organisation in AFE and HSL, however it was restricted to operational and quality activities at ADL and SLC. Senior management commitment was significant at AFE, ADL and HSL. They used the system in their routine business in identifying trends and making pro-active decisions. They insisted that everyone use the system, with an open and non-threatened management style. Initially in AFE and HSL there was some resistance at lower levels to use the system, which was overcome as the people realised the benefits.

However at SLC, the management were not confident on the information generated by the system because of its limitations. Hence the senior management were not committed to the rest of the project, as a result they didn’t drive the people into using the system. This was reflected in interviews, as shown in Figure 3, with very little improvement at SLC.

[pic]

Figure 1: Cross Case Analysis - Results

The relationship with customers and suppliers was enhanced in the case of AFE as it shared some of the critical statistical information with customers in improving the output product delivered to them. However, in the reminder of the companies it didn’t improve much because the companies did not provide access to information for their customers and suppliers. These companies wanted to exploit the system within the organisation first and then involve suppliers and customers in improvement projects.

The dissemination of information is significant in all four companies as the IT-PMS is similar in all companies. However due to some limitations of data capturing on the shop floor at SLC, all the required information could not be disseminated, which is reflected in Figure 3 with less improvement than the rest of companies.

The culture of the people was almost same in all four companies. Some of the management described the peoples’ culture as “Who moved my cheese”. The employees always wanted to do routine tasks without changing, no matter how the world changes. However, with the drive from senior management, the people realised the benefits of the system, and changes for the better was observed with some individuals. The management reckon that there is still a room for improving the behaviour of the people, which may take a few years.

The other factors, which made significant improvements on implications at AFE, ADL and HSL are:

• Drive and commitment from senior management

• Open and non-threatening management style

• Simplicity in selecting few measures

• Adoption of Shewhart charts as a standard method of reporting

• Integration of data collection and analysis into the business as part of everyday job

• Automation of data collection, analysis and communication as much as possible

• Maintenance of data accuracy

• Facilitation of cross-functional teams for continuous improvement programmes

• Empowerment of people to make fact based and information driven decisions

7. Conclusion

This paper provided empirical evidence that appropriately designed performance measurement systems, if supported through appropriate IT platforms, appropriately implemented and used with senior management commitment, will improve identification of strengths and weaknesses of business, pro-active decision making, continuous improvement, dissemination of information and knowledge, behaviour of people. The following factors significantly contribute to the above implications when implementing performance measurement:

|Data accuracy |Drive from senior management |

|IT support |Commitment from senior management |

|Resistance from people |Launch of other projects during this period |

However, this conclusion is based on four case studies with interviews done before and after six months of implementation. The authors and senior managers in interviews expressed their confidence that the benefits enjoyed will become more significant as time elapses, i.e. in one or two years after the implementation.

8. References

Bierbusse, P. and Siesfeld, T. (1998), "Measures that Matter", Journal of Strategic Performance Measurement, Vol. 1, No. 2. 6-11

Bititci U S and Carrie A S, (1998), Integrated Performance Measurement Systems: Structures and Relationships, EPSRC Final Research Report, Grant No. GR/K 48174, Swindon UK.

Bititci, U. S, Turner, T. and Begemann, C. (2000), “Dynamics of Performance Measurement Systems”, International Journal of Operations and Production Management, Vol. 20, No. 6, pp 692-704.

Bititci, U. S., Nudurupati, S. S., Turner, T. J. and Creighton, S. (2002a), “Web enabled Performance Measurement System: Management Implications”, International Journal of Operations and Production Management, Vol. 22, Issue 11, Pages 1273-1287.

Bititci, U. and Nudurupati, S. (2002b), “Using Performance Measurement to drive Continuous Improvement”, Manufacturing Engineer, Vol. 81, No. 5, Pages 230-235.

Bourne, M. and Wilcox, M. (1998), “Translating strategy into action”, Manufacturing Engineer. Vol. 77, Issue 3, Pages 109-112.

Bourne, M., Mills, J., Wilcox, M., Neely, A. and Platts, K. (2000a), “Designing, implementing and updating performance measurement systems”, International Journal of Operations and Production Management, Vol. 20, Issue. 7, Pages 754-771.

Bourne, M. and Neely, A. (2000b), “Why performance measurement interventions succeed and fail”, Proceedings of the 2nd International Conference on Performance Measurement, Cambridge UK, pp 165-173.

Coch L. and French J. (1948), “Overcoming resistance to change”, Human Relations, Vol. 1, No. 4 Pages 512-532.

Coughlan, P. and Coghlan, D. (2002) “Action research for operations management”, International Journal of Operations and Production Management, MCB university Press Ltd., Vol. 22 No.2, pp 220-240.

Cross K F and Lynch R L, (1988 -1989), "The SMART way to define and sustain success", National Productivity Review, vol. 9, no 1, 1988-1989.

Davenport, T. H. (1997), Information Ecology, Oxford University Press, US.

Dixon, J. R., Nanni, A. J., Vollmann, T. E. (1990), The new performance challenge: measuring operations for world class competition, Dow Jones –Irwin Homewood Il, 1990

Eccles, R. G. (1991) "The Performance Measurement Manifesto", Harvard Business Review, January-February, 1991, Pages131-137

Fitzgerald L., Johnston R., Brignall T. J., Silvestro R. and Voss C. (1991), Performance Measurement in Service Industries, CIMA, London.

Ghalayini, A. M. and Noble, J. S. (1996), “The changing basis of performance measurement”, International Journal of Operations and Production Management, Vol. 16, No. 8, Pages 63-80.

Holloway, J. (2001), “Investigating the impact of performance measurement”, Int. J. Business Performance Management, Vol. 3, Nos. 2/3/4. Pages 167-180.

Hope J. and Fraser R. (1998), “Measuring Performance in the new Organisational Model” Management Accounting, June 1998, Pages 22-23.

Hudson, M., Bennet, J. P., Smart, A. and Bourne, M. (1999), Performance measurement in planning and control in SME's, Global Production Management edited by Mertins K, Krause O and Schallock B, Kluwer Academic Publishers, isbn 0-7923-8605-1

Kaplan R. S. and Norton D. P. (1992), “The Balanced Scorecard – Measures That Drive Performance”, Harvard Business Review, January-February 1992 issue.

Kaplan R.S. and Norton D.P. (1996), Translating strategy into action: The balanced scorecard, Harvard Business School Press, Boston, Mass 1996.

Keegan D. P., Eiler R. G. and Jones C. R. (1989), “Are your performance measures obsolete?”, Management Accounting, June 1989, Pages 45-50.

Kennerly and Neely (2003), “Measuring performance in changing business environment”, International Journal of Operations and Production Management, Vol. 23, No. 2, Pages 213-229.

Marchand, D., Kettinger, W. and Rollins, J. (2000) “Company Performance and IM: the view from the top” – Mastering Information Management edited by Marchand D., Davenport T. and Dickson T., Financial Times, Prentice Hall, London. Pages 10-16.

Marr, B. and Neely, A. (2002) Balanced Scorecard Software Report, published by InfoEdge, Two Stamford Landing, Stamford, Connecticut 06902-7649, USA.

Meekings, A. (1995). "Unlocking the Potential of Performance Measurement: A Practical Implementation Guide", Public Money & Management, Issue October-December, Pages 5-12.

Neely, A., Mills, J., Gregory, M., Richards, H., Platts, K. and Bourne, M. (1996), Getting the measure of your business, University of Cambridge, Manufacturing Engineering Group, Mill Lane, Cambridge.

Neely A. (1999), “The performance measurement revolution: why now and what next?”, International Journal of Operations and Production Management, Vol. 19, No. 2. Pages 205-228.

Neely A. and Adams C. (2001), “The Performance Prism Perspective”, Journal of Cost Management, January/February 2001.

Neely, A. (2002), “Research Priorities”, Centre for Business Performance, Cranfield University at

Nudurupati, S. S. and Bititci, U. S. (2001) “Review of Performance Management Information Systems (PerforMIS)”, Internal Report, Centre for Strategic Manufacturing, DMEM, University of Strathclyde, Glasgow, UK.

Nudurupati S. S., Bititci U. S., and Maddocks S. (2002) “Web Enabled Performance Measurement Systems”, Conference Proceedings – PMA Conference 2002, The World Trade Center, Boston, USA. 17-19 July 2002.

Prahalad, C. K. and Krishnan, M. S. (2002), “The Dynamic Synchronisation of Strategy and Information Technology”, MIT Sloan Management Review, Summer 2002. Pages 24-33.

Schneiderman, A. M. (1999) "Why Balanced Scorecards Fail", Journal of Strategic Performance Measurement, (1999) Special Edition, Pages 6 – 11.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download