Quality Improvement Assignment Help
Quality Improvement Assignment Help Applications of theories of quality management and tools for quality improvement to the anti-social behaviour management service of a Registered Social Landlord
|Why Connect Should be Concerned about Quality – the Strategic Imperative.|
|The Costs of Failing to Provide Quality Services|
|Difficulties faced by Services in Delivering Quality|
|The Measurement of Quality|
|Appendix One – Harassment, Racial Harassment and Anti-Social Behaviour Service Standards|
Connect Housing is an RSL that provides housing and housing management help services to over 3,000 households across West Yorkshire. The majority of services are paid for from rental income and Consumer Or Business do not usually pay again to access a particular service.
The Anti-social behaviour management service is one of those Project management services. Staffed by three Housing Officers and one Senior, it receives and manages reports of ASB from customers and anyone else living, working in or visiting the areas in which Connect operates. The ASB service is not open to competition (people cannot choose to use the service of another RSL), though customers may choose not to use it at all or to use an agency like the Police for certain cases. The service level deals with around 300 new reports each year, ranging from noise nuisance to incidents of violence, drug dealing etc.
Though service management are in place (appendix one), these focus on:
- definitions of types of ASB
- Connect’s approach to the management of ASB
- the speed with which the association should respond to certain categories of report.
The standards include limited Information Systems that would enable a customer to assess service quality. Though levels of job satisfaction are measured, Connect has not asked customers what is important to them in the service and does not benchmark it against other organisations. Service improvement is ad-hoc and unfocussed.
This paper outlines how certain theories about Business management and particular tools for quality improvement apply to the ASB service and discusses reasons why application of some aspects of theory may be difficult. It also makes recommendations about how service quality may be defined and measured in future and about new approaches to service improvement.
Slack et al note “There is no clear or agreed definition of what quality means.” (2007, p 538). Compounding that lack of clarity is the fact that “Much of the theory of quality comes from work with processes which are by their nature reproduced many times -” (Brown et al, 2000, p 216). The ASB service, called upon in 300+ cases in the last 12 months, is effectively a different service every time it is provided because:
- It is provided by different staff, likely to provide inconsistent levels of service no matter what training is provided.
- It is provided to different customers, interacting with the service, including other customers) in different ways, regardless of their obligations under the terms of their tenancy agreements.
- Each case is unique – even noise nuisance cases require different types of management, dependent on factors like the type of property in which they occur (flat, house etc), the nature of the noise nuisance (vehicles, domestic disputes etc), the involvement of other agencies (local authority noise teams, Police etc) and so on.
The degree of customisation of the service affects the extent to which quality management theories, developed for use in processes that are repeated many times, can be appropriately applied to the service.
Quality gurus like Feigenbaum extended principles developed in production settings to services operations and developed definitions of quality such as the following:
“The total composite product and service characteristics…through which the product or service in use will meet the expectations of the customer”. (Feigenbaum, 1983 in Brown et al, 2000, p 194).
Such definitions are difficult to apply to services such as the ASB service. Dotchin and Oakland (1994, p 14) note “a service package constitutes elements which are different from each other, are difficult to measure objectively and for which the consumer may use completely different methods of assessment”. Garvin (1988, in Brown et al, 2000, p 194) raises two significant difficulties of relevance to the ASB service:
- Customers may have very different perceptions of quality
- It may be difficult to identify the key attributes that connote quality
Such confusion about the definition of quality affects how RSL’s measure service quality. Williams et al (1999, p 367) note that measuring service quality in the housing sector is done largely through use of hard, easily quantifiable data. That approach to the measurement of quality is reflected in the ASB service standards, in which the only measurable attributes of the service are defined as:
- speed with which staff will respond to reports
- fact that Connect will accept hate incident reports
- use of methods to resolve disputes, including use of legal action where appropriate
Slack et al (2007, p 541) provide a definition of quality that may be helpfully applied to the ASB service, that being:
“Quality can be defined as the degree of fit between customers’ expectations and customer perception of the product or service.”
The diagram below outlines the implications of such expectation / perception gaps.
|Customers’ expectations for the service||Gap||Customers’ expectations for the service||Customers’ perceptions of the service||Gap||Customers’ perceptions of the service|
|Customers’ perceptions of the service||Customers’ expectations for the service|
|Perceived quality is poor||Perceived quality is good|
Though Buttle (1996, p 10) noted that there is little evidence that customers assess service quality in terms of expectation / perception gaps, Connect recognises the importance of the extent to which services meet customers’ expectations. The association incorporated questions designed to assess this into its last satisfaction survey (a survey of a range of Connect services) but it is likely that the association asked the wrong question! As seen below, the survey measured the degree of fit between importance to customers and satisfaction with service.
|General condition of the inside of your home||9.1||7.9|
|General upkeep of the outside of your home||8.8||7.3|
|Safety and security of your home||9.2||7.5|
|The general surroundings where you live||8.8||7.4|
|Overall quality of repair work||8.9||8.1|
|Appointment being offered at a convenient time||8.9||8.2|
|The attitude of workmen/women||9.0||8.8|
|Repairs completed in good time||8.9||8.1|
|Repairs being completed on first visit||8.6||7.8|
|How clean and tidy your home was left||9.0||8.6|
|Helpfulness of Connect staff||9.0||8.5|
|Staff respecting you as an individual||9.1||8.5|
|Ability to listen to your views||9.0||8.3|
|The tenant newsletter – Get Connected||7.5||8.1|
|Other information provided by Connect (letters, leaflets, etc.)||7.9||8.1|
|Help given to resolve disputes with neighbours||8.1||7.3|
|Effective handling of other problems & complaints||8.5||7.3|
|The range of ways to pay your rent||8.6||9.0|
|The value you get for your rent money||8.8||7.9|
(Leadership Factor, 2007)
In relation to the ASB service, the table indicates there is a clear gap between views on importance and satisfaction with the service. However, it is likely that customers may view a service as important but still expect that it will not be of high quality. Buttle (1996, p 21) noted that customers’ expectations may be low because of previous experience with the service provider and that meeting these low expectations will result in no expectation / perception gap. An importance / satisfaction gap may not necessarily indicate that customers view the service as poorly performing and Connect should ensure that it understands the expectations of customers by explicitly asking them about this in future surveys.
Why Connect Should be Concerned about Quality – the Strategic Imperative.
Providing quality services has not always been an imperative for RSL’s but has become so in recent years (in theory at least) for most services. RSL’s are inspected by the Audit Commission which makes assessments of the quality of services and the prospects for improvement in their delivery. Poorly rated RSL’s face the prospect of supervision from the lead regulator (the Housing Corporation), with an associated loss of control over many functions.
Brown et al (2000, p 195) suggest two other possible reasons why service quality has become strategic:
- The number and capabilities of new entrants into many markets
- The amount of choice that customers now have as a result of that increased competition
Though those factors do have an impact on some of the services Connect provides (particularly supported housing services, subject to external competition since 2003), they do not yet affect the ASB service. Customers may choose not to use the service (possibly using agencies like the Police) but such choices would not adversely affect the association’s revenue. Customers experiencing ASB have little choice about use of the service and this lack of competition may be one reason why Connect has not focussed strongly on defining service quality, in asking customers for views on that matter or in continuous service improvement.
Longenecker and Scazzero (2000, p 227) provide the counter argument that service organisations have lagged behind manufacturing in developing improvement systems and provide a number of reasons for that, particularly pertinent to the ASB service, amongst the most relevant of which are:
- Lack of competition
- Difficulties in establishing and measuring quality.
However, though the lack of competition may inhibit the desire to improve the service, the prospect of a poor AC inspection rating should provide a similar incentive.
The Costs of Failing to Provide Quality Services
Demming comments “Defects are not free. Somebody makes them and gets paid for making them.” (1982, in Sower et al, 1995, p 235) Within the ASB service, not only do defects in service create additional costs for Connect but putting those defects right adds further cost to the operation.
The association does not fully measure the quality costs within the ASB service and it may be appropriate to begin to do so, using the Total Quality Management categories of prevention, appraisal, internal failure and external failure costs (Slack et al, 2007, p 658). Understanding the full costs of service failure is likely to be a significant incentive to improving service quality. Aspects of the service that may relate to each of the TQM quality costs categories are summarised below.
|Category||Service aspect to be costed|
Youth diversion activities
|Appraisal||Case reviews by line managers|
Monitoring performance (currently response times and satisfaction surveys only)
|Internal failure||Cost of revising and reissuing incorrect court applications, witness statements etc|
Staff time involved in above
|External failure||Customer complaints about the service (including complaints going to the Ombudsman)|
Repairs costs arising from ASB
Rent loss from homes left by customers leaving due to ASB
Re-let costs related to above
Difficulties faced by Services in Delivering Quality
Haywood-Farmer (1988, pp 20 – 21) notes that service organisations face inherent difficulties in delivering quality, all of which are relevant to the ASB service:
- The intangible nature of services and that, as the service cannot be stored, a final quality check before provision is often impossible.
- The heterogeneous nature of services – what customers value at one stage of service provision (when first making a report) may be different at another stage (when going to court as a witness).
- The involvement of customers in the service production and the lack of control the service has over what is effectively a new production worker. Staff can advise the customer about the appropriate ways to resolve cases but cannot insist that advice is followed.
- The need to view staff and facilities as marketing tools, given how visible they are to the customer, the level of interaction between them and the customer and the fact that these tangible service elements will be some of the few cues the customer can use to assess service quality.
He notes that these considerations mean that managing services is significantly different from managing manufacturing and that, for services managers, the quality control targets are “camouflaged, fuzzy and moving” (p 21).
Wisniewski and Donnelly (1996, p 359) provide a model (overleaf) of the factors that may affect the extent to which a perceived service matches expectations of that service (items in italics relate to applications of the model to the ASB service):
Whilst Connect has some control over some of the factors in the first row of influences and over the dimensions of service quality, it has little or no control over others.
Wisniewski and Donnelly (1996, p 364) also note that services may have to balance conflicting expectations from a range of customers and stakeholders. This will inevitably mean that the provider fails to meet some customers’ expectations and the ASB service faces this prospect many times each year. A customer may expect that Connect will take legal action against their neighbour who has repeatedly caused noise nuisance. Staff may have no intention of doing so because of their understanding of the perpetrator’s mental health needs and the impact these have upon their capacity to understand the consequences of their actions. Failure to take the action the first customer expects will result in a perception of a poor quality service. Taking that action will result in the same perception from the Social Worker of the perpetrator.
Quality Improvement Assignment Help
Hill (1993, p 117) notes difficulties that service managers should take into account, including:
- Standards of service need to take account of providers’ and customers’ perceptions of quality. (Connect’s service standard takes no account of customer perceptions).
- Controlling quality during the process must avoid interfering with service provision. (Regular checks on whether staff are following procedures or providing appropriate services will reduce the time to work on cases and will be detrimental to service provision).
The Measurement of Quality
As noted in section 2, Connect has poorly defined quality standards and this is likely to have a negative impact on prospects for service improvement. It may be helpful for the ASB service to apply the notion that separating out aspects of the service package assists in the establishment of specifications and standards and facilitates control of service operations (Hill 1993, p 117).
One way of doing so would be to define key aspects of the service in ways that reflect Garvin’s 8 dimensions of quality (1987, in Sower et al 1995, pp 328 – 325). The table below maps significant aspects of the service against each of those dimensions. Specifying service aspects in a revised service standard is likely to be helpful for customers in informing their judgements on service quality.
|Garvin’s dimension||Aspect of ASB service|
|Performance||Speed of response to reports|
Length of time to satisfactorily close case
|Features||Witness support, youth diversion schemes, security improvements|
|Reliability||Consistency of speed of response|
Frequency of feedback to customers
|Conformance||Accuracy of case recording, adherence to procedures on reporting back to original complainant etc|
|Durability||Length of time case remains resolved (i.e. does not have to be re-opened)|
|Aesthetics||Politeness of staff|
Empathy of staff
Quality of monitoring equipment provided etc
|Perceived Quality||Customer satisfaction levels|
If Connect is to establish standards as outlined in section 6, it must also review its approach to quality control. Seddon comments that “Quality by inspection is not quality” (1997, p 164), echoing the views of other management gurus that inspection must be designed in and cannot be inspected in. This is unfortunate (to say the very least) for Connect since the association’s approach to quality control within the ASB service is to inspect absolutely everything! Cases are reviewed by a line manager:
- every four weeks
- prior to expensive parts of the process (applications for Court hearings etc)
- at case closure.
Quality Improvement Assignment Help
This is a considerable task when one remembers that the service dealt with over 300 new cases in the last financial year.
Whilst Beaumont et al (1997, p 826) suggest that 100% inspection may be appropriate for less automated and less consistent services, Slack et al (2007, p 549) point out a number of reasons why checking everything may not be appropriate. These include the cost of doing so, the fact that staff become tired (so fail to spot errors) and the possibility that staff may not understand what to check for (more likely in a service with poorly defined quality standards). Add to that the following factors:
- the possibility that, by the time the check has been made, the service has failed (and so the check is simply recognising this failure rather than preventing it from impacting on the customer)
- judgements about some aspects of quality are necessarily subjective (such as whether the outcome of a case met the needs of a whole estate or just an individual tenant)
- there are no measurements for some aspects of quality (noted by Slack et al, 2007, p 545) such as the empathy of staff, apart from the judgement of the customer (and by the time this is received, the service may have failed)
and it becomes obvious that a different approach to quality control of the service is required.
It may be appropriate for Connect to consider the use of statistical process control to identify cases that need inspection or times of the year when performance is falling outside acceptable limits. Current monitoring of response times and customer satisfaction could chart the service’s performance trends over time but additional measures would be required to identify specific cases that may need inspection. The most useful additional measures are likely to be the following:
- Length of time between contacts with the customer (long contact intervals suggest that customers are not receiving regular feedback on case progress and this is likely to result in dissatisfaction).
- Number of case notes being added to specific cases (few notes suggest little action on the case).
- Cost of managing each case (rising costs suggest either complicated cases that could result in expensive court action or an inefficient Housing Officer who spends too much time on cases).
Such measures would need to be built into case management software prior to measuring performance over a trial period and then establishing appropriate control limits. Once established, they are likely to ensure that line managers have clear prompts for inspection, thus reducing the time involved in that process now. Slack et al (2007, p 553) explain that use of the tool may also help to inform service improvement – steadily improving performance may require investigation into what is driving that improvement and the findings could help to improve performance in other parts of the organisation (an important consideration for Connect that provides services from a number of offices and does face difficulties in ensuring consistency of service).
The association uses a number of tools to drive service improvement:
- systems thinking reviews (based on models developed by Toyota)
- learning from customer and stakeholder feedback
- customer care training
Use of the first two tools is sporadic and has focussed on services that have the capacity to lose significant amounts of money (the repairs service and the rent accounting services have been through systems thinking reviews in the last two years).
Drew (1997, p 427) has described benchmarking as a legal way of finding out “how others do something better than you do – so you can imitate – and perhaps improve upon – their techniques”. Notwithstanding concerns that it does not lead to best practice, only to the level of practice that other organisations have achieved (Slack et al, 2007, p 588), benchmarking the effectiveness of the ASB service may encourage service improvement and should be considered by Connect.
Slack et al provide examples of types of benchmarking that organisations may consider (2007, p 587) and the association does already use some of those types. Internal benchmarking is used in the comparison of Housing Officer performance, competitive and performance benchmarking are used in most services (with the exception of those subject to extensive external competition) and practice benchmarking is used occasionally (usually as part of a systems thinking review).
However, it is likely that Connect has not made maximum benefit of the tool. Slack et al caution against using it as a “one off” project or as a tool that will provide solutions. They also counsel that resources need to be devoted to the activity, though these need not be expensive ones (2007, pp 587 – 588). Apart from ensuring that the association does devote more time and attention to use of the tool, one further profitable avenue of research may relate to the practice of non-competitive benchmarking. Haywood-Farmer’s three dimensional classification model for the service sector (1988, p 25) is helpful in understanding which organisations the association could compare its performance with.
- high level of contact between customers and staff who deliver the ASB service
- level of labour intensity involved in provision of the service
- extent to which the service has to be customised to manage different types of incident and needs of customers
the ASB service may be seen as fitting into octant 8 of that model. As such, it may help Connect to consider benchmarking against advisory services (e.g. Citizen’s Advice Bureau) or design services (e.g. architects with whom the association works on the development of new housing).
A further area for consideration in relation to both service improvement and quality control is the application of techniques such as Poka-Yoke. Dyer (date unknown, quoted in Taguchi and Clausing, 1990, p 74) comments that the method builds the function of a checklist into an operation so “we can never forget what we have forgotten”. The idea could profitably be applied to the ASB service by formalising checklists in some of the following areas:
- questions customers should be asked (time of incident, who witnessed it, which Police Officer attended etc)
- whether particular parts of the procedure have been followed (have witnesses been interviewed, has photographic evidence been acquired etc)
- whether specific case management options have been considered (e.g. mediation, the implementation of a good neighbour agreement)
- how frequently customers have been contacted to give them updates on progressImplementation of such checklists may:
- assist in reducing time taken to inspect cases (by providing a summary of actions taken and not taken that would alert a line manager to the need to look more closely at a specific case)
- help customers to judge the quality of the service provided (by showing them what they should have been asked or offered)
- improve the quality of the service (by preventing staff from forgetting to comply with specific parts of the management procedures).
Connect should, though, beware the temptation to view such a tool as it is commonly branded – it will not “mistake-proof” the service. Shigeo Shingo, who created the technique, made a clear distinction between mistakes and defects (Fisher, 1999, p 264), regarding the former as inevitable but the latter as avoidable.
Whilst it is clear that there is little competitive imperative to define, measure and improve the ASB service quality, the regulatory and financial impacts of failing to do so act as a similar incentive. Consequently, Connect should take the following steps (as a minimum):
- Measure the gap between customer expectations and perceptions of the service
- Assess the costs of service provision and service failure according to TQM principles
- Establish clear specifications and standards for the service, against which staff and customers could measure it
- Introduce use of SPC to monitor performance and inform judgements about which cases need to be inspected and when
- Introduce non-competitive benchmarking to drive service improvement
- Introduce use of Poka-Yoke mechanisms to minimise the possibility of defective performance and to further inform customers about the standards of quality that they should expect.
Though such measures will not ensure the provision of a quality service, they will:
- facilitate understanding of customer expectations
- inform customers’ judgements about service quality
- reduce quality control costs
- drive service improvement
- help to prevent service failures.
Beaumont, N B, Sohal, A S and Terziovski, M – (1997) – “Comparing quality management practices in the Australian service and manufacturing industries.” – International Journal of Quality and Reliability Management 14 (8) – pp 814 – 833.
Brown, S, Lamming, R, Bessant, J and Jones, P – (2000) – Strategic Operations Management – Oxford: Butterworth-Heinemann.
Buttle, F – (1996) – “SERVQUAL: review, critique, research agenda.” – European Journal of Marketing 30 (1) – pp 8 – 32.
Dotchin, J A and Oakland, J S – (1994) – “Total Quality Management in Services. Part 3: Distinguishing Perceptions of Service Quality.” – International Journal of Quality and Reliability Management 11 (4) pp 6 – 28
Drew, S A W (1997) – “From Knowledge to Action: the Impact of Benchmarking on Organizational Performance.” – Long Range Planning 30 (3) – pp 427 – 441.
Fisher, M – (1999) – “Process improvement by poka-yoke.” – Work Study 48 (7) – pp 264 – 266.
Haywood-Farmer, J – (1988) – “A Conceptual Model of Service Quality.” – International Journal of Operations and Production Management 8 (6) – pp 19 – 29.
Hill, T – (1993) – The Essence of Operations Management – Hemel Hempstead: Prentice Hall.
Leadership Factor – (2007) – Customer Satisfaction Survey 2007 – Unpublished survey, Connect Housing, Leeds.
Longenecker, C O and Scazzero, J A – (2000) – “Improving service quality: a tale of two operations.” – Managing Service Quality 10 (4) – pp 227 – 232.
Seddon, J – (1997) – “Ten arguments against ISO 9000” – Managing Service Quality 7 (4) – pp 162 – 168.
Slack, N, Chambers, S and Johnston, R – (2007) – Operations Management (fifth edition) – Harlow: Prentice Hall.
Sower, V E, Motwani, J and Savoie, M J – (1995) – Classic Readings in Operations Management – Orlando: The Dryden Press.
Taguchi, G and Clausing, D – (1990) – “Robust Quality” – Harvard Business Review 90114 – pp 65 – 75
Williams, C S, Saunders, M N K and Staughton, R V W – (1999) – “Understanding service quality in the new public sector: An exploration of relationships in the process of funding social housing.” – The International Journal of Public Sector Management 12 (4) – pp 366 – 379.
Wisniewski, M and Donnelly, M – (1996) – “Measuring service quality in the public sector: the potential for SERVQUAL.” – Total Quality Management 7 (4) – pp 357 – 365
Harassment, Racial Harassment and Anti-Social Behaviour Service Standards.