Customer Involvement and Satisfaction

SECTION 3.1

Practically every element in this program depends on extensive "customer involvement." Customers drive the selection of Business Process Innovation Projects, as outlined below, and customer input is utilized at the front end of every process improvement initiative. Customers are heavily involved in pilot projects, such as those summarized throughout this booklet, and "customer feedback" travels two directions -- both to and from customers of A&BS services -- especially during the analytical and redesign phases of process improvement/innovation. Newly implemented process improvements and current services are evaluated in terms of customer satisfaction outcomes. Many "key deliverables" (services) provided by A&BS units are accompanied by customer satisfaction instruments -- especially point-of-service surveys, designed according to a checklist included later in this section.

The Valuable Role of Academic Administrative Officers

A&BS utilizes two groups of administrators -- Campus Academic Business Officers (CABOs) and Health Sciences Administrators (HSAs) -- as "Customer Panels." As such, these groups comprise direct customers, themselves, or function as customer-surrogates for essentially all services delivered campuswide. As customer-surrogates, these department administrators coordinate and track central services provided for faculty and others in their organizations, including trouble-shooting problems that materialize and negotiating when intervention is needed. They provide a coherent, informed view of customer expectations and customer satisfaction. These departmental customer panels, which include more than forty people, are reliable and comprehensive in their feedback -- response rate is high, evaluations are well informed and "calibrated" based on a consistent level of expectations across various functions and services, and the groups' compositions are fairly consistent from year to year.

Department Administrators' Role in BPI Projects

In determining priorities for BPI (Business Process Innovation) projects -- cross-functional "re-engineering" (this term is sparingly used in this program, for it is not accurate when applied to processes that were not "engineered" in the first instance) projects with broad campus impact -- the CABOs and HSAs participated in a multi-phase process:

  • Interviews and meetings produced a list of over twenty BPI project "candidates."
  • CABOs and HSAs completed a matrix survey which scored, on a 0 - 10 basis, all project "candidates" against sixteen scoring criteria:

1. Expensive process with substantial apparent or suspected waste, including pieces/steps with "no value added"

2. Process generates low "customer satisfaction"

3. Process regarded by knowledgeable users as "broken"

4. Process strategically linked to overall performance of the institution

5. Process affects other key business processes affecting much of the organization

6. Process feasible to "re-engineer" -- not politically impossible, unbounded, or externally limited. (In other words, complex, yet simple enough to enable "re-engineering" to succeed.)

7. Improvement in this process can leverage other process improvements

8. Process produces or involves large quantities of defects, rework, redundancy, checking, rechecking

9. Process has muddy objectives and/or appears undefined or out-of-control

10. Process characterized by unanswerable performance questions

11. Process administration valued as an end unto itself, more than process performance

12. Exceptions and special cases contribute greater than 20 percent of workload

13. Process under widespread pressure to change

14. Process shadowed by a companion process that compensates for its quality gaps

15. Extensive, fragmented, and redundant information exchange with other business systems, processes, and "customers"

16. Process contains multiple check-points and approvals.

Priority was scored based on the following "formula":

  • Priority score is the product of four factor-classes:
    1. potential value (or savings)
    2. dysfunctionality of the process
    3. feasibility of redesign
    4. typical of re-engineering project
  • Scoring criteria (listed above) were categorized into four factor groups:
    1. 1., 4., 5., and 7. pertain to potential value.
    2. 2., 3., and 8. pertain to perceived dysfunctionality of the process.
    3. 6. pertains to feasibility as a potential "re-engineering" project.
    4. Factors 9-16 are typical of opportune candidates for "re-engineering."
  • Criteria were grouped (as noted above) into four factor-groups. Scores were added for each group and "normalized" to a 0-1 factor, producing four factors that could range from zero to one:
    1. Potential value
    2. Dysfunctionality
    3. Feasibility
    4. Typical (of "re-engineering" project).
    These four factors were then multiplied to arrive at an overall priority score of zero or one.
  • This multiplicative scoring technique meant that if a "candidate" BPI project scored high on potential savings, dysfunctionality, and "typical of re-engineering" but low on feasibility, it would rank low. Or, if it scored high on the three latter factors but low on potential savings, it would similarly score low. This system reflects the logic that a decision-maker would actually use in allocating resources or effort. By contrast, an additive scheme (scores summed, rather than multiplied) could produce high scores for BPI candidates with low feasibility or low savings potential if all other scoring categories ranked high.
  • Despite the care exercised in developing and administering the scoring formula outlined above, the resulting priority rankings for the prospective Business Process Innovation projects were also reviewed on an intuitive basis -- to assure that they "made sense" -- with the Campus Academic Business Officers and the Health Science Administrators functioning as customer panels.
  • In order to determine whether our perceptions paralleled those of our customers, the same survey and scoring process were administered to A&BS managers. This effort produced similar rankings with a few exceptions, which were decided in the direction of customer preferences.
  • A number of CABOs and HSAs have participated on Business Process Innovation teams, providing in-depth customer perspectives. (Team participants and their affiliations are summarized in section 4.1.) These projects have collected customer input through interviews, focus groups, and surveys, as well as through the direct experience of customer-participants.

The Role of Customers in Pilot Projects

Many process improvements and innovations summarized throughout this booklet started out as pilot projects. In these projects, customer input is utilized in three ways:

  • Pilots are selected based on willingness of customers to participate in project design and "real time" evaluation, particularly during the startup phase.
  • A customer-based advisory group typically oversees the pilot project.
  • Evaluative feedback from the pilot unit and the advisory group are sought before the project is extended to other units.

Two-Way Feedback Needed for Customer Satisfaction

The "ZotMail/ZotFax" electronic communications system and UCI News are used to communicate with A&BS customers campuswide about both process improvement endeavors and resource-allocation decisions affecting A&BS services. Maximizing customer satisfaction involves communicating openly about budget-driven service priorities and service cutbacks; the extent to which cutbacks can be offset by productivity improvements or process streamlining; the degree to which service reductions are unavoidable; and realistic adjustments in service-expectations, especially as consolidated systems and simplified programs are introduced. Customer satisfaction must be evaluated in light of resource cutbacks and adjusted service standards. This view has been articulated by Chancellor Wilkening, and the Academic Senate Committee on Planning and Budget has acknowledged the importance of "downsizing the work" as we adjust to reduced budgets.

The criteria in the checklist below reveal a preference for point-of-service customer satisfaction measurement tools, for reasons noted. Additionally, our objective is to make feedback mechanisms as simple for the customer as possible.

Key Elements: Customer Satisfaction Tools
  • Point-of-service surveys require < 1 minute
  • Targeted at specific customers of "key deliverables"
  •  Designed for high response-rate (rather than sample size)
  • Satisfaction questions preceded by stated service standards (quantified unless a standard such as "courtesy" is implicit)
  • Customers rate satisfaction with reference to service standards (not abstractly or subjectively)
  • Point-of-service instruments are preferred due to ease of administration, economy, high response-rate, immediacy of expression, and efficiency in targeting customers
  • A retrospective (vs. point-of-service) survey requires <5 minutes
  • A retrospective survey measures satisfaction in relation to stated standards as well as the perceived importance of, and satisfaction with, the service standards themselves.
  • A retrospective survey should focus on <6 key deliverables
  • Both types of survey instruments should be pre-tested to ensure ease of error-free completion

Customer Satisfaction Data

Accounting and Fiscal Services

Cost Accounting Standards Workshop Performance Evaluation

Date: Ongoing

Survey Focus: Designed as a "one-minute" evaluation of the CAS workshop, the Program Evaluation is utilized to elicit attendees’ views immediately after attending a workshop. Feedback from workshop attendees:

  1. measures customer satisfaction
  2. evaluates presenters
  3. provides suggestions for improving course materials
  4. provides suggestions for enhancing presentation of materials
  5. suggests topics for future workshops

Survey Instrument: Questionnaire

Sample Size: 780

Response rate: 99%

Outcomes: Survey feedback is a valuable tool for immediately improving course materials, presentation of materials, and development of future training needs in order to meet the needs of our customers. Based upon customer feedback, we have offered additional workshops pertaining to Contracts and Grants Accounting.

Administrative Computing Services

Applications Access Survey

Date: Summer 1998

Survey Focus: Determine satisfaction in the administrative process used to access secure application systems.

Survey Instrument: Web questionnaire

Sample Size: 93

Response rate: 33%

Outcomes: Redesign application access process to simplify for end users.

Course Evaluation Survey

Date: Ongoing

Survey Focus: Improvement of Training Class

Survey Instrument: Questionnaire

Sample Size: 9/class - about 4 classes/week.

Response rate: 100%

Outcomes: More courses being scheduled. On a scale of 1-10 the average score was 8.5 or greater.

Distribution and Document Management

Official Policies on the Web

Date: Two focus group sessions, September 23, 1998

Survey Focus: Evaluate effectiveness of Web interface for Official University Policies and Procedures

Survey Instrument: Hands-on exercise with follow-up questionnaire and group discussion.

Sample Size: 16 focus group participants

Response rate: 100%

Outcomes: Improvement opportunities identified, and revisions to design were implemented. Decision made to offer an introductory workshop.

Goods and Storehouse Receiving Survey

Date: November 1998

Survey Focus: Evaluate customer satisfaction with delivery of Storehouse orders and purchased goods.

Survey Instrument: Point-of-service questionnaire.

Sample Size: 150

Response rate: 43%

Outcomes: Although results were overwhelmingly positive, comments indicate delivery of Storehouse orders may be a problem. Developing an addition to the tracking system to capture the time storehouse orders are received to time of delivery, in order to identify whether delays occur on the receiving dock or in the storehouse order-filling process.

Environmental Health & Safety

EH&S Customer Service Evaluation

Date: Ongoing

Survey Focus: To assess degree of satisfaction with key deliverable services

Survey Instrument: Questionnaire and telephone

Sample Size: 4900

Response rate: Approximately 70%

Outcomes: Identified customer needs in the areas of the EH&S website and inspections. Feedback from customers resulted in website improvement (including posting of training schedule with sign-up by e-mail), coordinating directly with Facilities Management on facility-related problems identified during inspections (rather than expecting the customer to do so), and prioritizing for the customer the action items identified during an inspection.

EH&S Customer Satisfaction Survey

Date: May 1998

Survey Focus: To obtain customer feedback on awareness and basic knowledge of services provided, needs and expectations, service attributes, and satisfaction with key deliverables. Conducted as part of EH&S participation in the UC Partnership for Performance program.

Survey Instrument: Questionnaire

Sample Size: 103

Response rate: 72%

Outcomes: Discovered lack of awareness regarding departmental IIPPs, how to access a MSDS, and what to do when regulatory agency inspectors visit. EH&S training and consultation programs are now giving priority to addressing these issues and enhancing other program elements identified on the survey.

Facilities Management

Rental Vehicle Survey

Department: Fleet Services

Date: Ongoing

Survey Focus: To obtain feedback from our daily customers as to the level and quality of our services provided and as to whether we are meeting their needs and expectations

Survey Instrument: Questionnaire tag hangs in vehicle

Sample Size: Monthly average is approximately 25

Response rate: Approximately 90%

Outcomes: We use the feedback to make changes in the checkout procedure to make it simpler for the customer. Complaints as to the condition of the vehicle upon delivery are followed up both with staff and with a phone call to the customer to remediate any problems.

Customer Satisfaction Survey

Date: April 1998

Survey Focus: To determine, from the point of view of the facility managers, how Facilities is doing in the areas in quality, responsiveness, and ease of doing business with the department.

Survey Instrument: Telephone survey

Sample Size: 25

Response rate: 88%

Outcomes: Opportunities for improvement were identified and discussed at staff meetings. Individual units customer evaluated the information and are implementing action items to address those issues.

Service Completion Survey

Department: Building Services – Labor Pool

Date: Ongoing

Survey Focus: To obtain feedback from customers as to the timeliness, worker courtesy, and level of customer satisfaction with services provided

Survey Instrument: Point-of-service questionnaire

Sample Size: 60

Response rate: 90%

Outcomes: Feedback is reviewed and discussed at monthly departmental meetings.

Facilities Management Employee Survey

Date: Annually

Survey Focus: To determine the level of supervisor skills in each of the major work groups in the department

Survey Instrument: Questionnaire

Sample Size: 170

Response rate: 100%

Outcomes: Feedback is reviewed, discussed, and published. Action plans to remediate deficiencies are formulated and implemented.

Service Completion Survey

Department: Building Services – Custodial

Date: Ongoing

Survey Focus: To obtain feedback from customers as to the timeliness, quality, and level of customer satisfaction with services provided

Survey Instrument: Point-of-service questionnaire

Sample Size: 50

Response rate: 95%

Outcomes: Feedback is used in performance review with contractor

Human Resources

How Are We Doing?

Date: Ongoing

Survey Focus: Point-of-service, distributed randomly to customers after receiving primary HR services.

Survey Instrument: Questionnaire. Telephone surveys also conducted to augment feedback received via questionnaires.

Sample Size: Approximately 250 surveys were sent out during the course of the year.

Response Rate: 32%

Outcome: A number of useful comments and suggestions surfaced. As a result, the following steps are completed or underway: (1) a back-up plan has been established for each HR unit and cross–training has occurred for key functions; (2) guidelines have been established for making effective customer-friendly use of voice mail; (3) a full review of HR items and linkages on the web has been completed and appropriate adjustments made; (4) the HR website is being redesigned to make it more user-friendly; and (5) a web-based matrix is being developed to include the appropriate policy or labor-contract reference for each personnel issue.

Materiel & Risk Management

Customer Satisfaction Survey

Department: Copy Centers

Date: July 1998

Survey Focus: To assess customer satisfaction with Copy Centers.

Survey Instrument: Questionnaire

Sample Size: 140

Response rate: 45%

Outcomes: There was a high level of satisfaction with the service provided by the department. As a result of the high satisfaction rating no changes are being made as a result of the survey.

UCI Interior Design Services Survey

Department: Interior Design

Date: July – December 1998

Survey Focus: To assess customer satisfaction.

Survey Instrument: Questionnaire

Sample Size: 18

Response rate: 38%

Outcomes: Overall customers were satisfied to very satisfied. The department has improved the level and increased the frequency of feedback to customers.