Quality can be designed into production process.

Quality Can Be Designed into Production Process

Quality is meeting operational and customer expectations, and quality design is meeting those expectations without massive error-correction systems.

It is the responsibility of every participant in the workplace, from clerks to managers, from operations to systems. Quality is achieved by teaching employees their role in the organization.

Quality does not evolve from planning. It is instead a concept that must be accepted as the basis for planning. It requires creativity, risk taking, and initiative.

The Key Elements

The potential quality in a process depends on the original design of the process, and its enhancement and maintenance over time. The design, workmanship, materials, tools, and time invested determine the finished product's quality.

Take the case of a bank that has a check processing "factory." It is designed to handle large volumes of items and dollars efficiently in a short period. The check processing design allows an "acceptable" amount of error, where "acceptable" is determined by error repair costs and complaints from customers.

Check operations directly affect every one of the bank's key transactions and financial systems, and indirectly determine sales, customer retention, and profitability.

Expected Errors

A large check operation supporting one million transaction accounts can expect to process an average of 1.5 million items a day. The check operation itself will generate 15,000 rejects, 500 or more out-of-balance conditions, 200 to 300 adjustments, and 50 to 100 posting corrections daily. In addition, customers can be expected to generate two to three error inquiries per week per 10,000 accounts, or about 50 per day.

This amounts to an average of 850 to 1,000 errors per day, excluding interest float and other accounting adjustments. That number of errors per day is so common that the rate is transparent to check managers, even when quality is considered important. Poor system designs are rapidly repaired, and technology is advanced.

Check operations quality has generally been defined by those factors that the customer directly relates to service: the number of statement errors, customer deposit corrections, and the posting of account activity. Although these items are key measures of customer service, they do not provide the check manager with enough data to evaluate the quality of the bank's internal operations.

Seven Guidelines

Most quality measures fail because they focus on symptoms, not causes. One of the most effective ways to approach integrating quality measures is to focus on where problems originate. Many managers tend to focus quality management on people or machines, the physical creators of errors. However, the real problem is that errors occur because the process is designed to allow them.

Whether designing or redesigning a product or service, these seven guidelines aid quality design.

* Simplify the input process. The more elements that are matched or constructed into a product, the more chances for error. The production pipeline should be linear, with no backtracking. An individual should see an item only once.

* Move verification and audit steps into the input process. As each step or process is performed, the design structure must require that the completed product is error free. Known deficiencies must not be passed on to the next task in the production chain.

* Minimize the number of steps to complete the product. Reducing the number of times the product is handled by people or machines lowers the incidental rate of error. When possible, combine steps and accountability, even at an increased cost. A slower, accurate process saves money in the long run.

* Maximize automation of routine processing and decision-making steps. Most processes that don't require judgment can be automated. Automation is appropriate in environments like check processing that have few product lines and heavy volumes.

* Simplify error-handling routines. Correcting an error and returning it promptly to the production process will minimize costs. Maximizing automation of repair steps allows systems to validate the original input quality requirements of repaired products. It will ensure that the repaired item passes all audits and verifications required of new items.

* Minimize the training required to perform any task. Management must ensure that each task can be learned quickly and accurately by minimizing the knowledge required to perform each step. The more complex the judgment and association of information required, the higher the degree of error. Complexity can be reduced by compartmentalizing the training process into tasks learned incrementally.

* Assess and adapt validation processes to product requirements. A significant factor in quality management is in the degree and type of monitoring performed. The style of monitoring must match the product delivered. For example, one does not review every stage of production to test the quality of a hamburger, one simply tastes it. In the same vein, to monitor check operations quality, one does not review posted items, but monitors resolved adjustment.

Policies and Procedures

The policies and procedures of an organization reflect how it views its goals. Almost always, business objectives change faster than the policies and procedures supporting them. Business strategies often move so quickly that operations infrastructures become significantly outdated.

Case study: A large West Coast bank averaged 20,000 overdrawn demand deposit accounts daily and returned checks on 95% of the consumer accounts (10,000 checks a day). Studies by the operations area showed that over 94% of these accounts would have had positive balances within five days of the overdraft had the item been paid instead of returned.

Action: Negotiations with the business unit were conducted and financial commitments made to cover potential operating losses. In exchange, the business unit agreed to test a policy of paying overdrafts for all but new customers up to a specified amount. After a short pilot, the results were so successful that the process was introduced throughout the bank.

Result: Returned items decreased by 65%, processing costs were reduced by 40%, overdraft losses increased less than 3%, fee income increased, and a measurable enhancement in customer satisfaction was achieved. Within two years, a number of competitors adopted similar, though less aggressive, policies.

Designing the Reports

The more complex the function, the more staff is required to complete it.

Reports were originally designed to support a task, but like procedures and policies, tasks and systems change faster than reports can be assessed for value.

Case study: Cash accounting audits utilized balances and activity for selected general ledger lines. Teller activity resided in one set of lines, but the control process (evolved in a decentralized environment) required access to a number of general ledger reports in order to proof cash activity.

Action: General ledger reports were redesigned to reflect composite activity. All activity lines for a given teller were reflected in one report, ordered in the way lines were normally reconciled. Branch-generated cash control forms were modified to match reports. Preaudit recaps summarized each clerk's assigned teller lines, allowing management immediate access to each clerk's daily workload.

Result: The report changes simplified the cash reconciliation process, allowing a reduction in staff and less training time for new employees. The changes also became the model for automating the function, further reducing operating expenses by 50%.

This project provided immediate improvements for under $40,000 in software development. It also provided specifications for systems automation when analysis showed that carryforward activities for unposted entries, including reconciliation of outstanding entries, could be fully automated.

Putting the Stress on Function

Internal control methods can be one of the most costly functions in a financial institution. In many cases, the controls are oriented toward transactions rather than toward people and processes. Functional control design should ensure that control costs are weighed against the risks, identify key indicators of risk, and be established at a level of review appropriate to the transaction.

Case study: General ledger controls in a major bank were developed for transactions that were controlled in a decentralized (branch) environment. However, as activities became centralized, control processes were left unchanged.

Action: An analysis of general ledger transaction activity was undertaken to assess control processes. Three recommendations were adopted:

* Minimize the number of general ledger lines accessed by any branch.

* Emphasize controls based on assessed risk.

* Automate exception tracking and review of high-risk activity.

Result: The number of general ledger lines were reduced by 70%, primarily by initiating "one-way" transfer lines to manage activity between major traffic points.

Exception activity and balances in high-risk lines approached zero. A 30% reduction in operating costs was achieved six months after implementation.

The results of quality design are largely dependent on the methods used to assess the design and market needs.

The ultimate success of any quality design project depends on a leader who believes in it and inspires others to believe, too. In the end, it is people -- not just methodologies, research, analysis, or design -- who succeed.

Mr. Propheter is a consultant with the Bentley Group, San Francisco.

For reprint and licensing requests for this article, click here.
MORE FROM AMERICAN BANKER