Browse Definitions :
Definition

statistical process control (SPC)

Contributor(s): Corinne Bernstein

Statistical process control (SPC) is a scientific, data-driven methodology for monitoring, controlling and improving procedures and products. This industry-standard quality control (QC) method entails gathering information about a product or process on a near real-time basis so that steps can be taken to ensure the process remains under control.

Using SPC, manufacturing engineers or production supervisors can obtain and analyze statistical data about the manufacturing process at a site. The data collected helps determine whether characteristics of the product or process conform to specifications and meet acceptable quality levels.

Eliminating variation from standards helps ensure that the process is stable and monitoring the production process helps spot significant changes from the mean. SPC helps reduce waste by focusing on early detection and prevention of problems, rather than the correction of problems after the fact.

SPC is sometimes used interchangeably with the term statistical quality control (SQC). However, SQC typically focuses on process outputs, or dependent variables, while SPC focuses on process inputs, or independent variables.

SPC tools

Statistical process control techniques and tools can be employed to monitor process behavior, discover issues in internal systems and develop solutions for production issues. SPC can be applied to any manufacturing or non-manufacturing process in which output that conforms to specifications can be measured. This methodology can be used to determine whether the processes are sufficient to generate products within an acceptable quality level if a project is producing many similar products but not if a project is creating only a small number of customized deliverables.

Control charts

Example of a control chart

In SPC, process variabilities are examined, and control charts and other tools can be used to refine a statistical process. Control charts enable users to record data and see the occurrence of an unusual event, such as a very high or low observation. Engineers may use standard deviation equations to streamline or refine results. Those seeking to improve processes that deal with statistical information may examine causes of variations and use logical rules to create algorithms for control.

History of SPC

The concept of statistical process control has a long history. Bell Laboratories’ Walter A. Shewhart, sometimes referred to as the father of statistical quality control, pioneered SPC in the early 1920s to measure variance in production systems. Renowned quality expert W. Edwards Deming, a student of Shewhart, expanded the concept and introduced it to Japanese industry after World War II. Today, organizations around the world have incorporated SPC to improve product quality by reducing process variation. Thanks partly to the propagation of comprehensive quality systems -- such as ISO, QS9000, Six Sigma and MSA (Measurement System Analysis) -- many companies have been working actively with SPC.

This was last updated in June 2019

Continue Reading About statistical process control (SPC)

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

What tools or software does your organization use to perform statistical process control?
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • compliance framework

    A compliance framework is a structured set of guidelines that details an organization's processes for maintaining accordance with...

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

  • privacy compliance

    Privacy compliance is a company's accordance with established personal information protection guidelines, specifications or ...

SearchSecurity

SearchHealthIT

  • telemedicine (telehealth)

    Telemedicine is the remote delivery of healthcare services, such as health assessments or consultations, over the ...

  • Project Nightingale

    Project Nightingale is a controversial partnership between Google and Ascension, the second largest health system in the United ...

  • medical practice management (MPM) software

    Medical practice management (MPM) software is a collection of computerized services used by healthcare professionals and ...

SearchDisasterRecovery

  • disaster recovery (DR) test

    A disaster recovery test (DR test) is the examination of each step in a disaster recovery plan as outlined in an organization's ...

  • business continuity plan (BCP)

    A business continuity plan (BCP) is a document that consists of the critical information an organization needs to continue ...

  • disaster recovery (DR)

    Disaster recovery (DR) is an area of security planning that aims to protect an organization from the effects of significant ...

SearchStorage

  • kilobyte (KB or Kbyte)

    A kilobyte (KB or Kbyte) is a unit of measurement for computer memory or data storage used by mathematics and computer science ...

  • megabytes per second (MBps)

    Megabytes per second (MBps) is a unit of measurement for data transfer speed to and from a computer storage device.

  • zettabyte

    A zettabyte is a unit of measurement used by technology professionals and the general public to describe a computer or other ...

Close