PPAP Documentation: What Goes in Each Element, Why Submissions Get Rejected, and How to Stop Resubmitting
A practical PPAP guide for automotive suppliers — what each of the 18 elements requires, the rejection patterns that keep killing submissions, and how to stop them.
A second-tier stamping supplier in northern Indiana submitted a Level 3 PPAP package to a Tier 1 customer at 4:47 p.m. on a Friday. The package was rejected at 4:53 p.m. — six minutes later, before anyone at the customer had opened a single attachment. The reason was simple. The Part Submission Warrant listed drawing revision F. The dimensional layout in the package was tied to drawing revision E. The customer's PPAP portal flagged the mismatch automatically. The reviewer never saw the rest of the submission, which by every other measure was clean — Cpk of 2.1 on the controlled characteristic, complete material certs, an approved control plan, an MSA done correctly. None of it mattered. The package went back into the queue, and the launch slipped a week because the resubmission landed at the back of the next review cycle.
This is the rejection pattern that keeps showing up in the PPAP discussions on Elsmar Cove and in the rejection-reason data customer supplier quality teams quietly track. The headline causes are not bad parts or failed capability studies. They are paperwork mismatches between documents that should have been internally consistent before the package ever left the supplier's hands. This guide covers what each of the 18 PPAP elements actually requires, where the consistency traps are, and the structural reasons most suppliers keep falling into them.
What PPAP is and what level you're submitting
PPAP — the Production Part Approval Process — is the AIAG-defined process by which automotive suppliers prove to their customers that the production process can make conforming parts at the required rate, and that the supplier has the documentation to demonstrate it. The current reference is the AIAG PPAP manual, 4th edition, supplemented by each customer's customer-specific requirements (CSRs).
There are five submission levels. Level 1 is a Part Submission Warrant only, with the rest of the package retained at the supplier. Level 2 is the warrant plus product samples and limited supporting data. Level 3 is the default — the full 18-element package plus samples submitted to the customer. Level 4 is whatever the customer specifies, which usually means Level 3 minus a few elements. Level 5 is the full package retained at the supplier and reviewed at the supplier's site.
Level 3 is what most automotive customers expect for new parts, design changes, and material changes. Level 1 is common for low-risk catalog parts and proprietary fasteners. Submitting at the wrong level — usually Level 1 when the customer's engineering bulletin called for Level 3, or Level 3 with elements missing because the supplier assumed they weren't required — is one of the standard rejection causes. The level is not the supplier's choice. It is the customer's choice, communicated in the engineering bulletin, the SQ engineer's email, or the customer's CSR document. Read the request before assembling the package.
The 18 elements, what they actually contain, and where they trip people up
Not every element is required for every submission. The level dictates what gets sent to the customer and what stays at the supplier, and the customer's CSR can add or waive elements. The package that ships, however, has to be internally consistent — which means every element has to be aligned with every other element on the part number, the drawing revision, the date, and the reason for submission.
1. Design records. The part drawing or 3D model that defines the part. The drawing revision listed on the design record has to match the revision on every other element in the package. This sounds trivial. It is not. The drawing revision is the most common mismatch in rejected submissions because the design record sits in the engineering system and the dimensional layout sits in a quality system, and the two systems update on different cadences.
2. Engineering change documents. If the part is being submitted because of an engineering change, the change documents have to be in the package. The trap here is partial submission — the change notice is included, but the predecessor change notice that the current change supersedes is not, and the customer can't trace the part's revision history.
3. Customer engineering approval. Where the customer's engineering organization approves a deviation, an interim spec, or a special characteristic, the approval record goes in. Suppliers commonly leave this out because the conversation happened by email between the supplier engineer and the customer engineer and never got formalized. The email is the approval record. It still has to be in the package.
4. Design FMEA. Required when the supplier is design-responsible. The DFMEA has to be current — same revision discipline as the drawing — and has to address every special characteristic. A DFMEA dated three years before the current drawing revision is a finding even if the design hasn't changed, because the FMEA review history isn't visible.
5. Process flow diagram. Every step in the manufacturing process from receipt of raw material through shipment, including inspection points, rework loops, and material handling. The flow diagram has to be consistent with the PFMEA and the control plan. If the flow shows three inspection stations and the control plan only addresses two, the package is internally inconsistent.
6. Process FMEA. The PFMEA covers every step in the process flow and identifies the failure modes, effects, and controls. The PFMEA has to address every special characteristic from the DFMEA or the customer drawing, the RPN scoring has to be current, and the action items have to be either closed or have a documented closure plan. Missing special characteristics is a common rejection reason.
7. Control plan. The control plan operationalizes the PFMEA. Every special characteristic has a row, every reaction plan is documented, and every measurement method is specified. The control plan has to be consistent with the PFMEA, the process flow, and the dimensional layout. The "reaction plan" column being blank or saying "see procedure" is a finding — auditors and SQEs both want the reaction documented in the row.
8. Measurement System Analysis. Gauge R&R studies on every gauge used to measure a special characteristic, plus bias, linearity, and stability where applicable. The MSA has to be current, the format has to match what the customer asks for in the CSR (some customers reject anything not in the latest AIAG MSA manual format), and the gauge identification numbers have to match the control plan. Submitting an MSA from an old gauge that's since been replaced is a common rejection cause when the SQE cross-checks the gauge ID against the control plan.
9. Dimensional results. A layout — every dimension on the drawing measured on a sample, with the actual values, the spec limits, and a pass/fail indication. The dimensional layout has to match the drawing revision, has to cover every dimension on the drawing (not just the special characteristics), and has to be signed and dated. Out-of-spec dimensions without an attached deviation request are an automatic rejection.
10. Material and performance test results. Material certifications from the steel mill or resin supplier, plus any performance tests the drawing or spec calls for. Material certs have to trace back to the actual material lot used to make the parts, not a generic mill cert from a different heat. Performance test reports have to be on the supplier's letterhead or a qualified lab's letterhead, dated, and signed. A performance test result with no signature is one of the most common rejection causes for first-time submitters.
11. Initial process studies. Process capability data on every variable special characteristic. AIAG requires Cpk ≥ 1.67 for new processes at PPAP, dropping to Ppk ≥ 1.33 for ongoing production. The data has to come from the actual PPAP production run — typically 300 consecutive parts — and the Cpk has to be computed from data the customer can verify. Small sample sizes and Cpk values that look too perfect both attract scrutiny.
12. Qualified laboratory documentation. Scope and accreditation evidence for any testing performed by an in-house or external lab. The lab has to be accredited (usually ISO/IEC 17025) for the specific tests in the PPAP, not just generally accredited. Submitting a PPAP from a supplier whose lab scope doesn't cover one of the tests in the package is a published rejection cause and a particularly painful one because the fix usually involves outsourcing the test, which adds weeks.
13. Appearance Approval Report. Required for parts where appearance is a customer requirement — paint, plating, visible interior trim, exterior body panels. AAR has to be signed by the customer's appearance approval authority. Appearance approvals from someone other than the named authority — common when the customer engineer is on PTO and the supplier engineer signs off with the buyer — get rejected.
14. Sample production parts. The actual parts. Quantity, identification, and packaging are all specified by the customer. The parts shipped to the customer have to be from the same production run that generated the dimensional layout and the capability data, and the part identification has to be traceable back to that run.
15. Master sample. A sample part retained at the supplier, signed and dated by both the supplier and the customer, that becomes the reference for any future production part disputes. The master sample requirement is sometimes waived in the CSR, sometimes not. Reading the CSR matters.
16. Checking aids. Fixtures, gauges, templates, and check fixtures used to verify the part. The PPAP package documents these — drawings, calibration records, MSA results — and the physical aids are retained at the supplier. The records have to be current; a checking aid drawing from before the last design change is a finding.
17. Customer-specific requirements. The CSR is its own element. Every customer publishes its own — Ford's PPAP requirements, GM's, Stellantis's, Toyota's, and every Tier 1 has their own layered on top. The element's evidence is a checklist showing every CSR requirement with its compliance status. Suppliers who don't have a CSR checklist in the package are signaling to the SQE that they didn't read the CSR. The SQE will have read it. The rejection follows.
18. Part Submission Warrant. The PSW is the cover sheet that summarizes the package — part number, drawing revision, reason for submission, level, dimensional and material results status, and the supplier's declaration of conformance. The PSW is the most common single source of rejection because it has to be perfectly consistent with every other element. A PSW that says "submission level 3" with a package missing two of the Level 3 required elements is a rejection. A PSW signed by someone whose name isn't in the customer's approved signer list is a rejection. A "reason for submission" of "annual" when the customer is expecting "engineering change" is a rejection.
Why PPAPs actually get rejected
The published rejection data and the practitioner conversations on Elsmar and in supplier quality groups tell the same story. Most rejections aren't about the parts. They're about the documents not lining up with each other or with the customer's expectations.
The dominant pattern is internal inconsistency across the package. The drawing revision on the PSW doesn't match the dimensional layout. The gauge ID on the MSA doesn't match the control plan. The PFMEA addresses six special characteristics; the control plan only operationalizes four. The process flow shows a heat treat step; the PFMEA doesn't. Each of these is a single-line discrepancy, easy to miss inside the supplier and impossible to miss in an SQE's first-pass review.
The second pattern is missing or vague required content. Reason for submission is "design change" without specifying which change. Statement of conformance reads "all dimensions conform" without referencing the specific test results. PSW signed without a date. Material certs included but not the lot traceability link. AAR signed by the wrong authority. None of these are about whether the parts are good. They are about whether the package proves it.
The third pattern is failure to meet customer-specific requirements. The supplier built a textbook AIAG-compliant package and didn't read the customer's CSR document. The CSR called for an additional reliability test, a specific MSA format, a particular packaging layout, or evidence of supplier-tier-2 PPAP. Without it, the package is incomplete by the customer's definition even if it is complete by AIAG's.
The fourth pattern is nonconforming parts submitted with deviation requests attached or — worse — without them. Submitting a part with a known dimensional miss and asking for the deviation in the cover note is the wrong sequence. The deviation has to be approved before the PPAP is submitted. A deviation request inside a PPAP package is a signal to the SQE that the planning process didn't complete, and the typical response is rejection followed by a request to resubmit after the deviation is closed out.
The fifth pattern is timing — submitting on Friday afternoon, mid-shutdown, the day before a customer holiday, or four days before the customer's launch readiness review. The package may eventually be approved, but the queue position can cost a week of program time and trigger an escalation that puts the supplier on a watch list.
The PSW is where the rejection actually happens
Of all 18 elements, the PSW is the one most likely to be the proximate cause of rejection. It is the element that has to summarize all the others, which means every inconsistency anywhere in the package surfaces on the PSW first.
A clean PSW carries the part number exactly as the customer specifies it, including any prefix or suffix the customer's system requires. The drawing revision matches the design record. The reason for submission is one of the AIAG-listed reasons, written in the customer's preferred wording — not "engineering change" when the customer's portal expects "design change," even though the standard treats them as synonyms. The submission level matches what the customer requested. The dimensional and material results check boxes match the actual results status of the embedded reports. The signer is on the customer's approved list. The date is real and recent. The conformance statement, where the customer requires one, is specific to the package and not a copy-paste from a previous PSW.
Suppliers who have figured this out treat the PSW not as the cover sheet but as a final cross-check tool. The PSW gets filled in last, every field gets verified against the underlying element, and the package doesn't ship until the cross-check is signed off by someone other than the person who built the package. The teams that ship rejected PSWs are almost always the teams that built the PSW first and assembled the supporting elements around it.
Where the records have to live
A working PPAP system has to do something most spreadsheet-and-folder document control architectures struggle with. It has to keep eighteen interrelated documents in sync across multiple revisions, multiple part numbers, multiple submissions to multiple customers, and multiple regulatory windows — and it has to keep the audit trail intact for a minimum of one production year past the part's last shipment, often longer per the customer's CSR.
Most suppliers approximate this with a folder structure on a shared drive, an Excel index of submissions, and a working assumption that the team building the next PPAP can find the previous one. They can, until they can't. The original Word file of the PSW gets overwritten with the resubmission. The dimensional layout for revision E is moved into an "old" folder that gets cleaned up when the IT team reorganizes the drive. The MSA spreadsheet is updated for a new gauge and the previous version of the file no longer exists in any form. When a customer comes back two years later asking for the historical PPAP because of a field issue, the package the supplier sent on the day is no longer fully reconstructable. The supplier produces what is recoverable, the customer notices the gaps, and the supplier moves a tier on the customer's risk score.
The structural problem is that PPAP is a multi-document, multi-revision, multi-submission record-keeping requirement running on top of tooling that wasn't built to retain that kind of trail. Drawings live in the engineering system. Capability data lives in the SPC software. MSAs live in spreadsheets. CSRs live in PDFs in an email archive. PSWs live in Word documents named v3-final-FINAL-actuallyfinal.docx. Tying them together for the submission is a manual integration job, and the audit trail across them is whatever the most disciplined person on the team manages to maintain.
This is the same record-keeping problem that surfaces under document control, change management, and CAPA. SheetLckr exists to close this specific gap: a compliance-grade spreadsheet platform with built-in version history, approval workflows, and a tamper-evident audit trail, so the dimensional layout, the MSA, the capability study, the CSR checklist, and the PSW cross-check can live in one connected, auditable record that survives the trail a customer SQE — or a customer SQE two years from now investigating a field claim — will walk. The package isn't passed or failed at the PSW. It's passed or failed at whether every supporting element can be reproduced exactly as it was on the day the PSW was signed.
PPAP is one of the higher-stakes record-keeping exercises in automotive supply because the consequences of a rejection are not abstract. A rejected PPAP holds up production, slips a launch, attracts an SQE visit, and eventually shows up on the supplier's scorecard. The discipline involved is not principally technical. It is the discipline of building a package whose 18 elements are internally consistent, externally aligned with the customer's CSR, and assembled in an order that lets the PSW be a cross-check rather than a guess. Suppliers who handle PPAP well treat every submission as a coordinated document set rather than a checklist to fill, and they keep the trail intact long enough that the package is still reconstructable years after the parts have shipped. Suppliers who handle it badly assemble the package from whatever is current in whichever system happens to have it, ship it on a Friday afternoon, and learn at 4:53 p.m. that the drawing revision on the PSW didn't match the dimensional layout. The rejection isn't really about the revision. It's about the system that let the mismatch ship.
Stop patching Excel. Run audits with confidence.
SheetLckr gives quality teams a spreadsheet with built-in audit trails, version locking, approvals, and CAPA tracking — so you're always audit-ready, not scrambling the week before.