The most common rejection in a job hunt isn't a "no thanks" email. It's silence. You spend an hour tailoring a resume, paste it into a portal, click Submit, and… nothing. Statistically, the resume probably never reached a recruiter at all — it was filtered out by software.
Industry estimates put that filter rate around 75%. Three out of four resumes get screened out by an Applicant Tracking System (ATS) before a human reviewer sees them. The frustrating part is that most of those rejections aren't about whether you're qualified for the role. They're about whether the software could parse your resume and match it to the job description.
This article walks through what an ATS actually does, the specific patterns that get resumes filtered, and how to fix them.
What an ATS is, in one paragraph
An Applicant Tracking System is the software an employer uses to receive, store, and rank job applications. Think Workday, Greenhouse, Lever, iCIMS, Taleo, SAP SuccessFactors. When you upload a resume, the ATS does three things: it parses your file into structured fields (name, contact, work history, education, skills), it scores you against the job description's keywords, and it produces a ranked list for the recruiter.
Recruiters at large companies receive hundreds of applications per role. They don't read every resume — they read the top of the ranked list. If you're not on it, you're invisible.
The five patterns that get resumes filtered
After analyzing thousands of rewrites, the same five patterns account for most ATS-related rejections.
1. Keyword mismatch
The ATS scores your resume against the keywords in the job description. If the JD asks for "TypeScript" and your resume says "TS" or "JavaScript with strong types," the parser may not match it. If the JD asks for "stakeholder management" and your resume says "worked with internal partners," same problem.
The fix isn't keyword stuffing — it's mirroring the JD's exact phrasing where it's factually accurate. If you used TypeScript on a project, the resume should say "TypeScript" verbatim, not a synonym.
2. Formatting that the parser can't read
Resumes built in Canva, two-column Word templates, or PDFs exported from Figma frequently fail at the parsing step. The ATS sees pixels, not text. Common offenders:
- Tables for layout — most ATS parsers read tables as a single confused string
- Text boxes — invisible to a third of parsers
- Headers/footers — many parsers strip them entirely
- Two-column layouts — order of reading is unpredictable
- Icons used as bullet points — read as garbage characters
The safest format is a single-column document with standard headings, plain bullet points, and a real .pdf or .docx (not a scanned image of either).
3. Non-standard section names
The ATS knows what "Experience" and "Education" mean. It does not know what "My Journey," "What I've Built," or "Adventures" mean. If you rename your sections to be cute, the parser may not categorize the content correctly — and the recruiter searching for "5+ years experience" might not match you even if you have it.
Stick to the obvious: Summary, Experience, Education, Skills, Projects, Certifications.
4. Inconsistent date formats
Mixing "Jan 2022 — Present" with "06/2020-12/2021" confuses the parser's tenure calculation. So does writing "2 yrs" instead of giving start and end dates. Pick one format (Jan 2022 – Present is the standard) and use it everywhere, including dates inside parenthesized roles.
5. PDFs that are actually images
If you "print to PDF" from a system that flattens text into an image, or if you export from a design tool, the resulting PDF may contain no machine-readable text at all. The ATS gets a single picture and can't extract anything.
The test: open your PDF and try to select a paragraph. If you can highlight individual words, the text is real. If selection grabs a whole rectangle of pixels, it's an image, and the ATS will reject it for "unparseable content."
A worked example
Here's a real-shape rewrite of a single experience bullet:
Original:
Worked on improving application performance and helped with backend stuff.
Why it gets filtered:
- "Worked on" and "helped with" don't read as ownership
- No keywords match a typical backend JD ("Node.js," "PostgreSQL," "latency," "throughput")
- No quantification
Rewritten for a backend JD asking for Node.js, Postgres, and performance optimization:
Reduced p95 API latency 40% by tuning Postgres queries and adding Redis caching to a Node.js service handling 10K requests/minute.
Same fact, parsed and scored very differently. The ATS reads "Postgres," "Node.js," "latency," "performance," and "caching" — every keyword from the JD. The recruiter reads ownership and a real metric.
What this means for you
If you're applying to 10+ roles a week and getting silence, the problem is almost never that you're underqualified. It's that the resume isn't being scored well by the systems doing the first read.
The fix is to tailor each resume to the JD before submitting. That sounds like an hour of work per application — and it used to be. With AI tooling that reads both the resume and the JD, scores the gap, and rewrites your bullets in seconds, it's now a thirty-second step.
We built Resuvia for exactly this reason. Paste your resume, paste the JD, see your ATS match score, see the keywords you're missing, get a tuned rewrite. No signup to see the score; pay only when you want to download the final PDF.
If you're targeting a specific role, our software engineer, data analyst, product manager, and marketing manager pages explain exactly which keywords the model recognizes for each.
The bots aren't the enemy. They're just software, and software has rules. Once you know what the rules are, getting past them is mechanical.