How to Handle Typing Test Score Disputes and Build Fair Retake Policies
Build a fair, defensible process for handling typing test score disputes. Learn how to create retake policies, structure appeals, and communicate with candidates at every stage.

A candidate finishes their typing test, sees a score of 43 WPM, and immediately fires off an email: "Something was wrong. My keyboard was lagging. I type way faster than that. Can I retake it?"
What do you do?
If you don't have a clear answer ready, you're not alone. Most hiring teams create typing assessments with plenty of thought around pass/fail thresholds and test content, but very little thought around what happens when someone pushes back on their results. That gap can lead to inconsistent decisions, frustrated candidates, and even legal exposure.
The good news? Building a fair, transparent dispute and retake process isn't complicated. It just takes some intentional planning. This guide walks you through how to handle score disputes professionally, create retake policies that hold up under scrutiny, and communicate everything in a way that protects both your organization and your candidates.
Whether you're hiring data entry clerks or customer service representatives, having a defensible process matters. A platform like TypeFlow gives you detailed keystroke analysis, security monitoring, and candidate performance data that make dispute resolution far more objective. But the policy framework comes first. Let's build it.
Why Score Disputes Happen and What They Really Mean
Before you can handle disputes well, you need to understand why candidates challenge their scores in the first place. Not every dispute is a complaint. Some are genuine signals that something went wrong during the assessment.
Legitimate Technical Issues
Typing tests depend on technology, and technology fails. A candidate might experience browser lag, an unresponsive keyboard, a spotty internet connection, or an unexpected system notification that pulled focus during the test. These are real problems that can genuinely suppress a person's score.
The challenge is distinguishing legitimate technical issues from convenient excuses. This is where data becomes your best friend. Modern testing platforms track more than just the final WPM number. They capture keystroke timing patterns, pause intervals, tab switches, paste attempts, and focus loss events. If a candidate claims their browser froze mid-test, you can look at the keystroke timeline and see whether there's a gap consistent with that claim.
Without this data, you're stuck making judgment calls based on who sounds more convincing. With it, you can make evidence-based decisions.
Test Anxiety and Unfamiliarity
Some candidates simply don't perform well under timed pressure, especially if they weren't expecting a typing assessment. A candidate who types 60 WPM all day at their current job might freeze up and score 40 WPM when a countdown timer is staring them down.
This doesn't mean your test is unfair. But it does mean your communication before the test matters. Candidates who know what to expect, how long the test takes, what the passing threshold is, and whether they'll have multiple attempts tend to perform closer to their true ability.
Misaligned Expectations
Sometimes a dispute isn't about the score itself. It's about the candidate not understanding how the score was calculated. If your test weights accuracy heavily (penalizing errors, for example), a fast but error-prone typist might score lower than they expected. If you measure net WPM rather than gross WPM, that distinction matters.
Clear documentation of your scoring methodology prevents this category of disputes almost entirely. When candidates understand the rules before they start, they're far less likely to challenge the outcome.
The bottom line: most disputes fall into a few predictable categories, and most can be prevented or resolved with better communication and better data. The Uniform Guidelines on Employee Selection Procedures emphasize that employment tests must be administered consistently and documented thoroughly. Your dispute process should reflect those principles.
Building a Retake Policy That's Both Fair and Defensible
A retake policy is your first line of defense against score disputes. When candidates know upfront that retakes are available under certain conditions, it reduces anxiety, increases perceived fairness, and gives you a structured framework for saying "yes" or "no" without it feeling arbitrary.
Here's how to build one that works.
Define Who Qualifies for a Retake
Not every dispute warrants a retake. Your policy should clearly spell out the circumstances under which a candidate can request one. Common qualifying scenarios include:
Documented technical failure: The candidate experienced a verifiable issue (browser crash, connectivity loss, hardware malfunction) that is supported by session data or a contemporaneous report.
Test administration error: The wrong test version was sent, the time limit was configured incorrectly, or the test link expired before the candidate could complete it.
Accommodation need: The candidate has a documented disability that wasn't properly accommodated during the first attempt.
Your policy should also clarify what does not qualify. A candidate who simply scored lower than they hoped, or who says they "weren't ready," generally shouldn't receive a retake unless your policy allows practice attempts for everyone.
The key principle: whatever rules you set, apply them uniformly. If Candidate A gets a retake because of a reported keyboard issue, Candidate B in the same situation should get one too. Inconsistency is what creates legal risk.
If you're looking for guidance on where to set your pass/fail line in the first place, the companion guide on how to set defensible typing test pass/fail thresholds walks through anchoring cutoffs to actual job requirements. Defensible thresholds are the foundation that makes your retake policy credible.
Set the Logistics
Once you've defined eligibility, nail down the practical details:
How many retakes are allowed? One retake is standard for most organizations. Allowing unlimited attempts creates gaming opportunities and administrative headaches. Two total attempts (original plus one retake) strikes a good balance.
What's the waiting period? A 48 to 72 hour cooling-off period between attempts prevents candidates from immediately retaking the test while still emotionally charged. It also gives them time to address any legitimate technical issues.
Which score counts? Most organizations use the higher of the two scores, which feels fair to candidates. Some use the most recent score. Others average them. Pick one approach and stick with it.
Is the retake the same test or different? Using a different text passage of equivalent difficulty prevents memorization effects while maintaining comparable difficulty. If your platform supports multiple test templates, rotate them.
What's the request deadline? Candidates should submit retake requests within 24 to 48 hours of their original attempt. This keeps the process moving and prevents disputes from surfacing weeks later.
Document all of this in writing and share it with candidates before they take the test. Transparency isn't just good ethics. It's good risk management.
For a deeper dive into handling specific retake scenarios, including what to do when technology fails mid-test, see the typing test retake policy and tech issue playbook.
Put It in Writing
Your retake policy should live in a formal document that candidates acknowledge before testing. This can be as simple as a paragraph on the test invitation email or a checkbox on the test landing page. It should include:
The number of attempts allowed
Conditions under which retakes are granted
The waiting period between attempts
How scores are calculated (which attempt counts)
The deadline for requesting a retake
Who to contact with questions
This written acknowledgment protects you if a candidate later claims the process was unfair. You can point to the policy they agreed to before starting.
Creating a Structured Appeals Process for Score Challenges
Retake policies handle the straightforward cases. But what about the candidate who doesn't qualify for a retake and still believes their score is wrong? You need an appeals process, a formal pathway for candidates to challenge their results and receive a considered response.
This doesn't need to be bureaucratic. For most organizations, a simple three-step framework works well.
Step 1: Receive and Acknowledge the Dispute
When a candidate disputes their score, respond within one business day with a templated acknowledgment. Something like:
Thank you for reaching out about your typing assessment results. We take all candidate concerns seriously. We'll review your submission and respond within [X] business days with a determination.
This accomplishes two things. First, it tells the candidate they've been heard, which immediately reduces tension. Second, it buys you time to actually review the data rather than making a snap decision.
Create a simple intake form or email template that captures:
The candidate's name and test ID
The date and time of the assessment
The specific concern (technical issue, scoring question, accommodation request, other)
Any supporting details the candidate wants to provide
Step 2: Review the Evidence
This is where objective data transforms your process from guesswork into something defensible. Pull the candidate's test session and examine:
Keystroke timeline: Are there unusual gaps, bursts, or patterns that suggest a technical interruption?
Security flags: Did the system detect tab switches, paste attempts, or focus loss events? If so, how many and when?
Score breakdown: What were the gross WPM, net WPM, accuracy percentage, and error count? Does the pattern suggest a technical issue or simply a lower skill level?
Comparison data: How does this candidate's session compare to others who took the same test in the same time period? If multiple candidates reported issues, that points to a systemic problem.
The reviewer should be someone other than the person who initially administered the test. This separation creates objectivity and demonstrates procedural fairness.
Document your findings. Even a brief paragraph summarizing what the data showed is valuable if you ever need to justify the decision later.
Step 3: Communicate the Decision
Respond to the candidate with a clear determination. If you're granting a retake, provide the logistics (new test link, deadline, any conditions). If you're denying the appeal, explain why in specific but professional terms.
A denial might look like:
After reviewing your test session data, we found no evidence of technical interruption during your assessment. Your keystroke patterns were consistent throughout the test duration, and no system anomalies were detected. Based on this review, we're unable to grant a retake. Your score of [X] WPM with [Y]% accuracy stands as your official result.
Notice the language: specific, factual, and respectful. You're not questioning the candidate's honesty. You're stating what the data showed.
For the small number of candidates who remain unsatisfied after the initial appeal, consider offering a secondary review by a different manager or HR representative. This final layer of review is rarely used but demonstrates that your process is thorough.
To maintain fairness across all candidates, it helps to normalize your scoring approach so that different test passages, times of day, or testing conditions don't create unintended score variations.
Communicating Your Policies to Build Candidate Trust
The best dispute resolution process in the world fails if candidates don't know about it. Communication is the thread that ties everything together, and it starts well before anyone sits down to type.
Before the Test
Your test invitation should set clear expectations. Tell candidates:
What the test measures (net WPM, accuracy, or both)
How long it takes
What the passing threshold is (or at minimum, that there is one)
Whether retakes are available and under what conditions
Who to contact if they experience technical issues during the test
What accessibility accommodations are available
This isn't just about being nice. The EEOC's guidance on employment tests makes clear that selection procedures should be administered consistently and that candidates should understand the process. Proactive communication is a core part of compliance.
A sample pre-test message might include:
You'll complete a 3-minute typing assessment. Your score will be based on net words per minute (adjusted for errors) and accuracy percentage. You may request one retake within 48 hours if you experience a verified technical issue. Please test your keyboard and internet connection before beginning.
During the Test
The testing interface itself should reinforce fairness. Clear countdown timers, visible progress indicators, and straightforward instructions reduce confusion and the disputes that come with it. If your platform monitors for tab switches or other security events, consider disclosing that monitoring upfront. Candidates who know they're being observed behave differently than those who don't, and transparency about monitoring reduces post-test challenges about "unfair surveillance."
After the Test
Provide results promptly. Delayed results create anxiety and speculation. When sharing scores, include enough context for the candidate to understand their performance:
Their WPM and accuracy scores
The passing threshold (so they know where they stand)
Next steps (whether they passed, failed, or are being considered)
How to request a retake or file a dispute if applicable
Candidates who receive clear, timely feedback are significantly less likely to dispute their results. Most disputes stem not from actual unfairness but from the feeling of unfairness that comes with opaque processes.
Training Your Team
Finally, make sure everyone involved in your hiring process understands the dispute and retake policies. Recruiters should know how to respond to initial complaints. Hiring managers should understand when to escalate. HR should be prepared to conduct secondary reviews if needed.
Create a simple decision tree:
Candidate reports a technical issue during the test → Grant retake per policy
Candidate requests retake within deadline, qualifying reason → Grant retake per policy
Candidate requests retake, no qualifying reason → Deny, explain policy, offer appeal
Candidate appeals denial → Escalate to secondary reviewer
Secondary review complete → Final determination, documented
This structure removes individual judgment from the equation and replaces it with consistent, documented process.
Handling typing test score disputes doesn't have to be stressful or risky. With a clear retake policy, a structured appeals process, and transparent communication at every stage, you protect your organization while treating candidates with the respect they deserve.
The foundation of all of this is good data. When you can pull up a candidate's keystroke timeline, security monitoring logs, and detailed performance breakdown, disputes become conversations grounded in evidence rather than arguments based on feelings. TypeFlow's plans include the detailed analytics, security monitoring, and candidate performance data that make this kind of objective dispute resolution possible.
Start by documenting your retake policy. Share it with your hiring team. Add it to your test invitation emails. Then, when the inevitable dispute arrives, you'll have a clear, fair, and defensible path forward.
Recommended Reading
How Staffing Agencies Standardize Typing Tests Across Multiple Clients
Staffing agencies juggling different typing tests for different clients waste time and risk bad placements. Here's how to build a standardized assessment framework that scales.
How to Set Up a Pre-Employment Typing Test in Minutes
Set up a professional pre-employment typing test in minutes. This step-by-step guide covers test configuration, sending invitations, reading results, and avoiding common pitfalls.
How to Send Typing Tests to Candidates by Email
A step-by-step guide for recruiters on sending typing tests to candidates by email, from configuring assessments and crafting invitations to tracking results and making data-driven hiring decisions.
How to Set Defensible Typing Test Pass/Fail Thresholds for Hiring
Setting typing test cutoffs shouldn't be guesswork. Learn the data-driven process for building defensible pass/fail thresholds that hold up to legal scrutiny and actually predict job performance.