Artifact content
Public source: pyavchik.space/requirements.html
This artifact demonstrates requirements testing as a QA activity. The goal is not to execute the application, but to review the specification itself for quality before implementation or execution creates avoidable defects.
- Type of QA work: static testing / requirements review / testability analysis
- Base document: public SRS-style specification with 91 requirements
- Modules covered: Auth, Game, UI, API, SQL, NFR, Wallet
When testing requirements, I focus on questions such as:
- Is the requirement unambiguous, or can it be interpreted in more than one way?
- Is expected behavior measurable and verifiable?
- Are validation rules, edge cases, and error conditions explicit enough for testing?
- Are browser support, API contracts, and database assumptions defined clearly?
- Does the requirement conflict with another requirement or leave important gaps?
- Can the requirement be traced to one or more meaningful test scenarios?
The public requirements.html sample is useful because it already exposes elements that make requirement testing practical: priorities, browser support, API contracts, database-related expectations, and a traceability matrix. That makes it possible to evaluate not only what the product should do, but also how testable and complete the specification is.
For QA, this matters because weak requirements create weak testing. Reviewing ambiguity, completeness, measurability, contradictions, and traceability early helps reduce rework, improve test coverage quality, and support better communication with developers and product stakeholders before defects move deeper into the delivery cycle.
This artifact supports the claim that I can work with requirements critically, not only test finished UI behavior. It is especially relevant for roles that expect analysis of tasks for ambiguity, completeness, and measurability.