User Acceptance Testing, Go-Live Readiness
Help Questions
CPA Information Systems and Controls (ISC) › User Acceptance Testing, Go-Live Readiness
A financial services firm is at the end of the SDLC (deployment readiness) for a loan origination platform developed under Scrum with a separate release management gate. UAT was executed by loan officers using production-like data; all critical paths passed, but testers discovered that the system allows a loan to be submitted without attaching a required income verification document if the user saves the application as a draft first. The team is considering go-live with this defect logged as “medium” because it occurs only in a specific sequence. What potential risk identified during testing requires immediate attention?
A vendor lock-in risk, because the platform uses proprietary document storage formats.
A performance risk, because saving drafts could increase database storage and slow down reporting.
A user adoption risk, because loan officers may prefer the legacy system if a draft feature exists in the new platform.
A compliance and control failure risk, because the workflow can bypass a required documentation control and may lead to noncompliant loan processing.
Explanation
This question assesses risk identification in UAT for deployment readiness in a Scrum SDLC, specifically compliance risks from workflow bypasses in regulated environments. The key facts involve a defect allowing loan submission without required documents via a draft save, classified as medium despite occurring in a specific sequence, with all critical paths passing. Option A aligns with best practices by recognizing this as a compliance failure that could lead to noncompliant processing, ensuring controls are enforced pre-go-live. Option B is incorrect as it misclassifies the issue as user adoption rather than a control gap, ignoring regulatory compliance principles; option C is wrong because performance is not the primary concern here, per risk assessment frameworks; option D is irrelevant as vendor lock-in does not relate to the defect, violating focused defect analysis. In analogous cases, conduct risk assessments during UAT to prioritize compliance over convenience. Implement change gates that require remediation of control failures before release.
A healthcare provider is completing the deployment phase of a new patient scheduling system built using a hybrid SDLC (waterfall requirements and design, Agile configuration). UAT included scripted tests and a small set of exploratory tests by clinic staff; most defects were minor, but users reported that appointment reminders sometimes display the wrong clinic address for patients who recently changed primary location. The go-live decision is scheduled for tomorrow, and leadership is focused on meeting the regulatory reporting timeline. Which criteria should be prioritized to determine go-live readiness?
Whether the defect backlog is below a numeric threshold set at project kickoff, even if high-severity issues remain open.
Whether all test cases were executed, regardless of severity, to demonstrate complete UAT coverage.
Whether critical business workflows and patient-facing communications are accurate, with high-severity defects remediated and re-tested, and sign-off obtained from process owners.
Whether performance testing results meet targets, since functional defects can be corrected after go-live without impacting operations.
Explanation
This question examines go-live readiness criteria in a hybrid SDLC, emphasizing the need to prioritize critical workflows and defect remediation over timelines or non-essential metrics. Key facts include a defect in appointment reminders displaying incorrect addresses for patients who changed locations, with most defects minor but this one affecting patient-facing communications in a healthcare context under regulatory pressures. Option C aligns with IT project management best practices by ensuring high-severity defects in core processes are fixed and signed off, promoting patient safety and compliance. Option A is incorrect as complete test coverage without severity consideration ignores risk-based testing principles; option B is wrong because a numeric defect threshold overlooks unresolved high-severity issues, per quality assurance standards; option D is misguided as it prioritizes performance over functional accuracy, violating the principle that critical defects must be resolved pre-deployment. For similar situations, use a readiness checklist that weights business impact and stakeholder sign-off. Establish clear exit criteria for UAT to guide go-live decisions objectively.
A public sector agency is at the end of the SDLC for an online permitting portal using a waterfall methodology with a formal acceptance phase. UAT included accessibility checks by business users, and testers reported that certain form error messages are not read by screen readers, preventing visually impaired users from completing submissions. The agency’s go-live is scheduled to align with a public communication campaign. What potential risk identified during testing requires immediate attention?
A network capacity risk, because assistive technology increases bandwidth consumption during peak usage.
A data conversion risk, because screen readers may alter form field values during submission.
A reputational risk only, since accessibility issues do not affect the underlying permit workflow logic.
A compliance and service delivery risk, because accessibility failures can prevent equal access and may violate applicable accessibility requirements.
Explanation
This question assesses risk prioritization in UAT for waterfall SDLCs, focusing on accessibility compliance in public-facing systems. Key facts include screen reader failures for error messages, preventing completions, with go-live tied to a campaign. Option B aligns with best practices by identifying compliance and service risks, ensuring equal access and legal adherence. Option A is incorrect as it downplays accessibility to reputational only, ignoring legal mandates; option C is wrong because data conversion is unrelated, per targeted risk analysis; option D is irrelevant as network capacity does not stem from the defect. In like situations, integrate accessibility testing into UAT mandates. Apply a compliance checklist to flag and remediate barriers before launch.
A university is in the final SDLC stage for a new student information system built using Agile with incremental releases. UAT was conducted by registrars using a test script library, but several departments reported they could not validate end-to-end scenarios because test data did not include students with multiple concurrent program enrollments. The team is considering go-live based on successful execution of available test scripts. Based on the user acceptance testing results, what is the most appropriate action?
Close UAT and rely on unit testing evidence, since unit tests provide better coverage than user-driven scenarios.
Expand UAT to include representative test data and execute end-to-end scenarios for complex enrollment cases before final go-live approval.
Proceed with go-live because missing test data is an expected UAT limitation and can be addressed through post–go-live monitoring.
Replace UAT with a penetration test to confirm the system is secure enough for production use.
Explanation
This question tests the importance of comprehensive test data in UAT for Agile SDLCs, ensuring end-to-end validation before go-live. Key facts involve incomplete test data preventing validation of complex enrollment scenarios, despite successful script execution, with consideration for go-live anyway. Option B aligns with best practices by expanding UAT to cover representative data, promoting thorough acceptance and reducing deployment risks. Option A is incorrect as accepting UAT limitations ignores data realism principles in testing; option C is wrong because unit testing does not substitute for user-driven validation, per SDLC phases; option D is misguided as penetration testing addresses security, not functional completeness. In comparable scenarios, define UAT scope to include diverse data sets for realism. Adopt a framework requiring full scenario coverage before readiness approval.
A subscription software company is preparing a major billing platform release and is in the final release readiness review under an Agile methodology with a defined “definition of done.” UAT passed for standard invoicing, but a defect remains open where refunds processed on the last day of the month are posted to the next accounting period in certain time zones. Finance indicates this could affect period-end close and revenue reporting. Based on the UAT results, what is the most appropriate action?
Proceed with go-live if performance and security testing are complete, since UAT is not intended to validate accounting period logic.
Go live as planned and instruct finance to manually reclassify the entries at month-end until the next release.
Defer go-live until the defect is fixed and re-tested in UAT, because it impacts financial reporting accuracy and period-end controls.
Proceed with go-live and accept the risk because the issue occurs only once per month and affects a limited set of transactions.
Explanation
This question evaluates handling financial accuracy defects in UAT for Agile SDLCs, deciding on deferral versus workarounds. Key facts include a defect posting refunds to the wrong period in certain time zones, impacting reporting, with a suggestion to proceed. Option B aligns with best practices by deferring go-live for fixes and re-testing to ensure financial integrity. Option A is incorrect as manual reclassification introduces errors, per control reliability principles; option C is wrong because UAT validates such logic, not just performance; option D is inadequate as accepting monthly risks violates accuracy standards. For analogous defects, prioritize financial controls in readiness assessments. Use regression testing to confirm fixes without introducing new issues.
A bank is completing the SDLC for a mobile app update using Agile delivery with a formal production change advisory board (CAB) approval. UAT was executed by a pilot group and showed intermittent login failures after password resets; the issue is not consistently reproducible, and the vendor suggests monitoring logs in production to gather more data. The release includes new authentication flows and is scheduled for a marketing launch. How should the IT team address this issue before going live?
Replace UAT with code review evidence, since authentication issues are primarily developer concerns rather than user acceptance concerns.
Proceed with go-live and rely on production monitoring, since intermittent issues are best diagnosed under real user load.
Move the issue to a future backlog item because UAT cannot reliably reproduce it and the marketing date is fixed.
Treat the issue as a potential critical defect, expand UAT with targeted test cases and logging in a staging environment, and require resolution or a validated workaround before CAB approval.
Explanation
This question tests addressing intermittent defects in UAT for Agile SDLCs with CAB approval, ensuring resolution before release. Key facts include non-reproducible login failures post-reset, with vendor suggesting production monitoring. Option B aligns with best practices by expanding UAT for diagnosis and requiring fixes, minimizing security risks. Option A is incorrect as production monitoring exposes users unnecessarily, per pre-deployment validation; option C is wrong because deferring ignores impact, violating release gates; option D is misguided as code reviews do not replace user testing. In similar intermittent issues, enhance logging and targeted testing. Establish criteria for reproducibility before CAB approval.
A nonprofit is in the final SDLC stage for a donor management system implemented with a vendor using a waterfall approach and formal acceptance sign-off. UAT scripts were executed, but only by the fundraising team; finance and data privacy stakeholders did not participate. The system includes automated tax receipt generation and donor communication preferences. Which criteria should be prioritized to determine go-live readiness?
Stakeholder acceptance across impacted business functions, including validation of tax receipt accuracy and communication preference controls, with documented UAT sign-off.
Whether all minor UI enhancements requested during UAT have been implemented before go-live.
Whether the number of UAT participants meets the minimum headcount target, regardless of their functional roles.
Whether the vendor’s implementation checklist is complete, since vendor certification replaces internal UAT participation.
Explanation
This question examines go-live readiness in waterfall SDLCs, emphasizing broad stakeholder involvement in UAT. Key facts include limited UAT participation excluding finance and privacy, despite script execution, for a system with tax and preference features. Option A aligns with best practices by requiring cross-functional acceptance to ensure comprehensive validation. Option B is incorrect as vendor checklists do not substitute internal validation, per accountability principles; option C is wrong because headcount ignores role relevance; option D is inadequate as minor UI changes are secondary to core functions. For comparable projects, mandate diverse stakeholder sign-off in readiness. Use RACI matrices to define UAT participants.
A logistics company is completing a transportation management system and is in the UAT and go-live readiness checkpoint under a DevOps-oriented SDLC with automated deployments and manual approval gates. UAT results show all functional scenarios passed, but users report that the new route-planning screen is confusing and increases time-to-complete by about 20% compared with the legacy system. Leadership wants to proceed to meet a contract start date. Which criteria should be prioritized to determine go-live readiness?
Whether system documentation is fully updated, since UAT results already confirm the system is acceptable.
Whether the deployment pipeline can complete in under 10 minutes, since rapid releases offset any usability concerns.
User experience and operational efficiency for key workflows, ensuring the new process does not materially degrade service levels at go-live.
Whether the number of UAT defects is lower than the prior project, regardless of impact on user productivity.
Explanation
This question examines go-live readiness in a DevOps SDLC, prioritizing user experience and efficiency over schedule or non-functional metrics. Key facts include functional passes but a 20% increase in task time due to confusing interfaces, with pressure to meet contract dates. Option A aligns with best practices by focusing on operational efficiency to avoid service degradation, ensuring sustainable adoption. Option B is incorrect as deployment speed does not offset usability issues, per user-centric design principles; option C is wrong because documentation alone does not address efficiency, violating holistic readiness; option D is inadequate as defect counts ignore impact, contrary to impact-based evaluation. For similar cases, incorporate usability metrics into readiness criteria. Use feedback loops to iterate on user experience pre-deployment.
A utilities company is preparing to deploy a customer outage notification system and is at the UAT completion gate in a phased rollout SDLC. UAT uncovered that customers who opt out of SMS still receive messages if they previously opted in and then changed preferences within 24 hours of an outage event. The team considers this a low impact defect because it affects a narrow timing window. What potential risk identified during testing requires immediate attention?
A scalability risk, because opt-out logic increases CPU usage during peak outages.
A training risk, because customers may not understand how to opt out properly.
A data retention risk, because SMS logs may be stored longer than email logs.
A privacy and consent management risk, because messaging users who opted out can violate customer preferences and applicable communication rules.
Explanation
This question assesses risk identification in UAT for phased SDLCs, focusing on privacy compliance in notifications. Key facts include opt-out failures within 24 hours of events, classified as low impact despite the narrow window. Option A aligns with best practices by recognizing privacy risks and potential violations, ensuring consent enforcement. Option B is incorrect as retention is unrelated to the defect; option C is wrong because scalability does not arise here; option D is misguided as training does not fix systemic issues. In similar privacy defects, prioritize remediation per data protection laws. Integrate consent testing into UAT scripts for compliance.
A global distributor is at the end of the SDLC for a warehouse scanning solution developed using Agile and integrated with legacy systems. UAT results are mixed: scanning and receiving workflows passed, but cycle count adjustments occasionally post to the wrong GL account due to a mapping table default. The warehouse team is ready, but finance has not signed off. Based on the user acceptance testing results, what is the most appropriate action?
Close UAT and move directly to disaster recovery testing, since accounting mapping is better validated during recovery exercises.
Defer go-live until the GL mapping defect is corrected and finance completes UAT validation and provides formal sign-off.
Proceed with go-live because warehouse operations are the primary users and finance can reconcile discrepancies after go-live.
Proceed with go-live if the vendor confirms the issue is limited to test data and not present in production.
Explanation
This question tests UAT sign-off requirements in Agile SDLCs with integrations, ensuring all stakeholders validate before go-live. Key facts include passing warehouse flows but GL mapping defects, with finance not signing off. Option C aligns with best practices by deferring for fixes and sign-off to maintain financial accuracy. Option A is incorrect as reconciliations are inefficient, per integration testing principles; option B is wrong because vendor assurances do not replace validation; option D is misguided as disaster recovery does not address functional defects. For integrated systems, require end-to-end UAT with stakeholder approval. Use defect impact analysis to guide deferral decisions.