Integrated Budget Documentation and Execution System (IDECS)
How TestWheel reduced testing efforts or an application used by the White House, US Congress and the Department of Defense



The Integrated Budget Documentation and Execution System or, IDECS is an IT system used by the US Air Force. IDECS is used to develop and compile budgeting information for reporting to various governmental entities, such as, Office of the President of the United States, United States Congress and Senate, Secretary of the Department of Defense, and White House Office of Management and Budget. IDECS supports Planning, Programming, Budgeting, Execution (PPBE) (formerly Programming, Budgeting and Execution System (PPBS) used in the preparation of budget submissions, such as President’s Budget (PB) and Budget Estimate Submission (BES). IDECS is used to build the Air Force (AF) budget documentation.
Challenges
Lengthy, manual regression cycles
Regression testing was done fully manually, using existing test cases housed in Jira. A full regression cycle often spanned 2 to 3 weeks, monopolizing most of the sprint’s testing window.
Limited coverage due to effort constraints
Per sprint, only 10–15 regression cases could realistically be executed. As new features built up, side-effect risk increased, but full-system testing was too time-costly.
Tester bottleneck & resource risk
All regression work was handled by a single tester. That created a single point of failure, limits on parallelization, and a risk of burnout or human error under tight deadlines.
Maintenance overhead with evolving application
Application changes (UI tweaks, flow updates) required frequent updates to test cases. Some adjustments took as little as 30 minutes, others extended into a full day of rework per test case.
Lack of formal defect leakage tracking & reporting
Though defects found in production were reviewed (via feedback from developers), there was no systematic defect leakage metric tied to regression tests. Test results were essentially manually documented in Jira.
Goals & Requirements
To modernize their regression approach, IDECS sought:
- Significant reduction in regression execution time
- The ability to scale coverage beyond only 10–15 tests
- Maintainability of tests as the application evolves
- Traceable reporting and dashboards for stakeholders
- A low-barrier adoption path (i.e. minimal disruption during the switch)
Solution: Introducing TestWheel at IDECS
- The team began by converting ~12 their most critical regression tests into TestWheel automation in the first cycle.
- They continued using their Jira-based documentation, but leveraged TestWheel’s interface to map Jira test steps into automated workflows.
- TestWheel’s built-in editing allowed the tester to fine-tune test steps directly in the app, reducing the need for external updates.
- Automated runs produced detailed logs, screen captures, and visual step-by-step outputs, giving clarity to both testers and stakeholders.
Results: Early outcomes and Benefits
| Metric / Area | Before (Manual) | After TestWheel | Observed Impact |
|---|---|---|---|
| Regression cycle duration | 2 – 3 weeks | 1–2 days (for automated subset) | 80–90% faster turnaround |
| Test coverage | 10–15 manual tests | 12 automated + growing | More regression breadth with less effort |
| Maintenance time per case | Up to 1 full day | Minutes | ≈ 80–90% reduction in rework |
| Tester effort allocation | 100% manual execution | Automation + oversight | Frees up tester bandwidth for higher-value testing |
| Reporting / traceability | Manual via Jira notes | Automated dashboards & logs | Clearer visibility, real-time insights |
| Defect leakage tracking | Ad hoc (via dev feedback) | Better alignment with test logs & result history | Ability to tie production issues to test gaps |
In early cycles, the team regained nearly two weeks of sprint time that had previously been consumed by regression execution. Long-term, the automated subset is expanding, and test coverage is steadily increasing without commensurate manual burden.
Beyond pure numbers, IDECS’s QA leadership reports improved confidence in releases, stronger collaboration with development (faster feedback), and more predictable regression cycles.
Why it Worked
Low-friction adoption
Because IDECS already managed test cases in Jira, the transition to TestWheel didn’t require scrapping existing test artifacts.
Incremental rollout
Automating a dozen high-value regression cases first allowed wins to be demonstrated early, reducing resistance.
Resilient automation
The system-level editing and self-adjusting logic reduced test failures from UI updates, minimizing maintenance.
Clear stakeholder visibility
The screenshot logs, reports, and historical trends made it easier to communicate test status to leadership and product owners.
What’s next? Future Roadmap
Building on the success of the initial automation phase, IDECS plans to gradually expand TestWheel across its full regression suite, prioritizing modules with the highest business impact. The QA team will continue running a combination of manual and automated tests in parallel to maintain coverage while scaling automation safely. As the framework matures, IDECS intends to introduce formal defect leakage tracking to strengthen visibility between test coverage and production outcomes. The team also aims to incorporate API and integration testing within TestWheel to broaden automation scope and ensure end-to-end validation across all critical workflows. Over time, IDECS will periodically review and refine its regression library, retiring low-value tests and optimizing the suite for maintainability and efficiency.