ADR-01: Static Analysis-Based Instant Analysis
🇰🇷 한국어 버전
| Date | Author | Repos |
|---|---|---|
| 2024-12-17 | @KubrickCode | All |
Context
Problem Statement
Existing test management tools have significant adoption barriers:
- High Setup Complexity: CI/CD pipeline integration requires authentication, environment variables, and infrastructure configuration
- Delayed Time-to-Value: Users must complete setup before seeing any results (hours to days)
- Technical Expertise Required: DevOps knowledge prerequisite excludes non-technical stakeholders (PMs, QA managers)
- Test Execution Dependency: Most tools require actual test runs, which need proper environments and passing tests
Market Landscape
| Competitor | Approach | Entry Barrier | Time-to-First-Value |
|---|---|---|---|
| TestRail | Manual entry | Medium | Minutes (manual) |
| Qase | AI-assisted | Medium | Minutes (manual) |
| Testomat.io | CI/CD integration | High | Hours-Days |
| PractiTest | Enterprise workflows | Very High | Days-Weeks |
Core Question
How can we deliver immediate value to users while minimizing adoption friction?
Decision
Adopt static analysis-based instant analysis without CI/CD integration requirement.
Key implementation:
- Users provide only a GitHub repository URL
- System performs AST-based code analysis using Tree-sitter
- Test inventory is generated without executing any tests
- Results are available within seconds of submission
Options Considered
Option A: Static Analysis-Based Instant Analysis (Selected)
URL input → Code fetch → AST parsing → Result generation
Pros:
- Zero-friction onboarding (Time-to-Value: seconds)
- No authentication required for public repositories
- No test execution environment needed
- Accessible to non-technical users
- Enables PLG (Product-Led Growth) strategy
- Cost-effective (no compute for test execution)
Cons:
- Cannot detect dynamically generated test cases
- No pass/fail execution results
- AST parsing accuracy varies by framework complexity
Option B: CI/CD Pipeline Integration Required
Integrate into CI pipeline → Execute tests → Report results
Pros:
- Complete test execution data (pass/fail, timing, coverage)
- Full support for dynamic/parameterized tests
- Natural private repository access
- Industry-standard approach
Cons:
- High setup friction (configuration files, secrets, permissions)
- Time-to-Value: hours to days
- Requires DevOps expertise
- Test environment dependencies
- Higher infrastructure costs
Option C: AI-Based Test Inference
Use LLM to analyze and infer test structure
Pros:
- Can handle unconventional patterns
- Natural language descriptions possible
Cons:
- Accuracy uncertainty (false positives/negatives)
- High compute costs
- Latency issues
- Hallucination risks
- Static analysis provides sufficient accuracy
Consequences
Positive
Zero-Friction Onboarding
- Time-to-Value: ~5 seconds vs hours/days
- Viral potential: easy sharing and demonstration
PLG (Product-Led Growth) Enablement
- "Try before you buy" experience
- Self-service adoption without sales involvement
Broad Accessibility
- Non-developers can access test insights
- No DevOps knowledge required
Cost Efficiency
- No test execution infrastructure
Competitive Differentiation
- Unique position: instant static analysis
- Not competing on CI/CD features
Negative
Dynamic Test Limitations
- Parameterized tests may show incomplete counts
- Data-driven tests not fully captured
- Mitigation: Clear documentation of limitations, "estimated" labels
No Execution Results
- Cannot show pass/fail status
- No timing or coverage data
- Mitigation: Position as "test inventory" not "test results"
Framework Coverage
- Must implement parser for each framework
- Edge cases in complex test structures
- Mitigation: Prioritize popular frameworks, community contributions
Technical Implications
| Aspect | Implication |
|---|---|
| Architecture | Core library must be framework-agnostic with pluggable parsers |
| Performance | Shallow clone + parallel parsing for large repos |
| Scalability | Async queue processing for analysis workload |
| Extensibility | Plugin architecture for new framework support |
References
- Product Overview - Core value proposition
- Core Engine - Parser implementation details
- Architecture - System design
