OSIPI Standards Compliance¶
osipy aligns with the Open Science Initiative for Perfusion Imaging (OSIPI) standards for parameter naming, units, and validation.
What is OSIPI?¶
OSIPI is an ISMRM initiative that aims to:
- Develop consensus-based standards for perfusion imaging
- Create publicly available software tools
- Share code and data for validation
- Promote reproducibility in perfusion research
OSIPI Task Forces¶
OSIPI is organized into task forces focusing on different aspects:
| Task Force | Focus | osipy Relevance |
|---|---|---|
| TF 1.1 | ASL lexicon | Parameter naming |
| TF 1.2 | DSC/DCE lexicon | Parameter naming |
| TF 2.1 | DCE software inventory | Validated against |
| TF 2.3 | ASL software inventory | Validated against |
| TF 2.4 | IVIM/DWI software | Validated against |
| TF 4.1 | DCE DRO | Validation data |
| TF 6.1 | ASL DRO | Validation data |
CAPLEX Naming Convention¶
What is CAPLEX?¶
CAPLEX (Contrast Agent based Perfusion Lexicon) standardizes parameter names:
CAPLEX parameter naming in osipy
Standard Parameters¶
| CAPLEX Name | Description | Units |
|---|---|---|
| Ktrans | Volume transfer constant | min⁻¹ |
| ve | EES volume fraction | mL/100mL |
| vp | Plasma volume fraction | mL/100mL |
| kep | Rate constant (EES to plasma) | min⁻¹ |
| Fp | Plasma flow | mL/min/100mL |
| PS | Permeability-surface area | mL/min/100mL |
| CBV | Cerebral blood volume | mL/100g |
| CBF | Cerebral blood flow | mL/100g/min |
| MTT | Mean transit time | s |
osipy Parameter Classes¶
Create a ParameterMap with OSIPI-compliant metadata
from osipy.common.parameter_map import ParameterMap
# ParameterMap includes OSIPI-compliant metadata
ktrans_map = ParameterMap(
name="Ktrans", # CAPLEX name
symbol="Ktrans", # ASCII symbol
units="1/min", # Standard units
values=ktrans_array,
affine=np.eye(4), # NIfTI affine
quality_mask=quality_mask,
)
Digital Reference Objects (DROs)¶
What are DROs?¶
DROs are synthetic datasets with known ground truth parameters:
- Generated from mathematical models
- Include realistic noise and artifacts
- Enable quantitative validation
OSIPI DCE DRO¶
osipy is validated against the OSIPI DCE-MRI DRO:
Validate osipy against OSIPI DCE DRO
# Validation workflow
from osipy.common.validation import load_dro, validate_against_dro
# Load DRO ground truth parameters
dro_data = load_dro("path/to/dro/")
# dro_data.parameters contains ground truth: {"Ktrans": ..., "ve": ..., "vp": ...}
# Fit with osipy (concentration, aif, time come from your imaging data)
result = osipy.fit_model("extended_tofts", concentration, aif, time)
# Compare computed maps to DRO ground truth
validation = validate_against_dro(
computed=result.parameter_maps,
reference=dro_data,
)
print(validation.summary())
Validation Metrics¶
osipy reports standard validation metrics:
| Metric | Description |
|---|---|
| Bias | Mean(estimated - true) |
| RMSE | Root mean squared error |
| CCC | Concordance correlation coefficient |
| %CV | Coefficient of variation |
Unit Conventions¶
Time Units¶
OSIPI specifies time units for each context:
| Context | Unit | Example |
|---|---|---|
| Time arrays (all modalities) | seconds | time = np.linspace(0, 300, 60) |
| DCE models (internal) | minutes | Ktrans in min⁻¹ |
| ASL (PLD, τ, T1) | milliseconds | pld=1800.0, label_duration=1800.0 |
| DSC (TE, TR in params) | milliseconds | te=30.0, tr=1500.0 |
osipy handles conversions automatically:
Automatic time unit conversion
Concentration Units¶
| Technique | Unit |
|---|---|
| DCE (tissue) | mM (millimolar) |
| DSC (ΔR2*) | s⁻¹ |
| ASL (ΔM) | arbitrary units |
Quality Control Standards¶
R² Threshold¶
osipy uses R² = 0.5 as the default quality threshold:
R-squared quality threshold
This follows OSIPI recommendations for excluding unreliable fits.
Quality Mask Convention¶
Filter parameters using quality mask
BIDS Compliance¶
BIDS Derivatives¶
osipy outputs follow BIDS derivatives:
BIDS derivatives directory structure
Provenance JSON¶
Provenance JSON metadata format
Interoperability¶
Compatible with OSIPI Software¶
Results from osipy can be compared with:
- ROCKETSHIP (MATLAB)
- TOPPCAT (Python)
- NordicICE (commercial)
- Olea Sphere (commercial)
Data Exchange¶
Standard file formats ensure interoperability:
| Format | Use |
|---|---|
| NIfTI | Parameter maps |
| JSON | Metadata |
| BIDS | Dataset organization |
OSIPI CodeCollection Compliance¶
osipy is validated against the OSIPI DCE-DSC-MRI CodeCollection, the community benchmark for DCE/DSC-MRI implementations. Reference test data and tolerances are committed in tests/fixtures/osipi_codecollection/ for automated cross-implementation testing.
OSIPI Tolerances¶
Validation uses OSIPI-agreed tolerances per parameter:
| Parameter | Absolute Tolerance | Relative Tolerance |
|---|---|---|
| Ktrans | 0.005 | 0.1 |
| ve | 0.05 | 0.0 |
| vp | 0.025 | 0.0 |
| Fp | 5.0 | 0.1 |
| PS | 0.005 | 0.1 |
| delay | 1.0 s | 0.0 |
| CBF (DSC) | 15.0 | 0.1 |
These tolerances are defined in osipy.common.validation.comparison.DEFAULT_TOLERANCES.
Validation Reports¶
validate_against_dro() returns a ValidationReport dataclass with comparison metrics:
Inspect validation report results
from osipy.common.validation import validate_against_dro
# Compare computed results against DRO ground truth
report = validate_against_dro(
computed=result.parameter_maps,
reference=dro_data,
)
# Inspect results
print(f"Overall pass: {report.overall_pass}")
for param, rate in report.pass_rate.items():
print(f" {param} pass rate: {rate:.1%}")
Export as JSON¶
ValidationReport supports structured export for CI/CD integration and record-keeping:
Export validation report to JSON
OSIPI Resources¶
Useful Links¶
- OSIPI Website
- CAPLEX Lexicon
- DCE-MRI DRO (OSF)
- ASL DRO (OSF) — link pending verification
- OSIPI GitHub
Task Force Reports¶
- TF 1.2: DCE/DSC Lexicon (Dickie et al.)
- TF 2.1: DCE Software Inventory (van Houdt et al.)
- TF 4.1: DCE DRO Description (Shalom et al.)
Contributing to OSIPI¶
osipy contributes to the OSIPI ecosystem through:
- Code contributions: Sharing implementations
- Validation data: Testing against DROs
- Documentation: Standards compliance docs
- Benchmarks: Performance comparisons