Search before asking
Feature Description
What: EU AI Act compliance checks mapped to computer vision pipeline operations — risk classification for detection/tracking use cases, structured logging for detection events, confidence thresholds for human review, and adversarial robustness testing.
Why: With the EU AI Act enforcement deadline on August 2, 2026, computer vision systems built with supervision will need to demonstrate compliance. The Act specifically calls out biometric identification and real-time monitoring as high-risk categories, which are common use cases for CV pipelines. With 36K+ stars and Fortune 100 adoption, many enterprise users have EU operations where the Act applies.
Specific areas that map to the Act:
- Art. 9 (Risk Management): Risk classification for detection/tracking use cases (surveillance vs. manufacturing vs. sports analytics have different risk levels)
-
- Art. 12 (Record-Keeping): Structured logging for detection events, tracking decisions, and zone analytics
-
-
- Art. 14 (Human Oversight): Configurable confidence thresholds, human review triggers for high-stakes detections
-
-
-
- Art. 15 (Security): Adversarial robustness testing for detection models, input validation for video streams
I ran supervision through AIR Blackbox, an open-source EU AI Act compliance scanner (Apache 2.0). supervision scored 7/44 checks passing (16%), but the documentation quality is exceptional: 98% type hint coverage and 66% docstring coverage, the best I've seen across any project.
You can run it yourselves:
pip install air-blackbox
air-blackbox comply --scan . --no-llm --format table --verbose
Everything runs locally, no data leaves your machine.
How (optional): A compliance module that maps supervision's existing detection/tracking/zone analytics to EU AI Act articles, with configurable risk levels per use case and structured audit logging for detection events.
Example Usage
No response
Are you willing to submit a PR?
Search before asking
Feature Description
What: EU AI Act compliance checks mapped to computer vision pipeline operations — risk classification for detection/tracking use cases, structured logging for detection events, confidence thresholds for human review, and adversarial robustness testing.
Why: With the EU AI Act enforcement deadline on August 2, 2026, computer vision systems built with supervision will need to demonstrate compliance. The Act specifically calls out biometric identification and real-time monitoring as high-risk categories, which are common use cases for CV pipelines. With 36K+ stars and Fortune 100 adoption, many enterprise users have EU operations where the Act applies.
Specific areas that map to the Act:
I ran supervision through AIR Blackbox, an open-source EU AI Act compliance scanner (Apache 2.0). supervision scored 7/44 checks passing (16%), but the documentation quality is exceptional: 98% type hint coverage and 66% docstring coverage, the best I've seen across any project.
You can run it yourselves:
Everything runs locally, no data leaves your machine.
How (optional): A compliance module that maps supervision's existing detection/tracking/zone analytics to EU AI Act articles, with configurable risk levels per use case and structured audit logging for detection events.
Example Usage
No response
Are you willing to submit a PR?