Hexis · Fundamental Rights Impact Assessment
FRIA — Article 27
Conduct a structured fundamental rights impact assessment as required by Article 27 of the EU AI Act (Regulation (EU) 2024/1689) before deploying high-risk AI systems.
Step 0 of 5
Is a FRIA required for your AI system?
Article 27 requires deployers of certain high-risk AI systems to conduct a fundamental rights impact assessment before use.
Please select a deployer type.
Not yet classified? Use the Risk Classifier →
Please select a risk classification.
← Back to Generator
Step 1 of 5
Describe the AI system under assessment
Article 27(1)(a) — Identify the AI system, its purpose, provider, and deployer information.
System name is required (min 3 characters).
Please describe the intended purpose (min 40 characters).
Please select a sector.
Please select a decision influence level.
Provider name is required.
Deployer name is required.
Responsible function is required.
Please provide a summary of provider instructions.
Step 2 of 5
How and how often will the system be used?
Article 27(1)(b) — Describe the deployment context, frequency of use, and human oversight arrangements.
Please select a deployment stage.
Leave blank if ongoing
Please select frequency of use.
Please select human-in-the-loop level.
Fully automated systems require heightened scrutiny under Articles 14 and 22 GDPR.
Please select output type.
Please describe the primary data sources.
Step 3 of 5
Who is affected by this AI system?
Article 27(1)(c) + 27(3) — Identify directly and indirectly affected groups, including vulnerable populations.
Please select at least one directly affected group.
Please estimate the number of affected people.
Please indicate whether vulnerable groups are present.
Please describe the vulnerability.
Please select geographic scope.
Step 4 of 5
Identify risks and assess impact on fundamental rights
Article 27(1)(d) — Document harm risks and map their impact on fundamental rights guaranteed by the EU Charter.
ID Risk Type Risk Scenario Likelihood Severity Existing Controls Residual
At least one risk entry is required.
Fundamental Right Impact Linked Risks Severity Mitigation Summary
Step 5 of 5
Define safeguards and governance measures
Articles 27(1)(e)-(f), 27(2), 27(3), 27(4) — Human oversight, incident response, complaint mechanisms, review schedule, and sign-off.
5A — Human Oversight (Art. 27(1)(e))
Please describe human oversight implementation.
Please describe the escalation path.
Please indicate whether operators have been trained.
Please provide training details.
5B — Incident Response (Art. 27(1)(f))
Please describe incident response procedures.
Please indicate rollback capability.
5C — Complaint Mechanism (Art. 27(1)(f))
Please select a complaint channel.
Please specify maximum response time.
Please describe the appeal process.
Please describe logging and audit trail.
5D — Review Schedule (Art. 27(2))
Please select review frequency.
Please select at least one change trigger.
5E — DPIA Linkage (Art. 27(4))
Please select DPIA status.
5F — Authority Notification (Art. 27(3))
Please select notification status.
Please provide the rationale.
5G — Sign-off
Prepared by is required.
Date is required.
← Back to Checklist

Temel Haklar Etki Değerlendirmesi (FRIA) Nedir?

AB AI Act Madde 27, belirli yüksek riskli AI sistemlerini konuşlandıran (deployer) kuruluşların temel haklar etki değerlendirmesi yapmasını zorunlu kılar. Değerlendirme; sistemin hangi kişileri etkilediğini, hangi temel haklara risk oluşturabileceğini ve bu riskleri azaltmak için hangi önlemlerin alındığını belgeler.

Kimler FRIA Yapmak Zorunda?

Madde 27 kapsamındaki yükümlülük iki ayrı kategoride geçerlidir:

Bu Araç Nasıl Kullanılır?

FRIA şablonu, Madde 27'nin gerektirdiği değerlendirme adımlarını yapılandırılmış bir form olarak sunar. Sisteminizin kapsamını, etkilenen grupları ve uygulanan koruma önlemlerini belgeleyin. Değerlendirme ilk kullanımdan önce tamamlanmalı; koşullar değiştiğinde güncellenmeli.

Risk seviyenizi henüz belirlemediyseniz önce Risk Classifier aracını kullanın.

Genel uyum yükümlülüklerinizi takip etmek için AB AI Act Kontrol Listesi sayfasına bakın.