AI Bias & Explainability Statement

Updated: 29/5/2025
At Cykel, we believe AI should open doors, not close them. Lucy is built to help organizations identify the best talent based on merit, while actively working to minimize bias throughout the hiring process.

Our Approach

Diverse AI Perspectives

Lucy employs multiple frontier LLMs alongside semantic analysis, neural search, and keyword matching. This approach ensures no single algorithm's limitations or biases dominate the assessment process. By combining different AI perspectives, we create more balanced and less bias-prone candidate evaluations.

Smart Customization

We recognize that every organization has unique hiring needs. Lucy's screening criteria are highly customizable, but this flexibility comes with responsibility. We provide clear guidance and guardrails to help customers avoid configurations that might inadvertently introduce bias.

Human Judgment Remains Central

Lucy is designed to augment human decision-making, not replace it. Recruiters maintain full control and can override any recommendation in real-time. We log actions to create accountability and enable continuous learning about when and why human judgment differs from AI suggestions.

How We Test for Fairness

Systematic Evaluation

We test using standardized resume sets where only names and demographic indicators vary while qualifications remain identical. This allows us to measure and address any disparities in Lucy's scoring patterns.

Focus on What Matters

Lucy analyzes text content from CVs, job descriptions and publicly available professional data. We deliberately do not process images, video, or voice data, eliminating common sources of unconscious bias and focusing solely on job-relevant information.

Transparency & Control

Explainable Decisions

While Lucy uses AI to evaluate candidates across multiple dimensions, we prioritize making these assessments understandable. Lucy provides insights into scoring decisions, helping recruiters understand not just who scored well, but why.

Comprehensive Audit Trails

Key interactions are logged, creating a record of screening and sourcing activity. This transparency supports both compliance requirements and continuous improvement.

Candidate Privacy

We process only professional information relevant to job qualifications. Lucy sees names when provided but focuses assessment on skills, experience, and job fit rather than demographic indicators.

Continuous Improvement

We don't claim Lucy is perfect – no AI system is. Bias in hiring is a complex challenge that requires ongoing attention and refinement. We commit to:

  • Internal testing using industry-standard fairness metrics
  • Transparent communication about our methods and limitations
  • Continuous updates based on new research and real-world outcomes
  • Full compliance with relevant regulations and global standards

Our Commitment

Creating fair AI is not a destination but a journey. We're committed to:

  • Measuring honestly - Bias testing with transparent methodology
  • Improving continuously - Refining our algorithms based on evidence and outcomes
  • Partnering openly - Working with customers to implement fair hiring practices
  • Staying accountable - Maintaining audit trails and welcoming external scrutiny

We encourage all organizations using Lucy to:

  • Regularly review and refine their screening criteria
  • Use AI insights as one input in a holistic hiring process
  • Monitor outcomes across different candidate groups
  • Take advantage of bias-reduction features like name-blind screening

Questions or feedback? We welcome dialogue about AI fairness in hiring. Contact us at hello@cykel.ai