Loading
About the Project

Turning a Complex Decision Into a Fair, Transparent Process

Built in partnership with AAU, this system was commissioned by the Iraqi Ministry of Higher Education to manage one of its most consequential HR processes: the study break.

When a ministry employee wants to pursue a master's or PhD degree, they need official approval for a period of leave from their position. The decision isn't arbitrary — it's based on a set of criteria that weigh the employee's record, seniority, department needs, and other factors into a score. Previously, this process was manual, inconsistent, and slow. We replaced it with a transparent, automated system.

How It Works

An employee logs in and submits a study break application — entering their academic intention, target institution, and supporting information. The system then does the work:

  • Automated point calculation — the system evaluates the application against the ministry's eligibility criteria and calculates a total score
  • Eligibility determination — based on the score, the system determines whether the employee qualifies for the study break
  • Transparent result — the applicant sees their score breakdown, understands exactly why they did or did not qualify, and can track their application status throughout the process

The Administration Side

Ministry HR staff and decision-makers have a dedicated dashboard to oversee all applications — reviewing scores, managing edge cases, generating reports by department, and maintaining the scoring criteria that power the engine.

Why It Matters

Before this system, the study break decision process was vulnerable to inconsistency — outcomes could vary depending on who reviewed the application and when. The automated scoring engine applies the same criteria to every applicant without exception, making the process fairer and fully auditable.

  • Employee Study Break Application Portal
  • Automated Eligibility Point Calculation
  • Transparent Score Breakdown for Applicants
  • Automated Approval or Rejection Decision
  • Application Status Tracking
  • HR Administration Dashboard
  • Department-Level Application Overview
  • Configurable Scoring Criteria Engine
  • Application History & Archive
  • Reporting & Statistics by Department
  • Notification System for Application Updates
  • Document Attachment & Supporting Evidence Upload
MOHE Study Leave System

How It Was Built

Our process
01 .
Criteria Analysis & Scoring Model

Worked closely with Ministry HR specialists and AAU to fully document the eligibility criteria used to evaluate study break applications — every factor, its weight, and the thresholds that determine approval. This scoring model became the core logic of the system.

02 .
System Architecture & Workflow Design

Designed the full application lifecycle — from employee submission through automated scoring, decision output, and HR oversight — defining every state an application can be in and how it transitions between them.

03 .
Platform Development

Built the employee application portal, the automated scoring engine, the HR administration dashboard, and the reporting layer — with a role-based access model that separates what applicants, HR staff, and ministry administrators can see and do.

04 .
Scoring Engine Validation

Ran the scoring engine against historical application data provided by the Ministry to validate that the automated results matched the outcomes of past manual decisions. Iterated on the model until accuracy was confirmed before go-live.

05 .
Deployment & Staff Handover

Deployed the system within the Ministry's infrastructure, trained HR administrators on managing the scoring criteria and reviewing applications, and handed over documentation for ongoing operation and criteria updates.

MOHE Study Leave System preview
Project Insights

Behind the
decisions we made.

Encoding a Human Decision Into an Algorithm

The hardest part of this project was not the development — it was the requirements. The Ministry's eligibility criteria existed in policy documents and in the institutional knowledge of HR staff, not in a clean, unambiguous specification. Translating years of human judgement into a deterministic scoring algorithm required extensive workshops with HR decision-makers, careful documentation of edge cases, and multiple validation rounds against historical data before we were confident the engine was accurate.

Fairness Through Transparency

An automated decision system is only trustworthy if applicants understand how their score was produced. We designed the result screen to show the full score breakdown — every criterion, the value assigned to it, and how it contributed to the total — so an employee who is rejected understands exactly why, rather than receiving an opaque outcome. This transparency was a deliberate design decision, not an afterthought.

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Get In TouchStar

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us

Contact Us