This article draws on Creative Navy's project work in complex technical and scientific software, spanning computational fluid dynamics, surgery planning systems, scientific research software, CAD/CAM platforms, circuit simulation, vessel tracking systems, air traffic control, and mission control environments. We have designed for demanding technical experts such as CFD analysts, circuit design engineers, surgeons, air traffic controllers, mission controllers, and maritime operators. A central competency in this work is the visualisation of complex, dynamic, multi-dimensional data under real operational conditions, where clarity and precision directly affect decision quality. Several of these environments are governed by specific human factors standards, including EUROCONTROL and ICAO guidelines for ATC, IEC 62366 for medical software, and NASA requirements for mission-critical systems.
An aircraft is on short final. It is night. Visibility is at CAT I minima and every edge light on the approach path should be burning at assigned intensity. A red fault indicator appears on the tower workstation. The operator needs to identify which specific lamp has failed, on which circuit, at which position on the runway. They have seconds before the aircraft touches down.
It is the operational scenario that airfield lighting control and monitoring systems, known as ALCMS, are designed to support. Whether the operator resolves that fault in three seconds or thirteen depends not on the hardware monitoring the lamp, but on what the interface does with the information it has already received.
This benchmark evaluates eight ALCMS and air traffic control platforms against seven criteria drawn from operational requirements and the regulatory standards governing airfield lighting. The systems range from purpose-built ALCMS to full air traffic management suites that include lighting control as one component. All are commercially deployed.
An Airfield Lighting Control and Monitoring System (ALCMS) gives tower operators and maintenance centres remote control and real-time monitoring of every runway and taxiway light on an airfield. ICAO Annex 14, Volume I (9th Edition, as of November 2022) requires automatic monitoring of approach and runway lighting with immediate fault detection. Minimum interface requirements for safe operations include spatial fault location on a map, single-lamp resolution, consistent alert coding, and corrective action reachable without multi-screen navigation.
Key Statistics
- Runway incursion reports filed annually with the FAA Aviation Safety Reporting System: approximately 2,000 (as of August 2017; data period January 2012 to August 2017)
- Share of reported runway incursions classified as ATC incidents: 16% (FAA/ASRS, 2012 to 2017)
- ASRS reports citing flight crew fatigue in 2022: 1,024; ATC staffing shortages cited concurrently as a contributing factor in runway incursion rate increases (as of 2022)
- Share of fatal commercial aviation accidents attributed to human factors: approximately 70%
- Share of human-factor accidents where pilot fatigue is a contributing cause: 15 to 20%, per industry figures cited in peer-reviewed aviation research
- ICAO Annex 14 (9th Edition, as of November 2022): a runway light is classified unserviceable when its beam intensity falls below 50% of the required standard
- Low-visibility restriction: when runway visual range is below 350 m, no two adjacent taxiway centreline lights may be simultaneously unserviceable
ALCMS Evaluation Criteria
Seven criteria form the basis of this benchmark. They reflect the operational demands placed on ALCMS interfaces across facilities ranging from small regional airstrips to multi-runway international airports. The evaluation methodology applied here draws on the same structured benchmarking approach we use across complex interface programmes, where criteria are developed from observed use conditions rather than vendor specification sheets.
| Criterion | What it assesses |
|---|---|
| Geographically accurate aerodrome map | Spatial accuracy, completeness, whether the map is treated as the primary display element |
| Consolidated status dashboard | Information density, layout logic, and customisation options |
| Remote control and monitoring | Scope of remote capability, reliability, single-lamp resolution |
| Data organisation and clarity | Information hierarchy, labelling, control over abbreviation use |
| Alert and error visibility | Colour consistency, priority coding, fault triage speed |
| Learnability | Onboarding friction for new operators, familiarity of navigation model |
| Overall UI and UX quality | Visual coherence, interaction consistency, layout logic |
All scores in the individual reviews that follow are the analyst's professional assessment based on interface analysis, screenshot review, and product documentation. They are not externally verified ratings.
The Expert Operator Fallacy
The most consistent objection we encounter when recommending interface improvements to airfield management teams is a version of the same argument: their operators are trained professionals, and trained professionals do not need a simplified tool. The assumption is that deep system familiarity compensates for any interface deficiency, and that investing in clarity risks condescending to an expert workforce.
EUROCONTROL's PJ16 Controller Working Position/HMI project (as of 2024, an active SESAR programme) states explicitly that its goal is to reduce controller workload and stress levels through improved HMI design. That is not a concession to novice users. It is a regulatory programme treating interface quality as a structural condition for safe expert performance. The same framing runs through the foundational EUROCONTROL EATCHIP programme principles: display only minimal information, keep it simple, use colour-coded guidance.
Expertise in a domain and fluency with a particular interface are not the same capability. Domain expertise is stable and accumulates over years. Interface fluency degrades under fatigue, time pressure, and night-shift conditions. In 2022, the ASRS received 1,024 flight crew fatigue reports, while simultaneous ATC staffing shortages in the United States were contributing to a documented increase in runway incursion rates. The operator who decoded a shorthand label correctly on Monday morning may not do so reliably at 03:00 on Friday after a twelve-hour shift.
Expert proficiency is a resource. An interface that forces experts to spend that resource locating information rather than acting on it is not a professional-grade tool. It is a liability with a clean specification sheet.
Insero RCMS
Analyst scores: UX 8/10, UI 7/10
Insero leads this benchmark on spatial completeness and scalability. The aerodrome map displays all lights, equipment, runways, and taxiways in accurate position. Insero allocates ample space for the aerodrome map, which is the most crucial part of the software, reflecting the core principle of evidence-based embedded GUI design: the primary instrument must dominate the spatial hierarchy. The system supports full remote control and monitors to the level of individual lamps rather than circuits or zones only. ICAO Annex 14 (9th Edition, as of November 2022) treats individual lamp unserviceability as a compliance-relevant event, making single-lamp resolution a regulatory feature rather than a premium one.

Airtraffic control UX-UI Example 1
Two capabilities separate Insero from every other system in this benchmark. The first is the search function on the main dashboard, which allows operators to locate any aircraft, vehicle, or asset in real time. No other system reviewed here provides map-level search for the primary operational view. The second is a scalability model that runs unchanged from a small airstrip to a multi-runway military or civil airport. The system's deployment in the defence sector, including upgrades at multiple air bases, reflects its capacity to operate under demanding operational conditions.

Airtraffic control UX-UI Example 2

Airtraffic control UX-UI Example 3
Where Insero loses points is in secondary panel design. The meteorological report section presents data without visual structure: a dense sequence of abbreviations and numbers with no hierarchy indicating which values are decision-critical and which are background context. A panel consulted during approach and landing operations should not require the same decoding effort as a raw data output. This is a solvable problem that does not undermine the system's core strengths, but it illustrates that even the strongest performers in this benchmark have not resolved information hierarchy at every level.

Airtraffic control UX-UI Example
CORTEX ALCMS
Analyst scores: UX 6.5/10, UI 5/10
CORTEX presents a map that is simpler and less spatially detailed than Insero's. Reduced visual density in the map layer is not inherently a weakness; lower complexity can meaningfully reduce onboarding friction for new operators. The problem is not the map. The dashboard doesn't appear to be customisable, which is a shame because they pack a lot of controls at the top and bottom, a pattern that recurs consistently in high-density industrial dashboards where customisation is treated as optional rather than operational. When a high-density control bar is fixed in position and cannot be reorganised, every operator must work within an information arrangement designed once for a generalised user, regardless of their specific context.

UI of airfield lighting software

UI of airfield lighting software

UI of airfield lighting software
The alert system is a more consequential problem. CORTEX uses multiple colours across its error states, with red, yellow, and blue appearing for different fault conditions. Multiple colour codes for the same functional category require operators to maintain a secondary translation layer. The cognitive cost is not large in isolation. Across a shift, under fatigue and time pressure, it accumulates. Consistent single-purpose colour coding is the standard the alert literature supports, and CORTEX does not meet it.
Data organisation across the system follows a logic of completeness rather than hierarchy. Information is presented sequentially with no visual prioritisation. The underlying assumption appears to be that displaying everything eliminates the risk of missing something. What it actually produces is an interface where the signal competes with the noise at the same visual weight. This is a structural design problem, not a cosmetic one, and the distinction matters for any improvement programme.
S4GA ALCMS
Analyst scores: UX 8.5/10, UI 7.5/10 (advanced tier)
S4GA's design history in this benchmark is a single decision played out at two tier levels. The basic product tier ships without an aerodrome map. This is a remarkable omission for software whose stated purpose is controlling runway lighting. S4GA markets itself with the phrase "the world's safest runway lighting." The basic tier cannot answer the question that matters most in a fault condition: where on the airfield is the problem?

UX airfield lighting system

UX airfield lighting system
The advanced tier includes a geographically accurate layout with detailed light grouping, separate taxiway control, and the ability to mark temporarily closed areas. At this level S4GA earns high scores on data organisation and learnability. The table layout with expandable groups is well designed. The column selector gives operators genuine control over what data they see. Visual inconsistency across icon and component sizes is a surface problem rather than a structural one.

UX airfield lighting system

UX airfield lighting system
ICAO Annex 14 (9th Edition, as of November 2022) requires automatic fault monitoring and immediate fault detection at the individual lamp level. A list of lamp identifiers and status indicators satisfies the monitoring requirement technically. It does not provide the spatial context that makes a fault response operationally fast. Whether the decision to withhold the map from the entry-level product reflects deliberate market segmentation or a safety concession is a question worth putting directly to the product team.
Adacel Aurora ATM
Analyst scores: UX 6.5/10, UI 2/10
Adacel Aurora is included in this benchmark because of its significant market position in North America rather than as a direct ALCMS comparator. Aurora is a full air traffic management platform covering oceanic, en route, terminal, and tower operations. Its scope is broader than any other system evaluated here, and its interface density reflects that scope.

Oldschool UX UI for airfields

Oldschool UX UI for airfields

Oldschool UX UI for airfields

Oldschool UX UI for airfields
The visual design is severely dated. Dated appearance alone does not disqualify a professional tool. High-reliability systems operate on legacy interfaces because replacement costs are high and institutional familiarity has value. What the visual age signals is that the interaction model has not been revised alongside two decades of progress in display density management and information hierarchy design.
Aurora includes customisation options for data viewing, which is meaningful given the complexity it manages. The alert system makes correct use of colour separation: saturated red errors are readable against a more muted background palette. This is the right underlying instinct. It is not sufficient to offset the information density that the platform imposes across every screen, which presents a significant onboarding barrier regardless of an operator's domain experience.
AP-TECH Innovence ALCM
Analyst scores: UX 7.5/10, UI 5/10
AP-TECH's Innovence ALCM achieves something few systems in this benchmark manage consistently: it uses a header-and-tabs navigation model, familiar from modern web applications, without abandoning the aerodrome map as the primary display element. The feature set is distributed across tabs rather than loaded onto a single screen. This reduces the operational density of the primary view without hiding available functionality, a balance most competitors in this category have not found.

UX Design airfield software

UX Design airfield software

UX Design airfield software

UX Design airfield software
The argument that functional depth and interface quality are competing properties is not supported by the evidence. In interface design for safety-critical industrial systems, we needn't pick one over the other, and the quantitative case for functional depth alongside design quality is well-documented across industrial embedded GUI programmes. The systems that score highest in this benchmark achieve their scores by organising their functional depth, not by reducing it. Innovence demonstrates that the navigation model is the lever.
Information hierarchy is inconsistent across the platform. Some features render larger than the assets they are attached to, inverting the expected visual relationship between component and context. A floating panel that changes position between tabs introduces a memorisation cost that grows across a shift. Despite managing a large inventory of assets, the map itself has no search or filtering mechanism. This gap appears in system after system in this benchmark and is worth naming as a structural pattern rather than a product-specific observation.
OCEM INFINITE ALCMS
Analyst scores: UX 9/10, UI 8.5/10
OCEM INFINITE is the top-scoring system in this benchmark on both dimensions. It is web-based, which raises legitimate questions about performance under the concurrent monitoring loads that complex airport operations generate. None of the other systems evaluated here are web-based, so a direct comparison is not possible. The interface quality warrants detailed attention regardless.

Airfiled technology UX and UI

Airfiled technology UX and UI

Airfiled technology UX and UI
The aerodrome map is the benchmark's most capable. OCEM renders the live state of the airfield lighting system directly on the map, so the current condition of every light is readable as a spatial picture rather than a status list requiring contextual translation. This is the clearest available demonstration of what a map-anchored interface accomplishes: it eliminates the cognitive step between reading an alert and knowing where on the airfield to act.
The feature set is distributed in a collapsible bottom panel, which means the map can occupy the full display when operational conditions require maximum spatial context. Data organisation reaches a level not achieved elsewhere in this benchmark. Preset configuration is presented as a grid showing intensity values by zone and condition. The log system includes search and filtering. Modal windows, tabs, and tables are used consistently across pages to organise a large feature set into readable sections.

Airfiled technology UX and UI

Airfiled technology UX and UI
OCEM's genuine limitation is colour. The system uses more distinct colours than the alert literature recommends, which begins to erode the parsability of a coding system that is otherwise well-constructed. Whether that colour palette can be rationalised without losing the expressive resolution of the live lighting visualisation is a design tension without a clean resolution at this product generation. The "connect airport" positioning announced at Airspace World 2025 signals an ambition toward integration with broader airport systems; whether that ambition will be matched by equivalent interface investment is a question the next product generation will answer.
Indra ATCS
Analyst scores: UX 9/10, UI 8.5/10
Indra ATCS is the only system in this benchmark that matches OCEM INFINITE on both scores, and it does so across a platform managing ground movement, tower operations, and en route radar. The scope is the largest of any system reviewed here.

User Interface Aerodrome software

User Interface Aerodrome software

User Interface Aerodrome software
The aerodrome map is the most spatially complete in the benchmark. Runways, taxiways, aprons, and buildings are all rendered. Several other systems omit built structures, which reduces the fidelity of the positional model an operator builds through repeated use. A controller working from a map that includes buildings will develop more accurate spatial memory than one working from a schematic of runways and taxiways alone.
Data density is high across most screens and the organisation, while present, is calibrated primarily for experienced users. Cluttered screens with multiple colour-coding systems can extend the time taken on critical tasks even for domain-proficient operators, because the first step is confirming that the correct element has been identified before acting. This is a finding that applies to comprehensive ATC platforms specifically: the principles of spatial anchoring and consistent alert coding are not negotiable at any scope level, but implementing them across a platform of Indra's breadth is a harder design problem than implementing them in a purpose-built ALCMS.
Thales TopSky ATC
Analyst scores: UX 5/10, UI 5/10
The Thales TopSky analysis is limited by documentation availability. The accessible materials cover the airspace map and flight management views rather than the full platform. The scores should be treated as partial assessments of what was evaluable rather than a complete system rating.
The available screens demonstrate a design philosophy that treats abbreviation as the default and full labels as an exception. Data fields, alert states, and contextual menus are shortened throughout. The airspace map provides genuine spatial context for experienced controllers. The problem is that spatial utility is undermined by the requirement to decode every data element before acting on it.

Airfield management software UX

Airfield management software UX

Airfield management software UX
The reasoning behind pervasive abbreviation is not evident from the available screens. Display space is not visibly constrained, and many contextual menus are collapsible and therefore not competing with the map for screen real estate. Choosing abbreviation under these conditions sacrifices clarity for convention. A tired controller approaching the end of a shift does not make the abbreviation-to-meaning translation less reliably because their domain expertise has lapsed. They make it less reliably because fatigue imposes a processing cost on every translation step the interface requires.
The Map-Anchor Standard
Across the eight systems in this benchmark, one pattern predicts more of the final score than any other single criterion. Systems that treat the aerodrome map as the primary visual instrument, that organise their alerts, controls, and status indicators in relation to spatial position, consistently score higher across every other criterion in the evaluation. Systems that treat the map as one component among several competing for screen space score lower across every criterion, because they are solving a different operational problem.
Fewer than half of the benchmarked systems consistently give the map that primary position. Two systems provide search or asset-level filtering on the map itself: Insero RCMS and OCEM INFINITE. These are also the only two systems in the benchmark that approach the combination of spatial clarity, data organisation, and alert management that operational conditions require.
An ALCMS interface that meets the Map-Anchor Standard treats the geographically accurate aerodrome map as the single primary visual instrument rather than one panel among several. Every alert, fault status, and control function is spatially anchored to map position so that an operator reading a fault condition immediately identifies its location on the airfield without additional navigation. Search and asset-level filtering on the map, consistent single-purpose colour coding for alerts, and collapsible secondary panels are the minimum supporting requirements.
| System | UX | UI | Map quality | Alert consistency | Data organisation | Search / filter |
|---|---|---|---|---|---|---|
| OCEM INFINITE | 9/10 | 8.5/10 | Live lighting visualisation | Good (colour excess) | Best in benchmark | Logs only |
| Indra ATCS | 9/10 | 8.5/10 | Full spatial with buildings | Dense but separated | Structured, expert-calibrated | No |
| S4GA ALCMS | 8.5/10 | 7.5/10 | Advanced tier only | Adequate | Strong at advanced tier | No |
| Insero RCMS | 8/10 | 7/10 | Full, searchable | Good red coding | Inconsistent in secondary panels | Map search |
| AP-TECH Innovence | 7.5/10 | 5/10 | Conventional map | Inconsistent | Variable by screen | Charts only |
| CORTEX ALCMS | 6.5/10 | 5/10 | Simplified | Multiple codes | No hierarchy | No |
| Adacel Aurora ATM | 6.5/10 | 2/10 | Not confirmed | Partial | Dense throughout | No |
| Thales TopSky ATC | 5/10 | 5/10 | Route focus | Partial | Abbreviations throughout | No |
All scores are the analyst's professional assessment based on interface analysis and product documentation review. They are not independently verified ratings.
What ALCMS Scores Reveal
Three principles emerge from the cross-system analysis that apply to any organisation currently evaluating, operating, or redesigning an ALCMS interface.
Map spatial accuracy is a baseline safety requirement. ICAO Annex 14 (9th Edition, as of November 2022) requires automatic monitoring at the individual lamp level. An interface that cannot place that lamp on a map does not serve the regulatory intent of the standard it supports, even when it satisfies the monitoring requirement technically. Spatial location is the information that transforms a fault alert into an actionable response.
Alert consistency matters more than alert visibility. Every system in this benchmark makes red fault indicators visible. Fewer than half apply colour consistently enough to support rapid fault triage under time pressure. When a fault appears in one colour in one screen and a different colour for an equivalent event in another screen, the operator's first task becomes confirming what the colour means rather than responding to the condition it signals.
Search is a safety feature. In a system managing several hundred individual lamps, vehicles, and assets, an operator who cannot filter or search the map in real time must navigate it manually during a live aircraft event. In the patterns we observe consistently across interfaces without map search capability, that navigation takes time the operational situation does not reliably provide. The system that resolves this is not more convenient. It is safer.
Limits and Gaps
This benchmark has constraints that readers should factor into how they apply its findings.
The analysis is based on interface review, screenshot analysis, and product documentation. No formal usability testing with trained operators in live or controlled environments was conducted. The scores reflect professional interface analysis, not measured task time, error rate, or operator performance under operational conditions. The findings identify design patterns; they do not quantify their operational consequences.
Documentation availability varied across the benchmark. The Thales TopSky analysis is based on partial documentation and should be treated as a partial assessment. The Adacel Aurora review reflects a broader ATC platform evaluated against criteria weighted toward ALCMS functions, which creates an imperfect comparison for that system specifically.
The benchmark applies a single criterion set to both purpose-built ALCMS and comprehensive ATC platforms. This allows comparison on the dimensions most operationally significant for airfield lighting management, but it underweights capabilities that matter for the broader ATC context that Adacel, Indra, and Thales serve. A platform-level evaluation of those systems would require a different and wider criterion set.
What this benchmark cannot settle is whether the interface design patterns it identifies as problematic produce measurably worse outcomes in operational terms. The regulatory literature specifies what is required. It does not document what happens when specific interface criteria fall below the standard it implies. Closing that gap would require controlled studies or structured incident analysis that is not yet systematic in this domain.
The systems reviewed here represent the current commercial field. The next article in this series turns to the design principles that the highest-scoring systems share, and what a purpose-built Map-Anchor interface would look like from the ground up.
The aircraft that opened this benchmark lands safely. Whether the operator resolved the lighting fault it encountered in three seconds or thirteen is a function of interface design, not monitoring hardware. Every system reviewed here detects faults. The separator is whether the interface makes that fault spatially readable at the moment it matters.
The map is the instrument. Everything else — the dashboards, the alert panels, the remote control interfaces — exists to support it. That principle, obvious in retrospect, is violated by the majority of systems in this benchmark. Not because the engineering teams who built them are unaware of it, but because the assumption that expert operators will learn the system consistently overrides the investment in making the system meet the operators where they are.
OCEM INFINITE and Indra ATCS demonstrate that high scores across every criterion are achievable within the same functional scope that lower-scoring competitors manage. The difference is not capability. It is structural. Map primacy, consistent alert coding, collapsible secondary panels, and asset-level search are design decisions, not hardware constraints. They are available to every system in this benchmark.
The organisations responsible for the lowest-scoring systems in this review have signed off on interfaces that their operators use every night, at every airport those systems serve. The question those organisations should be asking is not whether their operators are trained to use the system. The question is what happens when those operators are also tired.
Organisations reviewing ALCMS interface quality or planning a procurement decision are welcome to request a direct briefing on the findings of this benchmark through interface-design.co.uk.
Frequently Asked Questions
What is the Map-Anchor Standard for ALCMS interface design?
The Map-Anchor Standard is the evaluative principle introduced in this benchmark: every criterion for ALCMS interface quality, from dashboard customisation to alert management, should be evaluated relative to whether the aerodrome map is treated as the single primary visual instrument. An interface meets the standard when every fault alert, status indicator, and control function is spatially anchored to map position, making fault location immediate without additional navigation. Search on the map, consistent single-purpose colour coding, and collapsible secondary panels are the minimum supporting requirements.
Which ALCMS software scores highest in this benchmark?
OCEM INFINITE Airfield Technology ALCMS and Indra ATCS share the highest scores, both receiving UX 9/10 and UI 8.5/10 in the analyst's assessment. OCEM INFINITE is distinguished by live lighting visualisation rendered directly on the aerodrome map. Indra ATCS scores equally on spatial completeness across a broader platform managing ground movement, tower, and en route operations. Both scores are the analyst's professional evaluation based on interface analysis and product documentation; they are not independently verified ratings.
Does ICAO Annex 14 specify interface design requirements for ALCMS?
ICAO Annex 14, Volume I (9th Edition, effective November 2022) requires automatic monitoring of airfield lighting including approach and runway lights, and specifies that individual lamps are unserviceable when their intensity falls below 50% of the required standard. It establishes monitoring obligations at the lamp level and requires immediate fault detection. The standard does not prescribe specific interface design features, but its monitoring requirements imply that spatial fault location must be available to the operator within the response window a live aircraft event allows.
What causes alarm fatigue in airfield lighting control systems?
Alarm fatigue in ALCMS interfaces typically results from three interface conditions: multiple colour codes used inconsistently across the same alert category, absence of a priority hierarchy distinguishing safety-critical faults from status notifications, and alert panels presenting information sequentially with no visual separation between urgency levels. These conditions require operators to evaluate every alert individually rather than responding to its visual tier. Over a shift, the assessment load accumulates. Systems with consistent single-purpose colour coding and explicit priority hierarchies reduce this burden measurably.
How does S4GA ALCMS differ from OCEM INFINITE in its interface approach?
S4GA's highest-scoring tier includes a customised airfield layout with detailed light grouping and strong data organisation built on familiar web-application navigation conventions. Its entry-level tier omits the aerodrome map entirely, relying on a tabular interface. OCEM INFINITE provides a geographically accurate map with live lighting visualisation and the benchmark's most structured data organisation at all product levels. The two systems represent different philosophies: S4GA uses navigation model familiarity to ease onboarding; OCEM INFINITE centres every interaction on the spatial model of the airfield.
Why does airfield lighting software UX affect runway safety?
ICAO Annex 14 and FAA specification L-890 require ALCMS to provide immediate fault detection across the entire airfield lighting system. The regulatory obligation is on the system. The operational obligation, responding to a fault before it affects an aircraft on approach or departure, is on the operator. Interface design determines how quickly a trained operator can locate a fault, identify its severity, and initiate corrective action. When the interface requires multi-screen navigation to answer a spatial question, the response time lengthens. That lengthening is not a performance metric. It is a safety margin being consumed.
References
Aviation Renewables. (n.d.). ALCMS product documentation, citing FAA 150/5345-56 and FAA specification L-890. https://aviationrenewables.com/product/alcms-airport-lighting-control-monitoring-system/
Daramola, A. O., et al. (2025). Evolution of human factors research in aviation safety: A systematic review and bibliometric analysis of the intellectual structure. ScienceDirect. https://www.sciencedirect.com/science/article/pii/S2666449625000830
EUROCONTROL. (2024). Controller working position/human machine interface (PJ16 CWP/HMI). https://www.eurocontrol.int/project/controller-working-position-human-machine-interface
FAA/ASRS. Runway incursion data, January 2012 to August 2017. Referenced via Wikipedia, Runway incursion. https://en.wikipedia.org/wiki/Runway_incursion
ICAO Annex 14, Volume I, Aerodromes, 9th Edition. (2022, November). International Civil Aviation Organization. Cited via River Island Airport Solutions. https://riverisland.aero/index.php/icao-standards-and-photometric-testing/
Kuffner, T. (1999). Controller centered HMI: Exploiting modern graphical capabilities for en route air traffic control. MIT Lincoln Laboratory. https://archive.ll.mit.edu/mission/aviation/publications/publication-files/ms-papers/Kuffner_1999_CCHMI_MS-13545_WW-18698.pdf
Olaganathan, R., Holt, T. B., Luedtke, J., & Bowen, B. D. (2021). Fatigue and its management in the aviation industry, with special reference to pilots. Journal of Aviation Technology and Engineering, 10(1). https://commons.erau.edu/cgi/viewcontent.cgi?article=2705&context=publication
In this story
This UX benchmark evaluates eight airfield lighting control and monitoring systems on map quality, alert consistency, data organisation, and learnability. The Map-Anchor Standard emerges as the single most reliable predictor of interface performance across every criterion.



