Throughout the 2010s, as the world rebounded from the financial crisis, regulators around the world were intently focused on macroprudential reforms. Today, however, with the financial sector having weathered the pandemic and the market turmoil following the war in Ukraine, they are increasingly turning their scrutiny to non-financial risk.
In order to understand these new challenges, the Taiwan Banker spoke with Erich Hoefer, Co-founder/COO and Cameron Lawrence, Director of Research at Starling Trust Sciences – a US-based startup which helps banks manage their compliance culture through technology. “These agencies have realized that financial risk controls and capital buffers may address the ‘too big to fail’ problem, but they leave unsolved the ‘too big to manage’ problem,” said Starling.
The growth in connectivity brought by technological changes, along with the increasing complexity of regulations to prevent actions like money laundering and sanctions evasion, make corporate culture ever more important. These risks usually do not reflect in banks’ balance sheets until it is far too late, so a proactive approach is required.
The Taiwan Banker: Could you first provide some background on Starling and its business model?
Starling: Starling Trust Sciences is an applied behavioral sciences company with a mission to strengthen trust in organizations and institutions. We do this through three core offerings: predictive behavioral analytics tools, related advisory work, and a curated global dialogue on how best to govern and supervise cultural, behavioral, and other non-financial risks.
Through the application of what is known as computational social science, we are making it possible for complex organizations to anticipate and address such risks with reliable foresight. And we provide customers with highly-specialized advisory support in connection with the adoption of such proactive risk management tools.
The Taiwan Banker: Could you explain more about this “computational social science”? What does it entail?
Starling: Seeking to manage employee conduct risk without addressing culture is unlikely to achieve desired outcomes. Yet organizations regularly focus on strengthening controls, adding processes, or adopting intrusive surveillance. These approaches ignore fundamental aspects of human behavior — that people are strongly influenced by social cues received from their peers, and the behavioral norms seen to be at work among them. These informal influences – collectively, a firm’s culture – can and do overwhelm formal management structures.
A growing number of banks and other financial institutions have realized this and are investing in culture management capabilities. However, the most common tools available for measuring culture (employee surveys, interviews, etc.) are highly subjective and expensive in terms of both cost and employee distraction. Because they are periodic and based on samples of employees, managers cannot access continuous metrics covering the entire organization.
Starling’s predictive behavioral analytics platform delivers culture and governance metrics in real time, spanning the entire enterprise. Furthermore, we do this without relying on surveys or other intrusive interventions by analyzing the standard internal datasets a firm already collects.
This is possible because these data sets, particularly electronic communications metadata, contain signals that reflect the behavioral and cultural proclivities of the organization. Without reliance on specific text content, Starling’s AI interprets patterns in the flow of an organization’s internal communications and ties these patterns to behaviors of interest.
This may include behaviors related to speak-up and challenge, employee engagement, diversity & inclusion, or team performance. This equips managers with accurate leading indicators into behaviors that are likely to promote or hinder desired objectives, so they act before problems are flagged by traditional reporting systems.
The Taiwan Banker: Can you provide a practical example of the use of technology to uncover problems ahead of time?
Starling: One project Starling did with HSBC was turned into a Harvard Business School case study.
Following the Financial Crisis, HSBC had paid out over USD $5 billion in fines and penalties as a result of employee misconduct, and was the subject of a Deferred Prosecution Agreement with the US government. As a consequence, the bank was under intense pressure to demonstrate a more effective operational risk management framework.
This required a wholesale update of the bank’s “Three Lines of Defense” implementation, including hiring and reallocating thousands of executives, assigning new roles and reporting relationships, and implementing expensive new systems and processes.
Despite over $3 billion invested, the firm’s Global Head of Operational Risk still struggled to anticipate and avoid various non-financial risks, and the firm faced continued regulator wrath. A capable Three Lines of Defense relies on a complicated web of interactions across business lines and management tiers. It was difficult to demonstrate that expected behaviors were taking place to the satisfaction of the bank’s regulators and Board.
Specifically, the bank’s leadership faced:
· Long periods of uncertainty between deep-dive audits of risk conditions;
· Reliance on ‘executive intuition’ to identify management blind-spots;
· Risk management failures often discovered only during regulatory reviews; and
· Punitive fines in the wake of these lapses.
HSBC turned to Starling’s AI-powered platform for a solution. Starling’s platform was able to distinguish between teams that were achieving their goals and those where risk oversight was not meeting expectations. Starling could also identify the specific behaviors that differentiated good performers from poor performers. And, because the platform covered the entire bank and updated in real time, management didn’t have to wait for survey results or status updates to learn whether the right behaviors were taking place. In one case, Starling’s algorithms were able to detect an underperforming group six months before they self-reported their condition — giving management months of lead time to address short-comings.
The Taiwan Banker: Presumably when people are considering unethical actions, they are likely to move off their corporate email systems into informal chats, or perhaps unmonitored video calls. How can one system measure all aspects of communication?
Starling: We developed Starling specifically with this challenge in mind. Traditional surveillance and control technologies rely upon catching a single message among millions of emails, chats, and other communications. Only through 100% coverage can firms be truly ‘protected’.
Instead, Starling focuses on behavioral patterns persistent across time and communications channels. This means that, even if misconduct is taking place off-channel, Starling can detect relevant patterns in the approved communications channels. Thus, managers can focus less on reacting to risk events and spend more time addressing their root causes.
The Taiwan Banker: To what extent do the compliance problems you deal with involve malice versus negligence?
Starling: We discuss this in terms of mis-conduct – things done wrongly – and poor conduct – things not done as well as necessary. Our technology helps to address both issues, because in either case, the firm’s culture allows harmful behaviors to propagate, contagion-like, across the firm.
Many observers assume that conduct issues are driven by individuals deciding to cheat, steal, or be careless. That happens, of course, but the behaviors considered normal in one's peer group are far more impactful.
A key driver is what social psychologists refer to as “normative expectations” of behavior. Often this comes about when misaligned incentives, competitive pressures, and uncertain institutional ethical values encourage individuals to pursue self-interest rather than the firm’s stated goals. Some research suggests that people who work at banks — when primed to think of themselves in the context of their workplace — are more likely to behave dishonestly.[1] A culture problem.
Poor conduct occurs when normative expectations allow for weak systems, poor training, and pressures that deviate from established standards. Despite management's best efforts, mistakes are allowed to happen at scale without nefarious intentions. This has been called “the normalization of deviance.”[2]
As London School of Economics and Political Science Professor Tom Reader wrote in the preamble to our recent report, The Costs of Misconduct, “whereas misconduct is arguably motivated (whether by organizations or individuals), poor conduct is a byproduct of poor system design.” In both cases, the important lesson for management is to focus not only on formal systems and mechanisms, but also to address cultural drivers that can work against them.
The Taiwan Banker: One emerging issue for banks in Taiwan and elsewhere is cybersecurity. To what extent is this a human governance, versus a technological problem?
Starling: With the costs from cyberattacks continuing to rise, one of Europe’s largest insurers recently said that cyber attacks may soon become ‘uninsurable’. [3] Improved management of risk culture may be especially important in this context.
Social Engineering – whereby attackers lure employees into exposing internal systems – is a major vulnerability. Training is helpful, but unless it is reinforced by a culture of risk prevention, necessary behaviors may not take hold. Maintaining strong cyber defenses also requires sound governance and coordination across the enterprise to assure that defenses keep pace with evolving business needs. At a minimum, problems and vulnerabilities must be escalated and resolved quickly and efficiently. Fortunately, behaviors which support effective governance of cybersecurity can, in fact, be assessed proactively.
1. Alain Cohn & Ernst Fehr, “Business Culture and Dishonesty in the Banking Industry,” Nature, vol. 516, no. 7529 (2014).
2. Diane Vaughan, The Challenger Launch Decision: Risky Technology, Culture, and Deviance at Nasa. University of Chicago Press. 1997.
3. Ian Smith, “Cyber Attacks Set to Become ‘Uninsurable’, Says Zurich chief”, Financial Times, Dec. 26, 2022