Bringing Artificial Intelligence into Pay Decisions

January 9, 2020

Bringing Artificial Intelligence into Pay Decisions

As employers look for new ways to hire and keep skilled employees, some have begun to leverage artificial intelligence (AI) to more precisely compensate workers. They want pay offers and salary adjustments to reflect the value of an employee's skills rather than the compensation level of a specific job, particularly when those skills are in high demand and critical to the organization's success.

By using AI, employers can gain "new opportunities by rethinking the value that compensation programs deliver to the business and employees," said George Zarkadakis, digital lead at consultancy Willis Towers Watson in London. AI could help develop compensation metrics that reward employee efforts to advance an organization's goals. It can also analyze great swaths of labor market data to provide localized and up-to-the-minute competitive pay rates.

Shifting from Jobs to Skills

IBM has been using AI in its compensation systems for several years as it has shifted performance management to focus on ongoing feedback rather than a single periodic performance rating, while also tying salary increases more closely to employee skills. As a result, pay decisions more accurately reflect what the market is paying for certain skills as demand for those competencies ebbs and flows.

"Certain new skills are scarce and high in value, while other skills have become commoditized," said Joanna Daly, the company's Vice President of Compensation, Benefits and Corporate Health and Safety, based in Armonk, N.Y. In this environment, "managers must make more complex compensation decisions, so they need to have an understanding of the supply and demand for specific skills."

IBM managers use the in-house system to make better decisions during compensation discussions, including machine-learning that gives them salary increase recommendations ranging from high to average to no increase.

Managers can also leverage this information to explain how pay decisions are linked to employees' skill levels. "It gives employees an incentive to keep their skills competitive," said Daly, who noted that the AI system allows IBM to react immediately to changes in the market for specific skills.

Although the AI tool makes salary increase recommendations, the final decision is still up to the individual manager. However, "less than 5 percent of managers have disagreed with these recommendations," Daly said. Managers who follow AI compensation recommendations have cut their attrition rates by 50 percent.

Creating Nimble Systems

Employers can leverage AI not only for current compensation needs but also to model how skills might change over the next few years and how much it will cost to acquire and retain those skills in the workforce.

To price skills, employers can use AI to:

  • Harvest datasets from both internal and external sources.
  • Separate out the skills various roles require now and are likely to require in the future.
  • Determine pay for those skills based on geography.

Employers that are not ready to develop these systems themselves can rely on vendors to do it for them. For example, PayScale, which provides online data about compensation and benefits, relies on AI to help employers price jobs based on small differences among employees' skills and local labor market conditions.

That's become such an important element of compensation planning that PayScale updates its compensation database of skill differentials every two weeks, compared to quarterly updates for geographic differences.

AI Challenges

Using AI in this way is not without challenges and risk. If not managed and monitored appropriately, AI-based compensation tools could start out with ingrained biases or become further biased over time.

"The risk is in the variables in connection with the data in AI," said Peter Cassat, a partner with law firm Culhane Meadows Haughian & Walsh PLLC in Washington, D.C.

Research on AI outcomes by The Brookings Institution, a Washington, D.C.-based think tank, shows generally that if biased data feed the algorithm, results may be biased. For example, if some employees are being paid less than others despite having the same job, experience and skill levels, simply inputting that data into an AI-based pay system could perpetuate that bias.

"Bias in algorithms can emanate from unrepresentative or incomplete training data or the reliance on flawed information that reflects historical inequalities," the Brookings report stated. If left unchecked, biased algorithms can perpetuate biases against certain groups of people "even without the programmer's intention to discriminate."

Employers should be mindful of how AI tools are functioning and what data they are collecting. "It is important to make sure this does not favor some groups over others based on factors like gender," Cassat said.

Avoiding these problems begins with due diligence before choosing AI tools. Over time, it is also important to remain alert for any unintended consequences, not only in the recommendations the system outputs but also in how managers use the results.

"Don't just implement and forget," Cassat said. "Look at the results and whether and how they differ following implementation."

Rigorous data governance for AI also is important to ensure AI-supported compensation decisions are fair and unbiased. "AI systems that are not ethically governed can promote exclusion and feel too intrusive—and even threatening—to those impacted by their decisions," Zarkadakis said.

The Authors: 

Joanne Sammer is a New Jersey-based business and financial writer.