Challenges and Opportunities with Employee Engagement AI

Martin van Blerk
4 min readMar 11, 2024
Photo by John Schnobrich on Unsplash

During the COVID-19 pandemic, some teachers looking for ways to engage and support their remote students were excited by the prospect of an AI program claiming to be able to analyze emotions based on facial expression and body language. Developed through a partnership between Intel and a company called Classroom Technologies, the Zoom-compatible AI system was designed to discern whether a student was feeling bored or confused by the material, and even if their mind started wandering.

The concept involved the program catching photos of a student’s face, aligning these images with real-time information about what that student was working on at the time, and determining the most likely way the student was responding to the lesson. The ultimate goal was to help instructors figure out when a student needed additional help, or perhaps a change of lesson plan.

Some educators in Hong Kong have begun to deploy a similar system which aims to measure the action of muscle points on students’ faces to gauge emotions that include anxiety and fear.

Concerns about overreach

While some teachers have found these kinds of programs helpful, others have expressed vocal criticism. Based on research into human emotion, they’ve noted that it’s often impossible to pinpoint someone’s emotional state based on the multifaceted array of gestures, facial expressions, and other physical signals that we all demonstrate in different situations. For many educators, this type of AI application can become intrusive and a source of student anxiety, while not showing evidence of an objective benefit for the students.

Studies have also shown that the way in which people express strong emotions nonverbally takes different forms depending on their cultures of origin. This last point is especially crucial, given that some AI systems have been shown to replicate the biases of the humans who build them and supply them with training models.

What can this tell us about the use of employee engagement AI in the workplace? The conversation on this topic within the education field has strong parallels for the human resources industry. After all, both HR and education are inherently people-focused — in both fields, human connection and empathy are paramount. Like their counterparts in schools and universities, many HR departments are proceeding with caution when it comes to any AI product that presents itself as able to analyze employee emotions.

Focus on AI as a tool

As a December 2023, Forbes Human Resources Council article put it, emotional intelligence is the key to sustaining any complex effort involving interpersonal dynamics or changes to company culture. Even the most sophisticated AI system cannot by itself make these efforts successful. If a company simply deploys an AI system as a “crutch,” as the Forbes piece put it, instead of as a genuine “tool” to assist its human professionals, there is little chance of achieving tangible results.

So how can a company use AI to help talent acquisition professionals and other HR leaders really get to know a candidate or employee? How can these professionals use AI-powered tools in ethical ways that enhance trust and connection, rather than undermining it?

Ensuring regulatory compliance

Any time a company uses an algorithmic program to help make decisions that involve hiring, retention, and development, the legal team should be integrated into the process to ensure compliance with all regulatory requirements. HR teams that focus on supporting a diverse workforce should also be part of the design and implementation process, to enhance intercultural understanding and empathy, and to ensure that anti-bias goals are met.

Practicing transparency

It’s also vitally important to practice transparency. A company rolling out a new AI system should engage in substantive conversations with employees about its purpose, how it will be used, and how any data collected will be stored and safeguarded.

Planning strategically

Strategize in advance with vendors about how to build AI HR protocols that will help you accomplish your goals while bringing everyone on the team on board.

If the objective is to help the talent acquisition team scan through a backlog of resumes, one type of pattern-recognition and sorting program may do the job better than another. When you’re looking to build a more diverse workforce, another system may be the best choice. Letting potential vendors know exactly what the strategy is will allow them to offer products more likely to meet the need.

And in interviewing potential vendors, discuss their password-protection and data collection and retention policies. Cross-check these policies to be sure they comport with all local, state, and national regulations on data privacy.

Supportive — not intrusive — AI

With transparency and appropriate guardrails in place, an HR department can deploy AI in ways that support everyone on the team. In addition to its growing ability to help weed out bias in hiring and promotion, and its consistently strong performance in terms of data organization, AI can help identify and address serious issues like burnout.

Going beyond the window dressing of many standard employee wellness programs, an effective AI system can help employers identify anxiety and stress points and make appropriate referrals to supportive services. It can also bolster meaningful employee engagement programs and deliver data-driven insights to inform evaluation, training, and career-development goals.

--

--

Martin van Blerk
0 Followers

A NZ entrepreneur studied business, management, marketing, and game development at the University of Waikato and joined the University Game Developers Programme