Balancing Innovation & Integrity: Ethical AI Practices in Recruiting and Talent Management for Life Sciences Companies
The use of AI tools in the workplace is emerging across all industries, including those in the life sciences industry. There are countless examples of AI technology being deployed by life sciences organizations in research and development, drug discovery and even in clinical trials. At the same time, there are emerging use cases for these tools in recruiting, onboarding, and talent management.
As life sciences organizations work to develop utilize AI to automate processes, many companies find themselves behind the ball in developing policies to oversee the use of these tools by their workforce, including their HR departments. Robust policies, procedures, and training protocols are critical in mitigating potential risk associated with the improper use of AI tools.
In this article, we dive into the top AI use cases in recruiting and talent management, their associated risks and pitfalls, and best practices for life sciences companies to avoid those risks.
Practical AI Applications and Their Potential Pitfalls
Use Case: Recruiting Software
HR departments are using AI tools to help with many aspects of recruiting, including everything from content creation for job postings and candidate communications to the explanation of benefits and candidate tracking. In onboarding processes and on the talent management side, AI is being used to develop training modules, create HR-related messages, administer and analyze engagement surveys, and identify skills gaps in the workforce.
Potential Pitfalls: While these tools can certainly save time and money, they cannot fully account for potential bias in hiring processes. For example, if you’re using an AI tool to screen applications and you’re looking for someone who is an “aggressive” and “seasoned leader,” the AI tool is likely to screen out any resume that doesn’t have those keywords. As a result, you might end up with a batch of candidates that is skewed heavily toward one demographic.
The U.S. Equal Employment Opportunity Commission (EEOC) has issued AI-specific guidance emphasizing that employers must carefully assess any AI tools used in the hiring process and ensure they do not have adverse impacts on applicants with protected characteristics. In fact, the EEOC has already brought suits against companies for allegedly improper use of AI in their hiring processes. For example, it previously brought suit against iTutorGroup, a provider of English-language tutoring services to students in China using remote tutors, alleging that the company’s software was designed to automatically reject U.S. applicants who were women over age 55 and men over age 60. In its suit, the EEOC alleged that as a result of the improperly-designed software, over 200 qualified applicants were rejected. iTutorGroup eventually settled with the EEOC for $365,000 and has stopped hiring tutors in the U.S. as a result of the litigation.
Use Case: Process Improvements and Auditing
AI tools are also being used for HR process improvements, including streamlining the ADA request process, minimizing language barriers, and analyzing HR liability risks. For example, AI tools can help review reductions in force (RIFs) for disparate impact, assess data associated with pay equity audits, and screen internal communications.
Potential Pitfalls: When it comes to using AI for process improvements or audits, it is still important to maintain the human element. These tools should not fully replace tasks that require a human touch. When reviewing pay data, AI can provide great benefits in terms of time saved and ability to quickly review data from a variety of angles, but companies cannot take the data at face value alone. Rather, companies must examine the AI output with a human lens to determine the best approach to mitigate disparate impact and to understand why certain gaps in pay might exist.
Use Case: Performance Evaluations and Mentorship
AI tools can also be used to help inform employee development programming. For example, some organizations are currently using AI to collect and analyze performance review data and even to manage mentorship matchmaking.
Potential Pitfalls: AI tools can help automate programs such as performance evaluations and mentorship, but it also invites the risk of bias. AI has a low level of emotional intelligence and creativity, meaning that companies will need to ensure a human touch is still present in these use cases. Companies need to ensure the input data for the algorithm is accurate and sound information and informed by the performance program’s specific goals.
When it comes to mentor matchmaking, setting the right parameters is going to be critical to ensure the tool does not search only for similarities between prospective mentors and mentees but also differences, so that workers from different in-groups can have the opportunity to be paired with each other. For example, if an input to the algorithm is educational information, the AI might automatically match people from the same school, potentially missing out on a full category of candidates for mentee opportunities.
Best Practices to Avoid Bias and Protect Trade Secrets
The risks that come with AI tools shouldn’t be a deterrent from implementing them in your life sciences company. But as you do introduce these tools, it’s vital to make sure you are following best practices to ensure they are not inviting new biases into HR processes and that important employee data and company data, like trade secrets, remain protected. To that end, some best practices include the following:
1. Create rigorous training programs
Despite a major uptick in job postings requiring AI skills, few workers are currently being offered AI-specific training. If your employees or HR departments are using AI tools, training programs need to be implemented to ensure security and proper use. At a minimum, AI training programs should cover ethical use of AI, bias mitigation, accuracy assurance, and appropriate use of confidential data. They should also include real-world examples to help employees contextualize the information. Additionally, this training should be continuously refreshed and re-administered to employees as the technology continues to evolve.
2. Manage your vendors and ask the right questions
There are a number of vendors that offer various AI algorithms with machine learning capabilities, especially in the life sciences industry. Companies working with vendors to develop these tools need to go in with the expectation that the vendor is not aware of all of the regulations and requirements applicable to life sciences companies. This will require you to conduct deep dives into the vendor’s AI systems and processes to ensure compliance. The relevant EEOC guidance makes clear that employers are responsible for the use of AI or algorithm-based hiring tools. At the end of the day, the company will bear responsibility, not the vendor.
For example, in one case, CVS Pharmacy is facing a class action lawsuit in the District of Massachusetts alleging that the company’s use of a third-party administered, AI-assisted video interview tool violates state law against lie detector/integrity testing as a condition of employment. The tool was designed to gauge applicants’ “integrity and honor and help with lie detection.”
3. Be aware of state, local and even global requirements
When deploying AI across HR functions, companies will need to ensure compliance with all local, state, and national laws – which can vary significantly– for from both employment law and data gathering perspectives. Life sciences companies must ensure that they obtain the appropriate permission for use of data and complying with notification requirements.
When developing or implementing AI tools, it’s important that data scientists, IT teams and HR professionals collaborate to develop a compliance structure that mitigates the risk of bias and ensures the right data is being used to inform algorithms based on your desired output and goals.
At Buchanan, our life sciences attorneys are on top of the ever-evolving landscape of AI and how it’s being used in employment practices. We’re ready to help your company ensure regulatory compliance and data security when introducing AI tools for recruiting, talent management, and more.