Department of Labor Guidance on the use of Artificial Intelligence Tools

//

PintoBrown

In response to President Biden’s October 30, 2023, Executive Order 14110 on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, the U.S. Department of Labor’s Wage & Hour Division and Office of Federal Contract Compliance Programs (OFCCP) recently issued guidance documents on the use of Artificial Intelligence (AI) in the workplace.

Wage & Hour Division: FLSA, FMLA & PUMP Act Compliance

The Wage & Hour Division’s (WHD) Field Assistance Bulletin No. 2024-1, issued on April 21, 2024, focuses largely on the potential compliance risks associated with the use of AI tools to track working time and legally protected leave or breaks required by the FLSA, Pump Act, FMLA, and Employee Polygraph Protection Act.  While Bulletins provide guidance to WHD staff in the field, the information provided is equally helpful to the business community. 

Employers are increasingly relying on automated timekeeping and monitoring systems to track employees’ working time for purposes of compliance with the overtime requirements of the FLSA.  While the use of such technology is not prohibited, automated time tracking, or location monitoring tools may not accurately record the time employees spend working. For example, periods of inactivity, including required breaks and waiting time, may be incorrectly recorded as non-work time for which the employee is not compensated, but should be under the FLSA.  The Bulletin also stresses that employers must be vigilant when relying on AI tools to calculate employee wages, particularly for non-exempt positions for which wage rates may vary significantly based on work performed. 

Similarly, AI technologies used to process leave requests and record time off, including tracking FMLA protected leave, can be unreliable.  WHD warns that “without responsible human oversight, relying on automated systems to process leave requests, including determining eligibility, calculating available leave entitlements, or evaluating whether leave is for a qualifying reason, can create potential compliance challenges” exposing employers to liability for violating the FMLA.  Similar concerns arise under the Pump Act, where AI technology may incorrectly limit the length and timing of breaks for nursing employees.

OFCCP Guidance: AI & Compliance with Federal Contractors’ EEO Obligations

OFCCP’s Guidance opens with a warning to contractors. While Artificial Intelligence (AI) selection tools may make the employment decision making process more efficient, these tools also have the potential to perpetuate bias leading to discriminatory employment decisions. Federal contractors cannot escape liability by using a screening tool developed by a third-party vendor.  The guidance is very clear on this point – when using another entity’s product or service, “[a] federal contractor … cannot delegate its nondiscrimination and affirmative action obligations.”  While this has always been the standard, AI assessment tools add an additional layer of complexity and risk for federal contractors. 

As a general matter, OFCCP does not assess individual selection procedures unless the overall hiring process results in adverse impact of two or more standard deviations against a particular gender, race or ethnic group.  However, the current scheduling letter requests information related to any selection tools used to make hiring decisions, specifically AI-based assessments. The recently issued guidance expands on OFCCP’s expectations with respect to contractors’ use of AI-based selection procedures, including: 

  • Contractors must “understand” and be able to explain to the OFCCP the “business need” for any AI tool used to make selection decisions. 
  • Contractors are expected to assess the job-relatedness of the factors tested by the tool. Although not stated in the guidance, contractors should be prepared to produce this assessment during a compliance review where adverse impact is identified. 
  • Additionally, contractors are advised to request that the vendor share the results of any bias assessments, debiasing efforts, and/or studies of system fairness performed by the vendor prior to purchasing and implementing an AI-based selection device. Contractors should anticipate that OFCCP will seek to review these assessments during a compliance review.
  • Contractors should regularly assess the results of the AI tool for adverse impact. If the tool results in adverse impact, the next step is to identify whether there is an alternative job-related selection device with less impact. This recommendation is in-line with the current Executive Order regulations and the Uniform Guidelines for Employee Selection Procedures (UGESP).
  • Applicants should be given advance notice that the selection procedure involves an AI-based selection tool.  The notice should identify the data that will be captured during the assessment, steps taken to ensure the security of the information collected, and instructions for requesting reasonable accommodation, if needed.

Compliance Tips

AI is an evolving and complex area rife with potential liability.  As the guidance documents make clear, employers are liable for decisions/actions stemming from the use of AI technology in the workplace.  There is no “AI made me do it defense.”  Accordingly, it is critical that employers are aware of any AI-based tools used to make employment-related decisions.  Begin with an in-depth review of documentation related to each tool.  Follow up with vendors if the documentation is not sufficiently detailed.

Employers using AI to track working time are advised to periodically assess the accuracy of timekeeping reports.  Similarly, AI tools used to make selection decisions should be monitored for adverse impact. To the extent compliance gaps are identified, employers should take appropriate action to resolve these issues.

All assessments, analyses and corrective actions taken should be documented. Finally, conduct all assessments under the protection of the attorney-client privilege.

Leave a Comment