Department of Labor Issues Guidance in Connection with President Biden’s Artificial Intelligence Order

Late last year, President Biden issued a sweeping Executive Order concerning Artificial Intelligence (“AI”). Last month, the U.S. Office of Management Budget released a Memorandum providing “requirements and guidance for AI governance, innovation, and risk management” in the federal public sector. (I previously wrote about the Executive Order and Memorandum). On April 29, 2024, the U.S. Department of Labor Wage and Hour Division provided guidance to the private sector’s use of AI through Field Assistance Bulletin (“FAB”) No. 2024-1, titled “Artificial Intelligence and Automated Systems in the Workplace under the Fair Labor Standardards Act and Other Federal Labor Standards.”

Although written as an internal document, this 12-page FAB provides private-sector employers with important information and warnings about how to use (and not use) AI in the workplace. The FAB focuses on three federal laws:

  • The Fair Labor Standards Act (“FLSA”);
  • The Family and Medical Leave Act (“FMLA”); and
  • The Employee Polygraph Protection Act (“EPPA”).

Boiled down to its essence, the DOL cautions companies from solely relying on the output of AI systems and reminds employers that they must comply with the law regardless of the use of AI technologies. In other words, pointing the finger at AI will not be credited as a defense or excuse for violating federal law. The DOL ultimately calls upon employers to exercise and maintain “proper human oversight” of AI systems.

Below is a detailed overview of the FAB’s discussion of these three laws. If your business is using AI in any way, which it almost certainly is, then it needs a plan to ensure consistent and legally compliant use of the technology. RIW is happy to advise your company on how to maximize your use of AI while avoiding the issues laid out by the DOL concerning the FLSA, the FMLA, and the EPPA.


The FLSA requires, among other things, that covered employees receive at least the federal minimum wage for every hour worked and overtime for each hour worked in excess of 40 hours in a workweek. To demonstrate compliance with the FLSA, employers are responsible for properly monitoring and recording the hours their employees work. As this can often be a tedious and time-consuming task, companies have welcomed the use of AI as a way to streamline the process. As outlined in the FAB, the DOL warns employers about how they are using this technology to track and record their employees’ time.

With the prevalence of remote work, some employers use AI to track employee productivity. These systems range from reviewing internet browsing history to monitoring keystrokes or mouse clicks, GPS data from an employee’s cellphone, and even eye movements through an employee’s webcam. As creepy and “big brother” as this may sound, companies have found these systems to be an effective and efficient way of determining whether employees are working when they say they are (or should be). Through the FAB, the DOL states that employers should be wary of relying on these systems to track employees’ work or break time. According to the DOL, the technology may not be able to determine when an employee takes breaks to, for example, get a cup of coffee, use the bathroom, or breast pump, all of which are the types of breaks for which an employee should be paid under the FLSA.

The DOL also touches upon the use of predictive modeling for timekeeping purposes. For those unfamiliar with predictive modeling, a good example is when your text message or e-mail provider recommends words to complete a sentence. The recommendation is made based on historical data. For timekeeping purposes, an AI system could know how many hours an employee typically works and use that historical data to complete timekeeping. The DOL cautions against this because predictive modeling, just like the monitoring systems, may not properly account for fluctuations in an employee’s schedule.

Unsurprisingly, the DOL warns about the use of AI systems to calculate wages. The FLSA ensures, among many other things, that employees are paid for the time they work at a regular rate of pay, plus overtime (if applicable). The DOL expresses concern about algorithmic decision-making to determine a worker’s rate of pay, especially in situations where a worker’s wage rate varies, such as receiving overtime pay.


As many readers likely know, the FMLA provides eligible employees with job-protected leave for qualifying family and medical reasons. For employers using AI systems to process and/or consider FMLA leave requests, the DOL states such use “should be overseen by the employer to avoid the risk of widespread violations of FMA rights when eligibility, certification, and anti-retaliation and anti-interference requirements are not complied with.” Employers should closely monitor AI systems used to do the following:

  • Process FMLA leave requests;
  • Determine FMLA eligibility and/or benefits;
  • Calculate available leave entitlements; or
  • Evaluate whether the leave is for a qualifying reason.


The EPPA generally prohibits private employers from using lie detector tests as part of the hiring process. There are limited exemptions for this prohibition; however, in such instances, an employer must provide the prospective employee with written notice of the use of a lie detector test.

The DOL notes that certain AI technologies input an individual’s eye measurements, voice analysis, micro-expressions, and other body movements into a proprietary algorithm to provide an opinion on whether that individual is lying or being deceptive. In the FAB, the DOL states that the use of AI technology in this manner as part of the hiring process would violate the EPPA unless it is used in an industry for which an exemption applies and, if so, the employer-provided written notice of its use of this AI technology.

This is not a hypothetical concern. Litigation is presently pending in the U.S. District Court for the District of Massachusetts, where an individual brought suit against CVS, alleging it used this type of AI system as part of its hiring process in violation of a state law substantially similar to the EPPA.

RIW is Here to Help

AI is here to stay as part of the workplace. In fact, it will only become more prevalent in the coming months and years. The DOL’s FAB confirms that federal (and likely state) agencies will not wait for legislation or regulations governing the use of AI. Rather, they can and will find that companies use AI to violate existing laws.

We at RIW understand this ever-changing landscape and can assist your company in utilizing technology in a productive, efficient, and legally compliant manner.

Adam G. Gutbezahl is an associate in RIW’s Litigation Department, Employment Law Group, and Commercial Real Estate Group. 

POSTED IN: Articles & Quotes, Employment Law

Print to PDF