New York City Enacts AI Bias Employment Discrimination Law
September 23, 2022 | 5 minutes read
This past December, New York City became the first jurisdiction within the U.S. to enact a law geared toward regulating the use of artificial intelligence and machine learning algorithms that are used to recruit, hire, and promote employees that work within the city. Int. No. 1894-A, described as “A Local Law to amend the administrative code of the city of New York, in relation to automated employment decision tools”, will come into effect in January of next year, and will place new responsibilities businesses within the city that utilize artificial intelligence during the course of their respective hiring practices, with the goal of reducing the influence of algorithmic bias on hiring decisions in and around the NYC area.
How does the law define an automated employment decision tool?
Under the provisions of NYC’s Int. No. 1894-A, an automated employment decision tool is defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons. The term “automated employment decision tool” does not include a tool that does not automate, support, substantially assist or replace discretionary decision-making processes and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, database, data set, or other compilation of data.”
What are the duties of businesses under the law?
Under the provisions of NYC’s Int. No. 1894-A, businesses that use automated employment decision tools are required to undergo a “bias audit” of these decision tools. As stated in the law, these audits must be conducted by an independent auditor, and must “assess the tool’s disparate impact on persons of any component 1 category required to be reported by employers pursuant to subsection (c) of section 2000e-8 of title 42 of the United States Code as specified in part 1602.7 of title 29 of the code of federal regulations.” For reference, 42 U.S. Code § 2000e–8 – Investigations places several requirements on businesses across the country as it concerns fair employment practices.
Alternatively, the law also prohibits employers from using automated employment decision tools to hire, recruit screen, or promote employees unless certain stipulations are first met. For instance, employers within NYC are required to allow for their employment decision tools to be audited no more than 1 year prior to such tools being implemented and used. Furthermore, Employers also have a duty to make the results of these bias assessments available to members of the general public via the public website of the business that has undergone the audit, or through some similar channel. Moreover, all businesses that use automated employment decision tools, irrespective of the results of any independent audit, are required to provide prospective employees with a notice containing various details regarding such tools.
Public notification
To this last point, the notices that businesses within New York City are required to provide job applicants in the event that such businesses use automated employment decision tools to aid in their hiring practices must include the following:
- A formal acknowledgment that the automated employment decision tool will be used to assess whether or not a prospective employee is hired for the position, no less than 10 business days before the use of such a tool. On top of this, employers must also “allow a candidate to request an alternative selection process or accommodation.”
- The specific job qualifications and characteristics that the automated employment decision tool will use to determine a candidate’s potential match for the position, no less than 10 business days before the use of such a tool.
- The specific types of personal information regarding a candidate that the automated employment decision tool will be collecting, as well as the source of such data, if such information is not already publicly available on the employer’s website. Additionally, job applicants also have a right to request a personal copy of this information, and employer’s within NYC have an obligation under the law to furnish such information within 30 business days of receiving such a request.
Conversely, as it concerns the enforcement of the law, businesses within NYC that fail to maintain compliance with Int. No. 1894-A are subject to a wide range of monetary penalties. Most notably, employers within NYC that are found to be in violation of the law are subject to a fine of $500 for the first offense, as well as an additional fine of $500 for each subsequent violation that occurs on the same day of the first violation. In addition to this, employers that are repeatedly found to be in violation of the law are also subject to a fine of up to $1,500 for each subsequent violation of the law. What’s more, the law also gives NYC residents the right to bring forth their own civil lawsuits against employers they feel have violated their rights under the law.
As AI algorithms continue to be used with an increased frequency throughout different sectors of business in mainstream society, instances where an automated employment decision tool is found to have made a biased decision will continue to occur as well. For this reason, legislation such as NYC’s Int. No. 1894-A is very much needed, as the notion of artificial intelligence playing a substantial role in a business’s decision to recruit, hire, or promote an employee is a reality that has never been realized at any other point in human history. This being said, while NYC is the only U.S. city to have passed a law relating to AI and employment discrimination as of 2022, many more jurisdictions around the country will surely consider passing similar laws in the near future.