
On May 1, 2026, the Connecticut General Assembly passed Senate Bill 5, the Connecticut Artificial Intelligence Responsibility and Transparency Act, with a House vote of 131 to 17 and a Senate vote of 32 to 4. Governor Ned Lamont confirmed he will sign it. The first wave of compliance obligations lands October 1, 2026. That is less than five months away.
For years, Connecticut tried and failed to pass comprehensive AI legislation. A similar effort died under a veto threat in 2025. This year the bill was negotiated to include governor-backed provisions, which secured his support and sent a 71-page omnibus directly to law. The result is one of the broadest state AI frameworks in the country, covering automated employment decisions, AI companion chatbots, synthetic content labeling, and whistleblower protections for frontier model developers.
Most of the coverage has focused on large technology companies and enterprise HR platforms. The businesses that need to read this are the ones in Stamford, Hartford, New Haven, and Bridgeport using AI tools in their hiring stack, their customer service workflows, or their employee performance systems. Those businesses are directly in scope, and the Connecticut Attorney General has already signaled that enforcement is not theoretical.
SB 5 covers several categories. For most SMBs in Connecticut and for NJ and DE firms with Connecticut employees or operations, three areas are immediately relevant.
Automated employment decision technology. The law applies to any tool that processes personal data and produces a score, rank, classification, or recommendation that is a substantial factor in a hiring, promotion, discipline, or termination decision. Resume screening software, interview scoring platforms, performance management systems with AI outputs, workforce analytics tools that flag employees for review: all of these fall within scope. If your business uses any of them, you are a deployer under the law with obligations beginning October 1, 2026.
AI is not a legal defense. SB 5 amends Connecticut's anti-discrimination statutes to make explicit that using an automated decision tool is not a defense against a discrimination claim. If your AI-assisted hiring process produces a discriminatory outcome, the automated nature of the recommendation does not shield your business from liability. Courts may consider proactive bias testing as a mitigating factor, but that testing must be documented and in place before a complaint is filed, not assembled in response to one.
WARN Act AI disclosure. Beginning October 1, 2026, any Connecticut employer filing a WARN Act notice with the Department of Labor must disclose whether the layoffs are related to the use of artificial intelligence or another technological change. This is not a complex requirement, but it requires an internal process that connects your HR, legal, and operations teams before you ever need to use it. Organizations that have not built that process before the deadline will be building it under pressure, with a filing clock running.
Violations of the automated employment provisions are treated as unfair or deceptive trade practices under Connecticut law, enforced exclusively by the Attorney General. That framing carries more weight than a standard regulatory penalty structure.
The AG can pursue statewide enforcement actions without a private plaintiff. The unfair trade practices framework has historically created parallel class action exposure in other consumer protection contexts. And Connecticut AG William Tong published a February 2026 advisory memorandum to businesses making clear that his office already views existing Connecticut law as applying to AI systems, before SB 5 passed. The new law gives his office purpose-built enforcement authority on top of that existing posture.
There is a 60-day cure period for violations occurring on or before December 31, 2027. That is not a safety net for businesses that have not done the underlying work. The cure period assumes you can identify the violation, demonstrate a remediation path, and show good-faith compliance effort. Organizations without an AI inventory and documented accountability structure have nothing to cure from. The period benefits businesses that are already operating a compliance program, not ones that are starting from zero.
An inventory of every AI tool touching employment decisions. This includes resume screening platforms, applicant tracking systems with AI scoring features, performance management tools with automated outputs, call scoring software used to evaluate employees, and any AI embedded within HR platforms your business already uses. Most organizations find tools in scope they were not tracking when the question is asked systematically. Start with your HR, legal, and IT leads in the same conversation.
Vendor documentation you probably do not have yet. Developers of automated employment tools are required under SB 5 to provide deployers with compliance information, including the data categories the tool uses, the logic behind its outputs, and anti-bias testing documentation. If your vendor cannot or will not provide that information, the compliance gap is your problem, not theirs. Every vendor contract in your employment AI stack needs review now, and missing data-sharing agreements need to be resolved before October, not after.
Named ownership for each AI system in your employment stack. The law assumes a deployer knows which tools it is running, who is accountable for them, and which decisions they influence. A named role with a documented mandate for each AI system is the foundation the rest of the compliance structure is built on. Most SMBs do not have this documentation today.
A built WARN Act disclosure process. The time to design this process is before you need it. Establish the internal review that determines whether any future workforce reduction is linked to AI adoption. That requires connecting HR, legal, operations, and whoever owns your AI tools in a defined workflow. It is straightforward to build, but it takes coordination across teams that rarely sit together until a filing forces them to.
Connecticut did not pass SB 5 in isolation. Colorado introduced a replacement for its 2024 AI Act in the same week. New York's employment AI and pricing bills are advancing through committee. California continues to expand its AI governance frameworks. The Troutman Pepper state AI law tracker published May 4, 2026 documented active AI legislation in more than a dozen states, with employment and automated decision-making as the most common targets.
For a business operating only in Connecticut, SB 5 is the immediate deadline. For a New Jersey or Delaware business with Connecticut employees, customers, or vendors, SB 5 applies to those relationships regardless of where your headquarters sits. Employment law does not stop at state borders when the affected workers are in Connecticut.
The businesses that build compliance posture before October 1 are not doing extra work. They are doing the work that every state in this region will eventually require, and they are doing it at their own pace rather than on a regulator's timeline. That difference shows in how the program is built, how much it costs, and what it looks like to an auditor or an AG inquiry.
The SamurAI's AI Governance and Compliance practice is built around the exact gaps SB 5 exposes. Not a generic compliance checklist. A structured engagement that produces documented, audit-ready outputs your business can stand behind if the Attorney General comes asking.
AI system inventory and classification. We map every AI tool in use across your organization, including approved tools, tools your teams adopted without a formal procurement process, and vendor-embedded AI inside platforms you already run. The output is a documented inventory tied to business function and decision impact, organized against SB 5's deployer definitions. Most organizations find tools in scope during this process they were not tracking before we started.
Vendor compliance documentation review. We work through your employment AI vendor agreements and identify where the data category disclosures, output logic documentation, and anti-bias testing requirements SB 5 demands from developers are missing or insufficient. We help you structure the conversations with vendors to close those gaps before October, and we flag the contracts where the risk sits with you if the vendor cannot or will not comply.
Accountability mapping and ownership documentation. We identify who owns each AI system in your employment stack, document the mandate for each role, and build the governance structure SB 5 assumes you already have. This is the foundation of your compliance posture. Without it, every other control is unsupported.
WARN Act disclosure process design. We build the internal review workflow that connects your HR, legal, and operations teams for any future workforce reduction that could involve AI-related factors. The process is designed to run in the background until you need it, and to produce the right output on the first filing without scrambling to reconstruct the decision logic after the fact.
Ongoing AI governance posture. SB 5 compliance is not a one-time audit. New tools get deployed. Vendors update their systems. The classification you establish today needs a review cycle to stay accurate. We build that cadence into the engagement so the posture you build now does not degrade by Q1 2027 when the next wave of obligations kicks in.
The inventory, the vendor outreach, the documentation, and the internal process design all require lead time that compresses the closer you get to October 1. Organizations that begin in September will not finish cleanly. Organizations that begin now have a workable window.
The SamurAI works with businesses across NJ, DE, and CT on AI governance, compliance readiness, and cybersecurity posture. If your organization uses AI tools in any employment or operations context and has not mapped its exposure under SB 5, that is the gap worth addressing before the Attorney General identifies it first.

The Accounts Nobody Is Watching Most organizations have a mature process for managing human identiti...

The Headcount Nobody Added to the Org ChartYour Identity and Access Management (IAM) platform was bu...