Dear ,
Last Friday, Governor Healey committed Massachusetts to a three-year, multimillion-dollar contract with a single company, OpenAI, to deploy its AI tool for all 40,000 executive branch employees. She did this based on the recommendation of a task force dominated by executives with financial ties to OpenAI. She did it without consulting the workers who will use the tool or the communities who rely on the services that will be shaped by this tool. And when the State House News Service requested a copy of the contract, her administration did not respond.
If you rely on MassHealth, the workers who handle your coverage will have this tool with access to your personal data. If you call a state agency about unemployment benefits or housing, the person on the other end may use this tool to draft their response. This one company's technology will be woven into how your state government serves you for the next three years.
That means the company Governor Healey chose matters. For you, your family, and our community.
So let me tell you about the company she chose.
OpenAI is not only building tools for state governments. According to the Department of Homeland Security's own public inventory, published this January, OpenAI's technology is being used by ICE across enforcement and operations. The same company that won the Massachusetts contract, Carahsoft, is also OpenAI's channel for pursuing Department of Defense work. OpenAI quietly removed its own ban on military uses of its technology. And OpenAI hired its chief security officer directly from Palantir, the surveillance company that built ICE's deportation targeting system, a tool that pulls Medicaid data to map where immigrants live.
I am not saying the state's AI tool will share your data with ICE. The administration says state data will be protected. But we have not seen the data processing agreement, because Governor Healey's administration won't release the contract. We don't know what protections actually exist for your health information, your address, your family's data. And the company we're asked to trust is the same company embedded in a federal enforcement infrastructure that is using health benefits data to target immigrant communities for deportation.
We deserve a Massachusetts that is a sanctuary state
In Somerville, we know why we need to become a sanctuary state. We watched masked agents take Rümeysa Öztürk from our streets with impunity. Right now, immigrant families across the country are afraid to access Medicaid because the Trump administration began sharing Medicaid data with ICE. A mother in Chicago delayed prenatal care until her third trimester because she feared enrolling would put her husband at risk.
This is the context in which our Governor signed a three-year contract with OpenAI and won't show us the data protections. This is why who builds our government's infrastructure matters. And this is why every person in this Commonwealth deserves to see exactly how this choice was made.
How this decision was made
The administration calls this procurement "rigorous" and "transparent." Here is what a rigorous and transparent process would not look like.
The state did not issue an open call for proposals. Instead, it issued a Request for Quotes under an existing contract held by Carahsoft Technology Corporation, which was declared the winning bidder on January 27th, two weeks before the public heard about it. The cost is $13 per month per employee, roughly $4.3 million a year.
Why does the name Carahsoft matter? Because Carahsoft is OpenAI's exclusive government reseller, their designated sales channel for every government deal in the country. The "competitive procurement" was conducted through a contract vehicle controlled by the winning vendor's own sales partner. Let me say that again, because it matters: the competition ran through a channel that was built for OpenAI to win.
And the body that recommended this direction, Governor Healey's AI Strategic Task Force, has deep ties to the vendor that won.
-
Soundar Srinivasan sits on the Task Force. He is a director in Microsoft's AI program. Microsoft owns 27% of OpenAI and provides its cloud infrastructure.
- Spyros Matsoukas sits on the Task Force. He is a VP at Amazon Alexa AI. Amazon signed a $38 billion deal with OpenAI and is negotiating $50 billion more.
-
Vipin Mayar sits on the Task Force. He is an EVP at Fidelity Investments. Fidelity invested directly in OpenAI's most recent funding round.
- Meghan Verena Joyce sits on the Task Force. Her company builds its products on OpenAI's technology.
- Sears Merritt sits on the Task Force. His employer, MassMutual, has deployed OpenAI-powered tools enterprise-wide.
These are accomplished professionals. I don't question their expertise. But I keep coming back to the question I ask about every decision on Beacon Hill: who was not in the room? No consumer advocates. No AI safety researchers. No procurement specialists. No representatives of the 40,000 workers whose jobs will change. No members of the public whose personal data will flow through this tool.
The people shaping our AI future cannot only be the people who directly profit from it. Our communities have expertise too, the expertise of people who will live with the consequences. Right now, that expertise has no seat at the table.
The executed contract has not been released. The competing bidders, if any, have not been named. The evaluation scores have not been disclosed. If this process was fair, releasing the documents would prove it. Governor Healey's silence does the opposite.
Who was left out
Imagine you've worked for the Commonwealth for fifteen years processing benefits applications. One day you're told a tool you've never heard of will now be part of how you do your job, every day, for three years. You weren't consulted. Your union was given a brief overview days before the announcement and said what was announced was "very different than what we thought we were agreeing to."
That's what happened to 40,000 state employees. The National Association of Government Employees (NAGE), representing 15,000 of them, described the rollout as "putting the cart before the horse," saying the administration moved forward before completing mandatory collective bargaining. No workforce survey. No feedback mechanism. Workers were treated as recipients of this decision, not participants.
I, like so many of you, believe that workers are not obstacles to progress. They are the people who make government work. Once 40,000 people build their routines around one company's tool, switching becomes practically impossible. They deserved a voice. They did not get one.
What a real evaluation would have found
Other governments looked at the full landscape before choosing an AI vendor. Governor Healey’s administration, as far as we can tell, did not.
The federal Department of Health and Human Services gives workers access to two vendors, both ChatGPT and Anthropic's Claude, because having multiple tools prevents any single company from becoming your only option. Maryland started with a small pilot before expanding. Georgia approved multiple vendors through a transparent public process. Massachusetts locked 40,000 workers into a single vendor for three years.
The differences between vendors matter, and I want to explain one that affects every resident whose data will be processed through this tool. When a company wants to handle sensitive government information, the federal government requires it to pass an independent security review. The highest level is designed for systems where a breach could cause severe harm. Anthropic's Claude has passed this review independently. OpenAI's ChatGPT has not. OpenAI relies on Microsoft's certification rather than its own. For a tool that will process your health information, your communications with state agencies, and internal government documents across every executive branch office, that distinction matters.
Beyond security, there is a question of values. Anthropic has maintained restrictions on military applications of its technology. OpenAI removed those restrictions and is actively pursuing Pentagon contracts through the same reseller that won the Massachusetts deal. OpenAI's technology appears in ICE enforcement operations. As your legislator, I believe Massachusetts should be a sanctuary state for immigrant communities, and that’s why I am calling for a serious evaluation that weighs these facts. Whether Governor Healey's administration did is unknown, because the evaluation has not been disclosed.
This contract was also announced the same week OpenAI disbanded its Mission Alignment team, at least the second safety team the company has dissolved in under two years. Forty-two state attorneys general, including our own AG Campbell, have raised serious concerns about AI safety. Whether any of this factored into the decision is unknown.
I don't know. You don't know. And that is the problem.
What I'm doing about it
I call on Governor Healey to make the following available to the public and the Legislature:
-
The complete procurement documents, including the bidder list, evaluation scores, and the rationale for choosing a single vendor over the multi-vendor approach used by federal agencies.
- The full data processing agreement, including specific protections for residents who interact with MassHealth and other safety-net programs, and assurances about how sensitive personal data is firewalled from any other use.
- Conflict-of-interest disclosures from Task Force members whose employers have financial ties to OpenAI, and documentation of how the vendor's safety record, security certifications, and federal enforcement relationships were weighed in the selection.
- A plan for bringing the state workforce and the public into AI governance going forward.
These are not radical requests. They are the basic building blocks of accountable and transparent government. I will continue to push on this, and I am working on additional steps to ensure that the people of Massachusetts have a real voice in how AI is used in our government, not just this contract, but every decision that follows from it.
If the process was fair, release the documents. Let the people of this Commonwealth see for themselves.
What you can do
- Contact the Governor's office. Call (617) 725-4005 or visit mass.gov/governor. Ask for the release of the full procurement documents and the data processing agreement. Ask why communities and workers were excluded from this decision.
-
Contact your State Representative and Senator. Ask whether the Joint Committee on Advanced IT plans to hold hearings on this contract. Ask who should have a seat at the table when government makes decisions about AI. Find your rep at malegislature.gov
- If you rely on MassHealth or other state services: You have a right to know how your personal information will be handled. Demand the data processing agreement be made public.
-
If you are a state employee: Your union has demanded to bargain over this rollout. You have collective bargaining rights. Connect with your union representative.
- Share this newsletter. This is how inside-outside organizing works. I push from inside the State House, and you push from outside. Neither of us wins alone.
I'll be honest with you, this is the kind of issue that is easy to look away from. It's complicated. And life is hard enough without worrying about government procurement contracts.
But this is exactly how the decisions that shape our lives get made, quietly, in rooms we're not invited into, by people with interests we aren't told about. I co-founded Act on Mass because I believed that the rules that keep you in the dark about how your representatives vote are the rules that keep the status quo in place. This is the same fight. It's AI instead of parliamentary procedure, but the principle is identical: if you can't see it, you can't shape it.
Massachusetts has 22 people advising the Governor on AI. They include executives from the companies that profit most from the technology. They do not include a single worker who will use it. They do not include a single community organization. They do not include a single resident whose services will be shaped by it.
If we are going to lead on artificial intelligence, we should be the state that proves AI governance can be democratic. That the people affected by these decisions can shape them, not just be subjected to them. That Massachusetts can show the country what it looks like when technology serves the public because the public had a hand in deciding how.
We can build something better than this. And I believe we will.
If the process was fair, release the documents. Let the people of this Commonwealth see for themselves.