Skip to content
AltonHenley

Small City, Real Choices

by Alton Henley


The conference session on AI assumes you have an IT department.

You don’t.

You have eight employees, and you are personally responsible for permits, parks, code enforcement, public works coordination, council preparation, and HR. “Figure out AI” is not a line item that fits anywhere in that list.

This is not a failure of ambition. It’s a description of approximately 19,500 American municipalities—the majority of local governments in the country, almost entirely absent from the national conversation about AI in the public sector.

What the Data Shows

The ICMA 2024 Survey on Artificial Intelligence in Local Government reveals stark disparities in AI readiness by community size:

MetricSmall CommunitiesLarge Communities
Rate AI as high/moderate priority15%57%
Using AI in at least one service area40%73%
Have established AI policies2%25%
Have hired or appointed AI staff2%27%

A 10:1 ratio on policy adoption. A 13:1 ratio on designated AI staff. These aren’t small differences—they describe a fundamentally different universe of readiness.

The open-ended survey responses bring these numbers to life:

“We’re still trying to get reliable internet service to all our residents. AI feels like science fiction.”

“We don’t have the legal expertise to draft an AI policy, and we can’t afford outside counsel for something like this.”

“I wear six hats on any given day. Adding ‘figure out AI’ to that list just isn’t realistic.”

Two Different Problems Wearing the Same Face

Before deciding how to respond to these disparities, it’s worth distinguishing between two different explanations.

The first is capacity constraints. Small communities want to use AI—or might benefit from it—but lack the staff, expertise, and time to do so safely. ChatGPT doesn’t cost more in rural Kansas than in Los Angeles, but evaluating it, governing it, and training staff to use it responsibly requires capacity that small communities often don’t have.

The second is appropriate fit. Small communities may have correctly assessed that AI doesn’t match their service delivery model. When a community of 3,000 residents relies on personal relationships—the city clerk knows every resident, the public works director lives down the street—automation may not improve governance. It may undermine it. And the fixed costs of responsible AI implementation (policy development, oversight, training) don’t scale down proportionally. For a community with eight employees, governance overhead may exceed productivity benefit.

The honest answer is that both things are true for different communities, and often for the same community at the same time. Some small communities are capacity-constrained in ways that block genuinely valuable applications. Others have correctly concluded that AI doesn’t fit their needs. Effective support has to acknowledge this distinction rather than assuming every small community needs the same intervention.

One Voice Worth Hearing

One open-ended ICMA survey response stands out. The administrator of a small West Coast city—one of the most incisive observers in the entire survey—described it this way:

“There is an irrational exuberance that is building up in the profession from championing AI use in local government. I see it trumpeted as a miracle answer to manage workload, with lip service towards accountability for AI outputs. It can alleviate workload, but only in exchange for more rigorous oversight and a greater need for personal accountability for automated output, i.e., a different kind of work, but work nonetheless.

Local government must reliably deliver basic day-to-day services while striving to balance excess demand for services in a scarce-resources environment. As stewards of our communities, we should always curiously and prudently experiment with efficiency and effectiveness to a degree that aligns with our elected officials’ community vision.”

This administrator has done what the survey data shows most leaders haven’t: thought through what AI actually requires for daily operations, weighed costs against benefits, and arrived at a position grounded in reality rather than enthusiasm or fear. This is exactly what closing the knowledge gap is supposed to produce.

A Decision Framework

Not adopting AI is a legitimate decision. But it should be a decision, not a default. A few indicators:

AI is probably not right for you now if:

  • Your service delivery depends primarily on personal relationships, and residents prefer it that way
  • You have fewer than 10 employees and no one with time to own even basic AI governance
  • Your technology infrastructure can’t reliably support the tools you already have
  • The problems you’re trying to solve are better addressed by hiring, training, or process improvement

AI may be worth exploring if:

  • Staff are already using consumer AI tools informally—ChatGPT for drafting emails, translation apps—without any organizational guidance
  • You face a specific recurring bottleneck that matches a lower-risk use case: grant writing, meeting minutes, document drafting
  • A neighboring community or regional council offers shared governance support you could join
  • Workforce shortages are making it difficult to fill positions that AI could partially augment

If you decide “not now”:

  1. Document the decision and why. A one-page memo explaining that your community assessed AI readiness and concluded it isn’t the right fit protects you from future pressure and gives you a baseline for revisiting.
  2. Monitor for invisible AI. Even if you don’t adopt AI deliberately, it may arrive through software updates or informal staff use. Know what you have.
  3. Stay connected. Join your state municipal league’s AI updates. When your circumstances change, you’ll have the information to act.

Communities that work through this and conclude “not now” haven’t failed to adopt AI. They’ve made an informed decision.

What Small Communities Actually Need

The survey responses—not the statistical patterns, but the words behind them—suggest what would actually help:

Support designed for communities without IT departments. Resources that assume the city manager is also the finance director, the HR department, and the person who plows the parking lot in a snowstorm. Not template policies designed for cities with 300 employees, but guidance that works when you have eight.

Peer connections to other small communities facing the same decisions. Not case studies from Denver and San José—valuable as those are—but from places that look more like your community.

Protection from coercion. One survey respondent put it plainly: “I’m afraid AI will be imposed upon us.” State mandates, vendor requirements embedded in software renewals, professional pressure from conferences—any of these can push communities toward adoption they’re not ready for and don’t want. Support systems need to explicitly protect the right to say no.

The goal isn’t to push AI adoption on communities that don’t need it. It’s to ensure that communities aren’t blocked from beneficial applications solely by capacity constraints—and that communities considering AI have honest information about what it actually requires.


Alton Henley is the author of The Knowledge Barrier: AI Adoption in American Local Government and Ready or Not: A City Manager’s Guide to AI in Local Government.

Leave a Reply

Your email address will not be published. Required fields are marked *