This piece, written by Mitchell Weiss, originally ran on his newsletter on June 18th, 2024.
I had been sitting anticipating Sam Altman's appearance. He of OpenAI, of ChatGPT. I had been waiting with a colleague, oddly and perhaps portentously for a talk on AI, in Harvard's Memorial Church. It was before Altman appeared and said, "Where I think there is a huge misconception is just how good these tools are about to become." It was before he said, "When we were making the critical decisions, we didn't know we were making the critical decisions." It was before his interviewer asked him, "What do you hope society looks like when AGI gets built?" and he replied "My answer has changed over time," that my colleague while we were waiting and shifting in the pews had asked me a question: "What will AI do in government?"
"A lot."
A lot.
It wasn't the first time I'd been asked this over the last two years. I'd been asked in the US and abroad. By mayors and police chiefs, diplomats and community leaders. As someone who studies Possibility Government, it wasn't the first time I'd been asked, nor wondered myself, what AI would do in government or even what it would be.
What it may be is a Chief of Stuff.
Experts offer many metaphors for what generative AI is. "Generative AI is a co-pilot." (Hence, Copilot.) An "assistant." An "intern." "Co-intelligence." Depending on your opinion of both: "A smart high-schooler." We're also provided anti-metaphors. "It's not a Google search box." "It's not just auto-complete." The metaphors are a way of making sense of the moment and helping people think of how to use these new tools.
I like metaphors. When audiences ask my why I am being "nice" to my AI, I tell them I am not always nice. I say "Please" a lot, but also "Ugh" or "I'm frustrated." But if I seem like I am being nice I tell them you might imagine there are three potential reasons, and the real one is only the last one. It's the metaphorical one.
- "Just in case." People laugh, but it's not meant to be just funny. I heard this one from my brother and a colleague. I share it because it is important to recognize the real dangers. The non-zero existential ones. The ones from superintelligence.
- Because it will be more helpful to you. But this is not a reason to be overly nice. Sycophancy is not good in humans or AI.
- Because it helps me think of AI as a person. Because it helps me think about using AI as much more than "a Google search box". Because when I interact with AI tools like people, I ask them to do more and be better.
But "AI is a person" is not a nuanced metaphor.
So here's one "Generative AI is..." for government users and watchers. What is generative AI? It's a Chief of Stuff.
This one came to me on the heels of my friend's question and a career that preceded it. I'd earlier mulled writing a book about chiefs of staff by that title. (All former chiefs of staffs think they have a chief of staff book in them. I was no different and probably no more correct.) It would have been about all the roles chiefs of staffs played. I'd write the book - I once thought - one chapter, one chief of staff role, one canonical chief of staff exemplar at a time. Or even better, I later came to imagine: one value at a time. "Creativity." "Accountability." "Empathy." "Candor." "Resilience." "Relentlessness." So many government officials do "a lot" in government. But chiefs of staff were on my mind.
This metaphor came to me from my days in the role. I'd been called many names as a chief of staff, not all to my face. But one came racing back waiting for Altman, thinking about all that AI might do in government. When I was in Boston's City Hall more than a decade ago, Tom Tinlin, our transportation commissioner and resident wit had made it his "good morning" to me. "Chief of Stuff." It was a term of endearment (although needn't be, more on this below) and a recognition of just how much every day had in store.
There are three reasons Chief of Stuff seems an apt AI for government metaphor. Variety is the first one. Stuff. The other two are Agency and Power.
Variety.
Chiefs of staff do a lot of stuff; AI can, too. One thing I loved about my job was how varied it was. In the role, I had come up with new ideas. I had assessed others' new ideas. I stood in for the mayor when he couldn't be around. Followed up on tasks. Tracked results. Gave directions during crisis. Ordered food. Inspected neckties for spots. Inspected streets for potholes. Gathered data. Presented data. Written speeches. Trashed speeches. Played adversaries in debate preparations. Gamed out difficult conversations. Gamed out politics. Hired and fired people. Mentored them. I'd learned. I'd invited and arm-twisted. I'd cooperated and competed. Led and followed. Worked alone and on teams. Respected the old and raced after the new. I'd advised on lots of decisions. AI can do all of these things.
I was asked recently about the top 10 things I'd use AI in government for. "Chief of Stuff" invites us to think about a top 1000. Specifically, I'd been asked about AI-enabled citizen chatbots for government. I said it wasn't top on my list, but in any event, Chief of Stuff invites us to think that it is a really long list.
Five categories for uses of AI in government could be:
- Decision aiding.
- Automation.
- Data analysis and visualization.
- Simulation.
- Communication.
They each have hundreds of sub-uses among them. There are other categories with hundreds more.
There are many policy areas, too. Not long ago, building permits were reported to take close to two years in housing starved San Francisco. DMV wait times can be close to an hour in impatient L.A. A nation facing waves of immigrants and asylum seekers had a backlog in immigration court of 3 million cases at the end of last year. Almost a million veterans we made promises to are waiting to hear the outcomes of their claims for pensions and disability payments. Generative AI tools might be brought to bear in all those places and more.
And they might be brought to bear on the details. I looked at the process for opening a bakery in one major US city. All over I could see where AI tools now or soon could expedite the process for the public and the public workers. (There are implications here for fairness to the public and for the nature and number of government jobs. I will write more on both.) But the simple exercise helps one see that AI isn't just for the high-level. It's also in the nitty gritty, and that's where good chiefs of staff operate, too.
Across all levels of government and around the world, public leaders are exploring AI for the high-flying and the nitty gritty. If you give mayors and other city leaders, for example, encouragement to try AI and an invitation and place to share their uses, as Bloomberg Philanthropies has, they will do both. I'm partial to the mayor that made an AI tool read his nasty-grams so he can be spared the filth but conveyed the substance of constituent concerns.
AI does not do all of these things accurately or easily. AI has a "jagged technological frontier" in government, too. It will do some things well and some even similar things not well. I will write about government's jagged frontier. For now, use AI in areas you know well so that you know whether it is getting things right.
Chiefs of staff do a lot of things. Even the best don't do all of them well. But the best do get better, so don't freeze your impressions. Keep experimenting. Across as a variety of uses.
Agency.
Chiefs of staff delegate. "Chief of stuff" invites us to think about delegation, about granting agency. Chief of staff often conjures up the vision of a consigliere or an aide de camp; a singular loyal partner to the principal. I often have to remind people that the chief of staff is also the head of the principal's staff. In many organizations a chief of staff oversees a team, and in some government organizations these teams can be large. In the mayor's office I worked in the team was several dozen, and I had dotted-line responsibility for many of the two-dozen more cabinet members. Chiefs of staff grant some amount of autonomy to those team members to do things, to act. Chiefs of staff therefore manage. What is here and coming is the fact that government managers will manage people and robots, both. They will continue to set goals and delegate some amount of autonomy in achieving those goals. What is new is that some of those goals will be pursued by people and others by AI, with agency.
I gave myself a tiny peek into agentic government when OpenAI provided the ability to build CustomGPTs months ago. Just to experiment, I made quick version of an AI "Chief of Stuff." (It so happened that I had on hand a job description for a chief of staff. I get asked with some frequency by mayors what they look for in one. So often, that I'd written one up and could upload it as part of Chief of Stuff's "knowledge base.") When it came time to "train" a GPT to act like a chief of staff, I told it to have the qualities of what I thought were essential qualities of a good chief of staff.
There isn't much to this custom GPT I made. I trained it on very little. I could have trained it on much more. (Imagine that I fed it performance reviews of my work, examples of other chiefs of staff I respected and their behaviors, including some legendary ones. Imagine if I also asked its human principal how more specifically she would like it to act, and then trained it on those inputs, too.) Moreover, to ask it to craft an agenda for a meeting on snow, as I did in this quick video, is to ask very little of it. And even so, I think many would argue that a real chief of staff would have done considerably better. I think I would have.
But even a minimally trained and tasked custom bot like this augurs the arrival of agentic government. Public officials may task their AI tools to do work for them and may task multiple AI tools to work in teams. Public leaders may even task AI agents to manage other AI agents. AI will put the "agent" in government agents. (I am aware of how dark that sounds and could even be.) I will write more about this shortly, too.
Power.
Chiefs of staff have a complicated relationship with power, and so do (will) we with AI. Chiefs of staff can be unceasingly loyal. They can faithfully channel their principal's pull. But they can also usurp, machinate, and manipulate. Chiefs of staff say things like, "There is no space between the Mayor/Governor/Secretary" and me, but also say, "What the Mayor/Governor/Secretary wants..." but not precisely truthfully or omnisciently. Chiefs of staff think they know better. Other people can think chiefs of staff know more than they do and chiefs of staff can let them think that. Chiefs of staff can be overestimated. Same for AI. But chiefs of staff, and AI, can be under-estimated too. In my observation AI is being underestimated and under-anticipated in governments, even for all its current estimation and anticipation. And one under-anticipation is its power.
The AI way to think about power and AI in government would be to think about "alignment." To what extent will how AI acts for government stakeholders be consistent with the desires of its government stakeholders? Our mental model might be that chiefs of staff work for their principals, serve at their pleasure. That is not always true. Savvy principals are alert to that. (Even the mayor boss I much loved wanted to know my constant whereabouts; that attitude was part of what got him elected five times.) Our mental models may be that AI will work for us and serve at our pleasure. That may not always be true, especially as their capabilities increase. "Chief of Stuff" invites us to think about AI for government and its power and its predilections. You might be imagining your AI as Leo McGarry. It might be Doug Stamper.
What will AI in government be? Recently, I ran this "AI in government is..." question back not just metaphorically, but bureaucratically. What if AI had a government job, what GS would it be?
Many federal US jobs are paid according to grades on the General Schedule. The levels are based on education and experience. The Partnership for Public Service summarizes them this way:
- GS-3 or GS-4: typically internships, student jobs or lower level administrative work.
- GS-5 to GS-7: mostly entry-level and administrative positions.
- GS-8 to GS-12: mostly mid-level technical and first level supervisory positions.
- GS-13 to GS-15: Top-level technical and supervisory positions.
Where would AI Jane or AI Joe rank, their first day on the job? Might they be beyond this list?
There are of course jobs that grade beyond a GS-15. Many of the people in them are part of the Senior Executive Service. The SES. These leaders have consequential authority. Their jobs require navigating substantial complexity. They coordinate and they collaborate. And they do so often over wide portfolios.
Many chiefs of staff grade beyond GS-15. They are in the SES.
What will AI be in government? One provocative way to think about it would be to envision it as part of the SES.
For now, a more practical question than "What will AI be in government?" is "What will you be in government with AI?" There is an AI talent surge underway in the US government. The same is true across the globe and to a various extent at sub-national levels. States and cities are in on the action, too. These are institutional efforts. The aim is to bring people into governments who can ably use AI tools toward the public service, to help regulate AI, and to help nations stay at the cutting edge. Some efforts do have more specific targets for up-skilling the public workforce in AI. But all bring to mind a talent surge analogous to the institutional one: the individual one. How will you become proficient in these tools? How will you, in your government job, put them to best use?
One way to begin to think about this is to use generative AI - with its variety, its coming agency, and its rising power - like your personal chief of stuff and then to be one yourself. A Chief of Stuff.