I am interviewing for a job that doesn't exist, nor there is a company who does that. Probably.
I said to the recruiter and the senior VP, who was slightly goofy in a good way: I'm Vlad, very happy to be here. On the way up I passed your tech team and we got my digital team hooked in. My personal API was accepted, no problems, I was audited a month ago by an EU AI compliance org and got my stamp, good for another six months. So your agents and mine are now running whatever it is that agents drink when they test each other, I saw the brief you sent me in advance and I love the technical questions and I'm sure my team will do well. Now let's talk culture, strategy, and how I can help you move forward.
What's not to get here? Somewhere in the future, I hope that version of me is having this conversation and AI is actually not a buzzword but how we work and operate.
I learned to drive in the nineties, winter, in an Eastern European town, on a very old-school Lada where even the gear stick could come out in your hand if you grabbed it wrong. I never quite mastered getting the car moving on an uphill road and I kept mixing up my feet, not sure where the clutch ended and the brake started. At the end of three brutal weeks of not liking the mechanics of driving, I got out and just wished self-driving cars into existence. They are here now, operating them is another story as we don't have the laws or the regulations yet.
I think AI agents might follow that pattern. The human-agent symbiosis is already working technically and improvements will keep coming. Task itself is solved while the legal, financial, and company infrastructure is not.
Say, if I have my own digital team of agents, trained on everything on the internet but most importantly on the personal knowledge and context I've given them over time, they're powerful. And I can give my team an API, make my agents and my underlying system fully queryable, except what I decide is personal, and show up to a new workplace with my fleet already running. Why wouldn't I, why wouldn't you do that? Spend so much time getting up to speed at the new place, when I can be solving problems from day one, if your company has systems I can connect to?
The legal infrastructure for this doesn't exist yet. Employment contracts were written before anyone's agents were at the table: non-competes, IP agreements, trade secrets, all designed for a world where the thing that learned and built and knew things was you, the human. Now consider: if you work at a company for a year with your AI team, what can you take back when you leave? Who owns what was built together? If AI does the work, who pays the tax? Will companies start paying more for people who arrive with a pre-built team, the way they once paid more for people who brought their own clients?
Microsoft's Work Trend Index 2026 calls the working layer mostly solved. It's mode-fit, knowing when to ask versus when to delegate versus when to collaborate, is already the new professional skill. The ownership layer is still blank, and what we'd need is a way to track what my agents did and what they learned, so when we decide to part ways the audit is possible: this is what the company keeps, this is what I take back, these are the skills that developed during our time here, this is the institutional knowledge that stays. We don't have a name for that mechanism yet.
The question behind the mechanism is the harder one: what happens when the thing that learned isn't me but my agent, who is also me, but not only me?
Milena Nikolova seeded this in April, before we'd even started writing together: "The competitive unit becomes the human plus their AI team. Employer-employee bargains change because the capacity a professional brings is now a package, not just a skillset." A 2026 Harvard Business School working paper tracked 244 BCG management consultants working on real client tasks with AI tools available and found three patterns.
- Some fused with it, AI woven into the workflow and judgment intact, and they outperformed peers on both speed and quality — cyborgs.
- Others directed it, keeping strategic judgment human and delegating execution, which also worked — centaurs.
- And some outsourced their thinking entirely and handed over the judgment calls too, called it collaboration, and underperformed on the tasks that required reasoning — self-automators.
The practical thing practitioners need to understand is that cyborgs and self-automators look identical from the outside, same tool, same interface, same shortcut. The difference is only visible in what comes back: whether the judgment was yours, or whether you accepted the first answer and called it done. The unit of professional value has shifted from what you know to how you work when AI is in the room. People are landing in very different professional futures depending on how deliberately they've understood that. The people doing the hiring haven't asked this question yet, they should and they will.
Your professional identity has always been partly about what you know and partly about who you are in the room. What's changing is that who you are now includes the team you bring. Social identity theory tells us people internalize roles, I am the expert, I am the one who figures things out. That internalization is being rewritten and the question "what do you bring?" has a new answer. Whether you've built something you trust and can articulate, or whether you're still treating it as a tool you borrow, that distance is behavioral. It's a new identity, not a bunch of tools sloppy bundled under the evergreen AI term.
Johnny Mnemonic carried data in his head that corporations wanted access to. Motoko Kusanagi in Ghost in the Shell was inseparable from her network, her professional capacity was her augmented self, queryable, extensible, impossible to fully separate from the organic part. We've been imagining this for decades, in American and Japanese and Eastern European film, in different registers: sometimes as horror, sometimes as liberation, sometimes as a question we couldn't quite answer yet.
The question is getting simpler now.
Do you have an API? Are we ready to work like that?
Sources: Microsoft Work Trend Index 2026 — Beyond the Prompt; Randazzo, Lifshitz, Kellogg, Dell'Acqua, Mollick, Candelon, Lakhani — Cyborgs, Centaurs and Self-Automators: The Three Modes of Human-GenAI Knowledge Work, HBS Working Paper 26-036 (2026); Nikolova, M. — Boundborn framework seed, unpublished (April 2026).
*This article reflects an AI × HI collaboration. The ideas, stories, and perspective are my own, developed through iterative interaction with AI as a thinking partner to sharpen structure, clarity, and expression.