** Unsolicited advice on HCI (Industry) Job Market ** Apparently I’m the resident ‘HCI hater’ now (after the HCI doomerism blogpost), yet two students still braved to ask for career advice today. The reality of the 2026 job market is that the "modeling" breakthrough is the sun around which everything else orbits (in fact, that's been the reality in 2025, 2024, 2023, and yeah you get it) . If you are an HCI PhD, the most productive move you can make is to drop the defensive posture. Stop trying to convince the world that "HCI is important" or that "users matter." If you are still standing in meetings arguing for "the human element" while everyone else is talking about inference scaling, you aren't being a visionary—you’re being irrelevant. To survive and thrive, you have to kill the "HCI Researcher" persona. The core values of human-centeredness—usability, trust, safety—are still your secret weapons, but you need to stop calling them that. The industry doesn't want to hear about "empathy." It wants to hear about utility. It doesn’t care about "user delight." It cares about unit economics. When you pivot to being a **System-Product Hybrid**, you stop being the person who asks for a seat at the table. You become the person who makes the table work. You are the one who ensures that a multi-million dollar model doesn't fail the "last mile" of deployment. You aren't there to make things "more pleasant"; you are there to make it billable. In my experience, here are the three most viable ways to position yourself right now: 1. The Strategic AI PM (The Business Person) In this role, you're banking on the fact that engineers are often terrible at figuring out what to do with it. You aren't "designing an interface"; you are identifying the **commercial use case.** You're taking a raw capability (e.g., "we can now do long-context retrieval") and turning it into a product that someone will actually pay for. Why You? Because you’ve been trained to look at the messiness of the world. While the engineer is optimizing the loss function, you are figuring out how to power a mission-critical business process. 2. The Vertical Domain Architect (The "Last Mile" Person) AI is a generalist’s game. But general models fail in specialized fields. Making a model work for a structural engineer or a pediatric surgeon is **hard engineering** that requires deep context. You are the person who builds the "Last Mile." You understand the professional constraints that aren't in the model-builders' context window. You have the stomach for the unglamorous, niche work. You aren't building a "better" model; you are building the specific guardrails and data-pipelines that make the model survive in a specific industry. 3. The Evaluation & Alignment Specialist (The Technical Person) This is the only "Research" role that is actually mainstream. If the model is the meat, you are the **Quality Control and Training Co-Pilot.** The caveat is that you cannot do this with just "user studies." You have to be able to code ML experiments, understand model weights, and know how a change in the reward function will manifest in the model's latent space. ML pipelines find human data "noisy" and "annoying" in its raw form. If you can speak their language but handle the "translation to computational format" side of the training loop, you are the bridge. The Bottom Line HCI was never really the driver for innovation; the model always was. But as models become commoditized, the "AI" world is looking for people who can actually **deploy** these things without them breaking. Stop trying to prove HCI is "important" and start proving that you are the best person to handle the **messy, human-shaped problems**. Is model really the driver now?