Why do you think this has been one of the most unique hiring periods over the past decade or so, and how has AI affected that lately? “I do fundamentally think we’ve had a platform shift. We had this around mobile. We had this around e-commerce. Or, if you go back far enough, we had this shift from mainframes to client-servers. So, I do believe this [AI] is a fundamentally a platform shift.
“From that perspective, the most critical thing when I sit down with clients, I always ask them, ‘How’s your data doing?’ We all know nobody has perfect data. In the AI world, data is going to become even more important. If it was difficult to manage your data before — think about graph databases and vector databases — really we see a lot of investment by enterprises into getting their data right for AI; that translates into ensuring you have the right resources: data architects, analysts, AI engineers and all those sort of positions as driving it.”
A lot of organizations are relying on cloud-based AI services from the likes of Microsoft, Oracle, Amazon and Google. Are you seeing an increase in the use of proprietary small language models based on open source versus these large language models (LLMs) offered through SaaS-style services? “I think it’s both. I think it’s still very early days. I think most enterprises are continuing to work with large language models and I think that will be the trend over the near horizon. Most of the key cloud providers, even frontier [companies], are building small models, too. I believe over time, you will see specialization, verticalization and small models being of distinct value.