Confidence, Trust and Stronger Human Identity in an AI World
Feb/March Leadership Roundtables Summary
In February and March, we have been busy conducting a range of discussions with senior leaders on the key insights from NWC’s Professional Confidence in the Age of AI national leadership study, partnering with hosting organisations in Perth, Sydney and Melbourne.
Across our discussions over the last few weeks, a consistent picture emerged:
AI is already embedded in day-to-day work. Its impact is uneven, fast-moving, and not fully understood. And while the conversation often centres on capability and efficiency, what surfaced most strongly was something more human: confidence, judgement, trust and clarity.
The discussions did not land on a single answer, but they did point to a set of tensions that leaders are now navigating in real time.
1. Clarity is harder to provide, but more necessary than ever
Leaders are being asked to provide direction in an environment that is constantly shifting.
Discussion participants questioned how meaningful clarity can be offered by leaders, when AI capabilities, risks and implications evolve every few months. Traditional roadmaps feel insufficient. In their place, some described the need for clearer principles and more adaptive forms of leadership.
The absence of clarity, however, has a cost. Where leaders are hesitant or inconsistent in how they communicate AI’s role, confidence within teams can quickly erode.
2. Confidence is shifting from knowledge to judgement
A noticeable shift is occurring in how professionals define their contribution.
Where confidence was once grounded in “having the answer”, it is increasingly tied to how individuals interpret, challenge and apply what AI produces. Several participants described moving beyond initial discomfort with AI-generated outputs, towards a more confident stance: using AI as a baseline, and adding experience, context and discernment.
This is not a removal of expertise, but a reframing of it.
3. Critical thinking is becoming a fault line
Across all discussions, there was concern about over-reliance on AI and a decline in critical evaluation.
This was most pronounced when discussing less experienced professionals. Without a foundation of experience, it becomes harder to assess whether AI outputs are accurate, biased or relevant.
The implication is not just a capability gap, but a development challenge.
4. Trust remains unresolved, internally and externally
Trust in AI itself remains uneven. Participants raised concerns about data integrity, ethical sourcing, bias and the limits of AI-generated outputs. At the same time, organisations are grappling with how to ensure employees use AI safely, particularly when much of its use occurs outside formal systems.
Externally, trust also shapes how value is perceived. Even where AI improves efficiency, clients and stakeholders still look for human interpretation, reassurance and accountability before acting.
5. Efficiency is expected; value is being redefined
AI is already changing expectations around speed, productivity and costs. Internal teams can access and synthesise information more quickly. Clients are arriving better informed, often with more questions.
In some cases, there is emerging pressure on fees, based on the assumption that AI should reduce effort.
Yet across discussions, efficiency alone was not seen as sufficient. The work that remains valued is the work that helps people understand what to do with the information – interpretation and guidance through complexity.
6. The human element is not diminishing – it is becoming more visible
Despite advances in AI, participants repeatedly returned to the importance of human capability, such as judgement, empathy and context – the ability to read a situation, challenge assumptions, and guide others through uncertainty.
These are not new capabilities, but in an environment where information is abundant and easily generated, they are becoming more differentiating.
These discussions point to a subtle but important shift:
As AI expands access to information and accelerates delivery, the basis of professional confidence is moving.
Less about what we know.
More about how we think, how we apply judgement, and how we show up in moments that require interpretation, reassurance and direction.
For leaders, this creates a more complex task:
Not only to adopt and govern AI effectively, but to ensure that their teams have a clear sense of what they stand for, how they create value, and how they operate when the answers are no longer the differentiator.
In other words: What makes us unique and valuable, as humans, in this organisation Clarity on this will materially shape culture, brand and customer value.
Related material
Rate your team on the Team Brand Confidence Scale
What’s your organisation’s ‘special sauce’?

