On Thursday, March 19, ESCP Business School hosted a new edition of its Cross-generational Dialogues at the Amphitheater Jacques Cœur in Paris. Organised as part of the Cartier-ESCP-HEC Paris Turning Points Research Chair, the event brought together academic leaders, students and business voices to explore a shared contemporary concern: FOBO, the Fear of Being Obsolete.
From Gen Z students preparing to enter the workforce to senior leaders with decades of experience, the discussion made one thing clear: in the age of artificial intelligence, uncertainty is no longer confined to one generation. It is collective.
Opening the evening, Léon Laulusa, Executive President and Dean of ESCP Business School, framed the discussion around a simple but powerful idea: while AI may outperform humans in speed and computation, it does not replace what makes us fundamentally human. In a moment of technological acceleration, he called for more connection, more learning across generations, and more confidence in the value of human judgment.
The panel then brought together Cyrille Vigneron, Chairman of Cartier Culture and Philanthropy, Professor Anne-Laure Sellier of HEC Paris, Professor Benjamin Voyer of ESCP Business School, and Gen Z Observatory content curators Audrey Tsai (HEC Paris) and Alix Nikolov (ESCP), representing the next generation of observers of social change.
What emerged was not a single view of AI, but a rich and nuanced conversation about its promises, tensions and consequences.
From fear to questions
One of the first ideas discussed was whether “fear” is the right starting point. While FOBO captures a real sentiment, the panel also challenged the usefulness of fear as a framework. Fear can be paralysing. It can narrow thinking just when openness, experimentation and reflection are most needed.
Rather than asking whether AI should be feared, participants explored a more productive question: what exactly is changing, and how should we respond?
This shift from emotional reaction to critical inquiry set the tone for the evening.
What makes AI different
The discussion highlighted that technological disruption is not new. Previous generations also faced major shifts, from industrialisation to the internet. But AI feels different for at least two reasons.
First, its scope is broader. Unlike earlier changes that affected specific professions or sectors, AI appears to touch nearly every field at once, from education and marketing to healthcare, law, communication and creative work.
Second, its speed is unprecedented. Several speakers pointed to the unusually fast pace of development, experimentation and adoption. Even if actual transformation may still take time in practice, the perception of acceleration is itself reshaping expectations, work and identity.
A shared anxiety across generations
A key strength of the event was its refusal to reduce the conversation to stereotypes. Gen Z is not simply “comfortable with technology,” and senior generations are not simply “resistant to change.” Instead, the panel showed that anxiety and opportunity coexist across age groups.
For younger participants, AI raises immediate questions about employability, learning and professional development. If entry-level tasks are increasingly automated, how will young professionals build experience, judgment and confidence?
At the same time, students also described AI as a tool that can accelerate learning, increase efficiency and push them to raise their own standards.
For more experienced leaders, the concerns were often broader: governance, inequality, over-automation, loss of discernment, and the societal consequences of concentrating power and decision-making in too few hands.
Critical thinking as a common answer
Across these different perspectives, one point generated strong convergence: critical thinking is more valuable than ever.
If AI can produce content, summarise information, automate routine tasks and imitate styles, then human added value shifts elsewhere. It lies in asking better questions, exercising judgment, understanding context, identifying blind spots, and deciding what truly matters.
This has major implications for education. The role of institutions is no longer simply to transfer knowledge, but to help students develop the intellectual reflexes needed to navigate a world where knowledge is abundant, but discernment is scarce.
Human connection remains irreplaceable
Another strong theme of the evening was that AI, for all its power, does not eliminate the need for human relationships. On the contrary, moments of uncertainty often make solidarity, trust and dialogue even more essential.
Several examples shared during the panel illustrated a broader point: when systems fail, when technology reaches its limits, or when decisions become deeply personal, people still turn to one another. This is especially relevant in a world where automation can easily create distance, standardisation and depersonalisation.
Governance, bias and diversity
The panel also addressed one of the most important long-term issues surrounding AI: governance.
AI systems reflect the data, assumptions and structures that shape them. Without vigilance, they can reproduce and amplify existing biases. This makes diversity in design, leadership and technological development not only a matter of fairness, but also a condition for building better systems.
In that sense, the future of AI is not just a technical question. It is a social, ethical and political one.
In the end, the evening did not resolve every tension surrounding AI, nor was that its purpose. What it offered instead was something more useful: a shared space to think, question and listen.
And perhaps that is the most important response to FOBO. Not denying uncertainty, but refusing to face it alone.
Campuses