Over lunch last week, an industry veteran from a top tech company told me that they had made a hiring policy decision to recruit software engineers only if they had hands-on experience from before November 2022 (when ChatGPT was launched). The conversation was not an anomaly.
There is global data now showing that across domains and industries, fresher recruitment has fallen over the last 2-3 years with companies pushing for productivity improvement using AI. How far and how quickly will this approach succeed is to be seen but most leaders expect the impact of AI to be irreversible.
Across closed-door conversations, panel discussions and board rooms – there is a clear recognition that there is a tectonic shift underway in the nature of work, skills, jobs and careers. Sitting at the intersection of Academia and Industry, we saw these trends a couple of years earlier than most and started adapting to them. Over the past two years, our experience at Atria University has surfaced five lessons that we believe every educator, employer, and student should consider…
1. The Value of Friction
I love AI tools like NotebookLM, Gemini Deep Research and Claude. They have made my life as a researcher and as a professor so much better. The speed at which I can learn new concepts, understand the latest research and synthesize new information has increased phenomenally – thanks to these tools. And yet, I find that a lot of students using these tools actually learn less than what they would have learnt if they did not have access to them. It took us some time to understand this contradiction.
- Whether AI aids or hinders learning depends critically on how you use it.
- AI eliminates the friction from the learning experience.
- All topics can be explained with a click of a button.
- All questions can be answered instantaneously.
- All doubts can be cleared as they arise in the moment.
- Technology has delivered a personalized tutor for every student. So, what is the issue?
The issue is that Learning needs Friction. To understand a new concept or skill, you must literally change the neural wiring of your brain. That takes effort. Even if you have the explanation on the screen, the answer to the question you asked, the clarification of the doubt that you had, your brain needs to absorb it. That takes time … and effort. Skimming through the ChatGPT response or copy-pasting it to complete an assignment is counter-productive because it creates the illusion of learning without the actual learning.
“AI eliminates friction from learning—but without friction, there is no real learning, only the illusion of it.”
How do we add friction to improve the learning process? To slow down and let the students absorb content generated from AI? One experiment we are trying at Atria University (AU) is to allow students to create 1-page hand-written cheat-sheets using AI tools and lecture notes. Students are allowed to use this cheat-sheet during the exam. This is the modern-day
interpretation of Open-Book exams which encourages hand-written note taking. Using AI tools during note making is allowed and in fact, encouraged. However, limiting the notes to be 1-page long requires students to summarize, synthesize and internalize the concepts. We are still collecting data to evaluate the results of this experiment but anecdotal evidence shows that student engagement with course content has gone up.
2. Questions are more important than Answers
A unique challenge in the Indian context is students’ reluctance to ask questions in the class. We see this consistently in every new batch of students who join. A simple solution that works consistently for me is to reward questions. I carry a box of chocolates to the class and hand them out to students who ask (good) questions during the class. This increases class engagement and encourages students to think. There is a deeper reason too…
When AI makes answers virtually free – the questions you ask are what differentiates you.
This is probably the single most important challenge for developing human capital over the next decade. One place to search for the solution is Liberal Arts – where the emphasis is on critical thinking over rote memorization. The Socratic Method of Inquiry (similar to the Gurukul method) teaches the art of questioning by relying on class discussions, encouraging students to interrogate arguments, asking “why,” and examining the evidence, rather than accepting information at face value.
These are exactly the skills essential in the AI age because they provide the critical thinking required to write effective AI prompts, verify outputs, ensure ethical AI use, and bridge technology with human context. As part of our “Liberal STEM” philosophy at AU, we have incorporated these pedagogical approaches across the curriculum. We have also introduced dedicated courses like Critical Thinking, Critiquing Data and Problem Decomposition to develop these foundational skills.
The Critical Thinking course, for example, incorporates TED talks, question-generation, thinking frameworks, pen-and-paper mind-maps, group discussions, and introspection-and-reflection journal. The objective is to expand the horizons so that students have the breadth to examine problems from different perspectives. It seeks to help students to move beyond headlines and “gut feelings” and understand the real trends shaping our planet. It encourages independent thinking by challenging students to ask questions, challenge assumptions, spot bugs in their reasoning and identify cognitive mistakes in arguments.
3. The joy of reinventing the wheel
As AI increasingly provides instant answers, the temptation to skip the struggle is stronger than ever. But the struggle is actually the point. Consider this “Is a solved problem worth solving again?”
It turns out the answer is a resounding Yes – as far as learning is concerned. It is human nature to find joy in finding a solution to a problem. It doesn’t matter if others have solved the same problem earlier, elsewhere … millions of times. The joy of finding a solution to a challenging problem and learning something new is inherent human nature. Researchers call this “productive struggle” – the idea that working through difficulty is itself how we learn. Every parent has seen this in action as babies learn to walk, talk, swim, catch a ball, read.
Formal academic pedagogy however focuses primarily on information delivery via lectures as the primary form of “teaching”. We need to reorient our perspective and realize that knowledge is constructed by learners, not transmitted by teachers. This paradigm shift is critical in the age of AI. When tools like ChatGPT can provide instantaneous answers, learning must focus on questioning, verifying, and applying knowledge
How do we convert learners from passive consumers into active thinkers?
One way is to help students “discover” concepts rather than tell them. There are many ways to do this e.g. when small teams work on structured activities to construct understanding, the classroom gets very noisy, but concept retention is higher. When students solve real world problems (in group projects) by applying what they have learnt, they develop essential, lifelong skills – communication, management, and problem-solving—alongside content mastery.
A word of caution. This approach of learning-by-doing makes the role of the teacher or the professor more important, not less. The ability to design structured activities which can help students discover the underlying concept requires creative thinking and a solid grasp of the underlying concepts. If students get lost, they need to be guided and nudged in the right direction. And the transition from intuitive understanding to formal abstractions (e.g mathematical concepts) requires explanation.
4. The future is AI+X
One common mistake I see many people make is to treat AI as a branch of Computer Science. Modern AI grew out of a very diverse set of fields: math, biology, psychology, physics, electrical engineering and computer science. Its application span will be even more diverse: from healthcare to finance, from agriculture to biotech, from weather forecasting to marketing.
Does this mean that everyone should become an AI engineer?
No. Think of AI like math. The math that a Chartered Accountant uses is very different from the math that a physicist uses. The math that an interior designer uses is very different from the one that the electrical engineer uses. Correspondingly different are the math-based tools: from calculators and spreadsheets to differential equation solvers.
AI will likely follow the same path. How each profession uses AI will be different. The tools will be different and the knowledge required to use those tools will be different. It’s early in this journey but we can see emerging patterns. Chatbots like ChatGPT and Gemini will become general purpose tools used commonly across a wide variety of professions (similar to spreadsheets and calculators). Large Language Models are the new microprocessor: software developers will use them as an “Intelligence API” – a building block to add AI capabilities to apps. Scientists and Engineers will need domain specific foundation (specialized AI) models – sometimes as-is and sometimes wrapped up in nice User Interfaces.
Universities will need to adapt to this new reality. At AU, for example, we have opened the Minor in AI for all students irrespective of the Major (Degree) that they are pursuing e.g. a student earning a degree in Life Science can simultaneously complete a minor degree in AI. This is an explicit recognition of our belief that AI will be a horizontal – an enabler and
accelerator for multiple disciplines. Now, consider the AI-for-CXO weekend program for senior leaders that we offer – this 2-day program is not a shortened form of a 32-credit AI minor. It cannot be. It is a fundamentally different course for a distinctly different audience intended to achieve a differentiated objective.
5. Lifelong Learning
Last but not least, the longest shadow of AI would be on learning journeys. In the 20th century, a worker could learn one skill and use it for their entire 20- to 30-year career. Modern economies demand a broad spectrum of specialized skills: from narrow domain-specific skills (e.g. programming) to broad, foundational skills (e.g. critical thinking).
Many of the narrower skills have a short shelf life — they become obsolete or inefficient as technology evolves e.g. programming languages, software versions etc. In the early 2010s, the average half-life of technical skills – how long it takes for a skill to lose half its value – was generally considered to be around 5 years. In the AI Era (2020s), the rise of AI tools means that entire workflows are appearing, evolving, and disappearing within a year. In fast-changing fields like IT, software development, and AI, the half-life of skills is much shorter, averaging only 2.5 years.
- How do we cope with the accelerating pace of change where skills get outdated ~2 years?
- How do we prepare today’s generation for a career that will last till 2070? Do we even know what the world will look like then?
- What skills will they need? What role will they play in the economy?
The goal of a modern university must shift from content-delivery to fostering Learning-to-Learn. I already talked about the growing focus on foundational skills like critical thinking (See 2) and the importance of helping learn-by-doing (See 3). Both are steps in shaping self-learners. And yet, this is not enough. We need to break the hard mold of completing a four-year degree followed by a job and a lifelong career.
It is time to break the one-way street of going from academia to industry. Can we create models where industry professionals can take sabbaticals and go back to universities for upskilling? For a weekend? For a quarter? For a year?
Can we have academic programs (degrees) where they earn half their credits from the university and half their credits from the industry? Or where students at the undergraduate (Bachelors) / postgraduate (Masters) level work on industry sponsored projects – just like PhD students work on industry sponsored projects?
The future is evolving right in front of our eyes. It is time to act and adapt.
Conclusion
The industry veteran who told me over lunch that he only hires engineers with pre-ChatGPT experience was, in his own way, making a profound point — not about AI, but about the irreplaceable value of humans who learned to struggle, question, and think before the answers came easy. His concern is legitimate, but the solution is not to look backwards. It is to ensure that the next generation gets that same foundation, even as the world around them changes faster than any curriculum can keep up with.
A student entering university today will still be working in 2070. No one — not educators, not economists, not AI researchers — can tell you with confidence what that world will look like. What we can say is this: the students most likely to thrive in it are not the ones who knew the most answers in 2025. They are the ones who learned how to keep asking better questions.
“The future will belong not to those who knew the most answers, but to those who learned how to keep asking better questions.”
The changes needed are not waiting for a policy document or a government mandate. They are available to any educator willing to rethink one assignment, any employer willing to offer one sabbatical, any university willing to open one course beyond its traditional boundary. The future will not be built by those who adapted eventually — it will be built by those who started now.
Read Also : Beyond Job Roles: Nurturing Talent in Evolving Skill Ecosystems
IWork in the Age of AI: Are Indian Organisations Redesigning Jobs – or Simply Automating Them?
Rethinking Talent in the Age of AI: Why Workforce Agility Starts with the 4Bs
The Rise of the Chief Human Agency Officer: Why Every AI Organization Will Soon Need One









