Nika

What will the future of studying be like when AI does everything for us?

Today, I came across an article on TechCrunch: The great computer science exodus (and where students are going instead).

It shows that UC campuses saw a drop in computer science enrollment for the first time since the dot-com crash (6% in 2025, 3% in 2024), but students are shifting to AI-focused programs.

AI-focused programs can be found now at:

  • UC San Diego

  • MIT

  • University of South Florida

  • University at Buffalo

  • Not to mention, China is making AI literacy mandatory and creating AI-specific colleges

How do you think education will evolve, and how will artificial intelligence influence it?

For example, in terms of study programs, attending classes, or teaching methods?

Besides this one trend we’re seeing, there is still the ongoing trend of more dropouts or students who decide to go straight into entrepreneurship instead of continuing their studies. Even thanks to AI that helps solofounders to be more effective in building products and their startups.

442 views

Add a comment

Replies

Best
AJ

I think we are going to see an age of mis or under educated people. Not through any fault of their own.

we barely understand the cognitive effects of AI in its generative form as is. We have studies that claim it reduces learning or cognitive abilities, we have people using it as a tutor. And we have the phenomenon of learning by fixing its hallucinated mess.

Truth is we do not understand how to best utilize AI in education at any level. Until we form a better understanding, there will be side effects.

In my view using AI as a personalized tutor is not the best way to utilize it.

And there is a difference to be made between foundational knowledge and knowledge required to perform an economic activity.

Thus I see a future where Foundational Knowledge is taught to a certain standard, but beyond that, people will learn whatever minimum they have to to accomplish their goals with AI help.

There will also be those who reject it of course. I foresee a trend of waldorf style academies popping up at various levels. I also see a move towards testing that cannot be cheated even with AI. It worries me that right now we do not have an answer, and that by the time we do, so many will have grow up in the mostly failed experiments of trying ti find it.

Nika

@build_with_aj I personally think there will be a huge gap between people who are educated (who will try learning with AI), but there will be more consumers (passive consumers). I also read a study where people were using AI, vs Google, vs their own thinking. People who were writing an essay with AI had 55% lower brain activity.

AJ

@busmark_w_nika Yeah. exactly. TBH I've made a vow to not use AI while learning and to only use it for when I need to build something that requires the speed/expertise tradeoff. It's a form of tech debt if you think about it.

If I'm building a commercial project, I will use AI tooling. if I'm learning or building just for fun, I won't.

And I will be launching a couple things that are just for funsies here, Projects that I think are cool and products in the sense of them being a result of my work.

There will be a gap, and we must be mindful of the skills we choose to nurture.

I've written in my blog about the sort of skills that will arise when working with AI a lot. They are operational expertise ones.

Nika

@build_with_aj I think that LinkedIn also published something on emerging skills. Can you pls share your article? :)

Kalani Growney

Wow, great article. Like most college students, the uncertainty of finding a job after graduation is nerve-racking. I’m currently pursuing a BA in Art with a focus on UX design, and I’ve decided to add a second major in Information Technology to better prepare for an AI-driven workplace. Even with the additional major, I still feel a lot of uncertainty about job opportunities after graduation.

Nika

@kalanigrowney Do you build something besides attenting university as well?

Kalani Growney

@busmark_w_nika I've been building workjourney.ai with my Dad to learn vibe coding.

Sandun

Computer Science is shifting because the "boring" parts of coding are being automated. As a founder currently building an AI tool (SwiftTag.ai) using Google Vertex AI, I see this shift every day. We aren't seeing an exodus from technology, we’re seeing an exodus from syntax. The reason students are dropping out to become entrepreneurs isn't just impatience, it’s because the barrier to entry has collapsed. In the past, you needed a CS degree to build a production-ready app. Today with Gemini 2.5 Flash and modern orchestration you can build in a weekend what used to take a team a month.

Nika

@sanduns Are you going to launch here? :)

Igor Lysenko

You’re right that when someone uses AI with their own knowledge and experience, it functions as a tool. But when a new generation uses AI without their own foundation of knowledge, their creativity might simply be limited. It’s possible things could turn out differently, though :/

Nika

@ixord I have heard that there was a study about using AI and cognitive debt, but not so sure, whether this one: https://arxiv.org/abs/2506.08872?utm_source=chatgpt.com

Igor Lysenko

@busmark_w_nika Thank you for the link, I read the article. I am sure that in the future AI will not be as accessible to people, because you need to understand everything yourself. Also, when we are alone as children and bored, the brain makes us come up with ideas, and this idea generation is actually a very good skill if applied to work now. If, for example, using AI blocks creativity, people may find problem solving more difficult than others.

Gianmarco Carrieri

Really interesting data on the CS enrollment shift. As someone who studied computer science and now builds AI products, I see both sides of this.

The shift from "learning to code" to "learning to think with AI" is real, and I think it's actually healthy. When I'm building Aitinery (AI travel planner), I spend way more time on system design, prompt engineering, and understanding user problems than writing boilerplate code. The skill that matters is knowing WHAT to build and WHY, not just HOW to write a for-loop.

But here's what worries me: if people skip the foundational understanding, they won't know when AI is wrong. In my experience, the best AI builders are the ones who deeply understand the domain they're automating. You can't evaluate AI output on something you don't understand yourself.

I think the future isn't "AI replaces studying" — it's "AI changes what's worth studying." Critical thinking, domain expertise, and the ability to ask the right questions will matter more than ever. The students shifting to AI-focused programs are probably making a smart bet, as long as those programs teach them to think, not just to prompt.

Nika

@giammbo + it is about ambitions. Some people use AI to learn, but some only to burn time (nothing productive, just complete tasks without thinking or any additional effort). So yeah, depends on personality too.

Gianmarco Carrieri

@busmark_w_nika  100% agree on the personality angle. I see it with my own users too — some use Aitinery as a starting point and then customize everything (the curious ones), while others just hit "generate" and follow the AI plan blindly. The second group often has worse trips because they don't adapt when things change. AI amplifies what you already are: if you're curious, it makes you more efficient. If you're lazy, it makes you more lazy. That's the real education challenge nobody's talking about.