AI and Modern Workforce

I was invited to be a panel speaker for a session at the Festival of Digital Innovation event in BSD, South Tangerang. I was set to be in a discussion session with Fetri Miftach from Xynexys, Yusri Amsal from PwC, and Rendra Perdana from DANA, where the session was moderated by James Lim from Centre for Cybersecurity Institute, Singapore.

This post was written on the night after the event.

I was a last-minute addition to the panel, so probably that’s why I’m the only one from the session whose name and picture aren’t listed on the event landing page. But hey, the organizer at least made promotional images with my face on them:

The images are pretty big and irrelevant to what I’m going to write here, so I’ll settle with linking to those images instead of rendering them here.

The topic of the panel discussion was mostly about how AI is affecting work and also the workforce, and how we expect workers and students to adapt to a future of work with AI. In this post, I’m putting a few points I brought in the panel discussion in a written form so my friends who might be interested in the points discussed in the session can refer to this if needed.

I know at least one friend who seemed interested after I told them about my view on one of the points that were brought up by the moderator, so writing it down might be a good idea.

I’m writing it in a Q&A format since that was how the discussion was carried out. But I’ll just focus on the relevant questions and what I think about the issue, since I probably wouldn’t be able to represent the other panel speakers’ opinions accurately (assuming I even remember them all correctly).

Does the rise of AI technologies change the criteria you’re looking for in candidates you’re hiring for your team?

With or without AI, what I consider a strong candidate should have:

What should academia do to keep up with the latest AI technologies, considering it seems that academia is lagging in preparing their students to leverage AI?

Not just limited to AI, the academia lagging behind the industry in adapting to new technologies isn’t something new. We’ve heard people complaining about academia teaching outdated concepts and technologies to students for decades.

I happened to have met and talked with faculty members from two different universities on two separate occasions within the last 6 months or so. From the discussions I had with them, I found that both universities had basically the same problem: how they can teach the latest industry trends to their students.

I can understand that they want to make sure their students can easily adapt to the industry demands after the students graduate from those universities. But I think they’re approaching it the wrong way.

People always complain that universities are teaching outdated technologies and practices. That’s to be expected since many university lecturers don’t do full-time work in the industry (except for the adjunct lecturers, who might be expected to have a full-time industry job but probably less involved with the students) so it’s unreasonable for us to expect them to keep up with the industry trends while the industry itself is changing so fast. We have new tools popping up at a very fast rate, the existing tools changing rapidly with new version releases, and ever-alternating industry practice trends.

Given the number of faculty members and lecturers a university can have at one time, it’s unrealistic for them to be that up-to-date with the industry. Universities with more resources probably can do a better job in keeping up with the industry, but even then they still can only cover relatively little of the ever-changing industry trends.

I think the better approach is for the university to focus on teaching their students the fundamentals and make sure their students understand it properly. After all, once someone understands the fundamentals, they should be able to derive the necessary skills for them to adapt to the latest trends from there.

Given that AIs can learn very fast, how do you think we should keep up with AI since it takes way more time for humans to learn a skill if compared to AI? What should academia do to allow students to not be outcompeted by AI?

AI can learn at a superhuman speed given the resources and can accumulate knowledge at a rate that no human can compete with.

But when the AI is wrong, it’s never the AI whom we think to be the one at fault. It’s always someone else, maybe the AI’s developers or users. The blame never goes to the AI itself.

The AI won’t care how many mistakes it has made, and there are no real consequences for itself when making those mistakes. With the current AI technology, at least, we can’t really punish misbehaving AI to make it feel regret and have the AI live with that regret.

Humans are different. They can be punished, their reputation will suffer when they make mistakes, and they will also suffer from it. Unlike AI, humans must live with whatever mistakes they have made in the past.

Hence, humans have a lot more at stake when performing actions and making decisions, which AI doesn’t have. As long as humans have these characteristics (and as long as AI doesn’t have them yet), there should always be jobs where a human is the better choice to do it.

References

3rd Festival of Digital Innovation (FDI) - BSD 2025