With the tech industry being continuously reshaped by artificial intelligence, professionals are left speculating on how to remain relevant in an environment where machines are doing most of the tasks and the heavy lifting.
However, according to Google Cloud’s CTO Will Grannis, keeping pace with AI doesn’t mean walking out on conventional knowledge but building on it.
In a recent discussion with Business Insider, Grannis stressed that the crucial principles of computer science are not just practically viable; they have become more indispensable than ever.
Don’t ditch the basics—build on them
Notwithstanding the speedy expansion of AI tools such as Copilot and Codex, Grannis asserts that basic computer science abilities remain vital. “You still have to understand how computers work, how data stores work,” he said. These rudiments give the framework required to design systems that are both effective and useful, even though the manner in which people interact with technology has changed.
Grannis motivates job seekers to “lean into the education” they’ve obtained, underscoring that a traditional computer science degree or coding bootcamp still holds value. While AI may be systematising several coding tasks, the facts behind why and how code works set great engineers apart.
Modern tools require a modern mindset
Nevertheless, holding onto the rudiments doesn’t mean defying change. Grannis clarified that being competitive in tech today necessitates a disposition and the readiness to discover beyond the average prospectus. He suggests learning contemporary tools, trying out new systems, and integrating AI into workflows and roadmaps.
At a forthcoming hackathon, Grannis is driving his international team to be engrossed in what he calls “vibe coding,” using AI not just to create code, but to recapitulate, polish, and revolutionize it. He thinks this echoes the future of development – the intermingling of rudimentary knowledge and contemporary flexibility.
Welcome to the era of context engineering
Grannis visualises the next frontier in prompt engineering and what he termed “context engineering.” This means knowing the entire network that AI systems need to function efficiently and successfully, from the data they consume to the apparatuses and platforms they network with.
“We’re moving from the application layer to a more holistic view,” Grannis said. As AI systems develop into multi-agent platforms, professionals and experts must have a grip on the larger architecture they’re building within. Differentiating how to write an excellent prompt is not sufficient; developers must also create the context in which AI performs best.
Grannis highlights a fundamental message — the nitty-gritties are not obsolete; they’re the catalyst and the springboard for the next upsurge of innovation. Simply put, grasp the past, but be willing and prepared to build the future.