I graduated in Computer Science in the early 2000s.

When I took a Databases class, NoSQL didn't exist.
When I took a Computer Graphics class, OpenGL didn't support shaders.
When I took a Computer Security class, no one knew about botnets yet.
When I took an Artificial Intelligence class, deep learning didn't exist.
When I took a Programming Languages class, reactive programming wasn't a «thing».
When I took a Distributed Systems class, there was no Big Data or cloud computing.
When I took an Operating Systems class, hypervisors didn't exist (in PCs at least).
When I took a Networking class, there was no wifi in my laptop or internet in my phone.

Learn the fundamentals. The rest will change anyway.

I've felt this very true to my experience, and yet, more than 8 years ago, when people were giving me this very same advice, I was not believing them (very often, not always, luckily).

I realize this now, and with it, the irony of the fact that this must be true to so many people. I wished I learnt some more fundamentals, and I wished I studied them better. I am doing it now, from time to time, re-learning the fundamentals while pointing a condescending eye to my past-self.


Hisham H. Muhammad is a PhD from the Pontifícia Universidade Católica do Rio de Janeiro, who is using his homegrown Linux distro, typing in Dvorak into its own text editor (cit.)