I was chatting with a couple of old, crusty software developers who were talking about updating resumes. The theme of the conversation had turned to our advanced age (over 40!) and the impact that being so “seasoned’ had on the marketability of developers.
Let’s qualify “seasoned’ as 20+ years into software development careers. Clearly, any developer in their 20’s or 30’s is probably pretty safe from age bias.
Although my buddies were pure software gurus, I hadn’t written a line of code in years. (I do still dream about using a VMS debugger to figure out why my FORTRAN code isn’t working.)
As a manager, I don’t think age is viewed negatively until you have about 30 years in. But for developers, I do think age can have an impact on people’s perceptions.
Keep in mind that under the federal Age Discrimination in Employment Act (ADEA), workers 40 and over cannot be arbitrarily discriminated against because of age in employment decisions, including hiring. But proving discrimination is almost impossible if a job candidate wasn’t invited to interview because of their age.
I have written before about my past as a hiring manager, noting that when I reviewed resumes I always jumped down to the bottom to learn where the candidate went to college. As part of that quick scan, I’d naturally notice of their graduation year and do the quick math.
In my eyes, someone with 20+ years of experience has more upside than downside. They typically have learned valuable lessons throughout their career, and their maturity makes them more accountable than younger programmers.
Actually, it isn’t so much how old they are, but where their graduation year fell in the era of software development. Here are some milestone dates that may trigger a preconception of your skill set, depending on the type of job you are applying for.
While listening to Michael Jackson’s classic music (RIP King of Pop), computer science students were still using classic printed hole-punch cards in college to write their code. I do get a kick out of the stories when people dropped their punch cards on the way to the compiler minutes before a project was due.
I’m sure most of them adjusted to the not-so tedious move to online compilers, so this year marker is not really a big deal.
There is a good chance they didn’t receive formal education on web development. Even though Tim Berners-Lee invented the World Wide Web in 1989, the mainstream explosion of software around TCP/IP and HTML protocols didn’t happen until the mid-90’s.
This could be a key demarcation line when looking at the value of someone’s computer science education for any web development position. The good news is that any Windows-based development would have been covered around this time period, which coincided with the release of Windows NT.
Object oriented development didn’t become popular until the late 90’s, after Java was released. This isn’t saying that many developers didn’t learn OO concepts and design with C++ post-1989 when version 2.0 moved it into the mainstream.
And for you hard-core OO developers, yes it is a fact that Smalltalk (the only true OO development environment) was taking strides in the early 90’s, but unlikely a core topic in computer science programs.
Next Page: Salary and software development experience
Even though first XML spec was produced by the World Wide Web Consortium (W3C) in 1994, it wasn’t widely accepted and taught until the late 90’s. Even as late as 2000 the future of XML was being debated, so it’s a good bet the college curriculums didn’t catch up for a few years.
Of course, everyone in college at this time was desperately trying to learn web development so they could join a startup and retire a few years after graduation. So maybe their ambition pushed them to lean it as a leading edge protocol. If not, then one concern would be that without XML foundations they may be weaker at SOA development.
This marked the beginning of the .NET era and I imagine that most universities weren’t teaching C#, VB.net and ASP.NET until early to mid-2000’s. If a hiring manager is looking for a .NET developer, then anyone who graduated in the 90’s is going to be more closely scrutinized.
And all who graduated after the bubble burst in 2000 had more realistic expectations of a long term career, so maybe their attitudes were better than those that graduated a few years prior.
In fact, 2001 marked another milestone when considering not what technologies a candidate learned in college, but how a candidate was taught to write code. With the release of Schwaber and Beedle’s book Agile Software Development with Scrum in 2001, the old water-fall approach to managing software projects went out the window. Programmers classically trained in older methods may have a difficult time adjusting to the fast-paced, fluid Scrum methodology.
Anyone who graduated in the mid to late 2000’s is (hopefully) a pretty safe bet to be well grounded in the latest and greatest development technologies and methodologies.
But hold the phone! (landline, mobile or Skype?) Does it really matter what someone learned in their comp sci courses? Have the foundations changed that much since 1990? Many a COBOL programmer has learned OO development. Many a Powerbuilder developer has learned and excelled at .NET development.
To offset any age concerns (whether they are valid or not), it’s critical your resume shows an earnest effort to stay up on the latest and greatest innovations. For instance, if the candidate showed ongoing education with a master’s degree or relevant certificates that demonstrated a commitment to staying current in their area of expertise, then they’d be granted a reprieve.
However, if they had no significant formal education since their bachelor’s degree, then that raises a red-flag.
Not putting your graduation on your resume is likely to draw even more attention to it. I mean your years of experience are pretty obvious based on the length of your resume, so what does not including your graduation year do for you?
What does matter very much are your salary requirements.
Should a programmer with 30 years of experience who is now a Java expert be paid more than a developer with 10 years of Java experience? Don’t the extra years of experience count for something, even though they aren’t specific to the programming language for the job position?
When it comes down to it, you should have reasonable salary expectations and be proud of your experience. Highlight everything you’ve done throughout your career to stay current.
And if you haven’t stayed current, you better start now. To look really hip and leading edge, go take a course on iPhone or Android development.
Then again, about 75% of the world’s businesses data is still processed in Cobol so some of us old farts may someday find a lucrative job writing mainframe code once again. Which just goes to show that there is a place for old programmers after all!
ALSO SEE: Developer Salary Levels, 2004-2009
AND: Understanding Your ‘Idiot’ Manager
AND: When Developers Drink On The Job
Eric Spiegel is CEO and co-founder of XTS, which provides software for planning, managing and auditing Citrix and other virtualization platforms.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.