A full-stack developer is a jack of all trades and a highly sought-after job candidate. The title implies a breadth of knowledge that can be invaluable to short-staffed startups and big companies managing complex apps alike.
However, the term “full-stack developer” is controversial among developers. Some disparage the idea that anyone could be equally competent across an entire software stack, while others believe that the term has been so overused by employees and employers that it has become somewhat meaningless.
Defined: What is a full-stack developer?
A full-stack developer is someone who is competent to deal with the technologies behind the entire application stack—that is, the different layers of technologies that make up a modern application. The term is meant to contrast with developers who focus exclusively on an application’s front end (the UI, usually a website or mobile app) or exclusively on the back end (the business logic that drives the application and the database where the information the application needs is stored).
In theory, a full-stack developer would be as comfortable with the JavaScript code running in the user’s browser as they would be with the MySQL queries that get the information the user wants from a database.
Of course, this calls for mastery of a lot of technologies, a skill so rare that people use the phrase “unicorn” to describe practitioners. In a long and influential diatribe, developer Andy Shora made the claim that these true full-stack developers are a myth—that everyone has more mastery of some aspects of the stack than others, but that the existence of term “full-stack developer” encourages people to overstate some of their skills.