Probably, one of the most dreaded software development methodologies might make sense again for small companies in a operational perspective.
Ken Schwaber — the cofounder of Scrum and founder of Scrum.org — says Waterfall “literally ruined our profession.” “It made it so people were viewed as resources rather than valuable participants.” With so much planning done upfront, employees became a mere cog in the wheel.
The waterfall development model originated in the manufacturing and construction industries; where the highly structured physical environments meant that design changes became prohibitively expensive much sooner in the development process. When first adopted for software development, there were no recognized alternatives for knowledge-based creative work.
The effort of building a software program was well known. Although the evolution of software was rather quickly compared to every other type of engineering, there weren’t fancy and fast machines available that could compile and process our work in half a second to show the result onto computer screens or expose it on an interactable layer. Cardboards had to be punched with all the logic of the program. Only then, a box of punch cards would be read by the machines. A typing error generally meant re-punching an entire card.
Of course, looking back this sounds painful and it was! Several professionals were needed to run a simple compilation. There were people who were responsible only to organize cards in boxes. Not a very specialized job, but someone had to do it. To reduce the cost of this cumbersome operation is where Waterfall comes in. There was no need to have lots of highly specialized professional sit and punch cards the whole day if planning was done before. This worked perfectly. Many errors could be avoided and less time was needed to develop a program.
The well-known slowness of organizations in adopting new concepts couldn’t keep up with the evolution of machines. More hardware resources were becoming easier and cheaper to access, program compilations ceased to need physical interaction with computers, the web was invented and implemented globally enabling developers to share knowledge from wherever they were.
Waterfall stopped making much sense in a such fast and collaborative environment. Anyone could have a machine at the comfort of your house and write and, sometimes, disruptive programs that shook the foundations of known market practices. Dedicating months of planning, writing hundreds of pages (commonly books) of requirement documentation and diagrams started to sound an unnecessary step of an oppressive management. Changes had to be made.
The old vs the new
In recent years many have talked of Agile as the core and the must-follow methodology for software and even non-software development processes inside an organization. The common mindset is that it is the answer to every problem. No matter in which sector, what/who is involved and what the nature is, there will be an Agile framework for you.
Every organization is unique and faces various internal elements (i.e .size and current structure) and outside elements (i.e . Customers and business model). Much work has been done in creating new frameworks following the Agile manifesto to fulfill the demand of different kinds of rising problems. The simplicity comes from the fact that it’s a set of principles to follow along and tune to your needs. There’s no one-fits-all solution. Each sector has its own roles, problems, processes, people, and skillset. Which mix is good for the team will depend upon the internal and external factors, wants and goals.
In a Agile environment, collaboration between teams is more constantly present compared to the Waterfall methodologies where documentation was written and then passed down to the next in line. Every asked question was just answered with a simple and lazy “Read the docs”. Inspired by a vision of the future or just compiled bad experiences that seemed common to every one inside that room, the agile manifesto came to life. With every good methodology, only one goal is important: solve a problem.
The (old) new problem
At the time (90s, 00s) there was little to no segregation between who is responsible to build a good software product. Most of the work in the hands of developers. User experience wasn’t something of it’s own per se. Evidently, it was always present but if a product had a good acceptance it was somewhat a mystery. User interface design was treated as non important and presented some dreadful results.
Today, however you may need several people with different skillsets to build and to keep a competitive product in the market. UX/UI designers, product managers (and their variations), backend developers, frontend developers, mobile developers, infrastructure professionals, etc. (see a pattern here?) It’s clear that the need of more personnel, different job titles is increasing by the day.
Adding to the fact, big tech companies are starting to get desperate. There’s not enough professional to feed their needs. In a decade (2001–2011) wages increased up to 35% and up to 47% in the last one depending on the area. Could an era of software crises happen again?
The main goal of Waterfall is to reduce the costs of development phase (the actual code writing). With salaries so high and the lack of professionals available signals to the comeback of ahead planning. Small companies cannot just compete with big tech salaries and keep professionals around to “make mistakes faster”.
What do you think? It’s worth to have planning done before the actual execution? Or should everyone use the trial and error? Maybe a mix of both? Leave your comments below.
This is not a comparison between methodologies. I’m just pointing out patterns on the history of the industry, the motivations and the operational and technical problems they try to solve.