work week
Back to blogs

The birth of the workweek

The health and wellbeing of the workforce is closely intertwined with the state of a post-pandemic economy.

From the Great Resignation, four-day workweek trials, to employee wellbeing being thrust firmly into the limelight, employers have a unique opportunity to create lasting positive change and redefine the modern workplace. After all, employers who support work-life balance benefit from improved engagement, productivity, and retention, while those who flounder face lower morale, absenteeism, and a higher attrition rate.

How will our choices today impact tomorrow’s future workplace? Which path should we take to ensure work-life balance? In this blog, we look back through a historical lens on work-life balance to consider how we got to this position in the first place.

How we got here

For most of human history, ‘work’ and ‘life’ were not conceptually distinct. From hunter-gathering to subsistence farming, the entire family unit would generally be involved in labor, and the demands of that labor would determine where they lived and what shape their days took on.

This all changed with the Industrial Revolution when the world of work was dislodged from domestic life, and the notion that the two spheres were fundamentally different first took hold.

Clocking in

By the mid-18th Century, a series of rapid technological innovations had fundamentally altered production. Suddenly, industrial machines could be used to mass produce goods, and large factories housing such machines became ubiquitous.

The competitive advantages of mass production were clear, and the vast majority of people had to take work in these factories in order to survive. However, because operating this heavy machinery required brute strength, work became gendered, creating a domestic sphere where women reigned, and a working world that was mostly inhabited by men.

The notion of one’s time as something to be sold also originates here, along with the idea of ‘clocking in’ and hourly wages. In fact, the widespread dissemination of domestic clocks and personal timepieces largely coincided with this movement towards regimented working hours.

By the late 1800s, the number of hours women and children could work each week was legally limited, solidifying the symbolic separation between ‘work’ and ‘life’. The upshot, however, was that workers’ labor could be quantified in new ways, ultimately leading to the paradigm of work-life balance that we recognize today.

Working 9-to-5; what a way to make a living

The idea of a regular 9-to-5 workday was not always as intuitive as it might seem today; the notion was first forwarded by American labor unions in the mid-18th Century, but it wasn’t until Henry Ford used the model as a way of attracting workers to his car factories that the idea really took off. By 1938, the Fair Labor Standards Act made extra pay for hours worked over 40 hours per week mandatory, and since the Second World War, the 5-day workweek with a standard 9-to-5 daily shift has been held as standard across most developed nations.

This was an exceptionally fast transition: when governments first began tracking the average length of a workweek in 1890, they found it was around 100 hours for manufacturing jobs and 102 for building tradesmen. By the early 1950s, the average worker was putting in 48 hours per week. However, this change was not solely the result of pressure from unions or legislation: it was primarily to do with technological advances and changes to the kinds of work being done. And that means that while progress has continued in many ways, it has not been nearly as linear as we might imagine.


In the wake of the Second World War, extraordinary economic progress led many to believe that a post-work society – powered by technological innovations that would free workers from uncreative tasks – was just on the horizon. The reality, however, has been one of increasingly stratified work – greater degrees of precarity and higher rates of part-time work coexisting with an epidemic of corporate ‘burnout’– a term tellingly coined in the 1970s.

Since the war, a combination of forces created greater competition for jobs: globalization has driven wages down as the outsourcing of cheap labor has grown; increasing acceptance of female and minority workers have added greater diversity to the competitive pool; and exponential population growth has resulted in a far vaster workforce than ever before. The domestic sphere has failed to keep pace with this evolution: female participation in the workforce has not brought about a more equitable division of familial responsibilities, producing persistent tensions and forcing women to juggle work and life in ways which are often ultimately detrimental to their wellbeing.

Now, with the proliferation of smartphones and 24/7 email, we have arrived at a moment in history where workers often voluntarily work excessive hours with no obvious distinction between ‘work’ and ‘life’.

Shaping the future

In 1974, the futurist Arthur C. Clarkestood in front of the whirring cacophony of a gigantic computer center. He was being interviewed by an Australian reporter with impressive sideburns who asked what life will be like in 2001. Clarke says that every household will have its own computer console and access to all the information needed to function in a complex society. Clarke says this will connect and enrich lives in unimaginable ways, “any businessman or executive could live almost anywhere on earth and still do his business,” foretells Clarke, “and this is a wonderful thing – it means we won’t have to be stuck in cities, we can live out in the country or wherever we please.”

It was not until Covid-19 – not in a meaningful sense – that this future became fully realized. For much of the developed world, the global pandemic enforced working from home – and home became anywhere tethered to the internet that would make do. Global lockdowns forced organizations to try new ways of working. Of course, the situation is also much more complicated than that – change involves risk and incurs costs. What can seem like a reasonable idea to an individual employee can look very different from the top, where an operations manager has to analyze the way a whole organization or business unit can work.

If we can learn anything from history, it is that progress is a messy business. While the advent of industrial factories brought about unprecedented economic growth, it also shattered previously stable notions of the family unit. Additionally, while technology has freed us from countless mundane tasks, it has also enslaved large swathes of the population to 24/7 availability. Carving out a path forward must bear this in mind. While it’s important that we see past the arbitrary strictures of our current conceptions of work, we must also be wary of throwing them off without a clear sense of what risks might be associated with their replacement.

Head over to The work-life rebalance blog for more insights.

As a global talent partner, we are constantly on the pulse of evolving workplace patterns and its impact to talent management. If you’re an employer looking to make a business-critical hire, or an employee interested in understanding how to navigate this evolving landscape, contact us today.