Many of the digital world’s greatest inventions have resulted from people experimenting in their spare time.
Thirty years ago, a young CERN engineer called Tim Berners-Lee developed the World Wide Web while attempting to find an easier way to share documents among his colleagues.
At the same time, a Finnish graduate called Linus Torvalds was developing a simpler alternative to the clunky Unix operating system used at his university.
The results of these endeavours made their debuts in 1991, and both went on to change the world in very different ways.
Throughout the early Nineties, a keen team of young programmers evolved the Linux kernel (the core operating system) from a basic prototype into a fully functioning platform.
By the time Linux 1.0 launched in 1994, the kernel had been become a highly sophisticated – and therefore very complex – operating system for desktop and laptop computers.
The following year, Microsoft unveiled Windows 95, and text-based platforms like Linux instantly seemed dated.
Yet the subsequent ubiquity of Windows worked in Linux’s favour.
People unwilling to fund a corporate behemoth (or accept the limitations of rival Apple products) suddenly had a third operating system to choose from.
And while its text interface was rather passé, Linux 2.0’s adoption of network functionality chimed perfectly with the times.
Its potential was quickly spotted by computing companies, and blue-chip partners including Compaq and Dell came on-board.
Growing investment and support saw subsequent version of Linux becoming far more polished, sporting stylish interfaces reminiscent of Windows 10 and macOS Catalina.
Thanks to Linus Torvalds’ generosity, Linux has always been open source.
In other words, its program code can be modified and developed freely, with no usage restrictions.
This caused the ecosystem to fragment into a number of different (and therefore incompatible) distributions, known as distros.
Some of these provided greater compatibility with computer peripherals and third-party software than others, limiting their appeal among less technically savvy consumers.
Fierce competition also erupted between rival platforms like the community-oriented Debian (named after the founder and his girlfriend) and the more corporate CentOS.
Many people settled on a middle ground with Ubuntu, which holds the largest percentage of the Linux market and receives twice-yearly updates.
Because smaller distros are developed by teams of amateur enthusiasts, regular updates and support can’t be taken for granted the way they would with Windows or macOS.
However, the platform’s inherent flexibility has seen it adopted by numerous web applications, including many ecommerce websites requiring product databases.
It underpins Google’s Chrome OS for Chromebooks, powers Amazon Web Services and Google Cloud Services, and runs most in-car engine management software and satnav systems.
Despite these common uses, Linux could easily have remained in the shadow of its two arch-rivals.
Its incorporation into the fledgling Android ecosystem proved to be transformative, as the smartphone sector exploded into view.
Android now dominates the UK smartphone operating system market, meaning most phone users interact with a Linux-based OS on a daily basis.
It’s perhaps ironic that a platform designed to combat failings in established software has become so widely adopted, especially as a community-powered rival to monolithic corporations.
Yet Linux’s stability and versatility secured its place in history alongside other concepts which debuted in the same year – the Web, cyber cafes, the solid-state hard drive and webcams.
Viewed in hindsight, 1991 was the year the modern world was born.