The Ant Press will run a short series of articles on the Fourth Industrial Revolution (4IR), starting with this post to introduce and begin to unpack the concept and related aspects of this evolving revolution. We will look briefly at the origins of 4IR and attempt to locate ourselves within this new era, before exploring how the concept translates into practice and the possible impacts and opportunities, particularly in terms of the future of ‘work’, in this period of rapid, disruptive and pervasive technological change.
So… What is 4IR?
Three years ago, 4IR was the central theme at the World Economic Forum (WEF) annual meeting, Davos 2016. 4IR and Artificial Intelligence (AI) have remained prominent each year and a lot of research documentation has come from this. At Davos 2019, the theme was “Globalisation 4.0: Shaping a Global Architecture in the Age of the Fourth Industrial Revolution”, based on the idea that we “are entering a fourth industrial revolution, where a new wave of technological progress will launch us into a new era of globalisation.”
The 4IR, or “Industry 4.0” (manufacturing-focused), is the term popularised by the WEF for the current and developing environment in which disruptive technologies and trends are changing the way we live, work and relate to one another. Such emerging technologies include artificial intelligence (AI), the Internet of Things (IoT), cloud computing, cyber security, the sharing economy, robotics, autonomous robots and vehicles, additive manufacturing (3D printing), cyber-physical systems, genome editing, blockchain technologies, big data and analytics, virtual reality (VR), augmented reality (AR) and hybrid or mixed reality (MR).
How did we get here?
The Industrial Revolution (IR) – now termed the First IR and also regarded as the first wave of innovation (Kondratieff cycle or K-wave) – Is dated from the mid-18th century in Britain with the shift from the Agricultural Revolution and rural to rapid population growth, urbanisation and mechanised production, enabled by steam and water power, shifting our reliance from animals, human effort and biomass as primary sources of energy to fossil fuels.
The Technological or Second IR – the age of science or “Age of Synergy” – dates from the end of the 19th century, with the adoption of the production line, first used on a large scale by the meat-packing industries of Chicago and Cincinnati during the 1870s. The slaughterhouse system informed Henry Ford’s design and innovations in the first permanent, moving assembly line in 1913 and the advent of electrically powered time-efficient mass production and division of labour. The period saw major breakthroughs in the form of power generation and distribution, communications (e.g. telegraph, radio, telephone), new materials and substances including alloys and chemicals, mass produced pencils and wood-based paper, the internal combustion engine and petroleum.
The Digital Revolution or Third IR – also termed the “Information Age” – began from the late 1940s with the development of digital systems, communication and rapid advances in computing power, which enabled new ways of generating, processing and sharing information. Computerisation, electronics, automated mass production, and information and communications technology (ICT) characterise the Third IR, coinciding with the 5th wave of innovation which enabled the transformation of the world into a “global village”.
While 4IR is arguably an extension of the Digital Revolution and builds on the widespread availability of digital technologies, it is considered to be a distinct era for several reasons including the complexity and sheer speed of technological breakthroughs, the pervasiveness of scope, the tremendous systems impact of changes, the role of ‘big data’, and “because it will even challenge our ideas about what it means to be human”.
The 4IR sees the dawn of “cyber-physical systems” and the convergence of digital, biological, and physical innovations enabling entirely new capabilities for people and machines and entirely new ways of embedding technology within larger societies, communities, and even in the human body. 4IR has a disruptive effect on all economies and, as with any revolution, there are associated risks and benefits, and a series of social, political, cultural, and economic upheavals are expected to unfold over this century.
4IR, with associated uncharted growth in digitisation and internet connectivity, is argued to have “the potential to drive Africa forward like never before, enabling innovation, spurring new business models and improving the delivery of public services.”[iii] In South Africa, the inaugural Digital Economy Summit of 4IRSA – an initiative between several universities, government and major private-sector players – was convened in July 2019, with the theme: “Positioning SA to be Globally Competitive in the Fourth Industrial Revolution.”
OK… now for all the acronyms and a few more definitions.
We’ll keep the “Acronyms and Definitions” list updated and please feel free to contribute to both acronyms and definitions!
ACRONYMS AND DEFINITIONS
4IR – Fourth Industrial Revolution
4IRSA]– 4IR South Africa initiative: A platform to bring together key stakeholders, decision makers, and pioneers to define the principles, visions and outcomes of industry 4.0 and its future effects on South Africa and to engage in constructive discussions, explore best practices, and propose solutions to address the challenges.
AI – Artificial Intelligence is the simulation of human intelligence processes by machines, especially computer systems, which include learning (the acquisition of information and rules for using the information), reasoning (using rules to reach approximate or definite conclusions) and self-correction. Also termed Augmented Intelligence to emphasise that cognitive technology is designed to enhance and/or reinforce human intelligence rather than replace it. Real-world applications of AI include expert systems (e.g. aviation flight management and autopilot; AI legal research and counsel, such as ROSS), navigation and self-driving vehicles which use a combination of computer vision, image recognition and deep learning to build automated skill at piloting a vehicle and avoiding unexpected obstructions.
Examples of AI technology:
Automation: Programming a system or process to function automatically, such as Robotic process automation (RPA) using software with AI and machine learning capabilities to handle high-volume, repeatable tasks that previously required humans to perform. RPA software robots (bots) can mimic a human worker in tasks such as queries, calculations and maintenance of records and transactions. Currently three broad RPA technologies are identified: Probots (process data); Knowbots (source and store information); and Chatbots or virtual agents (respond to user enquiries in real-time).
Machine learning: The science of getting a computer to act without programming, or “the scientific study of the algorithms and statistical models that computer systems use to effectively perform a specific task or program without using direct instructions, relying on models and inference instead.”
Deep learning is a subset of machine learning that, in very simple terms, can be thought of as the automation of predictive analytics, or “an application of AI that helps create these ‘thinking machines’ by providing systems the ability to automatically learn and improve from experience without being explicitly programmed.”
Machine vision: The science of allowing computers to see, where visual information is captured and analysed, using a camera, analogue-to-digital conversion and digital signal processing. Not bound by biology, machine vision can be programmed, for instance, to see through walls, and is used in a range of applications from signature identification to medical image analysis.
Natural language processing (NLP): The processing of human language by a computer programme. One of the older and best-known examples of NLP is e-mail spam detection, while current approaches to NLP are based on machine learning with tasks including text translation, sentiment analysis and speech recognition.
Robotics: A field of engineering focused on the design and manufacture of robots which are often used to perform tasks that are difficult for humans to perform or perform consistently. Used in assembly lines for car production or by NASA to move large objects in space.
Components of AI
AR – Augmented Reality combines real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects, in an interactive experience that sometimes combines sensory modalities, including visual, auditory, haptic, somatosensory and olfactory.
Blockchain – Blockchain is the record-keeping technology behind Bitcoin. At its most basic level, blockchain is literally just a chain of blocks, albeit in a non-traditional sense, where a “block” refers to stored digital records or pieces of information and the “chain” refers to a public database which links the blocks using cryptography. The potential for blockchain technology is not limited to Bitcoin and it has gained a lot of attention in a variety of industries, including non-profits, the arts, and e-commerce.
Cloud computing – The use of various services, such as software development platforms, servers, storage and software, over the internet, often referred to as the “cloud.” Generally, there are three cloud computing characteristics that are common to all cloud-computing vendors: The back-end of the application (especially hardware) is completely managed by a cloud vendor; A user only pays for services used (memory, processing time and bandwidth, etc.); and Services are scaleable.
CRISPR – Clustered Regularly Interspaced Short Palindromic Repeats
FOBI – Fear of Being Irrelevant
Genome/Gene-Editing – A group of technologies that give scientists the ability to change an organism’s DNA, allowing genetic material to be added, removed, or altered at particular locations in the genome. Several approaches have been developed, such as the CRISPR-Cas9 (CRISPR with associated protein 9) system which has generated a lot of excitement in the scientific community because it is faster, cheaper, more accurate, and more efficient than other existing genome editing methods. CRISPR holds promise for the treatment and prevention of a range of disorders and diseases, however genome editing is currently illegal in many countries based on concerns about ethics and safety.[xi]
Gig economy – A free market system in which temporary positions are common and organisations contract with independent workers for short-term engagements, where “gig” is slang for “a job for a specified period of time” that was typically used in referring to musicians. Gig ’employees’ include freelancers, independent contractors, project-based workers and temporary or part-time hires. The digital age enables a gig economy, where the workforce is increasingly mobile and where job and location are decoupled.[xii]
GPU – Graphics Processing Unit
ICT – Information and Communications Technology
IR – Industrial Revolution
IT – Information Technology
IoT – The Internet of Things is a sensor network of billions of web-enabled smart devices that connect people, systems and other applications to collect and share data acquired from their environments, using unique identifiers (UIDs), embedded processors, communication hardware and software applications. The devices do most of the work without human intervention, although people can interact with them to, for instance, set the devices up, give them instructions or access the data. Real-world examples include Smart Homes, Buildings and Agriculture, as well as wearable devices, where sensors and associated software allow for monitoring and automated response and/or remote control.
K-Wave – A Kondratieff Wave is a long-term economic cycle believed to be born out of technological innovation, which results in a long period of prosperity.[xiv]
MR – Mixed Reality or hybrid reality starts with the virtual world and can only be experienced using MR headsets. The digital environment is anchored to and replaces the real world, but physical and digital objects co-exist and interact in real time. Using MR, you can see virtual objects just as in AR, but these objects can also interact with the real world. In a sense, MR is a more immersive and interactive type of AR.
The first immersive MR system (with enveloping sight, sound, and touch) was the Virtual Fixtures platform developed in 1992 (US Air Force; Armstrong Laboratories)
NASA – National Aeronautics and Space Administration
NLP – Natural Language Processing
RPA – Robotic Process Automation
SA – South Africa
SET – Science, Engineering, Technology
The Sharing Economy – or collaborative consumption – describes a non-traditional means of distributing goods and service (e.g. Airbnb listings; Uber transport).
UID – Unique Identifier
US – United States
VR – Virtual Reality is a simulated experience that can be similar to, or completely different from, the real world and there is a wide range of applications from entertainment/gaming to educational (e.g. medical) and military training.