The Gartner Hype Cycle

Joined
11 Nov 2020
Messages
10,403
Reaction score
1,524
Location
Middle Earth
Country
United Kingdom
No, it isn't a new fad for a Peloton device...

What to do when surrounded by people who are losing their minds about the Newest New Thing? Answer: reach for the Gartner Hype Cycle, an ingenious diagram that maps the progress of an emerging technology through five phases: the “technology trigger”, which is followed by a rapid rise to the “peak of inflated expectations”; this is succeeded by a rapid decline into the “trough of disillusionment”, after which begins a gentle climb up the “slope of enlightenment” – before eventually (often years or decades later) reaching the “plateau of productivity”.

320px-Gartner_Hype_Cycle.svg.png


{Something like the CD craze of the early 80s, when the geeks told vinyl collectors we were wasting our time. The new CD was better sound quality, couldn't scratch and was indestructable - well, it was crystal clear sound, but the other two...not so much. Peaked in the 90s but the Millenium brought about a decline in sales as the Zombie Generation looked back to the 70s when vinyl was king and saw that it was time to reassess it's value. Now my record collection is worth a great deal more than i paid for it and CDs are something handy to rest your coffee cup on.)

Given the current hysteria about AI, I thought I’d check to see where it is on the chart. It shows that generative AI (the polite term for ChatGPT and co) has just reached the peak of inflated expectations. That squares with the fevered predictions of the tech industry (not to mention governments) that AI will be transformative and will soon be ubiquitous. This hype has given rise to much anguished fretting about its impact on employment, misinformation, politics etc, and also to a deal of anxious extrapolations about an existential risk to humanity.

How come? Basically, because AI requires staggering amounts of computing power. And since computers require electricity, and the necessary GPUs (graphics processing units) run very hot (and therefore need cooling), the technology consumes electricity at a colossal rate. Which, in turn, means CO2 emissions on a large scale – about which the industry is extraordinarily coy, while simultaneously boasting about using offsets and other wheezes to mime carbon neutrality.

A study in 2019, for example, estimated the carbon footprint of training a single early large language model (LLM) such as GPT-2 at about 300,000kg of CO2 emissions – the equivalent of 125 round-trip flights between New York and Beijing. Since then, models have become exponentially bigger and their training footprints will therefore be proportionately larger.

Generative tasks (text generation, summarising, image generation and captioning) are predictably more energy- and carbon-intensive compared with discriminative tasks. Tasks involving images emit more carbon than ones involving text alone. Surprisingly (at least to this columnist), training AI models remains much, much more carbon-intensive than use of them for inference. The researchers tried to estimate how many inferences would be needed before their carbon cost equalled the environmental impact of training them. In the case of one of the larger models, it would take 204.5m inference interactions, at which point the carbon footprint of the AI would be doubled.

John Naughton@the Guardian

The irony is, developing technology in order to 'save the planet' and make our lives easier, where in truth, it could do neither and end up making things much worse.
 
Always like considering the Gartner Hype Cycle for everything including 3D TV, Heat Pumps, and Steam Engines in the 1860's.

But, as per the wiki link it should always remember that that graph is subjective to the viewer, not everyone as....

With the subjective terms disillusionment, enlightenment and expectations the Gartner Hype Cycle cannot be described objectively or clearly where technology now really is.

An example is that many in Programming industry have already moved through that graph and now use AI as a 'productive tool'.
 
Last edited:
I was interested to read that - a supercomputer scheduled to go online in April 2024 will rival the estimated rate of operations in the human brain, according to researchers in Australia. The machine, called DeepSouth, is capable of performing 228 trillion operations per second. It’s the world’s first supercomputer capable of simulating networks of neurons and synapses (key biological structures that make up our nervous system) at the scale of the human brain. DeepSouth belongs to an approach known as neuromorphic computing, which aims to mimic the biological processes of the human brain. It will be run from the International Centre for Neuromorphic Systems at Western Sydney University.

Blow your mind@the Conversation.com

If Quantum Computing is applied within a machine with such capability would Heisenburg's Theory become our new reality?

Bearing in mind that...

With the subjective terms disillusionment, enlightenment and expectations the Gartner Hype Cycle cannot be described objectively or clearly where technology now really is.
 
Back
Top