top of page

Feeling the AGI yet?

  • Writer: Atmn Intelligence
    Atmn Intelligence
  • Apr 23
  • 2 min read

Updated: Jul 21

Understanding the scaling hypothesis in AI & estimation of AGI arrival.

ree


Scaling hypothesis in AI explains emergence of an AGI as a result of training an AI model with “d” amount of data on “c” amount of compute for “t” amount of time.


The output is a network with “n” number of connections.


We call this network AGI-equivalent for its ability to solve any problem which is potentially solvable by human beings.


Using the above variables, we can benchmark a frontier AI model against the human brain.


Data (D)

A human brain has ~86 billion neurons which are estimated to have a storage of ~2.5 Petabytes or 2.5e+15 bytes.GPT4 was trained on a training data set of ~10 trillion words or 4e+13 bytes.So, the training data size in an AGI-equivalent model will be 62.5X that of the GPT4 model.


Compute (C) & Time (T)

A human brain is estimated to deliver a compute of 1 exaflop or close to a cluster of 51.3k A100s.GPT4 is trained on a cluster of ~25000 A100 GPUs, which are capable of delivering a compute of 19.5 TFLOPS each at FP32 or FP64 Tensor Core format.Increasing the cluster size or training time by >2X removes compute limitations in AGI arrival.


Number of connections (N)

A human brain is estimated to have ~100 trillion synapses.GPT4 has 1.76 trillion parameters, which can be considered analogous to synapses.This comparison would mean, an AGI-equivalent model will be ~57 times bigger than GPT4.


GPT1 was launched in June, 2018 with 117 million parameters.

GPT2 followed on 14th of February, 2019 (~250 days later) with 1.5 billion parameters (~13X).

GPT3 in June, 2020 (~490 days later) with 175 billion parameters (~117X).

GPT4 launched on 14th of March, 2023 (~1000 days later) with 1.76 trillion parameters (~10X).


An AGI-equivalent model with 100 trillion parameters, has a >95% probability of arriving between December, 2027 to June, 2043 with the most likely scenario around October, 2028.  


We have made a lot of assumptions to arrive at this estimate. The most important one remains the scaling law hypothesis.


Human brains debatably resemble a quantum computer more than a traditional binary computer. Additionally, human brains often rely on heuristics & algorithmic decision making for efficiency & speed.

As AI/ML research continues to tackle these challenges, we can safely refer to this AGI-equivalent model as an AGI network.


We will share more as we develop further understanding of AI, AGI, and ASI at Atmn Labs. 


While we remain skeptical about certain claims from frontier AI labs, we recognize that intelligence, awareness, and consciousness are all emergent phenomena.


They arise suddenly & immediately fill us with wonder.



Comments


atmn-animated-gif-nude.gif
  • X
  • GitHub
  • LinkedIn
  • Instagram
  • Medium
  • YouTube
  • Whatsapp
  • Reddit
  • Pinterest

1201 3rd Ave #2200, Seattle

WA 98101, United States

©2025 Atmn Intelligence, Inc.

bottom of page