Summary Estimating Likelihood of Transformative AGI by 2043 arxiv.org
43,226 words - PDF document - View PDF document
One Line
The text is missing, please provide the input text.
Slides
Slide Presentation (18 slides)
Key Points
- The likelihood of transformative AGI by 2043 is estimated to be less than 1%.
- The development of transformative AGI requires fundamental algorithmic improvements and efficient computation.
- The cost of computation for AGI training and inference tasks is currently underestimated.
- The development of post-silicon technology to replace silicon transistors is unlikely in the near future.
- The availability and scalability of robot bodies are concerns for achieving transformative AGI.
- The semiconductor shortage and limitations in the semiconductor supply chain may affect AGI development.
- Power infrastructure and energy generation present significant challenges for achieving transformative AGI.
- War and engineered pandemics have the potential to delay AGI development.
Summaries
32 word summary
The likelihood of transformative AGI by 2043 is estimated to be less than 1%. Potential hindrances include chip density and energy efficiency improvements, as well as the possibility of great power wars.
113 word summary
The likelihood of transformative AGI by 2043 is estimated to be less than 1%. This estimation considers the high bar set for AGI, the necessary steps required, and the probabilities of success for each step. Potential implications of AGI include
AMD, ASML, and TSMC all predict slowing improvements in chip density and energy efficiency, which could hinder progress towards AGI. Chip design improvements may only result in a 6.5x increase in efficiencies over 15 years, making
Expert forecasters predict a substantial likelihood of great power wars in the next year, with a range of 2% to 10%, and as high as 20% to 45% by 2043. The likelihood of war is influenced by
1854 word summary
The likelihood of transformative AGI by 2043 is estimated to be less than 1%. This estimation takes into account the high bar set for AGI, the necessary steps required, and the probabilities of success for each step. The cascading conditional
The likelihood of transformative AGI by 2043 is estimated to be less than 1%. There may be social anxiety, national security implications, and potential warnings from AGI itself. The risk of war and engineered pandemics may increase, while the
This essay aims to quantify and analyze the likelihood of transformative AGI developing by 2043. The framework is based on a cascade of seven conditional probabilities, including algorithmic improvements, efficient computation, cheap robot bodies, and geopolitical stability. The authors defend
The author has experience in the AI field and has made successful predictions in the past. They have expertise in semiconductor technology and AI technology, which informs their perspective on hardware and software progress for AGI. They have a track record of success in forecasting competitions
By 2043, transformative AGI must already exist and be deployed at scale to address existential risks. AGI scenarios include exceptional tasks beyond human capability and the performance of many already-possible tasks at large scale. Deployment and scale are necessary for the
It is a mistake to assume that something is impossible just because one cannot imagine it. Historical examples show that experts have often made incorrect predictions about technological advancements. Forecasting technological development is like navigating a foggy forest with blocked paths. Even if experts have
Efforts to develop walking robots using deep neural networks have been limited, with Boston Dynamics relying on control theory algorithms instead. Even Tesla's Optimus, which uses neural networks, struggles to walk and requires task-specific engineering. The OpenWorm project,
Early progress in AI does not guarantee all difficulties will be overcome. Self-driving has taken longer than chess or Go because of expensive and slow reinforcement learning. Radiologists are still needed despite AI tools being used regularly. Radiology requires a 3D world
Radiologists face various challenges that make automation difficult, such as complex interactions with physicians and unique clinic environments. The healthcare system's emphasis on patient safety and the need for specific domain knowledge further complicate the adoption of automated solutions. Machine learning practitioners must also
The emergence of transformative AGI relies on fundamental algorithmic improvements, such as AI sample efficiency, training feedback, and parallelization. However, these improvements are not guaranteed to occur on convenient timelines, and it is unclear what exactly needs to be developed.
To achieve transformative AGI by 2043, the current sequential reinforcement learning approach needs to be modified. This could involve constructing self-supervised training sets or simulated reinforcement learning loops. Building realistic simulations or collecting large amounts of real-world data are potential methods
Current estimates of the computational intensity required for AGI training and inference tasks are too low. The cost-efficiency of today's silicon and the computational intensity of the human brain suggest that a human-level AGI would cost about $1 million per hour to
A synapse is a complex biological assembly, not a single number, and real neurons perform complex activity that computational "neurons" cannot replicate. The computational complexity of a biological neuron is much higher than previously estimated. Neurons perform computation in relation to
Conservative estimates suggest that the human brain collects and processes about 1e8 bits of data per second. By updating assumptions from previous studies, it is estimated that training a human-scale AGI would require approximately 1e30 floating-point operations.
Transformative AGI by 2043 requires a five orders of magnitude drop in the cost of computation per unit AGI. This drop can come from a decrease in the amount of computation per dollar and the amount of computation used per human-equivalent task
NVIDIA's DGX H100 has a max power of 11.3 kW. Pricing is not a reliable metric due to high fixed costs. The trend of sub-Moore scaling in computation has been consistent for several years. The semiconductor industry
AMD, the second-largest GPU maker, and ASML, the leading chip manufacturing tool company, both forecast slowing improvements in chip density and energy efficiency. TSMC, the world's leading chipmaker, also predicts decelerating density improvements. The
In the race towards AGI, progress may decelerate due to diminishing returns and the difficulty of reducing energy consumption. Chip design improvements may only result in a 6.5x increase in efficiencies over 15 years. FLOPS/$ may
The prospect of five orders of magnitude of energy efficiency improvement in transistors within 20 years is unlikely. An entirely new transistor platform or extremely precise manufacturing would be required to beat the physical limit faced by all field-effect transistors. The path
The efficiency of artificial general intelligence (AGI) compared to evolved biological systems is uncertain. AGIs can use efficient algorithms and purpose-specific hardware, reducing the need for extensive training and inference. The amount of compute used by AGI could vary significantly,
It is highly unlikely that a new computing technology will surpass field-effect transistors in the near future, according to experts. The consensus among top computer hardware researchers is that transistors will continue to dominate the industry for the next few decades. The
The development of a post-silicon technology to replace silicon transistors within the next 15 years is unlikely due to various barriers, including the challenges of scaling up manufacturing, building trust with major companies, and adapting software and systems. If a
There is currently no candidate technology that is close to surpassing silicon transistors. Various alternative technologies have been investigated, including quantum computing, optical computing, spintronics, DNA computing, graphene or carbon nanotube transistors, memrist
By 2043, if AGIs perform 10% of human jobs and 80% of them require robots, we would need robots to work 800 billion hours per year. The cost and scalability of these robots are concerns. If robot bodies
To estimate the likelihood of achieving transformative AGI by 2043, several factors need to be considered. The availability of robot bodies in 2043 has a 60% chance of happening, with non-humanoid robot bodies being simpler to manufacture and
The time it takes to buy compute depends on the scale of the order. For individuals, buying compute is easy, but larger orders take longer to fulfill due to semiconductor supply chain limitations. At the scale needed for transformative AGI, expanding the industry could
The semiconductor shortage is improving but may last until 2024 due to the time it takes to build new manufacturing tools. Lead times for tools and chips are still long, and would be even longer with higher demand. Investment decisions by TSMC and
Semiconductor production may fall short of compute requirements for AGI by 2043. The estimate is based on lithography tool capacity and the potential reorientation of EUV-powered supply chains. There is a 50% chance that semiconductor production can meet
Building the necessary power infrastructure for transformative AGI by 2043 presents significant challenges. The scale of power generation required, potentially reaching 10-100 GW concentrated around data centers, is unprecedented and would require immense cooperation and financing. While global power numbers
The likelihood of AGI solving physical constraints is uncertain due to long time lags and dependence on scarce resources. AGIs are unlikely to immediately solve logistical problems. Estimates account for the possibility of rapid growth in wafer production and solar power installations. The
The global AGI supply chain and competition-driven instability could delay AGI development. AGIs themselves may take action to slow or halt AGI progress due to X-risk or self-interest. Even a small delay could be decisive. There is a 70
War has the potential to significantly delay the development of transformative AGI by 2043. It can destroy the demand for semiconductor products, human capital, and funding for AGI projects. War can also hinder progress in AI software development by reducing the concentration
Expert forecasters predict a substantial likelihood of great power wars in the next year, with a range of 2% to 10%, and as high as 20% to 45% by 2043. There is a high chance of conflict
There is a 40% chance of severe war erupting by 2042 if transformative AGI is on track. The likelihood of war is influenced by AI progress increasing the incentive and capabilities for war. The most probable scenario for delaying AGI progress
The likelihood of transformative AGI being delayed by an engineered pandemic is estimated to be low but non-zero, ranging from 2%-6%. Experts predict a 10% chance of over 1 billion people being killed by a pandemic within 92 years
A mild recession in 1969 may have delayed interplanetary space travel. The likelihood of transformative AGI being derailed by depression is estimated at 10% in the next 20 years. The chances of disruption due to world war,
The likelihood of transformative AGI by 2043 is uncertain. The authors believe they have not made errors in their modeling approach, which avoids rigid assumptions about AGI technology. They argue that AGI by 2043 will likely be powered by electricity
Investors and the educational system need to invest in AI. Historical examples show that technological advancements have gradual impacts. AGI takeoff will be gradual due to high costs, unreliability, and the need for companies to figure out how to fully utilize
The integration of AI into companies may take years, even when the technology is advanced. The lack of takeoff in computer and narrow AI tools suggests that improvements in narrow AI will not lead to rapid progress. The limiting factor in technological progress is willingness to
The summary of the text excerpt is as follows:
It is good news that there is a fair degree of certainty that the fundamental problems around AGI alignment and X-risk will not be pressing within the next twenty years. This provides time for critical work to
The text discusses the estimation of the likelihood of transformative AGI by 2043. It emphasizes the importance of considering others' views and the impact of extremizing on estimates. The neutral starting point for a forecast is subjective and depends on how the problem
A superintelligence, even with a refined understanding of physics, is still bound by physical laws. Like humans, an AGI will require an energy industry to extract energy from natural sources and be limited by Carnot efficiency. Energy will need to be transported
AlphaZero's chess strength is estimated to be -3500 Elo, but even with superintelligence, it cannot overcome certain disadvantages. While AlphaZero has inspired players, it hasn't revolutionized human chess or provided secret shortcuts to winning. Chess computers have
The cost of running systems like GPT-4 can reach over $100,000/hr. Scaling up GPT models may have diminishing returns. OpenAI is not currently developing GPT-5. AI progress depends on software and hardware efficiency as well
Scaling AI to transformative AGI by 2043 will require significant investments in data centers, power plants, and fabs. It will be slower and more challenging than previous scaling. Only a few industrial labs backed by deep-pocketed investors can afford it.
Asset prices and valuations of computer hardware and software companies appear to be normal, suggesting weak evidence for or against the likelihood of transformative AGI. Natural resources have not experienced significant changes in value. Public market investors and informed intracompany investors both forecast