El Futuro de las Computadoras Será Radicalmente Diferente

El Futuro de las Computadoras Será Radicalmente Diferente

The Resurgence of Analog Computers

The Historical Context of Computing

  • For centuries, analog computers were the most powerful technology, used for predicting eclipses and tides, and guiding anti-aircraft tanks.
  • With the advent of solid-state transistors, digital computers surpassed analog ones, leading to a predominance of digital systems in current use.

The Revival of Analog Technology

  • A perfect storm of factors is causing a resurgence in analog technology; an example is shown with an analog computer that can be programmed to solve various differential equations.
  • Unlike digital computers that operate on binary (zeros and ones), analog computers utilize oscillating voltages to represent physical problems more directly.

Demonstration of Analog Capabilities

  • By changing electrical connections, the analog computer can simulate different scenarios like damped oscillations in a spring system.
  • The Lorenz system is highlighted as a basic model for atmospheric convection and chaos theory, showcasing real-time parameter changes on the analog computer.

Advantages and Disadvantages of Analog Computers

  • Analog computers are powerful computational devices capable of rapid calculations while consuming less energy compared to digital counterparts.
  • Summing two 8-bit numbers requires about 50 transistors in digital computing but only two connected wires in an analog setup. Multiplication also demonstrates efficiency through voltage across resistors.
  • However, they are not general-purpose devices; tasks like using Microsoft Word are impossible on them due to continuous input/output limitations.

Limitations Affecting Popularity

  • Inaccuracy arises from component variations (e.g., resistors or capacitors), leading to a general error margin around 1%.
  • While speed and energy efficiency are advantages, their single-purpose nature and inherent inaccuracies contributed significantly to their decline once digital computing became viable.

The Role of Artificial Intelligence

Introduction to AI Concepts

  • The discussion transitions into artificial intelligence (AI), tracing its origins back to 1956 when the term was coined.
  • Frank Rosenblatt's Perceptron model aimed at mimicking neuron behavior by processing inputs as binary activations (1 or 0).

Neural Network Functionality Explained

  • Each neuron's activation level varies based on weighted connections from other neurons—some weights being positive (excitatory), others negative (inhibitory).
  • To determine if a neuron fires, one must sum the weighted activations against a threshold known as bias.

Training Mechanism for Perceptrons

  • Rosenblatt's Perceptron utilized an array of photo cells representing pixels in images. Each pixel’s brightness corresponds with its activation level between 0 and 1.
  • A single output neuron assesses whether it should fire based on summed inputs multiplied by their respective weights against bias thresholds.

Learning Process Through Adjustment

  • The goal was reliable image distinction between shapes like rectangles and circles; training involved adjusting weights based on correct or incorrect outputs.

Artificial Intelligence Evolution

Early Developments in AI

  • The concept of mapping two categories into different groups was emphasized as a foundational approach in early AI research.
  • Rosenblatt's perceptron struggled to differentiate between basic shapes and even animals, highlighting limitations in early machine learning capabilities.
  • The perceptron was described as an embryonic electronic computer with potential for advanced functions like walking and seeing after extensive training on examples.

Challenges Faced by Neural Networks

  • Critiques of the perceptron included its inability to distinguish between dogs and cats, leading to a challenging period known as the first AI winter.
  • Rosenblatt's untimely death during this period marked a setback for neural networks, which faced skepticism from the scientific community.

Resurgence of AI in the 1980s

  • In the 1980s, Carnegie Mellon researchers developed one of the first self-driving cars using an artificial neural network named Albin, which had an additional hidden layer compared to earlier models.
  • Albin processed images through a matrix multiplication method where input neurons connected to hidden neurons influenced output decisions based on activation levels.

Training Neural Networks

  • Human drivers provided correct steering angles during training sessions, allowing adjustments in weights within the neural network through backpropagation techniques.
  • Initial random weights evolved into identifiable patterns as training progressed, demonstrating how machines learn from data inputs over time.

Limitations and Further Developments

  • Despite advancements, artificial neural networks still struggled with simple tasks like distinguishing between cats and dogs due to hardware or software limitations.
  • A shift occurred when researcher Pei Felix proposed that more data could enhance training effectiveness; she created a large database of labeled images from 2006 to 2009.

Breakthrough Competitions and Innovations

  • From 2010 to 2017, annual competitions focused on large-scale facial recognition challenged software programs across various categories including dog breeds.
  • Successful networks featured output layers corresponding to object categories; performance was measured by top-five error rates indicating accuracy challenges.

Advancements with AlexNet

  • In 2011, AlexNet achieved significant improvements with a top-five error rate of only 16%, showcasing its depth with eight layers and millions of adjustable parameters.
  • Training AlexNet required substantial computational resources—700 million operations per image—highlighting advances in processing power through GPUs (Graphics Processing Units).

Impact on Future Research

  • The success of AlexNet set new benchmarks for performance in image classification tasks; subsequent models followed suit leading to further reductions in error rates over time.

The Future of Neural Networks and Analog Computing

Challenges in Scaling Neural Networks

  • The demand for larger neural networks is increasing, but this poses several challenges:
  • High energy consumption: Training a neural network can consume as much electricity as the annual usage of three households.
  • Bottlenecks in digital computing arise because most modern computers store information in memory and access it through a bus. This leads to inefficiencies during matrix multiplications required by deep neural networks.
  • Limitations of Moore's Law are becoming apparent; while transistor counts have doubled approximately every two years, physical constraints are making further miniaturization difficult.

The Rise of Analog Computers

  • As digital computers reach their limits, analog computers present a promising alternative. They excel at tasks like matrix multiplication without requiring the same level of precision as digital systems.
  • A visit to Mythic, an analog computing startup in Texas, reveals their focus on creating chips specifically designed for neural networks. Their technology enhances virtual reality applications by quickly rendering user poses in shared virtual spaces.

Innovations in Memory Technology

  • Mythic utilizes flash memory cells not just for binary storage (0 or 1), but as variable resistors. By controlling the number of electrons stored, they can manipulate resistance levels to perform calculations.
  • Each flash cell can be used to multiply values based on voltage and conductance relationships, allowing them to represent weights within artificial neural networks effectively.

Performance Comparison with Digital Chips

  • Mythic's chip achieves 25 trillion mathematical operations per second using only 3 watts of power. In contrast, newer digital systems require significantly more power (50 to 100 watts) and cost thousands of dollars.
  • While digital systems excel at training algorithms on powerful GPUs, Mythic’s approach offers potential applications across various fields such as security cameras and autonomous systems due to its efficiency and lower energy requirements.

Addressing Challenges with Analog Systems

  • Some propose using analog circuits for specific tasks like voice activation in smart speakers (e.g., Alexa), which could reduce energy consumption while maintaining reliability.

The Future of Analog vs. Digital Computers

Exploring the Viability of Analog Computers

  • The speaker reflects on the potential success of analog computers, questioning whether they can achieve similar success as digital computers.
  • There is a suggestion that analog computers may be more suitable for certain tasks compared to digital ones, despite the historical dominance of digital computing.

Digital Information and Its Evolution

  • Over the past 50 years, all forms of information—music, images, videos—have transitioned to digital formats.
  • The speaker posits that in a century, we might view digital technology not as an endpoint but rather as a starting point in the evolution of information technology.
Video description

🎛 SUSCRÍBETE para ver todos nuestros videos: https://www.youtube.com/c/Veritasiumenespañol?sub_confirmation=1 Las computadoras digitales nos han servido durante décadas, pero el auge de la inteligencia artificial exige un tipo de ordenador totalmente nuevo: el analógico. Gracias a Mike Henry y a todos los miembros de Mythic por el visita guiada de las computadoras analógicas. https://www.mythic-ai.com/ Gracias al Dr. Bernd Ulmann, que creó La Cosa Analógica y nos enseñó a usarla. https://the-analog-thing.org La Ley de Moore fue filmada en el Museo de Historia de la Computación en Mountain View, CA. Video de ALVINN de Welch Labs: https://www.youtube.com/watch?v=H0igiP6Hg1k ▀▀▀ Referencias: Crevier, D. (1993). AI: La tumultuosa historia de la búsqueda de la inteligencia artificial. Basic Books. - https://ve42.co/Crevier1993 Valiant, L. (2013). Probablemente sea aproximadamente correcto. HarperCollins. - https://ve42.co/Valiant2013 Rosenblatt, F. (1958). El perceptrón: Un modelo probabilístico de almacenamiento y organización de la información en el cerebro. Psychological Review, 65(6), 386-408. - https://ve42.co/Rosenblatt1958 EL NUEVO DISPOSITIVO DE LA MARINA APRENDE CON LA PRÁCTICA; Un psicólogo muestra el embrión de un ordenador diseñado para leer y ser más sabio (1958). The New York Times, p. 25. - https://ve42.co/NYT1958 Mason, H., Stewart, D., y Gill, B. (1958). Rival. The New Yorker, p. 45. - https://ve42.co/Mason1958 Imágenes de la conducción de Alvinn en NavLab - https://ve42.co/NavLab Pomerleau, D. (1989). ALVINN: Un vehículo terrestre autónomo en una red neuronal. NeurIPS, (2)1, 305-313. - https://ve42.co/Pomerleau1989 Sitio web de ImageNet - https://ve42.co/ImageNet Russakovsky, O., Deng, J. y otros (2015). Desafío de reconocimiento visual a gran escala de ImageNet. - https://ve42.co/ImageNetChallenge Documento de AlexNet: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). Clasificación de ImageNet con redes neuronales convolucionales profundas. NeurIPS, (25)1, 1097-1105. - https://ve42.co/AlexNet Karpathy, A. (2014). Publicación en el blog: Lo que aprendí al competir contra una ConvNet en ImageNet. - https://ve42.co/Karpathy2014 Fick, D. (2018). Publicación en el blog: Mythic @ Hot Chips 2018. - https://ve42.co/MythicBlog Jin, Y. y Lee, B. (2019). 2.2 Operaciones básicas de la memoria flash. Advances in Computers, 114, 1-69. - https://ve42.co/Jin2019 Demler, M. (2018). Multiplicaciones míticas en un instante. El informe del microprocesador. - https://ve42.co/Demler2018 Aspinity (2021). Publicación en el blog: 5 Mitos sobre el AnalogML. - https://ve42.co/Aspinity Wright, L. y otros (2022). Redes neuronales físicas profundas entrenadas con retropropagación. Nature, 601, 49-555. - https://ve42.co/Wright2022 Waldrop, M. M. (2016). La ley de Moore se queda sin chips. Nature, 530, 144-147. - https://ve42.co/Waldrop2016 ▀▀▀ Escrito por Derek Muller, Stephen Welch y Emily Zhang Filmado por Derek Muller, Petr Lebedev y Emily Zhang Animación: Iván Tello, Mike Radjabov y Stephen Welch Editado por Derek Muller Video/fotos adicionales proporcionadas por Getty Images y Pond5 Música de Epidemic Sound Producido por Derek Muller, Petr Lebedev y Emily Zhang El Futuro de las Computadoras Será Radicalmente Diferente Video en Inglés del Canal @veritasium : https://youtu.be/GVsUOuSjvcg Future Computers Will Be Radically Different ------------------------------------------------------------------------ 🕹 Este canal de Youtube es administrado por: https://www.unilingo.tv/ ❓Comentarios o sugerencias de traducción: info@unilingo.tv ------------------------------------------------------------------------ #veritasiumenespañol