Future Computers Will Be Radically Different (Analog Computing)

공유
소스 코드
  • 게시일 2024. 03. 28.
  • Visit brilliant.org/Veritasium/ to get started learning STEM for free, and the first 200 people will get 20% off their annual premium subscription. Digital computers have served us well for decades, but the rise of artificial intelligence demands a totally new kind of computer: analog.
    Thanks to Mike Henry and everyone at Mythic for the analog computing tour! www.mythic-ai.com/
    Thanks to Dr. Bernd Ulmann, who created The Analog Thing and taught us how to use it. the-analog-thing.org
    Moore’s Law was filmed at the Computer History Museum in Mountain View, CA.
    Welch Labs’ ALVINN video: • Self Driving Cars [S1E...
    ▀▀▀
    References:
    Crevier, D. (1993). AI: The Tumultuous History Of The Search For Artificial Intelligence. Basic Books. - ve42.co/Crevier1993
    Valiant, L. (2013). Probably Approximately Correct. HarperCollins. - ve42.co/Valiant2013
    Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. - ve42.co/Rosenblatt1958
    NEW NAVY DEVICE LEARNS BY DOING; Psychologist Shows Embryo of Computer Designed to Read and Grow Wiser (1958). The New York Times, p. 25. - ve42.co/NYT1958
    Mason, H., Stewart, D., and Gill, B. (1958). Rival. The New Yorker, p. 45. - ve42.co/Mason1958
    Alvinn driving NavLab footage - ve42.co/NavLab
    Pomerleau, D. (1989). ALVINN: An Autonomous Land Vehicle In a Neural Network. NeurIPS, (2)1, 305-313. - ve42.co/Pomerleau1989
    ImageNet website - ve42.co/ImageNet
    Russakovsky, O., Deng, J. et al. (2015). ImageNet Large Scale Visual Recognition Challenge. - ve42.co/ImageNetChallenge
    AlexNet Paper: Krizhevsky, A., Sutskever, I., Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. NeurIPS, (25)1, 1097-1105. - ve42.co/AlexNet
    Karpathy, A. (2014). Blog post: What I learned from competing against a ConvNet on ImageNet. - ve42.co/Karpathy2014
    Fick, D. (2018). Blog post: Mythic @ Hot Chips 2018. - ve42.co/MythicBlog
    Jin, Y. & Lee, B. (2019). 2.2 Basic operations of flash memory. Advances in Computers, 114, 1-69. - ve42.co/Jin2019
    Demler, M. (2018). Mythic Multiplies in a Flash. The Microprocessor Report. - ve42.co/Demler2018
    Aspinity (2021). Blog post: 5 Myths About AnalogML. - ve42.co/Aspinity
    Wright, L. et al. (2022). Deep physical neural networks trained with backpropagation. Nature, 601, 49-555. - ve42.co/Wright2022
    Waldrop, M. M. (2016). The chips are down for Moore’s law. Nature, 530, 144-147. - ve42.co/Waldrop2016
    ▀▀▀
    Special thanks to Patreon supporters: Kelly Snook, TTST, Ross McCawley, Balkrishna Heroor, 65square.com, Chris LaClair, Avi Yashchin, John H. Austin, Jr., OnlineBookClub.org, Dmitry Kuzmichev, Matthew Gonzalez, Eric Sexton, john kiehl, Anton Ragin, Benedikt Heinen, Diffbot, Micah Mangione, MJP, Gnare, Dave Kircher, Burt Humburg, Blake Byers, Dumky, Evgeny Skvortsov, Meekay, Bill Linder, Paul Peijzel, Josh Hibschman, Mac Malkawi, Michael Schneider, jim buckmaster, Juan Benet, Ruslan Khroma, Robert Blum, Richard Sundvall, Lee Redden, Vincent, Stephen Wilcox, Marinus Kuivenhoven, Clayton Greenwell, Michael Krugman, Cy 'kkm' K'Nelson, Sam Lutfi, Ron Neal
    ▀▀▀
    Written by Derek Muller, Stephen Welch, and Emily Zhang
    Filmed by Derek Muller, Petr Lebedev, and Emily Zhang
    Animation by Ivy Tello, Mike Radjabov, and Stephen Welch
    Edited by Derek Muller
    Additional video/photos supplied by Getty Images and Pond5
    Music from Epidemic Sound
    Produced by Derek Muller, Petr Lebedev, and Emily Zhang

댓글 • 13K

  • @5MadMovieMakers
    @5MadMovieMakers 2 년 전 +12930

    Hyped for the future of computing. Analog and digital could work together to make some cool stuff

    • @teru797
      @teru797 2 년 전 +109

      True AI is going to be the end of us. Why would you want that?

    • @kalindibang9578
      @kalindibang9578 2 년 전 +517

      @@teru797 true AI wont be possible for the next 200 years and by then if humanity kept on living how they are we aint gonna survive anyways

    • @michaelschiller8143
      @michaelschiller8143 2 년 전 +141

      @@teru797 it would still take quantum computers to be able to have the memory necessary to run

    • @jpthepug3126
      @jpthepug3126 2 년 전 +13

      @@teru797 cool

    • @jonathanthomasjohn8348
      @jonathanthomasjohn8348 2 년 전 +254

      @@teru797 we are already the end of us

  • @NotWhatYouThink
    @NotWhatYouThink 2 년 전 +2193

    Great episode. Hadn’t considered the mix of digital and analog computers in a complementary fashion. I guess it’s not what I thought!

    • @WeponizedAutism
      @WeponizedAutism 2 년 전 +47

      True, but the actual impact of this is not what you think.

    • @mushin111
      @mushin111 2 년 전 +17

      Jesus, could you astroturf a bit harder please?

    • @LeoStaley
      @LeoStaley 2 년 전 +30

      Until the 90s, US war ships used mechanical calculators to calculate aiming the guns, something that would be perfect for your channel.

    • @deusexaethera
      @deusexaethera 2 년 전 +4

      I see what you did there.

    • @dieSpinnt
      @dieSpinnt 2 년 전

      BS! Fourier ... ROTFL

  • @koborkutya7338
    @koborkutya7338 년 전 +360

    i recall our control system teacher at the university in the '90s said Space Shuttle flight controls contained analogue computing because it had to process like several thousand sensors' input to produce outputs and digital was just too slow for the job.

    • @rogerphelps9939
      @rogerphelps9939 년 전 +7

      He lied.

    • @TARS-CASE
      @TARS-CASE 9 개월 전 +47

      @@rogerphelps9939 the Space Shuttle did indeed use analog computing for some of its flight control systems. the Space Shuttle used a hybrid digital/analog system for flight controls. Most of the high-level control logic was handled by digital computers, but critical low-level control functions were performed using analog circuits.
      The analog components were able to process sensor inputs and produce control outputs much faster - on the order of microseconds - compared to even the fastest digital computers of the era, which took milliseconds. This speed was essential for stability during flight.

    • @user-tg5sv5ps2i
      @user-tg5sv5ps2i 개월 전 +5

      I can imagine it too be also just more fault tolerant. Discrete = hard, continous = easy. An overflow in digital can literally crash a whole system. In analog there is more room for error.

  • @avinashkrishnamurthy6251

    Analogue was never meant to die; the technology of that time was the limiting factor IMO. It appears like Analogue - Digital hybrid system can do wonders in computing.

    • @DigitalJedi
      @DigitalJedi 개월 전 +2

      I know this is an old comment, but I figured I'd add that as far as physical packaging goes, nothing stops us from putting one of these next to a conventional CPU. Cooling it would be the hard part as the temperature would swing the outputs by introducing noise. Might be better as an M.2 PCIE device.

    • @goldenhate6649
      @goldenhate6649 개월 전

      @@DigitalJedi Its incredibly unlikely this will ever expand into the home. These would likely be built entirely different from traditional computers.

    • @DigitalJedi
      @DigitalJedi 개월 전 +3

      @goldenhate6649 As we've seen they can be build on NAND processes already, which are widely adopted by consumer electronics. The use case provided of low-power wake-word and condition detection seems like a great application if they can find the right product in the consumer space.

  • @ElectroBOOM
    @ElectroBOOM 2 년 전 +513

    Awesome information!

    • @Mani_Umakant23
      @Mani_Umakant23 2 년 전 +8

      I gave you your first like 😁

    • @N____er
      @N____er 2 년 전 +16

      @@Mani_Umakant23 Why would you like such an unoriginal comment that provides so little value or thought?

    • @Mani_Umakant23
      @Mani_Umakant23 2 년 전 +12

      @@N____er Aise hi sexy lag rha tha.

    • @40.vedantdubey8c6
      @40.vedantdubey8c6 2 년 전 +13

      @@N____er Don't say anything bad about ElectroBOOM he is such a wonderful creator

    • @40.vedantdubey8c6
      @40.vedantdubey8c6 2 년 전 +4

      Hi sir I am a big fan of yours
      \

  • @ModernBuilds
    @ModernBuilds 2 년 전 +481

    Your videos a are always awesome and the fact that even I can comprehend them is amazing 🔥

    • @loturzelrestaurant
      @loturzelrestaurant 2 년 전 +3

      I love Science, so randomness-be-damned: I ask around if someone
      wants some scientific Watch-Suggests, cause Learning never ends.

    • @yvettedath1510
      @yvettedath1510 2 년 전

      fuk all this future scamputing and globalist agenda

    • @yvettedath1510
      @yvettedath1510 2 년 전

      AI they want AI to control humanity

    • @HzGP
      @HzGP 2 년 전

      @@yvettedath1510 you are already controlled my dear, and they didnt have to do much to achieve this... why they need to control something or someone they ALREADY have control on ?? Why is there a need to control a sheep that, in a blink of an eye, will give up a kidney to have the latest smartphone available on the market ?? why are you so special for them to be controlled ?

    • @radiopete7290
      @radiopete7290 2 년 전

      small brain person

  • @williamtell1477
    @williamtell1477 년 전 +336

    AI researcher here, you did a great job on this. For anyone interested the book Perceptrons by Minsky/Papert is a classic with many proofs of limitations and explorations of the limits of the paradigm. It still holds up today and its fascinating to read what scientists were thinking about neural networks during the year of the moon landing!

    • @musbiq
      @musbiq 5 개월 전 +2

      Great recommendation. Thanks.

    • @christophertown7136
      @christophertown7136 3 개월 전 +1

      A Logical Calculus of the Ideas Immanent in Nervous Activity

  • @YolandaEzeagwu
    @YolandaEzeagwu 년 전 +10

    I had a business analysis course that tried explain the perceptron and I didn't understand anything. I don't have a strong maths background. This video is pure genius. The way you explain ideas is amazing and easy to understand. Thank you so much. This is my favourite channel

  • @marsgizmo
    @marsgizmo 2 년 전 +360

    amazing episode, well explained! 👏

  • @anishsaxena1226
    @anishsaxena1226 2 년 전 +2650

    As a young PhD student in Computer Science, your explanation of how neural networks come to be and evolved, and the math behind it, is the cleanest and most accessible that I have come across. As I focus on computer architecture, I came to this video without much expectations of learning anything new, but I am glad I was wrong. Keep up the great work!

    • @deepblue3682
      @deepblue3682 2 년 전 +3

      From USA?

    • @alex.g7317
      @alex.g7317 2 년 전 +18

      There’s a reason he has 11, 000, 000, 000 subscribers after all 😏

    • @unstable-horse
      @unstable-horse 2 년 전 +115

      @@alex.g7317 Wow, that's more than the population of Earth. Where does he find all those subscribers??

    • @exoops
      @exoops 2 년 전 +31

      @@unstable-horse Mars

    • @alex.g7317
      @alex.g7317 2 년 전 +44

      @@unstable-horse omg, lol 😂. That was a typo!
      I meant 11, 000, 000!

  • @NoahSpurrier
    @NoahSpurrier 년 전 +69

    I remember seeing an analog differential calculator in high school in my physics and electronics teacher’s lab. It was more of a museum piece. It was never used.
    RIP Mr. Stark

    • @elliott8596
      @elliott8596 년 전 +5

      To be fair, many of the tools we use are analog. We just don't call them "analog computers"... even though, they kind of are.

    • @rogerphelps9939
      @rogerphelps9939 년 전

      Exactly. Museums is where analog computers belong.

    • @certainlynotmalo1.0.06
      @certainlynotmalo1.0.06 29 일 전 +2

      @@rogerphelps9939
      The words of someone who knows nothing but his own little world. And he is content with it. Honestly, i'm jealous.
      For real, stay that way or life will get an awful lot harder. I would give everything i have to aquire such luxury.

  • @asg32000
    @asg32000 5 개월 전 +5

    I've watched a lot of your stuff for years now, but this is the best one by far. Great job of explaining something so complex, difficult, and relevant!

  • @KraftyB
    @KraftyB 2 년 전 +2032

    Fun fact, that Graphics Card he’s holding at 18:00 is a Titan xp with an msrp of $1200, he says it draws 100w but it actually draws about 250w, so that tiny chip that only draws 3w is even more impressive

    • @Matthew-rl3zf
      @Matthew-rl3zf 2 년 전 +237

      Let's hope these new analog chips can solve our GPU shortage problem 😂

    • @justuskarlsson7548
      @justuskarlsson7548 2 년 전 +411

      In general when doing machine learning you are only using the CUDA cores of a graphics card so the wattage never gets close to its maximum.
      A lot of the processing units are simply not being used, for example shaders and 3D processing units. For example on my GTX 1080 I sit between 60-90w out of 200w when doing Pytorch machine learning. So I think 100w out of a maximum effect of 250w seems reasonable.

    • @chrisoman87
      @chrisoman87 2 년 전 +68

      you can underclock GPU's, thats what they do in cryptomining to improve their profit margins, depending on the chip they can operate effeciently at a fraction of their nominal power

    • @AC3handle
      @AC3handle 2 년 전 +57

      man, I'm old enough to remember when a $1200 card was considered EX PENS >IVE<
      And not...'going price'.

    • @chrisoman87
      @chrisoman87 2 년 전 +14

      @@AC3handle 1200 wont buy you enough power for a decent DL rig either. An RTX 3090 goes for ~$3000 USD

  • @Septimius
    @Septimius 2 년 전 +6169

    I see Derek is getting into modular synthesizers! Also, funny to see how the swing in musical instruments to go from analog to digital and back is being mirrored in computing generally.

    • @paradox9551
      @paradox9551 2 년 전 +290

      My first thought when he pulled out the analog computer was "Hey that looks like a modular synth!"

    • @toddmarshall7573
      @toddmarshall7573 2 년 전 +27

      Witness Audio Modeling (search for it on KRplus).

    • @p1CM
      @p1CM 2 년 전 +28

      Music has always been an AI task

    • @theisgunvald4219
      @theisgunvald4219 2 년 전 +112

      As a semi-professional music producer with almost half a decade of working with professional musicians I would agree - and this is mainly because people feel a lack of “soul” in music. Those small human errors that we’ve spent decades trying to get rid of with Autotune, Drum machines, Sequencers, digital synthesiser and digital samplers (the last two CAN create sounds that will always come out the same way as long as the input stays the same - however there are exemptions).
      This is probably something the people I know in the music industry refer to as “The generation-rule”, in brief the music today is a result of what our parents and grandparents heard combined with new technologies and pop culture. - If you’re interested in music and maybe want to stay ahead of the game look it up. Some refer to it as the “30 year rule” as well.

    • @PetraKann
      @PetraKann 2 년 전 +5

      @@p1CM AI has no tasks

  • @bishalpaudel5747
    @bishalpaudel5747 9 개월 전 +3

    This is very well explained video on analog computing. Never could I have thought the topic of analog computing can be put out in 20 minute video with such a phenomenal animation and explanation. Respect your work and effort to make science available to all for free. Respect 🙏

  • @Psrj-ad
    @Psrj-ad 년 전 +5

    this make me want Derek to talk about Neural networks and AI related topics a lot more.
    its not just extremely interesting but also constantly developing.

  • @belsizebiz
    @belsizebiz 2 년 전 +420

    For amusement only: My first day at work was in1966, as a 16 year old trainee technician, in a lab dominated by a thermionic valve analogue computer (or two). These kept us very warm through the winter months and cooked us during the summer. The task was calculation of miss distances of a now-obsolete missile system.
    One day I was struggling to set up a resistor/diode network to model the required transfer function, but the output kept drifting. I found the resistors were warming up and changing value. Such are the trials and tribulations of analogue computing....

    • @_a_x_s_
      @_a_x_s_ 2 년 전 +23

      Thus the temperature coefficient is very important for recent precision devices. And a high accuracy low ppm resistor is expensive, which is one of the reasons why it costs so much for high-end electronics instruments.

    • @mikefochtman7164
      @mikefochtman7164 2 년 전 +13

      I was going to comment that one disadvantage of analog computers is keeping them calibrated. If you want a precise amount of 'voltage' or movement to represent a real-world value, you have to keep it calibrated. Older mechanical ones had wear/tear, electronic ones have issues as well.

    • @stefangriffin2688
      @stefangriffin2688 2 년 전

      Ah!? But what if the resistors were warming up, digitally?

    • @Cat-ir8cy
      @Cat-ir8cy 2 년 전 +8

      @@stefangriffin2688 you can't have a digital resistor

    • @aravindpallippara1577
      @aravindpallippara1577 2 년 전

      @@stefangriffin2688 yeah digital signals works with gates - on or off

  • @jeffc5974
    @jeffc5974 2 년 전 +917

    One of the first things I learned in Electrical Engineering is that transistors are analog. We force them to behave digitally by the way we arrange them to interact with each other. I'm glad there are some people out there that remember that lesson and are bringing back the analog nature of the transistor in the name of efficiency and speed.

    • @jasonbarron6164
      @jasonbarron6164 2 년 전 +22

      At the expense of accuracy?

    • @JKPhotoNZ
      @JKPhotoNZ 2 년 전 +59

      Well, semi analogue. Don't forget the bias (voltage drop) before you get current amplification.
      Also, to say that analogue computers are more power efficient that digital is pretty hard to back up. A $2 microcontroller can run on a few mA for the desired task, then sleep on uA. You'll need at least 5mA for an analogue computer to start with and you can't make it sleep.

    • @danimayb
      @danimayb 2 년 전 +22

      @@JKPhotoNZ Great point. And with current Nano transistor technology, That efficiency (along with raw power) is going far beyond what a true analogue system could produce.

    • @rahulseth7485
      @rahulseth7485 2 년 전 +2

      Yeah but then you'll never know at which zone is it on?
      Because amplification happens differently for different input parameters. And not all transistors from the same batch would perform the same, i.e. it will lack repeatability (as Derek mentioned).

    • @mycosys
      @mycosys 2 년 전 +20

      the insoluable (even in theory) problems of analog are noise and signal integrity, which is why he didnt even mention them. This channel has gone to poop honestly.

  • @lc5945
    @lc5945 년 전 +14

    I remember the first time I heard the term "deep networks", it was back in 2009 when I was starting my MSc (using mainly SVMs), a guy in the same institute was finishing his PhD and introduced me to the concept and the struggles (Nehalem 8 cores era)...the leaps in performance made in NN since then thanks to GPGPU are enormous

  • @HrLBolle
    @HrLBolle 7 개월 전 +4

    Mythic Gate's approach to work:
    Kind of reminds me of the copper filament memory planes with ferromagnetic rings representing the bits used as memory for the AGC (Apollo Guidance Computer).
    The video on which this memory is based was released by Destin, aka Smarter Every Day, and accompanied his and Linus Sebastian's meeting with Luke Talley, a former IBM employee and, at the time of the Apollo missions, a member of the data analysis teams responsible for the Analysis, evaluation and processing of the telemetry data received from the Apollo instrument ring.

  • @funktorial
    @funktorial 2 년 전 +388

    started watching this channel when I started high school and now that I'm about to get a phd in mathematical logic, I've grown an even deeper appreciation for the way this channel covers advanced topics. not dumbed down, just clear and accessible. great stuff! (and this totally nerd-sniped me because i've been browsing a few papers on theory of analog computation)

    • @gautambidari
      @gautambidari 2 년 전 +14

      Absolutely. Love the way he covers the concept for everyone. Those who don't know in depth about it can still go away with a sort of basic understanding. And those who do understand it in depth will enjoy discovering new areas of invention that they can further explore.
      Looking forward to reading some papers on using analog computing in neural network applications

    • @victorymorningstar
      @victorymorningstar 2 년 전 +8

      I'm smart too.

    • @mentaltfladdrig
      @mentaltfladdrig 2 년 전 +5

      Same here. But i didnt go to high school and mylife became a total mess and i havent graduated whatsoever :)

    • @SteveAcomb
      @SteveAcomb 2 년 전

      “nerd-sniped” lmao I feel exactly the same
      and here I was thinking I was way ahead of the curve on alternate computing 😂
      jokes on me

  • @SandorFule
    @SandorFule 년 전 +498

    I am a process control engineer, born in 63. In the 80-ies we used analog computers to calculate natural gas flow - for the oil and gas company. A simple flow computer was around 10 kilos, full of op amps and trimmer pots. It was a nightmare to calibrate it. :)

    • @deang5622
      @deang5622 년 전 +17

      Op amps with their offset voltages and input bias currents leading to inaccuracy.
      Sounds like a nightmare.
      Constant recalibration required?

    • @rogerphelps9939
      @rogerphelps9939 년 전 +5

      Absolutely.

    • @victorblaer
      @victorblaer 년 전 +6

      @@rogerphelps9939 just calculating the uncertainty at each step sounds like a nightmare.

    • @percutseituan
      @percutseituan 년 전

      but you can mix with digital control for adjusting and decision

    • @benoitroehr4100
      @benoitroehr4100 년 전 +1

      I think Nasa (or was it still Naca at the time?) was able to simulate flight caracteristics with analog circuits too. I'm thrilled to see this tech coming back !

  • @snerttt
    @snerttt 6 개월 전 +21

    I'd be interested to see a digital computer adopt an analogue component, possibly to be utilized in situations of physics simulation, much like how a GPU is utilized to independently create graphics from the CPU.

  • @davidchristensen4643
    @davidchristensen4643 3 개월 전 +1

    It's interesting how circular technology is. Back in the 1970's my first job out of uni was with a national research association focused on all things to do with ships in the UK. Whilst the primary focus of my work was providing QA services to the various research teams, including maintaining and enhancing language systems like RATFOR, and system management of the ICL, IBM, Perq, CV & DEC systems, I was also involved in developing two specific analogue/digital hybrid projects. One was focused on managing and monitoring loading balances for bulk cargo ships and the other was simulating ship navigation into ports in real-time. Both of these projects involved interfacing the analogue data from real-time sensors to digital monitoring and mapping algorithms. Unfortunately, at that time, analogue was seen as a historical burden and both were eventually canned. Now, almost 50 years later, it's great to see that our ideas of the 70's are coming back into fashion.

  • @PersonaRandomNumbers
    @PersonaRandomNumbers 2 년 전 +446

    My professor always said that the future of computing lies in accelerators -- that is, more efficient chips that do specific tasks like this. He even mentioned analog chips meant to quickly evaluate machine learning models in one of his lectures, back in 2016! It's nice seeing that there's been some real progress.

    • @xveluna7681
      @xveluna7681 2 년 전 +13

      That's pretty much where things have always been. Using basic building blocks that do specific functions. A linear voltage regulator has the job of maintaining a constant output voltage for a given set of current levels and different input voltages. You can buy an Opamp and using resistors to make a function called a schmitt trigger. Or you might just buy a schmitt trigger from Texas Instruments and put it onto a board with less board space consumed. Or a schmitt trigger might be embedded for free in certain other ICs (intergrated circuit).
      The major computing engines I have seen so far have been effectively GPUs, CPUs, and FPGAs. Xilinx &Altera (now Intel) have specialized in making FPGAs. An FPGA's basic internal components are logic elements with flipflops with reset, aysnc reset inputs, 4-input look up table, etc. Cascade these to make larger units like a multiplexer, floating point arthimetic unit, etc. Its programmable so you can effectively emulate a worse performing specialized CPU. A CPU is still more effcient at doing CPU type functions. A GPU does specific stuff as well.
      The idea of doing analog computations honestly just sounds like another building block to add into a complex system. Only there simply hasn't been a large enough demand to require the generation of specialized hardware like what was described in this video. That one start-up sounds like its developing a chip that will do a series of very specific functions and will need to be integrated into a large systems to accomlish a specific task.

    • @matsv201
      @matsv201 2 년 전 +2

      Well that have sort of always been the case.
      I don´t know what was the first accelerators, but one of the fairly early once was the FPU. We nu just take it for granted.
      Sprite accelerator was also fairly early.
      Then graphics accelerators.
      Then video decoder/encoder
      Then MMU accelerators
      Then 3D accelerators.
      Them SIMD accelertors.
      Then T&L accelerators
      Then physics accelertors
      Then raytracing accelerators
      Then deep learning accelerators.

    • @LundBrandon
      @LundBrandon 2 년 전 +2

      ASIC devices have existed for decades...

    • @calculator4482
      @calculator4482 2 년 전

      @@LundBrandon they will soon become obsolete though due to reconfigurable computing devices line FPGAs

    • @LundBrandon
      @LundBrandon 2 년 전 +1

      @@calculator4482 FPGAs have also been around for decades, plus they draw more power. I'm a computer engineering student right now currently designing a CPU to be synthesized onto an FPGA. I'm not dumb.

  • @suivzmoi
    @suivzmoi 2 년 전 +272

    as a NAND flash engineer that bit about the usage of floating gate transistors as analog computers is interesting. particular because in flash memory there is a thing known as "read disturb", where even low voltages applied to the gate (like during a flash read) to query it's state can eventually cause the state itself to change. you would think it is a binary effect where if it's low enough it would just be a harmless read but no...eventually there will be electron build up in the gate (reading it many times at low voltage has a similar effect to programming it one time at a high voltage). in this particular application the weight would increase over time the more you query it, even though you didn't program it to be that high in the beginning. it's interesting because, it's sort of analogous to false memories in our brains where the more we recall a particular memory, the more inaccurate it could potentially become.

    • @donkisiko
      @donkisiko 2 년 전 +15

      Underrated comment!

    • @Xavar1us
      @Xavar1us 2 년 전 +13

      Absolutely love this comment! This has been on my mind for at least an hour now, the point you make is intriguing and a bit haunting, thanks for that!

    • @JeyeNooks
      @JeyeNooks 2 년 전 +3

      Fkin right on!!

    • @Lassana_sari
      @Lassana_sari 2 년 전 +2

      Very interesting.

    • @sampathsris
      @sampathsris 2 년 전 +6

      Underrated comment. Then in Eternals style we will have to reprogram the memories of our servants every now and then.

  • @fierybones
    @fierybones 2 개월 전 +1

    I happened to watch this just after playing with a modular (audio) synthesizer. In these, each module is controlled by a voltage, originating from an oscillator, a keyboard, or a "sequencer". The concept that makes a modular synth interesting is, the voltage pattern (waves) output from a module can either be used as an audio signal (if it's in the audio spectrum), or to control another module. In the simplest case, output from a voltage controlled oscillator (VCO) can be routed to a speaker to produce a sound. But it can also be routed to a module that filters a signal in some way, based on the output voltage of its predecessor.
    Maybe the thing that makes "ambient" music's slowly-shifting textures interesting is that they mimic the neural networks of our brains.

    • @certainlynotmalo1.0.06
      @certainlynotmalo1.0.06 29 일 전

      Aot of the actually do! You can even (kind of) help your brain waves to synchronise with the oscillations, it's not by brute force (you have to play along or it doesn't work very well) but it can greatly help for sleeping, learning and related stuff.
      Reminds me of old hardware synthts, where you had to connect each of the synth parts with cables. But tat gave you an amazing amount of flexibility! BUT, no one cared to write the configurations down... That was te funniest and most awful part at the same time...

  • @jasonturner1045
    @jasonturner1045 년 전 +1

    How have I not found this channel before now?? Fascinating topics.

  • @TerryMurrayTalks
    @TerryMurrayTalks 2 년 전 +333

    As a 70-year-old boomer my technical education involved building and testing very basic analog devices. Thanks for this video, it helped me to a better understanding of neural networks.

    • @magma5267
      @magma5267 2 년 전 +16

      You must be really healthy because you dont even look close to 70! :D

    • @TerryMurrayTalks
      @TerryMurrayTalks 2 년 전 +32

      @@magma5267 Thanks for the heads up. I've got a good women, 6 children, 8 grandchildren and a recently placed stent in my heart that keeps me going :)

    • @vedkorla300
      @vedkorla300 2 년 전 +5

      @@TerryMurrayTalks Good for you my man! I am still 20 and don't know what to do in life :(

    • @vource2670
      @vource2670 2 년 전 +1

      Yep your 70 mate

    • @TerryMurrayTalks
      @TerryMurrayTalks 2 년 전 +11

      @@vedkorla300 You have plenty of road left to travel, follow what you love, enjoy the journey, don't let the bumps in the road stop you and if you can get a soul mate to share it with you it will all be good.

  • @IronAsclepius
    @IronAsclepius 년 전 +1383

    My undergraduate work was actually with a professor who did research in the brain as an analog computer and using neural networks and analog computing as an attempt to achieve super-turing computation. A researcher who's name is worth looking into in all this from my research would be Hava Siegelmann. At the time I understood much less about the problem. My task was essentially to try and prove that analog computation could be modeled with a neural network on a digital computer. Not sure if my comment will be buried or not, but it's an area worth looking into if you're more deeply interested in this problem.

    • @AayushPatel-gc3fw
      @AayushPatel-gc3fw 년 전 +15

      I have never done much/extremely deep research on a topic, but this seems very interesting.

    • @raystir98
      @raystir98 년 전 +19

      id like your comment to not be burried.

    • @noblenessdee6151
      @noblenessdee6151 년 전

      0s and 1s . high and lows. voltage and no voltage (digital representation of numbers); have absolutely nothing to do with brain neurons. it's complete BS . For all we truly know about the brain there could be a near unless amount of information with every firing of a neuron. We have no idea what format conciseness information is in and likely never well as "in time" humans.

    • @AayushPatel-gc3fw
      @AayushPatel-gc3fw 년 전 +6

      @@noblenessdee6151 Engineers : well I will approximate every thing a neuron is saying, to just two numbers. 🙂...

    • @beeswaxlover
      @beeswaxlover 년 전 +18

      @@AayushPatel-gc3fw words are the limitations, not numbers, all words can be expressed in code, not all humanity can be expressed in words.

  • @gregseljestad2793
    @gregseljestad2793 10 개월 전 +2

    I just found out that the SR71 engines had a hydraulic computer that ran the system. That would be amazing to see. I worked at Caterpillar and a friend of mine was tasked with converting a craper transmission module from a hydraulic base to electronic. It was a very old design and all engineers had passed on. They had a team of engineers that had to replicate all hydraulic functions into an electrical equivelant. It was fascinating to me. One of the functions they had to replicate is going up a steep hill with a full load and being able to shift without rolling backwards. Holding the load, sharing the load with two clutches, and increasing one clutch while reducing the other clutch to make it a seemless shift. So enjoy this topic. Thanks!

    • @SaanMigwell
      @SaanMigwell 3 개월 전

      Most nuclear power plants are pneumatic computers. Well, the old subs and breeder reactors anyway.

  • @Paul-rs4gd
    @Paul-rs4gd 년 전 +7

    I can see this analog technology being used in special purpose AI processors attached to normal digital computers. It makes sense - they could provide very large scale, cheap and energy efficient Neural Net acceleration. Since it appears that 'scale' is the most important thing for AI, it is really important to bring down the cost and energy consumption, so we can all run GPT3 on our laptops :)

  • @dust7962
    @dust7962 2 년 전 +1093

    The problem with this system of computing is that interference is a huge factor. When you only test to see if there is voltage or not then you don't need to worry about interference. But when you get into building systems that use varying voltages say 0v 0.5v 1v then you need to worry about interference, and the more numbers you add the greater this factors into being an issue. Interference could come in the form of microwaves, radiation, mechanical vibration (think fans cooling off a server rack,) and the list drags on as almost anything can cause interference. That ostiliscope used in the example is an expensive piece of equipment that minimizes interference, but the cost is far higher than with a binary number system.

    • @introprospector
      @introprospector 2 년 전 +141

      Binary computers have to deal with interference too, that's handled by error correction. Error correction is already baked into the infrastructure of every digital component, to the point where we don't realize it's there. They suggested one method of error correction in the video, and they're probably not even scratching the surface of what's possible.

    • @fltfathin
      @fltfathin 2 년 전 +5

      i think the crux is the medium, the AI models brain which is so good at re-building itself, and it uses electron and chemicals to convey information. our transistors are too limited to mimic that interaction. for example the new 3w chip needs to be custom made for each model if i got it right.

    • @dust7962
      @dust7962 2 년 전 +73

      @@introprospector Yes, but with binary error correcting is simpler as interference isn't as much of a burden on the architecture. When the job is to check if there is, or isn't voltage it is a lot less complex than checking 8 different voltage thresholds.

    • @dust7962
      @dust7962 2 년 전 +35

      @@fltfathin This is called an ASIC (Application specific circuit) the computer is pretty much just sent to the landfill after it's outlived it's usefulness instead of having the ability to be repurposed. Which is another concern about where computing in general is heading as PCBs use less and less semi-precoius, or precious metals there is less incentive to recycle.

    • @JayJay-dp8ky
      @JayJay-dp8ky 2 년 전 +6

      @@dust7962 Yeah but I put my mobo in the case first and then the radiator wouldn't fit, so I had to take it out and install the radiator first. It was really annoying. I didn't watch this video, but I'm assuming this is what he was talking about.

  • @DomDomPop
    @DomDomPop 년 전 +996

    It’s funny, for those of us who are into electronic music production, analog never left! There are lots of great analog synthesizers out there that can produce all kinds of complex waveforms, and some of us have been known to tack an oscilloscope on to modular gear to view and use those waveforms. Even some relatively simple gear can produce complex, “3D” structures with the correct cable patches. A lot of what you described at the beginning is the backbone of synthesis for music, and the same principles obviously apply to mathematical operations.

    • @rogerphelps9939
      @rogerphelps9939 년 전 +17

      You can do everything digitally that an analog system can do and more. An example is resampling in order to change the frequency scale of a recording. This can be done in real time using digital methods, not so much for analog methods.

    • @DomDomPop
      @DomDomPop 년 전 +53

      @@rogerphelps9939 Depends on what you’re doing and what’s important to you. Analog synths are great for experimenting with the knobs and patch bay (if available) and learning what exactly each change has on the overall waveforms. They’re really great for learning what exactly you’re doing and what you’re getting as a result. Yeah, there are software synths meant to emulate hardware knobs and a patch bay, but I haven’t found clicking through all that as valuable as plugging and experimenting yourself. That stuff really depends on the person, though.
      What doesn’t depend on the person, and is arguably more important, is the fact that aliasing can end up being a problem on digital synths. When you start doing some crazy cross modulation between sources and/or you’re dealing with lots of harmonics, if the processor can’t keep up, your sound will suffer. Same with super high frequencies. Depends on the synth, of course, but analog synths can tend to have a warmer, purer sound to them as well, because you don’t have to emulate all those harmonics. It really comes down to the same arguments being made here regarding analog computers: there’s no processor overhead needed to create some very complex shapes, and to do so perfectly accurately, on analog. I use both types of synths, as lots of people do, and I would never say that one somehow makes the other unnecessary. Hell, there are hybrid synths that give a mostly analog signal path while allowing for, say, a digital sample and hold circuit and the ability to save certain parameters. People make those kinds of things for a reason, you know?

    • @victorpereira8000
      @victorpereira8000 년 전 +3

      Pythagoras discovered math with music I think right? Really like your comment

    • @RAndrewNeal
      @RAndrewNeal 년 전 +22

      @@rogerphelps9939 Difference is that you need billions to trillions of transistors to do digitally what can be done using tens to hundreds of transistors analogously.

    • @rogerphelps9939
      @rogerphelps9939 년 전 +8

      @@RAndrewNeal Wrong. The errors arising from component tolerances, noise and temperature dependent offsets make anything complicated pretty much impossible in analog. Transistors in digital processors are extremely cheap. Provided you have good DACs and ADCs you can do anything to whatever precision you need in digital.

  • @activision4170
    @activision4170 3 개월 전 +1

    Great video. Never knew this was a thing. Very useful. Might one day just be an extra part on the motherboard designed for fast approximation calculations

  • @kasparsiricenko2240

    When I was in institute back in 2016 I was thinking of this specific “gates” as well as undergraduate. I knew someone was already implementing it but still missing the time I could be part of innovations. What a genius way of reimplementing circuits for neural networks. Maybe that’s what the FPGA is future is - neural networks

  • @melanezoe
    @melanezoe 2 년 전 +236

    Freaked me out to see that opening analog plug board. That’s how I learned programming in my first data processing class at Fresno State University-in 1964. Eerie to have that memory rise.

    • @Ozhull
      @Ozhull 2 년 전 +9

      Damn you're old! Glad you're still kicking around

    • @jimmysyar889
      @jimmysyar889 2 년 전 +9

      @@Ozhull he’s only like 75 chill

    • @amaan06
      @amaan06 2 년 전

      Lol

    • @beesharp9503
      @beesharp9503 2 년 전

      Woo fresno!

    • @PaulJosephdeWerk
      @PaulJosephdeWerk 2 년 전

      I graduated Fresno State in 1993 with a BS in CS (after a stint in the military). I even took an artificial intelligence class. I still have my perceptron book.

  • @Ave117
    @Ave117 2 년 전 +575

    This actually helped me a lot to understand how neural networks work in general. For me it was kinda like black magic before. It still is to an extend but to know that moden Neural Networks are kind of more complex multi-layered perceptrons helped a lot.

    • @chibicitiberiu
      @chibicitiberiu 2 년 전 +17

      Yes, indeed. Something I find fascinating are recurrent networks, where some neurons feed back into the network which would allow some information to be saved from one image to the next one. This would allow the AI to process things that change in time, like music and video. For example, if you're tracking a subject with a camera and he turns around, a recurrent AI would be able to continue tracking the subject.

    • @connorjohnson4402
      @connorjohnson4402 2 년 전 +3

      Yea in the end to some of it is kind of voodoo black magic thought i mean they call them black boxes for a reason.

    • @Blox117
      @Blox117 2 년 전

      speak in english pls

    • @aladdin8623
      @aladdin8623 2 년 전 +32

      The video seems to contain some quite biased infos though. The top 5 error rate of humans of 5,1% is of course not accurate. If human beings were that bad, we accordingly would have much, much higher car accident rates. Those kind of inaccurate percentages come from statistics based on captchas. And several conditions do distort the results there.
      - human users often don't bring up the needed concentration and attention to solve captchas as they actually could. In fact they are angered by them and often times just click quickly through them.
      In traffic while driving a car human beings are much higher on alert and do tremendously less mistakes. Here human beings still beat autonomous drive computers by several orders of magnitudes, measured in car accidents per million driving hours.
      - the captchas often do not meet the actual human perception. The captcha images are often unclean, got low resolution and distortions. In the real world humans perceive much higher quality from their surroundings than some crippled captchas. A more clear image increases the recognition dramatically.
      It is really crucial in educational videos from whom and from where you take your numbers. Science is not always that objective as we are told, especially when corporations are funding them financially with own interests.
      Other than that the video is quite interesting. I also wished though that many common misconceptions would have been cleared up. For example many people still believe, that computers would work like human brains. This is plain nonsense, mostly spread by science fiction. The brain still does pose big mysteries to us especially "the big problem of consciousness".

    • @anteshell
      @anteshell 2 년 전 +14

      It is still kind of voodoo or black magic. While the overall working mechanisms are well known and the output can be estimated based on the input, how the neural network exactly reaches to the answer is nearly impossible to inspect because of the sheer amount of variables.
      In essence, you feed a black box with something and you can expect it to give you a particular answer with some confidence, but no-one has any damn idea what exactly happens inside the black box.

  • @santiagojimenezpinedo3473
    @santiagojimenezpinedo3473 10 개월 전 +4

    This is really cool, and there is another startup that have a different approach using analog but instead of using voltage and currents, they use light, so it is really interesting how the analog is coming back. I would really appreciate it if you would make a video about this. The startup is Lightelligence. As always, thanks for these videos.

    • @frightenedsoul
      @frightenedsoul 2 개월 전

      Terrible name, though lol. Lightelligence. I get the idea behind it but it just doesn’t work as a satisfying portmanteau

  • @di380
    @di380 2 개월 전

    I agree, one point I was going to mention regarding analog computers is that they are susceptible to voltage fluctuations, environmental noise and the accuracy of your results are directly dependent on the accuracy of your equipment reading the output voltages. There is that, but this makes sense when talking about specific applications like this one 👌

  • @quietcanadian5132
    @quietcanadian5132 2 년 전 +155

    I’ve been an engineer for 44 years. Great video. I actually worked on analog computers in the 70s when digital processing was still new. Never to this level though. Great job!

    • @apollochaoz
      @apollochaoz 년 전

      🇨🇦🏳‍🌈

    • @raijuko
      @raijuko 년 전 +3

      Its amazing how fast all of this is evolving. Looking at this, and comparing it to facial recognition software in simple phone apps we have now really shows how much all of this has influenced what kids and teens easily use today.

    • @johndoh5182
      @johndoh5182 년 전 +1

      I didn't catch the part where he quit talking about analog systems though when he went to the logic systems being used for matrix operations, because that was digital. There may have been analog inputs into the system, but there's an A to D conversion, and everything he showed at the end was strictly digital, so a bit misleading there. Current systems for AI are digital.

    • @GeovanniCastro666
      @GeovanniCastro666 년 전

      @@raijuko yes but i still believe in God . And i am a fan of science experimenting and inventing

  • @lonewulf0328
    @lonewulf0328 2 년 전 +490

    This was one of the best layman's explanations of neural net training models that I have ever seen. Awesome content!

    • @duongchuc1834
      @duongchuc1834 2 년 전 +1

      ok

    • @patakk8145
      @patakk8145 2 년 전 +3

      but it isn't, he literally said he's going to skip back propagation (which is how models are trained nowadays)

    • @PaulAVelceaVSC
      @PaulAVelceaVSC 2 년 전 +7

      i am a layman I did not understand a bit of it, pun intended

  • @spoidermon2515
    @spoidermon2515 년 전 +1

    Damn Man!! You explained it pretty well!! All that history and theory wrapped in 22 mins! Incredible!

  • @user-ou8qw2sg3d
    @user-ou8qw2sg3d 24 일 전

    This blows my mind. Thank you. It's so cool to learn this way about algorithms.

  • @robertb6889
    @robertb6889 2 년 전 +308

    As a guy who helps manufacture flash memory I find this really intriguing: especially because flash memory is continuing to scale via 3-D layering, so there’s a lot of potential, especially if you can build that hardware for multiplication into the chip architecture.

    • @ravener96
      @ravener96 2 년 전 +3

      You are still strugling with interconects from one side to the other

    • @Zeuskabob1
      @Zeuskabob1 2 년 전 +4

      @@ravener96 With many ML algorithms you can split problems into multiple sub-problems for different networks to handle. I wonder if developing that area of ML would be helpful to make effective analog systems? For an example, in image processing a pixel at the top left of the image has little interaction with a pixel in the bottom right of the image compared to nearby pixels. If you wait to compare them until multiple layers later, it speeds up processing the image and allows for algorithms to become more adept at finding sub-patterns in the image.

    • @martiddy
      @martiddy 2 년 전 +3

      @@Zeuskabob1 Depends on what kind of images processing the neural network is doing, if the computer wants to identify a face in a person maybe it doesn't need to process all pixels once it has processed all the pixels near the face, but in some cases distant pixels can indeed be correlated, like the images from a camera in an autonomous car identifying the white lines of a street, where it could be 99% sure it is a straight line but the corner pixels clearly indicates that is curve line.

    • @robertb6889
      @robertb6889 2 년 전

      @@ravener96 Yeah, but interconnects can be designed around with clever architecture to an extent. It's still quite interesting.

    • @seldompopup7442
      @seldompopup7442 2 년 전

      Flash cells are micron scale while the AI accelerators doing integer operation are built with the latest 4 nm technology. And floating gates have really limit life compared to pure logic circuit.

  • @BrianBoniMakes
    @BrianBoniMakes 2 년 전 +417

    I used to calibrate analog computers that ran experiments and test equipment. They were often odd mixtures of analog and digital technologies. Near the end I had to keep a few machines alive as they aged out of tolerance, there was always a way you could tweak out some more performance by shifting the calibration away from areas you didn't need in a much more forgiving way than any new digital could.

    • @nenmaster5218
      @nenmaster5218 2 년 전 +2

      Anyone knows some Good Science-Channel for me to cxheck out?

    • @yash1152
      @yash1152 2 년 전 +2

      thanks a lot Brian Boni for your valuable input
      (keys: computers: mix of analog and digital)

    • @yuro5833
      @yuro5833 2 년 전 +1

      @@nenmaster5218 Nile red and Nile blue

    • @nenmaster5218
      @nenmaster5218 2 년 전

      @@yuro5833 Thx!
      Know Hbomberguy?

    • @yuro5833
      @yuro5833 2 년 전

      @@nenmaster5218 I actually thought I didn’t then realized I had seen several of his videos and forgot about him so thank you as well

  • @joaoluizpestanamarcondes6219

    Bro, this channel is crazy top shelf stuff, im amazed, thank you for that

  • @javierperea8954

    That's so beautiful. Using a photocell as an analog to digital interface, with the advantages of both systems applied effectively in a system.

  • @FragEightyfive
    @FragEightyfive 2 년 전 +463

    Using an analog computer to demonstrate differential equations is a perfect teaching tool. I really wish tools like this were used in colleges more often.

    • @Elrog3
      @Elrog3 2 년 전 +10

      They already use crap like this far too often. This isn't something use for for a differential equations course. Maybe it would be ok for a circuits course or even Computer Science course focused solely on analog computers. In math, just give us the numbers and the logic... Don't waste are time with this stuff.

    • @Elrog3
      @Elrog3 2 년 전 +4

      @@JackFalltrades I am an engineering student.

    • @Elrog3
      @Elrog3 2 년 전 +3

      @@JackFalltrades I'm not calling letting students know of use-cases for things crap. I'm calling taking up class time that is meant for teaching students the logic of how to solve differential equations (because that is the class the original poster said it would be good for) and instead using that class time to teach something that only a tiny fraction of the class would ever use.

    • @quotidian8720
      @quotidian8720 2 년 전 +5

      it is used in control systems

    • @Noootch
      @Noootch 2 년 전 +18

      @@Elrog3 He never said it should be used in a differential equations course. You just sound like the type of students that go to university and ask which courses they need in order to get a high salary position in industry.

  • @nicholasjayaputra5754
    @nicholasjayaputra5754 2 년 전 +476

    I thought there was no other part to the first part, thank you for the satisfaction you have given me through the knowledge I got from this video

    • @zaksmith1035
      @zaksmith1035 2 년 전 +1

      Can't wait to watch this with my kids. I forgot it was coming, we were waiting so long for it.....

    • @AxxLAfriku
      @AxxLAfriku 2 년 전 +1

      NO! NO! NO! Many people say I am sick in the head. NOOOO!!!! I don't believe them. But there are so many people commenting this stuff on my videos, that I have 1% doubt. So I have to ask you right now: Do you think I am sick in the head? Thanks for helping, my dear nico

    • @byronvries3826
      @byronvries3826 2 년 전 +1

      @@AxxLAfriku 0

    • @nicholasjayaputra5754
      @nicholasjayaputra5754 2 년 전 +2

      @@zaksmith1035 That's awesome mate

    • @nicholasjayaputra5754
      @nicholasjayaputra5754 2 년 전 +4

      @@AxxLAfriku I'm honoured to have your bot-like reply in my comment. As for the answer, well, I don't know, but have a good day mate!

  • @NR-bt7yz
    @NR-bt7yz 5 개월 전

    I've recently started learning ML and this video helps so much. You just made me a Patreon supporter. Thanks Derek!

  • @mbharatm
    @mbharatm 년 전

    Amazing, thought provoking 2 part video on analog computing. Veritasium never disappoints!

  • @harrybarrow6222
    @harrybarrow6222 2 년 전 +346

    Rosenblatt’s Perceptron was essentially a one-neuron network, although he could perform logical operations on the binary data inputs before passing results, which gave it more power.
    Minsky and Papert at MIT were concerned that Rosenblatt was making extravagant claims for his Perceptron and scooping up a lot of the available funding.
    In their book, “Perceptions”, Minsky & Papert proved that one-neuron networks were limited in the tasks they could perform.
    You could build networks with multiple Perceptions, but since Perceptrons had binary outputs, nobody could think of a way to train networks.
    That killed funding for neural networks for decades.
    In the late 1980s, interest was re-kindled when John Hopfield, a physicist, came up with a training technique that resembled cooling of a physical spin-glass system.
    But the big breakthrough came when the error back-propagation technique was developed by Rumellhart, Hinton &Williams.
    In this, neurons were modified to have a continuous non-linear function for their outputs, instead of a thresholded binary output.
    Consequently, outputs of the network were continuous functions of the inputs and weights.
    A hill-climbing optimisation process could then be used to adjust weights and hence minimise network errors.
    The rest is history.

    • @3nertia
      @3nertia 2 년 전 +3

      And now, we're "evolution" but with awareness and intent heh

    • @slatervictoroff3268
      @slatervictoroff3268 2 년 전 +7

      Critically wrong. Not one-neuron - that doesn't even make sense. One *layer*.

    • @brunsky277
      @brunsky277 2 년 전 +6

      ​@@slatervictoroff3268 I have to disagree. Perceptron is one-neuron (one neuron that receives multiple inputs and puts out one output). This makes it also one layer network I would say

    • @meateaw
      @meateaw 2 년 전

      @@brunsky277 thinking about it though, the inputs all had their own weights, those weights correspond to a neuron. A modern ai model has inputs, and the weights exist on the layers.
      Therefore the perception had 400 inputs, 400 weights, and 1 output signal.
      That implies to me 400 neurons, in a single layer, leading to a single output value.

    • @WilisL
      @WilisL 2 년 전 +2

      @@slatervictoroff3268 No, one layer can be multiple perceptrons, it’s technically one-neuron (which is technically a one layer though)

  • @SIBUK
    @SIBUK 2 년 전 +658

    The most interesting thing I found in this was when he was saying that in the chip they had to make it alternate between analog and digital signals to maintain coherence. It's interesting because the brain does something similar where it alternates between electric pulses and chemical signals.

    • @chrisfuller1268
      @chrisfuller1268 2 년 전 +42

      The problem is machine learning is still not capable of being used commercially in general environments (e.g. security cameras) because they can't handle unpredictable situations. The brute force method of AI is still the only solution for general environments (e.g. self driving cars)

    • @riskyraccoon
      @riskyraccoon 2 년 전 +62

      @@chrisfuller1268 people also suffer from the brute force nature of processing information, aka confirmation bias. Thankfully we can take steps to correct this, but many people lack the tools and mindset to make these self corrections.

    • @chrisfuller1268
      @chrisfuller1268 2 년 전 +13

      @@riskyraccoon Yes, humans are flawed, but we are capable of recognizing objects no matter what else is in our field of view. This is a task machine learning will never be able to solve in 100% of all possible (infinite) environments. Brute force AI requires more development effort, but is capable of also identifying objects in many environments. This is why machine learning is a step backwards in technology and why it should never be used in life critical applications.

    • @chrisfuller1268
      @chrisfuller1268 2 년 전 +2

      @Adam H Amen, I never thought of the beast as an AI! The beast will be cast into the lake of fire so I believe he will be flesh and blood human with a soul, but the 'image of the beast'!

    • @chrisfuller1268
      @chrisfuller1268 2 년 전 +1

      @Adam H yes, that is a very interesting way of looking at it! I think we're a very long way from an AI being able to reason, but we've been using AI to kill people for decades.

  • @ChrisWalker-fq7kf
    @ChrisWalker-fq7kf 3 개월 전 +2

    That analog neural network was really interesting. But to me it's still essentially digital, i.e. discrete.
    In a normal digital solution you might have 16 possible values for the weights which would be encoded as 4 bits and would need to undergo addition/multiplication. But in the "analog" solution you encode the weights by setting one of 16 distinct voltage levels. The available voltage levels are quantised, not continuous so it's still a discrete system.
    It's great that you can do addition by just summing currents and multiplication by changing resistance. But you can even do this with binary: AND gates are multipliers and OR gates are adders if you only have 1 bit of data (1 OR 1 gives an overflow condition but the "analog" design will need enough voltage levels to avoid overflow also e.g. 7 + 13 would give the answer of 16 if this was the highest voltage level). I'd say it's still digital but it's not binary. It's multi-level logic.

  • @sushaanpatel1337
    @sushaanpatel1337 년 전 +1

    He has changed the title and thumbnail of this video for the thrid time and I watch it every time with the same curiosity

  • @Deveyus
    @Deveyus 2 년 전 +238

    A couple missed points:
    Things like google's Coral are also pushing incredibly high values, and to my knowledge are doing it digitally as an ASIC.
    Large models are expensive to train, there's no contention here, from mythic, you, or the wider AI community, but several advancements have been made in the last couple of years, that are letting models be compressed and refined to less than 1% of their original size. This makes them incredibly small and efficient operations, even on traditional CPUs.

    • @Zeuskabob1
      @Zeuskabob1 2 년 전 +11

      I'd love to read about that! I've been dipping my toes in ML algorithms and many of the really interesting networks require an immense amount of memory to function, on the order of tens of gigabytes. I'm curious why those models require such an immense amount of memory, and what can be done to improve that.

    • @siddharthagrawal8300
      @siddharthagrawal8300 2 년 전 +2

      @@Zeuskabob1 u don’t really need 10s of gigabytes to get a good model that can perform well on a task (usually). Most people still use models of size less than 5gb or so.

    • @vigilantcosmicpenguin8721
    • @flightrisk7566
      @flightrisk7566 2 년 전 +7

      thanks for pointing this out 🙄 seems like it was deliberately ignored for the sake of promoting this dumb startup

    • @moonasha
      @moonasha 2 년 전 +2

      just another case of Veritasium making a bait video to make experts respond

  • @TimeBucks
    @TimeBucks 2 년 전 +198

    Amazing video!

  • @coleballenger4595
    @coleballenger4595 개월 전

    12:48
    They did my man Alex so wrong there lol!
    Great video as usual.

  • @grabdoel
    @grabdoel 3 개월 전

    I learned more through your video than i did in engineering class :(. thanks a lot and it opens a great perspective on new innovations where analog is combined with digital. Will dive into it.

  • @The1wsx10
    @The1wsx10 2 년 전 +472

    wow that analog chip sounds extremely competitive. im surprised they already have something that good. mad props to the guy who figured out the hack with the flash storage

    • @dorusie5
      @dorusie5 2 년 전 +36

      I wonder how temperature sensitive it is.

    • @hughJ
      @hughJ 2 년 전 +137

      @@dorusie5 I'm mostly curious about the write-cycles and lifespan of the flash cells. Is the network going to get Alzheimer's after a few days?

    • @SharienGaming
      @SharienGaming 2 년 전 +74

      integrated circuits like that have always been really efficient - the downside is that they are extremely specialized... as the guy said: its not a general computation chip
      it can literally only do matrix multiplication - but that it can do really damn efficient (though slightly imprecise - which likely still is good enough for neural network purposes since they arent interested in the exact value of the result)
      so...sure thats competitive for that one purpose - but useless for anything that isnt that purpose
      but if the type of calculation that they can do is in high demand - they likely can sell a lot of specialized hardware either for specific devices or plug-in cards for computers that supply fast matrix multiplication operations

    • @wouterhenderickx6293
      @wouterhenderickx6293 2 년 전 +11

      I've been wondering about analogue usage of SSDs for a long time. It's an oversimplification, but each cell holds a voltage which can also be interpreted as an analog signal. If we take music as an example, you could basically write the value of one sample point to a cell, writing 16 bits worth of information to 1 NAND cell. This of course makes it impossible to compress the music, but it would allow to store music 'losslessly' at the same cell usage as a compressed 256kb/s file on TLC storage.
      Of course, NAND reproduction isn't perfect (and as such, music reproduction wouldn't actually be lossless), but I wonder how close this would come compared to the compressed digital file. I think this could be potentially useful for offline caches and downloads from Spotify for example, as the data can be checked and corrected when a high speed network connection is actually available.

    • @JustNow42
      @JustNow42 2 년 전 +2

      Already? We did this before 1960.

  • @carterbentley9030
    @carterbentley9030 년 전 +957

    Back in the mid-1960s my uncle, Joseph Grandine, designed a combination analog/digital computer that could optimally combine the two modes to solve complex problems in signal processing and data analysis. He called his computer the Ambilog 200. At that time, digital computing won the day. Now it sounds like he was a few generations ahead of his time.

    • @dinoschachten
      @dinoschachten 년 전 +74

      Amazing. Just found two articles about it in the Internet Archive.

    • @IAreBean
      @IAreBean 년 전 +20

      That is awesome

    • @LeKhang98
      @LeKhang98 년 전 +12

      That's amazing. We should show him this video and ask him what does he think about it.

    • @stwessboi
      @stwessboi 년 전 +2

      cap

    • @hamzahbalogun4220
      @hamzahbalogun4220 년 전 +4

      I would love to know him

  • @dominikhauk4638

    This has to be the most insightful and entertaining channel on youtube

  • @aetre1988
    @aetre1988 2 년 전 +515

    My dad's "Back when I was your age" stories on computing were about how he had to learn on an analog computer, which, according to him, you "had to get up and running, whirring at just the right sound--you had to listen for it--before it would give you a correct calculation. Otherwise, you'd input 100+100 and get, say, 202 for an answer." he hasn't been able to remember what make/model that computer was, but i'm curious. any old-school computer geeks out there know what he may have been talking about? Era would have been late 60s or early 70s.

    • @GDScriptDude
      @GDScriptDude 2 년 전 +85

      It sounds like your dad's computer was before the invention of the transistor. There was an analog computer at the electronics lab at the university of Hull, UK (when I was a student there in the 80s) that had moving parts. I remember when it became unstable and the professor sprinted across the lab to shut it down before it self-destructed. Something spinning suggests a sine wave generator for example.

    • @sapinva
      @sapinva 2 년 전 +25

      Yeah, just like analog synthesizers. You have to let them warm up to a stable temperature first or they would constantly drift out of tune while playing. This was later solved with digital controllers.

    • @murmamirrmohaimen2271
      @murmamirrmohaimen2271 2 년 전 +5

      Maybe the older mechanical calculators. Linus Tech Tips did a video on those. Super interesting stuff.

    • @urlkrueger
      @urlkrueger 2 년 전 +20

      I can't address your question directly but in the later half of the 1960's I worked on a helicopter simulator, used to train military pilots, in which all computations simulating flight were performed by analog circuits made up of transistorized (no IC's) operational amplifiers and servo motors with feedback.
      This whole machine was housed in a 40 foot long semi trailer. In the rear of the trailer was a cockpit from a CH-46 helicopter including all the controls and instruments but the windows were frosted over so you were always flying IFR in a fog, i.e. no visuals. Next as you moved forward was an operator's station where you could control parameters such as air pressure and temperature, activate failures such as engine fire or hydraulic failure and such. The remainder of the trailer contained a row of electronics racks on each side housing the amplifiers, servos and other circuits that performed all the calculations.
      We can look at main rotor speed as an example of how it worked. Rotor speed was represented by the position of a servo motor from 0 to 120 degrees. The position of the motor was determined by the output of an amplifier whose inputs were derived from many variables such as engine power (there were two), collective control position and altitude. Attached to the servo motor was a potentiometer whose output drove a cockpit instrument but was also fed back to amplifiers/servos which were used to calculate engine power and such.
      There were many such subsystems with feedback loops interconnecting them so that failures were very difficult to diagnose. Often the only way to resolve a problem was to take a guess at which part might have failed and replace it. Also routine maintenance was very labor intensive as the many potentiometers would wear and need to be cleaned and then realigned which might take an hour for each one.
      As a young man I was totally amazed and fascinated by this technology. As an old man I can't believe that it really worked at all. But it did, at least some of the time.

    • @dick7540
      @dick7540 2 년 전 +4

      Back in the day, circa 1957, I was an Electrical Engineering student at the City College of New York. In one of the labs we constructed an Analog Computer using physical components like Motors, Gears, etc. There was absolutely nothing binary/digital involved except weather you passed or failed the course.
      A couple of years later I worked with a Bendix G15 computer with an optional DDA (Digital Differential Analyzer). The DDA was an analog computer Input and output were analog. You can look upon Google. Search for " Bendix G15 computer with dda "

  • @joesterling4299
    @joesterling4299 2 년 전 +109

    The biggest issue is distortion. Inexact calculations due to imperfect components, degradation of the data when transmitted (wired or wireless), external EM interference, all conspire to make the use of analog a special challenge. Mixing digital and analog to play to the strengths of each along the way intrigues me. I'm old enough to have experienced the full evolution of digital computing. My mindset is therefore quite biased toward it. What you propose would be quite the eye opener for me, if it actually can be made to work as prolifically as current digital technology.

    • @WilcoVerhoef
      @WilcoVerhoef 2 년 전 +17

      I assume there's a lot to be discovered on the topic of self-correcting algorithms, or even error-correcting analog circuits that compensate partially for the inaccuracies. Like what Hamming codes are for digitally transmitted data.

    • @slippio
      @slippio 2 년 전 +1

      nature exists in chaos, technology is more and more approaching the chaos orchestra.

    • @StevenSiew2
      @StevenSiew2 2 년 전 +6

      Distortion really? I am under the impression that the biggest problem with analog computer is NOISE. You can never get rid of noise in an electrical system. Even if the hardware has no distortion, the inherent thermal noise in the system will cause some small calculation error.

    • @leftaroundabout
      @leftaroundabout 2 년 전 +7

      @@StevenSiew2 that's true, but noise is something that AI needs to deal with anyway because the inputs will always be noisy to begin with. It can actually be useful to _add artificial noise_ while training a digital NN, to avoid overfitting issues. (Stochastic gradient descent can also be seen as a way of making the training “noisy”). As long as the pertubations are small and random, training won't be affected negatively.
      Distortions however are hard to deal with. You may be able to train a model on a particular chip that has such and such distortion; because the distortion properties don't fluctuate and constant-but-unknown biases, the weights will ruthlessly overfit to this particular chip, and then it probably won't work at all on another copy.

    • @Opsse
      @Opsse 2 년 전 +1

      As a PhD student in this field, I can answer some of your questions.
      Yes, we usually talk more about noise than distortion. And thermal noise is not the only issue, there is read and write variability, resistance drift in time, the resistance of interconnections, ...
      However, it is true that neural networks can sometimes take advantage of the noise to avoid overfitting, but only a reasonable amount of noise and only in some cases.
      Self-correcting algorithms and error-correcting are options, but it's not that easy. Usually, this kind of method sacrifice the performance or requires more energy (which is the opposite of what we want).
      About the mixing digital and analog, they presented it nicely in the video, but the digital/analog converters required a lot of energy (sometimes more than the vector-matrices multiplication itself). So we don't want to do it too often.

  • @emmateedub9672
    @emmateedub9672 년 전

    An interesting video covering some of the beginnings of AI, how computers work and also environmental considerations. I would like to find out more about Rosenblatt however i was expecting something of the idea of mechanical computers. Good information good video, thanks!

  • @photorealm
    @photorealm 7 개월 전 +3

    When I started thinking about artificial neural nets, I just assumed they would really only happen on specialized analog computers in the future. Then google and others along with more powerful digital computers made it work pretty darn great.
    I love being in this time of history, watching so much science fiction slowly become reality.

  • @steveipsen6293
    @steveipsen6293 2 년 전 +239

    One of my first "computer" classes in engineering school was learning to wire up an analog computer and solve differential equations. Because I had to "assemble" the hardware for the process, it felt much more hands-on than when I took a punch deck to the little window, and waited for up to 20 minutes for the compiler to tell me I had no idea how Fortran worked. At the time, I really appreciated that parameters on the analog could be changed quickly in order to see how different currents, voltages, resistance, etc. affected the outcome. Of course, now with the speed of digital processors, the efficiency of Python libraries, and the Interwebs, I have largely gotten to appreciate the digital world. Now, Derek has got me jazzed to buy a portable analog. $200 on Ebay?

    • @neeneko
      @neeneko 2 년 전 +2

      Yeah, my computer classes in engineering school had a similar thing, though with us it was opamps. It was not a full class, but we did it around the same time as learning FPGAs and having to implement complex programmable digital logic, so it was a good reminder of 'digital logic with an ADC/DAC pair is not always the best or simplest solution'

    • @swapode
      @swapode 2 년 전 +2

      While it's absolutely not the same thing, I encourage newish programmers to write a 6502 emulator. It's about as close as one can realistically get to building your own CPU hands on, which IMHO gives a worthwhile different perspective to the field than the now common approach to never leave the comfort of interpreters and virtual machines.

  • @aidanl.9946
    @aidanl.9946 2 년 전 +251

    i've always mused about this to myself, i always thought 'why not use analogue to calculate certain things', theres lots of stuff in physics that's extremely hard to calculate, but just 'happens' in the real world in an efficient way, the surface of a bubble for instance minimises surface area very rapidly in a way that takes no effort on the bubbles part, but is incredibly hard for a digital pc to calculate. the tricky part (and the reason people doing this are smart scientists/engineers and i'm not) is figuring out how to wrangle "the bubble" into a portable and responsive piece of hardware, and it's super cool to see efforts made in this direction are having success

    • @jimmysyar889
      @jimmysyar889 2 년 전 +19

      Same thought. I used this technique to figure out a way to solve mazes super efficiently with flowing water. I think that’s what’s happening with quantum computers also.

    • @yoshienverde
      @yoshienverde 2 년 전 +20

      It always comes back to the drawbacks Derek mentions at the beginning of the video: Analog processing is single-purpose, error-prone, and hard to repeat.
      As such, for your physics example, it would invalidate A LOT of the data you get back, since you cannot guarantee a certain level of falseability, auditability, and error margins.
      You CAN get there, but you start requiring A LOT of boilerplate circuitry around the actual solution solving hardware. As a silly and basic example that is almost trivial nowadays, but still there, you can think of the necessity of adding a lot of surge protection and current stabilization to a circuit to ensure that the natural unsteadiness of current in the power grid won't skew your results.
      And that's even just taking into account discrete and "simple" issues to calculate. Imagine processing data for some chaos-related physics theory, and basically getting pure rubish at the end, because even the slightest micro-volt level of disturbance automatically distorts everything.
      How about external interference? Or electromagnetic interference between the actual wires in the circuit?
      As I said, not imposible to tackle, but you suddenly have an overhead of 90% boilerplate just to make the results useful on anything practical.
      I can't even imagine all the engineering that must have gone into those Mythic semi-analog chips for AI, just to keep everything tidy. The fact a Realtek-sized chip can give you one third the performance of some nVidia Quadra (or similar) card, for a fourth of the power consumption of a cheap entry-level mobile Core i3 is just astounding!

    • @yoshienverde
      @yoshienverde 2 년 전 +10

      To be clear, these Mythic chips point towards a future resurgence of analog processors not dissimilar to what digital ones brought in with their unparalleled versatility.
      Outside of very bespoke chips for very high amounts of money, probably in the realms of very high research, science, etc; I can see a general idea of modularity at a functional level. Say, you manufatcure analog chips that can do some very important but expensive math calculations that are common for most science in some specific branch (say, a lot of transformations, or integration, maybe some Lorentzians, and so on). Then, at research groups, institutes, university, they go and do the same as electronic engineers do with good old breadboards, and DIY some complex formulae on the fly, test their hypothesis, and iterate over the formuale as needed.
      Imagine those astrophysicists doing 2k term polynomials, being able to duct tape a dozen chips together, the same way electronic engineers use logic gates as basic digital units, and getting the results out in a couple of hours, instead of having to write a piece of software that will take a couple of days to run, a week to write, and any mistake or failed result requires another week to debug just to make sure it failed because you were wrong, and not because you input a 5 where a six should have gone when writing all 1500 terms for one of the formulae

    • @squeakybunny2776
      @squeakybunny2776 2 년 전

      Yes I've always thought this too.
      Aside from the negatives mentioned in the vid and comment above:
      "if you can't calculate it, let nature do it"
      I've used the term 'calculate' here, but I think it applies in a broader sense. If something is too hard to manufacture / produce precisely maybe nature can do it better.

    • @DrVonJay
      @DrVonJay 2 년 전

      @@yoshienverde wish I understood what you were saying but great rebuttal

  • @wattafakka4186
    @wattafakka4186 개월 전

    great video, I always wondered about neural networks. Now I got it!!👍👍

  • @timobakenecker7314
    @timobakenecker7314 10 개월 전

    This video really has put new aspects to my knowledge of AI in total. Thanks for that!

  • @TheWhatnever
    @TheWhatnever 2 년 전 +566

    This is missing any mention to the other big alternative: Photonics. Startups like Lightmatter have shown that this is another very potent alternative. And I believe its benefits, of not being limited by electronic bandwith/losses and the ability to use one circuit to calculate the same calculation multiple times at the same time by using multiple colors/wavelengths is just astonishing. It was also left out that a big problem of these systems is the bottleneck in the conversion from general compute to these analog domains.

    • @Xenko007
      @Xenko007 2 년 전 +30

      Hopefully he covers this topic in the future

    • @perc-ai
      @perc-ai 2 년 전 +10

      how are u so smart

    • @KWifler
      @KWifler 2 년 전 +34

      Probably because it is also an emerging system. But also because photons are used like electrons as the actor, a new actor, while the video is explaining two fundamentally different ways to act.

    • @ChristopherCricketWallace
      @ChristopherCricketWallace 2 년 전 +11

      I was waiting for him to get to photonics, too. It's a HUGE opportunity to crazy amounts of parallel processing. And then there's quantum computing white whale, too...

    • @blueredbrick
      @blueredbrick 2 년 전 +1

      I want my positronic brain patch

  • @masterbulgokov
    @masterbulgokov 2 년 전 +128

    "Better suited" is the key. Quantum computing will fall into the same clause: there some things quantum computing is "better suited" for.

    • @BreaksFast
      @BreaksFast 2 년 전 +7

      quantum computers (one that use physical q-bits) are only hypothetical, but people talk as if they already exist in reality. They don't, there is not a single, fully functional quantum computer on the planet, and there might never be.

    • @ninjafruitchilled
      @ninjafruitchilled 2 년 전 +7

      @@BreaksFast Sure they exist, they just don't have very many q-bits.

    • @RyanGrissett
      @RyanGrissett 2 년 전 +5

      @@BreaksFast The computers do exist, but there is a lack of understanding in programming them to do classical computing problems.

    • @scyfrix
      @scyfrix 2 년 전 +3

      @@BreaksFast They can and do exist, albeit with very limited qubit counts. The first experimental demonstration of one was in 1998. D-Wave Systems are selling computers with 2048+ qubits right now.

    • @jamesx9881
      @jamesx9881 2 년 전

      @@BreaksFast Tell that to IBM?

  • @michaelperry9180
    @michaelperry9180 3 개월 전

    Funnily enough, this video series helped me understand a bit better how analog music production works. "Modular setups" look a lot like the computer you used to model the Lorenz System.

  • @yourright4510
    @yourright4510 10 개월 전 +1

    While it may be true that we are reaching a limit we’re not quite certain what computational power some new neural networks will need. This is for future applicable outputs needed. Hinting at the new analog data calculations coming into the forefront.

  • @StratEdgyProductions
    @StratEdgyProductions 2 년 전 +908

    This was a banger of an episode. I was enraptured the entire time. Tight story telling with a great hook and title. You're a pro, man.

    • @Strawberry_ZA
      @Strawberry_ZA 2 년 전

      Fancy seeing you here ❤️

    • @oDxrk
      @oDxrk 2 년 전 +1

      hm

    • @trec_log
      @trec_log 2 년 전 +3

      hook, line and thinker

    • @memyselfandi6364
      @memyselfandi6364 2 년 전

      Damn Canadians keep blowing my mind.
      TELL THEM TO STOP IT!

    • @killercuddles7051
      @killercuddles7051 2 년 전

      SARS CoV-2 was patented with UNITED STATES after being developed by Pirbright Institute in UK

  • @pavanagrawal6397
    @pavanagrawal6397 2 년 전 +200

    Fantastic video and I learnt a lot being a biologist. Small correction, neurons (the real ones) are indeed analog in the sense that they can tweak their output and fire, fire more, fire less just like an analog computers. This happens by a combination of changes in neurotransmitters, their dumping at the synapses and adding neuropeptides that can change ‘gain’ from the neural networks.

    • @michaelmeichtry316
      @michaelmeichtry316 2 년 전 +17

      Exactly! The analog behavior of neurons is closely modeled by the analog current/voltage exhibited by the tweaked transistor cells, as so well demonstrated and visualized in the video.

    • @kalliste23
      @kalliste23 2 년 전 +13

      Neurons have a lot going on inside, and things are happening outside, that affect what they do and when they do it. It amazes me that computer neural networks work at all, let alone as well as they do.

    • @vyor8837
      @vyor8837 2 년 전 +1

      Ya, so take what he's wrong about in the field you know and apply it to the field I know(comp sci) and suddenly the entire video is a load of rubbish.

    • @grumpystiltskin
      @grumpystiltskin 년 전 +4

      @@kalliste23 Don't get me started about the neurons in a squid vs a human... they have fewer, bigger and more complex neurons.

    • @blucat4
      @blucat4 년 전 +1

      @@vyor8837 Not a load of rubbish, just amazingly primitive compared to what it's trying to mimic. And also use specific. And not really capable of learning new kinds of tasks. ;-)

  • @gg-qj3gc
    @gg-qj3gc 년 전 +1

    For anyone considering buying the "The Analog Thing" Computer. The site says "offering our low, not-for-profit unit price". Well, they increased the price from 299€ to 499€ in late 2022.

  • @metimulugeta8062

    I was thinking that the hardware is not competent for the software advancement that is taking place and u just gave the right amount of knowledge to clearly see through the fog.

  • @siemensmolders4131
    @siemensmolders4131 2 년 전 +538

    Interesting video, but felt a little bit too hyped up for me ^^
    The discussed challenge appears to be a highly specific application; matrix multiplication. The solution shown here was an analog ASIC (application-specific integrated circuit), which is a type of chip we've been making for over half a century. Once a tasks becomes both computationally expensive and very specific, the fastest method has always been to make a specific chip for it. Nor is analog multiplication anything new, I remember being taught the little analog multiplier circuit with the Gilbert cell over a decade ago.

    • @matteod2567
      @matteod2567 2 년 전 +89

      most of his videos are like this lol

    • @aceman0000099
      @aceman0000099 2 년 전 +17

      I believe Derek found a little niche to focus on since he did the video on the ancient Greek analogue computer, which had an almost identical conclusion

    • @ejpmooB
      @ejpmooB 2 년 전 +5

      I feel he is on to something here ... maybe the real benefit is that you don't have to make all these specific chips, because in principle one fairly big analog one could do everything you threw at it. But it feels a bit scary to me too, because you are getting closer to biological systems.

    • @danielraymond3045
      @danielraymond3045 2 년 전 +27

      Yeah, the reduction in power consumption I'd imagine is mostly due to it being an ASIC, not being analog. There are quite a few digital AI inference ASICs coming onto the market as well - I'm curious to see which ones will reign supreme

    • @mori3327
      @mori3327 2 년 전 +2

      Hi: Unfortunately I can not speak English, so I have to use the program I have installed on my phone for translation, except that I can speak Persian, I do not speak any language other than Persian and no other language. I can not speak, so if I said words or sentences and the special program mistranslated into your language, I apologize in advance to your esteemed father, from the bottom of my heart and from the bottom of my heart, that the letters and words Call me to the end and call me a person orDo not consider me a rude person and do not consider me a rude person and judge me correctly and after reading my writings, just put yourself in my place for a few minutes and imagine yourself in my place, maybe if you have an awakened conscience and There was love and affection in your hearts, of course, if you did not have pride and arrogance, understand me and give me the right, and again, maybe, maybe, maybe you did something and you did it for me and you took me from You saved this great tribulation that I hope will not happen to any living thing, of courseI do not very much hope that anyone will take my hand and save me from this misfortune, but I am writing so that I can at least be comfortable in front of my conscience and not blame myself later if I cry out for this cruelty, oppression, and captivity that has fallen on me. All the doors are closed to me, even the eyes of God are blind to see the oppression that is being done to me, and the ears of God are blind to hearing the cries of my constant cries, my midnight cries, my daily sufferings, my daily prayers and the jurists and The cries of my every moment from this oppression, oppression and cruelty that from the beginning of my life, fromThe first events of my life that I remember are deaf. So how can I hope for others when God has done nothing for me and trapped me in a cruel, cruel, cruel, and oppressive family? I am Morteza, I am from Iran, I am blind, I am 34 years old, I am unemployed because in our country, Iran, there is no work for healthy people, let alone disabled people like me. I live with my family in a small town in Iran. Of course, in appearance, they are my family, but in reality, they see me as their own enemy, and with me, who am their child, only bI was born blind, and I was not to blame for this, but my parents, because they were illiterate, considered me as a disgrace only to myself for my blindness and disability, and were always tortured as severely as possible. They beat me to the point where they threatened me with a knife. They put me to sleep and put the knife on my neck, and I was terrified and scared. Get up and until fullThe thin skin of my hand burned, they would not let me go. And they tortured me many, many times, to the point that my brothers, with the support of my parents, tortured me in front of my parents' indifferent eyes, and told me that I was blind. You are and you should be tortured to the extent that they created the belief in me and in my mind that anyone who is blind or disabled should be tortured because it is a disgrace to the family and society. And unfortunately, their torture is still going on, and only a kind of tortureThey do not consider it for him, so how can I complain to someone who considers my law as his property, of course, the current laws in Iran. Unfortunately, the government does not have a place for people like me to go and live. I really have no choice, either I have to commit suicide or I have to stay in the same house that my family has turned into hell for me, under the severe mental and physical torture that my family inflicts on me. And more than ever with a horrible gradual death that tormentedI want to stay. So if you still have a little mercy, fairness, conscience, compassion, love, humanity, knowledge, ideology, humanity and altruism, help me, hold my hand, reach my cry. If you are in contact with institutions and organizations affiliated with the International Committee of the Red Cross, or if any of you are a member of human rights organizations or the International Committee of the Red Cross or Red Cross organizations in free and pro-human rights countries, listen to my writings, my voice and my cry. Come on, maybe they're a little fair and think of me and a way toSave me from the clutches of ostensibly wolves, of course, if human rights institutions and organizations and the Red Cross actually support human rights, if you are a citizen of a free country or a citizen of a free country like the United States, Canada, Australia, Great Britain, Norway, You have Sweden, Denmark, Finland, and the European Union. Send me invitations, arrange for me to leave this house, which is worse than hell, so that maybe I too can taste freedom. If you have capital, you are rich, you have money, at least help meLeave the messengers, so that all the photos on my personal pages in the messengers and social networks are from 8 years ago, that is, for 8 years ago and 5 years ago. I have not even traveled for many years, because our city is a small city. It is sad for a disabled person to leave home, it is sad for a disabled person, the streets of our city are not adapted so that a disabled person can easily be at least a little out of home. And I'm really depressed at home, especially with this family that is always torturingAre. Of course, I do not care if I travel or not, because my problems are so great that not traveling travels much more than my other or other problems. In your opinion, can a person who eats only one meal in 48 hours, which is a good meal, even a moderate meal, but also any garbage he gets, take pictures, travel or not, and other things? Slowly, someone who breaks his heart at every moment, of course, the work of my heart is no longer broken and my heart is on fire andIt is burned, and this fire is getting more and more hot and burning, and it is burning and ashes my whole being. So I hope that if you do not hear the sound of my heart breaking, at least be fair and feel the smell of my heart burning, away from arrogance, arrogance, misguidedness, ethnicity, race, skin color, shape and appearance for the sake of humanity and for Humanity and honor that you have, just imagine yourself in my place for a few minutes, see yourself in my place and think, if far away from you, God forbid, you would be in my place.Did you ??? Were you satisfied that someone was making fun of you or disrespecting you, or did you laugh at you in response to your message and pain and heart ?, so my dear friends and those who wrote my writings to you Please, if you do not want to do anything for me, at least do not make fun of me, do not laugh, and if you do not want to help extinguish the fire inside me that burns my heart, at least do not spill oil and gasoline with ill-considered language, words and expressions, and this is far from knowledge, honor, It is humanity and family originality. If you want to give me anyPlease help me do not send me a message on KRplus because I can not transfer messages from KRplus to the translation program that I have installed on my phone, and as a result, because I am not fluent in any language other than Persian, I can not understand the meaning of messages. Let me know what you sent me on KRplus. If you wanted and could and the possibility of helping me in any way, whether financially, materially or spiritually, please contact me on WhatsApp or Telegram, because in WhatsApp and Telegram I can easily send messages.00989358205228 00989182804420 00989358205228 00989182804420 00989358205228 00989182804420 00989358205228
      😞😞😞

  • @KarthiSrinivasan
    @KarthiSrinivasan 2 년 전 +83

    There's an entire field of research called neuromorphic computing/engineering looking into this very problem. It was pioneered by Carver Mead in the 90s and has seen a lot of interest lately.

    • @lxschwalb
      @lxschwalb 2 년 전 +8

      I was waiting for him to either mention the words "neuromorphic" or "memristors"

    • @jecelassumpcaojr890
      @jecelassumpcaojr890 2 년 전 +1

      I remember reading about Mead's analog stuff in the 1980s, something related to hearing. Perhaps my memory is wrong.

  • @pbinnj3250
    @pbinnj3250 5 개월 전

    I cannot express all of my appreciation for this video. I understood it and I gained an enormous amount from it. If I sound unduly excited, it’s because I thought this stuff was beyond me. Thank you.

  • @rule1dontgosplat
    @rule1dontgosplat 23 일 전 +1

    Holy crap… I remember seeing the ALVINN van somewhere in the 1980s. Not sure if it was on PBS or something like that. That’s hilarious.

  • @Crowald
    @Crowald 년 전 +86

    So, this was Harold Finch's solution in Person of Interest. His ability to create an autonomous observant AI to identify dangerous behavior was the result of Rosenblat, and he did it 15 or 20 years before anyone else would even attempt to do so.
    Missed an opportunity to mention him in PoI. Neumann was mathematics, Turing is the father of modern computing, but Rosenblat was a maverick on the nature of neural networks.

    • @johndawson6057
      @johndawson6057 년 전 +4

      Oh my god thank you for bringing this up. Ever since i watched that show I have been set on learning everything and anything about AI. It has inspired me set me on my current course in Comp Science.

  • @scottmarquardt8770
    @scottmarquardt8770 2 년 전 +46

    Yeah, the old Navy fire control systems - along with directional aspects of sonar/radar - were analog from beginning to end, and the math required to come up with a fire-control solution that was stabilized in 3d on a moving ship, was intrinsic. It didn't compute as we think of it today - the problem and the solution were just a single feedback loop.
    I remember early in my training when I grasped this, it seemed like magic. Completely steeped in digital computation in my current work, it still seems more magical.

  • @stargaming1635
    @stargaming1635 28 일 전

    Your an absolute legend!
    I love your videos ❤

  • @mikegiles1821
    @mikegiles1821 년 전

    Very informative. Thanks for posting!

  • @dekev7503
    @dekev7503 2 년 전 +104

    This just goes to show that no knowledge is useless. When I was in my final year of my undergraduate degree ( Electrical Engineering) I took a course on analog computers and the general consensus was that this field was obsolete. That year was the last year that the course was taught as it was phased out in the new curriculum.

  • @louis-patrickrancourt1565

    Funny, I always tought of the positronic robots of Asimov to be analog computing, but as a programmer, it was difficult to understand how to work with analog instead of binary, but this video makes a lot of sense, and I can see how the combination of voltage and frequency can influence the result and the combinatory power of multiple inputs can, with each weighting differently in the scale, determine the final result. I only understand a glympse, I know, but my imagination and this video allowed to see how it can relate to our brain neural network. Amazing!!

    • @aoeu256
      @aoeu256 년 전 +13

      Analog has some sort of an error factor, but that error factor can be used for good in terms of evolutionary algorithms.

    • @mtgatutorials368
      @mtgatutorials368 년 전 +6

      The world is NOT Digital aka quantum, but it is Analog. These machines will prove this fact and change how we come to see reality.

    • @RAF-777
      @RAF-777 년 전 +6

      @@mtgatutorials368 I am not 100% sure, but the quantum computing seems a bit like analogue of the analogue computer the similar principals, but the scale is also much smaller than the analogue computer - the analogue computer is using thousands - millions of atoms to represent the value - the quntum one - operating on the single atom's, electrons, photons and other particles to represent the value, and perhaps also the quntum c use engagement, which I am think no one is quite sure how it works so they explain it using the nonsense ideas, like the communication quicker than light speed. They try to use the paradoxes to explain such possiblity, but they still nothing explaining still breaking the basic phisics law: nothing is quicker than light never ever. But I am not sure 😃

    • @mtgatutorials368
      @mtgatutorials368 년 전

      @@RAF-777 oh, I can explain it really easy. 4 spatial dimensions, in an expanding HyperTorus and HyperSphere

    • @cdreid9999
      @cdreid9999 년 전 +2

      @@aoeu256 no analog doesnt. He was describing our implementations of analog. There is no inherent theoretical error rate in analog systems..they are in fact theoretically perfectly accurate while digotal cant be. An analog circuit could theoretically carry the Precise value of pi. A digital one cant

  • @gmeast
    @gmeast 10 개월 전

    25 years ago, designed and built an analog computer using a handful of Summing and Differencing Amps, Resister Arrays, Log- and Anti-Log Amps and more. These components were intrerconnected by a whole bunch of addressable Cross-point/Cross-bar switches and Buffers. An array of inverting and non-inverting Buffers served as analog inputs and variables. A digital word was shifted onto the switches from a PC. You could "build" any math equation. It was eerie seeing a real-time answer emerge as variables and data were being input. Because OP amps were a major part of the architecture, speed was limited by the slew-rates if the Amps.

  • @ProfRvS
    @ProfRvS 12 일 전

    Thanks for this vieo (and many others you are making) - I have become a big fan of your channel! However, watching this video in particular in connection with your clickbait video got me thinking, because I will be recommending this video for my students because of the parts on AI, ANNs and Imagenet and I never would have expected to find these excellent sections from your title (clickbait ...) Thus, I only saw these by chance, since I watched the video out of general interest. This lead me to asking myself whether it wouldn't make sense to have something like alternate titles for different (target) groups - something KRplus could spend a bit of AI research on, maybe?
    Other than that: keep up the good work!

  • @adamkallaev3573
    @adamkallaev3573 년 전 +820

    If it makes my graphics card cheaper, I'm all for it

    • @hridayawesomeheart9477
      @hridayawesomeheart9477 년 전 +57

      Finally, a fellow PCMR member

    • @cdreid9999
      @cdreid9999 년 전 +41

      you dreamer you

    • @jerycaryy4342
      @jerycaryy4342 년 전 +68

      @@hridayawesomeheart9477 finally, an average redditor

    • @BlueDrew10
      @BlueDrew10 년 전 +43

      It sounds like it could make GPUs more power efficient. GPUs are starting to use AI to make certain computations more accurate, so maybe an analog chip on our GPUs could handle that instead.

    • @notisike3553
      @notisike3553 년 전 +11

      @@BlueDrew10 I agree, but the first major bottleneck is, like he said in the video, the massive power requirement to teach each AI, each needing 3 household's combined annual energy usage, mass production seems inefficent.

  • @carstenpxi
    @carstenpxi 2 년 전 +176

    Analog computers are actually a hardwired set of circuits “programmed” for a particular task. They excel is massive parallism, og true real time performance. In additional to analog circuits built using transistor or tubes, optical devices such as prisms (or rainbows) do real time spectrum analysis at light frequencies. And have real time color displays. To duplicate the performance of an optical prism at those frequencies using digital circuitry, would require massive arrays of digital hardware multiplier/accumulators. I did the calculation once in the mid 1990s and at that time it would require about 600 MW power. Early spectrum analysers developed for military applications, took audio or radio waves, upconverted them to light, and used a prism to make the spectrum analysis.

    • @LeTtRrZ
      @LeTtRrZ 2 년 전 +6

      If need be, could analog computers be made to go digital temporarily? If so, it would mean that they can perform accurately for a time and then go back to analog for complexity.

    • @srpenguinbr
      @srpenguinbr 2 년 전 +2

      @@LeTtRrZ I don't think so, they are so fundamentally different it would be hard to integrate them. It would be easier to have 2 circuits

    • @andreafedeli3856
      @andreafedeli3856 2 년 전 +4

      @@LeTtRrZ There are tons of studies on reconfigurable architectures, and the theory of how good a digital computer can be at representing analog behaviors (so, I reckon, the opposite of what you were conjecturing) is well known, as are the implied constraints, but the matter remains about what bricks set should be part of the reconfigurable architectural fraction, and in which abundance each. As soon as you decide the number of components available, and the maximum degree of connection reconfigurability, you define a limit on what you can represent in a given amount of time. As a passage in the video suggests, there are studies about the utilization of architectures with digital boundaries between analog slices, but error correction possibility is very often, I'd dare to say always, a consequence of what you know you want to represent, simply because if you don't know what you're representing, you cannot tell whether you're doing right or wrong... At best you may exploit some underneath characteristics of the representational space: e.g.: if you know that your values should fall on one element of a grid you may correct an analog result choosing the nearest grid element, which means, somehow, re-digitalizing the result... But knowing what you're representing poses a constraint on the freedom of the intermediate representations...

    • @LeTtRrZ
      @LeTtRrZ 2 년 전 +1

      @@andreafedeli3856 Why not just allow the computer to ration between digital and analog based on the demand of the task it’s attempting?

    • @smithsmithington
      @smithsmithington 2 년 전 +2

      @@LeTtRrZ He says that in the video. It's exactly what they do. @ 18:56

  • @mdzaid5925
    @mdzaid5925 9 개월 전 +23

    I feel that analog will make a very strong comeback, but only in specialized applications. For general purpose computing, digital will retain it's dominance.

    • @GGSHeadoR
      @GGSHeadoR 6 개월 전 +8

      Congratulations on repeating what you just heard in the video.

    • @GrassXMagnum
      @GrassXMagnum 6 개월 전 +1

      ​@@GGSHeadoRconsidering most people comment before even finishing watching the video, there's a chance they didn't actually listen to that part 😅

    • @schrodingerscat1863
      @schrodingerscat1863 6 개월 전

      It never went away, just became easier to model simple stuff purely in the digital domain. Some operations were always easier to model using analogue components sampling and displaying the results using digital computers.

  • @nixtoshi
    @nixtoshi 3 개월 전 +1

    Can someone explain to me why the result of the multiplication of the matrices is so high at minute 17:22?
    Shouldn't the result be 8.1, 7.9 and 9.8 instead of 341, 473, and 291?

  • @WahPony
    @WahPony 2 년 전 +103

    Processors have always been slapping new modules on to cover different types of problems. The GPU was added to do repetitive operations, receivers of radio signals usually have an analog de-modulator to make practical doing signal processing on them, quantum computers are right around the corner where 12 or so qubits could be connected to your processors in a seperate box via thunderbolt to perform verification and encryption tasks, GPUs now have internal tensor cores to perform the operations you discussed (and even a bit more general) at lower bit depth, and even processes like "multiply then add" have separate modules inside the processor to compute more efficiently then multiply first, then add as separate operations.

    • @joefish6091
      @joefish6091 2 년 전 +4

      Speaking of radio receivers, think SDR. software defined radio, a revolution in radio..

    • @chychywoohoo
      @chychywoohoo 2 년 전 +2

      Quantum computers are not right around the corner

    • @Kyrator88
      @Kyrator88 2 년 전 +1

      Quantum computers are not gonna be beside your computers unless you have a massive shed with space for a state of the art helium/nitrogen cooling system on hand

    • @SimonBuchanNz
      @SimonBuchanNz 2 년 전 +1

      @@Kyrator88 I'm not completely discounting the possibility of solid state room temperature qubits, but yeah, excitement about personal quantum computers is pretty silly.

    • @TheDXPower
      @TheDXPower 2 년 전

      A quantum computer is not required for performing quantum-safe encryption/decryption. NIST is very close to standardizing one of many candidates that provides this functionality.

  • @matthewsylvester9103
    @matthewsylvester9103 2 년 전 +234

    Hey Versitasium, you should really look into the hardware used specifically to simulate neurons like the human brain. They have chips that are built on a physical level to simulate individual neurons, and as of right now consumers cannot buy them. It would be so cool if you could get an in depth look into these, the information about them is kind of difficult to find. Artificial intelligence in the traditional sense has neurons that do not even come close to how they work in real life, they should be called artificial function generators. Generating consciousness with things like tensorflow is just very unlikely. Most don't have memory, if they do it's simplified and recursion is also simplified compared to real life neurons. Also I think it would be really cool if you dove into the technicals of artificial consciousness.

    • @MGHOoL5
      @MGHOoL5 2 년 전 +6

      Do you have anything about them like their name or resources to look into?

    • @skierpage
      @skierpage 2 년 전 +26

      @@MGHOoL5 DuckDuckGo for "neuromorphic computing"; University of Manchester's SpiNNaker chip, Intel's Loihi chip, something out of Stanford, are actual implementations.
      No one knows if General Intelligence and "consciousness" require the detailed neurobiology of the human brain, or whether it only requires a few more breakthroughs in conventional AI like backpropagation plus a few more orders of magnitude increase in the size of digital neural networks.

    • @tylerevans1700
      @tylerevans1700 2 년 전 +13

      @@skierpage good call on the recommendation to search with something other than Google, won't find anything there...

    • @MGHOoL5
      @MGHOoL5 2 년 전 +7

      @@tylerevans1700 Wait really? Is that a thing where you would go to another search engine like DuckDuckGo to find something 'more relevant' than you would find in google?

    • @skierpage
      @skierpage 2 년 전 +10

      @@MGHOoL5 an ad blocker is essential to filter out paid search results in Google. However, Google still prioritizes those search results aligned with its business interest: its own services, sites that serve tons of ads using Google's ad tech, $^&#@! 8-minute videos on KRplus instead of simple text explanations, showing sites with ads instead of providing the answer right in search results, etc.
      DuckDuckGo does less of this, But its search algorithm isn't as smart at understanding what you're looking for. I use Firefox which lets you easily choose which search engine to use from the location field.

  • @laikavoid3364
    @laikavoid3364 10 개월 전

    Such an amazing video! Great work!

  • @ghattassaliba4910
    @ghattassaliba4910 개월 전

    Thanks and keep making such content please❤

  • @sternis1
    @sternis1 2 년 전 +173

    I remember my friend once made a completely analog line follower robot. He implemented a PID controllers using Opamps and trimming the parameters with 3 variable resistors. It actually worked quite well!

    • @gamedominatorxennongdm7956
      @gamedominatorxennongdm7956 2 년 전 +4

      That's pretty creative and clever of him.

    • @allowambeBOWWAMB
      @allowambeBOWWAMB 2 년 전 +3

      opamps are wonderful

    • @omniverideus
      @omniverideus 2 년 전 +2

      That sounds like the micro-mouse project. In the UK at schools in the late 90's it was a challenge for electronics students (electives yr10-12). Really fun finding solutions to the problem of navigating a maze either on paper (a line) in a 3d maze or both.

    • @EpicVideoGamer7771
      @EpicVideoGamer7771 2 년 전 +1

      Your comment was stolen :/