• LughOPMA
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    11 months ago

    Computers are starting to use staggering amounts of electricity. There is a trade-off here between the utility of the tasks they perform and the climate damage caused by generating all the electricity they need. Bitcoin mining is thought to be currently using 2% of America’s electricity and seems an especially egregious waste of energy.

    Radically diminishing computer’s electricity requirements as they become more powerful should be seen as an urgent task.

  • pelya@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    11 months ago

    So the whole chip is a complicated lens, that somehow can perform multiplication using ‘analogue computation’.

    Arxiv link

    • kakes@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      11 months ago

      Imo, analog computation is the way forward with this whole AI thing. It seems like a waste to perform calculations bit-by-bit when neural nets are generally okay with “fuzzy math” anyway.

    • Anticorp@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      11 months ago

      So first we tricked rocks into doing math, and now we’ve figured out how to trick glass into doing math? This is truly amazing.

  • Yer Ma@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    Automobile analogy: there is no replacement for displacement… until there is?