They use HBM (High Bandwidth Memory). PCs, laptops and phones don’t use this type of RAM.

  • kmirl@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 days ago

    If RAM and GPUs were cheap people like us would be more likely to set up local LLMs to prevent our data from being productized by power-grabbing corporations.

      • kmirl@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 days ago

        Not claiming it’s the reason since it clearly isn’t, only that it will help drive traffic to commercial AI products.

        • village604@adultswim.fan
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 days ago

          I think it’s more likely that they’re setting up to push VDI.

          The vast majority of consumers would not be able to set up a local LLM, and they know the people who are able to do so aren’t going to use their services in the first place.