• Hawk@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 months ago

    They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

    • didnt_readit@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

      • Hawk@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.