• dan@upvote.au
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    The issue was that you can hold far more data on a CD - 650MB on a CD vs 64MB on the largest N64 cartridges. The N64’s 3D hardware was far superior to the Playstation, so sometimes I wonder if having a larger storage medium could have resulted in even better games.

    • KubeRoot@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Take a look at what Kaze Emanuar is doing with SM64 if you’re curious what the N64 can do with modern software practices ;D

      • dan@upvote.au
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        Yeah I’ve seen his videos - very impressive. He’s spent years working on it though (way more than most N64 devs that built commercially released games), and compiler optimizations that exist today didn’t exist back then.

        • KubeRoot@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I don’t think compiler optimizations matter much - supposedly the final build was compiled without optimizations, presumably by mistake, and the N64 has very specific hardware which compilers don’t know how to optimize for.

          What we certainly do have are much more powerful machines and software in general, letting you test, analyze and profile code much more easily, as well as vast amounts of freely available information online - I can’t really imagine how they did it back then.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            Some optimizations help a lot. There’s a bunch of general optimizations that compiler do that work for any CPU. A simple example would be unrolling small loops. Compilers are fast enough today that they can brute force the best optimization for a given piece of code if needed.