Like I’m not one of THOSE. I know higher = better with framerates.

BUT. I’m also old. And depending on when you ask me, I’ll name The Legend of Zelda: Majora’s Mask as my favourite game of all time.

The original release of that game ran at a glorious twenty frames per second. No, not thirty. No, not even twenty-four like cinema. Twenty. And sometimes it’d choke on those too!

… And yet. It never felt bad to play. Sure, it’s better at 30FPS on the 3DS remake. Or at 60FPS in the fanmade recomp port. But the 20FPS original is still absolutely playable.

Yet like.

I was playing Fallout 4, right? And when I got to Boston it started lagging in places, because, well, it’s Fallout 4. It always lags in places. The lag felt awful, like it really messed with the gamefeel. But checking the FPS counter it was at… 45.

And I’m like – Why does THIS game, at forty-five frames a second, FEEL so much more stuttery and choked up than ye olde video games felt at twenty?

  • BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    edit-2
    7 months ago

    It’s a few things, but a big one is the framerate jumping around (inconsistent frame time). A consistent 30fps feels better than 30, 50, 30, 60, 45, etc. Many games will have a frame cap feature, which is helpful here. Cap the game off at whatever you can consistently hit in the game that your monitor can display. If you have a 60hz monitor, start with the cap at 60.

    Also, many games add motion blur, AI generated frames, TAA, and other things that really just fuck up everything. You can normally turn those off, but you have to know to go do it.


    If you are on console, good fucking luck. Developers rarely include such options on console releases.

    • IceFoxX@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      7 months ago

      30 50 30 60 30… Thats FPS… Frametime means the time between each frame in this second.

  • over_clox@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    7 months ago

    My favorite game of all time is Descent, PC version to be specific, I didn’t have a PlayStation when I first played it.

    The first time I tried it, I had a 386sx 20MHz, and Descent, with the graphics configured at absolute lowest size and quality, would run at a whopping 3 frames per second!

    I knew it was basically unplayable on my home PC, but did that stop me? Fuck no, I took the 3 floppy disks installer to school and installed it on their 486dx 66MHz computers!

    I knew it would just be a matter of time before I got a chance to upgrade my own computer at home.

    I still enjoy playing the game even to this day, and have even successfully cross compiled the source code to run natively on Linux.

    But yeah I feel you on a variety of levels regarding the framerate thing. Descent at 3 frames per second is absolutely unplayable, but 20 frames per second is acceptable. But in the world of Descent, especially with modern upgraded ports, the more frames the better 👍

      • over_clox@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        7 months ago

        I haven’t actually played Free Space before, but I did manage to get a copy and archive it a few years ago.

        I also got a copy of Overload and briefly tried that, but on my current hardware it only runs at about 3 frames per second…

        The Descent developers were really ahead of their time and pushing gaming to the extreme!

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    arrow-up
    7
    ·
    7 months ago

    old games animations were sometimes made frame by frame. like the guy who drew the character pixel by pixel was like “and in the next frame of this attack the sword will be here”

  • tomkatt@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    7 months ago

    Couple things. Frame timing is critical and modern games aren’t programmed as close to the hardware as older games were.

    Second is the shift from CRT to modern displays. LCDs have inherent latency that is exacerbated by lower frame rates (again, related to frame timing).

    Lastly with the newest displays like OLED, because of the way the screen updates, lower frame rates can look really jerky. It’s why TVs have all that post processing and why there’s no “dumb” TVs anymore. Removing the post process improves input delay, but also removes everything that makes the image smoother, so higher frame rates are your only option there.

    • Lojcs@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 months ago

      First time hearing that about OLEDs, can you elaborate? Is it that the lack of inherent motion blur makes it look choppy? As far as I can tell that’s a selling point that even some non-oled displays emulate with backlight strobing, not something displays try to get rid of.

      Also the inherent LCD latency thing is a myth, modern gaming monitors have little to no added latency even at 60hz, and at high refresh rates they are faster than 60hz crts

  • SolidShake@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    7 months ago

    Bro when Majora’s mask came out nothing was 60fps lol. We weren’t used to it like how we are today. I’m used to 80fps so 60 to me feels like trash sometimes.

    • Count Regal Inkwell@pawb.socialOP
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      7 months ago

      Ackshuli – By late 2000 there were a couple games on PC that could get there.

      … If you were playing on high-end hardware. Which most PC gamers were not. (despite what Reddit PCMR weirdos will tell you, PC gaming has always been the home for janky hand-built shitboxes that are pushed to their crying limits trying to run games they were never meant to)

      Regardless that’s beside the point – The original MM still doesn’t feel bad to go back to (it’s an annual tradition for me, and I alternate which port I play) even though it never changed from its 20FPSy roots.

    • TheFogan@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      Yeah but even now you can go back and play Majora’s mask, and it not feel bad.

      But as mentioned the real thing is consistancy, as well as the scale of action, pace of the game etc… Zelda games weren’t sharp pinpoint control games like say a modern FPS. Gameplay was fairly slow. and yeah second factor is simply games that were 20FPS, were made to be a 100% consistant 20 FPS. A game locked in at 20, will feel way smoother than one that alternates between 60 and 45

      • IceFoxX@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        edit-2
        7 months ago

        No more optimizations. This must then be compensated for with computing power, i.e. by the end user. These are cost reasons. Apart from that, the scope has become much larger, making optimizations more time-consuming and therefore more expensive. In the case of consoles, there is also the fact that optimizations have to be made specifically for a hardware configuration and not, as with PCs, where the range of available components is continuously increasing. Nevertheless, the aim is to cut costs while maximizing profits.

    • otp@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      7 months ago

      Bro when Majora’s mask came out nothing was 60fps lol

      Huh? 60fps was the standard, at least in Japan and North America, because TVs were at 60Hz/fps.

      Actually, 60.0988fps according to speed runners.

      • bleistift2@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 months ago

        The TV might refresh the screen 60 times per second (or actually refresh half the screen 60 times per second, or actually 50 times per second in Europe), but that’s irrelevant if the game only throws 20 new frames per second at the TV. The effective refresh rate will still be 20Hz.

        That’s just a possible explanation. I don’t know what the refresh rate of Majora’s Mask was.

        • otp@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          I was looking it up, and games like Super Mario World are allegedly at 60fps according to some random things on the internet

  • RobotZap10000@feddit.nl
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    7 months ago

    FPS counters in games usually display an average across multiple frames. That makes the number actually legible if the FPS fluctuates, but if it fluctuates really hard on a frame-by-frame, it might seem inaccurate. If I have a few frames here that were outputted at 20 FPS, and a few there that were at 70 FPS, the average of those would be 45 FPS. However, you could still very much tell that the framerate was either very low or very high, which would be perceived as stutter. Your aforementioned old games probably were frame-capped to 20, while still having lots of processing headroom to spare for more intensive scenes.

  • Ghostalmedia@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 months ago

    Some old games are still pretty rough with their original frame rate. I recently played 4 player golden eye on an n64, and that frame rate was pretty tough to deal with. I had to retrain my brain to process that.

    • Crozekiel@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Out of curiosity, did you have an actual N64 hooked up to a modern TV? A lot of those old games meant to be played on a CRT will look like absolute dog shit on a modern LCD panel. Text is harder to read, it is harder to tell what a shape is supposed to be, it’s really bad.

  • dukeofdummies@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Well a good chunk of it is that older games only had ONE way they were played. ONE framerate, ONE resolution. They optimized for that.

    Nowadays they plan for 60, 120, and if you have less too bad. Upgrade for better results.

  • AdrianTheFrog@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.

    I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps

  • Raltoid@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 months ago

    Stuttering, but mostly it’s the FPS changing.

    Lock the FPS to below the lowest point where it lags, and suddenly it wont feel as bad since it’s consistent.


    EDIT: I completley skipped over that you used Fallout 4 as an example. That engine tied game speed and physics to fps last time I played. So unless you mod the game, things will literally move “slower” as the fps drops.

  • Crozekiel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Part of it is about how close you are to the target FPS. They likely made the old N64 games to run somewhere around 24 FPS since that was an extremely common “frame rate” for CRT TVs common at the time. Therefore, the animations of, well, basically everything that moves in the game can be tuned to that frame rate. It would probably look like jank crap if they made the animations have 120 frames for 1 second of animation, but they didn’t.

    On to Fallout 4… Oh boy. Bethesda jank. Creation engine game speed is tied to frame rate. They had several problems with the launch of Fallout76 because if you had a really powerful computer and unlocked your frame rate, you would be moving 2-3 times faster than you should have been. It’s a funny little thing to do in a single-player game, but absolutely devastating in a multi-player game. So, if your machine is chugging a bit and the frame rate slows down, it isn’t just your visual rate of new images appearing that is slowing down, it’s the speed at which the entire game does stuff that slowed down. It feels bad.

    And also, as others have said, frame time, dropped frames, and how stable the frame rate is makes a huge difference too in how it “feels”.

      • Count Regal Inkwell@pawb.socialOP
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        Yeah it’s actually around the 30s (or 60s, depending on whether you consider interlaced frames to be ‘true’ or just ‘halves’)

        A CRT television runs at 60Hz because it uses the alternating current from the wall as a reference, but in every half cycle it only actually draws half of the image. “60i” as they call it.

        So you can say it’s 60 interlaced frames a second, which is about comparable to 30 progressive frames.

      • Crozekiel@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        That’s true, but I didn’t say that was the native “frame rate” of the TVs, just a very commonly used frame rate at the time. And, at least in my experience, desync’ed frame rate and refresh rate is less of a problem on CRTs than LCDs - you don’t “feel” it as much generally.

  • dontbelievethis@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    Probably consistency.

    Zelda was 20 fps, but it was 20 fps all the time so your brain adjusted to it. You could get lost in the world and story and forget you were playing a game.

    Modern games fluctuate so much that it pulls you right out.

  • Yermaw@lemm.ee
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    The display being at a higher resolution doesnt help either. Running retro games on my fancy flatscreen hi-def massive TV makes them look and feel so much worse than on the smaller fuzzy CRT screens of the time.

    I can’t stand modern games with lower frame rates. I had to give up on Avowed and a few other late titles on the series S because it makes me feel sick when turning the camera. I assume most of the later titles on xbox will be doing this as theyre starting to push what the systems are capable of and the series S can’t really cope as well.