For those veteran linux people, what was it like back in 90s? I did see and hear of Unix systems being available for use but I did not see much apart from old versions of Debian in use.

Were they prominent in education like universities? Was it mainly a hobbyist thing at the time compared to the business needs of 98, 95 and classic mac?

I ask this because I found out that some PC games I owned were apparently also on Linux even in CD format from a firm named Loki.

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    6 months ago

    Well, XFree86 ( before Xorg and before KMS ) was an adventure. I spent hours guessing the vertical and horizontal frequencies of my monitor trying to get decent resolutions.

    Other than that, Linux was way more work but “felt” powerful relative to OS options of the time. Windows was still crashy. The five of us that used OS/2 hated that it still had a lot of 16 bit under the hood. Linux was pure 32 bit.

    Later in the 90’s, you could run a handful of Windows apps on Linux and they seemed to run better on Linux. For example, file system operations were dramatically faster.

    And Linux was improving incredibly rapidly so it felt inevitable that it would outpace everything else.

    The reality though was that it was super limited and a pain in the ass. “Normal” people would never have put up with it. It did not run anything you wanted it to ( if you had apps you liked on Mac, Windows, OS/2, Amiga, NeXTstep, BeOS, or whatever else you were using ( there were of potential options at the time ). But even for the pure UNIX and POSIX stuff, it was hard.

    Obviously installation was technical and complex. And everything was a hodge-podge of independently developed software. “Usability” was not a thing. Ubuntu was not release until 2004.

    Linux back then was a lot of hitting FTP sites to download software that you would build yourself from source. Stuff could be anywhere on the Internet and your connection was probably slow. And it was dependency hell so you would be building a lot of software just to be able to build the software you want. And there was a decent chance that applications would disagree about what dependencies they needed ( like versions ). Or the config files would be expected in a different location. Or the build system could not find the required libraries because they were not where the Makefile was looking for them.

    Linux in the 90’s had no package management. This is maybe the biggest difference between Linux then and Linux now. When package management finally arrived, it came in two stages. First, came packages but you were still grabbing them individually from FTP. Second came the package manager which handled dependencies and retrieval.

    The most popular Linux in the mid to late 90’s was Red Hat. This was before RHEL and before Fedora. There was just “Red Hat Linux”. Red Hat featured RPMs ( packages ) but you were still installing them and any required dependencies yourself at the command line. YUM ( precursor to NRF ) was not added until Fedora Core 1 was release in 2003!

    APT ( apt-get ) was not added to Debian until 1998.

    And all of this meant that every Linux system ( not distro — individual computer ) was a unique snowflake. No two were alike. So bundling binary software to work on “Linux” was a real horror-show. People like Loki gave it a good run but I cannot imagine the pain they went through. To make matters worse, the Linux “community” was almost entirely people that had self-selected to give up pre-packaged software and to trade sweat-equity for paying for stuff. Getting large number of people to give you money for software was hard. I mean, as far as we have come, that is still harder on Linux than on Windows or macOS.

    You can download early Debian or Red Hat distros today if you want to experience it for yourself. That said, even the world of hardware has changed. You will probably not be wrestling IRQs to get sound or networking running on modern hardware or in a VM. Your BIOS will probably not be buggy. You will have VESA at least and not be stuck on VGA. But all of that was just “computing” in the 90’s and the Windows crowd had the same problems.

    One 90s hardware quirk was “Windows” printers or modems though where the firmware was half implanted in Windows drivers. This was because the hardware was too limited or too dumb to work on its own and to save money your computer would do some of the work. Good luck having Linux support for those though.

    Even without trying old distros, just try to go a few days on you current Linux distro without using apt, nrf, pacman, zypper, the GUI App Store, or what have you. Imagine never being able to use those tools again. What would that be like?

    Finally, on my much, much slower 90’s PC, I compiled my own kernel all the time. Honestly multiple times per month I would guess. Compiling new kernels was a significant fraction of where my computing resources went at the time. I cannot remember the last time I compiled a kernel.

    It was a different world.

    When Linus from LTT tried Linux not that long ago ( he wanted to game ), he commented that he felt like he was playing “with” his computer instead of playing “on” his computer. That comment still describes Linux to some extent but it really, really captures Linux in the 90’s.

  • The Zen Cow Says Mu@infosec.pub
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Way back in the early 90s I needed to use LaTeX for university. The dos version was awful and couldn’t handle large documents. So the options were (1) a nextcube for $$$$, (2) Nextstep 3.3 for PCs for $$$ (some faculty had this), or (3) linux. So I downloaded slackware on dozens of disks.

    You had to configure the kernel, which wasn’t too hard since the autoconfig walked you through it. The hardest part was setting up X11, which required a lot of manual config, and if you screwed up the timings you could destroy a CRT monitor. OpenStep was an option, so there was a moderately friendly windowmanager available.

    Learning Emacs was also fairly unpleasant, but that was the best option for editing TeX at the time.

    Everything would work, until it suddenly would break. But nonetheless I was somehow able to get that thesis done.

    Ugh, modern linux is SOOOOOOOOOOOOOOOOOO much better

    • Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Just to add to this, early on there was no such thing as kernel modules, so you had to compile your own kernel with the hardware support you needed for anything beyond basic (if I remember it correctly, it was only basic processor stuff, keyboard and text mode VGA) hardware support.

    • JaxNakamura@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      So I downloaded slackware on dozens of disks.

      This is no joke. When I downloaded Slackware in '95 or '96, it was over 100 3.5" floppies of 1.44 MB each. And there were still more available, those were just the ones I thought I’d need. And before you could even begin installing, each of those had to be downloaded, written and verified because floppies were not terribly reliable.

  • porl@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    Hearing your monitor squeal when you got the modelines wrong was fun.

    • gari_9812@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Could you please elaborate? I’ve no idea what that sentence means, so it sounds really wild to me 😅

      • IsoKiero@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Back when CRT monitors were a thing and all this fancy plug’n’play technology wasn’t around you had modelines on your configuration files which told the system what kind of resolutions and refresh rates your actual hardware could support. And if you put wrong values there your analog and dumb monitor would just try to eat them as is with wildly different results. Most of the time it resulted just in a blank screen but other times the monitor would literally squeal when it attempted to push components well over their limits. And in extreme cases with older monitors it could actually physically break your hardware. And everything was expensive back then.

        Fun times.

      • Aceticon@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        CRT monitors internally use an electron gun which just fires electrons at the phosporous screen (from, the back, obviously, and the whole assembly is one big vacuum chamber with the phosporous screen at the front and the electron gun at the back) using magnets to twist the eletcron stream left/right and up/down.

        In practice the way it was used was to point it to the start of a line were it would start moving to the other side, then after a few clock ticks start sending the line data and then after as many clock ticks as there were points on the line, stop for a few ticks and then swipe it to the start of the next line (and there was a wait period for this too).

        Back in those days, when configuring X you actually configured all this in a text file, low level (literally the clock frequency, total lines, total points per line, empty lines before sending data - top of the screen - and after sending data as well as OFF ticks from start of line before sending data and after sending data) for each resolution you wanted to have.

        All this let you defined your own resolutions and even shift the whole image horizontally or vertically to your hearts content (well, there were limitations on things like the min and max supported clock frequency of the monitor and such). All that freedom also meant that you could exceed the capabilities of the monitor and even break it.

  • mortalic@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    This was me, you’re talking about me. 😂 In the 90’s Linux was barely getting started but slackware was probably the main distro everyone was focused on. That was the first one I ran across. This was probably late 90’s, I don’t remember when slack first came about though.

    By the time the 2000’s came around, it was basically a normal thing for people in college to have used or at least tried. Linux was in the vernacular, text books had references to it, and the famous lawsuit from SCO v IBM was in full swing. There were distro choices for days, including Gentoo which I spent literally a week getting everything compiled on an old Pentium only for it to not support some of the hardware and refuse to boot.

    There was a company I believe called VA Linux that declared that year to be the year of the Linux desktop. My memory might be faulty on this one.

    Loki gaming was a company that specialized in porting games to Linux, and they did a good job at it but couldn’t make money. I remember being super excited about them and did buy a few games. I was broke too so that was a real splurge for me. I feel like they launched in the 90’s (late) and crashed in the early 2000’s.

    • constantokra@lemmy.one
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      6 months ago

      I think you need to qualify that having used or tried Linux in college was normal in the 2000s for someone in computer science or engineering, or basically my fellow undiagnosed autistics and autistic adjacents. In my experience it was fairly normal in college for most people to have trouble operating a basic word processor, and they would not have had any idea what Linux was at all.

  • Shimitar@feddit.it
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Ah, Linux from scratch…

    Also, hardware was… Harder back then, on Linux (mostly modems).

    Beside that, software wise there was less stuff on Linux than today, so you had to check carefully you had what you needed.

    But I was already a Linux user, and a linux-only user at that.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        Is was those crappy winmodems that caused all the problems. They cheaped out on hardware, so you basically got a sound card. All of the work had to be done by the driver, which also put a lot of load on your CPU. Serial modems just worked since everything was done in hardware.