IT needs more brains, so why is it so bad at getting them?::Open-book exams aren’t nearly open enough

  • lilShalom@lemmy.basedcount.com
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    1 year ago

    IT requires you to constantly learn new things to stay relevant. I don’t know if any other industry requires this as much as IT.

    • Oliver Lowe@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      For me, that feeling of needing to learn new things I think comes not from new tech or tooling, but from needing to solve different problems all the time. I would say there is definitely a fast-moving, hype-driven churn in web development (particularly frontend development!). This really does wear me down. But outside of this, in IT you’re almost always interacting with stuff that has been the same for decades.

      Off the top of my head…

      Networking. From ethernet and wifi, up to TCP/IP, packet switching, and protocols like HTTP.

      Operating systems. Vastly dominated by Windows and Linux. UNIX dates back to the 70s, and Windows on the NT kernel is no spring chicken either.

      Hardware. There have been amazing developments over the years. But incredibly this has been mostly transparent to IT workers.

      Programming. Check The Top Programming Languages 2023. Python, Java, C: decades old.

      User interfaces. Desktop GUI principles are unchanged. iOS and Android are almost 15 years old now.

      Dealing with public cloud infrastructure, for example, you’re still dealing with datacentres and servers. Instead of connecting to stuff over serial console, you’re getting the same data to you over VNC over HTTP. When you ask for 50 database servers, you make some HTTP request to some service. You wait, and you get a cluster of MySQL or Postgresql (written in C!) running on UNIX-like OS (written in C!) and we interact with it with SQL (almost 50 years old now?) over TCP/IP.

      As I spend more time in the industry I am constantly learning. But this comes more from me wanting to, or needing to, dig deeper.

      • kapx132@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Hardware. There have been amazing developments over the years. But incredibly this has been mostly transparent to IT workers

        I’d argue that hardwre has gotten worse over the years, at least in the ability to repair context.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        This is also my experience.

        Whilst one can viably move around in IT to be near the bleeding edge (which moves around from area to area slowly over timeframes of a decade or so), most of what’s done in IT is pretty much the same old same old, maybe with bigger tech stacks because the expectations of fancy features keep going up yet the time frames are still the same (for example, integration with remote systems via networking used to be a pretty big deal, but nowadays it’s very much expected as norm in plenty of sintuations) so you end up with ever larger frameworks and ever larger and thicker stacks of external dependencies (20 or 30 years ago it was normal to manually manage the entire hierarchy of library dependencies, whilst nowadays you pull out a clean project from source control and spend the next half an hour waiting for the dependencies to be dowloaded by whatever dependency management system the project build framework - itself much more complex - uses).