• weird_nugget@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      10 months ago

      Yeah I also do and it is indeed frequently incorrect. It is good when you have like no idea about what you’re doing. It can help you get on track and then you can research by yourself.

    • TehBamski@lemmy.worldOP
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      Not picking fights. Just curious.

      Is this an improvement or a decline in your overall code programming success?

      • Matriks404@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        10 months ago

        I am a hobbyist (and not very good) programmer, and while ChatGPT (free version) often gives me wrong answers, it still gives me some insight on how some stuff could be done (intentionally or not) or how something works and is actually somewhat helpful in learning stuff, but I guess this could be double-edged sword even in that regard.

        It is also pretty good at detecting simple code errors, from what I have seen.

        Overall more positive than negative, but I wouldn’t recommend to use it blindly.

      • Deceptichum@kbin.social
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        10 months ago

        Huge improvement in work flow.

        Don’t get it to write your code for you, it’s not gonna work 3/10 times. Instead use it to review your code, help remove any code smells for refactoring.

      • blackbirdbiryani@lemmy.world
        link
        fedilink
        arrow-up
        7
        arrow-down
        1
        ·
        10 months ago

        I don’t use chatGPT, but work with colleagues who do. They’re productivity visibly drops and half the time I gotta fix their shitty code.

    • LanternEverywhere@kbin.social
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      I use chatGPT for any topic I’m curious about, and like half the time when i double check the answers it turns out they’re wrong.

      For example i asked for a list of phones with screens that don’t use PWM, and when i looked up the specs of the phones it recommended it turned out they all had PWM, even though in the chatGPT answer it explicitly stated that each of these phones don’t use PWM. Why does it straight up lie?!