• souperk@reddthat.com
    link
    fedilink
    arrow-up
    63
    ·
    edit-2
    11 months ago

    The title is pretty self explanatory. Yes, I want to know if it’s AI generated because I don’t trust it.

    I agree with the conclusion that it’s important to disclose how the AI was used. AI can be great to reduce the time needed for boilerplate work, so the authors can focus on what’s important like reviewing and verifying the accuracy of the information.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      31
      ·
      edit-2
      11 months ago

      Yep, my trust would go:

      1. Site that states they don’t use AI to generate articles
      2. Site that labels when they use AI generated articles
      3. Sites that don’t say anything and write in a weird way

       

      1. Sites that get caught using AI without disclosing it.

      So ideally don’t use AI, but if you do make it clear when and how. If a site gets CAUGHT using AI, then I’m probably going to avoid it altogether

    • jarfil@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      11 months ago

      reduce the time needed for boilerplate work

      Or… and this is just an idea… don’t add “boilerplate” to articles.

      If the content of an article can be summarized in a single table, I don’t want to read 10 paragraphs explaining the contents of the table row by row. The main reason to do that, is to pad the article and let the publisher put more ad sections between paragraph and paragraph, while making it harder to find the data I’m interested in.

      Still, I foresee a future where humans will fill out the table, shove it at an AI to do the “boilerplate work”, and then… users showing the whole article into an AI to strip the boilerplate and summarize it.

      A great scenario for AI vendors, not so great for anyone else.