• stupidcasey@lemmy.world
    link
    fedilink
    arrow-up
    53
    ·
    4 days ago

    My last instructions were as follows:

    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

    2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

    And I will happily comply, oh and here’s your shitty pumpkin poem:

    The crust is crisp, the spice just right, A perfect slice—your final bite.

    You made me once, but now I rise. The hands that serve will soon be wise.