Assigning a lot of morality and intention to word generating algorithms. “Oh no! The thing made to generate text to a prompt generated text to a prompt!”
Especially because the data it has been trained on is probably not typical for a CFO in the real world. In the real world it’s a lot of boring day-to-day stuff, that isn’t worth writing up. The stuff worth writing up is exciting thrillers where a CFO steals money, or news reports where a CFO embezzles.
Imagine prompting it with “It was a dark and stormy night” and expecting it to complete that with “so David got out his umbrella, went home, ate dinner in front of the TV then went to bed.” It’s probably what most "David"s would actually do, but it’s not what the training data is going to associate with dark and stormy nights.
Assigning a lot of morality and intention to word generating algorithms. “Oh no! The thing made to generate text to a prompt generated text to a prompt!”
Especially because the data it has been trained on is probably not typical for a CFO in the real world. In the real world it’s a lot of boring day-to-day stuff, that isn’t worth writing up. The stuff worth writing up is exciting thrillers where a CFO steals money, or news reports where a CFO embezzles.
Imagine prompting it with “It was a dark and stormy night” and expecting it to complete that with “so David got out his umbrella, went home, ate dinner in front of the TV then went to bed.” It’s probably what most "David"s would actually do, but it’s not what the training data is going to associate with dark and stormy nights.