Reminded of the time I used the Yep search engine’s AI chatbot to ask how to use advanced search functions, and it prompted me to click an icon that literally did not exist.
I get that these things don’t have intelligence and work off of datasets, but you would think that companies would ensure people can at least get the correct answers on the sites the damn things are based on.
I find they are good for creative tasks. Picture and music generation, but also ideas - say, give me 10 possible character names for a devious butler in a 1930s murder mystery novel.
But yes, terrible for facts, even rudimentary ones. I get so many errors with this approach its effectively useless.
However, I can see on narrower training data, say genetics, this might be less of a problem.
Reminded of the time I used the Yep search engine’s AI chatbot to ask how to use advanced search functions, and it prompted me to click an icon that literally did not exist.
I get that these things don’t have intelligence and work off of datasets, but you would think that companies would ensure people can at least get the correct answers on the sites the damn things are based on.
I find they are good for creative tasks. Picture and music generation, but also ideas - say, give me 10 possible character names for a devious butler in a 1930s murder mystery novel.
But yes, terrible for facts, even rudimentary ones. I get so many errors with this approach its effectively useless.
However, I can see on narrower training data, say genetics, this might be less of a problem.
“I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes”