4 Comments
User's avatar
Kent Robertson's avatar

Remember, opinions are like a$$ holes, everyone's got one.

Kent Robertson's avatar

If, when asking AI platforms a question, the user specifically asks for facts and not opinion, do these LLMs do a better job of giving these facts in an unbiased manner?

Asking AI platforms for opinion is fraught with the same problems that the current media sources have

Even if differing opinions are given, the choice of which opinions to cite can be perceived as biased.

Our biggest problem is getting the facts, the truth and nothing but the truth. We are capable of making good decisions when we are dealing with all the facts. Media bias is the root of all evil in my book. News should be unbiased and search out facts, leaving opinions to people.

Andy Hall's avatar

That's an interesting question. For our study we focused on stripped down policy questions as prompts, but in the future, it would be very interesting to see how differently the LLM behaves when the prompt specifies if it's looking for a factual answer or an opinion.

Michael's avatar

A fascinating article. Gave me allot to think about. I can see people using "AI" to determine their political choices. Just my humble opinion but is the use of "AI" much different then relying upon a favorite "podcaster" or "YouTuber" to make your decisions. (Just the rambling thoughts of an old hermit. Hope I haven't upset anyone.)