blusterydayve26 ,

Is it really a solution, though, or is it just GIGO?

For example, GPT-4 is about as biased as the medical literature it was trained on, not less biased than its training input, and thereby more inaccurate than humans:

https://www.thelancet.com/journals/landig/article/PIIS2589-7500(23)00225-X/fulltext

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • tech
  • kbinEarth
  • testing
  • interstellar
  • wanderlust
  • All magazines