Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Imagine an LLM tuned to eliminate misunderstanding and ask why at least 5 levels deep… Without fearing to irritate the boss or to create an impression of being not smart, both possibly harmful for human career but irrelevant to unthinking software tool.


I too like science fiction. People keep acting like it will be easy to bolt on things like eliminate misunderstandings onto LLMs and quite frankly I would be incredibly surprised if that happens any time soon.


Eliminating misunderstanding comes down to willingness to ask more questions if you have low confidence. The main reason this doesn’t happen is subordinates afraid to look stupid or lose jobs. None are concerns to an unthinking machine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: