For anything where you would ever expect a predictable, useful outcome to an arbitrary input. There is no possible path to LLMs ever doing anything close to that.
LLMs aren’t driving cars. LLMs aren’t doing financial modeling. Those are entirely different tools with heavily hand crafted models to specific applications.
Anyone using an LLM to provide therapy should get multiple life sentences in prison regardless of outcomes. There is no possible way to LLMs ever being actually useful for therapy. It’s just a random text generator that’s tuned well enough to sound good. It has no substance and the underlying tech cannot possibly develop substance.
Can I get blocking servers based on city, too?
Stupid puritan bullshit makes just picking a random US server really annoying.