We might eventually get to a point where LLMs are a useful conversational user interface for systems that are actually intrinsically useful, like expert systems, but it will still be hard to justify their energy cost for such a trivial benefit.
The costs of operation aren’t intrinsic though. There is a lot of progress in bringing computational costs down already, and I imagine we’ll see a lot more of that happening going forward. Here’s one example of a new technique resulting in cost reductions of over 85% https://lmsys.org/blog/2024-07-01-routellm/
We might eventually get to a point where LLMs are a useful conversational user interface for systems that are actually intrinsically useful, like expert systems, but it will still be hard to justify their energy cost for such a trivial benefit.
The costs of operation aren’t intrinsic though. There is a lot of progress in bringing computational costs down already, and I imagine we’ll see a lot more of that happening going forward. Here’s one example of a new technique resulting in cost reductions of over 85% https://lmsys.org/blog/2024-07-01-routellm/