Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Metaphorical

(2,507 posts)
7. Nope
Sat Aug 23, 2025, 07:27 PM
Saturday

As someone who is immersed in AI systems daily, large language models really do suck at math. They first have to recognize that they are in a mathematical context, then they typically OFFLOAD those computations to an external processor, because the way that latent spaces work, they are simply not well-optimized for arithmetic. Now, higher order symbolic mathematics? Yeah, they actually do pretty well with that, because those are essentially manipulation of blocks of symbols (most of which have been tokenized) and pattern matching. However, anyone using LLMs for statistical work is essentially getting the benefit of a lot of back-end python code that was custom written by humans to take those computations out of the equation.

Recommendations

1 members have recommended this reply (displayed in chronological order):

Latest Discussions»General Discussion»Political Pollsters Are T...»Reply #7