Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Bernardo de La Paz

(56,202 posts)
1. If they froze the trained weights, they wouldn't decline. But they keep "training" them on new users & own output
Mon Feb 17, 2025, 11:05 AM
Feb 2025

Including their own output means including their "hallucinations". But they also use other LLM AIs.

The weights are the amount of impact that a given artificial neuron (out of millions or hundreds of millions) has on the next neurons in the connection sequence from input to output. Training is an iterative process of homing in on those weights. It takes millions of runs and a lot of time and a lot of energy (enough that big players are looking to locate beside nuclear power plants).

Recommendations

0 members have recommended this reply (displayed in chronological order):

Latest Discussions»Culture Forums»Science»Older AI models show sign...»Reply #1