The Intarweb is your friend for researching things. It's a constant battle to disprove kruft and outdated 'knowledge', to reprocess patterns, and to reassess beliefs.
Neural networks are like great compression algorithms.
You're not just replacing a single, unrelated entity each time. You're having to retrain all of the other bits of knowledge which might have used the one bit as an underlying part of the compression "tables".
Unfortunately, often you'll replace something, and some things will come up and get reprocessed, but some will just get corrupted and forgotten until the come along later to bug you.
It's a constant battle to disprove kruft and outdated 'knowledge', to reprocess patterns, and to reassess beliefs.
Neural networks are like great compression algorithms.
You're not just replacing a single, unrelated entity each time.
You're having to retrain all of the other bits of knowledge which might have used the one bit as an underlying part of the compression "tables".
Unfortunately, often you'll replace something, and some things will come up and get reprocessed, but some will just get corrupted and forgotten until the come along later to bug you.