I've only started working on PCs within the past two years, but I haven't really had to know much with power distribution up until now.
A 1000 watt supply might be too small. Wattages says almost nothing. Important is current for each voltage. A 1000 watt supply that does not provide sufficient current on one voltage can still fail.
Most all computers only consume between 100 and 200 watts. In one rare case, someone reported his larger system consumed as much as 420 watts.
So let's put both paragraphs together. Since most computer assemblers do not learn from numbers, then we first calculate actual power consumption (ie 300 watts). Then tell them buy a supply that is at least double that size. Most then assume that a two times too large supply (600 watts) is what the computer actually consumes. Too many assume rather than measure.
How hot is your computer? Is it as hot as the kitchen toaster? If not, then it consumes no where near to 600 watts. Those above two paragraphs explain why so many recommend massively oversized supplies. And why that 600 watt supply (that probably only outputs 450 watts max) is more than sufficient.
Is the power supply sufficient? A too small supply will often be identified immediately with a multimeter. Normal is for an undersized supply to still power a computer months or a year without fault. Just because the computer works does not say the supply is OK or large enough. Is your new supply defective or undersized? A meter can identify that immediately - months before failure happens.
Since these simple electrical concepts are not well known, many will simply recommend a 600 or 900 watt supply. Better is to simply connect that 600 watt supply to the new system, ask what to do, and know without doubt whether that supply is sufficient. The only useful answer will say so due to measurement numbers.