Design principles: Lean data vs. bloated data

I've been tasked with designing tables to store civil infrastructure asset data. Users have asked for a multitude of fields in the tables. For instance, they've asked for 50+ fields in a roads table. The tables contain tens of thousands of records.

From my experience, 50+ fields in a table is way too many. Not from a system performance perspective, but from a data entry/data maintenance perspective. From past experience, I've found users can't keep up with that level of detail long-term. Maintaining 50+ fields for a specific asset is overwhelming, impractical and provides little value compared to the effort required.

I would like to set up a data that is lean, mean and reliable, rather than bloated, unreliable and probably full of nulls. It seems to me that principles like ‘less is more’, ‘keep it simple, stupid’, and ‘do a few things really well’ would be the key to success. But I struggle to communicate this to stakeholders when designing such systems.

So I'm wondering, are there any established design principles that talk about the approximate number of fields that can be reasonably/sustainably/accurately maintained by users? (In other words, a rationale for designing lean data.)