I'm going to define a unified data model as a model of data architecture that is necessarily rooted in actual digital hardware (otherwise it would just be a DataModel or ObjectModel) that can encompass every data relationship and scale arbitrarily large.
Somewhere on this wiki it's pointed out that tools are often defined by their limits as well as what they can do. Standards impose limits, otherwise they won't be standards. It could be called a form of discipline or regimentation in order to "tame" things. The trick is to find something that's flexible and can cover a lot of domains, yet not be so open-ended that it's RAM-like mush.
That's an interesting concept, although I would argue that the limit you're suggesting *is* imposed: the atomic unit imposes the constraint. The VotingModel does the rest and prevents the data from becoming mush. See DataEcosystem. The other limit is external - when people corrupt the data or the organization, they don't get the love.