I'm going to define a unified data model as a model of data architecture that is necessarily rooted in actual digital hardware (otherwise it would just be a DataModel or ObjectModel) that can encompass every data relationship and scale arbitrarily large.
Somewhere on this wiki it's pointed out that tools are often defined by their limits as well as what they can do. Standards impose limits, otherwise they won't be standards. It could be called a form of discipline or regimentation in order to "tame" things. The trick is to find something that's flexible and can cover a lot of domains, yet not be so open-ended that it's RAM-like mush.
Yes, but the idea of a unified data model is to impose a standard that is universal enough to encompass a large scope of data and types of data. This is in contrast to an adhoc data model concocted to solve business problems, even for the enterprise.
That's an interesting concept, although I would argue that the limit you're suggesting *is* imposed: the atomic unit imposes the constraint. The choice of your atomic unit informs what types of relationship your model will/can encode. The VotingModel does the rest and prevents the data from becoming mush. See DataEcosystem. The other limit is external - when people corrupt the data or the organization, they don't get the love.