Unified Data Model

I'm going to define a unified data model as a model of data architecture that is necessarily rooted in actual digital hardware (otherwise it would just be a simple DataModel or naive ObjectModel) that can encompass every data relationship and scale arbitrarily large. Ironically, in order to hold any kind of data relationship, one must constrain the problem with and within machine types. The foundation of such ability would necessarily lie in a DataStructure like a FractalGraph and that is rooted in a universal and "atomic" integer: 1.

How would you scale such a data model infinitely large? CrowdSourceing, of course, taking an exponential (O(n^2)) problem and turning it back into a linear one. (Such a project is over at PangaiaProject.)

Somewhere on this wiki it's pointed out that tools are often defined by their limits as well as what they can do. Standards impose limits, otherwise they won't be standards. It could be called a form of discipline or regimentation in order to "tame" things. The trick is to find something that's flexible and can cover a lot of domains, yet not be so open-ended that it's RAM-like mush.

Of course, for a such a unified model to scale and to be navigable, we need a ThreeDimensionalVisualizationModel.
See also UnifiedObjectModel.

EditText of this page (last edited April 5, 2014) or FindPage with title or text search