Dbas Gone Bad

Database Administrators Gone Bad and things that bother one about DBAs


It may be imprecise to use the term "database administrator" for this discussion. Data Analyst or Data Architect seems more correct. That doesn't invalidate the overall point of this page, however.

Since when are IT JobTitles precise?


(moved from PrevalenceLayer)

...remind me of conversations I've had over the years with some DBAs. I'm pretty well versed in relational DB design, and have a good technical understanding of what goes on in RelationalDatabases (more than I can say for many DBAs). I've even done some backups and admin work!

But the times I've said something that makes their role less relevant, in terms of day-to-day busywork, like allowing developers to have their own db instances, or simply saving something to a file when a db was complete overkill, they go into these gyrations. I think they must feel threatened, like all this RDB trivia they know will simply become worthless if they don't immediately stop the slide down a slippery slope to a world where persistence is done somewhere besides a RDB. The horror! So you start hearing all these densely worded cryptic jargon-filled arguments intended to instill FUD (FearUncertaintyAndDoubt) in the management types, that have nothing to do with improving the quality of the software or getting anything done. Sometimes it works, and the engineers who understand the game that's going on just shake their heads. Some/many/most? DBAs aren't happy unless they're well established as the bottlenecks in the development process.

Having said that, I've known, and can easily count on one hand, open-minded DBAs who are truly interested in helping create good software, and constantly improving the process, with no regard for protecting their fief. Those people are worth a bazillion dollars.

A similar thing could be said about some of the OO approaches: they build a custom database-like thing or features into the app for job security and/or a sense of control. BTW, I see nothing wrong with allowing developers to have their own database instance. Just because some DBAs are evil does not mean they all are. There are evil OOers also. As far as Oracle allegedly being too convoluted or whatever, take that issue up with the vendor. Relational systems don't have to be bulky (although I agree the bulky kind is sort of in style right now).

Regarding using files instead of tables, one problem is that if the application later "grows up" and needs to share info with other applications or languages, then your roll-your-own data persistence mechanism is different from all the other roll-your-own data thingies. Using a database helps make the access to data consistent so that one does not have to overhaul all the data calls or APIs when requirements change and/or more sharing is needed.

This isn't much of a problem with a well-factored application (switching persistence mechanisms). Planning for these types of contingencies is itself a source of code bloat and project failure. YouArentGonnaNeedIt.

Having worked in a big organization with large databases, the problem I saw come up repeatedly when applications stored data outside of the main database was duplication. Eventually, the application does -grow up-, and meanwhile, it has its own complete set of data relationships, and it also has its own copy of, say, the set of current employees, but each set of data is in a different state of accuracy and updatedness, and different data entry conventions were used. Now, merging the databases is a very big job!

There's no reason not to start out with a minimalist approach, and as requirements change and learning happens, "back in" to using a RDB. The same can be said of using a J2EE appserver. If you have a change-friendly software development process like XP, this won't be a problem. The challenge is getting that process in place, not IMHO stuff like changing your persistence layer.

This seems to be turning into an XP versus non-XP argument.

(Moved further discussion to RelationalVsXp)


One annoying thing about DB "views" is that often we have to depend on the DBA to create and change them. I think more RDBMS should have temporary or user-defined views as a feature so that we don't have to create complex deeply nested SQL to avoid having to talk to the DBA. See SqlFlaws.


Re: A similar thing could be said about some of the OO approaches: they build a custom database-like thing or features into the app for job security and/or a sense of control.

Most of the DBAs I've known have openly attacked the OO paradigm, but I've never met an OO programmer who attacked the relational paradigm. I assume there are some out there I haven't met, but all of the OO programmers I've known and worked with accept and respect databases as useful tools.

Sorry, I don't find OO fans all as charitable as you do. Besides, the quoted sentence is more about possible bad practices rather than explicit criticism taking place. They don't bash DBAs or RDBMS, they simply go around them by hand-building database-like features into the application itself. A majority of the OO fans I encounter would probably either build their own database-like features (such as persistence, indexing, multi-user semaphores, queries) into their applications, or would prefer to use an OODBMS instead of a RDBMS if given a choice. Of course, this is not a formal survey.

I agree, most OO programmers want to use something lighter than a commercial RDBMS when they can get away with it. And why shouldn't they? Why use a tool when you don't need it? When they need it, they use it without attacking relational fans or the theory behind RDBMSs. I've heard a lot of them gripe about DBAs taking too long to create schema, but that's about the worst I've heard. Of the people I've worked with, OO programmers tend to know much more about SQL and commercial RDBMSs than DBAs know about OO languages and programming.

Assuming this is true (I am skeptical), what is it about being a DBA that allegedly turns one into a stubborn ignoramus?

Your raison d'ĂȘtre is tied to one particular breed of persistence. More than that, you're tied to one of probably 3 or 4 major databases, since no one makes much money doing DBA work on HSqlDB. These incentives all favor JobSecurity behavior like making yourself a bottleneck, putting logic in triggers and stored procedures...in other words, stuff that generally works against most modern software development processes.

Since RDBMS tend to overlap with territory of other development processes (such as OO), this could be said about both sides. Each side thinks their technology is superior and that more should be shifted to it. It is a complex situation where there are no clear guidelines. It is like having supply-side and demand-side economists both involved in a project.

[aside: supply-side economics is a theory peddled only by cranks.]

I did mention that there are DBAs that aren't like this at all. I just wish there were a LOT more of them.

They aren't stubborn ignoramuses. They are tool custodians. Their survival is tied to the continued use of a tool. Any attempt to ignore that tool threatens them.

Well, it is perhaps a ZeroSum decision. Either you use typical database features (about 8) from the database, or you write or put those things in application code. If you put those 8 in the app, then you diminish the role and influence of the DBA, and if you put it all in(or get it from) the database, then you diminish the role and influence of OO designers. It is perhaps one of those insolvable HolyWarWalls. I personally think they belong in the database. It is better factoring, sharing, and reuse of typical and common data needs IMO.


They aren't stubborn ignoramuses. They are tool custodians. Their survival is tied to the continued use of a tool. Any attempt to ignore that tool threatens them.

They are professionals who have been successful and who feel threatened by ongoing changes in their field. It will happen to every one of us sooner or later. It is an example of how important it is to include the human impacts of technology in our work - especially in our interactions. I suggest that someone who has just been labelled a "stubborn ignoramus" correctly feels threatened - not by a technology, but by an abusive individual.


On the other hand, it is true that developers are a particularly dangerous kind of ignoramuses as well, particularly the likes that whine about the DBAs :), and they are much more hell bent on JobSecurity. Put your "persistence" in Python or use Prevayler for example, and the company data which is one of the most valuable assets of an enterprise has become the prisoner of your application. Try to access that data from Lisp or Haskell, or VB or C++, if you will. Oh, and by the way, you can't get any more ignorant than comparing ThePrevayler with Oracle, without knowing that JavaSerializationIsBroken (among many other things that differentiate prevayler from a real dbms).

Hmmm, seems to me it's trivial to write a program that extracts data from objects and puts it in any given persistence store. Yet one more source of ignorance is the confusion between persistence and databases. For this the only cure is AnIntroductionToDatabaseSystems. A more subtle source of ignorance is the claim of "modern development processes" (that supposedly are hampered by DBA). Well, you might want to read EwDijkstra papers on software engineering and processes before you expect that any claim of "modern software development process" can be taken seriously.

Now one more issue with ignorance. The claim that stored procedures or triggers are "poo". That's absolute and utter non-sense, and a typical example of JobSecurity (what I don't know and am not qualified for and not willing to learn must be bad so let's fight against). There are countless examples where a masterful use of triggers, views and stored procedures will simplify the application code, replacing lots of lines of convoluted logic in your favorite O/R mapping framework with a handful of lines in a trigger plus a the simplest insert or update in the application.

At the (severe) cost of 1) the ability to find bugs and 2) the speed with which they can be fixed. And DBAs who write these things most often have probably never heard of "UnitTesting". {That's only great cost for the kind of developer ignoramuses we're talking about. As for UnitTest, that's all fine and dandy, but how about ProofObligation?)

You said it. At one job I worked an enlightened DBA wanted to write UnitTests with PlSqlUnit and his CowOrkers in the database group outright rejected it.

Like any tool, there are good and bad uses for stored procedures. I don't think anybody could say that all uses are bad. For one, they can be executed by a wide variety of languages and applications. The alternative is often language- or app-specific code. This may violate OnceAndOnlyOnce if you have to rewrite the same logic for each language.

Another huge plus is throughput. If you're using an O/R mapper, any algorithms you can't directly write into the object selection query will have big performance penalties, since it means either post-search filtering, or additional database round trips, or possibly both. A PLSQL/Cursor based solution can pay big dividends in this type of scenario, provided you have the necessary background to write an efficient implementation. Just remember not to use it for write operations, since it will have no visibility to your mapper.

Valuable Section moved to SocialDynamics, as key points get drowned in the merits of technical and implementation discussions. -- dl Apr05


What if they just fundamentally disagree? What if the BenefitsAreSubjective between paradigms? Some people may think naturally in OO, others relationally.


Let's not get too mushy-headed with the peace/love/split-the-blame-down-the-middle business. My own observations over 10 years of experience are that DBAs are usually a bottleneck and frequently an active impediment to the software development process. In other words, getting software into production that meets customer requirements and is maintainable. Software engineers I've talked to, as well as some honest/humble DBAs (i.e. the really good ones who are secure) mostly corroborate this, and I personally rarely hear the reverse being a problem.

Perhaps because you tend to hang-out with or get hired by a certain "type" of software engineer. Custom roll-your-own database-like application features is not a small problem in many OO shops.

I don't see any "peace/love/split-the-blame-down-the-middle business" here - perhaps you could be more specific about where any of this gets to "mushy-headed". Perhaps you might agree that your words might, to some DBAs in particular, be viewed as hostile, prejudicial, and even bigoted. Try substituting an ethnic or racial group for "DBAs" in your paragraph and see if you still want to have written it. Such attitudes, in my over 30 years of experience, tend to be extraordinarily divisive and destructive to any work team.

So unfortunately, the real problem engineers are faced with is how to keep the DBA bottleneck effect to a minimum. I will completely agree that taking an adversarial approach is almost sure to fail. But simply splitting the blame and allocating it evenly to all parties is dishonest.

Isn't it up to your supervisor to determine how to handle such? For example, if it takes too long to add new columns and tables and it slows down your projects, then bring that up with your supervisor. If there is a mandate that all "persistent" and/or shared data design must go through the DBA, then you will probably just have to live with such a rule. It is like the selection of language or OS: Somebody else picks it, and you love it or leave it. PlayHurt. You can try to bring about change, but at the risk to your job or career.

Who said anything about "simply splitting the blame and allocating it evenly to all parties"? When there are legitimate bones to pick, a well-functioning group hassles it out. When the team members trust each other, when they have a common vision, mission and goals to provide a framework for discussion, when they respect the differing perspectives that each bring to the argument, and when they are each committed to getting the job done, then such conflicts can usually be worked out - in fact, many groups find that their most successfully creative results arise from such conflicts. I note that a fundamental aspect of Alexander's patterns is that they arise from forces in tension. Conflict is an expected and important aspect of every team's life. The question is how the team chooses to manage such conflict.


DBAs are naturally less inclined to refactoring and the "simplest thing" or other things that may look justified to a developer. If you add, subtract, move around attributes to your classes every other day, don't expect a DBA to be willing to work the same way. That is a very unrealistic and unjustified expectation. Crying foul afterward is really pointless, and at best can be seen as whining.

Get used to the unavoidable fact that data modeling will be more or less a BigDesignUpFront effort. That's the nature of the beast. Data is there not to serve as a persistence of the objects, data quality and the quality of the database schema are central to a DBA, while for your typical developer/ database illiterate dude, it is a secondary concern that he would rather forget about it and have the tooth fairy (read O/R mapping tool) take care of it "transparently".

One solution: take care of your data modeling needs more upfront, and don't set unrealistic expectations for DBAs. Then you can play with your "non-persistent" objects anyway you want it, without hitting "the bottleneck".

The DBA's assumption is often that the data will be around a long time and eventually may be shared by multiple applications and possibly multiple languages. Changing everything around in the schema is considered a change to an interface for the most part: it will create ripple effects. The developer's assumption is often that the data is only for his/her application. Which one is the most "right" has to be looked at on a case-by-case basis and perhaps some compromise needs to be reached. Do developers underestimate the need to share? Do DBAs over-estimate? My experience is that pulling info out of a custom-made or esoteric "persistence mechanism" is indeed a bear, especially from "legacy" languages. The idea of a database is that the data is readily accessible (if permission given) regardless of the language or tool used to generate or save it. Yes, this violates pure encapsulation. Just another ObjectRelationalPsychologicalMismatch item.


Unrealistic? Unavoidable? Unjustified? Those are definitely the words of someone who has heels dug in and isn't ready to EmbraceChange. Here's just three papers that demonstrate how it's possible to be agile with RDBMs, if you're inclined.

Reeves, Gareth M. Evolutionary Design with Databases http://jstorm.sourceforge.net/documentation/Evolutionary%20Design%20with%20Databases.pdf

Fowler, Martin, Sadalage, Pramod. Evolutionary Database Design, http://martinfowler.com/articles/evodb.html

Fowler, Martin. Domain Logic and SQL http://martinfowler.com/articles/dblogic.html

"Often application developers aren't allowed to define views and database developers form a bottleneck that discourages application developers from getting views like this done. DBAs may even refuse to build views that are only needed by a single application."

-- StevenNewton

Now a DBA that is not inclined to define views, that's a curiosity. Typically is more like the other way around: they will define views for security purposes, for controlling who updates what and what the impact is, and so on, so forth. They are trained to do that and most DBMS documentation strongly encourages such an approach. There might be amateur DBAs just the same as there are amateur developers.

The bottom line is that often DBAs and developers have conflicting interests. For a DBA the quality of data and of data schema is paramount. 24X7 availability and minimizing downtime and related risks are other huge concerns. Those are legitimate concerns of the whole enterprise, and there is a reason you have DBMSes and DBAs (for the time being at least). Antagonizing the DBAs and pretending that they should better get along with agile whatever mentality (or else ...) is rather missing the point. Also missing the point is quoting a handful of pages dealing with marginal issues only, when you know you can get lots of documented books with real case studies on the other side.

Agreed that these are all legitimate concerns with regards to a production environment, but far too often the same habits are applied to the development sandbox, and I have a hard time conceiving of a point of view from which one could see that as anything but harmful.

The issue of defining temporary or local views may be partly a fault of current implementations of RDBMS and SqlFlaws, as described above.


A true and recent (July 2004) story from the real world.

The application involves tracking sums of money, in the form of allocations which have conditions attached specifying what they may be used for. These allocations are of course tracked and persisted. On the other side, money goes out as it is authorized to specific spending instances in accordance with the limitations set for the allocations. Those authorizations are also tracked and persisted. At any given time the allocations, which may or may not be aggregated into larger sums, have a certain balance remaining for their specified purpose.

In theory, it is possible to determine how much money is available for a specific kind of authorization by taking all the allocations in effect and subtracting all the previously handled authorizations over the necessary span of time. No profiling has been done, but it is known there are many allocations and authorizations, with various complex rules, so it appears this calculation is computationally expensive. It also just doesn't make sense to derive these numbers every time they are needed.

The developers desire to store these sums back in the database as they are calculated each time, so that the next subtraction needs to only examine the remaining available calculated amount rather than rerunning all the calculations to determined the available balances. However, they cannot convince the database expert to create any sort of rows, tables, or anything to allow these available sums to be stored. The developer in charge of database "stuff" insists that because the available sums can be derived at any time from the allocation and authorization data, then the application must do this calculation every time. In the last conversation there was talk of putting these sums in some sort of scratch table area, not part of the domain schema for the business, but it was not well-received.

At this point, given the resistance from the database gatekeeper, it may be that the best remaining solution is for the application to keep the available balances but not ever persist them, meaning that if there is a restart or some other loss of runtime state, then there must be a facility in the code to rerun the calculations forward from the raw data. That facility, on first analysis, appears resource-intensive and to require careful optimization. It may be unneeded complexity.

You seem to be agreeing with the (de facto) DBA. So this is an example of a conflict rather than a "bad" DBA, correct? I am not sure why the app developers don't want to re-sum though. Code-wise, it should not make much difference.

[I have to agree here. This isn't complicated at all, it's a basic performance optimization - caching of results. Whether that caching should be done at the database level or at the application level is pretty much a matter of opinion and the issue here is that your opinions are clashing, not that the DBA is being obstructionist. From your brief description I would most likely implement it on the database side with a stored procedure that can transparently implement caching using a local table. I wouldn't do the actual calculations clientside whether there was caching required or not. If you're using the database solely as a place to dump data, and you are insisting on doing data manipulation at the app level then your DBA is totally correct to not let you munge up the database with your scratch space. That's an application level problem, and you should solve it there.]

There is often a need for app-specific info to be cached or saved somehow. Whether the DBA can or should be heavily involved is an issue to settle.


Some of the conflict between programmers and DBAs could perhaps be reduced with NimbleDatabases.

It would be nice if NimbleDatabases were compatible with the big-iron databases. However, I feel that the big-iron vendors purposely obscure their languages and syntax to reduce this possibility. It may be something only OpenSource can pull off.


Production DBA and development DBA are different roles. The former will correctly be resistant to change. The latter should be the reverse. Doesn't always work that way, unfortunately.

How does that work? The development DBA allows change which the production DBA resists until one of them is carted from the field of battle? What happens to the changes the development DBA makes that the production DBA resists?

Migrations into production depend on an extended formal process (via a QA activity). This is designed to guarantee synchronization of releases with external factors such as user-training, data migration, extended backups of the new data, operational procedure enhancements, contingency updates, network changes, storage reconfiguration, volume testing, data-feed updates, etc., etc., etc. Generally, a development system doesn't have such dependencies.

I guess I'm not making myself clear. It's natural for the development environment (including database) to change more rapidly than the production environment. What I'm confused about is the idea of having two DBA roles, one that is more resistant to change than the other. If a change is allowed by the development DBA role and passes QualityAssurance/UserAcceptanceTest?, shouldn't the production DBA role accept it without resistance? Shouldn't there be a set of roles (QualityAssurer?, UserAcceptanceTester?) that approve changes to the production environment?

If I may interject - the distinction between developer and production DBA is a useful one. The former is a team member along with the actual developers: they're responsible for helping the developers build a fast, maintainable, and reliable system. They're experts in their particular database platform. And they have the patience to deal with developers that aren't necessarily SQL or relational experts. The production DBA usually does not have a project-team reporting responsibility - they are more likely part of a larger infrastructure group. They are responsible for ensuring that no data is lost and that downtime due to failure is minimal. They are also responsible for production performance, and are typically blamed for performance problems that can be traced back to inappropriate design by the OO developers that didn't understand their underlying database platform, which has tremendous impact on things such as concurrency design.

Such things usually fall out of scope of any particular UserAcceptanceTest?, as they are more like "hygiene" factors, or "environmental" (what was the term TomGilb used for these things? sigh). Hygiene factors tend not to get explicit support in UAT, by users, or by developers, who are all focusing on the schedule or function (XP in particular has a tendency to sweep hygiene factors under the rug, because a "good response time" user story is pretty amorphous). So, these "pain in the ass DBAs" often are doing the organization an invaluable service. On the other hand, there definitely are plenty JobSecurity types out there. -- StuCharlton

I must say, I find the acrimony on this page a little perplexing. As a developer, persistence stores, ACID transactions systems and the like are at the very bottom of my list of things I'd like to write; I have no desire to put the DBA out of business. My existence is not threatened by the database, or is the DBA's existence threatened by my code. They each have different and, IMO, pretty clearly divided roles.

I embrace stored procedures and views. They provide a convenient layer of abstraction between my code and the data it requires and generates. If I can give my requirements for a stored proc to a DBA and he implements it for me, I know longer need to care about the internal representation of that data, and can get on with my life. If the DBA wants to move stuff around to reduce contention and the like, I don't have to care. And I like it that way. (It never is that way, and I constantly find myself pushing for it. The ones who resist do not, as a rule, do so politically; they just don't understand the benefits.)

I admit it may be different elsewhere, but at all the places I've worked at, the relational/OO religious war rhetoric above would sound pretty absurd. It's like a turf war between the developers and the network administrators. Why would I care whether we're using BGP or not - does it make socket() work any differently?

Reconciling a rapid evolutionary style of coding with the less mutable world of databases is a constant challenge. Stored procs and views mitigate this somewhat. That does not in any way diminish my respect for the job a DBA has to do.

I realize that the name of the page is DbasGoneBad, but it seems odd that all the developer here have defended the straw-man position offered by (presumably) the DBA contingent, without pointing out that many (most?) of us don't harbor any of the institutional anti-DBA bias with which we are being painted. As with developers, there are bad DBAs in the world; to paraphrase Tolstoy, each is bad in his own way (although there may be interesting commonalities). If someone was passing out torches for the DBA lynch mob, though, I must've missed it. -- PhilGroce


I think there are two camps of object-oriented developers. One camp I'll call the "originals" - which I consider myself to be a part of. We're the programmers that stumbled onto OO in the late 1980s early 1990s. I'd say most of the originals used Smalltalk as their learning language; maybe they did some C++. The point, however, is that during this early timeframe there was a certain mindset about object-oriented programming. I like to call it the "live data" mindset. I notice that people from the originals camp tend to view objects as an ordering concept and not just a convenient packaging mechanism for code. Indeed, we originals tend to view objects as being "alive"; when you view objects in this fashion and recognize that they're not always transitory, it allows for truly dynamic and flexible models. I believe the Dolphin Smalltalk developers had a talk a year or so ago where they called Smalltalk a "sea of live objects". That's exactly what object-oriented software is all about. It's an ordering paradigm unto itself. To make this possible, however, objects must have the capability of long lifespans. The idea of an object database came about to support this but they weren't widely accepted. It is my opinion they weren't accepted, in large part, because the corporate community had already gone through a rough transition to relational database technology. The mood and the time simply weren't right.

Now, there is another camp. I'll call this camp the "settlers". These folks came to object-oriented programming later in the game, most likely, through Java or maybe C++. In either case, these languages tend to view object-orientation as a cute and convenient packaging concept. These programmers tend to view object-oriented coding in the same terms as their previous procedural world. They don't see object's as being "alive" but rather as transitory things that do some work and go away via the garbage collector. I'd say they see objects as fancy C functions without the hassles of having to allocate and free memory. These programmers have a hard time using ResponsibilityDrivenDesign and tend to see objects as large function buckets. With this worldview, using external relational technology makes sense. It's no different than it was when they were using it in C, COBOL, or BASIC. This group of programmers started using object-oriented languages not necessarily because they wanted to or they saw "the magic", but rather, because it became the "hot" thing to do.

The bottom line is that, to have what the "originals" want, one needs a fast, efficient way of persistenting object instances. Doing this with relational databases isn't easy and, quite frankly, is a kludge. Objects are their own ordering concept, just as tables and tuples are ordering concepts for relational databases. I don't believe there's anyone in the originals camp that believes relational databases don't have a purpose. I, personally, have never advised against an RDBMS when I felt it was appropriate. However, a lot of time and effort is wasted trying to mesh the two worlds together. People from the originals camp see this and they complain. I think the folks who are talking about DbasGoneBad are either from, or think like, those in the original camp. It is my opinion that those in this camp are the visionaries of object-oriented design and programming.

Those in the "settlers" camp either haven't seen the light, or they simply don't understand the benefits of viewing objects as live, ordering, concepts. Since most of these folks came into object-oriented programming due to market conditions and come from predominately procedural ways of thinking, it doesn't matter to them to store things in a relational database. This seems natural, even though tons of effort is wasted in doing so. After all, they had to put their data in an RDBMS before, why not do it now?

I would argue that there is a fundamental difference if one views objects as an ordering concept. When one views objects as real, live entities it changes the way you model software. This is the magic of object-oriented software design and analysis that most of the industry is missing right now. What I, and many other, "originals" have encountered is that we have wavelength disconnect with DBAs. We try to use databases in dynamic ways they were never really intended for. This is because it's what we're given to work with; because of this, we try to "make do" and it causes friction. We want to constantly refactor the schema and dynamically generate SQL to hide the cruft of a relational database and the DBAs want to lock everything down and approve one "standard" query.

The solution here is really very simple. Object databases must gain acceptance. They, like RDBMS', serve a specific purpose in the grand scheme of software engineering. Without them, good object-oriented programmers are left with suboptimal conditions under which to develop their software. Ironically, in every case where I've recommended an OODBMS - and subsequently had DBAs fight it until the end - I've also recommended an RDBMS as a data warehouse. I think DBAs hear "database" in OODBMS and they grow fearful it's a replacement for products they already know and maintain. However, the reality is that OODBMS and RDBMS products are complementary. I believe that once we convince the DBAs we aren't trying to put them out of a job, OODBMS products might have a better chance and then everyone would be happy. -- JeffPanici

{You may be interested in ProgrammingWithoutRamDiskDichotomy.}

Perhaps we are off-topic here, but my reply can perhaps be moved also when a better home is found. Your characterization of the "OO generations" tends to match EdYourdon's observation that the early OO adopters were more productive or successful in his satisfaction surveys than the later batch. Being a RelationalWeenie, my interpretation of this is that the early adopters had minds that naturally fit OO. Later generations went OOP because it was in style, not because it fit their mind. Paradigms are probably subjective IMO, barring objective, external evidence otherwise.

Re: "However, the reality is that OODBMS and RDBMS products are complementary." I don't see how this can be so without violating OnceAndOnlyOnce. You suggest that RDBMS are best only used as a historical research tools (data warehouse) rather than "live" data, which I tend to disagree with. If you have a specific example where RDBMS tend to choke, I would like to examine it.


Challenge about how to represent a many-to-many relationship (users and user groups) extracted to ManyToManyChallenge.

I moved it there so I could answer it without making this already long page even longer; I hope that's OK. -- FalkBruegmann


This discussion might belong under another topic perhaps, like OoVsRelational or something.

Well, the unsupported claim has been made that the "real" solution to our problems is that OODBMSes "should" gain acceptance and replaced the antiquated SQL DBMSes. I'm not opposed to that in principle, but let's see first what the claimed advantages of the ObjectModel approach could buy us. Oh, I almost forgot to mention the not so subtle reference to some kind of "settlers" developers who apparently don't get the "true" OO :) Such BlahBlahBlah claims need substantiated with real evidence and I'm only trying to help the "real" OO folks do that at least half way decently.

{Note that many systems have GUI schema tools to reduce typing and make it more visual. Some might prefer text. See Also: AccessControlList. [this comment belongs to something that was moved elsewhere.]}

Again, here we are: You're saying that it must be one or the other. I'm saying it can be, and probably should be, both. Yes, I said OODBMS' should gain acceptance as a valid tool that object-oriented developers use in their trade. How is this a bad thing? Finally, I rather thought I was being blunt about the two "object-oriented" developer camps. I wasn't trying to hide the fact that I think some of us get it [the object-oriented paradigm] and others don't. This topic alone could be a discussion that spans several pages. -- JeffPanici

And I'm saying that from my point of view they are not deserving widespread acceptance as long as they suffer most spectacular technical limitations, and they go forward with their chasing of pointers. I strongly believe that the minute they will support relational algebra they will be more than ready for prime time.

As for being blunt, I'm all for CriticizeBluntly as long as you can back your critique with real substance.

I don't believe there's any substance (you certainly haven't shown it) in your sweeping generalization that there are OO camps that "get it" and OO camps that just don't "get it". There's nothing magical about OO so that one can "get it" or "does not get it". If you want to go into more details you'll find the functional camp (most notable such figures as PaulGraham and PeterNorvig) that will argue that the OO die hards just don't "get it" about the true nature of the computation and cornered themselves into a dead end, by choosing objects as primitives that are bulky and not composable (as opposed to functions).

Related: PeopleWhoDontGetOo, YouJustDontGetIt

In the end it all boils down to ProgrammingIsMath and one has to choose one formalism over the others, one notation over the others. And guess what: the truly professional software engineer should be able to master several of them. If you want to represent curve trajectories often time polar coordinates simplify everything, try to do that with a linear trajectory and you'll have created an unnecessary complication for yourself, and vice-versa. In data management it is as simple as " you have to be able to use something at least as powerful as relational algebra ". In certain programs a functional approach far outweighs any OO approach, and in other situations OO approach far outweighs a functional approach.

As for why you need relational algebra for any serious approach to data management problems, I can recommend you such fine books as AnIntroductionToDatabaseSystems, and TheThirdManifesto. To see the same area approach from an OODBMS perspective and reaching the same conclusion that OODBMSes should extend the relational model and not go against it, you can read FundamentalsOfObjectOrientedDatabases.


This page is getting off topic, IMO. It is about DbasGoneBad, and is degenerating into the OO vs Relational arguments. When someone mentions DbasGoneBad I think we need to look at roles. In many application development scenarios you end up with 2 sets of databases, production and development, with mutually exclusive functions. A development database exists for development and some testing work. Programmers often have a high level of control to make changes etc. DBAs provide tools and assistance, often also working as data analysts and modelers in smaller environments. Programmers are given wide latitude, though provisions are made to insure that developers do not trip over each other. In my situation I often end up playing "traffic cop" which is sort of a SCM problem. In addition I often spend time educating programmers on what a DB is and what our particular brand of DB can and cannot do. Now, a production DB is very different. Often, a company will have millions or even billions of USD invested in the data. Before a production move, the programmer must prove that the code they are going to unleash (and any schema changes) is going to work correctly and maintain data integrity. Often, this is a complex task. But the stakes of a mistake are high enough that a good DBA should "go nazi" on anyone that may threaten the production DB. That includes having them provide a detailed work plan with fall back positions. This may slow development down a bit but it is necessary. It does not provide an excuse for doing nothing, however. To me, DbasGoneBad do not properly support the development process and/or make things so complicated to move that nothing gets done. Comments?

I would concur. In general, "DBAs Gone Bad" is merely a specific instance of a common problem: independent groups. We have developers, DBAs, QA, QC, CM, etc., each focussed on their individual tasks and ignoring the whole. Each group tends to blame others for interfering with the specific group's task. DBAs tend to have their own isolated view of the world and do not want developers to mess it up. Developers are just as guilty, as are the other departments listed above. It is not DBAs (or the other groups) at fault here, it is really a management error to set up isolated groups and no one should be surprised the result is conflict. Instead, the groups need to be subservient to the overall task at hand, try to ensure its timely and successful completion, and not spend time trying to defend their private turfs.

Also, I must point out, I cannot find the reference for it, but IIRC DBAs are the only IT workers successfully sued for errors and omissions. In the case I heard of the DBA had directly changed data in the database and caused financial damages. This is another reason DBAs are conservative. AFAIK, no 'Software Engineer' has ever been sued for their work. And in fact one marker of a profession is that you, as a professional, are personally responsible for your mistakes.

Discussion continued in RealProfessionalsGetSued


Perhaps the field should be split between database designers and database administrators (titles still need work). Those who focus on the performance and hardware tend to let those issue bias schema designs in my observation. It seems hard to keep the user/developer perspective and also have one's head in the technical guts. Making a design that is app-developer friendly may make a DB performance tuner's job harder, and thus there is a conflict at play.


See DatabaseBestPractices, SqlFlaws, ObjectRelationalPsychologicalMismatch, YagniAndDatabases


EditText of this page (last edited November 21, 2014) or FindPage with title or text search