: We have 4 topics on this, for example OoIsAnthropomorphic
Is the conceiving of objects as wee people who have particular knowledge and can do certain things. Following from the idea that objects are "things that know stuff and can do stuff".
The interactions between objects can be thought of as conversations between these people, engaged in for some particular purpose. Applications are then a collaborative community of these folks, and the set of behaviours they can express in response to the events they are exposed to.
One technique I find attractive in application design is to ask the people who will use the application questions such as "What would you like to ask the application to do, and what would you expect/hope its answer to be?" If I can get people started with this, it's usually fairly easy for them to imagine reasonably well-formed conversations which convey the essence of what they're trying to accomplish.
hmmmm - Use Cases as screenplays? --ChrisGerrard
This is not so outlandish as it sounds. I find a deep similarity between software development and film development. For example, the concept of character seems very close to types, and plot to actions (use cases) -- JeffMantei
The big problem with this is it's not a binary distinction. I can't claim my object's better than yours because "it's more anthropomorphic". This sophistry is okay for a pep-talk, but it stands dangerously close to the idea "ObjectOrientedProgramming
models how humans think". Stop the insanity! --PhlIp
I think the issue is one of mental ability. Most of what people say is gossip. Other humans are our biggest environmental threat. We are our own predators and prey. Humans are great at keeping track of who knows what about whom. Our survival depends on it. It's in our genes. For some time I've suspected that OOP succeeded not for the reasons its creators imagined, but because its easy for our minds to build and manipulate models of interactive, secretive agents. -- EricHodges
There are levels here. A play usually doesn't describe the set in detail. A lot of code is
really part of the set and not the play.
My team and I find this a natural way to envision objects. Usually if two of us are working on objects that communicate with one another we will end up in a dialogue like this, "Well, I pass this int to you and you return the square root," "oh, but do I have to store that value somewhere?" or something similar. It's much easier to conceptualize objects anthropomorphically, I guess because it's easier for humans to understand interactions between themselves than interactions between objects. --BrianRobinson
I've played this game too. I think it works well not because of the anthropomorphism, but IMHO simply because when you "put yourself in the shoes" of an object, you have a clearer grasp of how you'd want to limit its responsibilites, or balance them with those of collaborating objects. In other words, you don't imagine objects as people; rather, you project onto objects patterns of division of labor which are integral to our daily practice. The familiarity of those patterns make them useful tools for assigning responsibilities of objects when we invoke the appropriate frame of mind. -- LaurentBossavit
Some people have trouble understanding the difference between objects in the computer and the real-world objects they represent.
Like... "No, no, no! You can't ask the 'customer' object [in the computer] for their credit limit because you can't trust the customer [in the real world] to give you an honest answer."
Honest, I had this happen... with a developer who was having a really hard time learning OO.
It was clearly a good question. The problem was in who was being asked. The lender determines the credit limit, the customer requests credit to cover purchases. Definitions are required, even in OO.
Don't anthropomorphize computers. They hate that.
I'm trying to make sense out of PhlIp
's response. No claim was made regarding the relative value of individual objects, so why so strident? What's the burr under your saddle? (or maybe animal metaphors are no good also) --ChrisGerrard
This is fine until you embarrass yourself when you speak your mental model aloud. In the distant past, I started saying "wants to be" to distinguish assignment from testing for equality. Reading code for a class, I learned that not everyone wrote code like "a wants to be the square root of x".
It was so catchy, that it sticks with me to this day, mumble years later. I'm not sure if that's good or bad.