Scroll Top

Object-Oriented Design-Programming – How We Objectify in Our Culture

What a title. Well, is this a technical article, or isn’t it?

It’s a view of culture, psychology, and their influence on technology. If this matter has been addressed before, it wasn’t in any of the articles that came to my attention at this writing.

Object-Oriented design in software development has been a convenience, a problem solver, a boon that makes a wide range of applications possible and practical. But where does the idea come from, really? Don’t answer “Bjarne Stroustrup;” he was just a point of crystallization. If it hadn’t been he, someone else would have conceived of it eventually — I believe it’s in our nature to lean in that direction.

Look at our modern (at this writing) culture, and its antecedents. We have a tendency to objectify everything. It’s cause for concern in some areas, as when women complain of being objectified by men. But people regardless of gender have this tendency; it’s a convenience that somehow makes ideas more comprehensible to our brains. Our physical foundation is animal. If we accept biological evolution as fact, we assume our ability to think in abstractions is a relatively recent development. The forebrain that contains much of our intellect is a wonderful tool, but the animal part of our brains is more thoroughly integrated into our physiology, and is adapted to reckoning in a universe of physical objects as our primary frame of reference, and our attempts to make sense of our world apparently default to that framework.

Our ancestors looked at forces of nature, and attributes of the mind, heart, and character, and personified them as pantheons of gods or divinities. Personification is a kind of objectification, since a live solid person is more tangible than an idea. Our ancient myths are full of such manifestations, e.g., thunder and lightning (Zeus, Jove, Thor), death (Hades, Pluto, Hela, the Grim Reaper), cleverness and mischief (Loki), speed (Hermes, Mercury, Caber), the hunt (Artemis, Diana, Sif), love (Aphrodite, Venus), intellect (Athena), marriage (Frigga), wealth and fertility (the Lord of Misrule, the Ghost of Christmas Present, Santa Claus, Ganesh), happiness (Pan, the Blue Bird of Happiness), and many more.

Aside: what is reality? The word “real” comes from a Latin root, “res,” meaning “thing” or “fact.” It’s a word that has been twisted out of its original meaning, much like “oxidation” and “fundamentalism.” Oxidation is a chemical process of energy discharge first observed involving the element oxygen; when similar processes were seen such as hydrogen burning in chlorine, the term “oxidation” persisted even though oxygen was not involved in the reaction. Fundamentalism is a kind of strict, rigid adherence to principles named for a sect of Christians; when similar strictness was seen among Muslims and those of other faiths, the term “fundamentalist” stuck, even though Christianity was not involved. So too that which is real is that which exists independently of the perceptions of the observer, and is usually a tangible physical object, while an idea, an image in one’s mind, or a dream is considered unreal, an illusion. But some metaphysically-inclined scientists, philosophers, and spiritualists believe that an idea or a mental image is more real than a gross physical manifestation of an object, that an invention begins with an idea, that the image exists at a subtler, purer level of reality, and so is more real than the physical object. What? What just happened here? If “reality” means “thingness” and a thing is a gross physical object, how can something as ungross, fine, or rarefied as an image or an idea be “thingier” than a gross physical object???

Perhaps to be precise in our use of language we ought to say that the idea is more “true” than the physical expression, and adopt the increasingly popular adjective “classical” to refer to classical reality (where a thing is a solid object), classical oxidation (energy discharge involving the element oxygen), classical fundamentalism (strict adherence to a Christian code), Classic Coke (the original recipe), and classical liberal (involving a free market and “laissez-faire” politics).

Back on track: look at how our semantics is influenced by our tendency to objectify, particularly in language. Verbs are words of action; nouns are words that name subjects or objects, which we might typically consider tangible things. “Action” is a noun! Our inclination to objectify is so entrenched in us that the step from verb to noun, from action to pseudo-“thing,” is not a distant leap. Just my attempts (noun, derived from verb) to describe all this (noun, situation, abstraction) is caught up in a tendency (noun, referring to action, also a noun) to assign labels (noun) to things (noun)! We’ve gone so far as to call an action (noun) a thing (tangible object)!

Right about now one of my friends would probably say, “Wilfred, stop. Drop it. Forget it. Relax. Have a drink.”

I could go on (“No!” says the reader), but I’ve made my point. We’re so deeply involved in making sense (noun) of our world by creating tangibles that something like Object-Oriented software design, with or without Stroustrup, was inevitable, even if it weren’t obvious to old-guard programmers such as myself.

Software architecture is considered cutting-edge research in Computer Science, and a discipline that abstracts instructions to the computer, that rarifies the instructions so that at a high level they are free of the constraints of the machine. There is a strange irony to the fact that software objects as tools of this architecture are a kind of concession to the animal part of us that needs objects to be able to cope with reality. (A parallel irony is that rockets, which are considered among the greatest technological achievements of modern times, so resemble primitive projectiles such as spears and arrows!)

Addendum: I presented a copy of the Objectification article (as given above) to a mentor, who wishes to remain anonymous. His response:

“Interesting perspective…

“OO as a concept certainly preceded Mr. Stroustrup. I doubt very much that the concept evolved because of any inherent tendency on the part of our species to ‘objectify’ stuff – the term ‘object’ in ‘object oriented’ was used as a metaphorical ‘hook’ to get software developers to think differently about how code should be organized. Prior to OO languages, we had methods (functions, subprograms, macros – take your pick of terms) that ‘did stuff’, BUT, those methods did NOT (and still don’t) have a persistent state – that is, they had no memory of what they did in prior executions, and that posed a very real problem. The data that the methods acted on had to be passed into the method, and results passed out – state (‘memory’) was achieved by making sure that any value that needed to ‘hang around’ for any future references was available as one of the values that was inside a method that was active or that remained on the call stack (and would thus become active at some point in the future as the call stack was unrolled). Absent the presence of the value on the call stack, the only way to ‘remember’ a value was to write it to disk (which was, and is, way too slow) or to set aside an area of memory for persistent storage (e.g., COMMON blocks in Fortran) (which do solve the persistence/’state’ problem, but leave the values far too accessible and therefore far too susceptible to accidental misuse). SO – pre-OO, we had a language environment in which we had to spend a lot of time worrying about value persistence and value visibility, and the language mechanisms that we had to use were relatively simple and absolutely unsafe.

“OO was an attempt to get developers to re-think how methods and data are organized relative to each other – that really is all that OO does differently when compared to pre-OO languages. HOWEVER, that single difference allows all sorts of benefits to accrue. Creating ‘objects’ solves the ‘persistence’/state problem quite nicely and elegantly; it also allows values to be guarded/protected within an object from direct outside influences; it also makes it MUCH easier to find the code that actually manipulates a specific value (it’s a whole lot easier to find a single mutator method than it was to find all of the references to a variable in a language like Fortran or COBOL and then figure out if that reference represented logic that needed to change) in order to update the code to reflect changes in business rules; the list goes on and on.

“The ‘object’ in ‘object oriented’ was and is simply a cognitive concept that any thinking person can understand: an object in the real world has state (values that describe it and that change according to certain specific rules) and behavior (activities that it is capable of performing). Virtually anyone can grasp the basic idea that in an OO programming universe, we define our code in units that describe the values and actions of a ‘thing’ (in Java, ‘unit’ = class, ‘values’ = attributes, and ‘actions’ = methods).”

My mentor addresses the issue from the practical engineering standpoint, while I address it from the psychological standpoint. It may be that the computer scientists were simply “building a better mousetrap,” and that it just happened to take the form of an object quite by accident. Coming from a degree in Psychology, I’m still inclined to believe that people tend to lean toward objects naturally.



AUTOPOST by BEDEWY VISIT GAHZLY

Related Posts

Privacy Preferences
When you visit our website, it may store information through your browser from specific services, usually in form of cookies. Here you can change your privacy preferences. Please note that blocking some types of cookies may impact your experience on our website and the services we offer.