PDA

View Full Version : what does object-oriented mean



JeenLeen
2018-01-09, 12:25 PM
In programming terminology, what does an object-oriented language mean? And what is one that isn't object-oriented?
I hear it used a lot, and I know some languages definitely are that, but I don't have a clear idea of what it means and I have no idea of what a non-object-oriented language would be.

Also, would languages like SAS or R be considered object-oriented? They don't really work on individual objects (like a simple declared variable in C++), but rather on elements of a dataset (something like a column in Excel).
Would SQL be considered object-oriented, or is it somewhat outside those definitions as a database language?

Grey_Wolf_c
2018-01-09, 01:09 PM
In programming terminology, what does an object-oriented language mean? And what is one that isn't object-oriented?
I hear it used a lot, and I know some languages definitely are that, but I don't have a clear idea of what it means and I have no idea of what a non-object-oriented language would be.

Also, would languages like SAS or R be considered object-oriented? They don't really work on individual objects (like a simple declared variable in C++), but rather on elements of a dataset (something like a column in Excel).
Would SQL be considered object-oriented, or is it somewhat outside those definitions as a database language?

Object-Oriented means that the language enforces certain rules of how you create a program. An "object" means something slightly different in each OO language, but broadly, it is a "thing" that you declare and that has properties. The rest of the program doesn't get to "know" how the object works, just that it has methods that can be invoked from outside. It is a philosophy of compartmentalization: it is supposed to make development easier by dividing the program into entities that operate independently. Think of it like a cross between a variable and a subroutine: you interact with it as if it was a subroutine, but it has memory like a variable.

Objects have certain advantages that can be very useful, such as inheritance: any object that inherits the properties of another will have the same methods, and thus it is interchangeable with the parent one in any program. So, in a game, you may have an object such as "cover" which has certain properties (like damage reduction) that can be returned on demand. Then, there are a bunch of other objects ("table" and "wall" and "door") all of which inherit "cover" but go on to have their own specific methods while still all counting as cover. So, the game doesn't care if you just moved near a table or a wall, it just cares that you moved next to a cover object, and if tthere is an attack, it can ask the object how much damage reduction said cover grants without needing individual code to deal with every different piece of cover in the game. This means that it is also easier to add objects that provide cover, because all the programs that deal with them won't care that they are new, as long as they still work like the others.

They are more than just static subroutines, though. Maybe the wall looses damage reduction every time it gets shot, so the program will tell the cover object that it got shot, and the object itself will remember, and next time it is asked for damage reduction, the number may have changed (and how it changes will depend on what actual object it is).



SQL is not object oriented, because it is not a programming language in the same sense as meant by the term in general. PL/SQL, which added a bit of programming language concepts (such as variable declaration and loops) is not really OO, although databases can be a bit: you don't need to know if what you are interacting with is a view or a table or a function, as long as they return table-like results, for example.

This is a very broad topic, though, and I've barely scratched the surface.

Grey Wolf

JeenLeen
2018-01-09, 01:25 PM
In a non-object-oriented language, do concepts like 'numeric' or 'character' not exist as, well, objects, and instead one has to define who to interact with each individual object instead of stating to treat it like the system treats numeric/character/whatever-else-the-language-has?

Grey_Wolf_c
2018-01-09, 01:40 PM
In a non-object-oriented language, do concepts like 'numeric' or 'character' not exist as, well, objects, and instead one has to define who to interact with each individual object instead of stating to treat it like the system treats numeric/character/whatever-else-the-language-has?

Yes and no. Yes, there are variables in procedural languages which store values. But you can't usually crate your own, and they definitely can't do inheritance. The strength of OO is that you can create your own objects, which through inheritance can be slotted into old programs that aren't expecting it.

Imagine you have a program that adds/multiplies/divides numbers. It was created to accept integers. At some point, you need the program to use complex numbers instead. In procedural, you are stuck, because integers cannot be switch for complex without re-coding the bits that do the operations. But in OO, "number" is just an object with "add/multiply/divide by" as functions that take another "number". If you create a "complex" that inherits "number", and then feed the program complex numbers instead of integers, the program will still work, because it can still call "sum" functions for every object. You just need to code the complex object to perform the sums in a new way, and if you know it works, the changes will be easier.

Don't think of it as "what can it do that procedural can't". At its base, OO is still procedural. There is nothing new to what it can do. It is a way to facilitate coding by enforcing certain rules that should make coding easier. It is hard to exemplify, because the strengths of OO become more and more obvious the bigger the program - that's why I went with a videogame example.

Grey Wolf

factotum
2018-01-09, 04:52 PM
SQL is not object oriented, because it is not a programming language in the same sense as meant by the term in general. PL/SQL, which added a bit of programming language concepts (such as variable declaration and loops)

I thought those were part of all SQL dialects? Certainly T-SQL (the dialect used in Microsoft SQL Server) has variable declarations and loops.

Errata
2018-01-09, 05:29 PM
In a non-object-oriented language, do concepts like 'numeric' or 'character' not exist as, well, objects, and instead one has to define who to interact with each individual object instead of stating to treat it like the system treats numeric/character/whatever-else-the-language-has?

The difference in a nutshell is that in a non object oriented language, code and data are two very distinct things. Data exists as variables, and code exists as functions. Functions can pass around data as parameters to other functions, directly by value or indirectly through pointers. In object oriented programming, the key difference is that code and data are unified. The data structure also has its own functions for how you access that data. Different variations of data structures may have the same function interface but implement them differently according to how each class of object is supposed to work.

Object oriented languages are generally extra syntax built on top of non-object oriented languages. It's just a paradigm that may help you structure and organize your code in a way that increases reuseability and reduces redundancy. But you can do the same things in procedural languages, possibly with more code (though in practice sometimes people go overboard with object oriented languages and aren't saving themselves much of anything). At a fundamental level, many object oriented compilers are internally just translating objects into regular data and regular functions, but the object oriented overlay may be an easier way for a human to keep track of it all and navigate that code.

pendell
2018-01-09, 05:29 PM
Object orientation means three specific concepts:

1) Encapsulation. We tie procedures and data together into a single object.

For instance

car
- color (data attribute)
- drive() - method that operates on it.

In the old days before OO existed, these two things would be separated -- the data would be stored in a structure or an array, while the procedures would be stored separately and be called from something like the main program.


2) Inheritance.

This means that objects can be categorized into classes, and you can have subclasses which inherit from the parent class.

For instance
car might be the top-level class.
-color
- drive()
And we could have the classes

SportsCar
-decals
- race()

and
PickupTruck
- goOffRoad()

which inherit from car.

So SportsCar would have two data attributes (color and decals) and two methods (drive() and race()) -- two inherited from its parent, and two which it implements itself.


3) Polymorphism.

This is a fancy word meaning you can have the same method be implemented differently across subclasses. For instance, let's imagine a subclass called vehicle

vehicle
-operate()

with subclasses

submarine
-operate()

car
-operate()

aircraft
-operate()

The implementation details are very different for each of these vehicles -- a submarine requires dive control, an aircraft has a rudder and an engine, a car has just a throttle -- but as a developer you don't have to worry about any of that . You simply have to call operate() on any vehicle, confident that it will do what it's supposed to do.

Because there may be only one method specification, but there are many forms that method can take. Hence the word 'polymorphism'.

====
Very few computer languages are totally object-oriented. Smalltalk is one, and everything in the language is an object. But the larger languages, such as Java and C#, have a set of primitives which are not objects.

primitives could be things like
float
int
string
char

These primitive types are either used directly, or they are used to create data attributes of objects which we then construct.

Again: An OO language has three things: Encapsulation, inheritance , and polymorphism.


Respectfully,

Brian P.

halfeye
2018-01-09, 05:53 PM
There are many non-object-oriented languages.

c, basic, pl1, fortran, cobol, algol, forth to name but a few. The main thing they have in common is being old. Some of them may have had OOP bolted on in later versions.

Excession
2018-01-09, 07:02 PM
Would SQL be considered object-oriented, or is it somewhat outside those definitions as a database language?

SQL, at least in it's purest form, is a "declarative" programming language rather than procedural or functional. Rather than programming a sequence of operations for the computer to take, you declare what result you want (e.g.: the names and addresses of all the customers that bought a laptop last month) and the database engine works out the best way to get that result. The reality is a bit more complex than that of course.

It is possible for SQL databases to be object oriented as well. PostgreSQL notably allows you to define new types for columns, and methods on those types to control how the data is handled. This can be used, for example, when storing domain specific things like mapping or astronomical data.

BannedInSchool
2018-01-09, 11:13 PM
Seems to me the key point of OO can be lost in the mechanisms that achieve it. You're putting off decisions about exactly what's going to happen at runtime until runtime. The object interface is a wall separating abstract code from runtime execution. You have these things that you don't know exactly what they are or exactly what they'll do, but you tell them to do things which will be determined later. But maybe that's my own mental quirk that makes thinking of implementation as something unknown happening in the future useful. :smallbiggrin:

Errata
2018-01-10, 01:44 AM
The object interface is a wall separating abstract code from runtime execution.

How so? That seems to be any kind of API, not specifically an object oriented one. You can absolutely define an abstract interface with an unspecified implementation in terms of a procedural language, and in practice it is extremely common to do so. You can also use object oriented programming in a way that you understand how it is implemented but still have reasons for structuring it that way that have nothing to with that.

Object oriented programming is fairly orthogonal to that. It might be one tool that in some situations may prove helpful in achieving that, but that's not the only thing that it's for, and it's not essential for that.

gomipile
2018-01-10, 07:20 AM
Seems to me the key point of OO can be lost in the mechanisms that achieve it. You're putting off decisions about exactly what's going to happen at runtime until runtime. The object interface is a wall separating abstract code from runtime execution. You have these things that you don't know exactly what they are or exactly what they'll do, but you tell them to do things which will be determined later. But maybe that's my own mental quirk that makes thinking of implementation as something unknown happening in the future useful. :smallbiggrin:

Key phrase: "can be." Procedural languages have their own possible issues that allow some users to create problems for themselves.

Also, you're using the word "runtime" for things that in compiled OO languages happen once at compile time, and are thus firmly in the past at run time.

In mathematics and scientific computing, your characterization "You have these things that you don't know exactly what they are or exactly what they'll do, but you tell them to do things which will be determined later. " is not true for many use cases of OO language features. For example, there are a lot of mathematical objects that have a "vector space structure." This means you can take any two of them (of the same type and dimension) and add them together to get a new one of the same type and dimension. You can also do "scalar multiplication" on them with real or complex scalars. This makes OO very convenient for scientific programming, since these operations with the same pattern of use and the same name crop up for many different variable types.

So, when a mathematician uses the same "addition" method on a pair of four dimensional vectors, then later on a pair of 4x7 matrices, she knows exactly what those operations are and exactly what they will do, she just doesn't have to use a slightly different function name for each of those operations.

Khedrac
2018-01-10, 03:11 PM
Object orientation means three specific concepts:
1) Encapsulation. We tie procedures and data together into a single object.
...
2) Inheritance.
...
3) Polymorphism.
...

Respectfully,

Brian P.
Brian, may I say that that was a better summary of Object Orientation than any explanation I ever encountered at university. Thank-you.

On the subject of Object-Oriented databases, I have long thought that there is a lot of confusion.
First consideder Relational Databases - relational database theory is about finding the most efficient (in terms of space used) storage structure for the data. The nature and purpose of the data affect how far down the path of 'normalization' the database needs to go (so it is not always more efficient to go further) but the original principle was about storage efficiency because disk space used to be expensive.

OO databases are all about supporting the required objects (see Brian P's definition above) and ignore the question of efficiency (because disk space is cheap).

Now a relational database can be defined to hold OO structures, but that isn't a true OO database, even if used to support a program written using OO principles. Personally I never understood the advantages of true OO databases (seeing as I could see how to mimic them in a relational database) but that was probably because no-one was as good as explaining them as Brian P.
Now SQL is (or was) very specifically designed for relational databases, so if you are querying a database using SQL it probably isn't an OO database - they are not going to work like that, and querying them using SQL is likely to result is spurious results (e.g. when you take tables apart and put them back together and get more records than you started with - it's called 'additive decomposition' and is one of the things normal forms are desinged to prevent).

wumpus
2018-01-11, 12:21 PM
A quick take from a sometime software writing old timer who mostly deals with hardware (so self-taught OO stuff).

In the beginning programming was invented. It quickly devolved into spaghetti code, the less said about that the better.

Then modular coding was invented. This was huge and allowed programs far to large and complex under spaghetti style to be written quickly and easily. Modular coding allowed the program flow to be obvious, and equally showed how data flowed through the program.

OO came along and destroyed all that, and it was a good thing. How? Why? may you ask. It turned out that coding flow mattered a lot on machines with memory measured in kilobytes, a lot less in machines measured in megabytes, and not at all (at least compared to anything else) in machines [or at least software using data] measured in gigabytes. Look at the features:

Encapsulation: easily my favorite bit of OO. While traditionally this is said to be important in hiding data (which could really be done in C structs), where I find it excels is in letting "the data" (i.e. an object) manage itself. Instead of lots of care tending your arrays, you simply access the data via an object and the object both provides the data and manages arrays/files/whatever whenever needed. When the importance of this finally hit me, I was able to rewrite and debug some code in a day that I couldn't fix (thanks to hiesenbugs) for over a month. In modern systems the amount of data the machine can process so wildly outweighs the amount of code you could possibly write it isn't funny: managing the data is wildly more important than managing program flow.

Inheritance: allows objects to use previously written code in different ways. This comes in handy as so many times you get similar but slightly different conditions that almost need the same code. It also means that if you debug one error you made, you will have to do less searching to find the same error made in each "slightly different conditions that need almost the same solution" as most should point to the same code. I'm a bit suspicious that having more people and more teams tend to lead to abuse, especially considering all the arcane discussion of exactly which bits get inherited in various languages.

Polymorphism: obviously useful if you have more than one type of IO, simply feed it to an object and let the object determine how to format and route the data. It is also great for debugging, replace any call (even a system call) with your own debug code and watch what it does. Also this allows the reverse: taking an object and placing it in a test rig to make sure it is doing exactly what you think it does. I'm sure there are others, but these stand out as how I keep using this.

So what happened to program flow? Every time you access an object (and with plenty of languages *any* variable can be an object, or perhaps *is* an object with just one variable) you could be executing whatever routine the object defines for that interaction. And the object definition depends on the *data* passed to the routine, you can't just look through the code and see what it is (thanks to polymorphism) except in trivial cases. Is it worth it? There is an old saying "show me your flowcharts and I will be confused, show me your data structures and I will understand" [you can tell it is old because when was the last time you saw a flow chart], and now the flow is practically indecipherable while the data structures can be easily laid out in specific classes, along with all the data management routines showing how it is handled. Furthermore, the shear amount of code has been reduced (well hopefully) thanks the ability to reuse that code in various ways (thanks to inheritance and polymorphism), meaning the coder has to understand and search through far less code. On the other hand it becomes next to impossible to just "jump in" (although considering the lines of code that can be created using an OO style, just "jumping in" isn't for the faint of heart anyway).

When architecting a program in OO, I'd still recommend trying to leave the flow of execution obvious (you really want to be able to follow that) but be more concerned with your data structures and how transfers (internal or IO) are handled. I suppose that eventually it will turn to spaghetti, but try not to start out that way. All that inheritance and spaghetti shouldn't be needed on small programs, but might be unavoidable in large, complex jobs (that really need those gigabytes for program data).

Balain
2018-01-19, 06:17 AM
...And what is one that isn't object-oriented?.....

some other types of languages I didn’t notice mentioned are functional languages and logic languages. Both I found interesting to use.