95% of all "Introduction to " books tend to dedicate the first couple of chapters to the fundamentals but with a specific bias towards the language in question. Seek out a few of those at a library or online equivalent and you'll start to see patterns cropping up.
Anything that doesn't have that bias is likely to use pseudocode which looks like a programming language anyway.
Object orientation works around the concept that things in the program "know" things about themselves and how to do things for themselves rather than have some other part of the program do things to them. Commonly you'll see things like doSomethingWith(someObject)
being the non-OO way and someObject.doSomething
being the OO way. That is, in the latter someObject
contains the knowledge of how to doSomething
, and the dot notation urges it to do whatever it is.
For a silly but more concrete example, x ← 2 + 2
is non-OO, but x ← 2.add(2)
is at least partially OO because the first 2 is expected to know how to add another 2 to itself and give out the correct answer. Presumably in whatever language that is, someone has created a method for numbers to know what to do when told to add
. The other 2 doesn't really get a say in things. We might also have, say, elephant.putOn(hat)
, but it might not be possible to hat.putOn(elephant)
because no-one thought to teach the hat how to wear things, let alone elephants.
There was that one bash.org quote where a script kiddie was given 127.0.0.1 as part of an "oh yeah I dare you" taunt after he said he could hack anyone, and he fell for it hook line and sinker. He was posting things like "Hahaha your K drive is being deleted! Now your H drive! [connection reset by peer]" and right after that the challenger was like "I don't even have a K drive."
(RIP bash.org though. I would have tried to link it otherwise)