(I’m sorry, this turned out longer than I imagined. I hope you don’t mind my ramble…)
Honestly, the terms “Functional programming”, “Object-oriented programming” or “Procedural programming” are incredibly overrated:
You’d think you could apply them to or detect them in a project, or source file - well you can’t.
They’re more like a way of thinking about programs, they’re fundamentally not in programs.
More specifically, they’re called “paradigms” for a reason. If you google “paradigm definition” you get “a typical example or pattern of something; a pattern or model.”, and “programming paradigms” are exactly that:
Patterns that you can apply when programming.
You can think about the same program from a functional or procedural or object-oriented perspective.
What you get out of it depends on your knowledge on said paradigms, and what happens when you try to recognize the patterns said paradigms.
I know you’re not asking for a religious language war, it’s stupid, I agree.
But let me give you an example from a language that I’m very familiar with, Lua.
You can write programs in all paradigms in it, just like Javascript, Python, and a lot of other languages. Also it’s syntax is somewhat ideomatic which makes for good examples
A simple for loop for counting down “procedural style”:
function count_down(max)
for i=max,1,-1 do
print("i is", max)
end
end
count_down(10)
Why is this “procedural”? It defines a procedure as a list of procedures, to be executed in order.
How could someone (mis-) treat this as functional? Well, count_down is a well-behaved function, with no side-effects*:
It always maps the integer max to the nil(nothing) return value. While typically not very useful in a functional programming language, it’s still valid.
And how is this object-oriented? Well, if you knew Lua, you’d know that this is essentially three things happening here:
We define a function, then add it to our environment(_G or _ENV), then run it.
The defining and adding it to out environment can be interpreted as adding a method to the environment, and running it can be interpreted as invoking that method on the environment. Not how I’d usually treat this code, but still valid.
A simple recursive function for counting down “functional style”:
function count_down(cur)
print("i is", cur)
if cur ~= 1 then
count_down(cur-1)
end
end
count_down(10)
Why is this “functional”? Well, it uses recursion and causes no side effects*.
How is this procedural? By default, after all we’re just defining a single procedure and then runs it.
Sure, this procedure does not cause side effects* and might be recursive, both things typically associated with functional programming, but perfectly valid in procedural programming(and both quite common in e.g. C).
A simple “object-oriented” example:
obj = {
min = 1,
}
function obj:count_down(max)
for i=max, self.min, -1 do
print("i is", max)
end
end
obj:count_down(10)
Why is this object-oriented? Well, we can see that we have a thing(call it object) that can both hold it’s specific state(the number min), and a function used to access that state (a “method”), typical object-oriented stuff.
Why is this procedural? Well, the object is in no way special, it’s just a regular Lua table, that happened to have a number value at the index “min”, and a function value at the index “count_down”. The function is in no way special, Lua is just clever with the use of the :
: When defining a function, it adds a hidden first argument called “self” to a function. When calling a function from a table using :
(instead of the normal indexing operation using .
) it passes the table the function is found in as first parameter. As you can see, the function is in no way specialized to the table. It’s just syntax sugar. In fact, this below would be exactly the same to Lua:
local function my_count_down(self, max) (same function body as above) end
obj.count_down = my_count_down
In fact, Lua has no concept of object-orientation, it’s just easy to use in a object-oriented way.
Now, all these examples look very different, but fundamentally, they are the same program, at most it’s the implementation that differs. All examples will count down from 10 to 1.
They were written differently, with specific paradigms in mind. And maybe you can recognize some specific patterns in them to associate them with a specific paradigm. But that’s only useful as far as the found patterns help you understand, these programs “are” not “in” a paradigm - It’s merely useful to view them as such, sometimes.
All programming paradigms are useful, if applied to the right thing.
But as you can see in the examples above, if applied to the wrong thing they make things needlessly complex.
With regards to a recommendation on learning: While you can learn the patterns of these concepts in basically any order, I’d recommend this:
Learn procedural programming to understand how computers work. While programming is not bound to a specific paradigm, in a way computer will always be best represented as “procedural”. It’s a good first starting point, as it helps understand other paradigms.
Learn functional programming to become a better programmer. At some point you can basically implement anything in a procedural language, just not well. While you likely won’t be using functional programming for anything work-related(there are exceptions), functional programming is great for understanding algorithms and data-structures, which will help tremendously even with procedural programming.
To truly appreciate functional programming you also want to understand the implementation. Functional programming languages are usually easier to implement.
Learn object-orientation to be a productive programmer. The main advantage of objects is that you can easily chain them up to very large, complicated programs; They make organizing your code easy, they enable easy interoperability between implementations via interfaces, etc. - there is a reason a lot of large software is implemented in C++, Java, etc.
The reason object-orientation is so powerful is also the reason why it’s a bad start for learning and a bad fit for small-scale programs: It tries to hide complexity of the implementation. In a large program you want that, but in a small program you want to know what you’re doing, exactly.
I hope that cleared things up a bit and that you don’t mind my ramblings 
I’m very sorry for this wall-of-text…
* Keep in mind that I play it fast-and-loose with some of the terms used here, because they are: Truly side-effect free programs do nothing, if you think about it: Every kind of IO operation is a side-effect, including print. Depending on your definitions, just *starting a program* is a side-effect.
Also keep in mind that by no means I’m an expert