Before starting to program, it is very important to have an appreciation for how the computer is going to look at your program. The fact of the matter is that, when your program is executed, the computer will do nothing unless your program tells it to do so. This includes reading input from the user or a file, doing something with that input, and giving the results in a predictable manner (usually either to screen or to print). None of this happens unless you, the programmer, tell the computer to do it. You do not need to tell it the fine details of how to do it, but you must tell it to do it.
At their core, computers are machines. Machines operate on the basis of command: you tell them to do something and they do it. This is an essential aspect of computers to be recognised by anyone who wants to program: the computer does what you tell it to do and does it faster than you could. But if it receives bad input, it will give bad output. In the early days of computing, programmers used an acronym for this: GIGO (Garbage In, Garbage Out). Like speaking Swahili to a Parisian, if you do not give the computer the right commands in the expected way, you will not get the right results in the way you expect.