I am completing a beginner's Python book. I think I understand what the question is asking.
Encapsulate into a function, and generalize it so that it accepts the string and the letter as arguments.
fruit = "banana"
count = 0
for char in fruit:
if char == 'a':
count += 1
print count
My answer is:
def count_letters(letter, strng):
fruit = strng
count = 0
for char in fruit:
if char == letter:
count += 1
print count
count_letters(a, banana)
But it is wrong: name 'a' is not defined. I don't know where I'm going wrong. I thought the interpreter should know that 'a' is the argument for 'letter', and so on.
So I must be missing something fundamental.
Can you help?