I have noticed a behavior of Python functions that I cannot fit into my naive understanding of local vs. global scope. For example, this
x = 1
y = 1
def f():
print(x+y)
f()
appears to be valid Python code, as it does not produce an error message and the function is executed. But it seems to break the rules, since x and y are neither global variables nor are they arguments to f() nor are they defined within the local scope of f(), still f() can access those variables. Can someone explain to me what is happening here?