1st test:
var a = 1;
function b() {
a = 10;
return;
function a() {}
}
b();
alert(a); // 1
2nd test:
var a = 1;
function b() {
a = 10;
return;
}
b();
alert(a); // 10
In the first test, a
is equal to 1
, although I set it to 10
in the method. In the second test, I set it to 10
and it is set to 10
when I output it.. How does this work?