I think statusBarFrame
and applicationFrame
complement each other exactly to form the whole screen's bounds. And the frame's value is relative to the "raw" portrait orientation screen coordinates (which I think is the screen's bounds). So for example, when an iPad is rotated upside down, the output of
NSLog(@"the status bar frame is %@",
NSStringFromCGRect([[UIApplication sharedApplication] statusBarFrame]));
NSLog(@"the applicationFrame is %@",
NSStringFromCGRect([[UIScreen mainScreen] applicationFrame]));
is
the status bar frame is {{0, 1004}, {768, 20}}
the applicationFrame is {{0, 0}, {768, 1004}}
But a questions is, how come these two values that complement each other come from two different objects: the application instance and the UIScreen
instance, but not from the same object? (say, both from the UIScreen
object?)