I am seeing a difference between what TalkBack reads out loud when a screen is displayed on a Nexus 7 (4.4.2) and a Samsung SGH-I317 (4.1.2).
- On Samsung SGH-I317 (4.1.2) when a screen is displayed some elements are read out loud but not all of the elements displaued.
- On the Nexus 7 (4.4.2) TalkBack reads only those elements that are specifically
selected/tapped.
Is this a Samsung/Nexus difference or an Android 4.1.2/4.4.2 difference?
What is the expected behaviour when a screen is displayed, should TalkBack read all the elements on the screen or just read what screen is displayed and leave it to users to touch a UI element before it is read out?