How does the browser utilize the cache, does it load webpage specific cache content, or will it access scripts from other pages?
It's however each browser wants it to work. More often than not, each file hosted (regardless if it's script, html, images) can have an expire time to tell the browser when to re-download the file. Sometimes the browser overrides that as configured (by default or by user).
When you say scripts from other pages do you mean:
//Option 1 - remote script
<script src="somesource" type="javascript/text"></script>
//Option 2 - inline script
<script type="javascript/text">
$(document).ready(function()....);
</script>
With Option 1, typically a browser will reuse the already downloaded somesource if the url exactly matches. With Option 2 the code is part of the page and must be downloaded each time because it's inline it the HTML code.
If I'm loading jQuery, I'm assuming it's best practice to check if it's in the user's cache before I load it, right?
I'm assuming you're only writing pages from remote hosts and using standard web browsers (eg Chrome, IE, FireFox, etc). In this case you don't have access to the cache, nor can you detect if the browser has already cached the script. Not only that it doesn't matter, the browser will decide to redownload it or not (Option 1).
Also it seems like you're confusing how browsers work. On each request of a page, in general, the browser empties itself of any images, html, css, javascript or any other resource, and grabs the URL specified. If the specified primary request points to another secondary+ request (script/img/css etc) then it goes back to Option 1 and loads that secondary+ request as needed/configured. As far as I'm aware, web browsers don't keep secondary requests loaded for each request just because it loaded it for any request, as this would cause massive problems.
So I guess my question primarily pertains to how browser caching works. If the user has loaded jQuery from Google CDN on another website, and I'm using the Google CDN Version, will it load a second time from Google? Or the same jQuery version from another CDN, such as Microsoft or the jQuery CDN?
Again this is up to each browser. Generally speaking, if two independant websites use the same request url (for example, jquery on googles CDN) then the browser will most likely use the cached version (Option 1).
If two independent websites request a filename that is the same name, but use a different URL, then most likely (generally) each one will be downloaded as the web browser can't be 100% certain the files are the same just based on name.
How could all the common script libraries be stored locally and accessed rather than loading online every time?
I don't believe you can manually tell web browsers to use local resources out of the box (see my final comment). If you're truly intending to create as small bandwidth application I would recommend looking up resources on google on how to do that (mostly search for mobile internal products and solutions like jquery mobile).
Loading Resources Locally
You can certainly do this, although I still would not recommend it ever. What you can do is load a web server locally on a PC (IIS Express, ISS, Apache, whatever). You can then either load a DNS server locally or edit the hosts files. If you changed the host file to look like:
127.0.0.1 ajax.googleapis.com
This would make requests for https://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js
actually request the file from the local webserver. Your application running on the local webserver could determine if the file is local and use a local version, and if not use an external DNS to determine the IP and download it.
This is all well and good, but it would almost be exactly what the web browser does. This only advantage is to preload scripts on a local machine, but this seems like a lot of extra effort that shouldn't be necessary.