Here's what I've come up with. Since I've put my store in the root of my web, it's a little different.
User-agent: *
Disallow: /cgi-bin/
Disallow: /_borders/
Disallow: /_derived/
Disallow: /_fpclass/
Disallow: /_overlay/
Disallow: /_private/
Disallow: /_themes/
Disallow: /_vti_bin/
Disallow: /_vti_cnf/
Disallow: /_vti_log/
Disallow: /_vti_map/
Disallow: /_vti_pvt/
Disallow: /_vti_txt/
Disallow: /aspnet_client/
Disallow: /images/
Disallow: /scripts/
Disallow: /stylesheets/
Disallow: /global
Disallow: /paypal
Disallow: /shop
Disallow: /ssl
Disallow: /tmp_
Disallow: /update_
Disallow: /ups
Disallow: /vs
Disallow: /yellow_
User-agent: Googlebot-Image
Disallow: /
This should keep bots from indexing any store scripts they don't need to while still spidering any other content.
If anyone is running a FrontPage web, It's wise not to let the FrontPage _vti folders get indexed either.
Alternatively, you can put the following meta tag <TITLE> element of each file you don't want indexed such as the store admin pages.
<meta name="robots" content="noindex, nofollow">
If anyone sees any mistakes, or knows of a better way to do this, please let us know.
Always remember that wherever you go, there you are.