I am looking for software that will download a complete working copy of a web page. I want this specifically for a simple neatline map page right now (one which uses a historical map), which I realize is database backed but I am hoping there is something that can recreate a simple site of this sort.
I'd like to do this in general for a couple reasons: it would allow us to archive a few things (non-neatline) that are using aging technologies I'm afraid will go away soon, and it would allow me to work on someone else's code locally when they ask me for troubleshooting advice. I have used HTTrack in the past, but it seems for more complex websites it's missing big swaths of both CSS and javascript, or is not linking them up properly.
Software I have tried: HTTrack, sitesucker, and scrapbook firefox plugin, as well as most browser's "download complete web page" option.