Exodus - a Web Application Review tool for Java
- changed the request and response panels to not include a horizontal scrollbar, and to wrap onto the next line. This makes it a bit easier to see what was happening in a long form submission, or POST, or cookie, etc.
- Fixed a problem with the processing of Headers, where multiple Set-Cookie headers were discarded, and only the last one was used.
- TranscoderFrame now has undo/redo support - Thanks for the suggestion, Dave.
- Exodus no longer deletes all files in the save directory prior to starting. Sorry, Tim.
Development has been progressing apace, albeit without much exposure to the outside world. Still No Internet at home, which makes it more difficult to keep this page updated.
- Significantly, the plugin interface has changed, moving away from the "Observer/Observable" paradigm. This may or may not be a good idea, the jury is still out :-)
- The Request and Response classes have been reimplemented as stream based messages (inheriting functionality from Message and Header classes). Hopefully this should improve performance, since messages are streamed back and forth if they are not being manually edited, rather than being downloaded completely to the proxy, and then written out to the browser.
- Display of requests and responses has been moved to a new Panel class, which implements editing, parsing, and tabular and MIME formats respectively. This is the preferred method of presenting Request and Response instances to the user. Some migration still needs to happen, but it is mostly done.
- HTML rendering has been reimplemented in the MIME panels of the Response. As noted previously, this was disabled since it can resubmit requests to the originating server, if for example, it contains frames, or image href's, etc. The HTML is only rendered if you select the MIME tab, so if this bothers you, don't do it :-) Instances where this can be a problem are, e.g. if one of the frames would log you off, or has side effects of some sort. Conditions under which the originating server will receive requests are: If the document specifies a BASE HREF, or links are absolute, not relative; The originating server is not on the other side of a proxy, or the default Java proxy settings are correct. (Exodus makes use of its own classes to retrieve requests, so any proxies configured in Exodus would not be used automatically)
- Saving and restoring of properties is also implemented. Exodus will read properties from the jar, and then override them with any properties from "Exodus.Properties" in your home directory. If you select "Options/save", exodus will save any changed properties to your local props file.
- Memory utilisation has been significantly reduced. Now conversations are cached (up to a limit, currently hardcoded at 10, but to be configurable). If a "backingstore" directory has not been specified, conversations older than the limit will not be accessible. Unfortunately, reading and writing to a backingstore directory is a little :-) broken at this point. Future releases will have this fixed, I hope.
- The SessionID analysis module has been revised. Currently, if you select a conversation, it will not be copied to the request box for editing. Simply copy and paste it from one of the other panes, in the interim.
- A "transcoder" window has been added. This allows you to "transcode" text between formats, currently URL(en|de)coding, Base64(en|de)coding are supported, as well as MD5 and SHA-1 hashing. Routines exist for Hex(en|de)coding, but are not accessible (no buttons :-)
- Interactive Intercept now pops up a separate window for each request/response if it matches the criteria, rather than being edited in the main window. This reduces clutter, and makes it more obvious when a request is waiting for user action.
- Implemented an URL Fuzzer module. Selecting values from the table is not entirely 100%, please select columns 1 or 2, before clicking on the column 3 to select a historical value to use when fuzzing. You can also type in your own value in column 4, if you'd rather not have your own username locked out while the password field is being fuzzed! (Hi Spike :-) I still need to make the list of fuzz strings user selectable/updatable - look for this in the next version.
- The URL Fuzzer module also tests to see if a resource REQUIRES a cookie, if one is sent. We only test simple GET requests, so as not to cause any unwanted side effects. We highlight any URLS that return a different response based on whether there is a cookie or not. (Or an authorised cookie, maybe) - I'll also do this with BasicAuth, and look for 401's. Not too tricky.
- General clean ups, made more robust, etc.
- Start of a cookie gathering/analysis plugin
I have actually been working on it, I just have no Internet access at home, and thus find it difficult to upload :-)
- Implemented a new plugin model, so that it is trivial to add new functionality to the app. The plugin model includes intercepting requests, intercepting responses, and being updated whenever anything interesting happens in the Model. "Interesting" generally includes a new or updated conversation, a new or updated URL, a new cookie, etc
- Some of the plugins that have been implemented are a manual request plugin, which allows the operator to review a previous request, edit it, then resubmit it to the server as a new conversation.
- as well as a spider plugin that shows all followed an unfollowed links, and allows the operator to specify header fields before fetching the links singly, or fetching all current "unseen" links
- Made exodus more robust in the face of rapidfire requests, e.g. if a spider is running through the proxy. It used to throw concurrentmodification exceptions, hopefully that is now sorted out through extensive locking.
- Sketched out an idea for a Cookie analysis module
- enhanced the sketch of the spider module. Now I need to start coding it!
- Now the fragments (bits of forms, scripts, comments etc) are actually displayed in the GUI
- fixed SSL intercept with IE - there is a slight delay while IE looks for new certs from MS, but it works.
- Proxy authentication now works - we don't try to do HTML analysis if the doc is not obviously HTML, and so AnalyseResponse no longer dies. (Seen with Cisco Content filter)
- Fixed the SSL support. It was pretty badly broken.
- SSL Upstream proxies now seems to work. (Tested by doing two levels of Exodus :-)
- Reading to and writing to a file works properly
- Proxies are now configured in a dialog. It better reflects the fact that the upstream proxies configured apply to all aspects of the application, not just the proxy. The dialog includes "No Proxy", but this is ignored for now.
- Errors found during the request stage are now propagated back to the browser, so you'll see them, rather than the browser sitting there waiting for the socket to close
- Fixed proxy download of large documents
- Model now returns copies, so changes to the conversations are not necessarily reflected in the Model
- Model maintains links between conversations and URLs
- Model can now read from and write to a directory (persistent storage)
- Exodus now includes a log pane, so messages are not lost to Stdout when run from a GUI. Low-level classes still log to stdout, so if you are having problems, run it from a DOS box or shell
Exodus is © 2003 by Rogan Dawes <firstname.lastname@example.org>