This software is OSI Certified Open Source Software. OSI Certified is a certification mark of the Open Source Initiative.
The Toolserver Framework for Python is a framework for easy webservice creation. It's main property is that it runs as a standalone server and so can easily be deployed. It uses medusa as it's webserver core, but sports a threading model that is much like the process model of Apache 1.3.
For the license under that this software is distributed, read LICENSE.
For an installation instruction read the file INSTALL (essentially you can just do easy_install toolserver, but read it anyway for some info on dependencies you have to satisfy manually).
For more info look at the bitbucket page: http://bitbucket.org/rfc1437/toolserver/
Some highlights why you might want to look into TooFPy:
- industry strength and fast webserver as the base. Static content is delivered directly by the high performance Medusa server, dynamic content is delivered by dynamically managed worker threads.
- namespaces for tools as the base for URL construction. Intermingled with a standard static webserver to fill in those areas where no tools are registered.
- concentrates on building the API, not on building a full web application. You can build a web application, if you want, but you don't need to. Want to just expose some quick hacks to SOAP or XMLRPC? Piece of cake.
- all APIs are dual use: they can be accessed over the web or they can be accessed internally in other tools. If you build a full web application, it will be composed of communicating tools where the tool communication is by calling well defined APIs that are both used internally and can be used externally, as well (if you decide so).
- automatic documentation generation for tools, both in HTML format (for human consumption) and WSDL format (for SOAP clients that need those). Just go to http://localhost:4334/API/ (assuming your server runs on the default port and the localhost IP) and you will get a list of links to installed tools with their respective API documentations - all generated on-the-fly from your tools sources!
- programming by contract for Web APIs! Of course, contracts are part of the human readable documentation. Contract checking can be switched on or off.
- wrapper based REST style APIs built over your standard RPC style APIs - no need to fight over API style anymore!
- factory tools that generate transient tools that themselves can be factories. This gives you fully dynamic namespaces.
- integrated authentication scheme for REST style API usage. This enables you to build your own authentication level above the tools without forcing some prebuilt scheme on you. This changed in 0.3.6, so now there are actually two authentication schemes, the newer one is build on standard user+group definitions, can be based on IP and HTTP basic authentication and can make use of a non-standard RSA authentication.
- integrated tools for powerfull thread management. Method calls can be made synchronously and asynchronously (using the same thread pool as the dynamic web content handling does). Asynchronous calls can be immediate, queued (serialized by priority on a per-tool queue) or timed.
- powerfull life debugger for the toolserver based on the medusa monitor. You can see what your server is doing and inject python code into a running system, peek at variables, even change them.
- simple HTML rendering library to ease production of wellformed HTML and XML content.
- dynamic and static namespaces for requests: global config variables are available during lifetime of the server, context variables are available during a request, static variables are available during normal method calls.
- integrated transaction hooks that allow you to combine your methods success or failure with database transaction calls. Your method fails, the database is rolled back.
- there are base classes for special purposes. For example you can choose wether all methods should be externally accessible by default or wether none should be accessible by default. Another base class is for providing RPC style access with a challenge-response-scheme for authentication. And then there is a all-singing-all-dancing base class if you want very simple integration of authentication - just subclass from AuthenticatedTool instead of StandardTool.
- IP based access lists (hosts.allow for fixed host assignements - the given IP set's the designated client system name and no additional authentication is needed). Just add <clientname>:<ip> lines to either hosts.allow or hosts.deny to allow or deny access. Entries in hosts.allow already set the client name. This can be used as a simple authentication check for tools that want to restrict access only to known systems. Systems in hosts.deny aren't allowed any RPC call at all (both RPC and REST style access applies!).
- RSA key based authentication to prevent unauthorized system access when you don't have static IPs to check or need extra protection. If you activate RSA authentication, this is mandatory - no other (minor) authentication scheme will be accepted.
- new RPC mechanisms can be easily installed, much like tools. Just write your RPC handling class like the SOAPHandler or XMLRPCHandler and put them into your local tools directory (or a global tools directory if you want to use this RPC mechanism in all instances of your toolserver). There is a new RPC mechanism based on base64 encoded binary pickles - look into PickleRPCClient.py for the client code and PickleRPCHandler.py for the handler code. This one is much faster than XML based RPC mechanisms, as only very fast standard libraries are used. PickleRPC only works with RSA authentication to enable maximum security.
- you can easily plugin WSGI compliant applications if you have wsgiref installed. This allows merging your toolserver with functionality from other sources.
- the ReactorChain gives a powerfull pattern to tools to hook into other tools processing. This mechanism is relyable: if hooked tools fail, the chain reaction will still take place, so the original tools functionality won't be compromised. One use for this is the change system.request.rewrite that can be used to add URI preprocessing to rewrite parts of URIs before any tool has the chance to see the URI. For example this can be used to do virtual host handling.
Some random notes (yes, some day there might be a documentation):
IMPORTANT NOTE ON PICKLERPC: Pickles should be seen as insecure, as the binary encoding of data could introduce stuff that you don't want to happen on your server. For that reason PickleRPC is disabled by default and needs to be enabled in the configuration. You should put some kind of protection in front of your server, for example checking connecting IPs for only trusted sources and you SHOULD NOT expose a server with PickleRPC to untrustworthy systems. For this, it would be a very bad idea to enable PickleRPC and open your server to the internet without any security measures! To help to prevent damage, PickleRPC is tied to RSA authentication - if you don't set up RSA authentication, PickleRPC can't be enabled.
HOW TO GET RSA AUTHENTICATION GOING: you first will have to generate usefull key pairs. Do this with "tsctl keygen" for the server and "tsctl keygen clientname" for the client and move the client keys to the client machine. When instantiating a RemoteToolserver instance (from the Toolserver.Client module) you can now pass pubkey, privkey and localname options. If you do, your call will have additional HTTP headers that carry a SHA256 hash, a RSA signature and your name. The server will check those (only if you enabled rsaauthenticate in the config) and provide you with the same headers in the result. Due to the RSA key stuff this will be noticeably slower but much securer, as you will only allow communication with known hosts. To enable new hosts, just put their public key into the pubkeys directory of your toolserver. They need to be named after the client. To disable a client, just remove it's public key and restart the server.
A NOTE ON ENCRYPTION: this toolserver framework could have easily adopted SSL as the main transport security protocol. Actually it would have been much more natural to use that - every proxy should now support CONNECT and so should support HTTPS connections. The reason I didn't do this (I might opt to do it in the future, though) is that the toolserver usually doesn't run on well known ports. And CONNECT to servers with ports other than the default ports nowadays requires administrator intervention - every port allowed for CONNECT needs to be entered in the system configuration. This is to prevent hijacking https proxying for unauthorized outside connections. So I opted for the non standard way of encrypting requests and responses instead - the call itself is still transported by the standard HTTP and so should travel through allmost every proxy you could think of. Without administrator intervention or frontend servers. Remember, TooFPY isn't about building web applications, it's about building web services and distributed applications. If you want interop with other software, you should use the SOAP or XMLRPC protocols and keep off the RPC encryption stuff, as that is non-standard. You still can set up frontend servers that do the SSL stuff and put TooFPy behind them and so can still use TooFPy as a building block for web applications, relying solely on standard protocols and formats. Just see the encryption stuff and the PickleRPC stuff as additional options if you run in a pure Python environment.
A NOTE ON COMPATIBILITY: some protocols are non-standard (for example PickleRPC can only be used with Python) or have non-standard extensions (for example RSA authentication is done in some non-standard way that currently only works with Toolserver.Client). If you want to access your webservices from a wide range of clients, you shouldn't use those non-standard features. Just don't enable PickleRPC and don't use RSA authentication - use CRAM authentication, instead. CRAM authentication uses a very simple scheme based on SHA1, so you can call methods with that authentication scheme from most languages (using Java or Perl is quite easy). CRAM authentication does rely on a shared secret, though. If you are forced to invent your own authentication mechanism, you can use context.request in your _validate methods and access the request data and headers directly - for example to build some IP based authentication or something using identd lookup (the IP is hooked in the request via the medusa transport - you might need to dig through the medusa source to see what I am talking about). If everything fails your needs, you can allways pass user and password fields in your calls - just be aware that they are transfered in clear over the wire, so set up some encrypted channel to communicate over. Some simple way to do this is to run the toolserver only on localhost and put an Apache in front of it and let that one handle all authentication and SSL stuff.
A NOTE ON NEW PYTHON VERSIONS: Python 2.6 deprecates the md5 and sha modules and wants you to use hashlib instead. Toolserver does so, but medusa still imports md5, so you get a deprecation warning. This is old code.