Lying Through Their Teeth: Easy vs. Simple

January 14th, 2008  |  Published in CORBA, design, distributed systems, REST, WS-*  |  7 Comments  |  Bookmark on Pinboard.in

I have to say that I agree with Ryan Tomayko on this one.

Among other things, Ryan touches on one of the favorite assertions of the REST detractors, which is that REST can’t be effective without an interface/service/resource definition language. After all, without such a language, how can you generate code, which in turn will ease the development of the distributed system by making it all look like a local system? Not surprisingly, the first comment on Ryan’s blog entry is exactly along these lines.

As I’ve been saying for years, trying to reverse-map your programming language classes into distributed services, such as via Special Object Annotations, is an attempt to turn local design artifacts into distributed ones, which we learned long ago is just plain wrong. You often end up paying for such shortcuts in areas such as reliability, flexibility, extensibility, versioning, reusability, and especially scalability.

Back in the halcyon days of CORBA, we generated code from OMG IDL, but IDL is not a local design artifact. OMG IDL was designed from Day One to define distributed systems (though we did add the “local” keyword to IDL sometime around 1999 or so to allow for easier local call optimizations). Note also that unlike the usual approach to defining WSDL, we never reverse-generated IDL from C++, Java, or any other programming language (though a questionable group eventually did come along and, trying to ride the Java popularity wave, define an OMG standard reverse IDL mapping for Java, despite strenuous objections from a number of us, including me). IDL also allowed for generating code in different programming languages for different parts of the same system. But the RPC roots of CORBA, its interface specialization requirements, and the inflexibility of the generated code, especially with respect to versioning, ultimately limited CORBA’s possibilities when it came to medium- to large-scale systems.

Proponents of definition languages seem to assert that such languages help with understandability. Such languages, they say, are required because they alone tell you how to invoke the service, what to pass to it, and what to expect in return. The problem with the way they make this assertion, though, is they make it sound like the application figures all that stuff out on its own with no human involvement. What happens in reality is that an actual human programmer sits down, reads the interface definition, more than likely reads some comments in the definition or a whole separate document that describes the interface in more detail, and perhaps even talks to the person who wrote the interface definition in the first place. Based on the knowledge gained, he then writes the application to call that interface. Similarly, with REST, you read the documentation and you write your applications appropriately, but of course the focus is different because the interface is uniform. Depending on the system, and assuming REST as implemented by HTTP, you might also be able to interact with it via your browser to help understand how it works, which I’ve found extremely valuable in practice (and yes, this works for application-to-application systems that are not designed primarily for browsers or human consumption). But ultimately, there’s no magic, regardless of whether or not you have a definition language.

What the proponents of definition languages seem to miss is that such languages are primarily geared towards generating tedious interface-specific code, which is required only because the underlying system forces you to specialize your interfaces in the first place. Keep in mind that specialized interfaces represent specialized protocols, and IDL was developed oh so long ago to generate the nontrivial code required to have RPC applications efficiently interact over such protocols, since back then computers and networks were far slower and less reliable than they are today, and getting that code right was really hard. When you have a uniform interface, though, the need to generate interface-specific interaction code basically goes away.

(BTW, the first IDL I ever saw was at Apollo, where it was used not only for RPC in the Apollo Network Computing System (NCS) but also to define Domain/OS header files once and generate them into their C and Domain Pascal equivalents, rather than writing and maintaining them twice, once for each language.)

Some REST proponents like WADL. I’ve looked at it but haven’t used it, so I can’t really comment on it. I’ve never felt the need to seek out a resource definition language of any kind for my REST work, at least to date. YMMV.

BTW, on a somewhat related note, I still use CORBA, contrary to what some jackasses out there would like you to believe. In some industries, certain CORBA interfaces are standardized and even legally enforced. In others, leading players have defined CORBA interfaces for 3rd-party integration. These interfaces work, so those companies and industries have no intention of changing them to another technology anytime soon, and in fact they simply have no need to change them at all. I’ve had to work within some of these CORBA scenarios lately, and I have to say I’ve found it to be fun, like meeting up with an old friend you haven’t seen in awhile. I’m sure many of these interfaces could be done better with REST, but they work as is, and there’s just no need to throw them out. Coincidentally Ryan spoke of CORBA when he responded to the commenter mentioned above. All in all, I remain proud of my CORBA work over the years, as we did a lot of good stuff back then, even if since then we’ve found simpler ways of doing a few things.

Responses

  1. Pavel Rodionov says:

    January 14th, 2008 at 7:23 am (#)

    What REST implementation do you use in practice? We currently trying to implement some services via REST-style, but all solutions currently lacks of something. Maybe we have some restriction, because we on java side (we know that Rails and Django exist). But we trying following:
    1)Rest plugin for struts2
    2)Restlet framework
    3)RESTfaces

    On client side we founded that Flex and Flash applications can’t make appropriate HTTP method calls, that switch us to POX side, which is not truly REST for us. So, what a strategy today, wait when some vendor started to implemenent
    ubiquitous REST-framework or starting own REST-way?

  2. steve says:

    January 14th, 2008 at 3:32 pm (#)

    Pavel: I don’t use a REST framework. I’ve heard good things about Restlet, but have never tried it. I write my REST code in Python and Erlang (never been a big fan of Java, sorry). As for the client side, yes, there can be restrictions depending on your client software, so you either have to choose clients that allow for the proper methods, get the service to support alternatives that work with the methods available to you, or maybe even write a gateway that translates your client’s methods into proper ones that it forwards to the service.

  3. Peter Cousins says:

    January 14th, 2008 at 3:34 pm (#)

    Steve is this finally a clue about Verivue ;-)

  4. steve says:

    January 14th, 2008 at 3:50 pm (#)

    Peter: not as far as I can tell. :-)

  5. Dmitry says:

    January 14th, 2008 at 7:47 pm (#)

    Steve,

    I think that definition/schema/constraint languages can help to describe complex representations better. We can guess/induce structure of simple representations by looking at several examples, by “browsing” resources and by reading informal comments.

    But for more complicated representations I do not mind to look at schema (more formal definition). Schemas (and constraints) can also be used for efficient validation/screening of representations.

    If we have to use some static language (or framework) for designing a client of a REST-based service then schemas can help to build (manually or automatically) “helper” classes. But these static classes can “freeze” representations and can restrict our ability to build more adaptive clients.

    Dynamic languages can build helpers “on the fly”, so we do not feel this “freezing” effect. XSLT, XQuery, XForms support flexible data representations by design so they also play well with REST-based services.

    I think that in ideal situation, all components of RESTful solution (including GUI) should support dynamic/flexible data representations. In this case schemas can play positive role, they do not create “freezing” effect and do not restrict ability to adapt.

  6. Patrick Mueller says:

    January 15th, 2008 at 8:20 am (#)

    I posted a response here.

    [Steve remarks: I’m including Patrick’s link to his response above, but I want to note that the entire premise of his response is based on the idea that I wrote my posting about schema, which is incorrect. Patrick even titled his response “steve vinoski on schema.” Unfortunately for Patrick, I was not writing about schema, and the word “schema” does not even appear in my posting. I was writing about interface definition languages, which are not at all the same thing as schema languages. So take his response with a large grain of salt.]

  7. rascunho » Blog Archive » links for 2008-01-18 says:

    January 18th, 2008 at 3:33 pm (#)

    […] lying-through-their-teeth-easy-vs-simple (tags: steve.vinoski.net 2008 mes0 dia18 at_tecp REST WSDL blog_post webservices) […]