For many months now – dating back to the middle of last year when Domino app dev was in limbo – I’ve been wrestling with what seems to me to be an elephant in the room with regard to Domino, NoSQL and REST. At the time many were looking at alternatives to part or all of the Domino stack. An alternative backend was one area developers looked at, many considering alternative NoSQL databases like MongoDb or Cloudant. Using a JavaScript web development framework (e.g. Angular or React) talking to the database layer via REST services was another. Neither were aspects that really appealed to me. At the heart of that decision was an obvious aspect that didn’t seem to receive the acknowledgement it deserved. It’s also been at the heart of my choice for building custom REST services.

CRUD

The focus of REST access to Domino seemed predominantly based on reproducing the style of access to other NoSQL backends. Namely REST access for basic CRUD operations: the ability to create, read, update and delete documents. This is what Domino Access Services provided, albeit imperfectly because read operations provided far more than was necessary and create / update operations provided unsatisfactory options for validation. And the demos I’ve seen of migrating Domino apps to other NoSQL backends focus on migrating apps where basic CRUD operations are sufficient – for example the Discussion Template.

NSF Interaction

But Domino pre-dates REST access to NoSQL databases. It predates basic CRUD access to backends. Consequently interaction with the NSF is via a much more sophisticated set of APIs – whether that be in Formula Language, LotusScript, Java or a mixture of one or more of those. Even where interaction with the NSF was from a browser, although Read access might have been via URL commands that gave basic CRUD operations, rarely was that sufficient for a Domino web application. Instead, there were URL commands for ?openAgent and, more recently, URL command access to web services. Agents and web services then used the more sophisticated and flexible backend languages. This is a key factor worth bearing in mind and not forgetting in the desire to embrace what’s “cool”.

What Is The Sweet-Spot for Domino Applications?

And this brings me to the other key factor. Despite the main IBM templates, the vast majority of Domino applications focus on a common purpose: workflow. One of the other key aspects of Domino is hierarchies, namely parent and child documents, related for a specific reason. CRUD, however, focusses on creating a single document, reading a document or collection of similar documents, updating a document, deleting a document. Actions performed in Domino applications are updates to approval fields on documents that are not exposed to editability via normal CRUD operations, sending notifications at the same time as an approval or CRUD operation, updating a parent and associated children at the same time, or updating a child and associated parents at the same time. This is not done by separate CRUD actions, as with other NoSQL databases. Because Domino has a sophisticated and flexible set of backend APIs, it’s done in a single chunk of code – behind a button, in a querySave / postSave event, in an agent etc. CRUD could be used to change the status of a document, but would not as seamlessly update a set of related documents, or send an email, or create another notification document, or post to Watson Workspace etc.

The Elephant in the Room

So why the desire to get a “primitive” CRUD REST API for Domino? SmartNSF seems to be addressing this to some extent with pre and post code for its REST services. But it remains to be seen whether it will manage the kinds of simultaneous workflow notifications many of our apps use and take for granted. A custom REST service – just like a LotusScript agent – can. But LotusScript is not designed to deal well with REST service calls – handling request objects, flexibly dealing with headers, converting JSON to LotusScript objects, easily reading JSON content, easily building JSON content, handling HTTP error codes etc. Nor is it easily extensible to allow the community to fill that gap. Node requires communication from JavaScript to the C API layer, which is platform-specific and has a limited set of experts in the community. Java has it all already, with the Apache Wink servlet and JAX-RS libraries. That can be used from Domino, from Vertx, from CrossWorlds, or beyond Domino.

For me, so many Domino apps are intrinsically linked with custom workflow and a single update process updating multiple documents, that it seems unnatural to try to shoehorn into CRUD operations what has been natural and straightforward using the Domino APIs. And the custom functionality required is too varied to expect a template from IBM to provide this kind of functionality. There are a variety of options, both provided by IBM (Extension Library or agents if you wish) and from the community (SmartNSF, ODA Starter Servlet). These are a better fit for the kind of functionality I tend to see in most of the applications I’ve built.

5 thoughts on “CRUD, NoSQL, Domino and Workflow”

  1. Paul, may I know what is your proposed solution ? I think that I read an article that ODA or DAS have a speed issue when getting a document, would this be a problem for you ?

    1. “Performance” is another topic I have strong and probably unconventional opinions about! I really need to write a blog post on that too. I’ve not found the speed issue of getting a document to be an issue in custom REST services I’ve built. There tends to be a lot of focus on that in articles, but the speed of getting a document on the server has to be considered in the wider context of the speed of the whole request. A quick network means a few more milliseconds getting the document is still acceptable. A slow network means saving a few milliseconds getting the document won’t be perceptible, because there’s a bigger issue. ODA Graph architecture heavily leverages getDocumentByUNID() rather than navigating views, which again improves speed on the server side. Looking at the speed to and from the server, a conventional REST service sends everything the provider thinks any given consumer might need. A GraphQL approach trims it down to what the consumer asks for, so that makes for another win. On top of that from what I’ve read this week, HTTP/2 allows concurrent pushes of resources and cuts the size of what’s sent, and although that can’t be done on Domino today, it can be done on Vertx running it’s own web server in an architecture analagous to CrossWorlds. But that too would use the Java API and probably ODA too. But some situations (internal app or app only used via decent connections) won’t really need that, so why over-complicate things? But, for me, the key is a REST API that handles workflow requirements without CRUD-based workarounds, as well as one that prevents editability of restricted fields outside specific REST service endpoints.

    1. Yes, I have been working with it over the last couple of weeks. I’ve had it accessing names.nsf from Eclipse and a fat jar (with Notes.jar as external jar). It may make more sense (and be easier for running) to bundle it. It works against Notes Client from Eclipse, though I’m not totally sure what changed to suddenly make it work. Once I get my head round benefits like HTTP/2 and get into bundling a client-side app in src/maiin/resources, there is definitely potential for microapps using Java and JS.

  2. I fully agree that CRUD implementations of REST won’t help much with modernization of Notes applications.
    Personally I a fan of custom REST implementations from a customer domains view. The REST requests and responses are much shorter and it hides the internal data structure.
    Generic CURD approaches are only useful for Proof-of-Concepts.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top