This article has been prompted by my previous one on Domino application development. One of the major differences between Domino application development in the past and the approaches for Node.js-related development in the future is architecture. But part of that could also be an inhibitor for existing customers, because of the incubated world of Domino.

Past History

I’m a big proponent that a big key to the future is understanding history. One of the drivers for basing the Notes Client on Eclipse and the use of OSGi in Domino was “easier” extensibility. With the Eclipse-based Notes Client we got plugins for things like File Explorer from Rene Winkelmeyer, Connections, Tungle and Gist (back before RIM bought them and cancelled the open source versions in favour of proprietary Contacts and Calendaring solutions for Blackberry devices). Business partners could also develop extensions for customers, and I was involved in developing one for adding conferencing information into mail, calendaring and applications. I’m not sure if this was possible in the old C++ client, but it would certainly have been more challenging. On the Domino server it allowed OSGi plugins extending the XPages, HTTP and DOTS environments to add fixes, add components and renderers for e.g. Bootstrap, extend the core Java and SSJS APIs, and add frameworks. Again, this would have been considerably harder and was even more specialised before OSGi.

But all were bundling functionality onto the existing Notes Client and Domino server, to allow them to be a monolithic “one-stop-shop”.

There were attempts to break out of that model, with blog posts from Jesse Gallagher about using Nginx and recommendations from IBM to use IBM HTTP Server as a web server in front of Domino. I’m not sure how many companies have deployed Nginx in front of Domino – it’s in use for OpenNTF. But the recommendation of using IBM HTTP Server became doomed when POODLE hit. Initial advice to deploy IBM HTTP Server was rejected by customers who demanded Domino HTTP be upgraded to TLS 1.2 rather than deploying a separate, better HTTP server as a proxy. So successful was that demand that the entitlement Domino customers had to deploy IBM HTTP Server with Domino has since been removed.

The World Outside Domino

Meanwhile, the world outside Domino moved on. Cloud and specifically cloud microservices proliferated. In the past some applications may have integrated with one, for example Google translation or Paypal or some other custom REST service. But they were often restricted to large or specialist services. But now there are a variety of Watson services, APIs for integration with large cloud services like Office 365 or Connections, and a huge rise in IoT. Cloud is no longer jsut about hosting of monolithic applications: it’s now more about microservices.

At the same time developers were building smaller and smaller applications. Node.js and Vert.x enabled a web server to be bundled into the application, as did Spring Boot for Java applications. Because applications integrated with various external services – including databases and authentication services – it made less sense to make the developed application a monolith. This brought added challenges like making sure all parts of the “application” were up and running. That brought the rise of things like Kubernetes and Istio. But it also brought a challenge of needing an operating system for each “compartmented microservice”, which brought the rise of Docker.

The challenges were offset by the benefit of being able to scale up and down as required, but also to switch in and out different “parts” of the application. Need a new front-end? That’s all you need to change and deploy. Changing the database layer is easier, because the architecture has encouraged a more segmented code structure. Want to get feedback on a new UI? Redirect those users to a different front-end server but leave everyone else where they are.

Packaging everything into a war and deploying it to a large-scale server like Tomcat or Websphere Liberty became challenged by this new architecture.

Domino V10

When Node.js support for Domino was announced earlier this year, the initial plan was for Node.js to run within the Domino HTTP task. That changed, for very good reason. The gRPC layer that Node.js talks to – the Proton task – is very efficient. So efficient that Jason Roy Gary generated a huge bill on a cloud provider by maxxing out a load of Node.js applications…which the gRPC layer of Domino responded to with ease!

The requirements of scalability of Node.js as well as the expectations of the JavaScript community outside of Domino made it more sensible to allow developers to deploy their own Node.js applications with Express web servers packaged in them.

But this probably means a learning curve for managers, administrators and developers. A monolith is binary – everything is working or nothing is working. A microservices architecture is less clear cut. It also requires deployment to a different architecture. Those who are already using cloud services may be ready to hit the ground running. Those who have stuck to on premises and/or virtualised environments each with their own OS will likely need to change.

Regardless, the architectural changes and challenges are ones DevOps need to be ready for. Developers will need to be more adept at building their development environments accordingly. Administrators will need to become more adept at tuning and managing the architecture of the microservices that make up the application. And managers need to be aware that hardware and software changes may be needed – and indeed expected if they seek to recruit JavaScript developers – even for experimentation. Some of this I’ll be covering in subsequent blog posts and videos, and others have already blogged about it. My knowledge is fledgeling – as my XPages knowledge was when I started – but it’s been good enough for development purposes. Learn from elsewhere as well, please add best practice recommendations and corrections in comments where appropriate. But to properly embrace the future and the power of Domino V10 and beyond, it’s time to step beyond the Domino server and learn.

2 thoughts on “The Changing Domino Environment Architecture”

  1. Which percent of the Domino apps are you covering? I still build mostly 100% big monolithic apps,

    Frankly I have no idea where to start learning: I wonder if any other developer knows or IBM / HCL is going to guide us through this.

    XPages was a big investment for each developer. I think Domino v10 is even more.

    1. Percentage is not really relevant, I think. Not a large percentage of applications used data from a different NSF pre-XPages. Now, most of my applications do. Architecture is down to what’s available and what’s possible, and the introduction of Node.js is one of the biggest changes we’ve had.

      Applications have tended to be monolithic because that’s what we’re used to and that’s what’s possible. Similarly, many XPages applications are typically what I would call Notes application on the web, rather than web applications built with XPages. By that I mean navigation on the left, because that’s how it works in the Notes Client; categorised views and sorting because that’s what there is in Notes Client.

      But equally I’ve built REST services for applications in addition to the existing data interface (dominoDocument datasource, Java class + methods etc). Going forward there’s a change of approach. The REST service becomes both the interface for external integration as well as the web user interface for your own application. Similarly, the web access may be via a Node.js application that provides an interface or external interaction (though with different restrictions on different APIs). The big question is how this works for Notes Client applications.

      The strength of microservices architecture is in its granularity. Developing a new web interface for an existing XPages application would mean redeveloping the whole application, but developing a new web interface to replace e.g. Angular 1 with React or add a React Native UI for mobile access would be less significant. That’s more critically important when web frameworks have a shorter life.

      I agree that it’s a huge learning curve. But unlike XPages, the only thing that’s proprietary are the API classes and methods (e.g. server.useDatabase()). Those won’t take huge learning, particularly compared to other technologies, and this is why cross-learning between existing Notes developers and newly-employed JavaScript developers is so important. There have been discussions with IBM about how to modernise developers. But IBM and HCL shouldn’t be prescribing whether or not you use Docker, which front-end framework to use, whether you code in Node.js directly or use lower-code options like Node-RED. Developers shouldn’t be looking to them to do so. They may be the right place to host webinars, samples etc, especially since NotesIn9 seems less active. But choice in this wider world is often down to personal preference and the only thing that relates to Domino is the Node.js APIs themselves, not where they’re used.

      There are various training courses available on the web on sites like Coursera or Lynda, as well as self-paced tutorials. There are also blog posts on Node.js from Tim Davis and on Docker and Domino from Tim Clark. Beyond that, it’s a case of choosing your preferred IDE (Visual Studio Code, Atom, Sublime etc), your preferred front-end framework (Vue, React, Angular), and picking up experience of Node.js from whatever you’re learning. This is why conferences throughout 2018 have been so critical – ICON UK, Collabsphere and Engage all had sessions giving an introduction to new technologies. There may be some of those sessions repeated next year, but those who regularly attend and speak will be looking for new learning. A single “Introduction to XPages” jumpstart session got repeated for several years at Lotusphere. Repeating intro sessions on Vue, React, Node.js and others starts to minimise the slots for other sessions for returning delegates, and I don’t expect them at user groups beyond this year at the latest.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.