Over the last month or so, I’ve been working on an application that pushes performance quite a bit. Last week I published a blog post about aspects of developing for performance. I’ll be writing a future blog post (maybe more than one) on the approaches I’ve taken for performance. But in this blog post I want to take a step back and discuss why I had to address performance, even after quite a bit of optimisation from the initial development. That’s because the learning really builds on what Notes and Domino is, and why it has strengths, even today. It also raises aspects of Notes and Domino that may be relevant for innovative approaches going forward.

History of Notes / Domino

Historically, connectivity was poor. When Notes was at its height, most people didn’t have the internet at home. Business people had dial-up connections that were slower than millenials could ever believe. Notes came into its own because of local replicas of a database, that enabled offline access with strong, reliable replication to the Domino server. Servers themselves could replicate, so different offices often had their own server. Office staff worked on the server at their office. Home-based staff replicated with the server when they came into the office. And the servers replicated with each other. Conflicts needed resolving (this was how I, as an end user, brought myself to the attention of managers and, through them, IT). But that was within the skill-set of end users.

As connectivity improved, servers were consolidated and it became less common to have a Domino server in every site. Some companies used hosted servers, moving them off premises. As the cloud gained prominence, the aim to have everything in the cloud, hosted and managed externally, became more of an end-game for IT managers. The push to move applications to the web and potentially mobile increased, so Notes Client was seen as out-dated and legacy. “Outsourcing”, “cloud-first”, “web” and “mobile” became buzzwords that any “cool” IT department had to embrace. And in the push for “cool”, I believe the reasons behind the Domino architecture of infrastructure got forgotten.

To quote Jurassic Park people were too preoccupied with whether or not they could, they didn’t stop to think whether they should.

Application / Business Logic / Database Architecture

Let’s step back and look at the architecture of an application. The user interface is either installed or runs locally for the user. Whether it’s mobile, browser or rich client, it’s happening on the user’s device. This is true of all applications, not just Domino.

Some degree of coding logic runs on that device too. Every browser “application” (I’m ignoring basic websites, they’re sites not applications) runs JavaScript on the device, whether it’s a desktop PC, a mobile phone, or a smart-enabled device like a till. I’m yet to come across any application that exists only of HTML, CSS and basic links. If it’s a rich client application, regardless of the language, again the coding logic runs on the device.

The database is then somewhere. In Notes Client it can be on a server or locally. In a browser, it’s remote somewhere. As and when interaction is required with stored data, communication is passed via the appropriate port and protocol (HTTP/S, TCP/IP or other) to the database.

Browser Access

In the browser world this is something that, as Domino developers, we’ve not really dug deeply into. Older Domino developers will have dug into AJAX requests or iFrames to “chunk” calls and hide them a bit. But with purely JavaScript applications, user interface and logic runs on the browser client and data / files come from a server. With XPages, the partial refresh means this is done by the platform and less commonly something we bother about coding. View Panels, Data Views etc mean we no longer have to think about manually chunking the view contents into reasonably-sized requests. So we often don’t think about it. If we have a page calling a lot of data, we blame the platform and not the chosen architecture. Beyond XPages technologies like SASS, LESS, GraphQL and HTTP/2 have gained in prominence because of the need to address those elements, when architectural design alone is insufficient.

This assumes there is no business logic on the server-side, that it’s just loading static HTML, passing static JSON, loading files from a file system. In most cases – even when that’s not openly visible – it’s not the case. There is a server in between that’s doing something, running some business logic. In the case of SASS / LESS it may be dynamically parsing files to inject context-specific content into the final CSS file. In the case of GraphQL, it’s parsing the incoming request and generating the JSON dynamically for the response. In the case of basic REST services, again it’s got to do the same thing. We may overlook that it’s happening, but it’s happening. In the case of HTTP/2, it may be sending additional content as subsequent responses.

And to make it absolutely clear here, most database backends don’t do the same thing. They may be on the same server or they may need to connect via internal connections (JDBC, HTTP/S, NRPC etc). In traditional XPages applications, that may be the same server. But it may not be, and it’s important to understand the distinction. What becomes important is how, when and how often it’s communicating. To get a better understanding of this, it’s useful to forget browser access for the moment and think about Notes Client access.

Notes Client

In the Notes Client, communication has continued very much as it always has. The Notes Client communicates with @Formulas, LotusScript or Java APIs that are constantly communicating between client and server. Every part of a complex @Formula that refers to a database or server needs to communicate with it. Every line of LotusScript or Java that refers to a database or server needs to do the same. Historically with the Notes Client, as I said, the Notes Client in question and the server / database being communicated with are in close proximity. This is what gives the smoothly scrolling views and removes any concern about huge Forms or regular round-tripping in a Form in the Notes Client.

But use that Notes Client against a remote server via a VPN and that performance isn’t the same. Why? Because you’re waiting longer for the initial view to load and each line of code that needs to speak to the server takes longer. You’re using the Notes Client communication method to, for example, navigate the view using the network connectivity of the browser application. Look back at the history I gave above and you’ll see that’s not the way it was designed or the way it was designed to be used. This is why developing against remote servers in Domino Designer is a painful experience and another reason why development should be against a nearby – or even locally installed – Domino server. Often that’s not the approach, but it’s because we’ve forgotten how Notes and Domino is designed to be used. And in the same way user experience suffers if people stop using our applications the way they were designed to be used, so our experience suffers when we use Domino in such a way without a robust network between clients and servers.

Coming back to XPages for a moment, this is why it is inadvisable to try to reproduce a Notes Client application in XPages. It’s something Notes developers can instinctively try to do and expect XPages to address. But your method of communication will not be as good, your proximity to the server may differ.

Notes Client and SoftLayer

But if you move your Notes Client application to SoftLayer, your server isn’t in the same building. So there’s likely to be an impact on performance compared to a server in the same building and it may make more sense to use a local replica. This starts to explain why managed replicas are recommended for SmartCloud Notes mail and why Verse is the preferred approach.

It’s not the only approach and cloud first has been a priority for IBM. As an IBMer said to me recently, cloud first requires thought about that architecture. And that’s why we’ve seen optimisation in recent releases (FP7, 8, 9) around view indexing. From what I’ve heard for FP9 there are some significant performance benefits to be gained because of less contention between read and write operations. The importance here of keeping up to date both in terms of install versions and admin knowledge is key.

XPages and Bluemix

Think about it this way, and you start to see the problems with the Bluemix. For XPages on Bluemix, this means your application is accessed via a browser, connecting to Bluemix for the business logic, communicating with a hybrid server for your data. That’s obviously going to have a different result and different use cases to an on premises server. My understanding is the communication for both is still via NRPC, but the network connectivity between data server and business logic server will be different. In theory it seems comparable to Notes Client against a server over VPN. Each line of SSJS / Java connecting to a server, a database, a view etc needs to communicate remotely. Again, anyone who has taken this approach should ensure their hybrid Domino server versions are as up-to-date as possible and their Domino knowledge is current and strong.

But conversely XPages against another backend also on Bluemix or communicating to other services, like Watson, will be better than XPages running on premises against those services.

XPages against Network Server

Similarly XPages could be running against a different server. In the case of XPiNC without the “run on server” option, again the code is running on the Notes Client but communicating the data on the network server. So obviousy the same impacts of each line of code communicating remotely is also true, though the network connectivity should be much better.

In the case of an XPages application running on a server against a different Domino server for the data, the business logic still needs to be running on a server. And it’s important to ensure that can perform well. I’ve actually seen some performance impacts with the recent project I’ve been working on, admittedly with a development server, but XPages running locally and communicating to a network server was actually quicker than XPages running on a dev server communicating to another network server. Assumptions shouldn’t be made, and benchmarking / profiling is always good.

XPages on SoftLayer

For browser and mobile applications, solely SoftLayer vs solely on premises will be more comparable, because the application user interface and server-side logic / database have a similar architecture. But SoftLayer may mean it’s easier to keep up-to-date on modern hardware / software. Just bear in mind that any admin or support done via the Notes Client will not be as comparable.


Whatever the architecture, what’s important is to be more aware of how the architecture has an impact. As applications split components up more and more, as content is gathered from different locations, it’s important to bear this in mind. In some cases it may require innovative caching approaches, as I’ve used recently, and will cover in an upcoming blog post. It may even be necessary to code for poor performance and even failure. What we shouldn’t do is ignore the differences between communication methods and architectures. There are impacts, and we need to consider them.

1 thought on “Notes, Domino, The Cloud and Performance”

  1. Paul definitely agree with you. Unfortunately, a majority of the Notes applications were developed quickly and there was no consideration in designing with performance in mind. As we have been building our current framework for the past 6 years, performance now is always a consideration. This is the results of experience in building poor performance Notes applications and trying to have them run on the internet with huge latency. Notes applications with it NRPC just chokes on the internet.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top