I didn’t realise this post was still in draft, so it’s took much longer than expected to publish. But it’s quite timely, because this covers Node-RED, and a blog post has been published this week covering some of the content from the slides of my session with Fredrik Malmborg at Engage. Still, better late than never.

In part one we covered Notes Client and XPages. In part two we covered Domino Mobile Apps. Technically, with both the previous parts we’ve covered interacting with a Domino application via a REST service. That’s nothing new – a LotusScript agent could have been a poor man’s REST service endpoint. XAgents are one option, an OSGi plugin is another, Smart NSF is a further one. And in previous versions web service providers and consumers were available and, although web services are still in some use, REST services are a more open and modern alternative. Ideally a REST service should be accompanied by an Open API 3.0 specification, and I’ve covered previously how that could be built and delivered in a secured manner within an NSF.

With the final part I’m stepping beyond traditional approaches and tools, and beyond the comfort zone of many. Also, none of the code is in the NSF, so it’s not in the GitHub repository. The starting point is DQL Explorer and you would need the source code from the DQL Explorer GitHub repository. The Domino server will need configuring, as in the “Configure” section of the README. But we’re making changes to the existing code, so we won’t be using DQL Explorer on the Domino server, we’ll be using it locally against the Domino server.

DQL Explorer is a Node.js application, so you should first be familiar with Node.js and your chosen IDE. Both will come from taking time to step beyond the HCL sites and Domino community. The strength of standard app dev approaches, frameworks and tools is they are standard; the challenge is they are constantly evolving and best practices means embracing those communities. There are tutorials on the Node.js website and my preferred IDE is Visual Studio Code, which also has a Node.js tutorial. Different people will choose and need different things. The community may be able to offer advice, but pro code and microservices development is about integrating various tools and solutions. If you’re lucky enough to get the answer to your question, great. But being able to understand what you’re doing and work some things out for yourself will be critical.

So I opened the repository in Visual Studio Code. As the “Configure” section of the README says, I updated the “proxy” setting in the package.json to point to my Domino server. As the documentation says in the “Available Scripts” part, I ran npm install to download all the dependencies and npm start to start running the application locally. Then I was ready to start developing.

DQL Explorer is a Node.js application with a React UI. Lots new here, but there are similarities to XPages – providing you’ve gone beyond the basics: each JavaScript file is a page or component, and it’s the render() function that prints HTML and JavaScript to the browser. I’m modifying the Results.js file. From line 152, it creates a CommandBar component for each of the button definitions in this._getCommandItems. That function returns an array of button definitions. I added an additional one, “Send as Spreadsheet to Connections”. I just reused the button image. Different applications will use different image libraries, you need to work out what it’s using and what to use instead.

I want to send JSON in this button, so I took the code of the _onDownloadJSON() function and modified it.

The first two lines are the same, to get the results returned, which are stored as JSON in DQL Explorer, and stringify them so it can be passed to a REST service. But instead of passing it back to the browser, we post the JSON to a local endpoint that’s not part of the DQL Explorer app (running locally on port 3000) and not the Domino server (that’s running on port 80 / 443). We make that request using an npm module called axios, which DQL Explorer is already using and which I’ve had some familiarity from the To Do application John Jardin built for our IBM Think 2018 session. We make this request and, if we get a response within the 200 range, let the user know it’s been uploaded successfully, otherwise assume an error.

The endpoint we’re calling is on Node-RED. The significant nodes were using here are Stefano Pogliani’s Connections nodes for Node-RED, which make interfacing with Connections data significantly easier than any other approach. The flow itself looks quite complex, so we’ll step through it. There are also some parts here that are specifically proof of concept, and would not work in a production application.

The entry point is the http node, which receives a POST request to the “receiveJson” on the Node-RED server. Everything there corresponds to what’s in lines 559 – 564 of the code block in the React DQL Explorer application. That then goes to the “Set FilePath” function node. This is quite an involved function node.

It uses the flow object, which holds variables scoped to the current Node-RED flow tab. This should have got set at startup, to retrieve a specific communityId from Connections Cloud (or any Connections instance) by name. You’ll notice from the screenshot of the flow that this node has two output points. The first part of the if statement routes through the second output by virtue of returning null for the first and the msg object for the second. If the communityId is available in the flow, we create a random string as resultand store it as an Excel filename in both the “filepath” and “filename” properties of the msg object. We also store the communityId. We then ensure we only trigger the first output point by returning the msg object as the first element in the array and null as the second.

If all has gone well, we then hit an excel node. This node uses msg.filepath to give it a filepath relative to the Node-RED server to store the resulting Excel document created from the JSON object passed into the http node.

That routes then to the file in node. Although th label above it says “Save to Docker”, this node is actually reading the Excel file saved by the excel node and outputs a single buffer object. This node expects the msg.filename property to define where to read the file from. Now you understand why we set two different properties to the same value in the function node.

This then calls the Upload File node, a Connections node to upload a file to Connections. The configuration for the node specifies the user credentials with which to connect, so the file will always be created by a specific user. This node also uses msg.communityId to specify the community into which to upload the file and msg.filename to specify the filename to save it as. The filename will also get appended with a timestamp, just to ensure it’s always unique – that’s a side-effect of the Node-RED node.

You’ll notice from the flow that this node connects to two node. The bottom one is self-explanatory, the File Delete node does what it says on the tin. It deletes the Excel file we created on the server and is there just for cleanup. The second one is a function node, and that sets msg.payload=msg.fileUrl. This has been set by the Upload File node and is the URL to the file on Connections. That means we could potentially display that as a link for the user.

Finally we end with an http response node, which will either return this URL, if we were successful or the error and a status code of 500 from the function node under “Community not set”, if the first function node was unable to retrieve the communityId.

One problem I found, and I’m not sure where the problem lies, but the Excel file cannot be opened by the Connections viewers. It looks fine and opens fine in Excel. I’m not sure if there’s a problem with the spreadsheet the Excel node creates from JSON, or whether it’s a limitation with the Connections viewers. I didn’t investigate in depth, and I’m not sure how I could.

But this shows how even a modern application can be “modernised” by adding additional functionality and additional integration. It also shows how Node-RED can be used for easy integration between Domino and Connections, and how there are additional nodes for file operations which make that a trivial task.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.