JavaScript – tsmx https://tsmx.net pragmatic IT Sat, 06 Jul 2024 20:02:30 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 https://tsmx.net/wp-content/uploads/2020/09/cropped-tsmx-klein_transparent-2-32x32.png JavaScript – tsmx https://tsmx.net 32 32 Integrating GCP Secret Manager with App Engine environment variables https://tsmx.net/integrating-gcp-secret-manager-with-app-engine-environment-variables/ Sun, 15 Oct 2023 21:21:55 +0000 https://tsmx.net/?p=2458 Read more]]> Showing a convenient way on how to use Secret Manager to securely pass sensible data as environment values to Google App Engine (GAE) services running with Node.js.

Unfortunately, App Engine doesn’t deliver an out-of-the box solution for passing env vars from Secret Manager like it’s available in Cloud Functions by using the --set-secrets option of gcloud functions deploy.

In this article a convenient way by using a simple npm package is shown to achieve this. The goals are:

  • Direct use of Secret Manager secret references in the standard App Engine deployment descriptor.
  • Minimal impact on the code.
  • No vendor and platform lock-in, no hard dependency to App Engine. The solution should still run in any other environment.
  • Should work with CommonJS as well as ESM/ECMAScript.

Let’s go to it…

Integrating Secret Manager with App Engine

Setting-up a secret in GCP Secret Manager

First, create one or more secrets in Secret Manager of your GCP project. Here, the secret is named MY_SECRET and has a reference path of projects/100374066341/secrets/MY_SECRET.

gcp-secret-manager-my-secret

For a more detailed guide on how to enbale Secret Manager and creating secrets please refer to this section about secure configuration management in Cloud Functions.

Granting Secret Manager rights to the GAE service account

In order to resolve secrets from Secret Manager, the service account principal running your App Engine service – by default PROJECT_ID@appspot.gserviceaccount.com – must have at least the Secret Manager Secret Accessor role. For more details refer to the Secret Manager access control documentation.

To do so, go to IAM in the console and edit the App Engine principal. There, click “Add another role” and search for Secret Manager Secret Accessor and save, like so.

gcp-iam-access-scecret-role

Referencing a secret in app.yaml

In the standard app.yaml deployment descriptor of your App Engine service, create an appropriate environment variable in the env_variables section containing the secrets reference path. Like so…

service: my-service
runtime: nodejs20

env_variables:
  MY_SECRET: "projects/100374066341/secrets/MY_SECRET/versions/latest"

Note that you have to add /versions/latest to reference the lastest version of the secret or /versions/x to reference the version with number x, e.g. /versions/2. For details see referencing secrets.

Add the gae-env-secrets package to your project

Next, add the gae-env-secrets package as a dependency in your project. This will provide the functionality to retrieve Secret Manager values for environment variables used in App Engine.

npm i gae-env-secrets --save

Use the Secret Manager value in your code

Import the gae-env-secrets package in your code and call the async getEnvSecrets function out of it. Once completed, you’ll be able to access the values stored in GCP Secret Manager by simply accessing the env vars used in the deployment descriptor. Works with CommonJS as well as ESM.

CommonJS

const { getEnvSecrets } = require('gae-env-secrets');

getEnvSecrets().then(() => {
  const secret = process.env['MY_SECRET']; // value of MY_SECRET from Secret Manager
});

ESM

import { getEnvSecrets } from 'gae-env-secrets';

await getEnvSecrets();
const secret = process.env['MY_SECRET']; // value of MY_SECRET from Secret Manager

That’s it. You can now seamlessly use Secret Manager secret values in your GAE App Engine services by referencing env vars.

To learn more on how the gae-env-secrets package is working and how its usage can be customized, read on.

Under the hood

Referencing secrets in the deployment descriptor

To reference secrets in the app.yaml deployment descriptor, you’ll need to pass the versioned reference of the secret from Secret Manager. This has the form of…

projects/[Project-Number]/secrets/[Secret-Name]/versions/[Version-Number|latest]

To retrieve the reference path of a secrets version in Secret Manager simply click “Copy resource name” on the three dots behind a version. Specifying latest as the version instead of a number will always supply the highest active version of a secret.

gcp-secret-manager-my-secret-name

Then pass the secrets reference to the desired variable in the env_variables block of the deployment descriptor, like so…

env_variables:
  SECRET_ENV_VAR: "projects/100374066341/secrets/MY_SECRET/versions/1"

For more details, refer to the app.yaml reference.

Determining the runtime-environment

gae-env-secrets will evaluate environment variables to detect if it is running directly in App Engine. If the following env vars both are present, the library would assume it’s running in GAE and substitute relevant env vars with their respective secret values from Secret Manager:

  • GAE_SERVICE
  • GAE_RUNTIME

If these two env vars are not present, the library won’t do anything. So it should be safe to call it unconditionally in your code without inferring local development, testing etc.

To simulate running under GAE, simply set those two env vars to anything.

Substituting env vars from Secret Manager

If running under GAE is detected, calling getEnvSecrets will iterate through all env vars and substitute the value with the corresponding secret derived from Secret Manager if one of the following condition is true:

  • The name of the env var ends with _SECRET (default suffix) or another deviating suffix passed via the options
  • Auto-Detection is enabled via options and the value of the anv var matches a Secret Manager secret reference

For accessing the Secret Manager, the library uses the package @google-cloud/secret-manager.

Error handling

By default and for security reasons, the library will throw an error if substituting an env vars value from Secret Manager fails for any reason…

  • secret reference is invalid
  • secret is inactive or not present
  • invalid version number
  • missing permissions to access Secret Manager
  • or else…

So make sure to use an appropriate error handling with try/catch or .catch().

To change this behaviour, use the strict property available in the options.

Passing options to getEnvSecrets

You can pass an options object when calling getEnvSecrets to customize the behaviour. The following options are available.

suffix

Type: String Default: _SECRET

All env vars whose name is ending with the suffix will be substituted with secrets from Secret Manager.

Pass another value to change the env vars of your choice.

// will substitue all env vars ending with '_KEY'
getEnvSecrets({ suffix: '_KEY' });

strict

Type: Boolean Default: true

By default strict is true which means that if a secret cannot be resolved an error will be thrown.

Setting strict to false will change this behaviour so that the error is only written to console.error. The value of the env var(s) where the error occured will remain unchanged.

// error will only be logged and respective env vars remain unchanged
getEnvSecrets({ strict: false });

autoDetect

Type: Boolean Default: false

The autoDetect feature enables automatic detection of env var values that contain a Secret Manager secret reference for substitution regardless of the suffix and env vars name.

This feature is additional to the provided suffix, meaning that all env vars ending with the suffix AND all automatically detected will be substituted.

To turn on this feature, pass true in the options object.

// turn on autoDetect
getEnvSecret({ autoDetect: true });

Example: Having this feature enabled, the following env var would be substituted with version 2 of the secret MY_SECRET regardless of the suffix because is contains a value of a Secret Manager reference.

env_variables:
  VAR_WITH_ANY_NAME: "projects/00112233/secrets/MY_SECRET/versions/2"

Considerations & limitations when using gae-env-secrets

Please keep in mind the following points when using this solution.

  • Since the getEnvSecrets function is async you’ll need to await the result or chain on using .then to be able to work with the secret values. CommonJS does not support top-level await.
  • As the env var secrets are resolved at runtime of your code, any top-level code of other modules that is executed upon require/import cannot make use of the secret values and instead would see the secret references as values of the env vars.
  • Resolving the secrets from Secret Manager using the underlying Google library will usually take 1-2 seconds.

Summary

This article shows how to integrate Secret Manager easily with App Engine by using one simple package and few lines of code. No vendor or platform lock-in is created.

However, once Google is supplying an out-of-the box feature to make the integration work like in Cloud Functions, it should be considered switching to this to maybe overcome the limitations of this solution, e.g. secret resolution at runtime.

Happy coding πŸ™‚

Useful links

]]>
CommonJS vs. ESM/ECMAScript cheat-sheet https://tsmx.net/commonjs-vs-esm-ecmascript-cheat-sheet/ Wed, 16 Aug 2023 20:40:19 +0000 https://tsmx.net/?p=2152 Read more]]> Short comparison of the most common statements in CommonJS vs. ESM/ECMAScript for importing, exporting and in your package.json.

See the table below for a brief comparison on the most used statements that should cover the vast majority of use-cases.

Use-CaseCommonJSESM / ECMAScript
Importing
Default import
(NPM module)
const imp = require('module');import imp from 'module';
Default import
(own module)
const imp = require('./myModule');

Note: path is mandatory, file extension is optional
import imp from './myModule.js';

Note: path and file extension are mandatory
Named importconst { namedImp } = require('module');import { namedImp } from 'module';
Import with function
invocation
const funcImp = require('module')(myParam);import imp from 'module';
const funcImp = imp(myParam);


Note: ESM doesn’t support invocations
on importing, so two lines of code are needed
Exporting
Default export
(unnamed)
module.exports = function() {/* */}export default function() {/* */}
Named export
(e.g. a function, works also with objects, classes etc.)
module.exports.myFunction = function() {/* */}export function myFunction() {/* */}
Exporting an arrow functionmodule.exports.myFunction = () => {/* */}export const MyFunction = () => {/* */}

Note: the const keyword is needed here
package.json entries
Module type…nothing…
or
"type": "commonjs"

Note: since CommonJS is the default, normally no type entry is present in package.json
"type": "module"

Note: this tells Node.js to treat all .js files as ES modules without the need to use the .mjs extension which is often preferred
Entry point"main": "index.js""exports": "./index.js"

Note: path and file extension are mandatory

Please note that this cheat-sheet is just an excerpt of all possible module.exports/require and import/export constellations as well as all available package.json options. For more details, refer to the very comprehensive documentation sites:

If you are about to migrate from CommonJS, also check out the article on converting an existing Node.js project to ESM.

Happy coding πŸ˜‰

]]>
Destructuring nested object properties in JavaScript https://tsmx.net/destructuring-of-nested-object-properties-in-javascript/ Mon, 24 Jul 2023 20:25:49 +0000 https://tsmx.net/?p=2114 Read more]]> Quick tutorial on how to use destructuring for accessing nested JavaScript object properties.

This is a really short article on an interesting JavaScript feature I wasn’t aware of long time: destructuring of nested object properties. Really helpful when you need to deal with only some properties of a heavily sub-structured object passed around, for example parts of a Redux state object in a React app.

Normal JavaScript object destructuring

First things first. Let’s assume you have the following JavaScript object.

const person = {
  name: 'John',
  address: {
    country: 'USA',
    city: 'New York',
    details: {
      street: 'Broadway',
      number: 3155
    }
  }
}

To only access the name property of the object when its passed over, you can use standard destructuring like so…

const writeName = ({ name }) => {
  console.log(name);
}

writeName(person);
// John

Naive approach to access nested properties using destructuring

But what when you need to access the city property of the object and only that? With standard destructuring the approach would look something like this.

const writeCity = ({ address }) => {
  console.log(address.city);
}

writeCity(person);
// New York

That’s defnitively working but there is an even more elegant approach to directly access the desired property instead of still need to step down one level using address.city.

Accessing nested object properties using deep destructuring

With using the following syntax for deep object destructuring, a direct extraction of the needed nested property is possible.

const writeCity = ({ address: { city } }) => {
  console.log(city);
}

writeCity(person);
// New York

That’s it! A really elegant way to access nested object properties. This is also working for deeper nested properties, just repeat the destructuring syntax using colons. The innermost of the destructuring will be available as local variables. Also you can of course destructure a list of properties, like so…

const writeAddressDetails = ({ address: { details: { street, number } } }) => {
  console.log(street);
  console.log(number);
}

writeAddressDetails(person);
// Broadway
// 3155

BTW: this is an official JavaScript feature also included in the JavaScript documentation for object destructuring. But it’s a bit hidden πŸ˜‰

Putting it all together

const person = {
  name: 'John',
  address: {
    country: 'USA',
    city: 'New York',
    details: {
      street: 'Broadway',
      number: 3155
    }
  }
}

const writeName = ({ name }) => {
  console.log(name);
}

const writeCityLong = ({ address }) => {
  console.log(address.city);
}

const writeCity = ({ address: { city } }) => {
  console.log(city);
}

const writeAddressDetails = ({ address: { details: { street, number } } }) => {
  console.log(street);
  console.log(number);
}

writeName(person);
// John

writeCityLong(person);
// New York

writeCity(person);
// New York

writeAddressDetails(person);
// Broadway
// 3155

Happy coding πŸ™‚

Useful links

]]>
Convert an existing NodeJS project from CommonJS to ESM / ECMAScript https://tsmx.net/convert-existing-nodejs-project-from-commonjs-to-esm/ Thu, 25 May 2023 20:57:25 +0000 https://tsmx.net/?p=1967 Read more]]> A quick guide for converting your existing NodeJS projects from CommonJS to ESM/ECMAScript. Including Express, Jest, Supertest and ESLint.

ECMAScript or ESM modules are the official way of developing JavaScript software today and many projects and libraries are moving towards this format to leverage its advantages. Support for CommonJS, which was the de facto standard for NodeJS projects so far, is even being dropped by some famous libs. The question now is how your existing NodeJS projects can be migrated to the new level?

Fortunately, for the vast majority of projects this should be a no-brainer. In this guide we’ll migrate a minimal project using popular libs and tools like Express, Jest, Supertest and ESLint from CommonJS to an ESM project.

tl;dr – fast-lane conversion

For the impatient here’s the straight forward uncommented list of steps to convert an existing project using Jest and ESLint from CommonJS to ESM. For more details on each step and other sidenotes see further below.

  1. In package.json: add "type": "module" and "exports": "./start.js", remove the "main": "start.js" entry.
  2. Replace all module.exports statements with export and all require statements with import in your source files.
  3. Set or add "sourceType": "module" and "ecmaVersion": "latest" in the parserOptions section of your ESLint configuration.
  4. Replace the "jest" start command in package.json with "NODE_OPTIONS=--experimental-vm-modules npx jest".

That’s already it. You should now be able to run the project and all tests using ESM. Keep on reading for an example and more details on the conversion steps.

Example project on GitHub

This post is accompanied by the node-commonjs-to-esm example project on GitHub which uses Express, Jest, Supertest and ESLint. The project comes with two branches commonjs and esm showing the original state using CommonJS and the migrated one using ESM.

# clone the example project
git clone https://github.com/tsmx/node-commonjs-to-esm.git

# install needed dependencies
cd node-commonjs-to-esm
npm install

# check out the original CommonJS project
git checkout commonjs

# check out the migrated ESM project
git checkout esm

To start the project or test suite – regardless of the branch you are in – run the following commands.

# run the project to start a simple server on localhost:3000 
# with GET routes for '/', '/route1' and '/route2'
npm run start

# run the Jest tests
npm run test

Note: When switching between the ESM and the CommonJS branch, a new npm install is not necessary as the conversion doesn’t affect the dependencies.

Required NodeJS version

To make full use of ECMAScript/ESM and convert your projects accordingly, a NodeJS version of at least v12.20.0 or v14.13.0 is needed.

Please note that if you are on an older version of NodeJS it’s highly recommended to update anyways since these versions are quite old and even support for v14 has ended by the time of writing this article. For the example NodeJS v18 LTS was used.

CommonJS to ESM conversion steps in detail

ESM modifications to package.json

To switch from CommonJS to ECMAScript/ESM we’ll first make two slight changes to our package.json:

  • Replacing "main": "start.js" with "exports": "./start.js"
    Please note the leading "./" as with ESM every reference to own files/modules has to be a full pathname including the directory and also file extension.
  • Adding "type": "module"
    This is to tell NodeJS we are on ESM now and prevents you having to rename all *.js files in your project to *.mjs as proposed in some guides. Also with ESM we can stay with the *.js file extension as it is still normal JavaScript.
# in package.json

# before
"main": "start.js",

# change to
"exports": "./start.js",
"type": "module",

For more details on exporting the entry point and the module type declaration refer to the official NodeJS documentation for package.json fields.

Replacing require/module.exports with import/export

Now the biggest change has to be done: in all of your source code files you have to replace the statements for exporting and importing as module.exports and require are not longer supported with ESM.

For unnamed exports do the following changes…

// before: CommonJS - unnamed export

module.exports = app;

// after: ESM - unnamed export

export default app;

And for named exports of functions…

// before: CommonJS - named exports

module.exports.route1 = function (req, res) { ... }
module.exports.route2 = (req, res) => { ... };


// after: ESM - named exports

export function route1 (req, res) { ... }
export const route2 = (req, res) => { ... };

Please note the const keyword instead of function when exporting an arrow function. For a complete list of available export statements for other types like arrays, classes, literals and so on please refer to the export statement documentation.

After changing all the exports, let’s move on with replacing require by import…

// before: CommonJS requiring in dependencies

const express = require('express'); // standard module
const app = require('./app'); // own module    
const routes = require('./handlers/routes'); // own module with named exports routes.route1 and routes.route2

// after: ESM importing dependencies

import express from 'express'; // without trailing .js
import app from './app.js'; // with trailing .js
import * as routes from './handlers/routes.js'; // '*' to import all named exports, with trailing .js

Note that own modules must always be imported by providing the path and file extension whereas standard modules installed via npm (like express) don’t.

Importing named exports can also be done selectively if you need only some imports from a module by using destructuring in curly braces like so…

import { route1, route2 } from './handlers/routes.js';

For a full list of available import declarations please refer to the import statement documentation.

You should now already be able to start the migrated ESM project by running npm run start.

Updating the ESLint configuration

Although everything is running, you should notice some ESLint errors in your code…

eslint-module-export-error

This is because ESLint isn’t aware of that you’ve switched from CommonJS to ESM. To fix that simply add the following entries at the top level to your ESLint configuration – in case of the sample project the .eslintrc.json file.

"parserOptions": {
  "ecmaVersion": "latest",
  "sourceType": "module"
},

Now ESLint knows to validate against ECMAScript/ESM syntax with the latest features and no errors should show up any more. For a complete description of the available options have a look at the ESLint parsing docs.

Making Jest working again

Last thing to fix up is running the unit tests with Jest. Invoking npm run test would give you an error like that…

jest-esm-error

To fix this, we change the starting command in our package.json for Jest as suggested in the provided documentation site for ECMAScript modules.

# before: CommonJS Jest start script under 'scripts'

"test": "jest"

# after ESM Jest start script

"test": "NODE_OPTIONS=--experimental-vm-modules npx jest"

Although this feature is still considered being experimental, the tests are now running again.

jest-esm-success

That’s it! The project is now completely converted to ESM.

You may also check out the CommonJs vs. ESM cheat-sheet for further reading.

Happy coding πŸ™‚

Useful links

]]>
Express: passing dates in an URL using route parameters including regex validation https://tsmx.net/express-pass-dates-in-url-with-regex-validation/ Mon, 06 Jun 2022 21:21:43 +0000 https://tsmx.net/?p=1717 Read more]]> Showing an elegant way on passing dates to your REST API’s and webservices using Express route parameters and regex validation standard features.

When implementing RESTful API’s or other webservices in NodeJS with Express you will sooner or later come to the point where you need to pass dates as parameters. Something like YYYY-MM-DD. This article provides a way how to do that getting the most out of Express standard features.

Query strings and route parameters quick intro

For this showcase let’s assume you want to write an API with a GET route /valueofday on localhost:3000. This route needs to receive a specific day as an input parameter to serve the values for this day.

To pass a date value to your Express application you have two possibilities in general:

  • Passing as an query string parameter:
    http://localhost:3000/valueofday?day=20220210
  • Passing as an URL route parameter:
    http://localhost:3000/valueofday/20220210

If you pass the required date as an query string like that, it would be available through a variable req.query.day in the request handler.

Although this is an easy way of grabbing the value ‘20220210’ you should keep in mind that it is a completely unchecked string value that is returned. So every post-processing that is needed to create a date like splitting the string, validating if numbers were passed, casting to numbers etc. is completely up to you.

This would bloat your API’s with a lot of boilerplate code having no business value. In this article you will see how to use out-of-the-box Express features to…

  • Separate the passed values for year, month and day
  • Ensure that only valid numbers are passed

…without the need of any own-written code. This helps keeping your REST API’s and webservices clean and robust. So let’s go for it.

Exploring Express route parameters features

Recognizing that query strings would be a sub-optimal solution let’s go ahead with route parameters in Express. To pass a day value to our route, we would naively specifiy it like so:

app.get("/valueofday/:day", (req, res) => {
    // logic goes here...
});

With that, Express will make the last URL portion available as req.params.day. Anyways, this is not any better then using a query string as it still returns an unchecked arbitrary string we expect to be in the form of YYYY-MM-DD.

Fortunately, Express has some nice out-of-the box features for route parameters we can use to optimize that.

Route parameter separators

First, we’ll use the Express parameters separation feature. To construct a JavaScript date object, we’ll need the values for YYYY, MM and DD separately. Here, Express can do the work for us by using the allowed '-' or '.' route parameter separators, like so:

app.get("/valueofday/:year-:month-:day", (req, res) => {
  // logic goes here...
});

This will give us three variables req.params.year, req.params.month and req.params.day instead of one. E.g. when calling http://localhost:3000/valueofday/2022-04-11 the params would be year=”2022″, month=”04″ and day=”11″. So far, so good!

But the returned variables are still of arbitrary value. So how to ensure that only numbers are passed?

Regex validation for route parameters

To check the passed URL route parameters, we will make use of the regex parameter validation Express offers. For every route parameter you can add a regular expression (in brackets) which it should be validated against. For our showcase, let’s use trivial regular expressions to ensure that only numbers of cerrect length can be passed, like so:

app.get("/valueofday/:year(\\d{4})-:month(\\d{2})-:day(\\d{2})", (req, res) => {
  // logic goes here...
});

This will make Express to ensure that req.params.year is a four digit number and req.params.month and req.params.day are two digit numbers. Of course you can use much more sophisticated regexes in your solution to do any tighter checks. This is just to get you an idea of how this feature works.

Putting it all together: passing a date to an Express API

Having our YYYY-MM-DD values separated in req.params and also ensured we’ll only get passed numbers, we can now easily create our JavaScript date object in the request handler for further processing. Most easy way is to use the unary + operator here to convert everything to a number.

app.get("/valueofday/:year(\\d{4})-:month(\\d{2})-:day(\\d{2})", (req, res) => {
  let queryDate = new Date(
    +req.params.year,
    +req.params.month - 1,
    +req.params.day
  );
  // logic using queryDate goes here...
});

Please note the “-1” here for the month value as the JavaScript date constructor treats this as an index value with a range of 0..11.

That’s it. We are now able to pass a date value to an Express API without the need of any boilerplate code.

Testing the solution

For testing purposes we’ll put together a minimal working Express solution with a catch-all route indicating any miss of our date-passing route.

var express = require("express");
var app = express();

app.get("/valueofday/:year(\\d{4})-:month(\\d{2})-:day(\\d{2})", (req, res) => {
  res.status(200).json({
      queryDate: new Date(
          +req.params.year,
          +req.params.month - 1,
          +req.params.day
      )
  });
});

app.get('*', (req, res) => {
  res.status(404).send('Non-existing route');
});

app.listen(3000);

Now let’s use curl to fire some requests and check the result…

$ curl -w "\n" localhost:3000/valueofday/2022-01-11
{"queryDate":"2022-01-11T00:00:00.000Z"}

$ curl -w "\n" localhost:3000/valueofday/2022-01-1
Non-existing route

$ curl -w "\n" localhost:3000/valueofday/1999-03-11
{"queryDate":"1999-03-11T00:00:00.000Z"}

$ curl -w "\n" localhost:3000/valueofday/1999-03-111
Non-existing route

$ curl -w "\n" localhost:3000/valueofday/1999-03.11
Non-existing route

$ curl -w "\n" localhost:3000/valueofday/1999-03-10
{"queryDate":"1999-03-10T00:00:00.000Z"}

$ curl -w "\n" localhost:3000/valueofday/abcd
Non-existing route

$ curl -w "\n" localhost:3000/valueofday/
Non-existing route

Happy coding πŸ™‚

Useful links

]]>
MongoDB: add/update fields referencing other existing fields using aggregation pipelines https://tsmx.net/mongodb-add-update-fields-using-aggregation-pipelines/ Sat, 09 Apr 2022 19:43:29 +0000 https://tsmx.net/?p=1543 Read more]]> Short explanation on how to add or update fields in documents referencing the values of other already existing fields by leveraging the aggregation pipeline framework.

Initial document situation

Let’s assume you have the following documents with hourly temperature values in a collection called temperatures

[
  {
    _id: ObjectId("62508bd0742bfb98b29dbe71"),
    date: ISODate("2022-04-08T08:00:00.000Z"),
    tempC: 7.3
  },
  {
    _id: ObjectId("62508bf0742bfb98b29dbe8c"),
    date: ISODate("2022-04-08T09:00:00.000Z"),
    tempC: 7.8
  },
  {
    _id: ObjectId("62508c02742bfb98b29dbe93"),
    date: ISODate("2022-04-08T10:00:00.000Z"),
    tempC: 8.5
  }
]

The given temperature in field tempC is in degrees Celsius but you may also need the temperature in degrees Fahrenheit. For various reasons it’ll make sense to have the Fahrenheit values persisted in MongoDB instead of calculating them always on-the-fly in your application.

So you want to add a field tempF to every document which holds the temperature in Fahrenheit. The calculation formula for that would be easy: tempF = tempC * 1.8 + 32. But how to achieve that in MongoDB?

Caveats with common update operators

Trying to solve this simple appearing task with the basic MongoDB functions, you will quickly face the following problems:

  • The $set operator used to add new fields or update existing ones cannot be used together with expressions. It only takes plain values, so it is not possible to reference other fields.
  • The traditional MongoDB update operators like $mul and $inc which would be needed here to calculate tempF are not sufficient. This is because they are only made to change an existing field in-line together with a fix value, e.g. “add 10 to a field” or “multiply a field by 2”.

So what is the way to go for with MongoDB to add a new field or update it when the resulting value references another existing field?

Updating documents using aggregation pipelines

Starting with MongoDB 4.2, it is possible to use the powerful aggregation pipeline with updates. This enables the usage of aggregation pipeline operators in normal update statements. These operators are more flexible than the traditional ones and allow expressions referencing other fields by using the ‘$…’ field path notation.

Having this in mind, we can now add the needed field using a $set aggregation pipeline stage using $add and $multiply as following…

db.temperatures.updateMany(
  {},
  [{ $set: { tempF: { $add: [ { $multiply: ['$tempC', 1.8] }, 32] } } }]
);

The new field tempF is now added to every document based on the already existing tempC field…

[
  {
    _id: ObjectId("62508bd0742bfb98b29dbe71"),
    date: ISODate("2022-04-08T08:00:00.000Z"),
    tempC: 7.3,
    tempF: 45.14
  },
  {
    _id: ObjectId("62508bf0742bfb98b29dbe8c"),
    date: ISODate("2022-04-08T09:00:00.000Z"),
    tempC: 7.8,
    tempF: 46.04
  },
  {
    _id: ObjectId("62508c02742bfb98b29dbe93"),
    date: ISODate("2022-04-08T10:00:00.000Z"),
    tempC: 8.5,
    tempF: 47.3
  }
]

Note: As the $set aggregation operator would overwrite the value of the specified field if it already exists in the document, this approach also works perfectly for updating fields.

At the end of the day, that was easy – right? πŸ˜‰

Useful links

]]>
Testing process.exit with Jest in NodeJS https://tsmx.net/jest-process-exit/ Fri, 17 Sep 2021 19:52:29 +0000 https://tsmx.net/?p=1000 Read more]]> Demonstrating a pragmatic way on how to test code paths resulting in process.exit using the Jest test library.

The test scenario

With NodeJS, Jest is a very popular and powerful testing library. Consider you want to test the following function in your project…

function myFunc() {
  //
  // ...do "stuff"
  //
  if (condition) {
      process.exit(ERROR_CODE);
  }
  //
  // ...do "other stuff"
  //
 }

To reach a full test coverage, you’ll need to set up a test for the branch where condition is true. Setting up such a test in Jest without any precautions would result in a real exit of the test process before it is finished wich would cause a failed test.

Mocking process.exit

To safely test the branch where condition is true you have to mock process.exit. Of course this mocking should have been done in a way that the following code “other stuff” is never executed like if the original process.exit would kick in.

To achieve that, we use Jest’s spyOn to implement a mock for process.exit. The mock method introduced with mockImplementation will throw an exception and replace the original implementation which was exiting the entire process. The mock function will receive a number (the error code) as argument like the original exit function.

const mockExit = jest.spyOn(process, 'exit')
  .mockImplementation((number) => { throw new Error('process.exit: ' + number); });

This mock ensures that the execution of our test function ends immediately without doing “other stuff” and without ending the Jest test process. Also, this mock serves to check if process.exit was really called and what the exit code was. We do this with Jest’s toHaveBeenCalledWith test function.

Putting it all together

To get the test case finally up and running, we have to wrap our function execution in an expect( ... ).toThrow() statement because it is now throwing an exception in the mock implementation. Also, it is a good practice to restore the original mocked function by calling mockRestore to avoid unintended side-effects.

Assuming we want to test a process exit code of -1, our final test case would look like this…

it('tests myFunc with process.exit', async () => {
  const mockExit = jest.spyOn(process, 'exit')
      .mockImplementation((number) => { throw new Error('process.exit: ' + number); });
  expect(() => {
      myFunc(true);
  }).toThrow();
  expect(mockExit).toHaveBeenCalledWith(-1);
  mockExit.mockRestore();
});

The complete example code is available in a GitHub repo.

Happy coding πŸ™‚

Useful links

]]>
JavaScript Map: sorted for-loop with a custom iterator https://tsmx.net/javascript-map-sorted-for-loop/ Tue, 01 Jun 2021 20:46:46 +0000 https://tsmx.net/?p=657 Read more]]> An elegant way of how to do a sorted iteration over all entries of a JavaScript Map with a for-loop using a custom iterator implemented with Symbol.iterator, function* and yield*.

Your data in a Map

When you need to have a fast key-based access to your data, storing it in a JavaScript Map is perfect for that. So let’s assume you have the following data of different countries with their country-code as a key and the full name and the number of your active users in that country as the value or payload.

var countries = new Map();

countries.set('DE', { name: 'Germany', activeUsers: 15000 });
countries.set('PL', { name: 'Poland', activeUsers: 13900 });
countries.set('UK', { name: 'United Kingdom', activeUsers: 14500 });

Having the Map-object in place, it is very easy and efficient to acces every dataset by it’s key, e.g. countries.get('UK'). Also checking if a particular key exists with countries.has('DE') or getting all keys as an Array with countries.keys() is well supported by JavaScript’s Map.

But what if you also need to have the Map’s entries sorted by an attribute of their value-part and want to iterate over them with a for loop? E.g. listing all countries descending to their number of active users? Well, this is where things start to become a bit more tricky – so let’s see how to solve this.

Iterate the Map – standard (unsorted) way

Maps support a convenient way to iterate over all their entries with for...of and foreach. However, this iterations will always return the elements of the Map in the order they were inserted. For our example above this would look like this.

for (let [key, info] of countries) {
  console.log(key + ' - ' + info.name + ', ' + info.activeUsers);
}

// DE - Germany, 15000
// PL - Poland, 13900
// UK - United Kingdom, 14500

Straight-forward approach – creating a sorted Array out of the Map

The obvious approach to achieve an ordered iteration according to the number of active users in each country would be to create an array out of the Map’s entries using the spread operator (…). This can then be sorted using standard Array functionality.

let sortedCountries = [...countries.entries()].sort((a, b) => b[1].activeUsers - a[1].activeUsers);
for (let [key, info] of sortedCountries) {
    console.log(key + ' - ' + info.name + ', ' + info.activeUsers);
}

// DE - Germany, 15000
// UK - United Kingdom, 14500
// PL - Poland, 13900

However, this approach has a major drawback. The sortedCountries variable has to be refreshed every time an entry is added or deleted in the Map. This is because it is created from a shallow copy of the Map contents using the spread (…) operator and therefore represents only the state of the Map at a certain point in time (precisely: the state of the Map when sortedCountries is set).

let sortedCountries = [...countries.entries()].sort((a, b) => b[1].activeUsers - a[1].activeUsers);

// ...

countries.set('IT', { name: 'Italy', activeUsers: 14200 });

for (let [key, info] of sortedCountries) {
    console.log(key + ' - ' + info.name + ', ' + info.activeUsers);
}

// Italy missing in output...

Elegant approach – custom iterator with function* and yield*

To overcome the drawbacks of the naive approach, JavaScript provides the functionality of creating custom iterators with Symbol.iterator. To do so, we have to assign a generator function to this iterator symbol. A generator function is declared by function* and returns an iteratable generator object. Within this generator function, we’ll yield-return all entries of the Map sorted in the way we want it using the yield* operator.

Ahh… what he says? Let’s first see the code, then it becomes far more easy to understand how it works…

countries[Symbol.iterator] = function* () {
    yield* [...countries.entries()].sort((a, b) => b[1].activeUsers - a[1].activeUsers);
}

For better understanding let’s break down this two lines of code.

With countries[Symbol.iterator] we define a new custom iterator. Or in other words, we declare what will be used/iterated when the countries Map is put into a for…of loop.

This iterator is then assigned a so called generator function with the keyword function*. Such a function returns – or yields – an iteratable object that is used by the calling for loop.

With […countries.entries()].sort((a, b) => b[1].activeUsers - a[1].activeUsers) we create a new array containing all the entries of the countries Map and sort it descending to the number of active users each country has.

Instead of specifying a seperate yield for each element of the sorted array being returned from the function, we use the yield* expression. This simply automatically iterates over the array and returns (yields) every element in the iteratable result. So the above code is just a short form for…

countries[Symbol.iterator] = function* () {
    let sorted = [...countries.entries()].sort((a, b) => b[1].activeUsers - a[1].activeUsers);
    for([key, value] of sorted) yield [key, value];
}

Since an iterator is always automatically evaluated when it is used in a for…of loop, we only need to declare it once. All changes to the Map – adding or deleting entries – are automatically reflected by the iterator when looping over it again.

Sorted for-loop over a Map – putting it all together

Now let’s put the complete example together. Alle the magic happens at the end in two lines of code…

var countries = new Map();

countries.set('DE', { name: 'Germany', activeUsers: 15000 });
countries.set('PL', { name: 'Poland', activeUsers: 13900 });
countries.set('UK', { name: 'United Kingdom', activeUsers: 14500 });

// *** Here comes the magic :) ***
countries[Symbol.iterator] = function* () {
   yield* [...countries.entries()].sort((a, b) => b[1].activeUsers - a[1].activeUsers);
}

console.log('with custom iterator:');
for (let [key, info] of countries) {
    console.log(key + ' - ' + info.name + ', ' + info.activeUsers);
}

// DE - Germany, 15000
// UK - United Kingdom, 14500
// PL - Poland, 13900

// Adding an antry to the map and check if sorting is still valid
countries.set('IT', { name: 'Italy', activeUsers: 14200 });

console.log('after adding an entry:');
for (let [key, info] of countries) {
    console.log(key + ' - ' + info.name + ', ' + info.activeUsers);
}

// DE - Germany, 15000
// UK - United Kingdom, 14500
// IT - Italy, 14200
// PL - Poland, 13900

A public Gist is also available for that example.

Happy coding πŸ™‚

Useful links

]]>
Use CommonJS npm packages in the browser with Browserify https://tsmx.net/npm-packages-browser/ Wed, 07 Oct 2020 07:28:06 +0000 https://tsmx.net/?p=318 Read more]]> A convenient way to use CommonJS npm packages in client-side JavaScript using Browserify.

Sometimes you might need to use the functionality already implemented in a “good old” CommonJS npm package at client-side JavaScript. Unfortunately there’s no require to directly use it there, so another solution is needed.

I recently needed my own human-readable package in a client-side script. Here is one possible convenient way how this can be accomplished using Browserify.

Initial situation

Lets assume you have the following client-side JavaScript that dynamically adds rows to a table of uploaded files and their size given in bytes.

// File: js/upload-utils.js

function createUploadTableEntry(filename, size) {
    var uploadTable = document.getElementById('upload-table');
    var row = uploadTable.insertRow();
    var cellName = row.insertCell();
    cellName.textContent = filename;
    var cellSize = row.insertCell();
    cellSize.textContent = size;
};

Resulting in a table looking like that:

Now it would be great to change the size column to a more readable expression using our recently created package human-readable.

Preparing the code

To do so using Browserify, we first have to add the package we want to use to our project.

npm i @tsmx/human-readable --save

Note: Since the package dependency is only needed for creating the browserified JavaScript at development time, --save-dev should also be enough in most cases.

Now in our NodeJS project we add a small wrapper script which imports the package and exports exactly the functionality we need in our existing client-side code for building the table. In our case it is a simple function creating a readable string out of an amount of bytes.

// File: browser-utils/human-readable.js

const hr = require('@tsmx/human-readable');

module.exports.getReadableSize = function (bytes) {
    return hr.fromBytes(bytes);
};

This file should be created in a folder which is committed to your code repo but neither deployed in production nor should it be served static as an asset by the server.

In my case the folder is browser-utils which I added to my .gcloudignore (because I’m using gcloud AppEngine… ) to ignore it for production deployment and which is not served static by express – in contrast to the js folder where all client-side scripts reside.

Browserifying it

Once that is done we’re ready to generate a client-side JavaScript by using Browserify. For that you first need to install Browserify.

npm i -g browserify

In our browser-utils folder we then call Browserify to pack our wrapper class and create the client-side script.

browserify human-readable.js --standalone hr >  ../js/human-readable-utils.js

This creates a new script human-readable-utils.js in the static served folder js for client-side usage. Note the standalone option: with this Browserify creates a “named” UMD module that can be used in other modules/scripts. In this case the name is hr. You can omit this option if your client-side script is not used by any other script. Here it is needed because I want to use a function from the generated script human-readable-utils.js in the already existing script upload-utils.js. For more details see the Browserify options docs.

The final file/folder structure looks like that:

path-to-your-app/
β”œβ”€β”€ js/                  # served static e.g. via express.static 
β”‚   β”œβ”€β”€ ...
β”‚   β”œβ”€β”€ human-readable-utils.js
β”‚   └── upload-utils.js
β”œβ”€β”€ browser-utils/       # not served, not deployed in production 
β”‚   β”œβ”€β”€ ...
β”‚   └── human-readable.js
β”œβ”€β”€ app.js
└── package.json

Having the browserified client-side script we can change the existing script and use our exported function via the hr name prefix.

// File: js/upload-utils.js

function createUploadTableEntry(filename, size) {
    var uploadTable = document.getElementById('upload-table');
    var row = uploadTable.insertRow();
    var cellName = row.insertCell();
    cellName.textContent = filename;
    var cellSize = row.insertCell();
    cellSize.textContent = hr.getReadableSize(size); // use browserified function
};

Last thing you have to do is to make sure that the browserified bundle is loaded before the existing code so that it is known and can be used.

<script src="/js/human-readable-utils.js"></script>
<script src="/js/upload-utils.js"></script>

That’s it – we’re done. Reloading the page, the npm package is now used to create human readable string for the file sizes.

Improving the development process with watchify

So far so good. We’re now able to use our npm package in the browser in a client-side JavaScript. One drawback is that we always have to browserify the script again once we changed something in the underlying wrapper script. This could easily be forgotten…

To mitigate this issue we can use watchify during development. It’s like a watchdog for changed files which automatically triggers the browserifying process. It takes the same arguments the browserify command does except that -o is mandatory. First install it as a global package.

$ npm i -g watchify

Then in the main folder of project the run it in, e.g. in a VS Code terminal.

$ watchify ./browser-utils/human-readable.js --standalone hr -v -o ./js/human-readable-utils.js
5583 bytes written to ./js/human-readable-utils.js (0.03 seconds) at 8:49:12

Watchify now will create a new browserified script each time the source is saved. To integrate it a bit more in your development and deployment process I suggest to create script entries in package.json for both, one-time browserify and continously watchify.

...
"scripts": {
    "start": "node app.js",
    "build-utils": "browserify ./browser-utils/human-readable.js --standalone hr > ./js/human-readable-utils.js",
    "build-utils-dev": "watchify ./browser-utils/human-readable.js --standalone hr -v -o ./js/human-readable-utils.js"
  },
...

This allows you to access this commands via npm run build-tools and npm run build-tools-dev where needed or also via the VS Code task runner features (F1 –> task –> …).

Additionally you could set-up a task in VS Code to run the watchify process everytime you open a specific folder.

Useful links

]]>